We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hey there is the support for ollama planned? I tried, and tought ollama api and openAI api are same but i cant get it working.
Kind regards pmf
The text was updated successfully, but these errors were encountered:
Hi, ollama works on well by using the OpenAI API. Here is a working config:
{ "homeserver": "https://<redacted>", "user_id": "@<redacted>:<redacted>", "password": "<redacted>", "device_id": "matrix_chatgpt_bot", "gpt_api_endpoint": "http://192.168.0.163:11434/v1/chat/completions", "gpt_model": "qwen2.5" }
Make sure http://192.168.0.163:11434/v1/chat/completions is accessable by matrix_chatgpt_bot.
http://192.168.0.163:11434/v1/chat/completions
Sorry, something went wrong.
Thanks a lot that solved the issue!
No branches or pull requests
Hey there is the support for ollama planned?
I tried, and tought ollama api and openAI api are same but i cant get it working.
Kind regards
pmf
The text was updated successfully, but these errors were encountered: