Intro
Setting Up Ollama for Seamless Integration with NextChatPlease ensure your client version is greater than or equal to
v2.11.2
and Ollama version is greater than v0.1.24
For different system environments, refer to the configuration methods outlined at https://github.com/ollama/ollama/blob/main/docs/faq.md.
1
Configure Ollama CORS:
- If you are accessing Ollama from the NextChat client, please add the following to your configuration:
- If you are using OLLAMA from a non-local source, set the following configuration:
It’s recommended to replace “0.0.0.0” with the specific domain or IP address you intend to use.
2
(optional)Configure HTTPS Certificate for Ollama API:
- This step is necessary only if you are accessing the Ollama API over HTTP from a NextChat instance hosted on an HTTPS website.
3
Config Ollama API in NextChat:
- Update OpenAI Endpoint to your Ollama deployment endpoint(e.g.
http://localhost:11434/
) - Leave
OpenAI API Key
empty - Set
Custom Model
to the model you want to use(e.g.gemma
,mistral
, ‘llama’). Please ensure you have the model installed in your Ollama instance.