ollama support for AI chatbot?
can we get a ollama local ai models to use in AI chatbot? it would be more private and give more flexibility for users
can we get a ollama local ai models to use in AI chatbot? it would be more private and give more flexibility for users
All Replies (1)
Hi, I guess you can. Enter about:config in the address bar and set your local chatbot URL in the browser.ml.chat.provider preference. Perhaps you also need to switch browser.ml.chat.hideLocalhost to false.