ollama support for AI chatbot?
can we get a ollama local ai models to use in AI chatbot? it would be more private and give more flexibility for users
Ausgewählte Lösung
Hi, I guess you can. Enter about:config in the address bar and set your local chatbot URL in the browser.ml.chat.provider preference. Perhaps you also need to switch browser.ml.chat.hideLocalhost to false.
Diese Antwort im Kontext lesen 👍 1Alle Antworten (3)
Ausgewählte Lösung
Hi, I guess you can. Enter about:config in the address bar and set your local chatbot URL in the browser.ml.chat.provider preference. Perhaps you also need to switch browser.ml.chat.hideLocalhost to false.
thank you so much😄
and also is there any way to remove or increase the limit of summarize page length? since its local I can just let it summarize a lot longer pages with no problem
Yes, you can increase browser.ml.chat.maxLength in about:config.