Windows 10 reached EOS (end of support) on October 14, 2025. If you are on Windows 10, see this article.

Search Support

Avoid support scams. We will never ask you to call or text a phone number or share personal information. Please report suspicious activity using the “Report Abuse” option.

Learn More

ollama support for AI chatbot?

  • 3 பதிலளிப்புகள்
  • 0 இந்த பிரச்னைகள் உள்ளது
  • Last reply by TyDraniu
  • தீர்வுற்றது

can we get a ollama local ai models to use in AI chatbot? it would be more private and give more flexibility for users

can we get a ollama local ai models to use in AI chatbot? it would be more private and give more flexibility for users

தீர்வு தேர்ந்தெடுக்கப்பட்டது

Hi, I guess you can. Enter about:config in the address bar and set your local chatbot URL in the browser.ml.chat.provider preference. Perhaps you also need to switch browser.ml.chat.hideLocalhost to false.

Read this answer in context 👍 1

All Replies (3)

தீர்வு தேர்ந்தெடுக்கப்பட்டது

Hi, I guess you can. Enter about:config in the address bar and set your local chatbot URL in the browser.ml.chat.provider preference. Perhaps you also need to switch browser.ml.chat.hideLocalhost to false.

thank you so much😄

and also is there any way to remove or increase the limit of summarize page length? since its local I can just let it summarize a lot longer pages with no problem

Yes, you can increase browser.ml.chat.maxLength in about:config.

கேள்வி எழுப்பு

You must log in to your account to reply to posts. Please start a new question, if you do not have an account yet.