Custom LLM
I don't plan to use any provided AI chatbot. I want to use models that I have downloaded and tested myself and which run on my PC. So please add the possibility to connect through major LLM APIs, the same way other tools way simpler than Firefox already allow to do.
Chosen solution
Oh OK, I found it. You have to install something like Open WebUI (locally with Docker for example), connect it to your local provider like LM Studio, possibly with some tweaking (API key, CORS), and then write the Open WebUI URL into about:config > browser.ml.chat.provider so Firefox can use this UI in its chat window.
At this point, we may argue to use Firefox for that at all, since we can just have a tab open on the Open WebUI directly. But we can select text on the web page and ask a few stuff on it, it adds it to the chat window and prompt automatically. Not the best integration, but at least there is some added value with it.
And Paul's tutorial gives a nice addition to it, especially regarding prompts addition and customization: https://support.mozilla.org/en-US/questions/1570431#answer-1805199 Now we can feel powerful.
Read this answer in context 👍 0All Replies (5)
You can go to browser.ml.chat.provider in about:config and set the URL to whatever you like. Even something like http://localhost:1234/ for a LLM server instance running locally.
Good to know, but how do I setup the model (I have several) and API key (yes I use one locally). I found a few stuff for smartassist options, but nothing for this one.
Actually, it seems to not be suited. It sends a GET rather then using a usual LLM API that works with POST. Looks more like you should provide the link of a website in this place. There is also browser.ml.chat.hideLocalhost, which is enabled by default and can be disabled to show a localhost entry in the list. But when using it this way it fills the option you mention with localhost:8080, so it probably expects a website.
Modified
Chosen Solution
Oh OK, I found it. You have to install something like Open WebUI (locally with Docker for example), connect it to your local provider like LM Studio, possibly with some tweaking (API key, CORS), and then write the Open WebUI URL into about:config > browser.ml.chat.provider so Firefox can use this UI in its chat window.
At this point, we may argue to use Firefox for that at all, since we can just have a tab open on the Open WebUI directly. But we can select text on the web page and ask a few stuff on it, it adds it to the chat window and prompt automatically. Not the best integration, but at least there is some added value with it.
And Paul's tutorial gives a nice addition to it, especially regarding prompts addition and customization: https://support.mozilla.org/en-US/questions/1570431#answer-1805199 Now we can feel powerful.
Modified
Hi
This independent article may help:
https://dev.to/plwt/advanced-configuration-of-the-ai-chatbot-in-firefox-4l0p