Inquiry regarding integration of local LLMs (Ollama/LM Studio) into the Mozilla ecosystem
Hi Mozilla Community,
With the rapid growth of local AI, I am interested in how we can better utilize local models—specifically via tools like Ollama or LM Studio—within the Mozilla ecosystem.
Given Mozilla's core commitment to privacy and user agency, using local LLMs seems like a perfect fit for:
Privacy-preserving browser assistants: Using Firefox extensions to interact with local endpoints (e.g., localhost:11434 for Ollama). Local content analysis: Summarizing or analyzing web pages without sending sensitive data to third-party cloud providers. WebGPU research: Exploring how local inference can be integrated more deeply into the browser experience. Are there any ongoing projects or planned features that would allow users to connect their local AI endpoints to Firefox? Or is there a recommended community-driven way to implement this via extensions?
I would love to hear your thoughts on this direction.
Chosen solution
You can set browser.ml.chat.provider = localhost:11434 in about:config for sidebar AI chat.
Read this answer in context 👍 0All Replies (1)
Chosen Solution
You can set browser.ml.chat.provider = localhost:11434 in about:config for sidebar AI chat.