Firefox
Firefox
Vytvořeno:
před 11 hodinami
Smart Window lets you connect your own AI model instead of using the ones provided by Firefox. This is helpful if you want more control, use a specific provider, or run a model locally on your device.
You can connect either:
- A remote model (such as OpenRouter)
- A local model running on your device (such as Lemonade Server or Ollama)
Note: If you use a custom model, Smart Window may not work as expected. This feature and these instructions are catered towards users who are familiar with these services and tools.
Table of Contents
Use a remote model (OpenRouter)
- Create an OpenRouter account if you do not have one already, at https://openrouter.ai/
- Generate an API Key in OpenRouter, and copy it to a secure place.
- OpenRouter API keys begin with sk-or-v1-
- Open the OpenRouter models page and choose a model you would like to use,
- Take note of its model ID. Ex: z-ai/glm-4.5-air:free
- In Firefox, open .
- Go to > >
- Select
- Fill in the fields:
- Model name: Paste OpenRouter model ID, from step 3
- Model endpoint with the OpenRouter API endpoint, which is typically https://openrouter.ai/api/v1
- API key: Paste your OpenRouter API key from step 2
- Click
- Open a Smart Window, and start using the Assistant
Tip: You can find free models on OpenRouter by searching for “free” on the models page (direct link).
Use a local model
Example: Lemonade Server
- Download and install Lemonade Server at https://lemonade-server.ai/. You must use version 10.2.0 or newer.
- Run Lemonade Server and download a model of your choice using the app instructions.
- In a command line terminal, set a larger context size the command lemonade config set ctx_size=8192
- Reload the model from the UI or using the command lemonade unload (the next time you make a request to the model it will load with your settings)
- In Firefox, open .
- Go to > >
- Select
- Fill in the fields:
- Model name: Enter your model name from step 2 (for example, SmolLM3-3B-GGUF)
- Model endpoint: Enter the Lemonade Server endpoint, which is typically http://localhost:13305/api/v1
- Note that no API key is required for Lemonade Server
- Click
- Open a Smart Window, and start using the Assistant
Example: Ollama
- Download and Install Ollama at https://ollama.com/download
- Run Ollama, and follow the instructions on the site to download a local model of your choice
- Open the Firefox settings screen, and go to > > , and select
- Fill in the fields:
- Model name: Enter your model name from step 2 (ex: qwen3.5:4b)
- Model endpoint: Enter the OpenRouter API endpoint, which is typically http://localhost:11434/v1
- Note that no API key is required for Ollama
- Click
- Open a Smart Window, and start using the Assistant