Situatie
Chatbox AI is an AI client application and smart assistant. Compatible with many cutting-edge AI models and APIs.Assuming you have Ollama running on a machine locally, you can access it from any device connected to your local network by a web browser interface.
Solutie
I)Configuring Ollama service
You need to ensure that the remote Ollama service is properly configured and exposed on the current network so that Chatbox can access it. By default, the remote Ollama service requires simple configuration.
How to Configure the Remote Ollama Service?
By default, the Ollama service runs locally and does not serve externally. To make the Ollama service available externally, you need to set the following two environment variables:
OLLAMA_HOST=0.0.0.0
OLLAMA_ORIGINS=*
I.1)Configuring on Windows
On Windows, Ollama inherits your user and system environment variables.
- Exit Ollama via the taskbar
- Open Settings (Windows 11) or Control Panel (Windows 10), and search for “Environment Variables”
- Click to edit your account’s environment variables
- Edit or create a new variable OLLAMA_HOST for your user account, with the value 0.0.0.0; Edit or create a new variable OLLAMA_ORIGINS for your user account, with the value *
- Click OK/Apply to save the settings
- Launch the Ollama application from the Windows Start menu.
I.2)Configuring on Linux
If Ollama is running as a systemd service, use systemctl to set the environment variables:
Invoke systemctl edit ollama.service to edit the systemd service configuration. This will open an editor.
Under the [Service] section, add a line for each environment variable:
[Service]
Environment=”OLLAMA_HOST=0.0.0.0″
Environment=”OLLAMA_ORIGINS=*”
Save and exit.
Reload systemd and restart Ollama:
systemctl daemon-reload
systemctl restart ollama
II)Service IP Address
II.1)After configuration, the Ollama service will be available on your current network (such as home WiFi). You can use the Chatbox client on other devices to connect to this service.
Acces : https://web.chatboxai.app/
Open Settings, in the model provider choose Ollama Api
In the API Host you need to enter theIP address of the Ollama service is your computer’s address on the current network, typically in the form:192.168.XX.XX:11434
II.2)Choose the model and start chatting!
Considerations
You may need to allow the Ollama service port (default 11434) through your firewall, depending on your operating system and network environment. To avoid security risks, do not expose the Ollama service on public networks. A home WiFi network is a relatively safe environment.
Leave A Comment?