How to install Open WebUI without Docker
This guide walks you through setting up Ollama Web UI without Docker. While Docker is officially recommended for ease and support, this manual approach might be useful for developers or environments with limitations. Remember, troubleshooting unsupported installations might require independent effort.
Project Components:
- Frontend: Web interface you interact with.
- Backend: Handles communication and functionality behind the scenes. Both components need to run simultaneously for development.
Requirements:
- Node.js >= 20.10: For building the frontend.
- Python >= 3.11: For running the backend.
- Ollama: For model
Let’s get started Installation Steps:
- Clone the Ollama Web UI repository:
git clone https://github.com/open-webui/open-webui.git
- Navigate to the project directory:
cd ollama-webui/
- Copy the environment file:
cp -RPp example.env .env
Notes: This file stores settings and configurations. Update it according to your needs.
Build the frontend:
- Building Frontend Using Node
npm install
npm run build
Start the backend:
- Create a virtualenv:
cd ./backend
# install virtualenv package
pip install -U virtualenv
# Create a virtualenv in backend folder
virtualenv venv
# Activate virtualenv
source venv/Scripts/activate
- Serving Frontend with the Backend
pip install -r requirements.txt -U
bash start.sh
Access the Ollama Web UI:
- Open http://localhost:8080/ in your web browser. The interface should be up and running!
Tips and Troubleshooting:
If you encounter errors, check the console output for any clues. Refer to the Ollama Web UI documentation for further configuration options and advanced features. Remember, non-Docker setups are not officially supported, so be prepared for some troubleshooting tasks.
Enjoy Ollama Web UI!
This tutorial should get you started with Ollama Web UI without Docker. Remember, while Docker is generally preferred, this manual approach offers flexibility for specific situations. With a little effort, you can have this powerful interactive tool up and running.