Open WebUI

Open WebUI Unveiled: Installation and Configuration

Bhavik Jikadara

--

What is Open WebUI?

Open WebUI is an extensible, feature-rich, and user-friendly self-hosted WebUI designed to operate entirely offline. It supports various LLM runners, including Ollama and OpenAI-compatible APIs.

Here are some exciting tasks on the Open WebUI roadmap:

  • 🔊 Local Text-to-Speech Integration: Seamlessly incorporate text-to-speech functionality directly within the platform, allowing for a smoother and more immersive user experience.
  • 🛡️ Granular Permissions and User Groups: Empower administrators to finely control access levels and group users according to their roles and responsibilities. This feature ensures robust security measures and streamlined management of user privileges, enhancing overall platform functionality.
  • 🔄 Function Calling: Empower your interactions by running code directly within the chat. Execute functions and commands effortlessly, enhancing the functionality of your conversations.
  • ⚙️ Custom Python Backend Actions: Empower your Open WebUI by creating or downloading custom Python backend actions. Unleash the full potential of your web interface with tailored actions that suit your specific needs, enhancing functionality and versatility.
  • 🔧 Fine-tune Model (LoRA): Fine-tune your model directly from the user interface. This feature allows for precise customization and optimization of the chat experience to better suit your needs and preferences.
  • 🧠 Long-Term Memory: Witness the power of persistent memory in our agents. Enjoy conversations that feel continuous as agents remember and reference past interactions, creating a more cohesive and personalized user experience.
  • 📚 Enhanced Documentation: Elevate your setup and customization experience with improved, comprehensive documentation.

Prerequisites

You’ll need to fulfil certain prerequisites:

Quick Start

Step 1: How to install Ollama

Ollama is an open-source software tool that allows you to run large language models (LLMs) on your local machine without the need for a server. This means that you can use LLMs for a variety of tasks, such as generating text, translating languages, and writing different kinds of creative content, without having to worry about privacy or data security.

Step 2: How to setup Open WebUI with Docker

  • Use the Git command git clone followed by the repository's URL to make a local copy of the repository hosted on GitHub. In this case, the command would be:
git clone https://github.com/open-webui/open-webui.git
  • Once the repository is cloned, change your current directory to the folder where the repository has been cloned. This can be done using the cd command followed by the name of the folder. For example:
cd open-webui

If Ollama is on your computer, use this command:

docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main

If Ollama is on a Different Server, use this command:

  • To connect to Ollama on another server, change the OLLAMA_BASE_URL to the server's URL:
docker run -d -p 3000:8080 -e OLLAMA_BASE_URL=https://example.com -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main

Installation for OpenAI API Usage Only

  • If you’re only using OpenAI API, use this command:
docker run -d -p 3000:8080 -e OPENAI_API_KEY=your_secret_key -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main

After installation, you can access Open WebUI at http://localhost:3000. Enjoy! 😄

Firstly, sign up and then login
Let’s get started by using Open WebUI

References

Thank you for taking the time to read our Medium blog post. We truly appreciate your interest and support!

--

--

Bhavik Jikadara
Bhavik Jikadara

Written by Bhavik Jikadara

🚀 AI/ML & MLOps expert 🌟 Crafting advanced solutions to speed up data retrieval 📊 and enhance ML model lifecycles. buymeacoffee.com/bhavikjikadara

No responses yet