Open WebUI

Chat IA et assistants

A self-hosted AI chat UI compatible avec Ollama and OpenAI-compatible APIs. Capable of running fully offline.

4.4
Webセルフホスティング(Docker)

Qu'est-ce que Open WebUI ?

Open WebUI is an open source, self-hosted AI chat platform. It prend en charge a large gamme de LLM runners notamment Ollama, OpenAI-compatible APIs, et plus encore, and can operate completely offline. It offers advanced fonctionnalites comme RAG (Retrieval-Augmented Generation), voice and video calls, document processing, and Python tool calling. It prend en charge 9 vector databases notamment ChromaDB, PostgreSQL, Qdrant, and Milvus, and includes prompt injection protection via LLM-Guard. It is one of the most popular self-hosted outils IA of 2026 for individuals and organizations that prioritize confidentialite.

Open WebUI capture d'ecran

Tarifs

1Free (open source)

Fonctionnalites principales

Multi-model support
RAG
Voice & video calls
Document processing
Prompt injection protection
9 vector database support

Avantages et inconvenients

Avantages

  • Completely free and open source
  • Works offline
  • Excellent confidentialite protection

Inconvenients

  • Requires technical knowledge comme Docker for setup
  • japonais localization of the UI is incomplete
  • Requires your own GPU/server

Questions frequemment posees

Q. Is Open WebUI free?

R. Oui, it is completely open source and free. However, you need a server or PC (possibly with a GPU) to run it.

Q. Qu'est-ce que the difference from Ollama ?

R. Ollama is an engine for running LLMs locally, while Open WebUI is its frontend (utilisateur interface). By connecting Open WebUI to Ollama, you can use local AI through a ChatGPT-like UI.

Outils similaires

Explorer davantage sur AIpedia