Minimalistic UI for Ollama LMs - This powerful react interface for LLMs drastically improves the chatbot experience and works offline.
-
Updated
Oct 6, 2024 - TypeScript
Minimalistic UI for Ollama LMs - This powerful react interface for LLMs drastically improves the chatbot experience and works offline.
LLMX; Easiest 3rd party Local LLM UI for the web!
A single-file tkinter-based Ollama GUI project with no external dependencies.
A UI client for Ollama written in Compose Multiplatform focused on running Deepseek r1 locally
Dive is an open-source AI Agent desktop application that seamlessly integrates any Tools Call-supported LLM with frontend MCP Server—part of the Open Agent Platform initiative.
比简单更简单,通过 Ollama 不需要显卡轻松在你的电脑上运行 LLM。
Ollama with Let's Encrypt Using Docker Compose
Full featured demo application for OllamaSharp
A modern web interface for [Ollama](https://ollama.ai/), with DeepSeek in next version.
An excellent localized AI chat client application, cross-platform, compatible with all large models compatible with Ollama and OpenAI API. Local deployment protects your data privacy and can be used as Ollama client and OpenAI client.
A Chrome extension hosts an Ollama UI web server on localhost and other servers, helping you manage models and chat with any open-source model. 🚀💻✨
Frontend for the Ollama LLM, built with React.js and Flux architecture.
OllamaOne is an Ollama GUI client.
Transform your writing with TextLLaMA! ✍️🚀 Simplify grammar, translate effortlessly, and compose emails like a pro. 🌍📧
Simple web UI for Ollama
Ollama Web UI is a simple yet powerful web-based interface for interacting with large language models. It offers chat history, voice commands, voice output, model download and management, conversation saving, terminal access, multi-model chat, and more—all in one streamlined platform.
ollama web_ui simple and easy
Add a description, image, and links to the ollama-ui topic page so that developers can more easily learn about it.
To associate your repository with the ollama-ui topic, visit your repo's landing page and select "manage topics."