12 Tools to Provide a Web UI for Ollama

Don't want to use the CLI for Ollama for interacting with AI models? Fret not, we have some neat Web UI tools that you can use to make it easy!
Warp Terminal

Ollama is a free and open-source tool that lets users run Large Language Models (LLMs) locally. It makes the AI experience simpler by letting you interact with the LLMs in a hassle-free manner on your machine.

You can run some of the most popular LLMs and a couple of open-source LLMs available.

Unfortunately, it offers a CLI, which may not be everyone's cup of tea β˜• So, you can choose to enhance your experience (and make it easier) by running LLMs locally using a web user interface. To accomplish that, you have a few open-source tools that provide a web UI.

Let me highlight some options here.

πŸ“‹
Make sure you have Ollama installed before you try working with the front-end UI. If you do not have Ollama installed, refer to our Ollama installation guide.
Running AI Locally Using Ollama on Ubuntu Linux
Running AI locally on Linux because open source empowers us to do so.

1. Page Assist

page assist firefox add-on

Page Assist is an interesting open-source browser extension that lets you run local AI models. It supports Ollama, and gives you a good amount of control to tweak your experience. You can install it on Chromium-based browsers or Firefox.

From letting you easily manage the installed models, adding files to analyze or research to be able to enable internet search, it is a convenient way to access LLMs right on your browser.

You can also decide to share your output with the world using a self-hosted URL (not sure who would want that, but yes).

2. Open WebUI

open webui

Open WebUI is the most popular and feature-rich solution to get a web UI for Ollama. The project initially aimed at helping you work with Ollama. But, as it evolved, it wants to be a web UI provider for all kinds of LLM solutions.

It supports OpenAI-compatible APIs and works entirely offline. You can install it quickly using Docker or Kubernetes. Furthermore, it features a Progressive Web App for mobile, and image generation integrations.

If you want a web UI for Ollama, I think this is an easy recommendation. However, if you are looking for something different, there are plenty of other options too.

3. Lobe Chat

Lobe Chat is a local and privacy-focused ChatGPT-like UI framework.

You can deploy your private Ollama chat application using Lobe Chat, and it should look pretty sleek. It is also available as a one-click script if you use Pinokio, the AI browser.

Lobe Chat also supports voice conversations and text-to-image generation. Furthermore, you can enhance its capabilities using plugins. It features support for Progressive Web App as well.

Suggested Read πŸ“–

I Installed AI Apps in a Single Click on My Linux System With Pinokio
You can also install AI apps in one-click with Pinokio AI browser.

4. Text Generation Web UI

text generation web ui

A web UI that focuses entirely on text generation capabilities, built using Gradio library, an open-source Python package to help build web UIs for machine learning models.

Text Generation Web UI features three different interface styles, a traditional chat like mode, a two-column mode, and a notebook-style model. You get OpenAI compatible model support and transformers library integration.

5. Ollama UI

ollama ui

If you do not need anything fancy, or special integration support, but more of a bare-bones experience with an accessible web UI, Ollama UI is the one.

It is a simple HTML-based UI that lets you use Ollama on your browser. You also get a Chrome extension to use it. As you can see in the screenshot, you get a simple dropdown option to select the model you want to use, and that's it.

6. Ollama GUI

ollama gui

While all the others let you access Ollama and other LLMs irrespective of the platform (on your browser), Ollama GUI is an app for macOS users.

The app is free and open-source, built using SwiftUI framework, it looks pretty, which is why I didn't hesitate to add to the list.

Yes, it may not be a web UI tool that lets you access the model from your phone or any browser, but it is a feasible option for macOS users.

7. Lord of LLMs Web UI

A pretty descriptive name, a.k.a., LoLLMs Web UI is a decently popular solution for LLMs that includes support for Ollama.

It supports a range of abilities that include text generation, image generation, music generation, and more. You can integrate it with the GitHub repository for quick access and choose from the different personalities offered.

You can set it up on Linux using the automatic installation script, and that'sprovide-a- an easy to way to get started with it.

8. LibreChat

LibreChat is an open-source ChatGPT alternative that you can deploy locally or in the cloud.

It is compatible with Ollama. You can use open-source LLMs and popular proprietary models like Google Vertex AI, ChatGPT, and more. It is tailored to be a ChatGPT clone, so you will not find anything unique about the UI offered. But, for some, the familiarity of the UI can help you navigate things better.

Suggested Read πŸ“–

13 Best Open Source ChatGPT Alternatives
Looking for open-source ChatGPT alternatives? We curated some of the best ones for you to take a look at.

9. Minimal LLM UI

Want something basic but capable?

Minimal LLM UI is an unusual option that provides a web UI for Ollama using React, aiming to provide you with a clean and modern design.

You can switch between LLMs, and save your conversation locally using a database.

10. Enchanted

Enchanted is an open-source app that lets you connect to your private models, compatible with Ollama to get a seamless experience across the Apple ecosystem (iOS, macOS, Vision Pro).

This is a useful tool for users who want the Apple platform support. If you wanted to have a GUI and still have the option to configure a web UI to access on macOS.

11. Msty.app (Non-FOSS)

msty.app screenshot

Msty is a fascinating non-foss app available across multiple platforms, providing you a local-first UI to run AI models or LLMs.

The highlights of this tool include conversational branches, and the ability to add knowledge stacks using Obsidian vaults (and other services).

12. Hollama

hollama ui

Hollama is yet another minimal web UI option that offers a publicly hosted version that does not need you signing up and still be able to use it (data stored locally).

You can run Hollama locally by using a docker image. Unlike most others, it lacks plenty of features. But, if you only wanted a quick web UI for Ollama, you can try this out.

Wrapping Up

Considering, we are just getting started with the local and private AI. I am certain that there will be several more tools available that allow us to use Ollama for interesting use-cases.

For me, the Page Assist extension seems like a time-saver (with no setup) which lets me run AI models locally while having the ability to search from the internet.

πŸ’¬ What is your favorite tool to get a web UI for Ollama? Would you mind sharing how you use Ollama + web UI in your day-to-day life? Use the comments section below and let's talk!

About the author
Ankush Das

Ankush Das

A passionate technophile who also happens to be a Computer Science graduate. You will usually see cats dancing to the beautiful tunes sung by him.

Become a Better Linux User

With the FOSS Weekly Newsletter, you learn useful Linux tips, discover applications, explore new distros and stay updated with the latest from Linux world

It's FOSS

Great! You’ve successfully signed up.

Welcome back! You've successfully signed in.

You've successfully subscribed to It's FOSS.

Success! Check your email for magic link to sign-in.

Success! Your billing info has been updated.

Your billing was not updated.