New Ollama UI

The new Ollama UI offers a polished, user-friendly interface for interacting with local AI models, providing a simple alternative to more complex GUIs like LM Studio while maintaining the core strengths of Ollama’s CLI and API tools. Although still basic and lacking advanced features like robust model management and synchronization, it lays a strong foundation for future enhancements aimed at both new users and power users.

The video introduces the new Ollama UI, a surprising development from the team behind Ollama, known primarily for their command-line interface (CLI) and API-focused local AI tools. While the UI is not yet fully featured, it is polished and offers a straightforward, single-install solution that could be ideal for many users. The presenter, who was part of Ollama’s founding team, expresses excitement about this evolution, noting the team’s disciplined approach to first perfecting the core product before expanding its scope. The main competitor mentioned is LM Studio, which has a GUI but is criticized for being overly complicated, making Ollama’s new UI a promising alternative.

The UI itself is simple and user-friendly. Users can enter questions and receive responses, with the ability to select from all locally installed models or download new ones directly through the interface. However, the model management features are basic; for example, there is no way to remove models or filter them by specific attributes like model size or type. The presenter highlights some quirks, such as the model list filtering only by the beginning of model names and the fact that models only start downloading when first used rather than immediately upon selection. Despite these limitations, the UI supports drag-and-drop for images and can handle various file types like PDFs and text files, showcasing its versatility.

The chat functionality includes standard features like opening new chats and viewing a sidebar with all conversations, but lacks keyboard shortcuts for these actions, which is unusual given the team’s focus on keyboard-driven workflows in the CLI. The app is standalone with no synchronization across devices, meaning chats are local to each installation. The settings panel has recently expanded to include options like default context length and network exposure, with hints at upcoming features involving user sign-in, though details remain under wraps. The presenter expresses some reservations about setting a global context length due to varying model capabilities and potential GPU limitations.

Looking ahead, the presenter is optimistic about the UI’s future development. They hope to see improvements such as enhanced keyboard shortcuts, better model management and filtering, easier model importing and publishing, and more access to metrics and logs for monitoring. The lack of a Prometheus endpoint for metrics is noted as a missed opportunity. Overall, the UI is seen as a strong foundation that will evolve over time, catering well to new users who prefer a simple interface while still leaving room for power users to rely on more advanced tools like Misti or Open Web UI.

In conclusion, the new Ollama UI represents a significant step forward for the Ollama team, offering a polished and accessible way to interact with local AI models without the complexity of the command line. While it may not immediately attract users of more feature-rich GUIs like LM Studio, it provides a solid base that will likely improve with future updates. The presenter encourages viewers to try the UI, subscribe for more content, and acknowledges that while it won’t replace all existing tools, it fills an important niche for ease of use and simplicity in local AI deployment.