Running Local LLMs

I’m using Large Language Models (LLMs) a fair amount these days. I highly prefer local models that kep my data private. Here are some of the programs I’ve tested and my thoughts about them.

Summary

Feature Ollama Open WebUI Anything LLM MindMac LM Studio  
Command-line Y          
Web Interface   Y        
Native Application Y   Y Y Y  
OpenAI-Like API Y       Y  
Free Y Y Y N Y  
Open Source ? ? ? ? ? ?
Custom Prompts ? yes ? ?    
Sticky System Prompt* ? yes No ? No  
Prompt Templates ? ? ? ? Yes  
  1. System messages that are easy to set (without training) and remain set through multiple chats.

Ollama

I really like Ollama. It’s easy to setup and use and it’s easy to install the popular models. I layer some of the other tools here on top of Ollama because it provides an OpenAI-like API for local models and makes installing common models easy.

The only down side to Ollama is that it’s a command-line tool. This is great, when I’m looking for command-line, but sometimes I’d like to use a GUI which is where some of the other tools I’m using come in.

Open WebUI

This one worked well for me for a while but lately it’s been locking up a lot. The UI becomes unresponsive and “goes away” for long periods of time while it’s sending data to API’s. This seems like a recent phenomenon. Maybe an update broke it, maybe I’m doing more complex things lately, or maybe my chat history is causing issues.

There is no automatic chat history clearing that I can find. I’d like to delete my chat history after a specific period of time to keep things clean.

LM Studio

This is one of the first LLM GUI’s I tried. It worked pretty well but I went away from it when I started using Open WebUI. Now I’m trying it again.

MindMac

This is a somewhat recent tool I’ve tried.

There’s a questions, for me, of if having a native app is a good thing. It is if it’s not just a web interface anyway (like Slack) but it’s just extra weight if it still uses a web interface.

AnythingLLM

This is a newer tool.

Written by Joel Dare on September 9, 2024.


Joel's Newsletter

Get a monthly digest of what I'm up to.

Subscribe by Email