How to Run Open WebUI Locally on Your Apple Mac: A Step-by-Step Guide

Run Open WebUI on MacBook Air M2 easily. Follow my guide for setup and automation to get ChatGPT-like responses locally.

If you're interested in running LLMs locally on your Apple Mac, you may have experienced challenges with various desktop tools like msty, gpt4all, etc. I encountered similar issues on my MacBook Air M2 with 8 GB RAM—none of these tools provided satisfactory responses. Whether it was the memory limitations, model selections, or prompts I used, the results were underwhelming.

# Introduction to Open WebUI

When Simon Willison shared his success with running Open WebUI using uvx, I decided to give it a try. Open WebUI offers "chatgpt-4o-latest" and "gpt-4o" models, giving me results akin to ChatGPT chats, which I found impressive. I have since disabled all other models except these two.

Though I'm still getting accustomed to it, here's how I successfully installed Open WebUI. The steps generally go like this:

  1. Install uvx
  2. Initialize a Python virtual environment
  3. Install Open WebUI
  4. Run the application

For those who prefer automation, you can add a script to your shell. Here's a detailed breakdown:

# Step-by-Step Installation Guide

# Install uvx

brew install uv

# Set Up the Environment

mkdir openwebui && cd openwebui
uv init --python=3.11 .
uv venv

# Install Open WebUI

uv pip install open-webui

# Run Open WebUI

open-webui serve

Open WebUI is now accessible via: http://0.0.0.0:8080/.

# Automate the Startup with a Script

Typing multiple commands to start Open WebUI can be cumbersome, so I asked Claude for a solution to automate the process while also logging outputs.

I have open web ui installed via pip in a virtual environment. I invoke it via source .venv/bin/activate and run open-webui serve; I want to automate this whenver I open the shell. And pipe the log to a file. How can I do this? I should also be able to stop web-ui

Here's the script it provided:

#!/bin/bash
# Function to start OpenWebUI
startwebui() {
    if pgrep -f "open-webui serve" > /dev/null; then
        echo "OpenWebUI is already running"
        return 1
    fi
    (source "$HOME/openwebui/.venv/bin/activate" &&
     open-webui serve > "$HOME/logs/webui.log" 2>&1 &)
    echo "OpenWebUI started. Logs at ~/logs/webui.log"
}
# Function to stop OpenWebUI
stopwebui() {
    pkill -f "open-webui serve"
    echo "OpenWebUI stopped"
}

# Instructions to Implement the Script

  1. Save the script as an executable file:

    chmod +x ~/scripts/webui-control.sh
    
  2. Add the script to your shell configuration file (e.g., ~/.zshrc):

    source ~/scripts/webui-control.sh
    

# Start Open WebUI Easily

Now, whenever you want to start Open WebUI, simply type startwebui in your terminal, and visit http://0.0.0.0:8080/ to enjoy locally running ChatGPT.

# Enjoy locally running ChatGPT

I'm excited to share that after testing various prompts, the responses I receive from Open WebUI are strikingly similar to those from ChatGPT. Successfully running these models locally on my Mac has been a gratifying experience, offering both convenience and efficiency.

I encourage you to try running Open WebUI locally and experience the benefits firsthand.

Enjoy locally running ChatGPT.

Published On:
Under: #aieconomy , #tech , #code