2025-07-28

Native OpenWebUI Start Checklist

 

Prototype.


OpenWebUI Native Start Checklist

1. Activate Python Virtual Environment

cd ~/glitch-stack/open-webui/backend
source venv/bin/activate

2. Start Backend

python3 app/main.py
  • Success Indicator: Should say Uvicorn running on http://0.0.0.0:8080 or similar.

  • If it fails, check:

    • Port conflict with 8080: ss -tulpn | grep 8080

    • Dependencies: pip install -r requirements.txt


3. Start Frontend (Vite Dev Server)

Open another terminal:

cd ~/glitch-stack/open-webui
npm run dev
  • Success Indicator: Should display VITE ready at http://localhost:5173 (or fallback like 5174).

  • If you see ENOSPC, re-verify:

    cat /proc/sys/fs/inotify/max_user_watches
    

4. Access OpenWebUI

On your browser (same machine or LAN):

http://localhost:5173
http://<glitchh3x-LAN-IP>:5173 (or :5174, etc.)

5. Verify Backend Connectivity

Open browser console (F12 > Network) and check if requests to:

http://localhost:8080/api/*

are succeeding (200 OK) — if not:

  • Backend isn’t running

  • Port/firewall conflict

  • Wrong base URL


6. Verify Ollama

curl http://localhost:11434/api/tags
  • If that fails, restart Ollama:

pkill ollama
OLLAMA_HOST=0.0.0.0 ollama serve

✅ Final Check

Component Command Status Check
Backend python3 app/main.py Uvicorn running on 0.0.0.0:8080
Frontend npm run dev Vite ready on port 5173+
Ollama ollama serve curl localhost:11434/api/tags OK
Browser Access OpenWebUI UI loads, model connects

🧰 Optional: Auto-Start Script

Would you like a shell script to launch backend + frontend together in tmux or screen?

Let me know and I’ll generate it.