2025-07-19

Patch Deep Friday - Test Drive 1 - fully localized.

 

Test Drive Sequence: Patch Deep Friday Stack

1. ๐Ÿ” Verify Docker Containers

Run:

bash
docker ps

You should see 3 containers:

  • ollama (port 11434)

  • openwebui (port 3000)

  • n8n (port 5678)


2. ๐ŸŒ Access OpenWebUI (LLM Chat)

  • Open browser to: http://localhost:3000

  • You should see a clean Web UI with model selection (llama3/mistral/phi)

  • ๐Ÿงช Test Prompt:

    “Explain the difference between symmetric and asymmetric encryption.”

๐Ÿง  Confirm that:

  • Model responds correctly

  • You can switch between models (llama3 → mistral → phi)

  • There’s no internet leak (confirm netstat if paranoid)


3. ๐Ÿง  API Test via Ollama

In terminal:

bash
curl http://localhost:11434/api/generate -d '{"model":"llama3","prompt":"What is OPSEC?"}'

You should receive a full JSON output with response.


4. ๐Ÿค– Access n8n

  • Open: http://localhost:5678

  • Log in with:

    • User: admin

    • Password: kalikotrocks

  • ๐Ÿงช Test Workflow:

    1. Create new workflow

    2. Add a "Cron" node → set to run every minute

    3. Add "Set" node → Output: message: Hello Kalikot

    4. Connect & Activate

This validates n8n automation engine.


5. ๐Ÿ“ File Ingestion Test (Optional)

If you’ve created a ~/offline-ai-stack/data/docs folder:

  • Drop a .txt file (e.g., test-snippet.txt)

  • Launch a Python container:

bash
docker run -it --rm -v ~/offline-ai-stack/data:/app/data python:3.11-bullseye bash
  • Inside:

bash
pip install llama-index chromadb # Then build simple doc index

We can automate this later via n8n.


6. ๐Ÿ›‘ When Done

To gracefully shut down:

bash
cd ~/offline-ai-stack docker compose down

๐ŸŽฏ Final Check

LayerStatus
Docker✅ Up and running
Ollama API✅ Responds to curl requests
WebUI✅ Loads, switches models
n8n✅ Login, build workflows
File Mounts✅ Validated via busybox/python