๐ง Patch Deep Friday: Project Summary
Project Purpose:
Patch Deep Friday is a fully offline, automation-ready AI assistant and workflow agent stack running on Kalikot — with Kali Linux. Patch is Reboot's brother AI, designed to learn, recover, and automate. It blends local LLM capability with workflow orchestration and document understanding, all without internet dependence.
⚙️ Section 1: Hardware and Host Setup
Primary Host: Kalikot
Project Directory:
~/offline-ai-stack/
๐ณ Section 2: Docker Stack (Active)
We’re using Docker Compose to run three core services:
✅ Confirmed Running via docker-compose up
:
version: '3.8'
services:
ollama:
image: ollama/ollama
ports:
- "11434:11434"
volumes:
- ollama-data:/root/.ollama
openwebui:
image: ghcr.io/open-webui/open-webui
ports:
- "3000:3000"
environment:
- OLLAMA_API_BASE_URL=http://ollama:11434
depends_on:
- ollama
n8n:
image: n8nio/n8n
ports:
- "5678:5678"
volumes:
- n8n-data:/home/node/.n8n
environment:
- N8N_BASIC_AUTH_ACTIVE=true
- N8N_BASIC_AUTH_USER=patch
- N8N_BASIC_AUTH_PASSWORD=fr1d4ysecure
restart: unless-stopped
volumes:
ollama-data:
n8n-data:
๐ง What This Gives Us:
-
Ollama: Local LLM backend (used for running models like
llama3
,mistral
,codellama
, etc.). -
Open WebUI: Visual front-end for chatting with models served by Ollama.
-
n8n: Automation engine to build local workflows (e.g., file parsing, alerting, internal triggers).
๐ง Section 3: Software and Authentication
Hugging Face Token
-
Token
deep-friday
saved successfully to:
/home/xxxxxx/.cache/huggingface/stored_tokens
-
Git credential helper not yet configured.
-
No git push setup, since Patch runs offline.
๐ฅ️ Section 4: Interface Access
Web UIs:
-
OpenWebUI → http://localhost:3000
-
n8n → http://localhost:5678
-
OLLAMA API → http://localhost:11434
Gmode Access Plans:
-
Planning access across the
gmode
VLAN network -
Intention to expose OpenWebUI and n8n via internal IPs
-
Future remote entry point via TOR + Tails + auth
๐ ️ Section 5: Networking and Remote Control
Current State:
-
Working from Kalikot inside the
gmode
network. -
Planning secure remote access using Tails + Onion + Login.
-
Need to activate SSH on Pop!_OS side (for Tracer III Evo) to work across LAN.
๐งช Section 6: Additional Experiments
Attempted:
-
Flashing Kali ARM to uConsole (CM4) → Resulted in black screen (driver issue).
-
Switched to Raspberry Pi OS Lite (64-bit) on uConsole.
-
Tried minimalist Kali install on uConsole → Broke due to
libgtk-3-0t64
andlibnettle.so.8
conflicts.
๐ง Section 7: Project Identity and Naming
-
Patch = Learner, fixer, automation role (opposite but complementary to Reboot).
-
Middle Name: Deep (for Deep Learning, depth of analysis).
-
Last Name: Friday (born on Pi Day: March 14).
✅ Section 8: Completed Tasks
-
✅ Named Kalikot and Patch.
-
✅ Fully configured Docker Compose file with 3 services.
-
✅ Set up Hugging Face token.
-
✅ Started OpenWebUI and confirmed connection to Ollama.
-
✅ Logged into n8n locally with basic auth.
-
✅ Defined goal of full offline stack.
-
✅ Isolated issues on ARM + Raspberry Pi display drivers.
⚠️ Section 9: To-Do / In Progress
-
Add PDF, DOCX, TXT, XLSX document loaders to Patch for RAG.
-
Finalize offline model download (e.g., LLaMA 3 or Phi-3 via
ollama pull
). -
Create basic n8n flow for doc → extract → summarize.
-
Enable LAN access to services from other gmode VLAN devices.
-
Enable TOR/Onion access with authentication layer.
-
Retry lightweight local UI for uConsole (Pi OS + text-based tools).
๐ TL;DR Summary
-
Patch Deep Friday is now operational with a 3-service Docker stack: Ollama (LLM), OpenWebUI (chat), and n8n (automation).
-
All services are running locally on Kalikot, your 32GB Dell 7490 with Kali Linux.
-
You've configured offline authentication, interface access, and future remote/TOR expansion plans.
-
ARM builds on uConsole ran into GTK+Netlib issues; fallback to Pi OS Lite in motion.
-
Current stage: Ready for RAG setup and automation flows in n8n.