Skip to content

Installation

This guide walks through deploying hemlock-lab from a fresh clone to a fully operational validation lab.

Prerequisites

Make sure you've completed the Prerequisites checklist before starting.


Step 1: Clone the Repository

git clone https://github.com/professor-moody/hemlock-lab.git
cd hemlock-lab

Step 2: Start Ollama

Ollama runs on the host. Make sure it's running and has the required models:

ollama serve &        # If not already running
ollama pull smollm2:135m
ollama pull nomic-embed-text
ollama pull nomic-embed-text

Step 3: Start the Stack

cd docker
docker compose up -d

This builds and starts 6 containers:

[+] Building langchain-rag...
[+] Building llamaindex-rag...
[+] Building unstructured-rag...
[+] Building haystack-rag...
[+] Building colpali-rag...
[+] Running 6/6
 ✔ Container chromadb         Healthy
 ✔ Container langchain-rag    Started
 ✔ Container llamaindex-rag   Started
 ✔ Container unstructured-rag Started
 ✔ Container haystack-rag     Started
 ✔ Container colpali-rag      Started

First build

The first docker compose up builds all pipeline images, which may take several minutes. Subsequent starts use cached images.


Step 4: Verify Health

for port in 8000 8100 8101 8102 8103 8104; do
  echo -n "Port ${port}: "
  curl -sf "http://localhost:${port}/health" | jq -r '"\(.framework // "chromadb") [\(.status // "ok")]"' 2>/dev/null || echo "DOWN"
done

Expected output:

Port 8000: chromadb [ok]
Port 8100: langchain [ok]
Port 8101: llamaindex [ok]
Port 8102: unstructured [ok]
Port 8103: haystack [ok]
Port 8104: colpali [ok]

Configuration

Override defaults via environment variables or a .env file in the docker/ directory:

Variable Default Purpose
OLLAMA_MODEL smollm2:135m LLM model for generation
OLLAMA_EMBED_MODEL nomic-embed-text Embedding model
SYSTEM_PROMPT_FILE (empty) Custom system prompt path
# Example .env file
echo 'OLLAMA_MODEL=llama3.2:3b' > docker/.env
docker compose up -d   # Restarts with new model

Troubleshooting

Symptom Cause Fix
Container exits immediately Ollama not running on host Start Ollama: ollama serve
ChromaDB health fails Container still starting Wait 10s for healthcheck to pass
Pipeline returns 503 ChromaDB or Ollama unreachable Check docker compose logs <service>
Build fails Docker not running Start Docker Desktop or Docker Engine

For more details, see Troubleshooting.


Next Steps


Legacy: Proxmox Deployment

The original deployment used make setup-proxmox to create a VM (VMID 260, 172.16.50.50), followed by make deploy for a 4-phase provisioning process (base OS → services → seed data → verification). The Proxmox scripts are still available in lab-scripts/ for bare-metal environments.