Price Is Right - Host-Based Setup
A simplified host-based microservices implementation of "The Price is Right" deal hunting system.
Overview
This setup runs all services directly on the host without Docker containers, using a shared Python virtual environment and direct Ollama connection.
Prerequisites
- Python 3.11+
- Ollama running on port 11434
- Required Ollama models:
llama3.2andllama3.2:3b-instruct-q4_0
Quick Start
-
Install dependencies:
pip install -r requirements.txt # or with uv: uv pip install -r requirements.txt -
Start all services:
python service_manager.py start -
Access the UI:
- Main UI: http://localhost:7860
- Notification Receiver: http://localhost:7861
-
Stop all services:
python service_manager.py stop
Service Architecture
| Service | Port | Description |
|---|---|---|
| Scanner Agent | 8001 | Scans for deals from RSS feeds |
| Specialist Agent | 8002 | Fine-tuned LLM price estimation |
| Frontier Agent | 8003 | RAG-based price estimation |
| Random Forest Agent | 8004 | ML model price prediction |
| Ensemble Agent | 8005 | Combines all price estimates |
| Planning Agent | 8006 | Orchestrates deal evaluation |
| Notification Service | 8007 | Sends deal alerts |
| Notification Receiver | 8008 | Receives and displays alerts |
| UI | 7860 | Main web interface |
Service Management
Start Services
# Start all services
python service_manager.py start
# Start specific service
python service_manager.py start scanner
Stop Services
# Stop all services
python service_manager.py stop
# Stop specific service
python service_manager.py stop scanner
Check Status
python service_manager.py status
Restart Service
python service_manager.py restart scanner
Data Files
data/models/- Contains .pkl model files (immediately accessible)data/vectorstore/- ChromaDB vector storedata/memory.json- Deal memory storagelogs/- Service log files
Key Features
- No Docker overhead - Services start instantly
- Direct file access - .pkl files load immediately
- Single environment - All services share the same Python environment
- Direct Ollama access - No proxy needed
- Easy debugging - Direct process access and logs
Troubleshooting
-
Port conflicts: Check if ports are already in use
python service_manager.py status -
Ollama connection issues: Ensure Ollama is running on port 11434
ollama list -
Service logs: Check individual service logs in
logs/directory -
Model loading: Ensure required models are available
ollama pull llama3.2 ollama pull llama3.2:3b-instruct-q4_0
Development
- All services are in
services/directory - Shared code is in
shared/directory - Service manager handles process lifecycle
- Logs are written to
logs/directory