- Interactive code explainer using Llama 3.2 via Ollama - Notebook and Terminal modes - Local LLM integration (no API costs) - Multiple examples included
7 lines
172 B
Plaintext
7 lines
172 B
Plaintext
# Ollama Configuration
|
|
OLLAMA_BASE_URL=http://localhost:11434/v1
|
|
OLLAMA_MODEL=llama3.2
|
|
|
|
# OpenAI (if you want to use OpenAI API instead)
|
|
# OPENAI_API_KEY=your_api_key_here
|