- Interactive code explainer using Llama 3.2 via Ollama - Notebook and Terminal modes - Local LLM integration (no API costs) - Multiple examples included
Code Explainer - LLM Engineering Week 1
An AI-powered code explanation tool using Llama 3.2 via Ollama.
🎯 Project Overview
This project demonstrates prompt engineering and local LLM integration by building a code explanation assistant. The tool analyzes code snippets and provides beginner-friendly, line-by-line explanations.
✨ Features
- Local LLM Integration: Uses Ollama with Llama 3.2 (no API costs!)
- Two Modes of Operation:
- 📓 Notebook Mode: Interactive Jupyter notebook with rich Markdown display
- 💻 Terminal Mode: Interactive CLI for continuous code explanation
- Smart Explanations:
- Summarizes overall purpose
- Line-by-line breakdown
- Highlights key programming concepts
- Beginner-friendly language
- Multiple Examples: Loops, list comprehensions, OOP, recursion, decorators
- Streaming Responses: Real-time output as the model generates
🛠️ Technologies Used
- Ollama: Local LLM runtime
- Llama 3.2: Meta's language model
- OpenAI Python SDK: API-compatible interface
- IPython: Rich markdown display
- python-dotenv: Environment management
📋 Prerequisites
- Ollama installed - Download from ollama.com
- Python 3.8+
- Llama 3.2 model pulled:
ollama pull llama3.2
🚀 Setup
1. Clone the repository
git clone <your-repo-url>
cd llm-engineering/week1/my-solutions
2. Install dependencies
pip install -r requirements.txt
3. (Optional) Configure environment
cp .env.example .env
# Edit .env if needed
💡 Usage
Notebook Mode
- Open
day1-solution.ipynbin Jupyter or VS Code - Run cells sequentially
- Use
explain_code_interactive()function with your own code
explain_code_interactive("""
def fibonacci(n):
if n <= 1:
return n
return fibonacci(n-1) + fibonacci(n-2)
""")
Terminal Mode
python code_explainer.py
Then:
- Paste your code
- Press Enter twice (empty line)
- Get explanation!
Commands:
quit/exit/q- Exitclear- Start freshexamples- See sample code snippets
🎓 Why Ollama?
I chose Ollama over OpenAI API for this project because:
✅ No API Costs: Completely free to use
✅ Privacy: All data stays local
✅ Offline: Works without internet
✅ Learning: Hands-on experience with local LLM deployment
✅ Speed: Fast responses on local hardware
📝 Examples Included
- Recursion - Fibonacci sequence
- Loops - Simple iteration
- List Comprehensions - Filtering and mapping
- Object-Oriented Programming - Classes and methods
- Decorators - Advanced Python concepts
🔧 Customization
Change the model
Edit code_explainer.py:
explainer = CodeExplainer(model="llama3.2:3b") # Use smaller model
Adjust temperature
Lower temperature = more consistent, Higher = more creative:
temperature=0.3 # Current setting for code explanations
Modify system prompt
Customize how the model explains code in the system_prompt variable.
🤝 Contributing
This is a course assignment, but feel free to fork and improve!
📄 License
MIT License - feel free to use for learning purposes.
👤 Author
İrem İrem
LLM Engineering Course - Week 1 Assignment
🙏 Acknowledgments
- Ed Donner - Course instructor for the excellent LLM Engineering course
- Anthropic - For Ollama framework
- Meta - For Llama 3.2 model