# Code Explainer - LLM Engineering Week 1 An AI-powered code explanation tool using Llama 3.2 via Ollama. ## 🎯 Project Overview This project demonstrates prompt engineering and local LLM integration by building a code explanation assistant. The tool analyzes code snippets and provides beginner-friendly, line-by-line explanations. ## ✨ Features - **Local LLM Integration**: Uses Ollama with Llama 3.2 (no API costs!) - **Two Modes of Operation**: - πŸ““ **Notebook Mode**: Interactive Jupyter notebook with rich Markdown display - πŸ’» **Terminal Mode**: Interactive CLI for continuous code explanation - **Smart Explanations**: - Summarizes overall purpose - Line-by-line breakdown - Highlights key programming concepts - Beginner-friendly language - **Multiple Examples**: Loops, list comprehensions, OOP, recursion, decorators - **Streaming Responses**: Real-time output as the model generates ## πŸ› οΈ Technologies Used - **Ollama**: Local LLM runtime - **Llama 3.2**: Meta's language model - **OpenAI Python SDK**: API-compatible interface - **IPython**: Rich markdown display - **python-dotenv**: Environment management ## πŸ“‹ Prerequisites 1. **Ollama installed** - Download from [ollama.com](https://ollama.com) 2. **Python 3.8+** 3. **Llama 3.2 model pulled**: ```bash ollama pull llama3.2 ``` ## πŸš€ Setup ### 1. Clone the repository ```bash git clone cd llm-engineering/week1/my-solutions ``` ### 2. Install dependencies ```bash pip install -r requirements.txt ``` ### 3. (Optional) Configure environment ```bash cp .env.example .env # Edit .env if needed ``` ## πŸ’‘ Usage ### Notebook Mode 1. Open `day1-solution.ipynb` in Jupyter or VS Code 2. Run cells sequentially 3. Use `explain_code_interactive()` function with your own code ```python explain_code_interactive(""" def fibonacci(n): if n <= 1: return n return fibonacci(n-1) + fibonacci(n-2) """) ``` ### Terminal Mode ```bash python code_explainer.py ``` Then: - Paste your code - Press Enter twice (empty line) - Get explanation! **Commands:** - `quit` / `exit` / `q` - Exit - `clear` - Start fresh - `examples` - See sample code snippets ## πŸŽ“ Why Ollama? I chose Ollama over OpenAI API for this project because: βœ… **No API Costs**: Completely free to use βœ… **Privacy**: All data stays local βœ… **Offline**: Works without internet βœ… **Learning**: Hands-on experience with local LLM deployment βœ… **Speed**: Fast responses on local hardware ## πŸ“ Examples Included 1. **Recursion** - Fibonacci sequence 2. **Loops** - Simple iteration 3. **List Comprehensions** - Filtering and mapping 4. **Object-Oriented Programming** - Classes and methods 5. **Decorators** - Advanced Python concepts ## πŸ”§ Customization ### Change the model Edit `code_explainer.py`: ```python explainer = CodeExplainer(model="llama3.2:3b") # Use smaller model ``` ### Adjust temperature Lower temperature = more consistent, Higher = more creative: ```python temperature=0.3 # Current setting for code explanations ``` ### Modify system prompt Customize how the model explains code in the `system_prompt` variable. ## 🀝 Contributing This is a course assignment, but feel free to fork and improve! ## πŸ“„ License MIT License - feel free to use for learning purposes. ## πŸ‘€ Author Δ°rem Δ°rem LLM Engineering Course - Week 1 Assignment ## πŸ™ Acknowledgments - **Ed Donner** - Course instructor for the excellent LLM Engineering course - **Anthropic** - For Ollama framework - **Meta** - For Llama 3.2 model