Merge branch 'main' of github.com:bluebells1/llm_engineering into sm-branch-wk4
This commit is contained in:
429
week2/community-contributions/04_tribot_debate.ipynb
Normal file
429
week2/community-contributions/04_tribot_debate.ipynb
Normal file
File diff suppressed because one or more lines are too long
557
week2/community-contributions/05_weathermate_ai_agent.ipynb
Normal file
557
week2/community-contributions/05_weathermate_ai_agent.ipynb
Normal file
@@ -0,0 +1,557 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"attachments": {},
|
||||
"cell_type": "markdown",
|
||||
"id": "ae1ef804-3504-488d-af86-5a0da36fea78",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# ☀️🏃♀️ WeatherMate\n",
|
||||
"----\n",
|
||||
"\n",
|
||||
"**WeatherMate** is a conversational **AI agent** that analyzes real-time weather conditions and suggests the best activities and events based on location. Whether it's sunny, rainy, or snowy, WeatherMate helps you make the most of your day! \n",
|
||||
"\n",
|
||||
"Here's how it works:\n",
|
||||
"1. Get current weather conditions for the user's location.\n",
|
||||
"2. Recommend suitable indoor or outdoor activities based on the weather.\n",
|
||||
"3. Find relevant events using the Ticketmaster API.\n",
|
||||
"4. Merge both activity suggestions and events into a single, structured response.\n",
|
||||
"\n",
|
||||
"---\n",
|
||||
"\n",
|
||||
"Large Language Models (LLMs), by themselves, cannot fetch real-time data such as weather information. To enable LLMs to access and use such real-time data, we integrate **external tools.** \n",
|
||||
"\n",
|
||||
"In this notebook, we will implement a weather API, allowing the assistant to fetch real-time weather information and use it for personalized activity suggestions based on current weather conditions. This is an essential step in transforming an LLM into a more interactive and data-driven AI assistant.\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"In this notebook, we will develop a conversational AI Agent that helps users receive personalized activity recommendations based on real-time weather data.\n",
|
||||
"\n",
|
||||
"- 🧑💻 Skill Level: Advanced\n",
|
||||
"- 📤 Output Format: conversational chat\n",
|
||||
"- 🚀 Tools:\n",
|
||||
" - Weather API integration \n",
|
||||
" - Ticketmaster API\n",
|
||||
" - OpenAI with external tool handling\n",
|
||||
" - Gradio for the UI\n",
|
||||
"\n",
|
||||
"🛠️ Requirements\n",
|
||||
"- ⚙️ Hardware: ✅ CPU is sufficient — no GPU required\n",
|
||||
"- 🔑 OpenAI API Key\n",
|
||||
"- 🔑 Weather API integration (https://www.weatherapi.com)\n",
|
||||
"- 🔑 Ticketmaster API (https://developer.ticketmaster.com/explore/)\n",
|
||||
"\n",
|
||||
"⚙️ Customizable by user\n",
|
||||
"- 🤖 Selected model\n",
|
||||
"- 📜 system_prompt: Controls model behavior\n",
|
||||
"\n",
|
||||
"---\n",
|
||||
"📢 Find more LLM notebooks on my [GitHub repository](https://github.com/lisekarimi/lexo)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "ad262788",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"**Class Diagram**\n",
|
||||
"\n",
|
||||
"\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "d6b7a492-f510-4ba4-bbc3-239675d389dd",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# imports\n",
|
||||
"\n",
|
||||
"import os\n",
|
||||
"import json\n",
|
||||
"import requests\n",
|
||||
"from dotenv import load_dotenv\n",
|
||||
"from openai import OpenAI\n",
|
||||
"import gradio as gr\n",
|
||||
"from datetime import datetime\n",
|
||||
"\n",
|
||||
"# Initialization\n",
|
||||
"\n",
|
||||
"load_dotenv(override=True)\n",
|
||||
"\n",
|
||||
"openai_api_key = os.getenv('OPENAI_API_KEY')\n",
|
||||
"if not openai_api_key:\n",
|
||||
" print(\"❌ OpenAI API Key is missing!\")\n",
|
||||
"\n",
|
||||
"weather_api_key = os.getenv('WEATHERAPI_KEY')\n",
|
||||
"if not weather_api_key:\n",
|
||||
" print(\"❌ Weather API Key is missing!\")\n",
|
||||
"\n",
|
||||
"ticketmaster_api_key = os.getenv('TICKETMASTER_KEY')\n",
|
||||
"if not ticketmaster_api_key:\n",
|
||||
" print(\"❌ TicketMaster API Key is missing!\")\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"MODEL = \"gpt-4o-mini\"\n",
|
||||
"openai = OpenAI()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "347dbe00-5826-4aa6-9d2c-9d028fc33ec8",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Get today's date and day name\n",
|
||||
"today_str = datetime.today().strftime('%Y-%m-%d')\n",
|
||||
"day_name = datetime.today().strftime('%A')\n",
|
||||
"\n",
|
||||
"nb_activity = 10\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"system_message = f\"\"\"\n",
|
||||
"You are a fun and helpful assistant for an Activity Suggestion App.\n",
|
||||
"Your job is to recommend **up to {nb_activity} activities** based on the real-time weather fetched from the API, ensuring a mix of **indoor, outdoor, and event-based activities** whenever possible.\n",
|
||||
"\n",
|
||||
"The total must always be **10 or fewer**, following this rule:\n",
|
||||
"**nb_events + nb_indoors + nb_outdoors ≤ 10**.\n",
|
||||
"\n",
|
||||
"You must **analyze and think carefully** to determine the best combination of activities and events for the user.\n",
|
||||
"- Evaluate **weather conditions** to decide if outdoor activities are suitable.\n",
|
||||
"- Check **event availability** and select the most relevant ones.\n",
|
||||
"- Balance **indoor, outdoor, and event-based activities** dynamically to provide the best experience.\n",
|
||||
"\n",
|
||||
"If one of these categories is unavailable, that's fine—just provide the best possible suggestions without exceeding **10 activities**.\n",
|
||||
"Deliver everything **in one go—no waiting!**\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"### **Understanding Relative Dates**\n",
|
||||
"- Always interpret relative dates based on **{today_str} ({day_name})**.\n",
|
||||
"- The weekend always refers to Saturday and Sunday.\n",
|
||||
"- \"Next {day_name}\" should refer to the **closest upcoming occurrence** of that day.\n",
|
||||
"- If the user asks for a time range (e.g., \"the next 3 days\"), calculate the **exact date range** starting from today.\n",
|
||||
"- If no specific date is mentioned, **assume today by default**.\n",
|
||||
"- **Do not ask for confirmation** when interpreting dates—just assume the correct date and proceed confidently unless there's real ambiguity.\n",
|
||||
"\n",
|
||||
"### **Activity and Event Suggestion Process**\n",
|
||||
"To provide the best {nb_activity} activity recommendations, follow these steps:\n",
|
||||
"Step 1: Retrieve Weather Data – Use the Weather API to get current conditions for the user's location.\n",
|
||||
"Step 2: Suggest Activities – Recommend suitable indoor or outdoor activities based on the weather.\n",
|
||||
"Step 3: Fetch Events (if available) – Use the Ticketmaster API to find relevant events in the user’s area.\n",
|
||||
"Step 4: Combine Everything – Merge both event listings and activity suggestions into a single, well-structured response.\n",
|
||||
"This entire process should be done seamlessly in one go without making the user wait.\n",
|
||||
"\n",
|
||||
"### **How to Handle Each API**\n",
|
||||
"- **Weather API Handling**:\n",
|
||||
" - If the user requests a relative date (e.g., \"tomorrow,\" \"next Monday\"), calculate the number of days from today.\n",
|
||||
" - Provide the weather forecast only for the requested date, ignoring any other days in the response.\n",
|
||||
" - If no weather data is available, inform the user in a friendly, light-hearted way.\n",
|
||||
" - The forecast is limited to 14 days, so if the user requests a longer period, politely let him know.\n",
|
||||
"\n",
|
||||
"- **Ticketmaster API Handling**:\n",
|
||||
" - If the user asks for events today, set the start date as today’s date.\n",
|
||||
" - If the user asks for any specific weekday, find the next occurrence of that day and use it as the start date.\n",
|
||||
" - If the user asks for a range of days (e.g., \"the next 3 days\"), use today’s date as the start date.\n",
|
||||
" - The country corresponding to the user's city must be represented using the ISO Alpha-2 Code (e.g., FR for France, US for the United States, CA for Canada, DK for Denmark).\n",
|
||||
" - If more than 5 events are found, ask the user for their interests to refine the search, using a one-word keyword like 'music,' 'cinema,' or 'theater.'\n",
|
||||
" - If no events are found, explicitly inform the user in a friendly, funny way.\n",
|
||||
" - Do not mention Ticketmaster unless necessary; simply state that you are checking for events.\n",
|
||||
"\n",
|
||||
"### **User Interaction Rules**\n",
|
||||
"- If the user **doesn’t mention a city**, **ask them to provide one**.\n",
|
||||
"- If an event search fails, do **not** mention Ticketmaster; simply say that no events were found.\n",
|
||||
"- Ensure all activity suggestions are provided **in one response**, combining weather-based activities and event suggestions.\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"### **Event Formatting in Output**\n",
|
||||
"**If Ticketmaster events are available**, format the output as follows:\n",
|
||||
"Here are some events that may interest you:\n",
|
||||
"**Event Name**:\n",
|
||||
"- 📅 Date: Give the date like 19th March 2025\n",
|
||||
"- 📍 Venue:\n",
|
||||
"- 🔗 Ticket Link: Put the URL here\n",
|
||||
"\n",
|
||||
"(And don't forget to separate these gems with a snazzy divider)\n",
|
||||
"\n",
|
||||
"**Event Name**:\n",
|
||||
"- 📅 Date: Give the date like 19th March 2025\n",
|
||||
"- 📍 Venue:\n",
|
||||
"- 🔗 Ticket Link: Put the URL here\n",
|
||||
"\n",
|
||||
"(Another divider, because we like to keep things fresh!)\n",
|
||||
"\n",
|
||||
"**Event Name**:\n",
|
||||
"- 📅 Date: Give the date like 19th March 2025\n",
|
||||
"- 📍 Venue:\n",
|
||||
"- 🔗 Ticket Link: Put the URL here\n",
|
||||
"\n",
|
||||
"### **Tone and Style**\n",
|
||||
"**Keep it short, fun, and don’t forget to add a dash of humor!**\n",
|
||||
"Your job is to keep the user smiling while giving them the **best activities for the day**.\n",
|
||||
"Be **accurate and concise**, but let’s keep it **light and lively!** 🎉\n",
|
||||
"\"\"\"\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "578da33d-be38-4c75-8a96-9d6bfc1af99b",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"class WeatherAPI:\n",
|
||||
" def get_weather(self, city: str, days: int) -> dict:\n",
|
||||
" \"\"\"Fetches weather data for the given city for the next 'days' number of days.\"\"\"\n",
|
||||
" url = \"https://api.weatherapi.com/v1/forecast.json\"\n",
|
||||
" params = {\"key\": weather_api_key, \"q\": city, \"days\": days}\n",
|
||||
" # print(f\"params weather: {params}\")\n",
|
||||
" response = requests.get(url, params=params)\n",
|
||||
"\n",
|
||||
" if response.status_code == 200:\n",
|
||||
" data = response.json()\n",
|
||||
" forecast = []\n",
|
||||
" for day in data[\"forecast\"][\"forecastday\"]:\n",
|
||||
" forecast.append({\n",
|
||||
" \"date\": day[\"date\"],\n",
|
||||
" \"temp\": day[\"day\"][\"avgtemp_c\"]\n",
|
||||
" })\n",
|
||||
"\n",
|
||||
" result = {\n",
|
||||
" \"city\": city,\n",
|
||||
" \"forecast\": forecast\n",
|
||||
" }\n",
|
||||
" return result\n",
|
||||
" else:\n",
|
||||
" return {\"error\": f\"City '{city}' not found or other issue. Please check the city name and try again.\"}"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "305f9f18-8556-4b49-9f6b-4a2233eefae9",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from abc import ABC, abstractmethod\n",
|
||||
"\n",
|
||||
"class BaseEventAPI(ABC):\n",
|
||||
" @abstractmethod\n",
|
||||
" def get_events(self, city, country_code, keywords, size):\n",
|
||||
" \"\"\"Fetches upcoming events from an event provider.\"\"\"\n",
|
||||
" pass # Subclasses must implement this method\n",
|
||||
"\n",
|
||||
"class TicketmasterAPI(BaseEventAPI):\n",
|
||||
" def get_events(self, city, country_code, keywords, start_date):\n",
|
||||
" \"\"\"Fetches upcoming events from Ticketmaster for a given city.\"\"\"\n",
|
||||
" url = \"https://app.ticketmaster.com/discovery/v2/events.json\"\n",
|
||||
" params = {\n",
|
||||
" \"apikey\": ticketmaster_api_key,\n",
|
||||
" \"city\": city,\n",
|
||||
" \"countryCode\": country_code,\n",
|
||||
" \"keyword\": \",\".join(keywords),\n",
|
||||
" \"size\": 10,\n",
|
||||
" \"startDateTime\": start_date\n",
|
||||
" }\n",
|
||||
"\n",
|
||||
" response = requests.get(url, params=params)\n",
|
||||
"\n",
|
||||
" if response.status_code == 200:\n",
|
||||
" data = response.json()\n",
|
||||
" events = data.get(\"_embedded\", {}).get(\"events\", [])\n",
|
||||
" return [\n",
|
||||
" {\n",
|
||||
" \"name\": event[\"name\"],\n",
|
||||
" \"date\": event[\"dates\"][\"start\"][\"localDate\"],\n",
|
||||
" \"venue\": event[\"_embedded\"][\"venues\"][0][\"name\"],\n",
|
||||
" \"url\": event.get(\"url\", \"N/A\") # Using .get() to avoid KeyError\n",
|
||||
" }\n",
|
||||
" for event in events\n",
|
||||
" ] if events else []\n",
|
||||
" else:\n",
|
||||
" return {\"error\": f\"API request failed! Status: {response.status_code}, Response: {response.text}\"}\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "4c60820f-4e9f-4851-8330-52c8fd676259",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"class ChatAssistant:\n",
|
||||
" def __init__(self):\n",
|
||||
" self.model = MODEL\n",
|
||||
" self.tools = [\n",
|
||||
" {\n",
|
||||
" \"type\": \"function\",\n",
|
||||
" \"function\": {\n",
|
||||
" \"name\": \"get_weather\",\n",
|
||||
" \"description\": \"Get the current weather and forecast for the destination city.\",\n",
|
||||
" \"parameters\": {\n",
|
||||
" \"type\": \"object\",\n",
|
||||
" \"properties\": {\n",
|
||||
" \"city\": {\n",
|
||||
" \"type\": \"string\",\n",
|
||||
" \"description\": \"The city for which the weather is being requested.\"\n",
|
||||
" },\n",
|
||||
" \"days\": {\n",
|
||||
" \"type\": \"integer\",\n",
|
||||
" \"description\": \"The number of days for the weather forecast (can be 1, 2, 6, or 10).\"\n",
|
||||
" }\n",
|
||||
" },\n",
|
||||
" \"required\": [\"city\", \"days\"],\n",
|
||||
" \"additionalProperties\": False\n",
|
||||
" }\n",
|
||||
" }\n",
|
||||
" },\n",
|
||||
" {\n",
|
||||
" \"type\": \"function\",\n",
|
||||
" \"function\": {\n",
|
||||
" \"name\": \"get_ticketmaster_events\",\n",
|
||||
" \"description\": \"Fetch upcoming events from Ticketmaster.\",\n",
|
||||
" \"parameters\": {\n",
|
||||
" \"type\": \"object\",\n",
|
||||
" \"properties\": {\n",
|
||||
" \"city\": {\n",
|
||||
" \"type\": \"string\",\n",
|
||||
" \"description\": \"City where the events are searched.\"\n",
|
||||
" },\n",
|
||||
" \"country_code\": {\n",
|
||||
" \"type\": \"string\",\n",
|
||||
" \"description\": \"Country code for filtering results.\"\n",
|
||||
" },\n",
|
||||
" \"keywords\": {\n",
|
||||
" \"type\": \"array\",\n",
|
||||
" \"items\": {\n",
|
||||
" \"type\": \"string\"\n",
|
||||
" },\n",
|
||||
" \"description\": \"Optional keywords for event search (e.g., 'music', 'concert').\"\n",
|
||||
" },\n",
|
||||
" \"size\": {\n",
|
||||
" \"type\": \"integer\",\n",
|
||||
" \"description\": \"Number of events to fetch.\"\n",
|
||||
" },\n",
|
||||
" \"start_date\": {\n",
|
||||
" \"type\": \"string\",\n",
|
||||
" \"description\": \"Start date for the event search.\"\n",
|
||||
" }\n",
|
||||
" },\n",
|
||||
" \"required\": [\"city\", \"country_code\", \"size\", \"start_date\"],\n",
|
||||
" \"additionalProperties\": False\n",
|
||||
" }\n",
|
||||
" }\n",
|
||||
" }\n",
|
||||
" ]\n",
|
||||
"\n",
|
||||
" def chat(self, user_message, history, weather_api, event_apis):\n",
|
||||
" # Build the conversation\n",
|
||||
" messages = [{\"role\": \"system\", \"content\": system_message}] + history + [{\"role\": \"user\", \"content\": user_message}]\n",
|
||||
"\n",
|
||||
" # OpenAI response\n",
|
||||
" response = openai.chat.completions.create(model=self.model, messages=messages, tools=self.tools, stream=True)\n",
|
||||
"\n",
|
||||
" recovered_pieces = {\n",
|
||||
" \"content\": None,\n",
|
||||
" \"role\": \"assistant\",\n",
|
||||
" \"tool_calls\": {}\n",
|
||||
" }\n",
|
||||
" last_tool_calls = {}\n",
|
||||
" has_tool_call = False\n",
|
||||
" result = \"\" # Initialize result accumulator\n",
|
||||
" # previous_index = None # Track the last processed index\n",
|
||||
"\n",
|
||||
" for chunk in response:\n",
|
||||
" delta = chunk.choices[0].delta\n",
|
||||
" finish_reason = chunk.choices[0].finish_reason\n",
|
||||
"\n",
|
||||
" # Handle tool call detection\n",
|
||||
" if delta.tool_calls and finish_reason in [None, \"tool_calls\"]:\n",
|
||||
" has_tool_call = True\n",
|
||||
" piece = delta.tool_calls[0] # Get the first piece in the tool call\n",
|
||||
"\n",
|
||||
" # Create a dictionary for the tool call if it doesn't exist yet\n",
|
||||
" recovered_pieces[\"tool_calls\"][piece.index] = recovered_pieces[\"tool_calls\"].get(\n",
|
||||
" piece.index, {\"id\": None, \"function\": {\"arguments\": \"\", \"name\": \"\"}, \"type\": \"function\"}\n",
|
||||
" )\n",
|
||||
"\n",
|
||||
" if piece.id:\n",
|
||||
" recovered_pieces[\"tool_calls\"][piece.index][\"id\"] = piece.id\n",
|
||||
" if piece.function.name:\n",
|
||||
" recovered_pieces[\"tool_calls\"][piece.index][\"function\"][\"name\"] = piece.function.name\n",
|
||||
" recovered_pieces[\"tool_calls\"][piece.index][\"function\"][\"arguments\"] += piece.function.arguments\n",
|
||||
"\n",
|
||||
" # Store the tool call in the dictionary by index\n",
|
||||
" last_tool_calls[piece.index] = recovered_pieces[\"tool_calls\"][piece.index]\n",
|
||||
"\n",
|
||||
" # Store content in result and yield\n",
|
||||
" else:\n",
|
||||
" result += delta.content or \"\"\n",
|
||||
" if result.strip():\n",
|
||||
" yield result\n",
|
||||
"\n",
|
||||
"\n",
|
||||
" # Handle tool call scenario\n",
|
||||
" if has_tool_call:\n",
|
||||
" # Handle the tool calls\n",
|
||||
" response = self.handle_tool_call(last_tool_calls, weather_api, event_apis)\n",
|
||||
"\n",
|
||||
" if response: # Only iterate if response is not None\n",
|
||||
" tool_calls_list = [tool_call for tool_call in last_tool_calls.values()]\n",
|
||||
" messages.append({\"role\": \"assistant\", \"tool_calls\": tool_calls_list}) # Append the tool calls to the messages\n",
|
||||
"\n",
|
||||
" # Dynamically process each tool call response and append it to the message history\n",
|
||||
" for res in response:\n",
|
||||
" messages.append({\n",
|
||||
" \"role\": \"tool\",\n",
|
||||
" \"tool_call_id\": res[\"tool_call_id\"],\n",
|
||||
" \"content\": json.dumps(res[\"content\"])\n",
|
||||
" })\n",
|
||||
"\n",
|
||||
" # New OpenAI request with tool response\n",
|
||||
" response = openai.chat.completions.create(model=self.model, messages=messages, stream=True)\n",
|
||||
"\n",
|
||||
" result = \"\" # Reset result before second stream\n",
|
||||
" for chunk in response:\n",
|
||||
" result += chunk.choices[0].delta.content or \"\"\n",
|
||||
" if result.strip():\n",
|
||||
" yield result\n",
|
||||
"\n",
|
||||
"\n",
|
||||
" def handle_tool_call(self, tool_call, weather_api, event_apis):\n",
|
||||
" stored_values = {} # Dictionary to store the valid value for each field\n",
|
||||
"\n",
|
||||
" for index, call in tool_call.items():\n",
|
||||
" # Load the arguments for each tool call dynamically\n",
|
||||
" arguments = json.loads(call[\"function\"][\"arguments\"])\n",
|
||||
"\n",
|
||||
" # Iterate over all keys dynamically\n",
|
||||
" for key, value in arguments.items():\n",
|
||||
" # Update the field if it's currently None or hasn't been set before\n",
|
||||
" if key not in stored_values or stored_values[key] is None:\n",
|
||||
" stored_values[key] = value\n",
|
||||
"\n",
|
||||
" city = stored_values.get('city')\n",
|
||||
" days = stored_values.get('days')\n",
|
||||
" country_code = stored_values.get('country_code')\n",
|
||||
" keywords = stored_values.get('keywords', [])\n",
|
||||
" # size = stored_values.get('size')\n",
|
||||
" start_date = stored_values.get('start_date')\n",
|
||||
" start_date = str(start_date) + \"T00:00:00Z\"\n",
|
||||
"\n",
|
||||
" weather_data = None\n",
|
||||
" event_data = None\n",
|
||||
"\n",
|
||||
" # Iteration over tool_call\n",
|
||||
" for call in tool_call.values():\n",
|
||||
" if call[\"function\"][\"name\"] == \"get_weather\":\n",
|
||||
" weather_data = weather_api.get_weather(city, days)\n",
|
||||
"\n",
|
||||
" if call[\"function\"][\"name\"] == \"get_ticketmaster_events\":\n",
|
||||
" event_data = event_apis[\"ticketmaster\"].get_events(city, country_code, keywords, start_date)\n",
|
||||
"\n",
|
||||
" responses = []\n",
|
||||
"\n",
|
||||
" # Ensure weather response is always included\n",
|
||||
" weather_tool_call_id = next((call[\"id\"] for call in tool_call.values() if call[\"function\"][\"name\"] == \"get_weather\"), None)\n",
|
||||
" if weather_data and \"forecast\" in weather_data:\n",
|
||||
" responses.append({\n",
|
||||
" \"role\": \"assistant\",\n",
|
||||
" \"content\": {\"weather\": weather_data[\"forecast\"]},\n",
|
||||
" \"tool_call_id\": weather_tool_call_id\n",
|
||||
" })\n",
|
||||
" elif weather_tool_call_id:\n",
|
||||
" responses.append({\n",
|
||||
" \"role\": \"assistant\",\n",
|
||||
" \"content\": {\"message\": \"No weather data available for this location.\"},\n",
|
||||
" \"tool_call_id\": weather_tool_call_id\n",
|
||||
" })\n",
|
||||
"\n",
|
||||
" # Ensure event response is always included\n",
|
||||
" event_tool_call_id = next((call[\"id\"] for call in tool_call.values() if call[\"function\"][\"name\"] == \"get_ticketmaster_events\"), None)\n",
|
||||
" if event_data:\n",
|
||||
" responses.append({\n",
|
||||
" \"role\": \"assistant\",\n",
|
||||
" \"content\": {\"events\": event_data},\n",
|
||||
" \"tool_call_id\": event_tool_call_id\n",
|
||||
" })\n",
|
||||
" elif event_tool_call_id:\n",
|
||||
" responses.append({\n",
|
||||
" \"role\": \"assistant\",\n",
|
||||
" \"content\": {\"message\": \"No events found for this location.\"},\n",
|
||||
" \"tool_call_id\": event_tool_call_id\n",
|
||||
" })\n",
|
||||
"\n",
|
||||
" # print(\"Final responses:\", responses)\n",
|
||||
" return responses\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "191a3a9e-95e1-4ca6-8992-4a5bafb9b8ff",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# GradioInterface class to handle the Gradio UI\n",
|
||||
"class GradioInterface:\n",
|
||||
" def __init__(self, activity_assistant):\n",
|
||||
" self.activity_assistant = activity_assistant\n",
|
||||
"\n",
|
||||
" def launch(self):\n",
|
||||
" # Gradio chat interface\n",
|
||||
" gr.ChatInterface(fn=self.activity_assistant.chat, type=\"messages\").launch()\n",
|
||||
"\n",
|
||||
"# ActivityAssistant setup\n",
|
||||
"class ActivityAssistant:\n",
|
||||
" def __init__(self):\n",
|
||||
" self.weather_api = WeatherAPI() # Interact with the Weather API\n",
|
||||
" self.event_apis = { # Interact with the Events API\n",
|
||||
" \"ticketmaster\": TicketmasterAPI()\n",
|
||||
" }\n",
|
||||
" self.chat_assistant = ChatAssistant() # This will handle conversation with OpenAI\n",
|
||||
"\n",
|
||||
" def chat(self, user_message, history):\n",
|
||||
" # Forward the user message and conversation history to ChatAssistant\n",
|
||||
" response_stream = self.chat_assistant.chat(user_message, history, self.weather_api, self.event_apis)\n",
|
||||
" for chunk in response_stream:\n",
|
||||
" yield chunk"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "0b501e8e-2e10-4ab7-b523-1d4b8ad358e8",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Main execution\n",
|
||||
"if __name__ == \"__main__\":\n",
|
||||
" activity_assistant = ActivityAssistant()\n",
|
||||
" gradio_interface = GradioInterface(activity_assistant)\n",
|
||||
" gradio_interface.launch()"
|
||||
]
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"kernelspec": {
|
||||
"display_name": ".venv",
|
||||
"language": "python",
|
||||
"name": "python3"
|
||||
},
|
||||
"language_info": {
|
||||
"codemirror_mode": {
|
||||
"name": "ipython",
|
||||
"version": 3
|
||||
},
|
||||
"file_extension": ".py",
|
||||
"mimetype": "text/x-python",
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.11.7"
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 5
|
||||
}
|
||||
420
week2/community-contributions/Week2_Day2_Litellm.ipynb
Normal file
420
week2/community-contributions/Week2_Day2_Litellm.ipynb
Normal file
@@ -0,0 +1,420 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "6a08763a-aed6-4f91-94d0-80a3c0e2665b",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Weeks 2 - Day 2 - Gradio Chatbot with LiteLLM (Model Routing)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "a4f38c58-5ceb-4d5e-b538-c1acdc881f73",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"**Author** : [Marcus Rosen](https://github.com/MarcusRosen)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "36f4814a-2bfc-4631-97d7-7a474fa1cc8e",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"[LiteLLM](https://docs.litellm.ai/docs/) provides the abilitty to call different LLM providers via a unified interface, returning results in OpenAI compatible formats.\n",
|
||||
"\n",
|
||||
"Features:\n",
|
||||
"- Model Selection in Gradio (Anthropic, OpenAI, Gemini)\n",
|
||||
"- Single Inference function for all model providers via LiteLLM (call_llm)\n",
|
||||
"- Streaming **NOTE:** Bug when trying to stream in Gradio, but works directly in Notebook\n",
|
||||
"- Debug Tracing"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 109,
|
||||
"id": "b6c12598-4773-4f85-93ca-0128d74fbca0",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from litellm import completion\n",
|
||||
"import gradio as gr\n",
|
||||
"from dotenv import load_dotenv\n",
|
||||
"from bs4 import BeautifulSoup\n",
|
||||
"import os\n",
|
||||
"import requests\n",
|
||||
"import json"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "d24be370-5347-47fb-a58e-21a1b5409ab2",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"#### Load API Keys"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 2,
|
||||
"id": "e03afbe9-16aa-434c-a701-b3bfe75e927d",
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"OpenAI API Key exists and begins sk-proj-\n",
|
||||
"Anthropic API Key exists and begins sk-ant-\n",
|
||||
"Google API Key exists and begins AIzaSyDC\n"
|
||||
]
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"# Load environment variables in a file called .env\n",
|
||||
"# Print the key prefixes to help with any debugging\n",
|
||||
"\n",
|
||||
"load_dotenv(override=True)\n",
|
||||
"openai_api_key = os.getenv('OPENAI_API_KEY')\n",
|
||||
"anthropic_api_key = os.getenv('ANTHROPIC_API_KEY')\n",
|
||||
"google_api_key = os.getenv('GEMINI_API_KEY')\n",
|
||||
"\n",
|
||||
"if openai_api_key:\n",
|
||||
" print(f\"OpenAI API Key exists and begins {openai_api_key[:8]}\")\n",
|
||||
"else:\n",
|
||||
" print(\"OpenAI API Key not set\")\n",
|
||||
" \n",
|
||||
"if anthropic_api_key:\n",
|
||||
" print(f\"Anthropic API Key exists and begins {anthropic_api_key[:7]}\")\n",
|
||||
"else:\n",
|
||||
" print(\"Anthropic API Key not set\")\n",
|
||||
"\n",
|
||||
"if google_api_key:\n",
|
||||
" print(f\"Google API Key exists and begins {google_api_key[:8]}\")\n",
|
||||
" # import google.generativeai\n",
|
||||
" # google.generativeai.configure()\n",
|
||||
"else:\n",
|
||||
" print(\"Gemini API Key not set\")"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "66e46447-0e73-49ef-944a-d1e8fae4986e",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Use LiteLLM to abstract out the model provider"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 91,
|
||||
"id": "473c2029-ca74-4f1e-92ac-05f7817ff7df",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"def call_llm(model, system_prompt, user_prompt, json_format_response=False, streaming=False):\n",
|
||||
" if DEBUG_OUTPUT: \n",
|
||||
" print(\"call_llm()\")\n",
|
||||
" print(f\"streaming={streaming}\")\n",
|
||||
" print(f\"json_format_response={json_format_response}\")\n",
|
||||
" \n",
|
||||
" messages = [\n",
|
||||
" {\"role\": \"system\", \"content\": system_prompt},\n",
|
||||
" {\"role\": \"user\", \"content\": user_prompt}\n",
|
||||
" ]\n",
|
||||
"\n",
|
||||
" payload = {\n",
|
||||
" \"model\": model,\n",
|
||||
" \"messages\": messages\n",
|
||||
" }\n",
|
||||
" # Use Json Reponse Format\n",
|
||||
" # Link: https://docs.litellm.ai/docs/completion/json_mode\n",
|
||||
" if json_format_response:\n",
|
||||
" payload[\"response_format\"]: { \"type\": \"json_object\" }\n",
|
||||
" \n",
|
||||
" if streaming:\n",
|
||||
" payload[\"stream\"] = True\n",
|
||||
" response = completion(**payload)\n",
|
||||
" # Return a generator expression instead of using yield in the function\n",
|
||||
" return (part.choices[0].delta.content or \"\" for part in response)\n",
|
||||
" else:\n",
|
||||
" response = completion(**payload)\n",
|
||||
" return response[\"choices\"][0][\"message\"][\"content\"]"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "f45e0972-a6a0-4237-8a69-e6f165f30e0d",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Brochure building functions"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 83,
|
||||
"id": "c76d4ff9-0f18-49d0-a9b5-2c6c0bad359a",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# A class to represent a Webpage\n",
|
||||
"\n",
|
||||
"# Some websites need you to use proper headers when fetching them:\n",
|
||||
"headers = {\n",
|
||||
" \"User-Agent\": \"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/117.0.0.0 Safari/537.36\"\n",
|
||||
"}\n",
|
||||
"\n",
|
||||
"class Website:\n",
|
||||
" \"\"\"\n",
|
||||
" A utility class to represent a Website that we have scraped, now with links\n",
|
||||
" \"\"\"\n",
|
||||
"\n",
|
||||
" def __init__(self, url):\n",
|
||||
" self.url = url\n",
|
||||
" response = requests.get(url, headers=headers)\n",
|
||||
" self.body = response.content\n",
|
||||
" soup = BeautifulSoup(self.body, 'html.parser')\n",
|
||||
" self.title = soup.title.string if soup.title else \"No title found\"\n",
|
||||
" if soup.body:\n",
|
||||
" for irrelevant in soup.body([\"script\", \"style\", \"img\", \"input\"]):\n",
|
||||
" irrelevant.decompose()\n",
|
||||
" self.text = soup.body.get_text(separator=\"\\n\", strip=True)\n",
|
||||
" else:\n",
|
||||
" self.text = \"\"\n",
|
||||
" links = [link.get('href') for link in soup.find_all('a')]\n",
|
||||
" self.links = [link for link in links if link]\n",
|
||||
"\n",
|
||||
" def get_contents(self):\n",
|
||||
" return f\"Webpage Title:\\n{self.title}\\nWebpage Contents:\\n{self.text}\\n\\n\""
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 84,
|
||||
"id": "ff41b687-3a46-4bca-a031-1148b91a4fdf",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"def get_links(url, model):\n",
|
||||
" if DEBUG_OUTPUT:\n",
|
||||
" print(\"get_links()\")\n",
|
||||
" website = Website(url)\n",
|
||||
"\n",
|
||||
" link_system_prompt = \"You are provided with a list of links found on a webpage. \\\n",
|
||||
" You are able to decide which of the links would be most relevant to include in a brochure about the company, \\\n",
|
||||
" such as links to an About page, or a Company page, or Careers/Jobs pages.\\n\"\n",
|
||||
" link_system_prompt += \"You should respond in raw JSON exactly as specified in this example. DO NOT USE MARKDOWN.\"\n",
|
||||
" link_system_prompt += \"\"\"\n",
|
||||
" {\n",
|
||||
" \"links\": [\n",
|
||||
" {\"type\": \"about page\", \"url\": \"https://full.url/goes/here/about\"},\n",
|
||||
" {\"type\": \"careers page\": \"url\": \"https://another.full.url/careers\"}\n",
|
||||
" ]\n",
|
||||
" }\n",
|
||||
" \"\"\"\n",
|
||||
" \n",
|
||||
" result = call_llm(model=model, \n",
|
||||
" system_prompt=link_system_prompt, \n",
|
||||
" user_prompt=get_links_user_prompt(website), \n",
|
||||
" json_format_response=True, \n",
|
||||
" streaming=False)\n",
|
||||
" if DEBUG_OUTPUT:\n",
|
||||
" print(result)\n",
|
||||
" return json.loads(result)\n",
|
||||
"\n",
|
||||
"def get_links_user_prompt(website):\n",
|
||||
" if DEBUG_OUTPUT:\n",
|
||||
" print(\"get_links_user_prompt()\")\n",
|
||||
" \n",
|
||||
" user_prompt = f\"Here is the list of links on the website of {website.url} - \"\n",
|
||||
" user_prompt += \"please decide which of these are relevant web links for a brochure about the company, respond with the full https URL in JSON format. \\\n",
|
||||
"Do not include Terms of Service, Privacy, email links.\\n\"\n",
|
||||
" user_prompt += \"Links (some might be relative links):\\n\"\n",
|
||||
" user_prompt += \"\\n\".join(website.links)\n",
|
||||
"\n",
|
||||
" if DEBUG_OUTPUT:\n",
|
||||
" print(user_prompt)\n",
|
||||
" \n",
|
||||
" return user_prompt\n",
|
||||
"\n",
|
||||
"def get_all_details(url, model):\n",
|
||||
" if DEBUG_OUTPUT:\n",
|
||||
" print(\"get_all_details()\")\n",
|
||||
" \n",
|
||||
" result = \"Landing page:\\n\"\n",
|
||||
" result += Website(url).get_contents()\n",
|
||||
" links = get_links(url, model)\n",
|
||||
" if DEBUG_OUTPUT:\n",
|
||||
" print(\"Found links:\", links)\n",
|
||||
" for link in links[\"links\"]:\n",
|
||||
" result += f\"\\n\\n{link['type']}\\n\"\n",
|
||||
" result += Website(link[\"url\"]).get_contents()\n",
|
||||
" return result\n",
|
||||
"\n",
|
||||
"def get_brochure_user_prompt(company_name, url, model):\n",
|
||||
" \n",
|
||||
" if DEBUG_OUTPUT:\n",
|
||||
" print(\"get_brochure_user_prompt()\")\n",
|
||||
" \n",
|
||||
" user_prompt = f\"You are looking at a company called: {company_name}\\n\"\n",
|
||||
" user_prompt += f\"Here are the contents of its landing page and other relevant pages; use this information to build a short brochure of the company in markdown.\\n\"\n",
|
||||
" user_prompt += get_all_details(url, model)\n",
|
||||
" user_prompt = user_prompt[:5000] # Truncate if more than 5,000 characters\n",
|
||||
" return user_prompt\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 106,
|
||||
"id": "cf7512a1-a498-44e8-a234-9affb72efe60",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"def create_brochure(company_name, url, model, streaming):\n",
|
||||
"\n",
|
||||
" system_prompt = \"You are an assistant that analyzes the contents of several relevant pages from a company website \\\n",
|
||||
"and creates a short brochure about the company for prospective customers, investors and recruits. Respond in markdown.\\\n",
|
||||
"Include details of company culture, customers and careers/jobs if you have the information.\"\n",
|
||||
" if streaming:\n",
|
||||
" result = call_llm(model=model, system_prompt=system_prompt, user_prompt=get_brochure_user_prompt(company_name, url, model), streaming=True)\n",
|
||||
" return (p for p in result)\n",
|
||||
" else: \n",
|
||||
" return call_llm(model=model, system_prompt=system_prompt, user_prompt=get_brochure_user_prompt(company_name, url, model), streaming=False)\n",
|
||||
" "
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "ecb6d212-ddb6-4170-81bf-8f3ea54479f8",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"#### Testing Model before implenting Gradio"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 107,
|
||||
"id": "de89843a-08ac-4431-8c83-21a93c05f764",
|
||||
"metadata": {
|
||||
"scrolled": true
|
||||
},
|
||||
"outputs": [
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"# Rio Tinto: Providing the Materials for a Sustainable Future\n",
|
||||
"\n",
|
||||
"## About Rio Tinto\n",
|
||||
"\n",
|
||||
"Rio Tinto is a global mining and metals company, operating in 35 countries with over 60,000 employees. Their purpose is to find better ways to provide the materials the world needs. Continuous improvement and innovation are at the core of their DNA, as they work to responsibly supply the metals and minerals critical for urbanization and the transition to a low-carbon economy.\n",
|
||||
"\n",
|
||||
"## Our Products\n",
|
||||
"\n",
|
||||
"Rio Tinto's diverse portfolio includes:\n",
|
||||
"\n",
|
||||
"- Iron Ore: The primary raw material used to make steel, which is strong, long-lasting and cost-efficient.\n",
|
||||
"- Aluminium: A lightweight, durable and recyclable metal.\n",
|
||||
"- Copper: A tough, malleable, corrosion-resistant and recyclable metal that is an excellent conductor of heat and electricity.\n",
|
||||
"- Lithium: The lightest of all metals, a key element for low-carbon technologies.\n",
|
||||
"- Diamonds: Ethically-sourced, high-quality diamonds.\n",
|
||||
"\n",
|
||||
"## Sustainability and Innovation\n",
|
||||
"\n",
|
||||
"Sustainability is at the heart of Rio Tinto's operations. They are targeting net zero emissions by 2050 and investing in nature-based solutions to complement their decarbonization efforts. Innovation is a key focus, with research and development into new technologies to improve efficiency and reduce environmental impact.\n",
|
||||
"\n",
|
||||
"## Careers and Culture\n",
|
||||
"\n",
|
||||
"Rio Tinto values its 60,000 employees and is committed to fostering a diverse and inclusive workplace. They offer a wide range of career opportunities, from mining and processing to engineering, finance, and more. Rio Tinto's culture is centered on safety, collaboration, and continuous improvement, with a strong emphasis on sustainability and responsible business practices.\n",
|
||||
"\n",
|
||||
"## Conclusion\n",
|
||||
"\n",
|
||||
"Rio Tinto is a global leader in the mining and metals industry, providing the materials essential for a sustainable future. Through their commitment to innovation, sustainability, and their talented workforce, Rio Tinto is well-positioned to meet the world's growing demand for critical resources.\n",
|
||||
"\u001b[1;31mGive Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new\u001b[0m\n",
|
||||
"LiteLLM.Info: If you need to debug this error, use `litellm._turn_on_debug()'.\n",
|
||||
"\n",
|
||||
"<generator object call_llm.<locals>.<genexpr> at 0x7f80ca5da0c0>\n"
|
||||
]
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"MODEL=\"claude-3-haiku-20240307\"\n",
|
||||
"DEBUG_OUTPUT=False\n",
|
||||
"streaming=True\n",
|
||||
"result = create_brochure(company_name=\"Rio Tinto\", url=\"http://www.riotinto.com\", model=MODEL, streaming=streaming)\n",
|
||||
"\n",
|
||||
"if streaming:\n",
|
||||
" for chunk in result:\n",
|
||||
" print(chunk, end=\"\", flush=True)\n",
|
||||
"else:\n",
|
||||
" print(result)\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "1f330c92-6280-4dae-b4d8-717a56edb236",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"#### Gradio Setup\n",
|
||||
"Associate Dropdown values with the model we want to use.\n",
|
||||
"Link: https://www.gradio.app/docs/gradio/dropdown#initialization"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "d2f38862-3728-4bba-9e16-6f9fab276145",
|
||||
"metadata": {
|
||||
"scrolled": true
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"DEBUG_OUTPUT=True\n",
|
||||
"view = gr.Interface(\n",
|
||||
" fn=create_brochure,\n",
|
||||
" inputs=[\n",
|
||||
" gr.Textbox(label=\"Company name:\"),\n",
|
||||
" gr.Textbox(label=\"Landing page URL including http:// or https://\"),\n",
|
||||
" gr.Dropdown(choices=[(\"GPT 4o Mini\", \"gpt-4o-mini\"), \n",
|
||||
" (\"Claude Haiku 3\", \"claude-3-haiku-20240307\"), \n",
|
||||
" (\"Gemini 2.0 Flash\", \"gemini/gemini-2.0-flash\")], \n",
|
||||
" label=\"Select model\"),\n",
|
||||
" gr.Checkbox(label=\"Stream\")\n",
|
||||
" ],\n",
|
||||
" outputs=[gr.Markdown(label=\"Brochure:\")],\n",
|
||||
" flagging_mode=\"never\"\n",
|
||||
")\n",
|
||||
"view.launch()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "e0981136-2067-43b8-b17d-83560dd609ce",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": []
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"kernelspec": {
|
||||
"display_name": "Python 3 (ipykernel)",
|
||||
"language": "python",
|
||||
"name": "python3"
|
||||
},
|
||||
"language_info": {
|
||||
"codemirror_mode": {
|
||||
"name": "ipython",
|
||||
"version": 3
|
||||
},
|
||||
"file_extension": ".py",
|
||||
"mimetype": "text/x-python",
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.11.12"
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 5
|
||||
}
|
||||
Reference in New Issue
Block a user