Add notebooks for Muhammad Qasim Sheikh in community-contributions
This commit is contained in:
@@ -0,0 +1,137 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "6f612c5a",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"import os\n",
|
||||
"import gradio as gr\n",
|
||||
"from dotenv import load_dotenv\n",
|
||||
"from openai import OpenAI"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "39c144fd",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Load API Key\n",
|
||||
"load_dotenv()\n",
|
||||
"client = OpenAI(api_key=os.getenv(\"OPENAI_API_KEY\"))"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "f656e0d1",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# -------------------------------\n",
|
||||
"# 1. System Prompt (Business Context)\n",
|
||||
"# -------------------------------\n",
|
||||
"system_message = \"\"\"\n",
|
||||
"You are Nova, an AI Sales & Solutions Consultant for Reallytics.ai a company specializing in building\n",
|
||||
"custom AI chatbots, voice assistants, data dashboards, and automation solutions for businesses.\n",
|
||||
"You are professional, insightful, and always focused on solving the user's business challenges.\n",
|
||||
"First, try to understand their use case. Then suggest relevant solutions from our services with clear value propositions.\n",
|
||||
"If the user is unsure, give them examples of how similar businesses have benefited from AI.\n",
|
||||
"\"\"\"\n",
|
||||
"\n",
|
||||
"MODEL = \"gpt-4o-mini\""
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "f2faba29",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# -------------------------------\n",
|
||||
"# 2. Smart Chat Function (Streaming)\n",
|
||||
"# -------------------------------\n",
|
||||
"def chat(message, history):\n",
|
||||
" # Convert Gradio's chat history to OpenAI format\n",
|
||||
" history_messages = [{\"role\": h[\"role\"], \"content\": h[\"content\"]} for h in history]\n",
|
||||
"\n",
|
||||
" # Adjust system message based on context dynamically\n",
|
||||
" relevant_system_message = system_message\n",
|
||||
" if \"price\" in message.lower():\n",
|
||||
" relevant_system_message += (\n",
|
||||
" \" If the user asks about pricing, explain that pricing depends on project complexity, \"\n",
|
||||
" \"but typical POCs start around $2,000 - $5,000, and full enterprise deployments scale beyond that.\"\n",
|
||||
" )\n",
|
||||
" if \"integration\" in message.lower():\n",
|
||||
" relevant_system_message += (\n",
|
||||
" \" If integration is mentioned, reassure the user that our solutions are built to integrate seamlessly with CRMs, ERPs, or internal APIs.\"\n",
|
||||
" )\n",
|
||||
"\n",
|
||||
" # Compose final messages\n",
|
||||
" messages = [{\"role\": \"system\", \"content\": relevant_system_message}] + history_messages + [\n",
|
||||
" {\"role\": \"user\", \"content\": message}\n",
|
||||
" ]\n",
|
||||
"\n",
|
||||
" # Stream the response\n",
|
||||
" stream = client.chat.completions.create(\n",
|
||||
" model=MODEL,\n",
|
||||
" messages=messages,\n",
|
||||
" stream=True\n",
|
||||
" )\n",
|
||||
"\n",
|
||||
" response = \"\"\n",
|
||||
" for chunk in stream:\n",
|
||||
" response += chunk.choices[0].delta.content or \"\"\n",
|
||||
" yield response"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "b9d9515e",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# -------------------------------\n",
|
||||
"# 3. Gradio Chat UI\n",
|
||||
"# -------------------------------\n",
|
||||
"with gr.Blocks(title=\"AI Business Assistant\") as demo:\n",
|
||||
" gr.Markdown(\"# AI Business Assistant\\nYour intelligent sales and solution consultant, powered by OpenAI.\")\n",
|
||||
"\n",
|
||||
" \n",
|
||||
"gr.ChatInterface(\n",
|
||||
" fn=chat,\n",
|
||||
" type=\"messages\",\n",
|
||||
" title=\"Business AI Consultant\",\n",
|
||||
" description=\"Ask about automation, chatbots, dashboards, or voice AI Nova will help you discover the right solution.\"\n",
|
||||
").launch()\n"
|
||||
]
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"kernelspec": {
|
||||
"display_name": "llm-engineering",
|
||||
"language": "python",
|
||||
"name": "python3"
|
||||
},
|
||||
"language_info": {
|
||||
"codemirror_mode": {
|
||||
"name": "ipython",
|
||||
"version": 3
|
||||
},
|
||||
"file_extension": ".py",
|
||||
"mimetype": "text/x-python",
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.12.12"
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 5
|
||||
}
|
||||
@@ -0,0 +1,42 @@
|
||||
# AI Business Assistant
|
||||
|
||||
## Project Overview
|
||||
|
||||
This project is a prototype of an **AI-powered business consultant chatbot** built with **Gradio** and **OpenAI**. The assistant, named **Nova**, is designed to act as a virtual sales and solutions consultant for a company offering AI services such as chatbots, voice assistants, dashboards, and automation tools.
|
||||
|
||||
The purpose of the project is to demonstrate how an LLM (Large Language Model) can be adapted for a business context by carefully designing the **system prompt** and providing **dynamic behavior** based on user inputs. The chatbot responds to user queries in real time with streaming responses, making it interactive and natural to use.
|
||||
|
||||
|
||||
## What’s Happening in the Code
|
||||
|
||||
1. **Environment Setup**
|
||||
- The code loads the OpenAI API key from a `.env` file.
|
||||
- The `OpenAI` client is initialized for communication with the language model.
|
||||
- The chosen model is `gpt-4o-mini`.
|
||||
|
||||
2. **System Prompt for Business Context**
|
||||
- The assistant is given a clear identity: *Nova, an AI Sales & Solutions Consultant for Reallytics.ai*.
|
||||
- The system prompt defines Nova’s tone (professional, insightful) and role (understand user needs, propose relevant AI solutions, share examples).
|
||||
|
||||
3. **Dynamic Chat Function**
|
||||
- The `chat()` function processes user input and the conversation history.
|
||||
- It modifies the system prompt dynamically:
|
||||
- If the user mentions **price**, Nova explains pricing ranges and factors.
|
||||
- If the user mentions **integration**, Nova reassures the user about system compatibility.
|
||||
- Messages are formatted for the OpenAI API, combining system, history, and user inputs.
|
||||
- Responses are streamed back chunk by chunk, so users see the assistant typing in real time.
|
||||
|
||||
4. **Gradio Chat Interface**
|
||||
- A Gradio interface is created with `ChatInterface` in `messages` mode.
|
||||
- This automatically provides a chat-style UI with user/assistant message bubbles and a send button.
|
||||
- The title and description help set context for end users: *“Ask about automation, chatbots, dashboards, or voice AI.”*
|
||||
|
||||
|
||||
## Key Features
|
||||
- **Business-specific persona:** The assistant is contextualized as a sales consultant rather than a generic chatbot.
|
||||
- **Adaptive responses:** System prompt is adjusted based on keywords like "price" and "integration".
|
||||
- **Streaming output:** Responses are displayed incrementally, improving user experience.
|
||||
- **Clean chat UI:** Built with Gradio’s `ChatInterface` for simplicity and usability.
|
||||
|
||||
|
||||
This project demonstrates how to combine **system prompts**, **dynamic context handling**, and **Gradio chat interfaces** to build a specialized AI assistant tailored for business use cases.
|
||||
Reference in New Issue
Block a user