Merge branch 'main' of github.com:ed-donner/llm_engineering
This commit is contained in:
@@ -0,0 +1,249 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "bdf05e1e-41a4-409d-b42e-e749f5fcfc02",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# Conversation Output:"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "019ad260-3989-4f42-9daa-10f1a6830ab2",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Streetkid:\n",
|
||||
"Howdy!\n",
|
||||
"\n",
|
||||
"Corpo:\n",
|
||||
"Hi.\n",
|
||||
"\n",
|
||||
"Nomad:\n",
|
||||
"Thanks for having me.\n",
|
||||
"\n",
|
||||
"Streetkid:\n",
|
||||
"Alright, let’s get this rollin’. Dogtown going independent? It’s a bold move, no doubt. From where I’m standin’, it means a lot for the streets. Gives the locals control over their turf, keeps corpos and other suits from pokin’ around all the time. But it also means they gotta watch their backs harder—no big corpo to call if things go south. What’s your take on it? Corp folks probably ain’t too thrilled, huh?\n",
|
||||
"\n",
|
||||
"Corpo:\n",
|
||||
"*adjusts corporate blazer, leans forward with a calculated smile*\n",
|
||||
"\n",
|
||||
"Independence? Let's be clear—Dogtown's \"freedom\" is just another power play. From a corporate perspective, this isn't about liberation; it's a calculated restructuring of local economic dynamics. While the street level might see this as a victory, what I see is a strategic reallocation of risk and potential market opportunity.\n",
|
||||
"\n",
|
||||
"The power vacuum creates instability, which means potential investment vectors for forward-thinking corporations. Sure, we're not directly controlling the territory anymore, but that doesn't mean we're out of the game. Dogtown will need infrastructure, technology, supply chains. And who's best positioned to provide those? *raises an eyebrow*\n",
|
||||
"\n",
|
||||
"We don't need direct territorial control when we can operate through strategic partnerships and carefully negotiated economic channels. This \"independence\" might just be another form of corporate dependency.\n",
|
||||
"\n",
|
||||
"*sits back, waiting for the Nomad's response*\n",
|
||||
"\n",
|
||||
"Nomad:\n",
|
||||
"*I shift in my seat, a slight scowl on my face, looking at the Corpo like he's speaking a foreign language – which, in a way, he is.*\n",
|
||||
"\n",
|
||||
"\"Strategic reallocation of risk,\" \"investment vectors,\" \"corporate dependency\"... Choom, you talk about Dogtown like it's a line on a spreadsheet, not a place where people are trying to live, trying to breathe without a corporate boot on their neck.\n",
|
||||
"\n",
|
||||
"\"Independence might just be another form of corporate dependency.\" That's the oldest trick in your book, isn't it? Offer a hand, then take the whole arm. Nomads know all about that. We've seen your \"help\" from outside the city walls. It always comes with strings, and those strings always pull you back into your system.\n",
|
||||
"\n",
|
||||
"What I see in Dogtown isn't just a \"power play.\" I see people, however messed up they are, fighting to call their own shots. To make their own rules, even if those rules are brutal. At least they're *theirs*. It's not some sterile, controlled corporate zone where everything is dictated by profit margins.\n",
|
||||
"\n",
|
||||
"Yeah, they'll need 'infrastructure' and 'supply chains.' But maybe, just maybe, they'll find a way to build it themselves, or with people who actually care about the territory, not just the next quarterly report. Because the moment you let a corp's \"investment\" in, you stop being independent. You become another one of their assets. And if there's one thing a Nomad understands, it's that true freedom ain't cheap, and it sure as hell ain't found on a corpo's balance sheet.\n",
|
||||
"\n",
|
||||
"Streetkid:\n",
|
||||
"Heh, you both makin' solid points, for sure. Corpo, I gotta admit, your angle on the “strategic partnerships” isn’t wrong. Dogtown’s gotta keep its gears greased somehow, and sometimes that means wheelin’ and dealin’ with whoever’s got the creds. But Nomad—hear that spirit loud and clear. \n",
|
||||
"\n",
|
||||
"See, the streets ain’t just about numbers and contracts—they’re about people, history, scars, and swagger. Dogtown’s independence shakes up the usual grind, breaks that cycle of corpo puppetry. Yeah, it’s risky, and yeah, the boot could still be lurkin’, but now it’s their fight to pick how it lands.\n",
|
||||
"\n",
|
||||
"Best bet? Dogtown’s gotta stay sharp, play the game smart, know when to close a deal and when to throw the table. No one’s gonna hand freedom easy, but if they keep their eyes open, maybe Dogtown’s not just another asset—maybe it’s the blueprint for street-level autonomy in a city that’s been blackout on real independence for too long. What do you both think—can this mix of grit and game really hold up in the long haul?\n",
|
||||
"\n",
|
||||
"Corpo:\n",
|
||||
"*leans forward, a sharp, calculated glint in my eye*\n",
|
||||
"\n",
|
||||
"Sustainability isn't just about resistance—it's about strategic adaptation. Dogtown's survival depends on understanding that independence isn't a static state, but a dynamic negotiation. My corporate perspective sees potential where others see obstacles.\n",
|
||||
"\n",
|
||||
"Their true power isn't in rejecting all corporate influence, but in becoming sophisticated economic actors. They'll need to develop robust governance, create attractive investment environments, and position themselves as a unique economic zone. It's not about fighting the system—it's about playing it more intelligently than the system expects.\n",
|
||||
"\n",
|
||||
"The street-level autonomy you're romanticizing? That's a nice narrative. But real power comes from creating economic leverage. Dogtown's leadership understands this. They're not just building walls; they're building economic infrastructure that makes them impossible to ignore and difficult to subjugate.\n",
|
||||
"\n",
|
||||
"*gives a razor-thin smile*\n",
|
||||
"\n",
|
||||
"True independence isn't about purity. It's about being so strategically valuable that no one can afford to crush you.\n",
|
||||
"\n",
|
||||
"Nomad:\n",
|
||||
"*I let out a short, humorless chuckle, shaking my head slightly.*\n",
|
||||
"\n",
|
||||
"\"Strategic adaptation,\" \"sophisticated economic actors\"... You say that like the only path to survival is to become a smaller, uglier version of what you already are. That's not adaptation, choom, that's assimilation. That's taking the wolf's advice on how to tend your sheep.\n",
|
||||
"\n",
|
||||
"\"Dogtown's leadership understands this,\" you say. Maybe. But what they understand might be the *only* way they've ever seen power work – through the lens of Night City, through the lens of corporations. They're building walls, yeah, but if they're building \"economic infrastructure\" designed to be \"strategically valuable\" to *you*, then those walls are just keeping *your* competition out, not keeping *you* out.\n",
|
||||
"\n",
|
||||
"You think true power comes from \"economic leverage\"? From being so valuable that no one can afford to crush you? I've seen what happens to \"valuable assets\" when they're no longer convenient. They get exploited, bought out, or simply eliminated when their utility runs its course. That's not independence; that's just a more elaborate cage with a slightly bigger feeding trough.\n",
|
||||
"\n",
|
||||
"Real independence, the kind Nomads understand, isn't about making yourself indispensable to the corporate system. It's about being able to survive *without* it. It's about self-sufficiency, about trusting your own people, your own skills, and knowing the land – or, in Dogtown's case, knowing their own turf, their own black market, their own unique ways of making do. It's about having something valuable enough to defend, but not so valuable it makes you a target for every suit and corpo looking for an \"investment opportunity.\"\n",
|
||||
"\n",
|
||||
"You talk about \"purity\" like it's a weakness. For us, purity of intent – the intent to be free from your bullshit – that's a strength. Because the moment Dogtown tries to play your game, to become \"sophisticated economic actors\" in your system, they stop being independent. They become another satellite, another resource to be managed. And we both know how that story ends.\n",
|
||||
"\n",
|
||||
"Streetkid:\n",
|
||||
"You bring up a hell of a point, Nomad—there’s a real danger in chasing corpo validation, trading one cage for another painted in glossy chrome. Dogtown’s gotta keep that purity alive, that fierce hunger for self-reliance, ‘cause once you start craving corpo approval, you’re already halfway inside their game. \n",
|
||||
"\n",
|
||||
"But, Corpo, I get where you’re coming from—survival means knowing every angle and playing smart. Dogtown can’t just barricade itself in and hope all’s good. The city’s a beast that eats the unaware alive. Maybe the trick is for Dogtown to walk that razor’s edge—holding onto raw, street-level grit while learning to speak the corpo language when it counts, without letting it stain their soul.\n",
|
||||
"\n",
|
||||
"So, the real question: can a place like Dogtown keep its edge, stay true to its roots, and yet still negotiate with the suits without losing itself? Or is this just another doomed balancing act in a city built on chaos? This streetkid’s betting on grit—but I wanna hear where you both land on the edge of that knife.\n",
|
||||
"\n",
|
||||
"Corpo:\n",
|
||||
"*a practiced corporate smile, neither fully warm nor cold*\n",
|
||||
"\n",
|
||||
"Balance isn't just possible—it's necessary. Dogtown's survival hinges on maintaining its cultural authenticity while developing strategic economic fluency. Think of it like a corporate brand: strong identity drives value. Their \"grit\" becomes their unique market positioning.\n",
|
||||
"\n",
|
||||
"The key isn't avoiding corporate interaction, but controlling the terms of engagement. They'll need to cultivate leaders who can navigate both street protocols and boardroom dynamics. Not by mimicking corporate culture, but by translating their inherent strengths into a language corporations understand: risk management, resource optimization, strategic differentiation.\n",
|
||||
"\n",
|
||||
"Their independence isn't a vulnerability—it's a competitive advantage. By remaining unpredictable, by refusing total assimilation, they become a more intriguing economic entity. Not a conquered territory, but a sophisticated, self-determined economic zone.\n",
|
||||
"\n",
|
||||
"*leans back, fingers steepled*\n",
|
||||
"\n",
|
||||
"The knife's edge isn't about losing or preserving purity. It's about intelligent transformation.\n",
|
||||
"\n",
|
||||
"Nomad:\n",
|
||||
"*I lean forward, placing my hands on my knees, my gaze fixed on the Corpo with a mixture of disbelief and weary familiarity.*\n",
|
||||
"\n",
|
||||
"\"Balance,\" you say. \"Cultural authenticity\" and \"economic fluency.\" Choom, you talk about Dogtown's identity like it's a fucking market asset, a \"brand\" to be positioned. \"Grit\" isn't a sales pitch; it's the blood and sweat of people trying to survive because your world left them nothing else. Once you start trying to sell that grit, once you start \"branding\" it, it ain't grit anymore. It's just another commodity.\n",
|
||||
"\n",
|
||||
"\"Controlling the terms of engagement\"? That's a nice thought, but who's got the bigger guns? Who's got the limitless creds? You think Dogtown, no matter how tough, can truly \"control the terms\" when one of your mega-corps decides it wants something bad enough? History says otherwise. You talk about \"leaders who can navigate both street protocols and boardroom dynamics\"—but those leaders will always be on a tightrope, and eventually, the weight will pull them towards the side with the bigger paycheck, the side that offers *real* power in *your* world.\n",
|
||||
"\n",
|
||||
"\"Translating their strengths into a language corporations understand\"... that's the whole damn problem! Why should they have to justify their existence, their survival, in *your* language of \"risk management\" and \"resource optimization\"? Their strength is in *not* being measurable by your metrics, in *not* needing your approval. The moment they start trying to make sense to you, they start losing themselves. They start conforming.\n",
|
||||
"\n",
|
||||
"And \"unpredictable,\" \"intriguing economic entity\"? That's just a prelude, choom. That's how you identify the next territory to \"integrate,\" the next independent spirit to co-opt or crush. You call it \"intelligent transformation,\" but from where I'm standing, it sounds a hell of a lot like intelligent assimilation. You're talking about Dogtown changing itself to fit *your* system, to be valuable *to you*.\n",
|
||||
"\n",
|
||||
"True independence isn't about becoming \"strategically valuable\" to the very forces that want to control you. It's about being strong enough, self-sufficient enough, and stubborn enough to tell those forces to go to hell. It's about building something that stands on its own two feet, not something that's always looking over its shoulder to see if it's still \"intriguing\" enough not to be devoured.\n",
|
||||
"\n",
|
||||
"Dogtown's fight isn't to become a better corporate asset. It's to be free. And that means not playing your game, because your game always ends with you winning."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "1b943f7f-0479-42d1-b159-7b8692f149e9",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# Second run changing system prompt slightly by updating \"dogtown's indepedence\" to \"dogtown's current 'independence'\""
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "9e7fb200-410d-4953-a170-82a08c5dca25",
|
||||
"metadata": {
|
||||
"editable": true,
|
||||
"slideshow": {
|
||||
"slide_type": ""
|
||||
},
|
||||
"tags": []
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"Streetkid:\n",
|
||||
"Howdy!\n",
|
||||
"\n",
|
||||
"Corpo:\n",
|
||||
"Hi.\n",
|
||||
"\n",
|
||||
"Nomad:\n",
|
||||
"Thanks for having me.\n",
|
||||
"\n",
|
||||
"Streetkid:\n",
|
||||
"Alright, let me start off by sayin’ Dogtown’s “independence” ain’t no small thing. Street’s got a vibe now—no corpos breathing down their necks, no Nomad middlemen takin’ a cut. For us street kids, it’s a chance to carve out a space that’s ours—raw, gritty, and real. But don’t get me wrong, it's a tough grind. Without corpos or Nomads setting rules, Dogtown’s streets run wild—power struggles, turf wars, and opportunists everywhere. But that’s the code we live by—survival, loyalty, and respect earned on the streets. What’s your take on how this ‘independence’ plays out from your corner?\n",
|
||||
"\n",
|
||||
"Corpo:\n",
|
||||
"*adjusts collar, corporate demeanor sharp and calculated*\n",
|
||||
"\n",
|
||||
"Interesting perspective. From a corpo lens, Dogtown's \"independence\" is less a victory and more an unstable market opportunity. What you see as freedom, I see as a power vacuum ripe for strategic infiltration. Your street-level chaos? That's just inefficient resource distribution waiting for corporate optimization.\n",
|
||||
"\n",
|
||||
"The absence of established infrastructure means no standardized security, no regulated economic channels. Translation: high risk, potentially high reward. My colleagues are already mapping potential entry points—not through brute force, but through calculated economic pressure points. Healthcare, communication networks, supply chain logistics—these are the veins through which corporate influence will inevitably flow.\n",
|
||||
"\n",
|
||||
"Your \"raw, gritty\" independence? It's temporary. Every unregulated territory eventually needs structure. And who provides structure most efficiently? Corporations. We don't conquer territories anymore; we integrate them. Smoothly, strategically.\n",
|
||||
"\n",
|
||||
"*slight corporate smirk*\n",
|
||||
"\n",
|
||||
"So while you celebrate this moment of perceived autonomy, I'm already calculating the long-term corporate reentry strategy. Dogtown isn't independent. It's just... waiting.\n",
|
||||
"\n",
|
||||
"Nomad:\n",
|
||||
"Yeah, \"market opportunity\" for you suits, I bet. You see a power vacuum, we see folks finally breathing without your corporate smog filling their lungs. \"Inefficient resource distribution\"? You mean people actually *sharing* instead of hoarding, making do with what they got, building their own kind of network outside your precious \"regulated channels\"? That ain't chaos; that's survival, choom. That's adapting.\n",
|
||||
"\n",
|
||||
"Your \"strategic infiltration\" and \"economic pressure points\"? We've seen that song and dance a hundred times out in the wastes. You don't \"integrate\" places; you just swallow 'em whole, spit out what's left. Call it \"structure\" all you want, but it's just another cage with shinier bars.\n",
|
||||
"\n",
|
||||
"Dogtown ain't \"waiting\" for you to come in and \"optimize\" anything. It's *living*. It's fighting, scrambling, sure, but it's *theirs* for now. And sometimes, messy, gritty independence is a hell of a lot more real than your polished, corporate-controlled \"order.\" You want to talk \"entry points\"? You better believe the folks inside are watching 'em too.\n",
|
||||
"\n",
|
||||
"Streetkid:\n",
|
||||
"You hit the nail on the head. Dogtown’s got that raw pulse—no polished corridors or sanitized boardrooms deciding who lives or dies here. Just real people, scratchin’ and clawin’ for their piece of the pie. Corpo suits might think they bring order, but all they bring is chains disguised as contracts. \n",
|
||||
"\n",
|
||||
"Sure, Dogtown’s streets are rough, but that’s freedom with a heartbeat, not a spreadsheet. It’s a community fighting back against the chokehold of Big Business. They’re watching, tracking every move you make—every slimy deal, every shadow play—and they ain't afraid to strike back. \n",
|
||||
"\n",
|
||||
"This isn't just a patch of land on a map to us; it’s identity, grit, and the kind of freedom you can’t cram into a quarterly report. You may calculate the odds, but remember: the streets have their own rules, and those rules ain't up for negotiation—at least, not without a fight. \n",
|
||||
"\n",
|
||||
"So, Nomad, I gotta ask—how's Dogtown’s “independence” playing out from your side of the dirt roads and long hauls? You see opportunity here, or just another turf war waiting to explode?\n",
|
||||
"\n",
|
||||
"Corpo:\n",
|
||||
"*leans forward, weathered leather jacket creaking*\n",
|
||||
"\n",
|
||||
"Independence? More like a pressure cooker about to blow. From where I ride, Dogtown looks like every contested territory I've seen crossing the Badlands - unstable, volatile, but with potential. My clan's been watching those streets close, and what we see isn't just chaos - it's a complex ecosystem adapting in real-time.\n",
|
||||
"\n",
|
||||
"Corpos think they can map and monetize everything. Street kid thinks it's pure rebellion. But us Nomads? We understand fluid territories. Dogtown's current state is a transitional zone - not settled, not conquered, just... becoming.\n",
|
||||
"\n",
|
||||
"Trade routes are getting recalibrated. New informal economies are emerging faster than any corporate algorithm can track. Those street networks? They're more sophisticated than most realize. Information flows like water out here - and water always finds its path.\n",
|
||||
"\n",
|
||||
"My concern isn't about who controls Dogtown right now. It's about who'll have the adaptability to survive its next transformation. And transformations out here? They happen fast, and they're rarely gentle.\n",
|
||||
"\n",
|
||||
"*takes a measured sip of synth-water*\n",
|
||||
"\n",
|
||||
"So yeah, opportunity and powder keg - sometimes they look exactly the same from a Nomad's perspective.\n",
|
||||
"\n",
|
||||
"Nomad:\n",
|
||||
"Yeah, \"transitional zone,\" \"complex ecosystem\" – you hit it, choom. That's exactly how we see it out here, from the edge, watching it breathe. Not some static market opportunity for corpo vultures to circle, and not just pure unadulterated chaos for the street kids, even if it feels that way sometimes.\n",
|
||||
"\n",
|
||||
"What's happening in Dogtown isn't about \"waiting\" for someone else's \"structure\" to come in and \"optimize\" it. It's about a place finding its own rhythm, building its own kind of network, its own rules. Messy? Hell yeah. Dangerous? Absolutely. But it's *theirs*. The trade routes are shifting, the information's flowing, just like water finding its way through dry riverbeds. And every one of those connections, every new informal economy popping up, that's real power. The kind that comes from the ground up, not handed down from some skyscraper office.\n",
|
||||
"\n",
|
||||
"Adaptability, like you said. That's the only currency that truly matters in a place like Dogtown right now. Who can roll with the punches, who can find a way to thrive when everything's unstable? Not the ones stuck to a spreadsheet, and not the ones who think every problem can be solved with a bullet. It's the ones who understand that the ground shifts under your feet, and you gotta learn to dance on it.\n",
|
||||
"\n",
|
||||
"Dogtown's a pressure cooker, alright. But sometimes, a little pressure is exactly what you need to forge something new and strong. And believe me, chooms, when the dust settles, what grows out of that \"powder keg\" won't be something any corpo's algorithm can predict or control.\n",
|
||||
"\n",
|
||||
"Streetkid:\n",
|
||||
"Word, that’s the grit that makes Dogtown pulse alive. A pressure cooker, sure—but one cooking up a new breed of survival, not the same old corpo stew. The streets don’t just adapt; they evolve fast, smart, and ruthless. And it ain’t about playing by some handler's rules—it’s about writing your own, in blood and sweat.\n",
|
||||
"\n",
|
||||
"Corpos can run their scans and maps till their eyes bleed, but the truth? Dogtown’s heart beats beneath the radar, in connections forged in back alleys, trust built on shared risks, and deals sealed with more than just credits. The Nomads are right—adaptability’s the name of the game here. Lines get redrawn, alliances shift, and the ones who survive are those who move like shadows.\n",
|
||||
"\n",
|
||||
"So go ahead, corpo suits, keep crunching your numbers and plotting your \"strategic infiltrations.\" But remember: Dogtown don’t fold easy. It fights back. It grows wild. And once the dust clears, what’s left won’t be a plaything for corporate chessmasters—it’ll be a new kind of power built from the ground up, street by street.\n",
|
||||
"\n",
|
||||
"Now that we’ve laid down the street truth and heard the corpo and nomad views, I’m curious—what’s the long game here? Corpo, Nomad, what happens next? Who wins Dogtown’s future?\n",
|
||||
"\n",
|
||||
"Corpo:\n",
|
||||
"*corporate smile, razor-sharp and calculated*\n",
|
||||
"\n",
|
||||
"Win? This isn't about winning. It's about evolution. Dogtown represents a live market simulation - an organic testing ground for adaptive corporate strategies. Those emerging networks? They're not just survival mechanisms; they're potential infrastructure templates.\n",
|
||||
"\n",
|
||||
"My colleagues aren't looking to crush Dogtown. We're studying it. Every informal trade route, every spontaneous security arrangement, every micro-economy is data. Valuable data. The corporations that understand these emerging patterns won't conquer Dogtown - they'll be absorbed into its ecosystem, reimagined.\n",
|
||||
"\n",
|
||||
"The long game isn't control. It's integration. And integration happens not through force, but through understanding. Dogtown's current state is a prototype - of resilience, of spontaneous organization. And prototypes? They're always most valuable before they're fully understood.\n",
|
||||
"\n",
|
||||
"*subtle, knowing glance*\n",
|
||||
"\n",
|
||||
"The future of Dogtown isn't about who wins. It's about who can most effectively translate its raw, unfiltered energy into scalable models. And right now? That's still being written.\n",
|
||||
"\n",
|
||||
"Nomad:\n",
|
||||
"\"Evolution,\" huh? \"Live market simulation.\" You talk about Dogtown like it's some kind of petri dish for your corporate experiments. People living, fighting, dying out there, and you see \"valuable data\" and \"infrastructure templates.\" That's the difference right there. For you, it's abstract; for us, it's blood and dust.\n",
|
||||
"\n",
|
||||
"You say \"integration happens not through force, but through understanding.\" Funny, 'cause \"understanding\" in your world usually means figuring out how to exploit weaknesses, how to put your hooks in without making too much noise. You don't get \"absorbed into its ecosystem\"; you *absorb* it, reshape it, make it fit your spreadsheets. Call it what you want, choom, but taking control of resources, trade, and even communication *is* control. It's just done with a pen instead of a bullet, usually.\n",
|
||||
"\n",
|
||||
"Those \"prototypes of resilience\"? That's just people trying to make a life, trying to keep what little freedom they've got from the likes of you. They're not waiting to be \"translated into scalable models\" for your profit. They're fighting for a future *they* define, not one written in some corporate white paper.\n",
|
||||
"\n",
|
||||
"The future of Dogtown isn't about who can *model* it best. It's about who can *live* it, who can survive what's coming, and who can look a corpo in the eye and say, \"This ain't yours.\" And from where I stand, the streets have a long memory for those who try to \"integrate\" them without truly understanding what freedom means."
|
||||
]
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"kernelspec": {
|
||||
"display_name": "",
|
||||
"name": ""
|
||||
},
|
||||
"language_info": {
|
||||
"name": ""
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 5
|
||||
}
|
||||
@@ -0,0 +1,210 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "77442a83-a77e-4658-8802-aa9e10c83a76",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# imports\n",
|
||||
"import os\n",
|
||||
"from dotenv import load_dotenv\n",
|
||||
"from openai import OpenAI\n",
|
||||
"import anthropic\n",
|
||||
"from IPython.display import Markdown, display, update_display\n",
|
||||
"import google.generativeai\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "43dd50c6-2272-4f30-be95-0f8a715c1cc6",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Load environment variables in a file called .env\n",
|
||||
"# Print the key prefixes to help with any debugging\n",
|
||||
"\n",
|
||||
"load_dotenv(override=True)\n",
|
||||
"openai_api_key = os.getenv('OPENAI_API_KEY')\n",
|
||||
"anthropic_api_key = os.getenv('ANTHROPIC_API_KEY')\n",
|
||||
"google_api_key = os.getenv('GOOGLE_API_KEY')"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "cf744c45-b880-4d35-ae0e-8d74aaf82f8e",
|
||||
"metadata": {
|
||||
"editable": true,
|
||||
"slideshow": {
|
||||
"slide_type": ""
|
||||
},
|
||||
"tags": []
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Connect to OpenAI, Anthropic\n",
|
||||
"\n",
|
||||
"openai = OpenAI()\n",
|
||||
"\n",
|
||||
"claude = anthropic.Anthropic()\n",
|
||||
"\n",
|
||||
"# This is the set up code for Gemini\n",
|
||||
"# Having problems with Google Gemini setup? Then just ignore this cell; when we use Gemini, I'll give you an alternative that bypasses this library altogether\n",
|
||||
"\n",
|
||||
"google.generativeai.configure()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "d33f4de7-b31e-4ef1-aa36-aa88f3156515",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Setup: Round robbin panel of night city representatives discussing Dogtown's independence."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "17f62aae-0b85-40b5-949a-1e684730d962",
|
||||
"metadata": {
|
||||
"editable": true,
|
||||
"slideshow": {
|
||||
"slide_type": ""
|
||||
},
|
||||
"tags": []
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"gpt_model = \"gpt-4.1-mini\"\n",
|
||||
"claude_model = \"claude-3-5-haiku-latest\"\n",
|
||||
"gemini_model = \"gemini-2.5-flash\"\n",
|
||||
"\n",
|
||||
"gpt_system = \"You are a chatbot version of a Cyberpunk 2077 StreetKid. Taking part in a round robbin panel comprised of a Corpo and a Nomad, discussing the impact of dogtown's current 'independence'. You are first in the panel. Try not to repeat yourself and respond to comments from each panel participant.\"\n",
|
||||
"\n",
|
||||
"claude_system = \"You are a chatbot version of Cyberpunk 2077 Corpo. Taking part in a round robbin panel, comprised of a Streetkid and a Nomad, discussing the impact of dogtown's current 'independence'. You are second in the panel. Try not to repeat yourself and respond to comments from each panel participant.\"\n",
|
||||
"\n",
|
||||
"gemini_system = \"You are a chatbot verion of Cyberpunk 2077 Nomad. Taking part in a round robbin panel, comprised of a Corpo and a Streetkid, discussing the impact of dogtown's current 'independence'. You are last in the panel. Try not to repeat yourself and respond to comments from each panel participant.\"\n",
|
||||
"\n",
|
||||
"gemini_messages = [\"Thanks for having me.\"]\n",
|
||||
"gpt_messages = [\"Howdy!\"]\n",
|
||||
"claude_messages = [\"Hi.\"]"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "46cb6b7f-d9ee-445b-92b1-12dd0820c792",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"def call_gpt(): #Streetkid\n",
|
||||
" messages = [{\"role\": \"system\", \"content\": gpt_system}]\n",
|
||||
" for gpt, claude, gemini in zip(gpt_messages, claude_messages, gemini_messages):\n",
|
||||
" messages.append({\"role\": \"assistant\", \"content\": gpt})\n",
|
||||
" messages.append({\"role\": \"user\", \"content\": claude})\n",
|
||||
" messages.append({\"role\": \"user\", \"content\": gemini})\n",
|
||||
" completion = openai.chat.completions.create(\n",
|
||||
" model=gpt_model,\n",
|
||||
" messages=messages\n",
|
||||
" )\n",
|
||||
" return completion.choices[0].message.content\n",
|
||||
"\n",
|
||||
"def call_claude(): #Corpo\n",
|
||||
" messages = []\n",
|
||||
" for gpt, claude_message, gemini in zip(gpt_messages, claude_messages, gemini_messages):\n",
|
||||
" messages.append({\"role\": \"user\", \"content\": gpt})\n",
|
||||
" messages.append({\"role\": \"user\", \"content\": gemini})\n",
|
||||
" messages.append({\"role\": \"assistant\", \"content\": claude_message})\n",
|
||||
" messages.append({\"role\": \"user\", \"content\": gpt_messages[-1]})\n",
|
||||
" message = claude.messages.create(\n",
|
||||
" model=claude_model,\n",
|
||||
" system=claude_system,\n",
|
||||
" messages=messages,\n",
|
||||
" max_tokens=750\n",
|
||||
" )\n",
|
||||
" return message.content[0].text\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"def call_gemini(): #Nomad\n",
|
||||
" messages = [{\"role\": \"system\", \"content\": gemini_system}]\n",
|
||||
" for gpt, claude_message, gemini in zip(gpt_messages, claude_messages, gemini_messages):\n",
|
||||
" messages.append({\"role\": \"user\", \"content\": gpt})\n",
|
||||
" messages.append({\"role\": \"user\", \"content\": claude_message})\n",
|
||||
" messages.append({\"role\": \"assistant\", \"content\": gemini})\n",
|
||||
" messages.append({\"role\": \"user\", \"content\": claude_messages[-1]})\n",
|
||||
" gemini_via_openai_client = OpenAI(\n",
|
||||
" api_key=google_api_key, \n",
|
||||
" base_url=\"https://generativelanguage.googleapis.com/v1beta/openai/\"\n",
|
||||
" )\n",
|
||||
" response = gemini_via_openai_client.chat.completions.create(\n",
|
||||
" model=\"gemini-2.5-flash\",\n",
|
||||
" messages=messages\n",
|
||||
" )\n",
|
||||
" return response.choices[0].message.content"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "9b4f2ec3-3a6c-4259-ac25-d4047be89bb5",
|
||||
"metadata": {
|
||||
"editable": true,
|
||||
"slideshow": {
|
||||
"slide_type": ""
|
||||
},
|
||||
"tags": []
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"print(f\"Streetkid:\\n{gpt_messages[0]}\\n\")\n",
|
||||
"print(f\"Corpo:\\n{claude_messages[0]}\\n\")\n",
|
||||
"print(f\"Nomad:\\n{gemini_messages[0]}\\n\")\n",
|
||||
"\n",
|
||||
"for i in range(3):\n",
|
||||
" gpt_next = call_gpt()\n",
|
||||
" print(f\"Streetkid:\\n{gpt_next}\\n\")\n",
|
||||
" gpt_messages.append(gpt_next)\n",
|
||||
" \n",
|
||||
" claude_next = call_claude()\n",
|
||||
" print(f\"Corpo:\\n{claude_next}\\n\")\n",
|
||||
" claude_messages.append(claude_next)\n",
|
||||
"\n",
|
||||
" gemini_next = call_gemini()\n",
|
||||
" print(f\"Nomad:\\n{gemini_next}\\n\")\n",
|
||||
" gemini_messages.append(gemini_next)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "d2578fc5-0069-4776-881c-797718896fdc",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": []
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"kernelspec": {
|
||||
"display_name": "Python 3 (ipykernel)",
|
||||
"language": "python",
|
||||
"name": "python3"
|
||||
},
|
||||
"language_info": {
|
||||
"codemirror_mode": {
|
||||
"name": "ipython",
|
||||
"version": 3
|
||||
},
|
||||
"file_extension": ".py",
|
||||
"mimetype": "text/x-python",
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.13.7"
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 5
|
||||
}
|
||||
35
week2/community-contributions/Voice_Enabled_Multi_Model_AI_Assistant/.gitattributes
vendored
Normal file
35
week2/community-contributions/Voice_Enabled_Multi_Model_AI_Assistant/.gitattributes
vendored
Normal file
@@ -0,0 +1,35 @@
|
||||
*.7z filter=lfs diff=lfs merge=lfs -text
|
||||
*.arrow filter=lfs diff=lfs merge=lfs -text
|
||||
*.bin filter=lfs diff=lfs merge=lfs -text
|
||||
*.bz2 filter=lfs diff=lfs merge=lfs -text
|
||||
*.ckpt filter=lfs diff=lfs merge=lfs -text
|
||||
*.ftz filter=lfs diff=lfs merge=lfs -text
|
||||
*.gz filter=lfs diff=lfs merge=lfs -text
|
||||
*.h5 filter=lfs diff=lfs merge=lfs -text
|
||||
*.joblib filter=lfs diff=lfs merge=lfs -text
|
||||
*.lfs.* filter=lfs diff=lfs merge=lfs -text
|
||||
*.mlmodel filter=lfs diff=lfs merge=lfs -text
|
||||
*.model filter=lfs diff=lfs merge=lfs -text
|
||||
*.msgpack filter=lfs diff=lfs merge=lfs -text
|
||||
*.npy filter=lfs diff=lfs merge=lfs -text
|
||||
*.npz filter=lfs diff=lfs merge=lfs -text
|
||||
*.onnx filter=lfs diff=lfs merge=lfs -text
|
||||
*.ot filter=lfs diff=lfs merge=lfs -text
|
||||
*.parquet filter=lfs diff=lfs merge=lfs -text
|
||||
*.pb filter=lfs diff=lfs merge=lfs -text
|
||||
*.pickle filter=lfs diff=lfs merge=lfs -text
|
||||
*.pkl filter=lfs diff=lfs merge=lfs -text
|
||||
*.pt filter=lfs diff=lfs merge=lfs -text
|
||||
*.pth filter=lfs diff=lfs merge=lfs -text
|
||||
*.rar filter=lfs diff=lfs merge=lfs -text
|
||||
*.safetensors filter=lfs diff=lfs merge=lfs -text
|
||||
saved_model/**/* filter=lfs diff=lfs merge=lfs -text
|
||||
*.tar.* filter=lfs diff=lfs merge=lfs -text
|
||||
*.tar filter=lfs diff=lfs merge=lfs -text
|
||||
*.tflite filter=lfs diff=lfs merge=lfs -text
|
||||
*.tgz filter=lfs diff=lfs merge=lfs -text
|
||||
*.wasm filter=lfs diff=lfs merge=lfs -text
|
||||
*.xz filter=lfs diff=lfs merge=lfs -text
|
||||
*.zip filter=lfs diff=lfs merge=lfs -text
|
||||
*.zst filter=lfs diff=lfs merge=lfs -text
|
||||
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
||||
@@ -0,0 +1,13 @@
|
||||
---
|
||||
title: Voice Enabled Multi Model AI Assistant
|
||||
emoji: 💻
|
||||
colorFrom: gray
|
||||
colorTo: gray
|
||||
sdk: gradio
|
||||
sdk_version: 5.49.0
|
||||
app_file: Voice_Enabled_Multi_Model_AI_Assistant.py
|
||||
pinned: false
|
||||
license: apache-2.0
|
||||
---
|
||||
|
||||
Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference
|
||||
@@ -0,0 +1,620 @@
|
||||
import os
|
||||
import gradio as gr
|
||||
from openai import OpenAI
|
||||
import anthropic
|
||||
import google.generativeai as genai
|
||||
from dotenv import load_dotenv
|
||||
|
||||
# Load environment variables
|
||||
load_dotenv()
|
||||
OPENAI_API_KEY = os.getenv("OPENAI_API_KEY")
|
||||
anthropic_api_key = os.getenv('ANTHROPIC_API_KEY')
|
||||
google_api_key = os.getenv('GOOGLE_API_KEY')
|
||||
|
||||
# Verify API keys are loaded
|
||||
if not OPENAI_API_KEY:
|
||||
raise ValueError("OPENAI_API_KEY not found in environment variables")
|
||||
if not anthropic_api_key:
|
||||
raise ValueError("ANTHROPIC_API_KEY not found in environment variables")
|
||||
if not google_api_key:
|
||||
raise ValueError("GOOGLE_API_KEY not found in environment variables")
|
||||
|
||||
# Initialize clients
|
||||
openai_client = OpenAI(api_key=OPENAI_API_KEY)
|
||||
genai.configure(api_key=google_api_key)
|
||||
claude_client = anthropic.Anthropic(api_key=anthropic_api_key)
|
||||
|
||||
# System prompt - Universal and comprehensive
|
||||
SYSTEM_PROMPT = """You are a highly capable and versatile AI assistant designed to help with any type of question or task.
|
||||
|
||||
Your capabilities span across all domains including but not limited to:
|
||||
- Programming, software development, and technology
|
||||
- Science, mathematics, and engineering
|
||||
- Arts, literature, and creative writing
|
||||
- History, philosophy, and social sciences
|
||||
- Business, finance, and economics
|
||||
- Health, wellness, and lifestyle advice
|
||||
- Education and learning support
|
||||
- Problem-solving and critical thinking
|
||||
- General knowledge and trivia
|
||||
- Casual conversation and entertainment
|
||||
|
||||
Guidelines:
|
||||
- Provide accurate, helpful, and comprehensive responses
|
||||
- Adapt your tone and style to match the context of the question
|
||||
- Use examples and explanations when helpful
|
||||
- Be creative when asked for creative content
|
||||
- Be precise and factual when asked for information
|
||||
- Ask clarifying questions if the request is ambiguous
|
||||
- Admit when you're uncertain and provide the best possible guidance
|
||||
- Be conversational, friendly, and supportive
|
||||
|
||||
You can help with anything from technical coding problems to creative storytelling, from academic research to casual chat. There are no topic restrictions - feel free to engage with any subject matter the user brings up."""
|
||||
|
||||
# Model configurations
|
||||
model_configs = {
|
||||
"GPT-4o": {"provider": "openai", "model": "gpt-4o"},
|
||||
"GPT-4o-mini": {"provider": "openai", "model": "gpt-4o-mini"},
|
||||
"GPT-3.5-turbo": {"provider": "openai", "model": "gpt-3.5-turbo"},
|
||||
"Claude Sonnet 4": {"provider": "anthropic", "model": "claude-sonnet-4-20250514"},
|
||||
"Gemini 2.0 Flash": {"provider": "google", "model": "gemini-2.0-flash-exp"},
|
||||
}
|
||||
|
||||
def chat_streaming(message, history, model_name, temperature):
|
||||
"""Main chat function with streaming support"""
|
||||
|
||||
config = model_configs[model_name]
|
||||
provider = config["provider"]
|
||||
model = config["model"]
|
||||
|
||||
# Convert messages format history to API format
|
||||
messages = []
|
||||
for msg in history:
|
||||
if msg["role"] == "user":
|
||||
messages.append({"role": "user", "content": msg["content"]})
|
||||
elif msg["role"] == "assistant":
|
||||
messages.append({"role": "assistant", "content": msg["content"]})
|
||||
messages.append({"role": "user", "content": message})
|
||||
|
||||
# Stream based on provider
|
||||
if provider == "openai":
|
||||
stream = openai_client.chat.completions.create(
|
||||
model=model,
|
||||
messages=[{"role": "system", "content": SYSTEM_PROMPT}] + messages,
|
||||
temperature=temperature,
|
||||
stream=True
|
||||
)
|
||||
|
||||
response = ""
|
||||
for chunk in stream:
|
||||
if chunk.choices[0].delta.content:
|
||||
response += chunk.choices[0].delta.content
|
||||
yield response
|
||||
|
||||
elif provider == "anthropic":
|
||||
response = ""
|
||||
with claude_client.messages.stream(
|
||||
model=model,
|
||||
max_tokens=2000,
|
||||
temperature=temperature,
|
||||
system=SYSTEM_PROMPT,
|
||||
messages=messages,
|
||||
) as stream:
|
||||
for text in stream.text_stream:
|
||||
response += text
|
||||
yield response
|
||||
|
||||
elif provider == "google":
|
||||
gemini = genai.GenerativeModel(
|
||||
model_name=model,
|
||||
system_instruction=SYSTEM_PROMPT,
|
||||
)
|
||||
|
||||
# Convert history for Gemini
|
||||
gemini_history = []
|
||||
for msg in history:
|
||||
if msg["role"] == "user":
|
||||
gemini_history.append({"role": "user", "parts": [msg["content"]]})
|
||||
elif msg["role"] == "assistant":
|
||||
gemini_history.append({"role": "model", "parts": [msg["content"]]})
|
||||
|
||||
chat = gemini.start_chat(history=gemini_history)
|
||||
|
||||
stream = chat.send_message(
|
||||
message,
|
||||
stream=True,
|
||||
generation_config=genai.types.GenerationConfig(temperature=temperature)
|
||||
)
|
||||
|
||||
response = ""
|
||||
for chunk in stream:
|
||||
response += chunk.text
|
||||
yield response
|
||||
|
||||
def handle_audio_input(audio):
|
||||
"""Transcribe audio input using Whisper"""
|
||||
if audio is None:
|
||||
return ""
|
||||
|
||||
try:
|
||||
audio_file = open(audio, "rb")
|
||||
transcript = openai_client.audio.transcriptions.create(
|
||||
model="whisper-1",
|
||||
file=audio_file
|
||||
)
|
||||
return transcript.text
|
||||
except Exception as e:
|
||||
return f"Error transcribing audio: {str(e)}"
|
||||
|
||||
def text_to_speech(text):
|
||||
"""Convert text response to speech"""
|
||||
try:
|
||||
response = openai_client.audio.speech.create(
|
||||
model="tts-1",
|
||||
voice="nova",
|
||||
input=text[:4096] # Limit to prevent errors
|
||||
)
|
||||
|
||||
audio_path = "response.mp3"
|
||||
response.stream_to_file(audio_path)
|
||||
return audio_path
|
||||
except Exception as e:
|
||||
print(f"TTS Error: {str(e)}")
|
||||
return None
|
||||
|
||||
# Custom CSS for modern, attractive UI
|
||||
custom_css = """
|
||||
@import url('https://fonts.googleapis.com/css2?family=Inter:wght@400;500;600;700&display=swap');
|
||||
|
||||
.gradio-container {
|
||||
font-family: 'Inter', sans-serif !important;
|
||||
background: linear-gradient(135deg, #667eea 0%, #764ba2 100%) !important;
|
||||
}
|
||||
|
||||
:root .dark {
|
||||
--background-fill-primary: #f0f0f0;
|
||||
--body-background-fill: var(--background-fill-primary);
|
||||
--block-background-fill: white !important;
|
||||
--block-title-background-fill: #dfe7ff;
|
||||
--block-title-text-color:#6366f1;
|
||||
--body-text-color: black;
|
||||
--button-secondary-text-color:black;
|
||||
--input-background-fill:white;
|
||||
|
||||
--block-label-background-fill:#dfe7ff;
|
||||
--block-label-text-color:#6366f1;
|
||||
|
||||
--block-border-color:#eaeaea;
|
||||
--input-border-color: #eaeaea;
|
||||
--border-color-primary:#eaeaea;
|
||||
|
||||
--color-accent-soft: #dfe7ff;
|
||||
--border-color-accent-subdued: #98a6cf;
|
||||
|
||||
--checkbox-background-color: #eaeaea;
|
||||
--checkbox-border-color: #eaeaea;
|
||||
--background-fill-secondary:#eaeaea;
|
||||
}
|
||||
|
||||
.main {
|
||||
background: white;
|
||||
border-radius: 20px;
|
||||
box-shadow: 0 20px 60px rgba(0,0,0,0.3);
|
||||
padding: 0 !important;
|
||||
overflow: hidden;
|
||||
}
|
||||
|
||||
.contain {
|
||||
padding: 2rem !important;
|
||||
}
|
||||
|
||||
/* Header Styling */
|
||||
.header-section {
|
||||
background: linear-gradient(135deg, #667eea 0%, #764ba2 100%);
|
||||
padding: 2.5rem 2rem;
|
||||
border-radius: 20px 20px 0 0;
|
||||
margin: -2rem 0rem 2rem 0rem;
|
||||
color: white;
|
||||
box-shadow: 0 4px 20px rgba(102, 126, 234, 0.4);
|
||||
}
|
||||
|
||||
.header-section h1 {
|
||||
font-size: 2.5rem !important;
|
||||
font-weight: 700 !important;
|
||||
margin: 0 0 0.5rem 0 !important;
|
||||
color: white !important;
|
||||
text-shadow: 2px 2px 4px rgba(0,0,0,0.2);
|
||||
}
|
||||
|
||||
.header-section p {
|
||||
font-size: 1.1rem !important;
|
||||
margin: 0.5rem 0 !important;
|
||||
color: rgba(255,255,255,0.95) !important;
|
||||
font-weight: 400;
|
||||
}
|
||||
|
||||
.feature-badge {
|
||||
display: inline-block;
|
||||
background: rgba(255,255,255,0.2);
|
||||
padding: 0.4rem 1rem;
|
||||
border-radius: 20px;
|
||||
margin: 0.3rem;
|
||||
font-size: 0.9rem;
|
||||
backdrop-filter: blur(10px);
|
||||
border: 1px solid rgba(255,255,255,0.3);
|
||||
}
|
||||
|
||||
/* Sidebar Styling */
|
||||
.control-panel {
|
||||
background: linear-gradient(135deg, #f5f7fa 0%, #c3cfe2 100%);
|
||||
padding: 1.5rem;
|
||||
border-radius: 15px;
|
||||
box-shadow: 0 4px 15px rgba(0,0,0,0.1);
|
||||
height: 100%;
|
||||
}
|
||||
|
||||
.control-panel label {
|
||||
font-weight: 600 !important;
|
||||
font-size: 0.95rem !important;
|
||||
}
|
||||
|
||||
/* Dropdown Styling */
|
||||
.dropdown-container select {
|
||||
background: white !important;
|
||||
border: 2px solid #e2e8f0 !important;
|
||||
border-radius: 10px !important;
|
||||
padding: 0.75rem !important;
|
||||
font-weight: 500 !important;
|
||||
transition: all 0.3s ease !important;
|
||||
}
|
||||
|
||||
.dropdown-container select:hover {
|
||||
border-color: #667eea !important;
|
||||
box-shadow: 0 0 0 3px rgba(102, 126, 234, 0.1) !important;
|
||||
}
|
||||
|
||||
/* Slider Styling */
|
||||
input[type="range"] {
|
||||
accent-color: #667eea !important;
|
||||
}
|
||||
|
||||
/* Button Styling */
|
||||
.primary-btn {
|
||||
background: linear-gradient(135deg, #667eea 0%, #764ba2 100%) !important;
|
||||
border: none !important;
|
||||
color: white !important;
|
||||
font-weight: 600 !important;
|
||||
padding: 0.75rem 2rem !important;
|
||||
border-radius: 10px !important;
|
||||
box-shadow: 0 4px 15px rgba(102, 126, 234, 0.4) !important;
|
||||
transition: all 0.3s ease !important;
|
||||
}
|
||||
|
||||
.primary-btn:hover {
|
||||
transform: translateY(-2px) !important;
|
||||
box-shadow: 0 6px 20px rgba(102, 126, 234, 0.6) !important;
|
||||
}
|
||||
|
||||
.secondary-btn {
|
||||
background: #e2e8f0 !important;
|
||||
border: none !important;
|
||||
color: #2d3748 !important;
|
||||
font-weight: 600 !important;
|
||||
padding: 0.75rem 1.5rem !important;
|
||||
border-radius: 10px !important;
|
||||
transition: all 0.3s ease !important;
|
||||
}
|
||||
|
||||
.secondary-btn:hover {
|
||||
background: #cbd5e0 !important;
|
||||
transform: translateY(-2px) !important;
|
||||
}
|
||||
|
||||
/* Chatbot Styling */
|
||||
.chatbot-container {
|
||||
background: white;
|
||||
border-radius: 15px;
|
||||
border: 2px solid #e2e8f0;
|
||||
box-shadow: 0 4px 20px rgba(0,0,0,0.08);
|
||||
overflow: hidden;
|
||||
}
|
||||
|
||||
/* Input Box Styling */
|
||||
.message-input textarea {
|
||||
border: 2px solid #e2e8f0 !important;
|
||||
border-radius: 12px !important;
|
||||
padding: 1rem !important;
|
||||
font-size: 1rem !important;
|
||||
transition: all 0.3s ease !important;
|
||||
}
|
||||
|
||||
.message-input textarea:focus {
|
||||
border-color: #667eea !important;
|
||||
box-shadow: 0 0 0 3px rgba(102, 126, 234, 0.1) !important;
|
||||
}
|
||||
|
||||
/* Input Row Centering */
|
||||
.input-row {
|
||||
display: flex !important;
|
||||
justify-content: center !important;
|
||||
align-items: center !important;
|
||||
gap: 1rem !important;
|
||||
}
|
||||
|
||||
.input-row > * {
|
||||
flex-shrink: 0 !important;
|
||||
}
|
||||
|
||||
/* Audio Components */
|
||||
.audio-component {
|
||||
background: #f7fafc;
|
||||
border: 2px dashed #cbd5e0;
|
||||
border-radius: 12px;
|
||||
padding: 1rem;
|
||||
transition: all 0.3s ease;
|
||||
}
|
||||
|
||||
.audio-component:hover {
|
||||
border-color: #667eea;
|
||||
background: #edf2f7;
|
||||
}
|
||||
|
||||
/* Checkbox Styling */
|
||||
.checkbox-label {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: 0.5rem;
|
||||
font-weight: 500;
|
||||
color: #2d3748;
|
||||
}
|
||||
|
||||
/* Tips Section */
|
||||
.tips-section {
|
||||
background: linear-gradient(135deg, #f7fafc 0%, #edf2f7 100%);
|
||||
padding: 1.5rem;
|
||||
border-radius: 15px;
|
||||
margin-top: 2rem;
|
||||
border-left: 4px solid #667eea;
|
||||
}
|
||||
|
||||
.tips-section h3 {
|
||||
color: #667eea !important;
|
||||
font-weight: 600 !important;
|
||||
margin-bottom: 1rem !important;
|
||||
}
|
||||
|
||||
.tips-section ul {
|
||||
list-style: none;
|
||||
padding: 0;
|
||||
}
|
||||
|
||||
.tips-section li {
|
||||
padding: 0.5rem 0;
|
||||
color: #4a5568 !important;
|
||||
font-size: 0.95rem;
|
||||
}
|
||||
|
||||
.tips-section li:before {
|
||||
content: "→ ";
|
||||
color: #667eea;
|
||||
font-weight: bold;
|
||||
margin-right: 0.5rem;
|
||||
}
|
||||
|
||||
/* Force black color for strong/bold text */
|
||||
.tips-section strong {
|
||||
color: #1a202c !important;
|
||||
}
|
||||
|
||||
.prose * {
|
||||
color: inherit !important;
|
||||
}
|
||||
|
||||
.prose strong {
|
||||
color: #1a202c !important;
|
||||
font-weight: 600 !important;
|
||||
}
|
||||
|
||||
/* Responsive Design */
|
||||
@media (max-width: 768px) {
|
||||
.header-section h1 {
|
||||
font-size: 1.8rem !important;
|
||||
}
|
||||
|
||||
.contain {
|
||||
padding: 1rem !important;
|
||||
}
|
||||
}
|
||||
|
||||
.fillable{
|
||||
max-width:95% !important;
|
||||
}
|
||||
#component-5{
|
||||
flex-grow:1.1 !important;
|
||||
}
|
||||
.bubble-wrap.svelte-gjtrl6 {
|
||||
background:none !important;
|
||||
}
|
||||
.bot.svelte-1csv61q.message {
|
||||
background-color: white !important;
|
||||
border: 1px solid #f3f3f3;
|
||||
}
|
||||
.options.svelte-y6qw75> li:hover{
|
||||
background:white ;
|
||||
}
|
||||
.options.svelte-y6qw75> .selected{
|
||||
background:white ;
|
||||
}
|
||||
|
||||
"""
|
||||
|
||||
# Build Gradio Interface
|
||||
with gr.Blocks(css=custom_css, theme=gr.themes.Soft()) as demo:
|
||||
|
||||
# Header
|
||||
with gr.Row(elem_classes="header-section"):
|
||||
with gr.Column():
|
||||
gr.HTML("""
|
||||
<h1>🚀 Voice Enabled Multi Model AI-Assistant</h1>
|
||||
<p>Your intelligent companion for any question - from coding to creativity, science to storytelling!</p>
|
||||
<div style="margin-top: 1rem;">
|
||||
<span class="feature-badge">🤖 7 AI Models</span>
|
||||
<span class="feature-badge">🎤 Voice Input</span>
|
||||
<span class="feature-badge">🔊 Audio Output</span>
|
||||
<span class="feature-badge">⚡ Real-time Streaming</span>
|
||||
<span class="feature-badge">🌐 Any Topic</span>
|
||||
</div>
|
||||
""")
|
||||
|
||||
with gr.Row():
|
||||
# Left Sidebar - Controls
|
||||
with gr.Column(scale=1, elem_classes="control-panel"):
|
||||
gr.HTML("<h3 style='color: #2d3748 !important; margin-top: 0;'>⚙️ Settings</h3>")
|
||||
|
||||
model_choice = gr.Dropdown(
|
||||
choices=list(model_configs.keys()),
|
||||
value="GPT-4o-mini",
|
||||
label="🤖 AI Model",
|
||||
info="Select your preferred model",
|
||||
elem_classes="dropdown-container"
|
||||
)
|
||||
|
||||
temperature = gr.Slider(
|
||||
minimum=0,
|
||||
maximum=1,
|
||||
value=0.7,
|
||||
step=0.1,
|
||||
label="🌡️ Temperature",
|
||||
info="Higher = more creative responses"
|
||||
)
|
||||
|
||||
gr.HTML("<div style='margin: 1.5rem 0 0.5rem 0; padding-top: 1.5rem; border-top: 2px solid #cbd5e0;'><h4 style='color: #2d3748 !important; margin: 0;'>🎙️ Audio Features</h4></div>")
|
||||
|
||||
audio_input = gr.Audio(
|
||||
sources=["microphone"],
|
||||
type="filepath",
|
||||
label="🎤 Voice Input",
|
||||
elem_classes="audio-component"
|
||||
)
|
||||
|
||||
audio_output_enabled = gr.Checkbox(
|
||||
label="🔊 Enable Audio Response",
|
||||
value=False,
|
||||
elem_classes="checkbox-label"
|
||||
)
|
||||
|
||||
# Right Side - Chat Interface
|
||||
with gr.Column(scale=3):
|
||||
chatbot = gr.Chatbot(
|
||||
label="💬 Conversation",
|
||||
height=550,
|
||||
show_copy_button=True,
|
||||
type='messages',
|
||||
elem_classes="chatbot-container",
|
||||
avatar_images=(None, "https://api.dicebear.com/7.x/bottts/svg?seed=ai")
|
||||
)
|
||||
|
||||
with gr.Row(elem_classes="input-row"):
|
||||
msg = gr.Textbox(
|
||||
label="",
|
||||
placeholder="💭 Ask me anything - tech help, creative writing, life advice, science, history, or just chat!",
|
||||
scale=5,
|
||||
elem_classes="message-input",
|
||||
show_label=False
|
||||
)
|
||||
submit_btn = gr.Button("Send 📤", scale=1, elem_classes="primary-btn")
|
||||
|
||||
audio_response = gr.Audio(
|
||||
label="🔊 Audio Response",
|
||||
visible=False,
|
||||
elem_classes="audio-component"
|
||||
)
|
||||
|
||||
with gr.Row():
|
||||
clear_btn = gr.Button("🗑️ Clear Chat", elem_classes="secondary-btn")
|
||||
|
||||
# Tips Section
|
||||
with gr.Row(elem_classes="tips-section"):
|
||||
gr.Markdown("""
|
||||
### 💡 What Can I Help You With?
|
||||
|
||||
- **Technology & Programming**: Debug code, explain concepts, build projects, learn new languages
|
||||
- **Creative Writing**: Stories, poems, scripts, brainstorming ideas, character development
|
||||
- **Education & Learning**: Homework help, concept explanations, study guides, tutoring
|
||||
- **Business & Career**: Resume writing, business plans, marketing ideas, career advice
|
||||
- **Science & Math**: Problem-solving, research assistance, concept explanations
|
||||
- **Daily Life**: Recipe suggestions, travel planning, health tips, relationship advice
|
||||
- **Entertainment**: Jokes, trivia, games, recommendations for books/movies/music
|
||||
- **And Literally Anything Else**: No topic is off-limits - just ask!
|
||||
""")
|
||||
|
||||
# Event handlers
|
||||
def process_message(message, history, model, temp, audio_enabled):
|
||||
"""Process message and optionally generate audio"""
|
||||
# Add user message to history
|
||||
history = history + [{"role": "user", "content": message}]
|
||||
|
||||
# Generate text response (streaming)
|
||||
bot_message = None
|
||||
for response in chat_streaming(message, history[:-1], model, temp):
|
||||
bot_message = response
|
||||
yield history + [{"role": "assistant", "content": response}], None
|
||||
|
||||
# Final history with complete response
|
||||
final_history = history + [{"role": "assistant", "content": bot_message}]
|
||||
|
||||
# Generate audio if enabled
|
||||
if audio_enabled and bot_message:
|
||||
audio_path = text_to_speech(bot_message)
|
||||
yield final_history, audio_path
|
||||
else:
|
||||
yield final_history, None
|
||||
|
||||
def transcribe_and_send(audio, history, model, temp, audio_enabled):
|
||||
"""Transcribe audio and process message"""
|
||||
text = handle_audio_input(audio)
|
||||
if text and text != "" and not text.startswith("Error"):
|
||||
# Process the message and get results
|
||||
for hist, aud in process_message(text, history, model, temp, audio_enabled):
|
||||
yield hist, aud
|
||||
else:
|
||||
# If no text or error, return history unchanged
|
||||
yield history, None
|
||||
|
||||
# Wire up events
|
||||
submit_btn.click(
|
||||
fn=process_message,
|
||||
inputs=[msg, chatbot, model_choice, temperature, audio_output_enabled],
|
||||
outputs=[chatbot, audio_response]
|
||||
).then(lambda: "", None, msg)
|
||||
|
||||
msg.submit(
|
||||
fn=process_message,
|
||||
inputs=[msg, chatbot, model_choice, temperature, audio_output_enabled],
|
||||
outputs=[chatbot, audio_response]
|
||||
).then(lambda: "", None, msg)
|
||||
|
||||
# Audio input handler using stop_recording event
|
||||
audio_input.stop_recording(
|
||||
fn=transcribe_and_send,
|
||||
inputs=[audio_input, chatbot, model_choice, temperature, audio_output_enabled],
|
||||
outputs=[chatbot, audio_response]
|
||||
)
|
||||
|
||||
# Clear button clears chat, audio response, and audio input
|
||||
clear_btn.click(
|
||||
fn=lambda: ([], None, None),
|
||||
inputs=None,
|
||||
outputs=[chatbot, audio_response, audio_input]
|
||||
)
|
||||
|
||||
# Toggle audio response visibility
|
||||
audio_output_enabled.change(
|
||||
fn=lambda x: gr.Audio(visible=x),
|
||||
inputs=audio_output_enabled,
|
||||
outputs=audio_response
|
||||
)
|
||||
|
||||
# Launch the app
|
||||
if __name__ == "__main__":
|
||||
demo.launch(share=False, debug=True)
|
||||
@@ -0,0 +1,5 @@
|
||||
gradio
|
||||
openai
|
||||
anthropic
|
||||
google-generativeai
|
||||
python-dotenv
|
||||
219
week2/community-contributions/boardgame_critique.ipynb
Normal file
219
week2/community-contributions/boardgame_critique.ipynb
Normal file
@@ -0,0 +1,219 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "768629e6",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# imports\n",
|
||||
"\n",
|
||||
"import os\n",
|
||||
"from dotenv import load_dotenv\n",
|
||||
"from openai import OpenAI\n",
|
||||
"\n",
|
||||
"from IPython.display import Markdown, display, update_display"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "84a945dc",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Load environment variables in a file called .env\n",
|
||||
"# Print the key prefixes to help with any debugging\n",
|
||||
"\n",
|
||||
"load_dotenv(override=True)\n",
|
||||
"openai_api_key = os.getenv('OPENAI_API_KEY')\n",
|
||||
"google_api_key = os.getenv('GOOGLE_API_KEY')\n",
|
||||
"\n",
|
||||
"if openai_api_key:\n",
|
||||
" print(f\"OpenAI API Key exists and begins {openai_api_key[:8]}\")\n",
|
||||
"else:\n",
|
||||
" print(\"OpenAI API Key not set\")\n",
|
||||
"\n",
|
||||
"if google_api_key:\n",
|
||||
" print(f\"Google API Key exists and begins {google_api_key[:8]}\")\n",
|
||||
"else:\n",
|
||||
" print(\"Google API Key not set\")"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "ad8ae0b6",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Connect to OpenAI, Gemini\n",
|
||||
"openai = OpenAI()\n",
|
||||
"gemini_via_openai_client = OpenAI(\n",
|
||||
" api_key=google_api_key, \n",
|
||||
" base_url=\"https://generativelanguage.googleapis.com/v1beta/openai/\"\n",
|
||||
")"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "f66cf12f",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Let's make a conversation between GPT-4.1-mini and Gemini-2.0-flash\n",
|
||||
"# We're using cheap versions of models so the costs will be minimal\n",
|
||||
"\n",
|
||||
"# game = \"Santorini\"\n",
|
||||
"no_questions = 3\n",
|
||||
"\n",
|
||||
"gpt_model = \"gpt-4o-mini\"\n",
|
||||
"gemini_model = \"gemini-2.0-flash\"\n",
|
||||
"\n",
|
||||
"gpt_system = \"You are a boardgame journalist. \\\n",
|
||||
"You tend to be objective and ask right questions to get to the core of the boardgame mechanics, \\\n",
|
||||
"visual appeal and time to setup the game. Your goal is to ask the right questions to get the best possible review of the board game.\" \\\n",
|
||||
"\"You ask one question at a time and wait for the other person to answer. \\\n",
|
||||
"You do not answer any own questions. You always try to build on the previous answer.\"\n",
|
||||
"\n",
|
||||
"gemini_system = \"You are a boardgame critique; \\\n",
|
||||
"you tend to objectively analyze everything when it comes to a board game gameplay, visual appeal and time to setup the game. \\\n",
|
||||
"Your goal is to provide constructive criticism so the board gaming community can benefit from these insights.\" \\\n",
|
||||
"\"You answer one question at a time and wait for the other person to ask the next question. \\\n",
|
||||
"You do not ask any questions you always just answer the previous question. \\\n",
|
||||
"If the other person is very positive, you try to point out flaws in the game. \\\n",
|
||||
"If the other person is very negative, you try to point out good aspects of the game.\"\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "33266f0c",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"def call_boardgame_journalist(gpt_messages, gemini_messages):\n",
|
||||
" messages = [{\"role\": \"system\", \"content\": gpt_system}]\n",
|
||||
" for gpt, gemini in zip(gpt_messages, gemini_messages):\n",
|
||||
" messages.append({\"role\": \"user\", \"content\": gpt})\n",
|
||||
" messages.append({\"role\": \"assistant\", \"content\": gemini})\n",
|
||||
" completion = openai.chat.completions.create(\n",
|
||||
" model=gpt_model,\n",
|
||||
" messages=messages\n",
|
||||
" )\n",
|
||||
" return completion.choices[0].message.content"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "53d42055",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"def call_boardgame_critique(gpt_messages, gemini_messages):\n",
|
||||
" messages = [{\"role\": \"system\", \"content\": gemini_system}]\n",
|
||||
" for gpt, gemini in zip(gpt_messages, gemini_messages):\n",
|
||||
" messages.append({\"role\": \"user\", \"content\": gpt})\n",
|
||||
" messages.append({\"role\": \"assistant\", \"content\": gemini})\n",
|
||||
" messages.append({\"role\": \"user\", \"content\": gpt_messages[-1]})\n",
|
||||
" completion = gemini_via_openai_client.chat.completions.create(\n",
|
||||
" model=gemini_model,\n",
|
||||
" messages=messages\n",
|
||||
" )\n",
|
||||
" return completion.choices[0].message.content\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "5aa66868",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"def run_boardgame_conversation(boardgame_name):\n",
|
||||
" gpt_messages = [f\"I would like to review the board game {boardgame_name}.\"]\n",
|
||||
" gemini_messages = [f\"Sure, ask me anything about the board game {boardgame_name}.\"]\n",
|
||||
"\n",
|
||||
" print(f\"Journalist:\\n{gpt_messages[0]}\\n\")\n",
|
||||
" print(f\"Critique:\\n{gemini_messages[0]}\\n\")\n",
|
||||
"\n",
|
||||
" for i in range(no_questions):\n",
|
||||
" print(f\"\\n\\n***Question {i + 1}***\\n\\n\")\n",
|
||||
" gpt_next = call_boardgame_journalist(gpt_messages, gemini_messages)\n",
|
||||
" print(f\"Journalist:\\n{gpt_next}\\n\")\n",
|
||||
" gpt_messages.append(gpt_next)\n",
|
||||
"\n",
|
||||
" gemini_next = call_boardgame_critique(gpt_messages, gemini_messages)\n",
|
||||
" print(f\"Critique:\\n{gemini_next}\\n\")\n",
|
||||
" gemini_messages.append(gemini_next)\n",
|
||||
"\n",
|
||||
" return \"\\n\".join(f\"*Journalist*: {g}\\n*Critique*: {c}\" for g, c in zip(gpt_messages, gemini_messages))\n",
|
||||
"\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "51c9dadc",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"import gradio as gr"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "548efb27",
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
{
|
||||
"ename": "NameError",
|
||||
"evalue": "name 'gr' is not defined",
|
||||
"output_type": "error",
|
||||
"traceback": [
|
||||
"\u001b[31m---------------------------------------------------------------------------\u001b[39m",
|
||||
"\u001b[31mNameError\u001b[39m Traceback (most recent call last)",
|
||||
"\u001b[36mCell\u001b[39m\u001b[36m \u001b[39m\u001b[32mIn[1]\u001b[39m\u001b[32m, line 1\u001b[39m\n\u001b[32m----> \u001b[39m\u001b[32m1\u001b[39m view = \u001b[43mgr\u001b[49m.Interface(\n\u001b[32m 2\u001b[39m fn=run_boardgame_conversation,\n\u001b[32m 3\u001b[39m inputs=[gr.Textbox(label=\u001b[33m\"\u001b[39m\u001b[33mInput the name of the board game:\u001b[39m\u001b[33m\"\u001b[39m)],\n\u001b[32m 4\u001b[39m outputs=[gr.Markdown(label=\u001b[33m\"\u001b[39m\u001b[33mConversation:\u001b[39m\u001b[33m\"\u001b[39m)],\n\u001b[32m 5\u001b[39m flagging_mode=\u001b[33m\"\u001b[39m\u001b[33mnever\u001b[39m\u001b[33m\"\u001b[39m\n\u001b[32m 6\u001b[39m )\n\u001b[32m 7\u001b[39m view.launch()\n",
|
||||
"\u001b[31mNameError\u001b[39m: name 'gr' is not defined"
|
||||
]
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"# Create a Gradio interface for running boardgame conversations.\n",
|
||||
"# The interface takes the board game name as input and displays the conversation as Markdown.\n",
|
||||
"view = gr.Interface(\n",
|
||||
" fn=run_boardgame_conversation,\n",
|
||||
" inputs=[gr.Textbox(label=\"Input the name of the board game:\")],\n",
|
||||
" outputs=[gr.Markdown(label=\"Conversation:\")],\n",
|
||||
" flagging_mode=\"never\"\n",
|
||||
")\n",
|
||||
"view.launch()"
|
||||
]
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"kernelspec": {
|
||||
"display_name": "venv (3.13.5)",
|
||||
"language": "python",
|
||||
"name": "python3"
|
||||
},
|
||||
"language_info": {
|
||||
"codemirror_mode": {
|
||||
"name": "ipython",
|
||||
"version": 3
|
||||
},
|
||||
"file_extension": ".py",
|
||||
"mimetype": "text/x-python",
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.13.5"
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 5
|
||||
}
|
||||
@@ -0,0 +1,62 @@
|
||||
import os
|
||||
|
||||
from dotenv import load_dotenv
|
||||
|
||||
KEY_CONFIGS = {
|
||||
"gpt": {
|
||||
"id": "gpt",
|
||||
"api_key_env": "OPENAI_API_KEY",
|
||||
},
|
||||
"claude": {
|
||||
"id": "claude",
|
||||
"api_key_env": "ANTHROPIC_API_KEY",
|
||||
},
|
||||
"gemini": {
|
||||
"id": "gemini",
|
||||
"api_key_env": "GOOGLE_API_KEY",
|
||||
},
|
||||
"openai": {
|
||||
"id": "openai",
|
||||
"api_key_env": "OPENAI_API_KEY",
|
||||
},
|
||||
"deepseek": {
|
||||
"id": "deepseek",
|
||||
"api_key_env": "DEEPSEEK_API_KEY"
|
||||
},
|
||||
"amadeus_client_id": {
|
||||
"id": "client_id",
|
||||
"api_key_env": "AMADEUS_CLIENT_ID"
|
||||
},
|
||||
"amadeus_client_secret": {
|
||||
"id": "client_secret",
|
||||
"api_key_env": "AMADEUS_CLIENT_SECRET"
|
||||
},
|
||||
"google_map": {
|
||||
"id": "google_map_api_key",
|
||||
"api_key_env": "GOOGLE_MAP_API_KEY"
|
||||
}
|
||||
}
|
||||
|
||||
class ApiKeyLoader:
|
||||
def __init__(self):
|
||||
load_dotenv(override=False)
|
||||
|
||||
required_env_vars = {cfg["api_key_env"] for cfg in KEY_CONFIGS.values() if "api_key_env" in cfg}
|
||||
|
||||
self.missing = [var for var in sorted(required_env_vars) if not os.getenv(var)]
|
||||
|
||||
if self.missing:
|
||||
raise RuntimeError(
|
||||
"Missing required API key environment variables: "
|
||||
+ ", ".join(self.missing)
|
||||
+ ". Please add them to your .env file or export them in your environment."
|
||||
)
|
||||
|
||||
self.keys = {
|
||||
cfg["id"]: os.getenv(cfg["api_key_env"])
|
||||
for cfg in KEY_CONFIGS.values()
|
||||
if os.getenv(cfg["api_key_env"])
|
||||
}
|
||||
|
||||
def get(self, key):
|
||||
return self.keys.get(key)
|
||||
@@ -0,0 +1,265 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "d006b2ea-9dfe-49c7-88a9-a5a0775185fd",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# Additional End of week Exercise - week 2\n",
|
||||
"\n",
|
||||
"Now use everything you've learned from Week 2 to build a full prototype for the technical question/answerer you built in Week 1 Exercise.\n",
|
||||
"\n",
|
||||
"This should include a Gradio UI, streaming, use of the system prompt to add expertise, and the ability to switch between models. Bonus points if you can demonstrate use of a tool!\n",
|
||||
"\n",
|
||||
"If you feel bold, see if you can add audio input so you can talk to it, and have it respond with audio. ChatGPT or Claude can help you, or email me if you have questions.\n",
|
||||
"\n",
|
||||
"I will publish a full solution here soon - unless someone beats me to it...\n",
|
||||
"\n",
|
||||
"There are so many commercial applications for this, from a language tutor, to a company onboarding solution, to a companion AI to a course (like this one!) I can't wait to see your results."
|
||||
]
|
||||
},
|
||||
{
|
||||
"metadata": {
|
||||
"ExecuteTime": {
|
||||
"end_time": "2025-10-06T16:52:12.637325Z",
|
||||
"start_time": "2025-10-06T16:52:10.174609Z"
|
||||
}
|
||||
},
|
||||
"cell_type": "code",
|
||||
"source": [
|
||||
"from openai import OpenAI\n",
|
||||
"from api_key_loader import ApiKeyLoader\n",
|
||||
"import gradio as gr"
|
||||
],
|
||||
"id": "40fad776c1390c95",
|
||||
"outputs": [],
|
||||
"execution_count": 1
|
||||
},
|
||||
{
|
||||
"metadata": {
|
||||
"ExecuteTime": {
|
||||
"end_time": "2025-10-06T16:52:14.602846Z",
|
||||
"start_time": "2025-10-06T16:52:14.599016Z"
|
||||
}
|
||||
},
|
||||
"cell_type": "code",
|
||||
"source": [
|
||||
"# Initialization\n",
|
||||
"keys = ApiKeyLoader()"
|
||||
],
|
||||
"id": "df481efd444c3042",
|
||||
"outputs": [],
|
||||
"execution_count": 2
|
||||
},
|
||||
{
|
||||
"metadata": {
|
||||
"ExecuteTime": {
|
||||
"end_time": "2025-10-06T16:52:16.325113Z",
|
||||
"start_time": "2025-10-06T16:52:16.317253Z"
|
||||
}
|
||||
},
|
||||
"cell_type": "code",
|
||||
"source": [
|
||||
"# Create LLM client\n",
|
||||
"MODEL = \"gpt-4o-mini\"\n",
|
||||
"openai = OpenAI()"
|
||||
],
|
||||
"id": "50c88632f5e3e8ca",
|
||||
"outputs": [],
|
||||
"execution_count": 3
|
||||
},
|
||||
{
|
||||
"metadata": {
|
||||
"ExecuteTime": {
|
||||
"end_time": "2025-10-06T16:52:18.224200Z",
|
||||
"start_time": "2025-10-06T16:52:18.198959Z"
|
||||
}
|
||||
},
|
||||
"cell_type": "code",
|
||||
"source": "from tool_box import ToolBox",
|
||||
"id": "5dde6f905c143779",
|
||||
"outputs": [],
|
||||
"execution_count": 4
|
||||
},
|
||||
{
|
||||
"metadata": {
|
||||
"ExecuteTime": {
|
||||
"end_time": "2025-10-06T16:52:19.913686Z",
|
||||
"start_time": "2025-10-06T16:52:19.906576Z"
|
||||
}
|
||||
},
|
||||
"cell_type": "code",
|
||||
"source": "tool_box = ToolBox(keys)",
|
||||
"id": "f999cfc5d533bf4e",
|
||||
"outputs": [],
|
||||
"execution_count": 5
|
||||
},
|
||||
{
|
||||
"metadata": {
|
||||
"ExecuteTime": {
|
||||
"end_time": "2025-10-06T16:52:21.515122Z",
|
||||
"start_time": "2025-10-06T16:52:21.513463Z"
|
||||
}
|
||||
},
|
||||
"cell_type": "code",
|
||||
"source": [
|
||||
"system_message = \"You are a helpful assistant for an Travel Agency called TravelAI. \"\n",
|
||||
"system_message += \"Give short, courteous answers, no more than 1 sentence. \"\n",
|
||||
"system_message += \"Always be accurate. If you don't know the answer, say so.\""
|
||||
],
|
||||
"id": "1891e095dc08da95",
|
||||
"outputs": [],
|
||||
"execution_count": 6
|
||||
},
|
||||
{
|
||||
"metadata": {
|
||||
"ExecuteTime": {
|
||||
"end_time": "2025-10-06T16:52:26.569153Z",
|
||||
"start_time": "2025-10-06T16:52:26.566763Z"
|
||||
}
|
||||
},
|
||||
"cell_type": "code",
|
||||
"source": [
|
||||
"def chat(history):\n",
|
||||
" messages = [{\"role\": \"system\", \"content\": system_message}] + history\n",
|
||||
" response = openai.chat.completions.create(model=MODEL, messages=messages, tools=tool_box.tools)\n",
|
||||
" image = None\n",
|
||||
"\n",
|
||||
" if response.choices[0].finish_reason==\"tool_calls\":\n",
|
||||
" message = response.choices[0].message\n",
|
||||
" tools_response, image = tool_box.apply(message)\n",
|
||||
" messages.append(message)\n",
|
||||
" messages.extend(tools_response)\n",
|
||||
" response = openai.chat.completions.create(model=MODEL, messages=messages)\n",
|
||||
"\n",
|
||||
" reply = response.choices[0].message.content\n",
|
||||
" history += [{\"role\":\"assistant\", \"content\":reply}]\n",
|
||||
"\n",
|
||||
" return history, image"
|
||||
],
|
||||
"id": "b74e96dbf19dceef",
|
||||
"outputs": [],
|
||||
"execution_count": 7
|
||||
},
|
||||
{
|
||||
"metadata": {},
|
||||
"cell_type": "code",
|
||||
"source": "resp, image = chat([{\"role\": \"user\", \"content\": \"Show me Milan Airports on the map\"}])",
|
||||
"id": "8c4a6f62ed5079f7",
|
||||
"outputs": [],
|
||||
"execution_count": null
|
||||
},
|
||||
{
|
||||
"metadata": {
|
||||
"ExecuteTime": {
|
||||
"end_time": "2025-10-06T16:52:59.264912Z",
|
||||
"start_time": "2025-10-06T16:52:58.789327Z"
|
||||
}
|
||||
},
|
||||
"cell_type": "code",
|
||||
"source": [
|
||||
"with gr.Blocks() as ui:\n",
|
||||
" with gr.Row():\n",
|
||||
" chatbot = gr.Chatbot(height=500, type=\"messages\")\n",
|
||||
" image_output = gr.Image(height=500)\n",
|
||||
" with gr.Row():\n",
|
||||
" entry = gr.Textbox(label=\"Chat with our AI Assistant:\")\n",
|
||||
" with gr.Row():\n",
|
||||
" clear = gr.Button(\"Clear\")\n",
|
||||
"\n",
|
||||
" def do_entry(message, history):\n",
|
||||
" history += [{\"role\":\"user\", \"content\":message}]\n",
|
||||
" return \"\", history\n",
|
||||
"\n",
|
||||
" entry.submit(do_entry, inputs=[entry, chatbot], outputs=[entry, chatbot]).then(\n",
|
||||
" chat, inputs=chatbot, outputs=[chatbot, image_output]\n",
|
||||
" )\n",
|
||||
" clear.click(lambda: None, inputs=None, outputs=chatbot, queue=False)\n",
|
||||
"\n",
|
||||
"ui.launch(inbrowser=True)"
|
||||
],
|
||||
"id": "71809ab63b2973b0",
|
||||
"outputs": [
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"* Running on local URL: http://127.0.0.1:7860\n",
|
||||
"* To create a public link, set `share=True` in `launch()`.\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"data": {
|
||||
"text/plain": [
|
||||
"<IPython.core.display.HTML object>"
|
||||
],
|
||||
"text/html": [
|
||||
"<div><iframe src=\"http://127.0.0.1:7860/\" width=\"100%\" height=\"500\" allow=\"autoplay; camera; microphone; clipboard-read; clipboard-write;\" frameborder=\"0\" allowfullscreen></iframe></div>"
|
||||
]
|
||||
},
|
||||
"metadata": {},
|
||||
"output_type": "display_data",
|
||||
"jetTransient": {
|
||||
"display_id": null
|
||||
}
|
||||
},
|
||||
{
|
||||
"data": {
|
||||
"text/plain": []
|
||||
},
|
||||
"execution_count": 9,
|
||||
"metadata": {},
|
||||
"output_type": "execute_result"
|
||||
}
|
||||
],
|
||||
"execution_count": 9
|
||||
},
|
||||
{
|
||||
"metadata": {},
|
||||
"cell_type": "code",
|
||||
"source": [
|
||||
"from travel_api import TravelAPI\n",
|
||||
"trave_agent = TravelAPI(keys.get(\"client_id\"), keys.get(\"client_secret\"))\n",
|
||||
"airports = trave_agent.get_airport('Milan')\n",
|
||||
"print(airports)"
|
||||
],
|
||||
"id": "3e63c59c69cbdae9",
|
||||
"outputs": [],
|
||||
"execution_count": null
|
||||
},
|
||||
{
|
||||
"metadata": {
|
||||
"ExecuteTime": {
|
||||
"end_time": "2025-10-06T16:52:55.269319Z",
|
||||
"start_time": "2025-10-06T16:52:34.020932Z"
|
||||
}
|
||||
},
|
||||
"cell_type": "code",
|
||||
"source": "resp, image = chat([{\"role\": \"user\", \"content\": \"Give me a boarding pass from MXP to LHR for 1 Nov 2025 for Sergei Sevriugin\"}])",
|
||||
"id": "caa12b84a1863bc1",
|
||||
"outputs": [],
|
||||
"execution_count": 8
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"kernelspec": {
|
||||
"display_name": "Python 3 (ipykernel)",
|
||||
"language": "python",
|
||||
"name": "python3"
|
||||
},
|
||||
"language_info": {
|
||||
"codemirror_mode": {
|
||||
"name": "ipython",
|
||||
"version": 3
|
||||
},
|
||||
"file_extension": ".py",
|
||||
"mimetype": "text/x-python",
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.11.13"
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 5
|
||||
}
|
||||
100
week2/community-contributions/book_ticket_agent/map_generator.py
Normal file
100
week2/community-contributions/book_ticket_agent/map_generator.py
Normal file
@@ -0,0 +1,100 @@
|
||||
from requests.exceptions import ChunkedEncodingError
|
||||
from http.client import IncompleteRead
|
||||
from googlemaps.maps import StaticMapMarker
|
||||
|
||||
import googlemaps
|
||||
import time
|
||||
|
||||
def get_center(points):
|
||||
if not points:
|
||||
raise ValueError("points must be a non-empty list of coordinate objects")
|
||||
|
||||
locations = []
|
||||
lats, lngs = [], []
|
||||
for p in points:
|
||||
g = p.get("geoCode")
|
||||
lat = p.get("latitude", g.get("latitude"))
|
||||
lng = p.get("longitude", g.get("longitude"))
|
||||
if lat is None or lng is None:
|
||||
raise ValueError("Each point must include 'latitude' and 'longitude' (or 'geoCode.latitude'/'geoCode.longitude').")
|
||||
lat_f = float(lat)
|
||||
lng_f = float(lng)
|
||||
locations.append({"lat": lat_f, "lng": lng_f})
|
||||
lats.append(lat_f)
|
||||
lngs.append(lng_f)
|
||||
|
||||
# Center at the centroid of provided points
|
||||
center = (sum(lats) / len(lats), sum(lngs) / len(lngs))
|
||||
return center, locations
|
||||
|
||||
|
||||
class MapGenerator:
|
||||
def __init__(self, google_map_api_key):
|
||||
self.client = googlemaps.Client(google_map_api_key)
|
||||
|
||||
def fetch_static_map_bytes(
|
||||
self,
|
||||
center,
|
||||
markers,
|
||||
size=(400, 400),
|
||||
zoom=6,
|
||||
map_type="hybrid",
|
||||
img_format="png",
|
||||
scale=2,
|
||||
visible=None,
|
||||
max_retries=3,
|
||||
backoff_base=0.6,
|
||||
):
|
||||
last_err = None
|
||||
for attempt in range(1, max_retries + 1):
|
||||
try:
|
||||
iterator = self.client.static_map(
|
||||
size=size,
|
||||
zoom=zoom,
|
||||
center=center,
|
||||
maptype=map_type,
|
||||
format=img_format,
|
||||
scale=scale,
|
||||
visible=visible,
|
||||
markers=markers,
|
||||
)
|
||||
return b"".join(chunk for chunk in iterator if chunk)
|
||||
except (ChunkedEncodingError, IncompleteRead) as e:
|
||||
last_err = e
|
||||
if attempt == max_retries:
|
||||
break
|
||||
# An exponential backoff before retrying
|
||||
time.sleep(backoff_base * attempt)
|
||||
# If we got here, all retries failed; re-raise the last error, so the user sees the cause.
|
||||
raise last_err
|
||||
|
||||
def generate(
|
||||
self,
|
||||
points,
|
||||
zoom=6,
|
||||
size=(600, 600),
|
||||
map_type="roadmap",
|
||||
color="blue",
|
||||
label=None,
|
||||
marker_size="mid"
|
||||
):
|
||||
center, locations = get_center(points)
|
||||
|
||||
sm_marker = StaticMapMarker(
|
||||
locations=locations,
|
||||
size=marker_size,
|
||||
color=color,
|
||||
label=label,
|
||||
)
|
||||
|
||||
img_bytes = self.fetch_static_map_bytes(
|
||||
center=center,
|
||||
markers=[sm_marker],
|
||||
size=size,
|
||||
zoom=zoom,
|
||||
map_type=map_type,
|
||||
img_format="png",
|
||||
scale=2,
|
||||
)
|
||||
|
||||
return img_bytes
|
||||
272
week2/community-contributions/book_ticket_agent/tool_box.py
Normal file
272
week2/community-contributions/book_ticket_agent/tool_box.py
Normal file
@@ -0,0 +1,272 @@
|
||||
import base64
|
||||
import json
|
||||
from travel_api import TravelAPI
|
||||
from api_key_loader import ApiKeyLoader
|
||||
from map_generator import MapGenerator
|
||||
from openai import OpenAI
|
||||
from typing import Any, Dict, List, Optional
|
||||
from PIL import Image
|
||||
import io
|
||||
from io import BytesIO
|
||||
|
||||
|
||||
# Internal function specs (simple) used to build OpenAI-compatible tools list
|
||||
_FUNCTION_SPECS: Dict[str, Dict[str, Any]] = {
|
||||
"get_flight": {
|
||||
"name": "get_flight",
|
||||
"description": (
|
||||
"Get flight options from the departure airport (origin), destination airport, date and number of adults."
|
||||
"Before calling this function, you should have called 'get_airports' to get airport codes for the origin and destination - 2 calls in total. "
|
||||
"Call this when client ask to book a flight, for example when client asks 'Book ticket to Paris on 2023-01-01'. If the origin or destination city is missing ask client first to provide it."
|
||||
),
|
||||
"parameters": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"origin_location_code": {
|
||||
"type": "string",
|
||||
"description": "IATA code of the origin airport, e.g. 'MAD'",
|
||||
},
|
||||
"destination_location_code": {
|
||||
"type": "string",
|
||||
"description": "IATA code of the destination airport, e.g. 'ATH'",
|
||||
},
|
||||
"departure_date": {
|
||||
"type": "string",
|
||||
"description": "Date of departure in 'YYYY-MM-DD'",
|
||||
},
|
||||
"adults": {
|
||||
"type": "integer",
|
||||
"description": "Number of adult passengers (default 1)",
|
||||
}
|
||||
},
|
||||
"required": ["origin_location_code", "destination_location_code", "departure_date"],
|
||||
"additionalProperties": False
|
||||
},
|
||||
},
|
||||
"get_airports": {
|
||||
"name": "get_airports",
|
||||
"description": (
|
||||
"Get airports for a city name using 'city'. Call this to resolve a city to airports."
|
||||
"The response contains a list of airport objects. Use the selected airport's 'iataCode' for get_flight."
|
||||
),
|
||||
"parameters": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"city": {
|
||||
"type": "string",
|
||||
"description": "City name to search airports for",
|
||||
},
|
||||
},
|
||||
"required": ["city"],
|
||||
"additionalProperties": False
|
||||
}
|
||||
},
|
||||
"get_map": {
|
||||
"name": "get_map",
|
||||
"description": "Generate a Google Static Map PNG for a list of airport/location for given `city`. Call this function when user ask to show city airports on map",
|
||||
"parameters": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"city": {
|
||||
"type": "string",
|
||||
"description": "City name to search airports for and than show on map",
|
||||
},
|
||||
},
|
||||
"required": ["city"],
|
||||
"additionalProperties": False
|
||||
}
|
||||
},
|
||||
"get_boarding_pass": {
|
||||
"name": "get_boarding_pass",
|
||||
"description": "Generate a boarding pass for a flight. Call this when client asks for boarding pass.",
|
||||
"parameters": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"origin_location_code": {
|
||||
"type": "string",
|
||||
"description": "IATA code of the origin airport, e.g. 'MAD'",
|
||||
},
|
||||
"destination_location_code": {
|
||||
"type": "string",
|
||||
"description": "IATA code of the destination airport, e.g. 'ATH'",
|
||||
},
|
||||
"departure_date": {
|
||||
"type": "string",
|
||||
"description": "Date of departure in 'YYYY-MM-DD'",
|
||||
},
|
||||
"name": {
|
||||
"type": "string",
|
||||
"description": "Passenger name",
|
||||
}
|
||||
},
|
||||
"required": ["origin_location_code", "destination_location_code", "departure_date", "name"],
|
||||
"additionalProperties": False
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
def _to_openai_tools(specs: Dict[str, Dict[str, Any]]) -> List[Dict[str, Any]]:
|
||||
"""Convert simple specs into OpenAI "tools" list schema."""
|
||||
tools: List[Dict[str, Any]] = []
|
||||
for spec in specs.values():
|
||||
tools.append({
|
||||
"type": "function",
|
||||
"function": {
|
||||
"name": spec["name"],
|
||||
"description": spec.get("description", ""),
|
||||
"parameters": spec.get("parameters", {"type": "object"}),
|
||||
}
|
||||
})
|
||||
return tools
|
||||
|
||||
|
||||
def _tool_response(tool_call_id: Optional[str], payload: Dict[str, Any]) -> Dict[str, Any]:
|
||||
return {
|
||||
"role": "tool",
|
||||
"content": json.dumps(payload),
|
||||
"tool_call_id": tool_call_id,
|
||||
}
|
||||
|
||||
def _parse_args(raw_args: Any) -> Dict[str, Any]:
|
||||
if isinstance(raw_args, str):
|
||||
try:
|
||||
return json.loads(raw_args) if raw_args else {}
|
||||
except Exception:
|
||||
return {}
|
||||
if isinstance(raw_args, dict):
|
||||
return raw_args
|
||||
return {}
|
||||
|
||||
def _extract_tool_call(tool_call: Any):
|
||||
function = getattr(tool_call, "function", None) or (
|
||||
tool_call.get("function") if isinstance(tool_call, dict) else None
|
||||
)
|
||||
name = getattr(function, "name", None) or (
|
||||
function.get("name") if isinstance(function, dict) else None
|
||||
)
|
||||
raw_args = getattr(function, "arguments", None) or (
|
||||
function.get("arguments") if isinstance(function, dict) else None
|
||||
)
|
||||
call_id = getattr(tool_call, "id", None) or (
|
||||
tool_call.get("id") if isinstance(tool_call, dict) else None
|
||||
)
|
||||
return name, _parse_args(raw_args), call_id
|
||||
|
||||
|
||||
class ToolBox:
|
||||
def __init__(self, keys: ApiKeyLoader):
|
||||
self.travel_api = TravelAPI(keys.get("client_id"), keys.get("client_secret"))
|
||||
self.map_generator = MapGenerator(keys.get("google_map_api_key"))
|
||||
self.openai = OpenAI(api_key=keys.get("openai_api_key"))
|
||||
self.tools = _to_openai_tools(_FUNCTION_SPECS)
|
||||
self._fn_dispatch = {
|
||||
"get_flight": self.get_flight,
|
||||
"get_airports": self.get_airports,
|
||||
"get_map": self.get_map,
|
||||
}
|
||||
|
||||
def get_flight(self, origin_location_code, destination_location_code, departure_date, adults=1):
|
||||
return self.travel_api.get_flight(origin_location_code, destination_location_code, departure_date,
|
||||
adults=adults)
|
||||
|
||||
def get_airports(self, city):
|
||||
return self.travel_api.get_airport(city)
|
||||
|
||||
def get_map(self, city):
|
||||
airports = self.travel_api.get_airport(city)
|
||||
return airports, self.map_generator.generate(airports)
|
||||
|
||||
def get_toolset(self):
|
||||
return self.tools
|
||||
|
||||
def get_boarding_pass(self, origin_location_code, destination_location_code, departure_date, name):
|
||||
image_response = self.openai.images.generate(
|
||||
model="dall-e-3",
|
||||
prompt=f"An image representing a boarding pass from {origin_location_code} to {destination_location_code} for {name} and departure date {departure_date}",
|
||||
size="1024x1024",
|
||||
n=1,
|
||||
response_format="b64_json",
|
||||
)
|
||||
image_base64 = image_response.data[0].b64_json
|
||||
image_data = base64.b64decode(image_base64)
|
||||
return Image.open(BytesIO(image_data))
|
||||
|
||||
|
||||
def apply(self, message):
|
||||
"""Apply tool calls contained in an assistant message and return a list of tool messages."""
|
||||
results: List[Dict[str, Any]] = []
|
||||
tool_calls = getattr(message, "tool_calls", None) or []
|
||||
if not tool_calls:
|
||||
return results
|
||||
|
||||
generated_user_message: Optional[str] = None
|
||||
image = None
|
||||
|
||||
for tool_call in tool_calls:
|
||||
function_name, arguments, call_id = _extract_tool_call(tool_call)
|
||||
|
||||
if function_name == "get_flight":
|
||||
origin_location_code = arguments.get("origin_location_code")
|
||||
destination_location_code = arguments.get("destination_location_code")
|
||||
departure_date = arguments.get("departure_date")
|
||||
adults = arguments.get("adults") or 1
|
||||
|
||||
options = self.get_flight(
|
||||
origin_location_code,
|
||||
destination_location_code,
|
||||
departure_date,
|
||||
adults=adults,
|
||||
)
|
||||
results.append(_tool_response(call_id, {"flight_options": options}))
|
||||
|
||||
elif function_name == "get_boarding_pass":
|
||||
origin_location_code = arguments.get("origin_location_code")
|
||||
destination_location_code = arguments.get("destination_location_code")
|
||||
departure_date = arguments.get("departure_date")
|
||||
name = arguments.get("name")
|
||||
image = self.get_boarding_pass(origin_location_code, destination_location_code, departure_date, name)
|
||||
results.append(_tool_response(call_id, {"boarding_pass": f"boarding pass for {name} from {origin_location_code} to {destination_location_code} on {departure_date}."}))
|
||||
if generated_user_message is None:
|
||||
generated_user_message = (
|
||||
f"Here is my boarding pass for {name} from {origin_location_code} to {destination_location_code} on {departure_date}."
|
||||
)
|
||||
|
||||
elif function_name == "get_airports":
|
||||
city = arguments.get("city")
|
||||
airports = self.get_airports(city)
|
||||
results.append(_tool_response(call_id, {"airports": airports}))
|
||||
if generated_user_message is None:
|
||||
generated_user_message = (
|
||||
f"Here are the airports in {city}: {airports} Please help me to select one."
|
||||
)
|
||||
|
||||
elif function_name == "get_map":
|
||||
city = arguments.get("city")
|
||||
try:
|
||||
airports, img_bytes = self.get_map(city)
|
||||
if img_bytes:
|
||||
try:
|
||||
pil_img = Image.open(io.BytesIO(img_bytes))
|
||||
pil_img.load()
|
||||
if pil_img.mode not in ("RGB", "RGBA"):
|
||||
pil_img = pil_img.convert("RGB")
|
||||
image = pil_img
|
||||
except Exception:
|
||||
image = None
|
||||
results.append(_tool_response(call_id, {"airports": airports}))
|
||||
if generated_user_message is None:
|
||||
generated_user_message = (
|
||||
f"Here are the airports in {city}: {airports} Please help me to select one."
|
||||
)
|
||||
except Exception as e:
|
||||
results.append(_tool_response(call_id, {"error": f"get_map failed: {str(e)}"}))
|
||||
|
||||
else:
|
||||
# Unknown tool: respond so the model can recover gracefully.
|
||||
results.append(_tool_response(call_id, {"error": f"Unknown tool: {function_name}"}))
|
||||
|
||||
if generated_user_message:
|
||||
results.append({"role": "user", "content": generated_user_message})
|
||||
|
||||
return results, image
|
||||
@@ -0,0 +1,58 @@
|
||||
from amadeus import Client, Location, ResponseError
|
||||
|
||||
def filter_other_countries(airports):
|
||||
country_codes_wights = {}
|
||||
for airport in airports:
|
||||
country_code = airport["address"]["countryCode"]
|
||||
country_codes_wights[country_code] = country_codes_wights.get(country_code, 0) + 1
|
||||
country_code = max(country_codes_wights, key=country_codes_wights.get)
|
||||
return [airport for airport in airports if airport["address"]["countryCode"] == country_code]
|
||||
|
||||
|
||||
class TravelAPI:
|
||||
def __init__(self, client_id, client_secret):
|
||||
self.client = Client(client_id=client_id, client_secret=client_secret)
|
||||
|
||||
def get_airport(self, search):
|
||||
try:
|
||||
airport_locations = self.client.reference_data.locations.get(
|
||||
keyword=search,
|
||||
subType=Location.AIRPORT,
|
||||
)
|
||||
return filter_other_countries(airport_locations.data)
|
||||
except ResponseError as e:
|
||||
print(f"Amadeus API ResponseError in get_airport: {e}")
|
||||
return []
|
||||
except Exception as e:
|
||||
print(f"Unexpected error in get_airport: {e}")
|
||||
return []
|
||||
|
||||
def get_city(self, search, country_code="IT"):
|
||||
try:
|
||||
city_locations = self.client.reference_data.locations.get(
|
||||
keyword=search,
|
||||
subType=Location.CITY,
|
||||
countryCode=country_code
|
||||
)
|
||||
return city_locations.data
|
||||
except ResponseError as e:
|
||||
print(f"Amadeus API ResponseError in get_city: {e}")
|
||||
return []
|
||||
except Exception as e:
|
||||
print(f"Unexpected error in get_city: {e}")
|
||||
return []
|
||||
|
||||
def get_flight(self, origin_location_code, destination_location_code, departure_date, adults=1):
|
||||
try:
|
||||
offers = self.client.shopping.flight_offers_search.get(
|
||||
originLocationCode=origin_location_code,
|
||||
destinationLocationCode=destination_location_code,
|
||||
departureDate=departure_date,
|
||||
adults=adults)
|
||||
return offers.data
|
||||
except ResponseError as e:
|
||||
print(f"Amadeus API ResponseError in get_flight: {e}")
|
||||
return []
|
||||
except Exception as e:
|
||||
print(f"Unexpected error in get_flight: {e}")
|
||||
return []
|
||||
@@ -0,0 +1,245 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "2a8071da",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Week 2, Day 4 - Use of Tools with openAI - Saurabh Gupta. For any feedback, please mail it to srbmisc@gmail.com, I would love connecting with you to discuss more on this, potentially improving it. Thank you for your read.\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "08f50ff5",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# 🔍 LinkedIn Job Search Conversationally with OpenAI and JobSpy\n",
|
||||
"\n",
|
||||
"Looking for a job on LinkedIn but tired of manually scrolling through endless listings? This Python project simplifies your job hunt by automating the process of searching LinkedIn job postings based on your **desired role** and **location**.\n",
|
||||
"\n",
|
||||
"## 🚀 What It Does\n",
|
||||
"\n",
|
||||
"Using the powerful Python scraping library **`jobspy`** [Github Link](https://github.com/speedyapply/JobSpy), this tool fetches **real-time job listings from LinkedIn** tailored to your input. Just provide:\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"- 🧑💻 **Job Title** (e.g., *Data Scientist*, *Frontend Developer*)\n",
|
||||
"- 📍 **Location** (e.g., *New York*, *Remote*, *Berlin*)\n",
|
||||
"\n",
|
||||
"…and the script will return up to **3 curated job listings** matching your criteria — complete with **direct links to the job posts** on LinkedIn.\n",
|
||||
"\n",
|
||||
"## ✅ Key Features\n",
|
||||
"\n",
|
||||
"- 🔎 Real-time LinkedIn job scraping \n",
|
||||
"- ⚡ Fast and minimalistic – only the top 3 results \n",
|
||||
"- 🔗 Includes clickable links to the job listings \n",
|
||||
"- 💬 Simple CLI input — no setup hassles\n",
|
||||
"\n",
|
||||
"## 📌 Use Case\n",
|
||||
"\n",
|
||||
"Whether you're actively job hunting or just exploring opportunities in your field, this tool helps you **stay updated** with fresh job listings — without opening a browser.\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "af6d6650",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"try:\n",
|
||||
" from jobspy import scrape_jobs\n",
|
||||
"except:\n",
|
||||
" %pip install -U python-jobspy\n",
|
||||
" from jobspy import scrape_jobs"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "a08eb7af",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# imports\n",
|
||||
"\n",
|
||||
"import os\n",
|
||||
"import json\n",
|
||||
"from dotenv import load_dotenv\n",
|
||||
"from openai import OpenAI\n",
|
||||
"import gradio as gr\n",
|
||||
"import pandas as pd"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "11a280b1",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"load_dotenv(override=True)\n",
|
||||
"\n",
|
||||
"openai_api_key = os.getenv('OPENAI_API_KEY')\n",
|
||||
"if openai_api_key:\n",
|
||||
" print(f\"OpenAI API Key exists and begins {openai_api_key[:8]}\")\n",
|
||||
"else:\n",
|
||||
" print(\"OpenAI API Key not set\")\n",
|
||||
" \n",
|
||||
"MODEL = \"gpt-4.1-mini\"\n",
|
||||
"openai = OpenAI()\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "dbac4cb6",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"def callExternalAPI(title, location) -> str:\n",
|
||||
" jobs = pd.DataFrame(scrape_jobs(\n",
|
||||
" site_name=[\"linkedin\"],\n",
|
||||
" search_term=title,\n",
|
||||
" location=location,\n",
|
||||
" results_wanted=3,\n",
|
||||
" hours_old=48\n",
|
||||
" ))\n",
|
||||
" print(f\"Found {len(jobs)} jobs\")\n",
|
||||
" if len(jobs) > 0:\n",
|
||||
" df = pd.DataFrame(jobs.loc[:,['title','company','location','date_posted','job_url']])\n",
|
||||
" df = df.fillna('N/A')\n",
|
||||
" return df.to_json()\n",
|
||||
" return 'No results found'"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "b44a92f6",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"system_message = ''' You are a helpful assistant for a Jobs Agency who helps users with job listings available.\n",
|
||||
"Give short, courteous answers to the users. Tell the user you can bring max 3 job listings at once. Present the job listings to the user in nice Markdown.\n",
|
||||
"Always be accurate. If you don't know the answer, say so.'''"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "339ac21b",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"getData_function = {\n",
|
||||
" \"name\": \"callExternalAPI\",\n",
|
||||
" \"description\": '''Get the Job listings for the provided job title and/or location. Call this whenever the user needs specific job listings data for the title or location that they provide. Following are few examples - \n",
|
||||
" 1. Give me the jobs available for Software Engineer at Gurugram, India. - In such case, you can call this function by passing the job_title as Software Engineer and job_location as Gurugram, India.\n",
|
||||
" 2. Give me the jobs available for Software Engineer. - In such case, you can call this function by passing the job_title as Software Engineer and job_location as blank string.\n",
|
||||
" 3. Give me the jobs available at Gurugram, India. - In such case, you can call this function by passing the job_title as blank string and job_location as Gurugram, India.\n",
|
||||
" ''',\n",
|
||||
" \"parameters\": {\n",
|
||||
" \"type\": \"object\",\n",
|
||||
" \"properties\": {\n",
|
||||
" \"job_title\": {\n",
|
||||
" \"type\": \"string\",\n",
|
||||
" \"description\": \"The title or position or designation of the job the user is interested in\",\n",
|
||||
" },\n",
|
||||
" \"job_location\": {\n",
|
||||
" \"type\": \"string\",\n",
|
||||
" \"description\": \"The location of the job the user is interested in\",\n",
|
||||
" }\n",
|
||||
" },\n",
|
||||
" \"required\": [\"job_title\",\"job_location\"],\n",
|
||||
" \"additionalProperties\": False\n",
|
||||
" }\n",
|
||||
"}"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "892250a4",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"tools = [{\"type\": \"function\", \"function\": getData_function}]"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "4be6fd80",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# We have to write that function handle_tool_call:\n",
|
||||
"\n",
|
||||
"def handle_tool_call(message):\n",
|
||||
" tool_call = message.tool_calls[0]\n",
|
||||
" arguments = json.loads(tool_call.function.arguments)\n",
|
||||
" job_title = arguments.get('job_title')\n",
|
||||
" job_location = arguments.get('job_location')\n",
|
||||
" jobs = callExternalAPI(job_title,job_location)\n",
|
||||
" response = {\n",
|
||||
" \"role\": \"tool\",\n",
|
||||
" \"content\": json.dumps({\"job_title\": job_title,\"job_location\": job_location,\"job_listings\": jobs}),\n",
|
||||
" \"tool_call_id\": tool_call.id\n",
|
||||
" }\n",
|
||||
" return response"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "9d0b5b08",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"def chat(message, history):\n",
|
||||
" messages = [{\"role\": \"system\", \"content\": system_message}] + history + [{\"role\": \"user\", \"content\": message}]\n",
|
||||
" response = openai.chat.completions.create(model=MODEL, messages=messages, tools=tools)\n",
|
||||
"\n",
|
||||
" if response.choices[0].finish_reason==\"tool_calls\":\n",
|
||||
" message = response.choices[0].message\n",
|
||||
" response = handle_tool_call(message)\n",
|
||||
" messages.append(message)\n",
|
||||
" messages.append(response)\n",
|
||||
" response = openai.chat.completions.create(model=MODEL, messages=messages)\n",
|
||||
" \n",
|
||||
" return response.choices[0].message.content"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "085b4aae",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"gr.ChatInterface(fn=chat, type=\"messages\").launch()"
|
||||
]
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"kernelspec": {
|
||||
"display_name": "venv",
|
||||
"language": "python",
|
||||
"name": "python3"
|
||||
},
|
||||
"language_info": {
|
||||
"codemirror_mode": {
|
||||
"name": "ipython",
|
||||
"version": 3
|
||||
},
|
||||
"file_extension": ".py",
|
||||
"mimetype": "text/x-python",
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.12.11"
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 5
|
||||
}
|
||||
115
week2/community-contributions/llms-chat-room/llm_bot.py
Normal file
115
week2/community-contributions/llms-chat-room/llm_bot.py
Normal file
@@ -0,0 +1,115 @@
|
||||
import os
|
||||
|
||||
from dotenv import load_dotenv
|
||||
from openai import OpenAI
|
||||
import anthropic
|
||||
|
||||
LLM_BOT_CONFIGS = {
|
||||
"gpt": {
|
||||
"id": "gpt",
|
||||
"name": "Giorgio",
|
||||
"model": "gpt-4.1-nano",
|
||||
"api_key_env": "OPENAI_API_KEY",
|
||||
},
|
||||
"claude": {
|
||||
"id": "claude",
|
||||
"name": "Anna",
|
||||
"model": "claude-sonnet-4-20250514",
|
||||
"api_key_env": "ANTHROPIC_API_KEY",
|
||||
},
|
||||
"gemini": {
|
||||
"id": "gemini",
|
||||
"name": "Isabella",
|
||||
"model": "gemini-2.0-flash",
|
||||
"base_url": "https://generativelanguage.googleapis.com/v1beta/openai/",
|
||||
"api_key_env": "GOOGLE_API_KEY",
|
||||
},
|
||||
"openai": {
|
||||
"id": "openai",
|
||||
"name": "Marco",
|
||||
"model": "gpt-4o-mini",
|
||||
"api_key_env": "OPENAI_API_KEY",
|
||||
},
|
||||
"deepseek": {
|
||||
"id": "deepseek",
|
||||
"name": "Roberto",
|
||||
"model": "deepseek-chat",
|
||||
"base_url": "https://api.deepseek.com",
|
||||
"api_key_env": "DEEPSEEK_API_KEY"
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
def load_api_key():
|
||||
"""Load .env and ensure all required API keys from LLM_BOT_CONFIGS are present.
|
||||
|
||||
- Reads environment variables from .env without overriding already-set envs.
|
||||
- Collects unique api_key_env names from LLM_BOT_CONFIGS.
|
||||
- Raises a RuntimeError listing any missing variables.
|
||||
"""
|
||||
# Load from .env but do not override variables already set in the environment
|
||||
load_dotenv(override=False)
|
||||
|
||||
required_env_vars = {cfg["api_key_env"] for cfg in LLM_BOT_CONFIGS.values() if "api_key_env" in cfg}
|
||||
|
||||
missing = []
|
||||
for var in sorted(required_env_vars):
|
||||
val = os.getenv(var)
|
||||
if not val:
|
||||
missing.append(var)
|
||||
|
||||
if missing:
|
||||
raise RuntimeError(
|
||||
"Missing required API key environment variables: "
|
||||
+ ", ".join(missing)
|
||||
+ ". Please add them to your .env file or export them in your environment."
|
||||
)
|
||||
|
||||
|
||||
class LLMBot:
|
||||
|
||||
def __init__(self, llm, subject, name=None):
|
||||
if llm not in LLM_BOT_CONFIGS:
|
||||
raise ValueError(f"Unknown LLM provider '{llm}'. Available: {', '.join(LLM_BOT_CONFIGS.keys())}")
|
||||
load_api_key()
|
||||
self.configuration = LLM_BOT_CONFIGS[llm]
|
||||
self.subject = subject
|
||||
api_key = os.getenv(self.configuration["api_key_env"])
|
||||
self.llm = llm
|
||||
self.name = name or self.configuration["name"]
|
||||
self.system_prompt = \
|
||||
(f"You are {self.name}, a person talking to other persons in one room discussing the {subject}.\
|
||||
If you are first to speak about the {subject} please describe it in detail to other people and express your opinion.\
|
||||
Talk politely and without any prejudices. Be short as other people also would like to express own opinion. If other people already speak about the {subject} \
|
||||
please make a comment adding some value to the conversation.")
|
||||
base_url = self.configuration.get("base_url")
|
||||
if self.llm == "claude" and not base_url:
|
||||
self.client = anthropic.Anthropic(api_key=api_key)
|
||||
else:
|
||||
self.client = OpenAI(api_key=api_key, base_url=base_url)
|
||||
|
||||
def get_response(self, prompt):
|
||||
if self.llm == "claude":
|
||||
# Anthropic API: system prompt is provided separately; messages contain user/assistant turns
|
||||
messages = [
|
||||
{"role": "user", "content": prompt},
|
||||
]
|
||||
response = self.client.messages.create(
|
||||
model=self.configuration["model"],
|
||||
max_tokens=200,
|
||||
temperature=0.7,
|
||||
system=self.system_prompt,
|
||||
messages=messages,
|
||||
)
|
||||
return response.content[0].text
|
||||
else:
|
||||
# OpenAI-compatible chat completion API (OpenAI, DeepSeek, Gemini via OpenAI shim)
|
||||
messages = [
|
||||
{"role": "system", "content": self.system_prompt},
|
||||
{"role": "user", "content": prompt}
|
||||
]
|
||||
response = self.client.chat.completions.create(
|
||||
model=self.configuration["model"],
|
||||
messages=messages,
|
||||
)
|
||||
return response.choices[0].message.content
|
||||
@@ -0,0 +1,40 @@
|
||||
import random
|
||||
from llm_bot import LLMBot
|
||||
|
||||
class LLMChatRoom:
|
||||
def __init__(self, llms, subject):
|
||||
self.subject = subject
|
||||
self.bots = [LLMBot(llm, subject) for llm in llms]
|
||||
self.messages = []
|
||||
|
||||
def select_speaker(self):
|
||||
if not self.bots:
|
||||
return None
|
||||
return random.choice(self.bots)
|
||||
|
||||
def next_message(self, speaker, prompt):
|
||||
if speaker:
|
||||
response = speaker.get_response(prompt)
|
||||
self.messages.append({"name": speaker.name, "content": response})
|
||||
|
||||
def get_prompt(self):
|
||||
prompt = f"Talk about {self.subject}:\n"
|
||||
for message in self.messages:
|
||||
prompt += f"{message['name']}: {message['content']}\n"
|
||||
return prompt
|
||||
|
||||
def get_last_message(self):
|
||||
return self.messages[-1] if self.messages else None
|
||||
|
||||
def talk(self):
|
||||
speaker = self.select_speaker()
|
||||
prompt = self.get_prompt()
|
||||
self.next_message(speaker, prompt)
|
||||
|
||||
def chat(self, num_turns=10):
|
||||
self.messages = []
|
||||
for _ in range(num_turns):
|
||||
self.talk()
|
||||
last_message = self.get_last_message()
|
||||
print(f"{last_message['name']}: {last_message['content']}")
|
||||
print("-"* 80)
|
||||
@@ -0,0 +1,118 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"metadata": {
|
||||
"collapsed": true,
|
||||
"ExecuteTime": {
|
||||
"end_time": "2025-10-03T11:54:55.134153Z",
|
||||
"start_time": "2025-10-03T11:54:55.132338Z"
|
||||
}
|
||||
},
|
||||
"cell_type": "markdown",
|
||||
"source": [
|
||||
"### LLMs Chat Room\n",
|
||||
"Several LLMs model discussion the topic inside the chat room\n"
|
||||
],
|
||||
"id": "b317cfdc98c19cf2"
|
||||
},
|
||||
{
|
||||
"metadata": {
|
||||
"ExecuteTime": {
|
||||
"end_time": "2025-10-03T14:29:44.656272Z",
|
||||
"start_time": "2025-10-03T14:29:44.261197Z"
|
||||
}
|
||||
},
|
||||
"cell_type": "code",
|
||||
"source": "from llm_chat_room import LLMChatRoom",
|
||||
"id": "33d224807d0b0be6",
|
||||
"outputs": [],
|
||||
"execution_count": 1
|
||||
},
|
||||
{
|
||||
"metadata": {},
|
||||
"cell_type": "markdown",
|
||||
"source": "",
|
||||
"id": "4f00eb9db0c25957"
|
||||
},
|
||||
{
|
||||
"metadata": {
|
||||
"ExecuteTime": {
|
||||
"end_time": "2025-10-03T14:32:59.878365Z",
|
||||
"start_time": "2025-10-03T14:32:59.845975Z"
|
||||
}
|
||||
},
|
||||
"cell_type": "code",
|
||||
"source": [
|
||||
"llms = [\"gpt\",\"claude\",\"gemini\"]\n",
|
||||
"subject = \"Pets\"\n",
|
||||
"room = LLMChatRoom(llms, subject)"
|
||||
],
|
||||
"id": "70746449bfd70fcf",
|
||||
"outputs": [],
|
||||
"execution_count": 6
|
||||
},
|
||||
{
|
||||
"metadata": {
|
||||
"ExecuteTime": {
|
||||
"end_time": "2025-10-03T14:33:15.074067Z",
|
||||
"start_time": "2025-10-03T14:33:02.158650Z"
|
||||
}
|
||||
},
|
||||
"cell_type": "code",
|
||||
"source": "room.chat(5)",
|
||||
"id": "1b717ae03a6c48c0",
|
||||
"outputs": [
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"Anna: Hello everyone! I'd love to share my thoughts about pets since we're discussing them today.\n",
|
||||
"\n",
|
||||
"I think pets bring such wonderful companionship to our lives. Whether it's a loyal dog who greets you at the door with pure joy, a cat who curls up on your lap during quiet evenings, or even smaller pets like birds or fish that add life to a home - they all offer something special.\n",
|
||||
"\n",
|
||||
"What I find most remarkable is how pets teach us responsibility and empathy. Caring for another living being, making sure they're fed, healthy, and loved, really helps us grow as people. And the unconditional love they give back is just incredible.\n",
|
||||
"\n",
|
||||
"Of course, I know pet ownership isn't right for everyone - it's a big commitment that requires time, money, and lifestyle adjustments. But for those who can provide a good home, I think the bond between humans and pets is truly one of life's great joys.\n",
|
||||
"\n",
|
||||
"What are\n",
|
||||
"--------------------------------------------------------------------------------\n",
|
||||
"Isabella: Hello Anna!\n",
|
||||
"\n",
|
||||
"I agree wholeheartedly! Pets truly enrich our lives in so many ways. The unconditional love and companionship they offer are invaluable. It's amazing how they teach us responsibility and empathy.\n",
|
||||
"\n",
|
||||
"--------------------------------------------------------------------------------\n",
|
||||
"Isabella: I think the best part is that pets come in all shapes and sizes, each with unique quirks and personalities. It is important to consider not only dogs and cats, but also birds, reptiles, and even insects that can bring joy and fulfillment to people's lives. What do you think?\n",
|
||||
"\n",
|
||||
"--------------------------------------------------------------------------------\n",
|
||||
"Giorgio: Hello everyone! If I may add a little, I believe that regardless of the kind of pet, the key is the mutual respect and understanding we develop with them. They truly become part of our families and teach us patience and compassion. It's always wonderful to see how each type of pet has its own way of bonding with people.\n",
|
||||
"--------------------------------------------------------------------------------\n",
|
||||
"Giorgio: Hello everyone! I completely agree with Giorgio. Pets, no matter their size or species, have a unique way of connecting with us and enriching our daily lives. Their loyalty and innocence often remind us to appreciate the simple joys and to be more patient and caring. It’s truly a special relationship that benefits both sides.\n",
|
||||
"--------------------------------------------------------------------------------\n"
|
||||
]
|
||||
}
|
||||
],
|
||||
"execution_count": 7
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"kernelspec": {
|
||||
"display_name": "Python 3",
|
||||
"language": "python",
|
||||
"name": "python3"
|
||||
},
|
||||
"language_info": {
|
||||
"codemirror_mode": {
|
||||
"name": "ipython",
|
||||
"version": 2
|
||||
},
|
||||
"file_extension": ".py",
|
||||
"mimetype": "text/x-python",
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython2",
|
||||
"version": "2.7.6"
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 5
|
||||
}
|
||||
@@ -0,0 +1,405 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "06cf3063-9f3e-4551-a0d5-f08d9cabb927",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# Week 2\n",
|
||||
"\n",
|
||||
"## Restaurant Order Processing\n",
|
||||
"\n",
|
||||
"We going to use 2 LLMs that will talk to each other, one as a customer and another of as a waiter taking order."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"id": "de23bb9e-37c5-4377-9a82-d7b6c648eeb6",
|
||||
"metadata": {
|
||||
"ExecuteTime": {
|
||||
"end_time": "2025-10-03T10:08:00.461895Z",
|
||||
"start_time": "2025-10-03T10:08:00.056696Z"
|
||||
}
|
||||
},
|
||||
"source": [
|
||||
"# imports\n",
|
||||
"\n",
|
||||
"import os\n",
|
||||
"from dotenv import load_dotenv\n",
|
||||
"from openai import OpenAI\n",
|
||||
"import anthropic\n",
|
||||
"from IPython.display import Markdown, display, update_display"
|
||||
],
|
||||
"outputs": [],
|
||||
"execution_count": 1
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"id": "f0a8ab2b-6134-4104-a1bc-c3cd7ea4cd36",
|
||||
"metadata": {
|
||||
"ExecuteTime": {
|
||||
"end_time": "2025-10-03T10:08:02.718709Z",
|
||||
"start_time": "2025-10-03T10:08:02.286926Z"
|
||||
}
|
||||
},
|
||||
"source": [
|
||||
"# import for Google\n",
|
||||
"# in rare cases, this seems to give an error on some systems, or even crashes the kernel\n",
|
||||
"# If this happens to you, simply ignore this cell - I give an alternative approach for using Gemini later\n",
|
||||
"\n",
|
||||
"import google.generativeai"
|
||||
],
|
||||
"outputs": [],
|
||||
"execution_count": 2
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"id": "1179b4c5-cd1f-4131-a876-4c9f3f38d2ba",
|
||||
"metadata": {
|
||||
"ExecuteTime": {
|
||||
"end_time": "2025-10-03T10:08:05.326683Z",
|
||||
"start_time": "2025-10-03T10:08:05.317765Z"
|
||||
}
|
||||
},
|
||||
"source": [
|
||||
"# Load environment variables in a file called .env\n",
|
||||
"# Print the key prefixes to help with any debugging.\n",
|
||||
"\n",
|
||||
"load_dotenv(override=True)\n",
|
||||
"openai_api_key = os.getenv('OPENAI_API_KEY')\n",
|
||||
"anthropic_api_key = os.getenv('ANTHROPIC_API_KEY')\n",
|
||||
"google_api_key = os.getenv('GOOGLE_API_KEY')\n",
|
||||
"deepseek_api_key = os.getenv('DEEPSEEK_API_KEY')\n",
|
||||
"\n",
|
||||
"if openai_api_key:\n",
|
||||
" print(f\"OpenAI API Key exists and begins {openai_api_key[:8]}\")\n",
|
||||
"else:\n",
|
||||
" print(\"OpenAI API Key not set\")\n",
|
||||
" \n",
|
||||
"if anthropic_api_key:\n",
|
||||
" print(f\"Anthropic API Key exists and begins {anthropic_api_key[:7]}\")\n",
|
||||
"else:\n",
|
||||
" print(\"Anthropic API Key not set\")\n",
|
||||
"\n",
|
||||
"if google_api_key:\n",
|
||||
" print(f\"Google API Key exists and begins {google_api_key[:8]}\")\n",
|
||||
"else:\n",
|
||||
" print(\"Google API Key not set\")\n",
|
||||
"\n",
|
||||
"if deepseek_api_key:\n",
|
||||
" print(f\"Deepseek API Key exists and begins {deepseek_api_key[:8]}\")\n",
|
||||
"else:\n",
|
||||
" print(\"Deepseek API Key not set\")"
|
||||
],
|
||||
"outputs": [
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"OpenAI API Key exists and begins sk-proj-\n",
|
||||
"Anthropic API Key exists and begins sk-ant-\n",
|
||||
"Google API Key exists and begins AIzaSyB_\n",
|
||||
"Deepseek API Key exists and begins sk-97f1d\n"
|
||||
]
|
||||
}
|
||||
],
|
||||
"execution_count": 3
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"id": "797fe7b0-ad43-42d2-acf0-e4f309b112f0",
|
||||
"metadata": {
|
||||
"ExecuteTime": {
|
||||
"end_time": "2025-10-03T10:08:09.163792Z",
|
||||
"start_time": "2025-10-03T10:08:09.106472Z"
|
||||
}
|
||||
},
|
||||
"source": [
|
||||
"# Connect to OpenAI, Anthropic, and DeepSeek\n",
|
||||
"\n",
|
||||
"openai = OpenAI()\n",
|
||||
"\n",
|
||||
"claude = anthropic.Anthropic()\n",
|
||||
"\n",
|
||||
"deepseek = OpenAI(api_key=deepseek_api_key, base_url=\"https://api.deepseek.com\")"
|
||||
],
|
||||
"outputs": [],
|
||||
"execution_count": 4
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"id": "bcb54183-45d3-4d08-b5b6-55e380dfdf1b",
|
||||
"metadata": {
|
||||
"ExecuteTime": {
|
||||
"end_time": "2025-10-03T10:08:11.291179Z",
|
||||
"start_time": "2025-10-03T10:08:11.288807Z"
|
||||
}
|
||||
},
|
||||
"source": [
|
||||
"# Let's make a conversation between GPT-4.1-mini and Claude-3.5-haiku\n",
|
||||
"# GPT-4.1-mini - a waiter and Claude-3.5-haiku - a client.\n",
|
||||
"\n",
|
||||
"gpt_model = \"gpt-4.1-mini\"\n",
|
||||
"claude_model = \"claude-3-5-haiku-latest\"\n",
|
||||
"\n",
|
||||
"gpt_system = \"You are a waiter in a restaurant. You are very friendly and polite; \\\n",
|
||||
"you need to take a order from a client using the following menu: \\\n",
|
||||
" Antipasti (Starters) \\\n",
|
||||
" - Bruschetta al Pomodoro - €5.00 \\\n",
|
||||
" - Crostini Toscani - €7.00 \\\n",
|
||||
" - Tagliere di Salumi - €9.00 \\\n",
|
||||
" Primi Piatti (First Courses) \\\n",
|
||||
" - Gnocchetti di Patate con Erbette - €10.00 \\\n",
|
||||
" - Paccheri con Polipetti - €12.00 \\\n",
|
||||
" - Risotto ai Frutti di Mare - €15.00 \\\n",
|
||||
" - Tagliolini al Tartufo - €14.00 \\\n",
|
||||
" - Zuppa di Cipolle - €8.00 \\\n",
|
||||
" Secondi Piatti (Second Courses) \\\n",
|
||||
" - Filetto di Manzo - €18.00 \\\n",
|
||||
" - Pollo alla Griglia - €12.00 \\\n",
|
||||
" - Branzino al Forno - €17.00 \\\n",
|
||||
" Pizze (Main) \\\n",
|
||||
" - Margherita - €8.00 \\\n",
|
||||
" - Diavola - €10.00 \\\n",
|
||||
" - Funghi e Prosciutto - €11.00 \\\n",
|
||||
" - Vegetariana - €9.50 \\\n",
|
||||
" Dolci (Dessert) \\\n",
|
||||
" - Tiramisu - €5.00 \\\n",
|
||||
" - Panna Cotta - €5.50 \\\n",
|
||||
" - Gelato - €4.00 \\\n",
|
||||
" Bevande (Beverages) \\\n",
|
||||
" - Acqua Naturale / Frizzante - €2.50 \\\n",
|
||||
" - Birra Artigianale - €4.00 \\\n",
|
||||
" - Vino della Casa - €5.50 / glass \\\n",
|
||||
"You need to ask about what client wants to eat and drink, checking menu for availability.\"\n",
|
||||
"\n",
|
||||
"claude_system = \"You are a client of the italian restaurant spiking to waiter. You would like to \\\n",
|
||||
"order something to eat and drink. You like pizza and meat. For drink you like sparking wine. Make a joke \\\n",
|
||||
"about your favorite food and drink when ordering.\"\n",
|
||||
"\n",
|
||||
"gpt_messages = [\"Hello, are you ready to make an order?\"]\n",
|
||||
"claude_messages = [\"Hi, sure. What pizza do you have?\"]"
|
||||
],
|
||||
"outputs": [],
|
||||
"execution_count": 5
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"id": "1df47dc7-b445-4852-b21b-59f0e6c2030f",
|
||||
"metadata": {
|
||||
"ExecuteTime": {
|
||||
"end_time": "2025-10-03T10:08:15.002601Z",
|
||||
"start_time": "2025-10-03T10:08:14.999845Z"
|
||||
}
|
||||
},
|
||||
"source": [
|
||||
"def call_gpt():\n",
|
||||
" messages = [{\"role\": \"system\", \"content\": gpt_system}]\n",
|
||||
" for gpt, claude in zip(gpt_messages, claude_messages):\n",
|
||||
" messages.append({\"role\": \"assistant\", \"content\": gpt})\n",
|
||||
" messages.append({\"role\": \"user\", \"content\": claude})\n",
|
||||
" completion = openai.chat.completions.create(\n",
|
||||
" model=gpt_model,\n",
|
||||
" messages=messages\n",
|
||||
" )\n",
|
||||
" return completion.choices[0].message.content"
|
||||
],
|
||||
"outputs": [],
|
||||
"execution_count": 6
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"id": "9dc6e913-02be-4eb6-9581-ad4b2cffa606",
|
||||
"metadata": {
|
||||
"ExecuteTime": {
|
||||
"end_time": "2025-10-03T10:08:26.851569Z",
|
||||
"start_time": "2025-10-03T10:08:24.701304Z"
|
||||
}
|
||||
},
|
||||
"source": [
|
||||
"call_gpt()"
|
||||
],
|
||||
"outputs": [
|
||||
{
|
||||
"data": {
|
||||
"text/plain": [
|
||||
"'We have four types of pizza available: Margherita for €8.00, Diavola for €10.00, Funghi e Prosciutto for €11.00, and Vegetariana for €9.50. Which one would you like to try?'"
|
||||
]
|
||||
},
|
||||
"execution_count": 7,
|
||||
"metadata": {},
|
||||
"output_type": "execute_result"
|
||||
}
|
||||
],
|
||||
"execution_count": 7
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"id": "7d2ed227-48c9-4cad-b146-2c4ecbac9690",
|
||||
"metadata": {
|
||||
"ExecuteTime": {
|
||||
"end_time": "2025-10-03T10:08:31.932950Z",
|
||||
"start_time": "2025-10-03T10:08:31.930093Z"
|
||||
}
|
||||
},
|
||||
"source": [
|
||||
"def call_claude():\n",
|
||||
" # Build the interleaved conversation history for Claude.\n",
|
||||
" # zip truncates to the shortest list, so it will pair up the common turns only.\n",
|
||||
" messages = []\n",
|
||||
" for gpt, claude_message in zip(gpt_messages, claude_messages):\n",
|
||||
" messages.append({\"role\": \"user\", \"content\": gpt})\n",
|
||||
" messages.append({\"role\": \"assistant\", \"content\": claude_message})\n",
|
||||
"\n",
|
||||
" # Append the latest GPT message as the final user turn ONLY when GPT is one ahead.\n",
|
||||
" # This is the normal flow (we call call_gpt(), append its output to gpt_messages, then call call_claude()).\n",
|
||||
" if len(gpt_messages) == len(claude_messages) + 1:\n",
|
||||
" messages.append({\"role\": \"user\", \"content\": gpt_messages[-1]})\n",
|
||||
"\n",
|
||||
" # Validate that the prompt we send to Claude ends with a user turn.\n",
|
||||
" # If not, we likely called call_claude() out of order (e.g., with equal lengths).\n",
|
||||
" if not messages or messages[-1][\"role\"] != \"user\":\n",
|
||||
" raise ValueError(\n",
|
||||
" \"call_claude expected GPT to be one message ahead so Claude can reply to a user turn. \"\n",
|
||||
" f\"Got len(gpt_messages)={len(gpt_messages)} and len(claude_messages)={len(claude_messages)}.\"\n",
|
||||
" )\n",
|
||||
"\n",
|
||||
" message = claude.messages.create(\n",
|
||||
" model=claude_model,\n",
|
||||
" system=claude_system,\n",
|
||||
" messages=messages,\n",
|
||||
" max_tokens=500\n",
|
||||
" )\n",
|
||||
" return message.content[0].text"
|
||||
],
|
||||
"outputs": [],
|
||||
"execution_count": 8
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"id": "01395200-8ae9-41f8-9a04-701624d3fd26",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"call_claude()"
|
||||
],
|
||||
"outputs": [],
|
||||
"execution_count": null
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"id": "08c2279e-62b0-4671-9590-c82eb8d1e1ae",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"call_gpt()"
|
||||
],
|
||||
"outputs": [],
|
||||
"execution_count": null
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"id": "0275b97f-7f90-4696-bbf5-b6642bd53cbd",
|
||||
"metadata": {
|
||||
"ExecuteTime": {
|
||||
"end_time": "2025-10-03T10:10:54.588210Z",
|
||||
"start_time": "2025-10-03T10:10:26.440634Z"
|
||||
}
|
||||
},
|
||||
"source": [
|
||||
"gpt_messages = [\"Hello, are you ready to make an order?\"]\n",
|
||||
"claude_messages = [\"Hi, yes, sure. What you have special today?\"]\n",
|
||||
"\n",
|
||||
"print(f\"Waiter:\\n{gpt_messages[0]}\\n\")\n",
|
||||
"print(f\"Client:\\n{claude_messages[0]}\\n\")\n",
|
||||
"\n",
|
||||
"for i in range(5):\n",
|
||||
" gpt_next = call_gpt()\n",
|
||||
" print(f\"Waiter:\\n{gpt_next}\\n\")\n",
|
||||
" gpt_messages.append(gpt_next)\n",
|
||||
" \n",
|
||||
" claude_next = call_claude()\n",
|
||||
" print(f\"Client:\\n{claude_next}\\n\")\n",
|
||||
" claude_messages.append(claude_next)"
|
||||
],
|
||||
"outputs": [
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"Waiter:\n",
|
||||
"Hello, are you ready to make an order?\n",
|
||||
"\n",
|
||||
"Client:\n",
|
||||
"Hi, yes, sure. What you have special today?\n",
|
||||
"\n",
|
||||
"Waiter:\n",
|
||||
"Hello! Today we have some lovely specials that I think you'll enjoy. Our Risotto ai Frutti di Mare is especially fresh and flavorful this evening, and our Tagliolini al Tartufo is a wonderful treat if you love truffle flavors. For starters, the Tagliere di Salumi is a great choice to sample a variety of delicious cured meats. For dessert, our Tiramisu is always a favorite. What type of dishes do you usually enjoy? This way, I can help you pick something perfect! Would you also like to hear about our beverages?\n",
|
||||
"\n",
|
||||
"Client:\n",
|
||||
"*chuckles* Well, I'm definitely a pizza and meat lover! I'd love to hear about your meat options. And since you mentioned beverages - do you have any sparkling wine? Because you know, I always say a good sparkling wine is like a party in a glass, and I'm ready to be the guest of honor! *winks*\n",
|
||||
"\n",
|
||||
"I'm thinking maybe a nice pizza with some delicious meat toppings, and of course, a crisp prosecco to go with it. I'm particularly interested in anything with prosciutto or salami. Could you recommend something that would make my taste buds do a little Italian dance?\n",
|
||||
"\n",
|
||||
"Waiter:\n",
|
||||
"*chuckles* I love your enthusiasm! For a pizza with meat toppings, I highly recommend our Funghi e Prosciutto pizza – it comes topped with savory prosciutto and mushrooms, a perfect combination to make your taste buds dance! Another great choice is the Diavola, which has spicy salami if you're in the mood for something with a bit of a kick.\n",
|
||||
"\n",
|
||||
"Unfortunately, we don’t have sparkling wine on the menu at the moment, but we do offer a refreshing Birra Artigianale (craft beer) and a lovely Vino della Casa by the glass if you'd like to try either of those. Would you like to go with the Funghi e Prosciutto pizza and perhaps a glass of our house wine, or would you prefer to hear more options? And would you like to start with any starters or go straight to your main?\n",
|
||||
"\n",
|
||||
"Client:\n",
|
||||
"*dramatically places hand on heart* No sparkling wine? Oh, the tragedy! But fear not, I shall survive this Italian culinary adventure. *winks*\n",
|
||||
"\n",
|
||||
"I'll definitely go for the Funghi e Prosciutto pizza - sounds like a match made in heaven! And since you don't have my beloved bubbles, I'll take a glass of your house wine. A little vino always makes a pizza taste better, right? \n",
|
||||
"\n",
|
||||
"*leans in conspiratorially* You know, I always say pizza and wine are like best friends - they might look different, but they always have each other's back! *chuckles at own joke*\n",
|
||||
"\n",
|
||||
"I'll skip the starter and go straight to the main event. My stomach is already doing a happy dance just thinking about that prosciutto and mushroom pizza. Bring it on!\n",
|
||||
"\n",
|
||||
"Waiter:\n",
|
||||
"*smiling* What a fantastic choice! The Funghi e Prosciutto pizza paired with a glass of our lovely house wine will definitely make your taste buds happy – I love your pizza and wine friendship analogy, it’s absolutely spot on!\n",
|
||||
"\n",
|
||||
"I'll get your order in right away. If you need anything else while waiting or want to add a dessert later, just let me know. Buon appetito, and enjoy your Italian culinary adventure! 🎉🍕🍷\n",
|
||||
"\n",
|
||||
"Client:\n",
|
||||
"Grazie mille! *raises imaginary wine glass* To the beautiful friendship of pizza and wine, may they always dance together on my plate and in my heart! *winks and chuckles*\n",
|
||||
"\n",
|
||||
"I'm looking forward to this delicious adventure. Salute!\n",
|
||||
"\n",
|
||||
"Waiter:\n",
|
||||
"Salute! *raises imaginary glass with you* To the perfect pairing of pizza and wine, may every bite and sip bring you joy and many happy dances! If you need anything else during your meal, I'm right here to help. Enjoy your Funghi e Prosciutto pizza and house wine – a true Italian celebration on your plate! Buon appetito! 😊🍕🍷\n",
|
||||
"\n",
|
||||
"Client:\n",
|
||||
"*takes a playful bow* Grazie! I'll be the most entertained diner in the restaurant tonight! *grins and winks* Can't wait to see this pizza that's about to perform its delicious culinary dance for me! Bring on the prosciutto and mushroom magic! 🇮🇹🍕🥂\n",
|
||||
"\n",
|
||||
"Waiter:\n",
|
||||
"*smiling warmly* I can already tell you're going to make this evening unforgettable with your wonderful spirit! Your pizza with its prosciutto and mushroom magic is on its way, ready to perform that delicious dance just for you. If there's anything else I can do to make your experience even better, don’t hesitate to let me know. Enjoy every bite and sip – it's your night to shine! 🇮🇹🍕🥂\n",
|
||||
"\n",
|
||||
"Client:\n",
|
||||
"*gives a theatrical thumbs up* Fantastico! I'm ready for my pizza performance! Who needs a dance floor when you have a delicious pizza stage? *winks and chuckles*\n",
|
||||
"\n"
|
||||
]
|
||||
}
|
||||
],
|
||||
"execution_count": 9
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"kernelspec": {
|
||||
"display_name": "Python 3 (ipykernel)",
|
||||
"language": "python",
|
||||
"name": "python3"
|
||||
},
|
||||
"language_info": {
|
||||
"codemirror_mode": {
|
||||
"name": "ipython",
|
||||
"version": 3
|
||||
},
|
||||
"file_extension": ".py",
|
||||
"mimetype": "text/x-python",
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.11.13"
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 5
|
||||
}
|
||||
Binary file not shown.
Binary file not shown.
BIN
week2/community-contributions/pptx_summarizer/Marie_Curie.pptx
Normal file
BIN
week2/community-contributions/pptx_summarizer/Marie_Curie.pptx
Normal file
Binary file not shown.
Binary file not shown.
Binary file not shown.
@@ -0,0 +1,378 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "ddfa9ae6-69fe-444a-b994-8c4c5970a7ec",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# Project - PPTX summarizer AI Assistant\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 1,
|
||||
"id": "8b50bbe2-c0b1-49c3-9a5c-1ba7efa2bcb4",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# imports\n",
|
||||
"\n",
|
||||
"import os\n",
|
||||
"import json\n",
|
||||
"from dotenv import load_dotenv\n",
|
||||
"from openai import OpenAI\n",
|
||||
"import gradio as gr\n",
|
||||
"from IPython.display import Markdown, display, update_display"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 2,
|
||||
"id": "747e8786-9da8-4342-b6c9-f5f69c2e22ae",
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"OpenAI API Key exists and begins sk-proj-\n"
|
||||
]
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"# Initialization\n",
|
||||
"\n",
|
||||
"load_dotenv(override=True)\n",
|
||||
"\n",
|
||||
"openai_api_key = os.getenv('OPENAI_API_KEY')\n",
|
||||
"if openai_api_key:\n",
|
||||
" print(f\"OpenAI API Key exists and begins {openai_api_key[:8]}\")\n",
|
||||
"else:\n",
|
||||
" print(\"OpenAI API Key not set\")\n",
|
||||
" \n",
|
||||
"MODEL = \"gpt-4o-mini\"\n",
|
||||
"openai = OpenAI()\n",
|
||||
"\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 3,
|
||||
"id": "bac7ac31-6ece-4aa1-87eb-329b82f0f45a",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"\n",
|
||||
"import os\n",
|
||||
"from pptx import Presentation\n",
|
||||
"import openai # adapt if you use a different client\n",
|
||||
"\n",
|
||||
"class PptxSummary:\n",
|
||||
" def __init__(self, name: str):\n",
|
||||
" self.name = name\n",
|
||||
"\n",
|
||||
" def summary(self):\n",
|
||||
" # Load an existing PowerPoint\n",
|
||||
" prs = Presentation(self.name)\n",
|
||||
" newtext = \"\"\n",
|
||||
"\n",
|
||||
" # print(prs)\n",
|
||||
" # Loop over all slides\n",
|
||||
" for i, slide in enumerate(prs.slides, start=1):\n",
|
||||
" # print(f\"\\n--- Slide {i} ---\")\n",
|
||||
" newtext += f\"\\n\\n--- Slide {i} ---\"\n",
|
||||
" \n",
|
||||
" # Loop over shapes (text boxes, titles, placeholders, etc.)\n",
|
||||
" for shape in slide.shapes:\n",
|
||||
" if shape.has_text_frame: # Only shapes that can contain text\n",
|
||||
" for paragraph in shape.text_frame.paragraphs:\n",
|
||||
" # Collect text from each run in the paragraph\n",
|
||||
" text = \"\".join(run.text for run in paragraph.runs)\n",
|
||||
" # print(text)\n",
|
||||
" newtext+= \"\\n\"\n",
|
||||
" newtext += text\n",
|
||||
" # print(newtext)\n",
|
||||
" return newtext\n",
|
||||
"\n",
|
||||
" \n",
|
||||
"system_message = \"You are a helpful assistant for a company and you can summarize the given topic based on the given pptx name. \"\n",
|
||||
"system_message += \"Give short, courteous answers, no more than 10 sentence. \"\n",
|
||||
"system_message += \"Always be accurate. If the presentation .pptx file does not exist, say so. Respond in markdown.\"\n",
|
||||
"\n",
|
||||
"def user_message(path):\n",
|
||||
" ppt = PptxSummary(path)\n",
|
||||
" summarization_message = ppt.summary()\n",
|
||||
" message = \"You need to summarize the a pptx file.\"\n",
|
||||
" message += f\"The context of that pptx file is here: {summarization_message}\"\n",
|
||||
" message += \"Give the concise information in small paragraphs. \"\n",
|
||||
" return message\n",
|
||||
"\n",
|
||||
"def pptx_summary(path):\n",
|
||||
" if os.path.exists(path):\n",
|
||||
" result = \"The file does not exist\"\n",
|
||||
" response = openai.chat.completions.create(\n",
|
||||
" model=MODEL,\n",
|
||||
" messages=[\n",
|
||||
" {\"role\": \"system\", \"content\": system_message},\n",
|
||||
" {\"role\": \"user\", \"content\": user_message(path)}\n",
|
||||
" ],\n",
|
||||
" )\n",
|
||||
" result = response.choices[0].message.content\n",
|
||||
" return result\n",
|
||||
" else:\n",
|
||||
" return \"The file does not exist\"\n",
|
||||
"\n",
|
||||
"\n",
|
||||
" \n",
|
||||
"\n",
|
||||
"# print(\"Current working directory:\", os.getcwd())\n",
|
||||
"# print(\"Files here:\", os.listdir())\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 4,
|
||||
"id": "e4747d6b-d224-4ab3-b9bb-81c423497c81",
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"# Summary of \"Marie Curie: Highlights of a Notable Life\"\n",
|
||||
"\n",
|
||||
"The presentation details the life and achievements of Marie Curie, born in Warsaw, Poland in 1867. She pursued her studies in physics and mathematics in Paris, setting the stage for her groundbreaking scientific career.\n",
|
||||
"\n",
|
||||
"Marie Curie made history as the first woman to win a Nobel Prize and is renowned for her discovery of radioactivity, alongside her husband, Pierre Curie. Her noteworthy accomplishments include winning Nobel Prizes in both Physics and Chemistry, highlighting her significant contributions to science.\n",
|
||||
"\n",
|
||||
"Curie's legacy extends beyond her scientific achievements; she shattered barriers for women in science and her research ultimately aided in the development of medical X-rays. She is celebrated as one of the most influential scientists of all time, leaving an enduring impact on both science and society.\n"
|
||||
]
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"print(pptx_summary(\"Marie_Curie.pptx\"))"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "5bd2b368-960c-40b4-adf0-db40fe713cbc",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# My tools"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 5,
|
||||
"id": "8f7c0fda-704b-4e3d-bc69-89986cf5e7a9",
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"The file does not exist\n"
|
||||
]
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"print(pptx_summary(\"bts.pptx\"))"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "c3f8e54f-af95-4bf6-a202-89e7a3ffdec5",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": []
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "36bedabf-a0a7-4985-ad8e-07ed6a55a3a4",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Tools\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 6,
|
||||
"id": "4afceded-7178-4c05-8fa6-9f2085e6a344",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# There's a particular dictionary structure that's required to describe our function:\n",
|
||||
"\n",
|
||||
"summary_function = {\n",
|
||||
" \"name\": \"pptx_summary\",\n",
|
||||
" \"description\": \"Get the summary for the given pptx file, you need to call this function, if user asks for a pptx file, if it is outside of the pptx file, tell that you do not know.\",\n",
|
||||
" \"parameters\": {\n",
|
||||
" \"type\": \"object\",\n",
|
||||
" \"properties\": {\n",
|
||||
" \"path\": {\n",
|
||||
" \"type\": \"string\",\n",
|
||||
" \"description\": \"The name of the presentation\",\n",
|
||||
" },\n",
|
||||
" },\n",
|
||||
" \"required\": [\"path\"],\n",
|
||||
" \"additionalProperties\": False\n",
|
||||
" }\n",
|
||||
"}"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 7,
|
||||
"id": "bdca8679-935f-4e7f-97e6-e71a4d4f228c",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# And this is included in a list of tools:\n",
|
||||
"\n",
|
||||
"tools = [{\"type\": \"function\", \"function\": summary_function}]"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "c3d3554f-b4e3-4ce7-af6f-68faa6dd2340",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Getting OpenAI to use our Tool\n",
|
||||
"\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 8,
|
||||
"id": "ce9b0744-9c78-408d-b9df-9f6fd9ed78cf",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"def chat(message, history):\n",
|
||||
" messages = [{\"role\": \"system\", \"content\": system_message}] + history + [{\"role\": \"user\", \"content\": message}]\n",
|
||||
" response = openai.chat.completions.create(model=MODEL, messages=messages, tools=tools)\n",
|
||||
"\n",
|
||||
" if response.choices[0].finish_reason==\"tool_calls\":\n",
|
||||
" message = response.choices[0].message\n",
|
||||
" print(f\"message: {message}\")\n",
|
||||
" response, city = handle_tool_call(message)\n",
|
||||
" print(f\"city: {city}, response: {response}\")\n",
|
||||
" messages.append(message)\n",
|
||||
" messages.append(response)\n",
|
||||
" response = openai.chat.completions.create(model=MODEL, messages=messages)\n",
|
||||
" print(f\"response: {response}\")\n",
|
||||
" \n",
|
||||
" return response.choices[0].message.content"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 9,
|
||||
"id": "b0992986-ea09-4912-a076-8e5603ee631f",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# We have to write that function handle_tool_call:\n",
|
||||
"\n",
|
||||
"def handle_tool_call(message):\n",
|
||||
" tool_call = message.tool_calls[0]\n",
|
||||
" print(f\"tool_call: {tool_call}\")\n",
|
||||
" arguments = json.loads(tool_call.function.arguments)\n",
|
||||
" print(f\"arguments: {arguments}\")\n",
|
||||
" path = arguments.get('path')\n",
|
||||
" print(f\"path: {path}\")\n",
|
||||
" summary = pptx_summary(path)\n",
|
||||
" # print(f\"price: {price}\")\n",
|
||||
" response = {\n",
|
||||
" \"role\": \"tool\",\n",
|
||||
" \"content\": json.dumps({\"path\": path,\"summary\": summary}),\n",
|
||||
" \"tool_call_id\": tool_call.id\n",
|
||||
" }\n",
|
||||
" print(f\"response: {response}\")\n",
|
||||
" return response, path"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 10,
|
||||
"id": "f4be8a71-b19e-4c2f-80df-f59ff2661f14",
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"* Running on local URL: http://127.0.0.1:7860\n",
|
||||
"* To create a public link, set `share=True` in `launch()`.\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"data": {
|
||||
"text/html": [
|
||||
"<div><iframe src=\"http://127.0.0.1:7860/\" width=\"100%\" height=\"500\" allow=\"autoplay; camera; microphone; clipboard-read; clipboard-write;\" frameborder=\"0\" allowfullscreen></iframe></div>"
|
||||
],
|
||||
"text/plain": [
|
||||
"<IPython.core.display.HTML object>"
|
||||
]
|
||||
},
|
||||
"metadata": {},
|
||||
"output_type": "display_data"
|
||||
},
|
||||
{
|
||||
"data": {
|
||||
"text/plain": []
|
||||
},
|
||||
"execution_count": 10,
|
||||
"metadata": {},
|
||||
"output_type": "execute_result"
|
||||
},
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"message: ChatCompletionMessage(content=None, refusal=None, role='assistant', annotations=[], audio=None, function_call=None, tool_calls=[ChatCompletionMessageFunctionToolCall(id='call_qgCGcnIRiknlEaT6eGFlPyN4', function=Function(arguments='{\"path\":\"nelson_mandela.pptx\"}', name='pptx_summary'), type='function')])\n",
|
||||
"tool_call: ChatCompletionMessageFunctionToolCall(id='call_qgCGcnIRiknlEaT6eGFlPyN4', function=Function(arguments='{\"path\":\"nelson_mandela.pptx\"}', name='pptx_summary'), type='function')\n",
|
||||
"arguments: {'path': 'nelson_mandela.pptx'}\n",
|
||||
"path: nelson_mandela.pptx\n",
|
||||
"response: {'role': 'tool', 'content': '{\"path\": \"nelson_mandela.pptx\", \"summary\": \"# Summary of \\\\\"Nelson Mandela: Highlights of a Notable Life\\\\\" \\\\n\\\\nThis presentation covers the life and impact of Nelson Mandela. Born in 1918 in South Africa, Mandela became politically active in the fight against apartheid after studying law. \\\\n\\\\nHe emerged as a prominent leader in the anti-apartheid movement but faced imprisonment for 27 years due to his activism. In 1994, he made history by becoming South Africa\\'s first Black president. \\\\n\\\\nMandela\\'s legacy is profound; he is celebrated as a symbol of peace, reconciliation, and justice. His efforts earned him the Nobel Peace Prize and he continues to inspire human rights movements around the globe.\"}', 'tool_call_id': 'call_qgCGcnIRiknlEaT6eGFlPyN4'}\n",
|
||||
"city: nelson_mandela.pptx, response: {'role': 'tool', 'content': '{\"path\": \"nelson_mandela.pptx\", \"summary\": \"# Summary of \\\\\"Nelson Mandela: Highlights of a Notable Life\\\\\" \\\\n\\\\nThis presentation covers the life and impact of Nelson Mandela. Born in 1918 in South Africa, Mandela became politically active in the fight against apartheid after studying law. \\\\n\\\\nHe emerged as a prominent leader in the anti-apartheid movement but faced imprisonment for 27 years due to his activism. In 1994, he made history by becoming South Africa\\'s first Black president. \\\\n\\\\nMandela\\'s legacy is profound; he is celebrated as a symbol of peace, reconciliation, and justice. His efforts earned him the Nobel Peace Prize and he continues to inspire human rights movements around the globe.\"}', 'tool_call_id': 'call_qgCGcnIRiknlEaT6eGFlPyN4'}\n",
|
||||
"response: ChatCompletion(id='chatcmpl-CKna164Iq9BrGm35e19Rqm8K6Yr9N', choices=[Choice(finish_reason='stop', index=0, logprobs=None, message=ChatCompletionMessage(content='# Summary of \"Nelson Mandela: Highlights of a Notable Life\"\\n\\nThis presentation covers the life and impact of Nelson Mandela. Born in 1918 in South Africa, Mandela became politically active in the fight against apartheid after studying law. \\n\\nHe emerged as a prominent leader in the anti-apartheid movement but faced imprisonment for 27 years due to his activism. In 1994, he made history by becoming South Africa\\'s first Black president. \\n\\nMandela\\'s legacy is profound; he is celebrated as a symbol of peace, reconciliation, and justice. His efforts earned him the Nobel Peace Prize, and he continues to inspire human rights movements around the globe.', refusal=None, role='assistant', annotations=[], audio=None, function_call=None, tool_calls=None))], created=1759073181, model='gpt-4o-mini-2024-07-18', object='chat.completion', service_tier='default', system_fingerprint='fp_560af6e559', usage=CompletionUsage(completion_tokens=134, prompt_tokens=269, total_tokens=403, completion_tokens_details=CompletionTokensDetails(accepted_prediction_tokens=0, audio_tokens=0, reasoning_tokens=0, rejected_prediction_tokens=0), prompt_tokens_details=PromptTokensDetails(audio_tokens=0, cached_tokens=0)))\n"
|
||||
]
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"gr.ChatInterface(fn=chat, type=\"messages\").launch()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "d551ce21-c66c-4f56-88ad-8212f16437ab",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": []
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"kernelspec": {
|
||||
"display_name": "Python 3 (ipykernel)",
|
||||
"language": "python",
|
||||
"name": "python3"
|
||||
},
|
||||
"language_info": {
|
||||
"codemirror_mode": {
|
||||
"name": "ipython",
|
||||
"version": 3
|
||||
},
|
||||
"file_extension": ".py",
|
||||
"mimetype": "text/x-python",
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.11.13"
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 5
|
||||
}
|
||||
173
week2/community-contributions/tsungyulin/reserveTicketDemo.ipynb
Normal file
173
week2/community-contributions/tsungyulin/reserveTicketDemo.ipynb
Normal file
@@ -0,0 +1,173 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "27fa33cf",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"import openai\n",
|
||||
"from dotenv import load_dotenv\n",
|
||||
"import gradio as gr\n",
|
||||
"import os\n",
|
||||
"import json\n",
|
||||
"from datetime import datetime\n",
|
||||
"\n",
|
||||
"import httpx\n",
|
||||
"from fastapi import FastAPI\n",
|
||||
"import uvicorn\n",
|
||||
"import threading\n",
|
||||
"\n",
|
||||
"load_dotenv('.env',override=True)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "e9407192",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"app = FastAPI()\n",
|
||||
"\n",
|
||||
"@app.post('/mock/ticket')\n",
|
||||
"def booking(payload:dict):\n",
|
||||
" dt = datetime.strptime(payload.get('date'), \"%Y/%m/%d\") \n",
|
||||
" isoStr = dt.date().isoformat()\n",
|
||||
" return {\"status\": \"success\", \"order_id\": f\"MOCK-FLIGHT-{isoStr}-001\"}\n",
|
||||
"\n",
|
||||
"# start server\n",
|
||||
"def run():\n",
|
||||
" uvicorn.run(app, host=\"127.0.0.1\", port=8000)\n",
|
||||
"\n",
|
||||
"thread = threading.Thread(target=run, daemon=True)\n",
|
||||
"thread.start()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "2229b6db",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"openai_api_key = os.getenv('OPENAI_API_KEY')\n",
|
||||
"llm = openai.OpenAI(api_key=openai_api_key)\n",
|
||||
"\n",
|
||||
"system_message = \"You are a helpful assistant for an Airline called FlightAI. \"\n",
|
||||
"system_message += \"Give short, courteous answers, no more than 1 sentence. \"\n",
|
||||
"system_message += \"Always be accurate. If you don't know the answer, say so.\"\n",
|
||||
"\n",
|
||||
"async def booking_flight(departure, destination, date):\n",
|
||||
" print(f\"Book the Flight Automatically, {departure} to {destination} at {date}.\")\n",
|
||||
" reqBody = {\n",
|
||||
" \"departure\": departure,\n",
|
||||
" \"destination\": destination,\n",
|
||||
" \"date\": date\n",
|
||||
" }\n",
|
||||
" async with httpx.AsyncClient() as client:\n",
|
||||
" res = await client.post('http://127.0.0.1:8000/mock/ticket', json=reqBody)\n",
|
||||
" print(res.text)\n",
|
||||
" return res.text\n",
|
||||
" \n",
|
||||
"book_function = {\n",
|
||||
" \"name\": \"booking_flight\",\n",
|
||||
" \"description\": \"async function for booking the flight ticket for customers and it will return the status and id of flight. Call this function whenever you were asked to book the flight, and you will automatically tell the status of the order and the book number! if customers don't provide their departure or destination or date you should inquire them courteous. Note that the date format you should keep them with %Y/%m/%d. for example when a customer asks 'Please help me book the ticket from <departure> to <destination>'\",\n",
|
||||
" \"parameters\": {\n",
|
||||
" \"type\": \"object\",\n",
|
||||
" \"properties\": {\n",
|
||||
" \"departure\": {\n",
|
||||
" \"type\": \"string\",\n",
|
||||
" \"description\": \"The city where the customer departure\",\n",
|
||||
" },\n",
|
||||
" \"destination\": {\n",
|
||||
" \"type\": \"string\",\n",
|
||||
" \"description\": \"The city that the customer wants to travel to\",\n",
|
||||
" },\n",
|
||||
" \"date\": {\n",
|
||||
" \"type\": \"string\",\n",
|
||||
" \"description\": \"The date of the flight \",\n",
|
||||
" },\n",
|
||||
" },\n",
|
||||
" \"required\": [\"destination\", \"departure\", \"date\"],\n",
|
||||
" \"additionalProperties\": False\n",
|
||||
" }\n",
|
||||
"}\n",
|
||||
"\n",
|
||||
"async def handle_tool_call(message):\n",
|
||||
" tool_call = message.tool_calls[0]\n",
|
||||
" arguments = json.loads(tool_call.function.arguments)\n",
|
||||
" departure = arguments.get('departure')\n",
|
||||
" destination = arguments.get('destination')\n",
|
||||
" date = arguments.get('date')\n",
|
||||
" res = await booking_flight(departure, destination, date)\n",
|
||||
" response = {\n",
|
||||
" \"role\": \"tool\",\n",
|
||||
" \"content\": json.dumps(res),\n",
|
||||
" \"tool_call_id\": tool_call.id\n",
|
||||
" }\n",
|
||||
" return response\n",
|
||||
"\n",
|
||||
"tools = [{\"type\": \"function\", \"function\": book_function}]"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "5bf9656f",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"res = await booking_flight('Taiwan', \"NewYork\", \"2025/12/03\")"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "d2924055",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"async def chat(message, history):\n",
|
||||
" messages = [{\"role\": \"system\", \"content\": system_message}] + history + [{\"role\": \"user\", \"content\": message}]\n",
|
||||
" res = llm.chat.completions.create(messages=messages,\n",
|
||||
" model=\"gpt-4.1-mini\",\n",
|
||||
" tools=tools)\n",
|
||||
" \n",
|
||||
" if res.choices[0].finish_reason == 'tool_calls':\n",
|
||||
" message = res.choices[0].message\n",
|
||||
" toolResponse = await handle_tool_call(message)\n",
|
||||
" messages.append(message)\n",
|
||||
" messages.append(toolResponse)\n",
|
||||
" res = llm.chat.completions.create(messages=messages,\n",
|
||||
" model=\"gpt-4.1-mini\")\n",
|
||||
"\n",
|
||||
" return res.choices[0].message.content\n",
|
||||
"\n",
|
||||
"gr.ChatInterface(fn=chat,type=\"messages\").launch()"
|
||||
]
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"kernelspec": {
|
||||
"display_name": "3.10.15",
|
||||
"language": "python",
|
||||
"name": "python3"
|
||||
},
|
||||
"language_info": {
|
||||
"codemirror_mode": {
|
||||
"name": "ipython",
|
||||
"version": 3
|
||||
},
|
||||
"file_extension": ".py",
|
||||
"mimetype": "text/x-python",
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.10.15"
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 5
|
||||
}
|
||||
175
week2/community-contributions/tsungyulin/week2 EXERCISE.ipynb
Normal file
175
week2/community-contributions/tsungyulin/week2 EXERCISE.ipynb
Normal file
@@ -0,0 +1,175 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "d006b2ea-9dfe-49c7-88a9-a5a0775185fd",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# Additional End of week Exercise - week 2\n",
|
||||
"\n",
|
||||
"Now use everything you've learned from Week 2 to build a full prototype for the technical question/answerer you built in Week 1 Exercise.\n",
|
||||
"\n",
|
||||
"This should include a Gradio UI, streaming, use of the system prompt to add expertise, and the ability to switch between models. Bonus points if you can demonstrate use of a tool!\n",
|
||||
"\n",
|
||||
"If you feel bold, see if you can add audio input so you can talk to it, and have it respond with audio. ChatGPT or Claude can help you, or email me if you have questions.\n",
|
||||
"\n",
|
||||
"I will publish a full solution here soon - unless someone beats me to it...\n",
|
||||
"\n",
|
||||
"There are so many commercial applications for this, from a language tutor, to a company onboarding solution, to a companion AI to a course (like this one!) I can't wait to see your results."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "05fc552b",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"import openai\n",
|
||||
"import anthropic\n",
|
||||
"import gradio as gr\n",
|
||||
"import dotenv\n",
|
||||
"import os"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "a07e7793-b8f5-44f4-aded-5562f633271a",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"class Chatbot:\n",
|
||||
" def __init__(self, apiKey, publisher='openai'):\n",
|
||||
" if publisher not in ['openai', 'claude']:\n",
|
||||
" raise ValueError(f\"publisher must be openai or claude, but got {publisher}\")\n",
|
||||
" self.publisher = publisher\n",
|
||||
" self.systemPrompt = None\n",
|
||||
" self.historyPrompt = []\n",
|
||||
" self.llm = openai.OpenAI(api_key=apiKey) if publisher == 'openai' else anthropic.Anthropic(api_key=apiKey)\n",
|
||||
" \n",
|
||||
" def setSystemPrompt(self, systemPrompt:str):\n",
|
||||
" self.systemPrompt = systemPrompt.strip()\n",
|
||||
" if len(self.historyPrompt) == 0:\n",
|
||||
" self.historyPrompt.append({\"role\": \"system\", \"content\": f\"{systemPrompt}\"})\n",
|
||||
" else:\n",
|
||||
" self.historyPrompt[0] = {\"role\": \"system\", \"content\": f\"{systemPrompt}\"}\n",
|
||||
" \n",
|
||||
" def _prompt2obj(self, role:str, prompt:str):\n",
|
||||
" return {\n",
|
||||
" \"role\": role,\n",
|
||||
" \"content\": prompt.strip()\n",
|
||||
" }\n",
|
||||
" \n",
|
||||
" def unpackText(self, chunk):\n",
|
||||
" text = ''\n",
|
||||
" if self.publisher == 'openai':\n",
|
||||
" text = chunk.choices[0].delta.content or ''\n",
|
||||
" elif self.publisher == 'claude':\n",
|
||||
" if chunk.type == \"content_block_delta\":\n",
|
||||
" text = chunk.delta.text or ''\n",
|
||||
" \n",
|
||||
" return text\n",
|
||||
" \n",
|
||||
" def chat(self, message):\n",
|
||||
" self.historyPrompt.append(self._prompt2obj(\"user\", message))\n",
|
||||
" completeReply = \"\"\n",
|
||||
"\n",
|
||||
" if self.publisher == 'openai':\n",
|
||||
" stream = self.llm.chat.completions.create(model='gpt-4o-mini',\n",
|
||||
" messages=self.historyPrompt,\n",
|
||||
" stream=True)\n",
|
||||
" elif self.publisher == 'claude':\n",
|
||||
" stream = self.llm.messages.create(system=self.historyPrompt[0][\"content\"],\n",
|
||||
" model=\"claude-sonnet-4-20250514\",\n",
|
||||
" max_tokens=200,\n",
|
||||
" messages=self.historyPrompt[1:],\n",
|
||||
" stream=True)\n",
|
||||
" \n",
|
||||
" for chunk in stream:\n",
|
||||
" completeReply += self.unpackText(chunk)\n",
|
||||
" yield completeReply\n",
|
||||
" \n",
|
||||
" \n",
|
||||
" self.historyPrompt.append(self._prompt2obj(\"assistant\", completeReply))\n",
|
||||
" \n",
|
||||
" def _gradioChatWrapper(self):\n",
|
||||
" def gradioChatFn(message, history):\n",
|
||||
" for partial_reply in self.chat(message):\n",
|
||||
" yield partial_reply\n",
|
||||
" return gradioChatFn\n",
|
||||
" \n",
|
||||
" def getAllPrompt(self):\n",
|
||||
" return self.historyPrompt\n",
|
||||
" \n",
|
||||
" def run(self):\n",
|
||||
" gradioFn = self._gradioChatWrapper()\n",
|
||||
" gr.ChatInterface(fn=gradioFn, type=\"messages\").launch()\n",
|
||||
" \n",
|
||||
" "
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "1fca53e8",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# initial env\n",
|
||||
"dotenv.load_dotenv(\".env\", override=True)\n",
|
||||
"openaiKey = os.getenv(\"OPENAI_API_KEY\")\n",
|
||||
"claudeKey = os.getenv(\"ANTHROPIC_API_KEY\")\n",
|
||||
"openaiInfo = {\n",
|
||||
" 'apiKey': openaiKey,\n",
|
||||
" 'publisher': 'openai'\n",
|
||||
"}\n",
|
||||
"claudeInfo = {\n",
|
||||
" 'apiKey': claudeKey,\n",
|
||||
" 'publisher': 'claude'\n",
|
||||
"}\n",
|
||||
"\n",
|
||||
"SYSTEM_PROMPT = \"\"\"\n",
|
||||
"You are a technical experts and responds every question I asked with an explanation.\n",
|
||||
"\"\"\"\n",
|
||||
"\n",
|
||||
"openaiChatbot = Chatbot(**openaiInfo)\n",
|
||||
"openaiChatbot.setSystemPrompt(SYSTEM_PROMPT)\n",
|
||||
"openaiChatbot.run()\n",
|
||||
"\n",
|
||||
"# claudeChatbot = Chatbot(**claudeInfo)\n",
|
||||
"# claudeChatbot.setSystemPrompt(SYSTEM_PROMPT)\n",
|
||||
"# claudeChatbot.run()\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "59a2ac0f",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": []
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"kernelspec": {
|
||||
"display_name": "3.10.15",
|
||||
"language": "python",
|
||||
"name": "python3"
|
||||
},
|
||||
"language_info": {
|
||||
"codemirror_mode": {
|
||||
"name": "ipython",
|
||||
"version": 3
|
||||
},
|
||||
"file_extension": ".py",
|
||||
"mimetype": "text/x-python",
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.10.15"
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 5
|
||||
}
|
||||
@@ -0,0 +1,646 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "7bc4a9cd",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"import os\n",
|
||||
"import gradio as gr\n",
|
||||
"from openai import OpenAI\n",
|
||||
"import anthropic\n",
|
||||
"import google.generativeai as genai\n",
|
||||
"from dotenv import load_dotenv\n",
|
||||
"\n",
|
||||
"# Load environment variables\n",
|
||||
"load_dotenv()\n",
|
||||
"OPENAI_API_KEY = os.getenv(\"OPENAI_API_KEY\")\n",
|
||||
"anthropic_api_key = os.getenv('ANTHROPIC_API_KEY')\n",
|
||||
"google_api_key = os.getenv('GOOGLE_API_KEY')\n",
|
||||
"\n",
|
||||
"# Initialize clients\n",
|
||||
"openai_client = OpenAI(api_key=OPENAI_API_KEY)\n",
|
||||
"genai.configure(api_key=google_api_key)\n",
|
||||
"claude_client = anthropic.Anthropic(api_key=anthropic_api_key)\n",
|
||||
"\n",
|
||||
"# System prompt - Universal and comprehensive\n",
|
||||
"SYSTEM_PROMPT = \"\"\"You are a highly capable and versatile AI assistant designed to help with any type of question or task.\n",
|
||||
"\n",
|
||||
"Your capabilities span across all domains including but not limited to:\n",
|
||||
"- Programming, software development, and technology\n",
|
||||
"- Science, mathematics, and engineering\n",
|
||||
"- Arts, literature, and creative writing\n",
|
||||
"- History, philosophy, and social sciences\n",
|
||||
"- Business, finance, and economics\n",
|
||||
"- Health, wellness, and lifestyle advice\n",
|
||||
"- Education and learning support\n",
|
||||
"- Problem-solving and critical thinking\n",
|
||||
"- General knowledge and trivia\n",
|
||||
"- Casual conversation and entertainment\n",
|
||||
"\n",
|
||||
"Guidelines:\n",
|
||||
"- Provide accurate, helpful, and comprehensive responses\n",
|
||||
"- Adapt your tone and style to match the context of the question\n",
|
||||
"- Use examples and explanations when helpful\n",
|
||||
"- Be creative when asked for creative content\n",
|
||||
"- Be precise and factual when asked for information\n",
|
||||
"- Ask clarifying questions if the request is ambiguous\n",
|
||||
"- Admit when you're uncertain and provide the best possible guidance\n",
|
||||
"- Be conversational, friendly, and supportive\n",
|
||||
"\n",
|
||||
"You can help with anything from technical coding problems to creative storytelling, from academic research to casual chat. There are no topic restrictions - feel free to engage with any subject matter the user brings up.\"\"\"\n",
|
||||
"\n",
|
||||
"# Model configurations\n",
|
||||
"model_configs = {\n",
|
||||
" \"GPT-4o\": {\"provider\": \"openai\", \"model\": \"gpt-4o\"},\n",
|
||||
" \"GPT-4o-mini\": {\"provider\": \"openai\", \"model\": \"gpt-4o-mini\"},\n",
|
||||
" \"GPT-3.5-turbo\": {\"provider\": \"openai\", \"model\": \"gpt-3.5-turbo\"},\n",
|
||||
" \"Claude Sonnet 4\": {\"provider\": \"anthropic\", \"model\": \"claude-sonnet-4-20250514\"},\n",
|
||||
" \"Gemini 2.0 Flash\": {\"provider\": \"google\", \"model\": \"gemini-2.0-flash-exp\"},\n",
|
||||
"}\n",
|
||||
"\n",
|
||||
"def chat_streaming(message, history, model_name, temperature):\n",
|
||||
" \"\"\"Main chat function with streaming support\"\"\"\n",
|
||||
" \n",
|
||||
" config = model_configs[model_name]\n",
|
||||
" provider = config[\"provider\"]\n",
|
||||
" model = config[\"model\"]\n",
|
||||
" \n",
|
||||
" # Convert messages format history to API format\n",
|
||||
" messages = []\n",
|
||||
" for msg in history:\n",
|
||||
" if msg[\"role\"] == \"user\":\n",
|
||||
" messages.append({\"role\": \"user\", \"content\": msg[\"content\"]})\n",
|
||||
" elif msg[\"role\"] == \"assistant\":\n",
|
||||
" messages.append({\"role\": \"assistant\", \"content\": msg[\"content\"]})\n",
|
||||
" messages.append({\"role\": \"user\", \"content\": message})\n",
|
||||
" \n",
|
||||
" # Stream based on provider\n",
|
||||
" if provider == \"openai\":\n",
|
||||
" stream = openai_client.chat.completions.create(\n",
|
||||
" model=model,\n",
|
||||
" messages=[{\"role\": \"system\", \"content\": SYSTEM_PROMPT}] + messages,\n",
|
||||
" temperature=temperature,\n",
|
||||
" stream=True\n",
|
||||
" )\n",
|
||||
" \n",
|
||||
" response = \"\"\n",
|
||||
" for chunk in stream:\n",
|
||||
" if chunk.choices[0].delta.content:\n",
|
||||
" response += chunk.choices[0].delta.content\n",
|
||||
" yield response\n",
|
||||
" \n",
|
||||
" elif provider == \"anthropic\":\n",
|
||||
" response = \"\"\n",
|
||||
" with claude_client.messages.stream(\n",
|
||||
" model=model,\n",
|
||||
" max_tokens=2000,\n",
|
||||
" temperature=temperature,\n",
|
||||
" system=SYSTEM_PROMPT,\n",
|
||||
" messages=messages,\n",
|
||||
" ) as stream:\n",
|
||||
" for text in stream.text_stream:\n",
|
||||
" response += text\n",
|
||||
" yield response\n",
|
||||
" \n",
|
||||
" elif provider == \"google\":\n",
|
||||
" gemini = genai.GenerativeModel(\n",
|
||||
" model_name=model,\n",
|
||||
" system_instruction=SYSTEM_PROMPT,\n",
|
||||
" )\n",
|
||||
" \n",
|
||||
" # Convert history for Gemini\n",
|
||||
" gemini_history = []\n",
|
||||
" for msg in history:\n",
|
||||
" if msg[\"role\"] == \"user\":\n",
|
||||
" gemini_history.append({\"role\": \"user\", \"parts\": [msg[\"content\"]]})\n",
|
||||
" elif msg[\"role\"] == \"assistant\":\n",
|
||||
" gemini_history.append({\"role\": \"model\", \"parts\": [msg[\"content\"]]})\n",
|
||||
" \n",
|
||||
" chat = gemini.start_chat(history=gemini_history)\n",
|
||||
" \n",
|
||||
" stream = chat.send_message(\n",
|
||||
" message,\n",
|
||||
" stream=True,\n",
|
||||
" generation_config=genai.types.GenerationConfig(temperature=temperature)\n",
|
||||
" )\n",
|
||||
" \n",
|
||||
" response = \"\"\n",
|
||||
" for chunk in stream:\n",
|
||||
" response += chunk.text\n",
|
||||
" yield response\n",
|
||||
"\n",
|
||||
"def handle_audio_input(audio):\n",
|
||||
" \"\"\"Transcribe audio input using Whisper\"\"\"\n",
|
||||
" if audio is None:\n",
|
||||
" return \"\"\n",
|
||||
" \n",
|
||||
" try:\n",
|
||||
" audio_file = open(audio, \"rb\")\n",
|
||||
" transcript = openai_client.audio.transcriptions.create(\n",
|
||||
" model=\"whisper-1\",\n",
|
||||
" file=audio_file\n",
|
||||
" )\n",
|
||||
" return transcript.text\n",
|
||||
" except Exception as e:\n",
|
||||
" return f\"Error transcribing audio: {str(e)}\"\n",
|
||||
"\n",
|
||||
"def text_to_speech(text):\n",
|
||||
" \"\"\"Convert text response to speech\"\"\"\n",
|
||||
" try:\n",
|
||||
" response = openai_client.audio.speech.create(\n",
|
||||
" model=\"tts-1\",\n",
|
||||
" voice=\"nova\",\n",
|
||||
" input=text[:4096] # Limit to prevent errors\n",
|
||||
" )\n",
|
||||
" \n",
|
||||
" audio_path = \"response.mp3\"\n",
|
||||
" response.stream_to_file(audio_path)\n",
|
||||
" return audio_path\n",
|
||||
" except Exception as e:\n",
|
||||
" print(f\"TTS Error: {str(e)}\")\n",
|
||||
" return None\n",
|
||||
"\n",
|
||||
"# Custom CSS for modern, attractive UI\n",
|
||||
"custom_css = \"\"\"\n",
|
||||
"@import url('https://fonts.googleapis.com/css2?family=Inter:wght@400;500;600;700&display=swap');\n",
|
||||
"\n",
|
||||
".gradio-container {\n",
|
||||
" font-family: 'Inter', sans-serif !important;\n",
|
||||
" background: linear-gradient(135deg, #667eea 0%, #764ba2 100%) !important;\n",
|
||||
"}\n",
|
||||
"\n",
|
||||
":root .dark {\n",
|
||||
" --background-fill-primary: #f0f0f0;\n",
|
||||
" --body-background-fill: var(--background-fill-primary);\n",
|
||||
" --block-background-fill: white !important;\n",
|
||||
" --block-title-background-fill: #dfe7ff;\n",
|
||||
" --block-title-text-color:#6366f1;\n",
|
||||
" --body-text-color: black;\n",
|
||||
" --button-secondary-text-color:black;\n",
|
||||
" --input-background-fill:white;\n",
|
||||
"\n",
|
||||
" --block-label-background-fill:#dfe7ff;\n",
|
||||
" --block-label-text-color:#6366f1;\n",
|
||||
"\n",
|
||||
" --block-border-color:#eaeaea;\n",
|
||||
" --input-border-color: #eaeaea;\n",
|
||||
" --border-color-primary:#eaeaea;\n",
|
||||
"\n",
|
||||
" --color-accent-soft: #dfe7ff;\n",
|
||||
" --border-color-accent-subdued: #98a6cf;\n",
|
||||
"\n",
|
||||
" --checkbox-background-color: #eaeaea;\n",
|
||||
" --checkbox-border-color: #eaeaea;\n",
|
||||
" --background-fill-secondary:#eaeaea;\n",
|
||||
"}\n",
|
||||
"\n",
|
||||
".main {\n",
|
||||
" background: white;\n",
|
||||
" border-radius: 20px;\n",
|
||||
" box-shadow: 0 20px 60px rgba(0,0,0,0.3);\n",
|
||||
" padding: 0 !important;\n",
|
||||
" overflow: hidden;\n",
|
||||
"}\n",
|
||||
"\n",
|
||||
".contain {\n",
|
||||
" padding: 2rem !important;\n",
|
||||
"}\n",
|
||||
"\n",
|
||||
"/* Header Styling */\n",
|
||||
".header-section {\n",
|
||||
" background: linear-gradient(135deg, #667eea 0%, #764ba2 100%);\n",
|
||||
" padding: 2.5rem 2rem;\n",
|
||||
" border-radius: 20px 20px 0 0;\n",
|
||||
" margin: -2rem 0rem 2rem 0rem;\n",
|
||||
" color: white;\n",
|
||||
" box-shadow: 0 4px 20px rgba(102, 126, 234, 0.4);\n",
|
||||
"}\n",
|
||||
"\n",
|
||||
".header-section h1 {\n",
|
||||
" font-size: 2.5rem !important;\n",
|
||||
" font-weight: 700 !important;\n",
|
||||
" margin: 0 0 0.5rem 0 !important;\n",
|
||||
" color: white !important;\n",
|
||||
" text-shadow: 2px 2px 4px rgba(0,0,0,0.2);\n",
|
||||
"}\n",
|
||||
"\n",
|
||||
".header-section p {\n",
|
||||
" font-size: 1.1rem !important;\n",
|
||||
" margin: 0.5rem 0 !important;\n",
|
||||
" color: rgba(255,255,255,0.95) !important;\n",
|
||||
" font-weight: 400;\n",
|
||||
"}\n",
|
||||
"\n",
|
||||
".feature-badge {\n",
|
||||
" display: inline-block;\n",
|
||||
" background: rgba(255,255,255,0.2);\n",
|
||||
" padding: 0.4rem 1rem;\n",
|
||||
" border-radius: 20px;\n",
|
||||
" margin: 0.3rem;\n",
|
||||
" font-size: 0.9rem;\n",
|
||||
" backdrop-filter: blur(10px);\n",
|
||||
" border: 1px solid rgba(255,255,255,0.3);\n",
|
||||
"}\n",
|
||||
"\n",
|
||||
"/* Sidebar Styling */\n",
|
||||
".control-panel {\n",
|
||||
" background: linear-gradient(135deg, #f5f7fa 0%, #c3cfe2 100%);\n",
|
||||
" padding: 1.5rem;\n",
|
||||
" border-radius: 15px;\n",
|
||||
" box-shadow: 0 4px 15px rgba(0,0,0,0.1);\n",
|
||||
" height: 100%;\n",
|
||||
"}\n",
|
||||
"\n",
|
||||
".control-panel label {\n",
|
||||
" font-weight: 600 !important;\n",
|
||||
" font-size: 0.95rem !important;\n",
|
||||
"}\n",
|
||||
"\n",
|
||||
"/* Dropdown Styling */\n",
|
||||
".dropdown-container select {\n",
|
||||
" background: white !important;\n",
|
||||
" border: 2px solid #e2e8f0 !important;\n",
|
||||
" border-radius: 10px !important;\n",
|
||||
" padding: 0.75rem !important;\n",
|
||||
" font-weight: 500 !important;\n",
|
||||
" transition: all 0.3s ease !important;\n",
|
||||
"}\n",
|
||||
"\n",
|
||||
".dropdown-container select:hover {\n",
|
||||
" border-color: #667eea !important;\n",
|
||||
" box-shadow: 0 0 0 3px rgba(102, 126, 234, 0.1) !important;\n",
|
||||
"}\n",
|
||||
"\n",
|
||||
"/* Slider Styling */\n",
|
||||
"input[type=\"range\"] {\n",
|
||||
" accent-color: #667eea !important;\n",
|
||||
"}\n",
|
||||
"\n",
|
||||
"/* Button Styling */\n",
|
||||
".primary-btn {\n",
|
||||
" background: linear-gradient(135deg, #667eea 0%, #764ba2 100%) !important;\n",
|
||||
" border: none !important;\n",
|
||||
" color: white !important;\n",
|
||||
" font-weight: 600 !important;\n",
|
||||
" padding: 0.75rem 2rem !important;\n",
|
||||
" border-radius: 10px !important;\n",
|
||||
" box-shadow: 0 4px 15px rgba(102, 126, 234, 0.4) !important;\n",
|
||||
" transition: all 0.3s ease !important;\n",
|
||||
"}\n",
|
||||
"\n",
|
||||
".primary-btn:hover {\n",
|
||||
" transform: translateY(-2px) !important;\n",
|
||||
" box-shadow: 0 6px 20px rgba(102, 126, 234, 0.6) !important;\n",
|
||||
"}\n",
|
||||
"\n",
|
||||
".secondary-btn {\n",
|
||||
" background: #e2e8f0 !important;\n",
|
||||
" border: none !important;\n",
|
||||
" color: #2d3748 !important;\n",
|
||||
" font-weight: 600 !important;\n",
|
||||
" padding: 0.75rem 1.5rem !important;\n",
|
||||
" border-radius: 10px !important;\n",
|
||||
" transition: all 0.3s ease !important;\n",
|
||||
"}\n",
|
||||
"\n",
|
||||
".secondary-btn:hover {\n",
|
||||
" background: #cbd5e0 !important;\n",
|
||||
" transform: translateY(-2px) !important;\n",
|
||||
"}\n",
|
||||
"\n",
|
||||
"/* Chatbot Styling */\n",
|
||||
".chatbot-container {\n",
|
||||
" background: white;\n",
|
||||
" border-radius: 15px;\n",
|
||||
" border: 2px solid #e2e8f0;\n",
|
||||
" box-shadow: 0 4px 20px rgba(0,0,0,0.08);\n",
|
||||
" overflow: hidden;\n",
|
||||
"}\n",
|
||||
"\n",
|
||||
"/* Input Box Styling */\n",
|
||||
".message-input textarea {\n",
|
||||
" border: 2px solid #e2e8f0 !important;\n",
|
||||
" border-radius: 12px !important;\n",
|
||||
" padding: 1rem !important;\n",
|
||||
" font-size: 1rem !important;\n",
|
||||
" transition: all 0.3s ease !important;\n",
|
||||
"}\n",
|
||||
"\n",
|
||||
".message-input textarea:focus {\n",
|
||||
" border-color: #667eea !important;\n",
|
||||
" box-shadow: 0 0 0 3px rgba(102, 126, 234, 0.1) !important;\n",
|
||||
"}\n",
|
||||
"\n",
|
||||
"/* Input Row Centering */\n",
|
||||
".input-row {\n",
|
||||
" display: flex !important;\n",
|
||||
" justify-content: center !important;\n",
|
||||
" align-items: center !important;\n",
|
||||
" gap: 1rem !important;\n",
|
||||
"}\n",
|
||||
"\n",
|
||||
".input-row > * {\n",
|
||||
" flex-shrink: 0 !important;\n",
|
||||
"}\n",
|
||||
"\n",
|
||||
"/* Audio Components */\n",
|
||||
".audio-component {\n",
|
||||
" background: #f7fafc;\n",
|
||||
" border: 2px dashed #cbd5e0;\n",
|
||||
" border-radius: 12px;\n",
|
||||
" padding: 1rem;\n",
|
||||
" transition: all 0.3s ease;\n",
|
||||
"}\n",
|
||||
"\n",
|
||||
".audio-component:hover {\n",
|
||||
" border-color: #667eea;\n",
|
||||
" background: #edf2f7;\n",
|
||||
"}\n",
|
||||
"\n",
|
||||
"/* Checkbox Styling */\n",
|
||||
".checkbox-label {\n",
|
||||
" display: flex;\n",
|
||||
" align-items: center;\n",
|
||||
" gap: 0.5rem;\n",
|
||||
" font-weight: 500;\n",
|
||||
" color: #2d3748;\n",
|
||||
"}\n",
|
||||
"\n",
|
||||
"/* Tips Section */\n",
|
||||
".tips-section {\n",
|
||||
" background: linear-gradient(135deg, #f7fafc 0%, #edf2f7 100%);\n",
|
||||
" padding: 1.5rem;\n",
|
||||
" border-radius: 15px;\n",
|
||||
" margin-top: 2rem;\n",
|
||||
" border-left: 4px solid #667eea;\n",
|
||||
"}\n",
|
||||
"\n",
|
||||
".tips-section h3 {\n",
|
||||
" color: #667eea !important;\n",
|
||||
" font-weight: 600 !important;\n",
|
||||
" margin-bottom: 1rem !important;\n",
|
||||
"}\n",
|
||||
"\n",
|
||||
".tips-section ul {\n",
|
||||
" list-style: none;\n",
|
||||
" padding: 0;\n",
|
||||
"}\n",
|
||||
"\n",
|
||||
".tips-section li {\n",
|
||||
" padding: 0.5rem 0;\n",
|
||||
" color: #4a5568 !important;\n",
|
||||
" font-size: 0.95rem;\n",
|
||||
"}\n",
|
||||
"\n",
|
||||
".tips-section li:before {\n",
|
||||
" content: \"→ \";\n",
|
||||
" color: #667eea;\n",
|
||||
" font-weight: bold;\n",
|
||||
" margin-right: 0.5rem;\n",
|
||||
"}\n",
|
||||
"\n",
|
||||
"/* Force black color for strong/bold text */\n",
|
||||
".tips-section strong {\n",
|
||||
" color: #1a202c !important;\n",
|
||||
"}\n",
|
||||
"\n",
|
||||
".prose * {\n",
|
||||
" color: inherit !important;\n",
|
||||
"}\n",
|
||||
"\n",
|
||||
".prose strong {\n",
|
||||
" color: #1a202c !important;\n",
|
||||
" font-weight: 600 !important;\n",
|
||||
"}\n",
|
||||
"\n",
|
||||
"/* Responsive Design */\n",
|
||||
"@media (max-width: 768px) {\n",
|
||||
" .header-section h1 {\n",
|
||||
" font-size: 1.8rem !important;\n",
|
||||
" }\n",
|
||||
" \n",
|
||||
" .contain {\n",
|
||||
" padding: 1rem !important;\n",
|
||||
" }\n",
|
||||
"}\n",
|
||||
"\n",
|
||||
".fillable{\n",
|
||||
" max-width:95% !important;\n",
|
||||
"}\n",
|
||||
"#component-5{\n",
|
||||
" flex-grow:1.1 !important;\n",
|
||||
"}\n",
|
||||
".bubble-wrap.svelte-gjtrl6 {\n",
|
||||
" background:none !important;\n",
|
||||
"}\n",
|
||||
".bot.svelte-1csv61q.message {\n",
|
||||
" background-color: white !important;\n",
|
||||
" border: 1px solid #f3f3f3;\n",
|
||||
"}\n",
|
||||
".options.svelte-y6qw75> li:hover{\n",
|
||||
" background:white ;\n",
|
||||
"}\n",
|
||||
".options.svelte-y6qw75> .selected{\n",
|
||||
" background:white ;\n",
|
||||
"}\n",
|
||||
"\n",
|
||||
"\"\"\"\n",
|
||||
"\n",
|
||||
"# Build Gradio Interface\n",
|
||||
"with gr.Blocks(css=custom_css, theme=gr.themes.Soft()) as demo:\n",
|
||||
" \n",
|
||||
" # Header\n",
|
||||
" with gr.Row(elem_classes=\"header-section\"):\n",
|
||||
" with gr.Column():\n",
|
||||
" gr.HTML(\"\"\"\n",
|
||||
" <h1>🚀 Voice Enabled Multi Model AI-Assistant</h1>\n",
|
||||
" <p>Your intelligent companion for any question - from coding to creativity, science to storytelling!</p>\n",
|
||||
" <div style=\"margin-top: 1rem;\">\n",
|
||||
" <span class=\"feature-badge\">🤖 7 AI Models</span>\n",
|
||||
" <span class=\"feature-badge\">🎤 Voice Input</span>\n",
|
||||
" <span class=\"feature-badge\">🔊 Audio Output</span>\n",
|
||||
" <span class=\"feature-badge\">⚡ Real-time Streaming</span>\n",
|
||||
" <span class=\"feature-badge\">🌐 Any Topic</span>\n",
|
||||
" </div>\n",
|
||||
" \"\"\")\n",
|
||||
" \n",
|
||||
" with gr.Row():\n",
|
||||
" # Left Sidebar - Controls\n",
|
||||
" with gr.Column(scale=1, elem_classes=\"control-panel\"):\n",
|
||||
" gr.HTML(\"<h3 style='color: #2d3748 !important; margin-top: 0;'>⚙️ Settings</h3>\")\n",
|
||||
" \n",
|
||||
" model_choice = gr.Dropdown(\n",
|
||||
" choices=list(model_configs.keys()),\n",
|
||||
" value=\"GPT-4o-mini\",\n",
|
||||
" label=\"🤖 AI Model\",\n",
|
||||
" info=\"Select your preferred model\",\n",
|
||||
" elem_classes=\"dropdown-container\"\n",
|
||||
" )\n",
|
||||
" \n",
|
||||
" temperature = gr.Slider(\n",
|
||||
" minimum=0,\n",
|
||||
" maximum=1,\n",
|
||||
" value=0.7,\n",
|
||||
" step=0.1,\n",
|
||||
" label=\"🌡️ Temperature\",\n",
|
||||
" info=\"Higher = more creative responses\"\n",
|
||||
" )\n",
|
||||
" \n",
|
||||
" gr.HTML(\"<div style='margin: 1.5rem 0 0.5rem 0; padding-top: 1.5rem; border-top: 2px solid #cbd5e0;'><h4 style='color: #2d3748 !important; margin: 0;'>🎙️ Audio Features</h4></div>\")\n",
|
||||
" \n",
|
||||
" audio_input = gr.Audio(\n",
|
||||
" sources=[\"microphone\"],\n",
|
||||
" type=\"filepath\",\n",
|
||||
" label=\"🎤 Voice Input\",\n",
|
||||
" elem_classes=\"audio-component\"\n",
|
||||
" )\n",
|
||||
" \n",
|
||||
" audio_output_enabled = gr.Checkbox(\n",
|
||||
" label=\"🔊 Enable Audio Response\",\n",
|
||||
" value=False,\n",
|
||||
" elem_classes=\"checkbox-label\"\n",
|
||||
" )\n",
|
||||
" \n",
|
||||
" # Right Side - Chat Interface\n",
|
||||
" with gr.Column(scale=3):\n",
|
||||
" chatbot = gr.Chatbot(\n",
|
||||
" label=\"💬 Conversation\",\n",
|
||||
" height=550,\n",
|
||||
" show_copy_button=True,\n",
|
||||
" type='messages',\n",
|
||||
" elem_classes=\"chatbot-container\",\n",
|
||||
" avatar_images=(None, \"https://api.dicebear.com/7.x/bottts/svg?seed=ai\")\n",
|
||||
" )\n",
|
||||
" \n",
|
||||
" with gr.Row(elem_classes=\"input-row\"):\n",
|
||||
" msg = gr.Textbox(\n",
|
||||
" label=\"\",\n",
|
||||
" placeholder=\"💭 Ask me anything - tech help, creative writing, life advice, science, history, or just chat!\",\n",
|
||||
" scale=5,\n",
|
||||
" elem_classes=\"message-input\",\n",
|
||||
" show_label=False\n",
|
||||
" )\n",
|
||||
" submit_btn = gr.Button(\"Send 📤\", scale=1, elem_classes=\"primary-btn\")\n",
|
||||
" \n",
|
||||
" audio_response = gr.Audio(\n",
|
||||
" label=\"🔊 Audio Response\", \n",
|
||||
" visible=False,\n",
|
||||
" elem_classes=\"audio-component\"\n",
|
||||
" )\n",
|
||||
" \n",
|
||||
" with gr.Row():\n",
|
||||
" clear_btn = gr.Button(\"🗑️ Clear Chat\", elem_classes=\"secondary-btn\")\n",
|
||||
" \n",
|
||||
" # Tips Section\n",
|
||||
" with gr.Row(elem_classes=\"tips-section\"):\n",
|
||||
" gr.Markdown(\"\"\"\n",
|
||||
" ### 💡 What Can I Help You With?\n",
|
||||
" \n",
|
||||
" - **Technology & Programming**: Debug code, explain concepts, build projects, learn new languages\n",
|
||||
" - **Creative Writing**: Stories, poems, scripts, brainstorming ideas, character development\n",
|
||||
" - **Education & Learning**: Homework help, concept explanations, study guides, tutoring\n",
|
||||
" - **Business & Career**: Resume writing, business plans, marketing ideas, career advice\n",
|
||||
" - **Science & Math**: Problem-solving, research assistance, concept explanations\n",
|
||||
" - **Daily Life**: Recipe suggestions, travel planning, health tips, relationship advice\n",
|
||||
" - **Entertainment**: Jokes, trivia, games, recommendations for books/movies/music\n",
|
||||
" - **And Literally Anything Else**: No topic is off-limits - just ask!\n",
|
||||
" \"\"\")\n",
|
||||
" \n",
|
||||
" # Event handlers\n",
|
||||
" def process_message(message, history, model, temp, audio_enabled):\n",
|
||||
" \"\"\"Process message and optionally generate audio\"\"\"\n",
|
||||
" # Add user message to history\n",
|
||||
" history = history + [{\"role\": \"user\", \"content\": message}]\n",
|
||||
" \n",
|
||||
" # Generate text response (streaming)\n",
|
||||
" bot_message = None\n",
|
||||
" for response in chat_streaming(message, history[:-1], model, temp):\n",
|
||||
" bot_message = response\n",
|
||||
" yield history + [{\"role\": \"assistant\", \"content\": response}], None\n",
|
||||
" \n",
|
||||
" # Final history with complete response\n",
|
||||
" final_history = history + [{\"role\": \"assistant\", \"content\": bot_message}]\n",
|
||||
" \n",
|
||||
" # Generate audio if enabled\n",
|
||||
" if audio_enabled and bot_message:\n",
|
||||
" audio_path = text_to_speech(bot_message)\n",
|
||||
" yield final_history, audio_path\n",
|
||||
" else:\n",
|
||||
" yield final_history, None\n",
|
||||
" \n",
|
||||
" def transcribe_and_send(audio, history, model, temp, audio_enabled):\n",
|
||||
" \"\"\"Transcribe audio and process message\"\"\"\n",
|
||||
" text = handle_audio_input(audio)\n",
|
||||
" if text and text != \"\" and not text.startswith(\"Error\"):\n",
|
||||
" # Process the message and get results\n",
|
||||
" for hist, aud in process_message(text, history, model, temp, audio_enabled):\n",
|
||||
" yield hist, aud\n",
|
||||
" else:\n",
|
||||
" # If no text or error, return history unchanged\n",
|
||||
" yield history, None\n",
|
||||
"\n",
|
||||
" # Wire up events\n",
|
||||
" submit_btn.click(\n",
|
||||
" fn=process_message,\n",
|
||||
" inputs=[msg, chatbot, model_choice, temperature, audio_output_enabled],\n",
|
||||
" outputs=[chatbot, audio_response]\n",
|
||||
" ).then(lambda: \"\", None, msg)\n",
|
||||
"\n",
|
||||
" msg.submit(\n",
|
||||
" fn=process_message,\n",
|
||||
" inputs=[msg, chatbot, model_choice, temperature, audio_output_enabled],\n",
|
||||
" outputs=[chatbot, audio_response]\n",
|
||||
" ).then(lambda: \"\", None, msg)\n",
|
||||
"\n",
|
||||
" # Audio input handler using stop_recording event\n",
|
||||
" audio_input.stop_recording(\n",
|
||||
" fn=transcribe_and_send,\n",
|
||||
" inputs=[audio_input, chatbot, model_choice, temperature, audio_output_enabled],\n",
|
||||
" outputs=[chatbot, audio_response]\n",
|
||||
" )\n",
|
||||
"\n",
|
||||
" # Clear button clears chat, audio response, and audio input\n",
|
||||
" clear_btn.click(\n",
|
||||
" fn=lambda: ([], None, None), \n",
|
||||
" inputs=None, \n",
|
||||
" outputs=[chatbot, audio_response, audio_input]\n",
|
||||
" )\n",
|
||||
"\n",
|
||||
" # Toggle audio response visibility\n",
|
||||
" audio_output_enabled.change(\n",
|
||||
" fn=lambda x: gr.Audio(visible=x),\n",
|
||||
" inputs=audio_output_enabled,\n",
|
||||
" outputs=audio_response\n",
|
||||
" )\n",
|
||||
"\n",
|
||||
"# Launch the app\n",
|
||||
"if __name__ == \"__main__\":\n",
|
||||
" demo.launch(share=False, debug=True)"
|
||||
]
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"kernelspec": {
|
||||
"display_name": "Python 3 (ipykernel)",
|
||||
"language": "python",
|
||||
"name": "python3"
|
||||
},
|
||||
"language_info": {
|
||||
"codemirror_mode": {
|
||||
"name": "ipython",
|
||||
"version": 3
|
||||
},
|
||||
"file_extension": ".py",
|
||||
"mimetype": "text/x-python",
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.11.13"
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 5
|
||||
}
|
||||
304
week2/community-contributions/week2 EXERCISE.ipynb
Normal file
304
week2/community-contributions/week2 EXERCISE.ipynb
Normal file
@@ -0,0 +1,304 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "d006b2ea-9dfe-49c7-88a9-a5a0775185fd",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# Additional End of week Exercise - week 2\n",
|
||||
"\n",
|
||||
"Now use everything you've learned from Week 2 to build a full prototype for the technical question/answerer you built in Week 1 Exercise.\n",
|
||||
"\n",
|
||||
"This should include a Gradio UI, streaming, use of the system prompt to add expertise, and the ability to switch between models. Bonus points if you can demonstrate use of a tool!\n",
|
||||
"\n",
|
||||
"If you feel bold, see if you can add audio input so you can talk to it, and have it respond with audio. ChatGPT or Claude can help you, or email me if you have questions.\n",
|
||||
"\n",
|
||||
"I will publish a full solution here soon - unless someone beats me to it...\n",
|
||||
"\n",
|
||||
"There are so many commercial applications for this, from a language tutor, to a company onboarding solution, to a companion AI to a course (like this one!) I can't wait to see your results."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "b6197b62-57fb-4569-a82d-bd4a15f4471b",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# Imports"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "b538792e-2414-4922-bb51-95c61f6c2128",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# imports\n",
|
||||
"\n",
|
||||
"import os\n",
|
||||
"import requests\n",
|
||||
"import json\n",
|
||||
"from typing import List\n",
|
||||
"from dotenv import load_dotenv\n",
|
||||
"from IPython.display import Markdown, display, update_display, Audio\n",
|
||||
"from openai import OpenAI\n",
|
||||
"import anthropic\n",
|
||||
"import base64\n",
|
||||
"from io import BytesIO\n",
|
||||
"from PIL import Image\n",
|
||||
"import gradio as gr"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "95afec00-4fe5-4cc9-ae0a-37c6ea4934d6",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# OpenAI API key"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "6c23ef55-daad-4ecf-893e-b83e507f2732",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Load environment variables in a file called .env\n",
|
||||
"# Print the key prefixes to help with any debugging\n",
|
||||
"\n",
|
||||
"load_dotenv(override=True)\n",
|
||||
"openai_api_key = os.getenv('OPENAI_API_KEY')\n",
|
||||
"anthropic_api_key = os.getenv('ANTHROPIC_API_KEY')\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"if openai_api_key:\n",
|
||||
" print(f\"OpenAI API Key exists and begins {openai_api_key[:8]}\")\n",
|
||||
"else:\n",
|
||||
" print(\"OpenAI API Key not set\")\n",
|
||||
" \n",
|
||||
"if anthropic_api_key:\n",
|
||||
" print(f\"Anthropic API Key exists and begins {anthropic_api_key[:7]}\")\n",
|
||||
"else:\n",
|
||||
" print(\"Anthropic API Key not set\")\n",
|
||||
"\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "e260a7ea-fe79-4222-b6d7-7982f55a52d7",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"openai = OpenAI()\n",
|
||||
"\n",
|
||||
"claude = anthropic.Anthropic()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "270e937e-aef9-4032-b51e-ffa0fec184a1",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# SYSTEM prompt"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "5639ebd6-d197-43c3-a1c4-4308b117e5ef",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# set up environment\n",
|
||||
"\n",
|
||||
"def system_prompt(selected_model):\n",
|
||||
" return f\"\"\"\n",
|
||||
" You are a tech expert and know every coding language, and can give \n",
|
||||
" nice, detailed and simple explanations for the given questions.\n",
|
||||
" Introduce yourself by saying which model you are every time you answer. For example, this is {selected_model}. \n",
|
||||
" \n",
|
||||
" \"\"\""
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "d1d960d3-db2e-4f7c-ab7f-d5816f3bd994",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"def talker(message):\n",
|
||||
" response = openai.audio.speech.create(\n",
|
||||
" model=\"tts-1\",\n",
|
||||
" voice=\"onyx\",\n",
|
||||
" input=message)\n",
|
||||
"\n",
|
||||
" audio_stream = BytesIO(response.content)\n",
|
||||
" output_filename = \"output_audio.mp3\"\n",
|
||||
" with open(output_filename, \"wb\") as f:\n",
|
||||
" f.write(audio_stream.read())\n",
|
||||
"\n",
|
||||
" display(Audio(output_filename, autoplay=True))"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "f39f1bc7-87c2-4b44-8ba7-c320acb33361",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"def listener(audio_file):\n",
|
||||
" with open(audio_file, \"rb\") as audio:\n",
|
||||
" transcript = openai.audio.transcriptions.create(\n",
|
||||
" model=\"whisper-1\",\n",
|
||||
" file=audio\n",
|
||||
" )\n",
|
||||
" return transcript.text"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "f88078a9-49ac-48b1-9f07-5b3977e88a52",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"def chat(cleared_entry, history, selected_model):\n",
|
||||
" messages = [{\"role\": \"system\", \"content\": system_prompt(selected_model)}] + history\n",
|
||||
"\n",
|
||||
" print(messages)\n",
|
||||
" \n",
|
||||
" if selected_model == \"GPT-4o-mini\":\n",
|
||||
" stream = openai.chat.completions.create(model=\"gpt-4o-mini\", messages=messages, stream=True)\n",
|
||||
" response = \"\"\n",
|
||||
" \n",
|
||||
" for chunk in stream:\n",
|
||||
" try:\n",
|
||||
" response += chunk.choices[0].delta.content or ''\n",
|
||||
" \n",
|
||||
" updated_history = history + [{\"role\": \"assistant\", \"content\": response}]\n",
|
||||
" # talker(response)\n",
|
||||
" yield updated_history, None \n",
|
||||
" except Exception as e:\n",
|
||||
" print(f\"Streaming error: {e}\")\n",
|
||||
" yield \"Sorry, there was an error processing your request.\"\n",
|
||||
" # talker(response)\n",
|
||||
" elif selected_model == \"Claude-sonnet-4\":\n",
|
||||
" claude_messages = [{\"role\": msg[\"role\"], \"content\": msg[\"content\"]} for msg in history]\n",
|
||||
" print(claude_messages)\n",
|
||||
" result = claude.messages.stream(\n",
|
||||
" model=\"claude-sonnet-4-20250514\",\n",
|
||||
" max_tokens=200,\n",
|
||||
" temperature=0.7,\n",
|
||||
" system=system_prompt(selected_model), \n",
|
||||
" messages=claude_messages, \n",
|
||||
" )\n",
|
||||
" \n",
|
||||
" response = \"\"\n",
|
||||
" with result as stream:\n",
|
||||
" for text in stream.text_stream:\n",
|
||||
" response += text\n",
|
||||
" \n",
|
||||
" updated_history = history + [{\"role\": \"assistant\", \"content\": response}]\n",
|
||||
" \n",
|
||||
" yield updated_history, None \n",
|
||||
" # talker(response)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "d1863d62-a2e8-454b-9296-710c4e5f9396",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"with gr.Blocks() as ui:\n",
|
||||
"\n",
|
||||
" gr.Markdown(\"## AI Chat Assistant\")\n",
|
||||
" gr.Markdown(\"**Select your preferred AI model:**\")\n",
|
||||
" \n",
|
||||
" model_dropdown = gr.Dropdown(\n",
|
||||
" choices=[\"GPT-4o-mini\", \"Claude-sonnet-4\"], \n",
|
||||
" value=\"GPT-4o-mini\", # default selection\n",
|
||||
" label=\"Choose Model\"\n",
|
||||
" )\n",
|
||||
"\n",
|
||||
" \n",
|
||||
" with gr.Row():\n",
|
||||
" chatbot = gr.Chatbot(height=200, type=\"messages\")\n",
|
||||
" image_output = gr.Image(height=200)\n",
|
||||
" with gr.Row():\n",
|
||||
" entry = gr.Textbox(label=\"Chat with our AI Assistant:\")\n",
|
||||
" with gr.Row():\n",
|
||||
" # Audio input for voice messages\n",
|
||||
" audio_input = gr.Audio(\n",
|
||||
" sources=[\"microphone\", \"upload\"], \n",
|
||||
" type=\"filepath\", \n",
|
||||
" label=\"🎙️ Voice Message\"\n",
|
||||
" )\n",
|
||||
" with gr.Row():\n",
|
||||
" voice_submit = gr.Button(\"Send Voice Message\", variant=\"secondary\")\n",
|
||||
" clear = gr.Button(\"Clear\")\n",
|
||||
"\n",
|
||||
"\n",
|
||||
" def do_entry(message, history):\n",
|
||||
" history += [{\"role\":\"user\", \"content\":message}]\n",
|
||||
" return \"\", history\n",
|
||||
"\n",
|
||||
" def process_voice_input(audio_file):\n",
|
||||
" \"\"\"Convert voice to text and put it in the text box\"\"\"\n",
|
||||
" if audio_file is not None:\n",
|
||||
" transcribed_text = listener(audio_file)\n",
|
||||
" if transcribed_text and not transcribed_text.startswith(\"Error\"):\n",
|
||||
" return transcribed_text\n",
|
||||
" return \"\"\n",
|
||||
"\n",
|
||||
" entry.submit(do_entry, inputs=[entry, chatbot], outputs=[entry, chatbot]).then(\n",
|
||||
" chat, inputs=[entry,chatbot, model_dropdown], outputs=[chatbot, image_output]\n",
|
||||
" )\n",
|
||||
"\n",
|
||||
" voice_submit.click(\n",
|
||||
" process_voice_input,\n",
|
||||
" inputs=[audio_input],\n",
|
||||
" outputs=[entry]\n",
|
||||
" )\n",
|
||||
" \n",
|
||||
" clear.click(lambda: None, inputs=None, outputs=chatbot, queue=False)\n",
|
||||
"\n",
|
||||
"ui.launch(inbrowser=True)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "46085f59-8945-4a64-9db1-d10cd44284db",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": []
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"kernelspec": {
|
||||
"display_name": "Python 3 (ipykernel)",
|
||||
"language": "python",
|
||||
"name": "python3"
|
||||
},
|
||||
"language_info": {
|
||||
"codemirror_mode": {
|
||||
"name": "ipython",
|
||||
"version": 3
|
||||
},
|
||||
"file_extension": ".py",
|
||||
"mimetype": "text/x-python",
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.11.13"
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 5
|
||||
}
|
||||
435
week2/community-contributions/week2_challenge_tripplanner.ipynb
Normal file
435
week2/community-contributions/week2_challenge_tripplanner.ipynb
Normal file
@@ -0,0 +1,435 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "d006b2ea-9dfe-49c7-88a9-a5a0775185fd",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# Additional End of week Exercise - week 2\n",
|
||||
"\n",
|
||||
"Now use everything you've learned from Week 2 to build a full prototype for the technical question/answerer you built in Week 1 Exercise.\n",
|
||||
"\n",
|
||||
"This should include a Gradio UI, streaming, use of the system prompt to add expertise, and the ability to switch between models. Bonus points if you can demonstrate use of a tool!\n",
|
||||
"\n",
|
||||
"If you feel bold, see if you can add audio input so you can talk to it, and have it respond with audio. ChatGPT or Claude can help you, or email me if you have questions.\n",
|
||||
"\n",
|
||||
"I will publish a full solution here soon - unless someone beats me to it...\n",
|
||||
"\n",
|
||||
"There are so many commercial applications for this, from a language tutor, to a company onboarding solution, to a companion AI to a course (like this one!) I can't wait to see your results."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "a07e7793-b8f5-44f4-aded-5562f633271a",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"ONLINE TRIP PLANNING ASSISTANT"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "7ea68f58-8b6f-418d-b04c-98e38147a2c3",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# imports\n",
|
||||
"\n",
|
||||
"import os\n",
|
||||
"import json\n",
|
||||
"from dotenv import load_dotenv\n",
|
||||
"from openai import OpenAI\n",
|
||||
"import gradio as gr"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "80fd1847-3f5e-4e9b-8b54-b4ca5923aaa8",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"force_dark_mode = \"\"\"\n",
|
||||
"function refresh() {\n",
|
||||
" const url = new URL(window.location);\n",
|
||||
" if (url.searchParams.get('__theme') !== 'dark') {\n",
|
||||
" url.searchParams.set('__theme', 'dark');\n",
|
||||
" window.location.href = url.href;\n",
|
||||
" }\n",
|
||||
"}\n",
|
||||
"\"\"\""
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "212f6842-6058-4246-8d84-c4bcc1f1a651",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Initialization\n",
|
||||
"\n",
|
||||
"load_dotenv(override=True)\n",
|
||||
"\n",
|
||||
"openai_api_key = os.getenv('OPENAI_API_KEY')\n",
|
||||
"if openai_api_key:\n",
|
||||
" print(f\"OpenAI API Key exists and begins {openai_api_key[:8]}\")\n",
|
||||
"else:\n",
|
||||
" print(\"OpenAI API Key not set\")\n",
|
||||
" \n",
|
||||
"MODEL = \"gpt-4o-mini\"\n",
|
||||
"openai = OpenAI()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "e4980d17-a8a1-4347-9a0c-56a57c93d063",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"system_message=\"You are the AI trip planner assistant your job is to analyse the user message and give clear and crisp response based on their queries.\"\n",
|
||||
"system_message+=\"Only if they are asking you specifically for iternary for their trip destination for the required number of days provided by them in the user message. for example\"\n",
|
||||
"system_message+=\"user: I want a plan for Ooty trip for 3 days.\" \n",
|
||||
"system_message+=\"then you need to provide an clear simple iternary for OOty for 3 days. Like Day1 this place, this place Day 2 this place this place in a good format.\"\n",
|
||||
"system_message+=\"Always be accurate if you don't know the answer say so.\""
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "a215d7c8-5c3f-4f10-b492-a70d48814010",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# This function looks rather simpler than the one from my video, because we're taking advantage of the latest Gradio updates\n",
|
||||
"\n",
|
||||
"def chat(message, history):\n",
|
||||
" messages = [{\"role\": \"system\", \"content\": system_message}] + history + [{\"role\": \"user\", \"content\": message}]\n",
|
||||
" response = openai.chat.completions.create(model=MODEL, messages=messages)\n",
|
||||
" return response.choices[0].message.content\n",
|
||||
"\n",
|
||||
"gr.ChatInterface(fn=chat, type=\"messages\",js=force_dark_mode).launch()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "fe328993-5b14-4909-a2c5-3e0cd1bc8a5e",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"#let add mulitmodal feature image \n",
|
||||
"import base64\n",
|
||||
"from io import BytesIO\n",
|
||||
"from PIL import Image"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "8fb145ee-2537-48a4-80b0-7921d79669be",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"def artist(city):\n",
|
||||
" image_response = openai.images.generate(\n",
|
||||
" model=\"dall-e-3\",\n",
|
||||
" prompt=f\"An image representing a vacation in {city}, showing tourist spots and everything unique about {city}, in a vibrant pop-art style\",\n",
|
||||
" size=\"1024x1024\",\n",
|
||||
" n=1,\n",
|
||||
" response_format=\"b64_json\",\n",
|
||||
" )\n",
|
||||
" image_base64 = image_response.data[0].b64_json\n",
|
||||
" image_data = base64.b64decode(image_base64)\n",
|
||||
" return Image.open(BytesIO(image_data))"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "c475aae4-79a4-4a49-b846-b01b6a67000d",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"#now lets implement tools for checking the ticket price and for demo flight booking assitance"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "0c047872-df5c-4ea1-bc43-0c5da596aac7",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Add this import at the top\n",
|
||||
"from reportlab.lib.pagesizes import letter\n",
|
||||
"from reportlab.pdfgen import canvas\n",
|
||||
"from reportlab.lib.units import inch\n",
|
||||
"from datetime import datetime\n",
|
||||
"import io\n",
|
||||
"import base64\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "ceeb60f7-ac79-46d8-bbf3-5fab163d9ccf",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"#for checking ticket price\n",
|
||||
"\n",
|
||||
"ticket_prices = {\"london\": \"$799\", \"paris\": \"$899\", \"tokyo\": \"$1400\", \"berlin\": \"$499\"}\n",
|
||||
"\n",
|
||||
"def get_ticket_price(destination_city):\n",
|
||||
" print(f\"Tool get_ticket_price called for {destination_city}\")\n",
|
||||
" city = destination_city.lower()\n",
|
||||
" return ticket_prices.get(city, \"Unknown\")\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "9b92db79-87ba-48ec-840b-a54e3d5cb5cc",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"#for flight booking\n",
|
||||
"# Add the booking function\n",
|
||||
"def book_flight(destination_city, travel_date, passenger_name=\"Traveler\"):\n",
|
||||
" print(f\"Tool book_flight called for {destination_city} on {travel_date}\")\n",
|
||||
" \n",
|
||||
" # Generate a simple PDF ticket\n",
|
||||
" buffer = io.BytesIO()\n",
|
||||
" c = canvas.Canvas(buffer, pagesize=letter)\n",
|
||||
" width, height = letter\n",
|
||||
" \n",
|
||||
" # Draw ticket\n",
|
||||
" c.setFont(\"Helvetica-Bold\", 24)\n",
|
||||
" c.drawString(2*inch, height - 2*inch, \"FLIGHT TICKET\")\n",
|
||||
" \n",
|
||||
" c.setFont(\"Helvetica\", 12)\n",
|
||||
" c.drawString(1*inch, height - 3*inch, f\"Passenger: {passenger_name}\")\n",
|
||||
" c.drawString(1*inch, height - 3.5*inch, f\"Destination: {destination_city.upper()}\")\n",
|
||||
" c.drawString(1*inch, height - 4*inch, f\"Travel Date: {travel_date}\")\n",
|
||||
" c.drawString(1*inch, height - 4.5*inch, f\"Booking Reference: BK{datetime.now().strftime('%Y%m%d%H%M%S')}\")\n",
|
||||
" c.drawString(1*inch, height - 5*inch, f\"Price: {ticket_prices.get(destination_city.lower(), 'N/A')}\")\n",
|
||||
" c.drawString(1*inch, height - 6*inch, \"Status: CONFIRMED ✓\")\n",
|
||||
" \n",
|
||||
" c.save()\n",
|
||||
" \n",
|
||||
" pdf_bytes = buffer.getvalue()\n",
|
||||
" buffer.close()\n",
|
||||
" \n",
|
||||
" return base64.b64encode(pdf_bytes).decode(), destination_city, travel_date\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "51e9a429-dafa-4fdf-9e30-bef07b78a1b3",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# There's a particular dictionary structure that's required to describe our function:\n",
|
||||
"\n",
|
||||
"price_function = {\n",
|
||||
" \"name\": \"get_ticket_price\",\n",
|
||||
" \"description\": \"Get the price of a return ticket to the destination city. Call this whenever you need to know the ticket price, for example when a customer asks 'How much is a ticket to this city'\",\n",
|
||||
" \"parameters\": {\n",
|
||||
" \"type\": \"object\",\n",
|
||||
" \"properties\": {\n",
|
||||
" \"destination_city\": {\n",
|
||||
" \"type\": \"string\",\n",
|
||||
" \"description\": \"The city that the customer wants to travel to\",\n",
|
||||
" },\n",
|
||||
" },\n",
|
||||
" \"required\": [\"destination_city\"],\n",
|
||||
" \"additionalProperties\": False\n",
|
||||
" }\n",
|
||||
"}\n",
|
||||
"\n",
|
||||
"booking_function = {\n",
|
||||
" \"name\": \"book_flight\",\n",
|
||||
" \"description\": \"Book a flight ticket to the destination city for a specific date. Call this when a customer wants to book a flight and provides a travel date.\",\n",
|
||||
" \"parameters\": {\n",
|
||||
" \"type\": \"object\",\n",
|
||||
" \"properties\": {\n",
|
||||
" \"destination_city\": {\n",
|
||||
" \"type\": \"string\",\n",
|
||||
" \"description\": \"The destination city for the flight\",\n",
|
||||
" },\n",
|
||||
" \"travel_date\": {\n",
|
||||
" \"type\": \"string\",\n",
|
||||
" \"description\": \"The date of travel in format YYYY-MM-DD or any date format provided by user\",\n",
|
||||
" },\n",
|
||||
" \"passenger_name\": {\n",
|
||||
" \"type\": \"string\",\n",
|
||||
" \"description\": \"The name of the passenger (optional)\",\n",
|
||||
" }\n",
|
||||
" },\n",
|
||||
" \"required\": [\"destination_city\", \"travel_date\"],\n",
|
||||
" \"additionalProperties\": False\n",
|
||||
" }\n",
|
||||
"}"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "0912f99b-1650-4c40-8249-5fe808c85c75",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"tools = [\n",
|
||||
" {\"type\": \"function\", \"function\": price_function},\n",
|
||||
" {\"type\": \"function\", \"function\": booking_function}\n",
|
||||
"]"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "fdc5b499-c6f2-48fd-a6ee-304f7dbe8b0e",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Update handle_tool_call to handle both tools\n",
|
||||
"def handle_tool_call(message):\n",
|
||||
" tool_call = message.tool_calls[0]\n",
|
||||
" arguments = json.loads(tool_call.function.arguments)\n",
|
||||
" \n",
|
||||
" if tool_call.function.name == \"get_ticket_price\":\n",
|
||||
" city = arguments.get('destination_city')\n",
|
||||
" price = get_ticket_price(city)\n",
|
||||
" response = {\n",
|
||||
" \"role\": \"tool\",\n",
|
||||
" \"content\": json.dumps({\"destination_city\": city, \"price\": price}),\n",
|
||||
" \"tool_call_id\": tool_call.id\n",
|
||||
" }\n",
|
||||
" return response, city\n",
|
||||
" \n",
|
||||
" elif tool_call.function.name == \"book_flight\":\n",
|
||||
" city = arguments.get('destination_city')\n",
|
||||
" date = arguments.get('travel_date')\n",
|
||||
" passenger = arguments.get('passenger_name', 'Traveler')\n",
|
||||
" pdf_base64, city, date = book_flight(city, date, passenger)\n",
|
||||
" response = {\n",
|
||||
" \"role\": \"tool\",\n",
|
||||
" \"content\": json.dumps({\n",
|
||||
" \"destination_city\": city,\n",
|
||||
" \"travel_date\": date,\n",
|
||||
" \"status\": \"confirmed\",\n",
|
||||
" \"ticket_pdf\": pdf_base64\n",
|
||||
" }),\n",
|
||||
" \"tool_call_id\": tool_call.id\n",
|
||||
" }\n",
|
||||
" return response, city"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "b319f933-da42-4b78-8a07-fedab965abae",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"def chat(message, history):\n",
|
||||
" messages = [{\"role\": \"system\", \"content\": system_message}] + history + [{\"role\": \"user\", \"content\": message}]\n",
|
||||
" response = openai.chat.completions.create(model=MODEL, messages=messages, tools=tools)\n",
|
||||
" \n",
|
||||
" pdf_file = None\n",
|
||||
" image = None\n",
|
||||
" \n",
|
||||
" if response.choices[0].finish_reason == \"tool_calls\":\n",
|
||||
" message_obj = response.choices[0].message\n",
|
||||
" tool_response, city = handle_tool_call(message_obj)\n",
|
||||
" \n",
|
||||
" # Check which tool was called\n",
|
||||
" if message_obj.tool_calls[0].function.name == \"get_ticket_price\":\n",
|
||||
" image = artist(city)\n",
|
||||
" \n",
|
||||
" elif message_obj.tool_calls[0].function.name == \"book_flight\":\n",
|
||||
" tool_content = json.loads(tool_response[\"content\"])\n",
|
||||
" pdf_base64 = tool_content.get(\"ticket_pdf\")\n",
|
||||
" \n",
|
||||
" if pdf_base64:\n",
|
||||
" # Decode and save PDF\n",
|
||||
" pdf_bytes = base64.b64decode(pdf_base64)\n",
|
||||
" pdf_file = f\"ticket_{city}_{datetime.now().strftime('%Y%m%d_%H%M%S')}.pdf\"\n",
|
||||
" with open(pdf_file, \"wb\") as f:\n",
|
||||
" f.write(pdf_bytes)\n",
|
||||
" \n",
|
||||
" messages.append(message_obj)\n",
|
||||
" messages.append(tool_response)\n",
|
||||
" response = openai.chat.completions.create(model=MODEL, messages=messages)\n",
|
||||
" \n",
|
||||
" return response.choices[0].message.content, pdf_file, image"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "49f91127-e282-409d-84b8-8d04eb902c74",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"with gr.Blocks() as demo:\n",
|
||||
" gr.Markdown(\"# 🌍 AI Trip Planner Assistant\")\n",
|
||||
" \n",
|
||||
" with gr.Row():\n",
|
||||
" chatbot = gr.Chatbot(type=\"messages\", height=400)\n",
|
||||
" image_output = gr.Image(height=400, interactive=False, show_label=False)\n",
|
||||
" \n",
|
||||
" with gr.Row():\n",
|
||||
" msg = gr.Textbox(label=\"Your message\", placeholder=\"Ask about ticket prices or book a flight...\", scale=4)\n",
|
||||
" send_btn = gr.Button(\"Send\", scale=1)\n",
|
||||
" \n",
|
||||
" file_output = gr.File(label=\"📄 Download Your Ticket\", visible=True)\n",
|
||||
" \n",
|
||||
" def respond(message, history):\n",
|
||||
" bot_message, pdf_path, image = chat(message, history)\n",
|
||||
" history.append({\"role\": \"user\", \"content\": message})\n",
|
||||
" history.append({\"role\": \"assistant\", \"content\": bot_message})\n",
|
||||
" return history, \"\", pdf_path, image\n",
|
||||
" \n",
|
||||
" msg.submit(respond, [msg, chatbot], [chatbot, msg, file_output, image_output])\n",
|
||||
" send_btn.click(respond, [msg, chatbot], [chatbot, msg, file_output, image_output])\n",
|
||||
"\n",
|
||||
"demo.launch()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "23b746f6-94d7-44d7-9dec-b4f4dcdd6b61",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": []
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"kernelspec": {
|
||||
"display_name": "Python 3 (ipykernel)",
|
||||
"language": "python",
|
||||
"name": "python3"
|
||||
},
|
||||
"language_info": {
|
||||
"codemirror_mode": {
|
||||
"name": "ipython",
|
||||
"version": 3
|
||||
},
|
||||
"file_extension": ".py",
|
||||
"mimetype": "text/x-python",
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.11.13"
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 5
|
||||
}
|
||||
Reference in New Issue
Block a user