Package updates, more Ollama, fixes
This commit is contained in:
@@ -104,8 +104,8 @@
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# import for google\n",
|
||||
"# in rare cases, this seems to give an error on some systems. Please reach out to me if this happens,\n",
|
||||
"# or you can feel free to skip Gemini - it's the lowest priority of the frontier models that we use\n",
|
||||
"# in rare cases, this seems to give an error on some systems, or even crashes the kernel\n",
|
||||
"# If this happens to you, simply ignore this cell - I give an alternative approach for using Gemini later\n",
|
||||
"\n",
|
||||
"import google.generativeai"
|
||||
]
|
||||
@@ -148,14 +148,22 @@
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Connect to OpenAI, Anthropic and Google\n",
|
||||
"# All 3 APIs are similar\n",
|
||||
"# Having problems with API files? You can use openai = OpenAI(api_key=\"your-key-here\") and same for claude\n",
|
||||
"# Having problems with Google Gemini setup? Then just skip Gemini; you'll get all the experience you need from GPT and Claude.\n",
|
||||
"# Connect to OpenAI, Anthropic\n",
|
||||
"\n",
|
||||
"openai = OpenAI()\n",
|
||||
"\n",
|
||||
"claude = anthropic.Anthropic()\n",
|
||||
"claude = anthropic.Anthropic()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "425ed580-808d-429b-85b0-6cba50ca1d0c",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# This is the set up code for Gemini\n",
|
||||
"# Having problems with Google Gemini setup? Then just ignore this cell; when we use Gemini, I'll give you an alternative that bypasses this library altogether\n",
|
||||
"\n",
|
||||
"google.generativeai.configure()"
|
||||
]
|
||||
@@ -308,7 +316,9 @@
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# The API for Gemini has a slightly different structure\n",
|
||||
"# The API for Gemini has a slightly different structure.\n",
|
||||
"# I've heard that on some PCs, this Gemini code causes the Kernel to crash.\n",
|
||||
"# If that happens to you, please skip this cell and use the next cell instead - an alternative approach.\n",
|
||||
"\n",
|
||||
"gemini = google.generativeai.GenerativeModel(\n",
|
||||
" model_name='gemini-1.5-flash',\n",
|
||||
@@ -318,6 +328,28 @@
|
||||
"print(response.text)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "49009a30-037d-41c8-b874-127f61c4aa3a",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# As an alternative way to use Gemini that bypasses Google's python API library,\n",
|
||||
"# Google has recently released new endpoints that means you can use Gemini via the client libraries for OpenAI!\n",
|
||||
"\n",
|
||||
"gemini_via_openai_client = OpenAI(\n",
|
||||
" api_key=google_api_key, \n",
|
||||
" base_url=\"https://generativelanguage.googleapis.com/v1beta/openai/\"\n",
|
||||
")\n",
|
||||
"\n",
|
||||
"response = gemini_via_openai_client.chat.completions.create(\n",
|
||||
" model=\"gemini-1.5-flash\",\n",
|
||||
" messages=prompts\n",
|
||||
")\n",
|
||||
"print(response.choices[0].message.content)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
@@ -534,7 +566,7 @@
|
||||
"\n",
|
||||
"Try creating a 3-way, perhaps bringing Gemini into the conversation! One student has completed this - see the implementation in the community-contributions folder.\n",
|
||||
"\n",
|
||||
"Try doing this yourself before you look at the solutions.\n",
|
||||
"Try doing this yourself before you look at the solutions. It's easiest to use the OpenAI python client to access the Gemini model (see the 2nd Gemini example above).\n",
|
||||
"\n",
|
||||
"## Additional exercise\n",
|
||||
"\n",
|
||||
@@ -584,7 +616,7 @@
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.11.10"
|
||||
"version": "3.11.11"
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
|
||||
Reference in New Issue
Block a user