remove client.close() to allow multiple llm runs
This commit is contained in:
@@ -9,7 +9,7 @@
|
||||
"\n",
|
||||
"Summarization use-case in which the user provides an article, which the LLM will analyze to suggest an SEO-optimized title.\n",
|
||||
"\n",
|
||||
"NOTES:\n",
|
||||
"**NOTES**:\n",
|
||||
"\n",
|
||||
"1. This version does NOT support website scrapping. You must copy and paste the required article.\n",
|
||||
"2. The following models were configured:\n",
|
||||
@@ -17,7 +17,21 @@
|
||||
" b. Llama llama3.2\n",
|
||||
" c. Deepseek deepseek-r1:1.5b\n",
|
||||
" It is possible to configure additional models by adding the new model to the MODELS dictionary and its\n",
|
||||
" initialization to the CLIENTS dictionary."
|
||||
" initialization to the CLIENTS dictionary. Then, call the model with --> ***answer =\n",
|
||||
" get_answer('NEW_MODEL')***.\n",
|
||||
"3. Users are encouraged to assess and rank the suggested titles using any headline analyzer tool online.\n",
|
||||
" Example: https://www.isitwp.com/headline-analyzer/. "
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "e773daa6-d05e-49bf-ad8e-a8ed4882b77e",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Confirming Llama is loaded\n",
|
||||
"!ollama pull llama3.2"
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -43,18 +57,11 @@
|
||||
"source": [
|
||||
"# set environment variables for OpenAi\n",
|
||||
"load_dotenv(override=True)\n",
|
||||
"api_key = os.getenv('OPENAI_API_KEY')\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "e773daa6-d05e-49bf-ad8e-a8ed4882b77e",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Confirming Llama is loaded\n",
|
||||
"!ollama pull llama3.2"
|
||||
"api_key = os.getenv('OPENAI_API_KEY')\n",
|
||||
"\n",
|
||||
"# validate API Key\n",
|
||||
"if not api_key:\n",
|
||||
" raise ValueError(\"No API key was found! Please check the .env file.\")"
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -153,9 +160,6 @@
|
||||
" model=MODELS[model],\n",
|
||||
" messages=messages\n",
|
||||
" )\n",
|
||||
"\n",
|
||||
" # closing LLM client connection\n",
|
||||
" client.close()\n",
|
||||
" \n",
|
||||
" # return answer\n",
|
||||
" return response.choices[0].message.content\n",
|
||||
@@ -199,10 +203,10 @@
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# get openAi answer\n",
|
||||
"# get Llama answer\n",
|
||||
"answer = get_answer('LLAMA')\n",
|
||||
"\n",
|
||||
"# display openAi answer\n",
|
||||
"# display Llama answer\n",
|
||||
"display(Markdown(f\"### {MODELS['LLAMA']} Answer\\n\\n{answer}\" ))"
|
||||
]
|
||||
},
|
||||
@@ -221,10 +225,10 @@
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# get openAi answer\n",
|
||||
"# get Deepseek answer\n",
|
||||
"answer = get_answer('DEEPSEEK')\n",
|
||||
"\n",
|
||||
"# display openAi answer\n",
|
||||
"# display Deepseek answer\n",
|
||||
"display(Markdown(f\"### {MODELS['DEEPSEEK']} Answer\\n\\n{answer}\" ))"
|
||||
]
|
||||
},
|
||||
@@ -235,7 +239,7 @@
|
||||
"source": [
|
||||
"### Suggested future improvements\n",
|
||||
"\n",
|
||||
"1. Add support for website scrapping to replace copy/pasting of articles.\n",
|
||||
"1. Add website scrapping support to replace copy/pasting of articles.\n",
|
||||
"2. Improve the system_prompt to provide specific SEO best practices to adopt during the title generation.\n",
|
||||
"3. Rephrase the system_prompt to ensure the model provides a single Title (not a list of suggestions). \n",
|
||||
"4. Add the logic that would allow each model to assess the recommendations from the different models and \n",
|
||||
@@ -245,12 +249,10 @@
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"id": "1af8260b-5ba1-4eeb-acd0-02de537b1bf4",
|
||||
"id": "cf7403ac-d43b-4493-98bb-6fee94950cb0",
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"S"
|
||||
]
|
||||
"source": []
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
|
||||
Reference in New Issue
Block a user