75 lines
2.4 KiB
Plaintext
75 lines
2.4 KiB
Plaintext
{
|
||
"cells": [
|
||
{
|
||
"attachments": {},
|
||
"cell_type": "markdown",
|
||
"metadata": {
|
||
"id": "GHsssBgWM_l0"
|
||
},
|
||
"source": [
|
||
"# 🔍 Predicting Item Prices from Descriptions (Part 8)\n",
|
||
"---\n",
|
||
"- Data Curation & Preprocessing\n",
|
||
"- Model Benchmarking – Traditional ML vs LLMs\n",
|
||
"- E5 Embeddings & RAG\n",
|
||
"- Fine-Tuning GPT-4o Mini\n",
|
||
"- Evaluating LLaMA 3.1 8B Quantized\n",
|
||
"- Fine-Tuning LLaMA 3.1 with QLoRA\n",
|
||
"- Evaluating Fine-Tuned LLaMA\n",
|
||
"- ➡️ Summary & Leaderboard\n",
|
||
"\n",
|
||
"---\n",
|
||
"\n",
|
||
"# 🧪 Part 8: Summary & Leaderboard\n",
|
||
"\n",
|
||
"\n",
|
||
"\n",
|
||
"# 🥇 The winner is the LLaMA 3.1 8B (4-bit) fine-tuned on 400K samples \n",
|
||
"\n",
|
||
"LLaMA 3.1 8B (4-bit) fine-tuned on 400K samples is outperforming even the big guy GPT-4o — with the lowest error and highest accuracy (75.6%).\n",
|
||
"\n",
|
||
"RAG + GPT-4o Mini also did well, proving that retrieval adds real value.\n",
|
||
"\n",
|
||
"On the other hand, traditional ML models and even human guesses, gave weaker results and fell behind the top models.\n",
|
||
"\n",
|
||
"💡 As we’ve seen, a **well-tuned open-source small model** can do amazing things on a focused task — sometimes even better than large, closed models.\n",
|
||
"It’s not about size — it’s about fit, focus, and fine-tuning.\n",
|
||
"\n",
|
||
"# ✨ Conclusion\n",
|
||
"What a journey! From classic ML to state-of-the-art LLMs, from embeddings to retrieval and fine-tuning — we explored it all to answer: who predicts prices best?\n",
|
||
"\n",
|
||
"Thanks for following along — see you in the next challenge! 🚀\n",
|
||
"\n",
|
||
"---\n",
|
||
"📢 Find more LLM notebooks on my [GitHub repository](https://github.com/lisekarimi/lexo)"
|
||
],
|
||
"outputs": []
|
||
}
|
||
],
|
||
"metadata": {
|
||
"accelerator": "GPU",
|
||
"colab": {
|
||
"gpuType": "T4",
|
||
"provenance": []
|
||
},
|
||
"kernelspec": {
|
||
"display_name": "Python 3 (ipykernel)",
|
||
"language": "python",
|
||
"name": "python3"
|
||
},
|
||
"language_info": {
|
||
"codemirror_mode": {
|
||
"name": "ipython",
|
||
"version": 3
|
||
},
|
||
"file_extension": ".py",
|
||
"mimetype": "text/x-python",
|
||
"name": "python",
|
||
"nbconvert_exporter": "python",
|
||
"pygments_lexer": "ipython3",
|
||
"version": "3.11.11"
|
||
}
|
||
},
|
||
"nbformat": 4,
|
||
"nbformat_minor": 4
|
||
} |