<?xml version="1.0" encoding="UTF-8"?><oembed><type>video</type><version>1.0</version><html>&lt;iframe src=&quot;https://www.loom.com/embed/4acfc5db242b4364aeb074d7ee353ca5&quot; frameborder=&quot;0&quot; width=&quot;1728&quot; height=&quot;1296&quot; webkitallowfullscreen mozallowfullscreen allowfullscreen&gt;&lt;/iframe&gt;</html><height>1296</height><width>1728</width><provider_name>Loom</provider_name><provider_url>https://www.loom.com</provider_url><thumbnail_height>1296</thumbnail_height><thumbnail_width>1728</thumbnail_width><thumbnail_url>https://cdn.loom.com/sessions/thumbnails/4acfc5db242b4364aeb074d7ee353ca5-00001.gif</thumbnail_url><duration>266.88</duration><title>Fine Tuning Snowflake Arctic Embedding Model using LlamaIndex</title><description>In this video,  we fine tune Snowflake Arctic Embedding large Model on camel research papers  with 334M params using LlamaIndex and evaluate agains the OpenAI text embedding 3 small model and see great results achieved compared to OpenAI post fine-tuning.

Github link: https://github.com/rajkstats/AIE2/blob/main/Week%205/Fine_tuning_Arctic_Embedding_Model_using_LlamaIndex_Assignment_Version_RAJK.ipynb

Hugging Face Fine tuned model: https://huggingface.co/rajkstats/snowflake-ft-camelids-l</description></oembed>