{"type":"video","version":"1.0","html":"<iframe src=\"https://www.loom.com/embed/4acfc5db242b4364aeb074d7ee353ca5\" frameborder=\"0\" width=\"1728\" height=\"1296\" webkitallowfullscreen mozallowfullscreen allowfullscreen></iframe>","height":1296,"width":1728,"provider_name":"Loom","provider_url":"https://www.loom.com","thumbnail_height":1296,"thumbnail_width":1728,"thumbnail_url":"https://cdn.loom.com/sessions/thumbnails/4acfc5db242b4364aeb074d7ee353ca5-00001.gif","duration":266.88,"title":"Fine Tuning Snowflake Arctic Embedding Model using LlamaIndex","description":"In this video,  we fine tune Snowflake Arctic Embedding large Model on camel research papers  with 334M params using LlamaIndex and evaluate agains the OpenAI text embedding 3 small model and see great results achieved compared to OpenAI post fine-tuning.\n\nGithub link: https://github.com/rajkstats/AIE2/blob/main/Week%205/Fine_tuning_Arctic_Embedding_Model_using_LlamaIndex_Assignment_Version_RAJK.ipynb\n\nHugging Face Fine tuned model: https://huggingface.co/rajkstats/snowflake-ft-camelids-l"}