<?xml version="1.0" encoding="UTF-8"?><oembed><type>video</type><version>1.0</version><html>&lt;iframe src=&quot;https://www.loom.com/embed/cf7cee8151544d6c86467db16ecb858f&quot; frameborder=&quot;0&quot; width=&quot;1280&quot; height=&quot;960&quot; webkitallowfullscreen mozallowfullscreen allowfullscreen&gt;&lt;/iframe&gt;</html><height>960</height><width>1280</width><provider_name>Loom</provider_name><provider_url>https://www.loom.com</provider_url><thumbnail_height>960</thumbnail_height><thumbnail_width>1280</thumbnail_width><thumbnail_url>https://cdn.loom.com/sessions/thumbnails/cf7cee8151544d6c86467db16ecb858f-00001.gif</thumbnail_url><duration>88.557</duration><title>Developing a Custom DistilBERT Model for Semantic Classification</title><description>In this video, I discuss my project of developing a custom DistilBERT model with a 768-dimensional hidden layer for semantic classification tasks. I explain what semantic classification is and how it can be used for sentiment analysis, topic categorization, intent detection, and spam detection. I also share my goal of optimizing the performance of the DistilBERT model for specific needs and the tools I used, such as the Intel Developer Cloud Notebook and the Transformers library from Hugging Face. No action is requested from the viewers.</description></oembed>