<?xml version="1.0" encoding="UTF-8"?><oembed><type>video</type><version>1.0</version><html>&lt;iframe src=&quot;https://www.loom.com/embed/f279af6c56854f3fad1a082d20f252af&quot; frameborder=&quot;0&quot; width=&quot;1920&quot; height=&quot;1440&quot; webkitallowfullscreen mozallowfullscreen allowfullscreen&gt;&lt;/iframe&gt;</html><height>1440</height><width>1920</width><provider_name>Loom</provider_name><provider_url>https://www.loom.com</provider_url><thumbnail_height>1440</thumbnail_height><thumbnail_width>1920</thumbnail_width><thumbnail_url>https://cdn.loom.com/sessions/thumbnails/f279af6c56854f3fad1a082d20f252af-2eb7dee02f2146db.gif</thumbnail_url><duration>555.174</duration><title>Model Selection</title><description>In this video, I walk you through my process for selecting and evaluating a model using XGBoost classification. I discuss the importance of AUC scores, especially in the context of class imbalance, and share my findings from tuning parameters and feature selection. I found that using all features improved our model&apos;s performance significantly, achieving a test score of 0.9815. Please review the details and let me know your thoughts on the parameter tuning strategies I employed.</description></oembed>