<?xml version="1.0" encoding="UTF-8"?><oembed><type>video</type><version>1.0</version><html>&lt;iframe src=&quot;https://www.loom.com/embed/83397059d6f6492d8b9fe383a6b2043f&quot; frameborder=&quot;0&quot; width=&quot;1680&quot; height=&quot;1260&quot; webkitallowfullscreen mozallowfullscreen allowfullscreen&gt;&lt;/iframe&gt;</html><height>1260</height><width>1680</width><provider_name>Loom</provider_name><provider_url>https://www.loom.com</provider_url><thumbnail_height>1260</thumbnail_height><thumbnail_width>1680</thumbnail_width><thumbnail_url>https://cdn.loom.com/sessions/thumbnails/83397059d6f6492d8b9fe383a6b2043f-bbf67ba608921e3c.gif</thumbnail_url><duration>234.765</duration><title>12-Mastering PyTorch: Advanced Techniques and Custom Architectures</title><description>In this advanced PyTorch tutorial, I dive deeper into concepts like higher-order derivatives, Jacobians, and Hessians, while also showcasing how to define custom gradients using Torch.AutoGrad.Function. We will build core operations from scratch, such as 2D convolution and batch normalization, to understand the underlying mechanics of nn.con2d and nn.batchnorm. Additionally, I will guide you through creating custom layers with nn.Module and constructing advanced architectures like ResNet and transformer blocks. We will also cover custom training loops, including gradient clipping and learning rate scheduling, culminating in a practical demo where we integrate these components into a mini ResNet model for digit classification. I encourage you to follow along and experiment with these concepts to master PyTorch.</description></oembed>