{"type":"video","version":"1.0","html":"<iframe src=\"https://www.loom.com/embed/d44040318697420c94f8b20867d673ac\" frameborder=\"0\" width=\"1680\" height=\"1260\" webkitallowfullscreen mozallowfullscreen allowfullscreen></iframe>","height":1260,"width":1680,"provider_name":"Loom","provider_url":"https://www.loom.com","thumbnail_height":1260,"thumbnail_width":1680,"thumbnail_url":"https://cdn.loom.com/sessions/thumbnails/d44040318697420c94f8b20867d673ac-10cdaefe00fea27b.gif","duration":268.825,"title":"5-Understanding Neural Networks: The Role of Calculus in Learning 📈","description":"In this notebook, I explain the calculus concepts that underpin how neural networks learn, focusing on derivatives, gradients, and backpropagation. I walk through the importance of derivatives in measuring output sensitivity to weight changes and how we utilize the chain rule to compute gradients through layers. We explore gradient descent, where we adjust weights to minimize loss, and I demonstrate the complete training loop on the XOR problem, showcasing both the forward and backward passes. I encourage you to observe the visualizations of activation functions, loss surfaces, and the learning process. Overall, this notebook connects key calculus concepts to the functioning of neural networks."}