{"type":"video","version":"1.0","html":"<iframe src=\"https://www.loom.com/embed/7474d8c38f9c4c5bb09225761aef0c0d\" frameborder=\"0\" width=\"1680\" height=\"1260\" webkitallowfullscreen mozallowfullscreen allowfullscreen></iframe>","height":1260,"width":1680,"provider_name":"Loom","provider_url":"https://www.loom.com","thumbnail_height":1260,"thumbnail_width":1680,"thumbnail_url":"https://cdn.loom.com/sessions/thumbnails/7474d8c38f9c4c5bb09225761aef0c0d-ee949423959aa363.gif","duration":358.639,"title":"7-Foundations of Probability in Deep Learning 📊","description":"In this video, I discuss the fundamental concepts of probability as they relate to deep learning, covering topics such as sample spaces, conditional probability, and Bayes' theorem. I also explain the significance of random variables, expected value, variance, and key distributions like Bernoulli and Gaussian. Additionally, I delve into information theory, maximum likelihood estimation, and the role of sampling methods in deep learning. I encourage you to implement Softmax, CrossEntropyLoss, and DropoutLayer from scratch as practical exercises to reinforce your understanding. Thank you for your attention!"}