Getting just started and want to know what is Backpropagation in Deep Learning or in any Neural Network for that matter?
Or are you that “Want to get deeper” type, and want to understand the mathematical concepts and equations behind Backpropagation?
Or Even ‘just’ simple mathematical understanding also does not entice you, and you want to actually derive the mathematics from basic Calculus and dive even deeper.
Or you want to understand how Backpropagation and Feed-Forward mechanisms are connected?
Or you want to understand how Backpropagation changes by the choice of the Activation functions? For example, how does the backpropagation change from an Identity Activation based regression problem to a Sigmoid Activation based classification problem in Neural Networks and Deep Learning?
Well, then this is the right video for you. This hour-long video ensures that you get a thorough understanding of both the Mathematical and Concepts behind Backpropagation, and derives from Calculus and other interesting means the Backpropagation for different types for Activation functions.
Links to some of the referred contents in this video:
Machine Learning | Bayesian Linear Regression:
Machine Learning | Regularization – Lasso, Ridge, and OLS Regression | L1, L2 Regularizations:
Machine Learning | Bias Variance Trade-Off: