π₯ Recorded video: Lecture 2
π May 5, 6pm
Description
Contents: Batch processing of many input samples, efficient implementation in python, neural networks can generate arbitrary functions, training a network as (high-dimensional) nonlinear curve fitting, cost function, stochastic gradient descent, backpropagation algorithm, full implementation in python, relation to physics (path integral), summary: basic ingredients of neural networks (and hyperparameters)
After this lecture, you will in principle be able to train a deep neural network on arbitrary tasks (using pure python code that we provide in the lecture). But you donβt yet know how to choose a smart representation of the data in more complicated cases, how best to choose the parameters during training, how to accelerate the gradient descent, etc.