MIT Introduction to Deep Learning 6.S191: Lecture 1
*New 2020 Edition*
https://www.youtube.com/watch?v=njKP3FqW3Sk&list=PLtBw6njQRU-rwp5__7C0oIVt26ZgjG9NI&index=2
MIT Introduction to Deep Learning 6.S191: Lecture 1
*New 2020 Edition*
Foundations of Deep Learning
Lecturer: Alexander Amini
January 2020
For all lectures, slides, and lab materials: http://introtodeeplearning.com
Lecture Outline
0:00 - Introduction
4:14 - Course information
8:10 - Why deep learning?
11:01 - The perceptron
13:07 - Activation functions
15:32 - Perceptron example
18:54 - From perceptrons to neural networks
25:23 - Applying neural networks
28:16 - Loss functions
31:14 - Training and gradient descent
35:13 - Backpropagation
39:25 - Setting the learning rate
43:43 - Batched gradient descent
46:46 - Regularization: dropout and early stopping
51:58 - Summary
Subscribe to @stay up to date with new deep learning lectures at MIT, or follow us on @MITDeepLearning on Twitter and Instagram to stay fully-connected!!
No hay comentarios:
Publicar un comentario