Lectures

Below is a summary of the lecture topics and links to the lecture slides. I will try and make all slides available before the lecture begins.  We might vary the order of the lecture topics (probability of this happening is larger for the later lectures). The topics of Lectures 1-5 are fairly set though.

Lecture 1

Title:  The Deep Learning Revolution

Date & Time: Tuesday, March 17, 10:00-12:00

Topics covered: 

  • Review of the impact deep networks have had in the application fields of speech, computer vision and NLP. 
  • Review of the course's syllabus.
  • Review of the course's assignments, project and assessment.

Slides: Ladda ner Lecture1.pdf

, Ladda ner Lecture1_CourseAdmin.pdf

Videos of the lecture:

Part 2:

Part 1:

Lecture 2

Title:  Learning Linear Binary & Linear Multi-class Classifiers from Labelled Training Data

 (mini-batch gradient descent optimization applied to "Loss + Regularization" cost functions)

Date & Time: Thursday, March 19, 13:00-15:00

Topics covered: 

  • Supervised learning  =  minimizing loss + regularization.
  • Learning linear classifiers.
  • Binary SVM classifiers as an unconstrained optimization problem.
  • Gradient Descent, SGD, mini-batch optimization.  
  • Multi-class classification with one layer networks.
  • Different loss-functions.

Slides: Ladda ner Lecture2.pdf

Videos of the lecture: Lecture2_part1.mp4 Ladda ner Lecture2_part1.mp4Spela upp mediekommentar., Lecture2_part2.mp4 Ladda ner Lecture2_part2.mp4Spela upp mediekommentar.

Suggested Reading Material: Sections 5.1.4, 5.2, 5.2.2, 5.7.2 from "Deep Learning" by Goodfellow, Bengio and Courville. Link to Chapter 5 of Deep Learning (Links to an external site.)Links to an external site.

Sections 8.1.3, 8.3.1 from the book give amore detailed description and analysis of mini-batch gradient descent and SGD than given in the lecture notes. Link to Chapter 8 of Deep Learning (Links to an external site.)Links to an external site..

The suggested readings from chapter 5 should be familiar to those who have already taken courses in ML. Lecture 2 should be more-or-less self-contained. But the reading material should flesh out some of the concepts referred to in the lecture.

 

Lecture 3

Title:  Back Propagation

Date & Time: FridayMarch 21, 08:00-10:00

Topics covered: 

  • Chain rule of differentiation.
  • Chain rule for vector inputs and outputs
  • Computational graph.
  • Back propagation (In more detail then you probably ever expected!)

Slides: Ladda ner Lecture3.pdf

Videos of the lecture: Lecture3_part1.mp4 Ladda ner Lecture3_part1.mp4Spela upp mediekommentar., Lecture3_part2.mp4 Ladda ner Lecture3_part2.mp4Spela upp mediekommentar.

Suggested Reading Material:

Section 6.5 from the deep learning book.

I'm going to go into very explicit detail about the back-propagation algorithm.  It was not my original intention to have such an involved description but condensing the explanation made things less clear. My hope, though, is that everybody will have a good understanding of the theory and the mechanics of the algorithm after this lecture. I go into more specific detail (but not as generic) than in the deep learning book. So my recommendation is that you read my lecture notes to get a good understanding for the concrete example(s)  I explain and then you can read the deep learning book for a broader description.  Section 6.5 also assume you know about networks with more than 1 layer! So it may be better to hold off reading it until after lecture 4 (where I will talk about n-layer networks, activation functions, etc..) 

Lecture 4

Title:  k-layer Neural Networks

Date & Time: Monday, March 23, 13:00-15:00

Topics covered: 

  • k-layer Neural Networks.
  • Activation functions.
  • Backprop for k-layer neural networks.
  • Problem of vanishing and exploding gradients.
  • Importance of careful initialization of network's weight parameters.
  • Batch normalization + Backprop with Batch normalisation

Slides: Ladda ner Lecture4.pdf

Videos of the lecture: Lecture4_part1.mp4 Ladda ner Lecture4_part1.mp4Spela upp mediekommentar., Lecture4_part2.mp4 Ladda ner Lecture4_part2.mp4Spela upp mediekommentar.

Suggested Reading Material:

Sections 8.7.1 from the deep learning book has a more subtle description of the benefits of batch normalisation and why it works.

 

Lecture 5

Title:  Training & Regularization of Neural Networks

Date & Time: Tuesday, March 24, 08:00-10:00

Topics covered: 

  • The art/science of training neural networks.
  • Hyper-parameter optimisation.
  • Variations of SGD.
  • Regularization via DropOut.
  • Evaluation of the models - ensembles

Slides: Ladda ner Lecture5.pdf

Videos of the lecture: Lecture5_part1.mp4 Ladda ner Lecture5_part1.mp4Spela upp mediekommentar., Lecture5_part2.mp4 Ladda ner Lecture5_part2.mp4Spela upp mediekommentar.

Suggested Reading Material:

Sections 8.3.1, 8.3.2, 8.3.3, 8.5 from the deep learning book cover variations of the SGD in detail.

 

Lecture 6

Title:  All about Convolutional Networks

Date & Time:  Monday,  March 30, 13:00-15:00

Topics covered: 

  • Details of the convolution layer in Convolutional Networks.
  • Gradient computations for a convolutional layers.
  • Common operations in ConvNets - max-pooling etc

Slides: Ladda ner Lecture6.pdf

Videos of the lecture: Lecture6_part1.mp4 Ladda ner Lecture6_part1.mp4Spela upp mediekommentar., Lecture6_part2.mp4 Ladda ner Lecture6_part2.mp4Spela upp mediekommentar.

Suggested Reading Material:

  • Section 9.1, 9.2 (motivates benefit of convolutional layers Vs fully connected layers), 9.10 (if you are interested in the neuro-scientific basis for ConvNets). Section 9.3 discusses the pooling operation.
  • Clip from lecture 21 of Robert Sapolsky's course on behavioral biology: 21. Chaos and Reductionism Links to an external site.21. Chaos and Reductionism (from 21:40 to 33:05 focuses on Huber-Wiesel's experiments I mentioned in Lecture 6). Thanks to Paul Vinell for sending me this link. 

 

Lecture 7

Title:  Training & Designing ConvNets

Date & Time: Tuesday,  March 31, 08:00-10:00

Topics covered: 

  • Review of the modern top performing deep ConvNets - AlexNet, VGGNet, GoogLeNet, ResNet
  • Practicalities of training deep neural networks - data augmentation, transfer learning and stacking convolutional filters. 
  • The Transposed Convolution operation
  • Bottleneck ConvNets (Semantic Segmentation, Depth estimation)
  • Attempts at Visualizing what a deep ConvNet has learnt

Slides: Ladda ner Lecture7.pdf

Videos of the lecture: Lecture7.mp4 Ladda ner Lecture7.mp4Spela upp mediekommentar.

Lecture 8

Title: Deep Learning Frameworks and Computational Facilities

Date & Time: Monday, Apr 6, 13:00-15:00

Topics covered: 

  • General purpose software for deep learning Tensorflow & PyTorch
  • Fundamental differences between Tensorflow & PyTorch - static and dynamic computation graphs
  • How to use the Google Cloud service

Slides: Ladda ner Lecture: DL frameworks.pdf

Videos of the lecture: Lecture8_part1.mp4 Ladda ner Lecture8_part1.mp4Spela upp mediekommentar., Lecture8_part2.mp4 Ladda ner Lecture8_part2.mp4Spela upp mediekommentar.

Lecture 9

Title:  Networks for Sequential Data: RNNs & LSTMs

Date & Time:  Tuesday, April 7, 08:00-10:00

Topics covered: 

  • RNNs.
  • Back-prop for RNNs.
  • RNNs for synthesis problems.
  • Problem of exploding and vanishing gradients in RNN.
  • LSTMs
  • ConvNets Vs LSTMs for sequential data

Slides: Ladda ner Lecture9.pdf

Videos of the lecture: Lecture9_part1.mp4 Ladda ner Lecture9_part1.mp4Spela upp mediekommentar., Lecture9_part2.mp4 Ladda ner Lecture9_part2.mp4Spela upp mediekommentar. 

Lecture 10

Title: Networks for Sequential Data: RNNs & LSTMs Ctd

Date & Time:  Monday, April 20, 10:00-12:00

Topics covered:  

  • Finished the content from the previous lecture
  • Tutorial examples on the board

Slides: Ladda ner Lecture10_Example.pdf

Videos of the lecture:

Lecture 11

Title: How to generate realistic images using deep learning?

Date & Time:  Tuesday,  April 21, 08:00-10:00

Topics covered:  

  • Introduction to (Deep) Generative Modeling 
  • Variational Auto-encoders
  • Generative Adversarial Networks
  • Brief mention of other methods (PixelCNN, Glow)

Slides: Ladda ner Lecture11.pdf

Videos of the lecture: Lecture11.mp4 Ladda ner Lecture11.mp4Spela upp mediekommentar.

 

Lecture 12

Title: Self-supervised learning

Date & Time: Monday, 27th April, 13:00-15:00

Topics covered:  

  • The next big thing

Slides: Ladda ner Lecture12.pdf

Videos of the lecture:

 

Lecture 13

Title:  Deep learning for translation problems

Date & Time:  Tuesday, April  28, 13:00-15:00

Topics covered:  

  • Deep Learning and NLP embeddings
  • Language Translation
    • Supervised Learning
  • Image Captioning
  • Attention Networks

Slides: Ladda ner Lecture13.pdf

Videos of the lecture:

 

Lecture 14

Title:  Transformer Networks & some odds and ends

Date & Time: Monday, May 4, 13:00-15:00

Topics covered:  

  • Transformer networks & self-attention
  • Adversarial examples
  • Some Odds & ends
  • Q & A from students.

Slides Ladda ner Lecture14.pdf

Videos of the lecture: