Concise Lecture Notes - Lesson 5 | Fastai v3 (2019)

These notes were typed out by me while watching the lecture, for a quick revision later on. To be able to fully understand them, they should be used alongside the jupyter notebooks that are available here:

Preamble:


Notes:

What happens when we do transfer learning on a resnet-34?

How do we do that?

Embedding matrices:

Movie rating Collab Filtering example

Why n_factors = 40?

Regularization:

How to penalize complexity using weight decay?

$$w_t = w_{t-1}-lr*(\frac{\partial{(mse(m(X,w), y))}}{\partial {w_{t+1}}} - (wd*w))$$

Adam Optimizer

Loss plot when SGD is used:
SGD
Loss plot when Adam is used:
Adam

Cross Entropy Loss

$$-{(y\log(p) + (1 - y)\log(1 - p))}$$ $$-\sum_{c=1}^My_{o,c}\log(p_{o,c})$$
comments powered by Disqus