Concise Lecture Notes - Lesson 2 | Fastai v3 (2019)

These notes were typed out by me while watching the lecture, for a quick revision later on. To be able to fully understand them, they should be used alongside the jupyter notebooks that are available here:

Preamble:


Notes:

How to create own classifier with own images?

Go through the steps from the first tutorial to create a model

Cleaning up the database:

Putting model in production:

How to use a trained model for inference?

What happens when we run into a problem:

Generally the issue is:

1. Learning rate
2. Number of epochs

These are how to know what is really wrong:

  1. If validation loss is too high then learning rate is too high
  2. Over the first few epochs if the error_rate is going down really slowly then the learning rate is too high.
  3. When there are too few epochs, the training loss is much higher than the validation loss
  4. When there are too many epochs, the model starts overfitting
  5. The only way to figure out if we are overfitting is if the error rate is going down for a while and then starts going up again

Why the use 3e-3 and 3e-4 was being used as the default learning rate?

Gradient Descent

Notes on pytorch

Stochastic Gradient Descent

comments powered by Disqus