Hypothesis

The hypothesis is that people enjoy commenting 💩…

fast.ai 2020 — Lesson 1

fastai/course-v4

fastai/fastbook

High school math is enough to understand deep learning.

Lots of data is not needed but some record-breaking results have been made with <50 items.

You don’t need an expensive computer but start of the art results can be made for free.

Deep learning is the same as…

fast.ai 2020 — Lesson 8

fastai/fastbook

Language model = a model that tries to predict the next word of a sentence.

A language model works well on transfer learning as the base model because it knows something about language as it can predict the next word of a sentence.

The base language model should be…

fast.ai 2020 — Lesson 7

fastai/fastbook

Weight decay (L2 regularization)

The idea is to add the sum of all the weights squared to the loss function. This way the model tries to keep the weights as small as possible because bigger weights will increase the final loss.

loss_with_wd = loss + wd * (parameters**2).sum()

It’s…

fast.ai 2020 — Lesson 6

fastai/fastbook

learn = cnn_learner(dls, resnet34, metrics=error_rate)
learn.fine_tune(2, base_lr=0.1)

Learning rate finder helps to pick the best learning rate. The idea is to change the learning rate after every mini-batch and then plot the loss. Good learning rate is somewhere between the steepest point and the minimum. So for example based…

fast.ai 2020 — Lesson 4

fastai/fastbook

Create dataset

train_x = torch.cat([stacked_threes, stacked_sevens]).view(-1, 28*28)
train_y = tensor( * len(threes) +  * len(sevens)).unsqueeze(1)
print(train_x.shape, train_y.shape)
CONSOLE: (torch.Size([12396, 784]), torch.Size([12396, 1]))
dset = list(zip(train_x, train_y))
x, y = dset
print(x.shape, y)
PRINT: (torch.Size(), tensor())

Create weights

def init_params(size, variance=1.0):
weights = init_params((28*28,1))
bias = init_params(1)

fast.ai 2020 — Lesson 3

fastai/fastbook 