Beruflich Dokumente
Kultur Dokumente
Learning rate: if too small it takes long time. If too big, accuracy will be low because of big steps
Best LR:
lr_find(): looks at each mini batch/iteration, and mulptiplicatively increase the learning rate.
The learn.shed.plot_lr() (LR vs. iterations) has this information
learn.shed.plot() (LR vs. LOSS) : As LR increases decrease in loss is much high but after the
optimal learning rate point, the decrease in loss is constant/ may get worse
So, Pick the learning rate where the loss is still improving
Lesson 3: CNNs
Learn.bn_freeze(True):Batch normalization - if your data is similar to imagine and using deeper
models, then add this.
Multiply every element of 3 x 3 matrix (Convolution) with every element of 3 x 3 section of
image
Filter 1
Filter 2
Relu
Max pool (Replace every 2X2 part of grid with max resulting half the size )
Filter 3
Filter 4
Relu
Maxpool
…….
Get the probabilities