Edited, memorised or added to reading queue

on 01-May-2019 (Wed)

Do you want BuboFlash to help you learning these things? Click here to log in or create user.

Flashcard 4028512472332

Tags
#deeplearning #fastai #initialization #kaiming #lesson_8
Question

Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification

What is about?

Answer

This paper introduced a successful method for initializing Neural Networks which use non-linear functions such as Relu, called here rectifiers.

As the title says, the performance made it possible to surpass human-level performance on Imagenet.


statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

pdf

cannot see any pdfs







#Parametric-Relu #deeplearning #fastai #initialization #kaiming #lesson_8

PRelu - Layer

\(f(y_i) = max(0, y_i) + a_imin(0, y_i)\)

statusnot read reprioritisations
last reprioritisation on suggested re-reading day
started reading on finished reading on

pdf

cannot see any pdfs




Flashcard 4028517977356

Tags
#deeplearning #fastai #has-images #initialization #kaiming #lesson_8


The convergence of a 22-layer large model. We use ReLU as the activation for both cases. Both our initialization (red) and “Xavier” (blue) [7] lead to convergence, but ours starts reducing error earlier.

statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

pdf

cannot see any pdfs