AI Paper English F.o.R.

人工知能(AI)に関する論文を英語リーディング教本のFrame of Reference(F.o.R.)を使いこなして読むブログです。

Batch Normalization

Batch Normalization | Abstract 第8文

Using an ensemble of batch-normalized networks, we improve upon the best published result on ImageNet classification: reaching 4.9% top-5 validation error (and 4.8% test error), exceeding the accuracy of human raters. Sergey Ioffe, et al.,…

Batch Normalization | Abstract 第7文

Applied to a state-of-the-art image classification model, Batch Normalization achieves the same accuracy with 14 times fewer training steps, and beats the original model by a significant margin. Sergey Ioffe, et al., "Batch Normalization: …

Batch Normalization | Abstract 第6文

It also acts as a regularizer, in some cases eliminating the need for Dropout. Sergey Ioffe, et al., "Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift" https://arxiv.org/abs/1502.03167 内部共変量…

Batch Normalization | Abstract 第5文

Batch Normalization allows us to use much higher learning rates and be less careful about initialization. Sergey Ioffe, et al., "Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift" https://arxiv.or…

Batch Normalization | Abstract 第4文

Our method draws its strength from making normalization a part of the model architecture and performing the normalization for each training mini-batch. Sergey Ioffe, et al., "Batch Normalization: Accelerating Deep Network Training by Reduc…

Batch Normalization | Abstract 第3文

We refer to this phenomenon as internal covariate shift, and address the problem by normalizing layer inputs. Sergey Ioffe, et al., "Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift" https://arxi…

Batch Normalization | Abstract 第2文

This slows down the training by requiring lower learning rates and careful parameter initialization, and makes it notoriously hard to train models with saturating nonlinearities. Sergey Ioffe, et al., "Batch Normalization: Accelerating Dee…

Batch Normalization | Abstract 第1文

Training Deep Neural Networks is complicated by the fact that the distribution of each layer’s inputs changes during training, as the parameters of the previous layers change. Sergey Ioffe, et al., "Batch Normalization: Accelerating Deep N…