AI Paper English F.o.R.

人工知能(AI)に関する論文を英語リーディング教本のFrame of Reference(F.o.R.)を使いこなして読むブログです。

Dropout

Dropout | Abstract 第10文

We show that dropout improves the performance of neural networks on supervised learning tasks in vision, speech recognition, document classification and computational biology, obtaining state-of-the-art results on many benchmark data sets.…

Dropout | Abstract 第9文

This significantly reduces overfitting and gives major improvements over other regularization methods. Nitish Srivastava, et al., "Dropout: A Simple Way to Prevent Neural Networks from Overfitting" http://jmlr.org/papers/volume15/srivastav…

Dropout | Abstract 第8文

At test time, it is easy to approximate the effect of averaging the predictions of all these thinned networks by simply using a single unthinned network that has smaller weights. Nitish Srivastava, et al., "Dropout: A Simple Way to Prevent…

Dropout | Abstract 第7文

During training, dropout samples from an exponential number of different “thinned” networks. Nitish Srivastava, et al., "Dropout: A Simple Way to Prevent Neural Networks from Overfitting" http://jmlr.org/papers/volume15/srivastava14a/sriva…

Dropout | Abstract 第6文

This prevents units from co-adapting too much. Nitish Srivastava, et al., "Dropout: A Simple Way to Prevent Neural Networks from Overfitting" http://jmlr.org/papers/volume15/srivastava14a/srivastava14a.pdf ノードをランダムに消去しながら学…

Dropout | Abstract 第5文

The key idea is to randomly drop units (along with their connections) from the neural network during training. Nitish Srivastava, et al., "Dropout: A Simple Way to Prevent Neural Networks from Overfitting" http://jmlr.org/papers/volume15/s…

Dropout | Abstract 第4文

Dropout is a technique for addressing this problem. Nitish Srivastava, et al., "Dropout: A Simple Way to Prevent Neural Networks from Overfitting" http://jmlr.org/papers/volume15/srivastava14a/srivastava14a.pdf ノードをランダムに消去しなが…

Dropout | Abstract 第3文

Large networks are also slow to use, making it difficult to deal with overfitting by combining the predictions of many different large neural nets at test time. Nitish Srivastava, et al., "Dropout: A Simple Way to Prevent Neural Networks f…

Dropout | Abstract 第2文

However, overfitting is a serious problem in such networks. Nitish Srivastava, et al., "Dropout: A Simple Way to Prevent Neural Networks from Overfitting" http://jmlr.org/papers/volume15/srivastava14a/srivastava14a.pdf ノードをランダムに消…

Dropout | Abstract 第1文

Deep neural nets with a large number of parameters are very powerful machine learning systems. Nitish Srivastava, et al., "Dropout: A Simple Way to Prevent Neural Networks from Overfitting" http://jmlr.org/papers/volume15/srivastava14a/sri…