Parametric Dropout in RNN
Abstract
Deep learning has become a very important tool in the field of natural language processing (NLP), especially RNN, in processing sequential data. Dropout as a method of regularization, excellent solved the “overfitting†problem in neural network training. However, dropout in RNN always just implement in the feed-forward connection, less for recurrent connection, because dropout may damage the temporal dependencies of RNN. In this paper, we proposed a parametric dropout algorithm for RNN, used in recurrent connection and can capture the dependence and semantic information between words in sentences. We experiment our algorithm in two datasets and composed with original uniform dropout. The result shows that our algorithm performs better than the previous one.
Keywords
Dropout, Parametric, RNN
DOI
10.12783/dtcse/smce2017/12443
10.12783/dtcse/smce2017/12443
Refbacks
- There are currently no refbacks.