Publication
ICASSP 2017
Conference paper

Noisy objective functions based on the f-divergence

View publication

Abstract

Dropout, the random dropping out of activations according to a specified rate, is a very simple but effective method to avoid over-fitting of deep neural networks to the training data.

Date

16 Jun 2017

Publication

ICASSP 2017

Authors

Share