Neural Net Dropout

Published: Oct. 2, 2017, 3:32 a.m.

b"Neural networks are complex models with many parameters and can be prone to overfitting.\\xa0 There's a surprisingly simple way to guard against this: randomly destroy connections between hidden units, also known as dropout.\\xa0 It seems\\xa0counterintuitive that undermining\\xa0the structural integrity of the neural net makes it robust against overfitting, but in the world of neural nets, weirdness is just how things go sometimes.\\n\\nRelevant links: https://www.cs.toronto.edu/~hinton/absps/JMLRdropout.pdf"