Neurons need no adaptation to optimally code arbitrarily complex stimuli

Published: May 22, 2020, 10 p.m.

Link to bioRxiv paper: http://biorxiv.org/cgi/content/short/2020.05.21.104638v1?rss=1 Authors: Forkosh, O. Abstract: Neural networks seem to be able to handle almost any task they face. This feat involves coping efficiently with different data types, at multiple scales, and with varying statistical properties. Here, we show that this so-called optimal coding can occur at the single-neuron level and does not require adaptation. Differentiator neurons, i.e., neurons that spike whenever there is an increase in the input stimuli, are capable of capturing arbitrary statistics and scale of practically any stimulus they encounter. We show this optimality both analytically and using simulations, which demonstrate how an ideal neuron can handle drastically different probability distributions. While the mechanism we present is an oversimplification of "real" neurons and does not necessarily capture all neuron types, this is also its strength since it can function alongside other neuronal goals such as data manipulation and learning. Depicting the simplicity of neural response to complex stimuli, this result may also indicate a straightforward way to improve current artificial neural networks. Copy rights belong to original authors. Visit the link for more info