Self-Organized Neural Learning of Statistical Inference from High-Dimensional Data
International Joint Conference on Artificial Intelligence (IJCAI-13),
pages 1226--1232,
- Aug 2013
With information about the world implicitly embedded in complex, high-dimensional neural population responses, the brain must perform some sort
of statistical inference on a large scale to form hypotheses about the state of the environment. This
ability is, in part, acquired after birth and often
with very little feedback to guide learning. This
is a very difficult learning problem considering the
little information about the meaning of neural responses available at birth. In this paper, we address the question of how the brain might solve
this problem: We present an unsupervised artificial neural network algorithm which takes from the
self-organizing map (SOM) algorithm the ability to
learn a latent variable model from its input. We extend the SOM algorithm so it learns about the distribution of noise in the input and computes probability density functions over the latent variables.
The algorithm represents these probability density
functions using population codes. This is done
with very few assumptions about the distribution of
noise. Our simulations indicate that our algorithm
can learn to perform similar to a maximum likelihood estimator with the added benefit of requiring
no a-priori knowledge about the input and computing not only best hypotheses, but also probabilities
for alternatives.
@InProceedings{BW13, author = {Bauer, Johannes and Wermter, Stefan}, title = {Self-Organized Neural Learning of Statistical Inference from High-Dimensional Data}, booktitle = {International Joint Conference on Artificial Intelligence (IJCAI-13)}, editors = {}, number = {}, volume = {}, pages = {1226--1232}, year = {2013}, month = {Aug}, publisher = {AAAI Press}, doi = {}, }