TY - JOUR
T1 - Statistical physics and representations in real and artificial neural networks
AU - Cocco, S.
AU - Monasson, R.
AU - Posani, L.
AU - Rosay, S.
AU - Tubiana, J.
N1 - Publisher Copyright:
© 2017 Elsevier B.V.
PY - 2018/8/15
Y1 - 2018/8/15
N2 - This document presents the material of two lectures on statistical physics and neural representations, delivered by one of us (R.M.) at the Fundamental Problems in Statistical Physics XIV summer school in July 2017. In a first part, we consider the neural representations of space (maps) in the hippocampus. We introduce an extension of the Hopfield model, able to store multiple spatial maps as continuous, finite-dimensional attractors. The phase diagram and dynamical properties of the model are analyzed. We then show how spatial representations can be dynamically decoded using an effective Ising model capturing the correlation structure in the neural data, and compare applications to data obtained from hippocampal multi-electrode recordings and by (sub)sampling our attractor model. In a second part, we focus on the problem of learning data representations in machine learning, in particular with artificial neural networks. We start by introducing data representations through some illustrations. We then analyze two important algorithms, Principal Component Analysis and Restricted Boltzmann Machines, with tools from statistical physics.
AB - This document presents the material of two lectures on statistical physics and neural representations, delivered by one of us (R.M.) at the Fundamental Problems in Statistical Physics XIV summer school in July 2017. In a first part, we consider the neural representations of space (maps) in the hippocampus. We introduce an extension of the Hopfield model, able to store multiple spatial maps as continuous, finite-dimensional attractors. The phase diagram and dynamical properties of the model are analyzed. We then show how spatial representations can be dynamically decoded using an effective Ising model capturing the correlation structure in the neural data, and compare applications to data obtained from hippocampal multi-electrode recordings and by (sub)sampling our attractor model. In a second part, we focus on the problem of learning data representations in machine learning, in particular with artificial neural networks. We start by introducing data representations through some illustrations. We then analyze two important algorithms, Principal Component Analysis and Restricted Boltzmann Machines, with tools from statistical physics.
KW - Continuous attractors
KW - Machine learning
KW - Neural network
KW - Place cell
KW - Principal component analysis
KW - Restricted Boltzmann Machine
UR - http://www.scopus.com/inward/record.url?scp=85038839266&partnerID=8YFLogxK
U2 - 10.1016/j.physa.2017.11.153
DO - 10.1016/j.physa.2017.11.153
M3 - ???researchoutput.researchoutputtypes.contributiontojournal.article???
AN - SCOPUS:85038839266
SN - 0378-4371
VL - 504
SP - 45
EP - 76
JO - Physica A: Statistical Mechanics and its Applications
JF - Physica A: Statistical Mechanics and its Applications
ER -