Time | Speaker | Title | Abstract |
---|---|---|---|
14:00 | Michael J. Barber | Neural belief propagation without multiplication | Neural network models can be derived from the hypothesis that populations of neurons perform statistical inference. Such networks can be generated from a broad class of probabilistic models, but often function through the multiplication of neural firing rates. By introducing additional assumptions about the nature of the probabilistic models, we derive a class of neural networks that function only through weighted sums of neural activities. |
14:30 | Arnaud Buhot | Strong and fragile glassy behaviour in kinetically constrained systems | |
15:15 | Coffee break | ||
15:45 | Michael J. Barber | Noise-induced signal enhancement in heterogeneous neural networks | Neural networks can represent complex functions, but are often constructed of very simple units. We investigate the limitations imposed by such a simple unit, the McCulloch-Pitts neuron. We explore the role of stochastic resonance in units of finite precision and show how to construct neural networks that overcome the limitations of single units. |
16:15 | Sebastian Risau-Gusman | Typical properties of Soft Margin Classifiers | |
16:45 | Free discussion |