NIPS 2015


Friday 11 December 8.30am – 6.30 pm
Palais des Congrès de Montréal, Montréal, Canada

A NIPS Workshop
Part of  NIPS 2015 meeting on
Advances in Neural Information Processing Systems

Invited Speakers – Video Talks & Abstracts

Invited talk 1 – José Bento Ayres Pereira, Boston College
Learning networks of stochastic differential equations

Invited talk 2 –  Jakob Macke, Max Planck Institute for Biological Cybernetics, Max Planck Institute
Correlations and signatures of criticality in neural population models

Invited talk 3 –  Andrea Montanari, Department of Electrical Engineering, Stanford University
Information-theoretic bounds on learning network dynamics Abstract_Montanari_NIPS2015

Invited talk 4 – Alfredo Braunstein, Department of Applied Science and Technology (DISAT), Politecnico di Torino
Inference problems for irreversible stochastic epidemic models

Invited talk 5 – Ramon Grima, University of Edinburgh
Exact and approximate solutions for spatial stochastic models of chemical systems Abstract_Grima_NIPS2015

Invited talk 6 – Graham Taylor, University of Guelph
Learning Multi-scale Temporal Dynamics with Recurrent Neural Networks

Posters – Video Presentations & Abstracts

Ludovica Bachschmid-Romano, Department of Computer Science, TU Berlin
Inference in kinetic Ising models: mean-field and Bayes estimators Astract_LBRomano_NIPS

Bhaswar B. Bhattacharya, Department of Statistics, Stanford University
Inference in Ising models (Work with Sumit Mukherjee)

Barbara Bravi, Department of Mathematics, King’s College London
Inference for dynamics of continuous variables: the Extended Plefka expansion with hidden nodes

Sakyasingha Dasgupta, RIKEN Brain Science Institute
Efficient signal processing in random networks that generate variability: A comparison of internally generated and externally induced variability

Caterina De Bacco, University of Paris-Sud 11
A matrix product algorithm for the far-from-equilibrium evolution of dynamical processes on networks

Kenji Doya, Okinawa Institute of Science and Technology
Inference of Neural Circuit Connectivity form High-dimensional Activity Recording Data: A Survey

Alex J Gibberd, Department of Statistical Science, University College London & James D B Nelson
Inference for Piecewise-Constant Gaussian Graphical Models
Abstract_ Gibberd and Nelson_NIPS2015

Daniel Soudry, Department of Statistics, University of British Columbia
Implementing efficient “shotgun” inference of neural connectivity from highly sub-sampled activity data
Abstract_ Soudry_NIPS2015 ; Abstract_Soudry_PDF_with_Figures

Prof Manfred Opper (Technische Universitaet Berlin, Germany), Prof Yasser Roudi (NTNU, Trondheim, Norway), Prof Peter Sollich (King’s College London, UK)

Prof Peter Sollich:

Inference and learning on large graphical models, i.e. large systems of simple probabilistic units linked by a complex network of interactions, is a classical topic in machine learning. Such systems are also an active research topic in the field of statistical physics.
The main interaction between statistical physics and machine has so far been in the area of analysing data sets without explicit temporal structure. Here methods of equilibrium statistical physics, developed for studying Boltzmann distributions on networks of nodes with e.g. pairwise interactions, are closely related to graphical model inference techniques; accordingly there has been much cross-fertilization leading to both conceptual insights and more efficient algorithms. Models can be learned from recorded experimental or other empirical data, but even when samples come from e.g. a time series this aspect of the data is typically ignored.
More recently, interest has shifted towards dynamical models. This shift has occurred for two main reasons:

  1. Most of the interesting systems for which statistical analysis techniques are required, e.g. networks of biological neurons, gene regulatory networks, protein-protein interaction networks, stock markets, exhibit very rich temporal or spatiotemporal dynamics; if this is ignored by focusing on stationary distributions alone this can lead to the loss of a significant amount of interesting information and possibly even qualitatively wrong conclusions.
  2. Current technological breakthroughs in collecting data from the complex systems referred to above are yielding ever increasing temporal resolution. This in turn allows in depth analyses of the fundamental temporal aspects of the function of the system, if combined with strong theoretical methods. It is widely accepted that these dynamical aspects are crucial for understanding the function of biological and financial systems, warranting the development of techniques for studying them.

In the past, the fields of machine learning and statistical physics have cross-fertilised each other significantly. E.g. the establishment of the relation between loopy belief propagation, message passing algorithms and the Bethe free energy formulation has stimulated a large amount of research in approximation techniques for inference and the corresponding equilibrium analysis of disordered systems in statistical physics.
It is the goal of the proposed workshop to bring together researchers from the fields of machine learning and statistical physics in order to discuss the new challenges originating from dynamical data. Such data are modelled using a variety of approaches such as dynamic belief networks, continuous time analogues of these – as often used for disordered spin systems in statistical physics –, coupled stochastic differential equations for continuous random variables etc. The workshop will provide a forum for exploring possible synergies between the inference and learning approaches developed for the various models. The experience from joint advances in the equilibrium domain suggests that there is much unexplored scope for progress on dynamical data.

Possible topics to be addressed will be:
Inference on state dynamics:
– efficient approximation of dynamics on a given network, filtering, smoothing
– inference with hidden nodes
– existing methods including dynamical belief propagation & expectation propagation, variational approximations, mean-field and Plefka approximations; relations between these, advantages, drawbacks
– alternative approaches
Learning model/network parameters:
– with/without hidden nodes
Learning network structure:
– going beyond correlation information