Speaker Abstracts and Biographies

Bob Batterman – University of Pittsburgh

Emergence, Autonomy, Multiscale Modeling

Abstract:   Many systems that are complex and heterogeneous at small scales exhibit patterns of behavior at larger scales that appear to be universal.  In other words, systems that differ in their lower-scale makeup exhibit the same upper-scale patterns. I examine some of the methods by which we can understand this kind of universality (renormalization group theory and homogenization).  It is sometimes possible to explain why the upper-scale behaviors display a kind of autonomy that is often, though often not, explicitly associated with the concept of emergence.

Biography: Bob Batterman (PhD, University of Michigan, 1987) is professor of philosophy and Chair of the department at the University of Pittsburgh. Prior to his arrival in Pittsburgh, he held the Rotman Canada Research Chair in Philosophy of Science at the University of Western Ontario (2005–2010). Before that he spent 15 years in the Department of Philosophy at Ohio State University. He is a Fellow of the Royal Society of Canada. He is the author of The Devil in the Details: Asymptotic Reasoning in Explanation, Reduction, and Emergence (Oxford, 2002). He is the editor of The Oxford Handbook of Philosophy of Physics. (Oxford, 2013) His work in philosophy of physics focuses primarily upon the area of condensed matter broadly construed. His research interests include the foundations of statistical physics, dynamical systems and chaos, asymptotic reasoning, mathematical idealizations, the philosophy of applied mathematics, explanation, reduction, and emergence.

Current research supported by the John Templeton Foundation examines examines multiscale approaches to modeling systems such as steel beams and embryonic development in biology. The focus is on justifying the use of upperscale/continuum models via the mathematics of homogenization, Gamma convergence, and the renormalization group.


Glen Evenbly – University of California, Irvine

Coarse-graining transformations for many-body systems: new approaches using tensor networks

Abstract:  Discerning the large-scale collective behavior of a many-body system from a microscopic description (specified as an interaction between the constituent particles) remains a central aspect in many areas of contemporary physics, and has long posed a formidable challenge. It is due to the difficulty of the so-called many body problem that certain collective phenomena, such as high-temperature superconductivity, remain not very well understood.

In recent times, coarse-graining transformations have emerged as a powerful tool for addressing many-body systems. These methods involve building an effective description of the system, one that captures the desired long-range physics, in terms of a few coarse-grained degrees of freedom by successively identifying and removing short-ranged degrees of freedom.

In this talk I will introduce coarse-graining transformations in the context of classical statistical and quantum many-body systems on the lattice, focusing on newly developed methods based on tensor networks.

Biography:  Glen Evenbly was born in New Zealand and obtained his B.Sc. in Physics at the University of Auckland, New Zealand in 2004. He performed physics research for his Ph.D. at the University of Queensland with Professor Guifre Vidal, and received his Ph.D. in Physics in 2010. Afterwards, he was awarded the Sherman-Fairchild Prize Postdoctoral Scholarship in Theoretical Physics at California Institute of Technology (2011–2014) working in Professor John Preskill’s group. He is currently is a Simons Foundation postdoctoral fellow working with Steven White at the University of California, Irvine.

Evenbly’s research is focused on the development and implementation of tensor network approaches for the efficient simulation of quantum many-body systems. In particular, he has made significant contributions to the development of the multi-scale entanglement renormalization ansatz (MERA) and its application to the study of many-body systems at criticality.


Kevin Brown – University of Connecticut

Emergent Behavior in Systems Biology and Cognitive Science?

Abstract:   I will discuss possible connections to emergence in three specific areas.  One is the parameter space geometry of large nonlinear models with many poorly determined parameters, as is common in models arising in computational cell biology.  Another is collective behavior in computational studies of small populations of neuronal-type units.  A third comes from studies of human cognition, particularly mindwandering and the default mode network of human brain activity.  I also hope to indicate that while “more is different” may be an important mantra, in some problems different is also different and may be the more salient explanation.  I will briefly reference some history on earlier frameworks for emergence, and I connect to a debate in the theory of consciousness about the importance and utility of emergent explanations for complex phenomena.

Biography: Kevin Scott Brown is an Assistant Professor in the Department of Biomedical Engineering with joint appointments in the Departments of Physics, Chemical and Biomolecular Engineering, and Marine Sciences, at the University of Connecticut. He is also a Member there of the Institute for Systems Genomics. He is a complex systems scientist. He studies complex biological systems, particularly those arising in systems biology and systems neuroscience. He employs methodology from dynamical systems, network theory, Bayesian and nonparametric statistics, computational biology, and statistical signal processing. His work has focused heavily on inverse problems; inferring network and model structures from cellular time series measurements, protein sequences, and high-dimensional brain data. His work is strongly connected to data, and he continues to have many productive collaborations with experimentalists. After receiving his Ph.D. degree from Cornell University and before taking his present position, he was a Helen Hay Whitney Foundation Fellow in Molecular and Cellular Biology with Andrew W. Murray at Harvard University, a Postdoctoral Fellow with Jean M. Carlson and then a Project Scientist in the Department of Physics and Institute for Collaborative Biotechnolog at the University of California Santa Barbara, and a Research Assistant Professor in Chemical, Materials and Biomolecular Engineering and the Department of Marine Sciences at the University of Connecticut.

Kevin’s Research website


Mark Transtrum – Brigham Young University

Parameter Identifiability and Emergent Theories in Physics, Biology, and Beyond                                   

Abstract:  I discuss the relationship between parameter identifiability and emergent theories in science. The success of science is due in large part to the hierarchical nature of physical theories. These effective theories model natural phenomena as if the physics at macroscopic length scales were almost independent of the underlying, shorter-length-scale details. Using the Fisher Information Matrix, I show that long-scale observations of system behavior are often insufficient to accurately identify most microscopic parameters. Similarly non-identifiable parameters are observed in many diverse areas of science for which effective theories have historically been difficult to find. Interpreting a model as a manifold of predictions in data space, I show how effective models can be systematically derived from microscopic first principles for a variety of complex systems in physics, biology, and other fields.

Biography:  Mark K. Transtrum completed his undergraduate studies in physics and mathematics at Brigham Young University in Provo, Utah. His graduate studies were done at Cornell University under the advisement of Professor James Sethna. After receiving his Ph.D. in Physics in 2011, he worked for two years as a postdoctoral fellow at M.D. Anderson Cancer Center in Houston, Texas with Peng Qiu. Transtrum returned to Brigham Young University as a faculty member in Physics & Astronomy in 2013.

Transtrum’s research explores how our mathematical representations of physical systems vary depending on the observations we make about the system. Much of his work uses methods from the field of Information Geometry, an approach to statistics that combines information theory and differential geometry. His contributions have helped to explain why reduced representations of complex processes are possible, and he has also developed improved methods for data fitting and for exploring the behavior space of complex models. Most recently, he has explored new model reduction methods as a way of identifying and removing irrelevant details from complicated models.


Richard Nisbett – University of Michigan

From Ecology to Cognition

Abstract:  Ecologies that support large permanent farms encourage complex, interdependent social relations in the service of optimal use of land. Other occupations such as an open-sea fishing that require a number of people working together also encourage interdependent social relations. Ecologies such as steppes, mountains, and dry plains which are best suited for herding support  independent cultures. Interdependent social relations direct attention outward due to the necessity of coordinating behavior with other group members. Consequently, interdependent peoples see much more context than independent peoples and are much more aware of relationships even among inanimate aspects of the environment. Independent peoples focus more on objects (including people) that they wish to manipulate or influence and have less need to focus on context. Independent peoples use attributes of objects (including people) to categorize them and they attend to the rules governing the behavior of the objects.

Herdsmen are susceptible to loss of their livelihood in an instant. Theft of a herd or even a single animal can be devastating economically. Typically herdsmen live in the low population-density areas where there is no effective policing. Hence, in the words of the North Carolina Piedmont maxim: “Every man a sheriff on his own hearth.” In order to discourage anyone from taking advantage of them, herdsman well in a ”culture of honor.” Any insult or slight is greeted with violence or the threat of it. Cf: the American inner city.

Biography: Richard Nisbett is the Theodore M. Newcomb Distinguished University Professor at the University of Michigan, where he is Co-Director of the Culture and Cognition Program. He studies how laypeople reason and make inferences about the world. He has shown both that inferences can be seriously flawed and that they are surprisingly subject to correction by training in probabilistic reasoning, methodological rules, and cost-benefit analysis. Other work compares East Asians with Westerners. He finds that Westerners reason analytically, emphasizing rules and categorization whereas East Asians reason holistically, focusing broadly on context and attending to similarities and relationships. His most recent work is on the nature of intelligence and its modifiability, showing that traditional views of intelligence are far too pessimistic about how much intelligence can be improved. He has also studied the “culture of honor” of the U.S. South which lies behind the greater violence of that region.


Deborah Gordon – Stanford University

The Ecology of Collective Behavior

Abstract: Ants are an amazingly diverse group of more than 14,000 species that live in every conceivable habitat on earth. Like many distributed systems, both natural and engineered, ant colonies operate without any central control. No ant can assess what needs to be done. Each ant responds to its interactions with other ants nearby and in the aggregate, these dynamical networks of interaction regulate colony behavior and provide a filter to screen out enemies. I will describe some of the algorithms that ant colonies of different species use to solve particular ecological problems. An ecological perspective can show us how the processes that produce collective behavior are shaped by the dynamics of the environment in which they evolve.

Biography: Deborah M. Gordon is a Professor in the Department of Biology at Stanford University. She studies how colonies work without central control using networks of simple interactions, and how these networks evolve in relation to changing environments. Her work extends to analogies with other biological systems and collaborations with engineers and computer scientists, such as the “Anternet”, comparing the regulation of foraging in desert harvester ants to the protocol used to regulate data traffic in the internet. Her projects include a long-term study of a population of harvester ant colonies in Arizona, studies of the invasive Argentine ant in northern California, and ant-plant mutualisms in Central America. She received her PhD from Duke University, then joined the Harvard Society of Fellows, and did postdoctoral research at Oxford and the University of London.Awards include a Guggenheim Fellowship and the Gores Teaching Award from Stanford. She is the author of two books, Ant Encounters (Primers in Complex Systems, Princeton Univ Press) and Ants at Work (Norton), and recent talks include TED 2014 (http://www.stanford.edu/~dmgordon/).


Hugues Bersini – Université Libre de Bruxelles

Biological Emergence is Stronger

Abstract:  In my talk, I’ll discuss and illustrate the necessary three ingredients which together could allow a collective phenomenon to be labelled as “emergent”. First the phenomenon, as usual, requires a group of natural objects entering in a non-linear relationship and potentially entailing the existence of various semantic descriptions depending on the human scale of observation. Second this phenomenon has to be observed by a mechanical observer instead of a human one, which has the natural capacity for temporal and/or spatial integration. Finally, for this natural observer to detect and select the collective phenomenon, it needs to do so in rewards of the adaptive value this phenomenon is responsible for. The necessity for such a teleological characterization and the presence of natural selection drive us to defend, with many authors, the idea that truly emergent phenomena might only belong to biology. Following a brief philosophical plea, we present a simple and illustrative computer thought experiment in which a society of agents evolves a stigmergic collective behavior as an outcome of its greater adaptive value. The three ingredients are illustrated and discussed within this experimental context. Such an inclusion of the mechanical observer and the selection as much natural to which this phenomenon is submitted should underlie the necessary de-subjectivation that strengthens any scientific endeavour. I shall finally show why the short paths taken by ants colony, the collective flying of birds, robotic swarms studied in my lab and the maximum consumption of nutrients by a cellular metabolism can be seen as emergent in a stronger sense.

Biography: Hugues Bersini is born the 19/1/61 and is living in Brussels. He has an MS degree (1983) and a Ph.D in engineering (1989) both from Université Libre de Bruxelles (ULB). After having been supported as a researcher by a EEC grant from the JRC-CEE in Ispra (1984-1987), he became member of the IRIDIA laboratory (the AI laboratory of ULB). He is now heading this same lab with Marco Dorigo. Since 1992, he has an assistant professor position at ULB and has now become full professor, teaching computer science, Web technology, business intelligence, programming and AI. He has been partner of various industrial projects and EEC esprit projects involving the use of adaptive fuzzy or neuro controllers, optimization algorithms and data mining. Over the last 20 years, he has published about 300 papers on his research work which covers the domains of cognitive sciences, AI for process control, connectionism, fuzzy control, lazy learning for modelling and control, reinforcement learning, biological networks, the use of neural nets for medical applications, frustration in complex systems, chaos, computational chemistry, object-oriented technologies, immune engineering and epistemology. He is quite often asked for giving tutorials covering the use of neural networks, object orientation and the behaviour of complex systems. He is a pioneer in the exploitation of biological metaphors (such as the immune system) for engineering and cognitive sciences. He co-organized at ULB the second conference on Parallel Problem Solving Nature (PPSN), the second European Conference on Artificial Life (ECAL). He co-organized the three European Workshops on Reinforcement Learning. During the 1996 Conference on Evolutionary Computation, he co-organized the first International Competition on Evolutionary Optimization algorithms. He co-organized the second International Competition on Evolutionary Optimization algorithms held in Indiana University. He organized several tributes to one of his main scientific inspirator: Francisco Varela. The most recent one was organized in Paris, Tribute in Paris . He organized the 5th International Conference on Artificial Immune System : ICARIS . He organized the 11th European Conference of Artificial Life, ECAL 2011 . He was the coordinator of the FAMIMO LTR European Project on fuzzy control for multi-input multi-output processes and was involved in two esprit projects: NEMORETS and METHODS that both cover the exploitation of evolutionary and optimization algorithms for industrial design. He is the author of eleven French books covering basic computer sciences: “Tout ce que vous avez toujours voulu savoir sur l’informatique” published at Best Of, “L’Orienté Objet” published at Eyrolles, “Les fondements de l’informatique” published at Vuibert and “De L’Intelligence Humaine à L’Intelligence Artificielle” published at Ellipse, three on complex systems: “Des réseaux et des sciences” and “Comment définir le vivant” published at Vuibert, “Qu’est-ce l’Emergence” published at Ellipses. Recently he published “Informatique et Cinéma” at Ellipse, “Haro sur la compétition” at PUF, “Le Tamagotchi de Mme Yen” and “Le dernier fado de l’androide” at Le Pommier. He teaches AI, object-oriented programming: C++, java, .Net, UML, Django/Pyton, Design Patterns to university students (Solvay and Polytechnic Schools) and for industries. He is consultant for companies in OO, Data-Mining technologies and Business Intelligence. He is working these days on data mining and data warehousing of genomic data for the project In Silico . These last years, three spin-off have been created out of researches done at IRIDIA: Cluepoints , Tevizz and In Silico DB


David Schwab – Northwestern University

Statistical Physics of Statistical Inference with Hidden Variables

Abstract:  I will discuss two related problems where statistical physics provides a new lens through which to understand statistical inference. (1) Recently it has become possible to directly measure the simultaneous activity of large populations of neurons, allowing one to study the properties of the collective neural code. When translating the observed response distributions into the language of statistical physics, these systems appear poised near a unique critical point, where the extensive parts of the entropy and energy are exactly equal. Here we present analytical arguments and numerical simulations showing that such critical behavior naturally arises in systems with unobserved random variables, such as a common input stimulus to a neural population, that affect the observed degrees of freedom. (2) Next we turn to deep learning, a popular technique in machine learning where recent performance on tasks such as visual object recognition rivals human performance. We present recent work relating greedy training of deep belief networks to a form of variational real-space renormalization. This connection may help explain how deep networks automatically learn relevant features from data and extract independent factors of variation.​

Biography: David J. Schwab is an Assistant Professor in the Department of Physics and Astronomy at Northwestern University. He completed his undergraduate studies in physics and mathematics at Cornell. His PhD is in Physics from UCLA where he studied biophysics with Robijn Bruinsma and condensed matter theory with Sudip Chakravarty. Subsequently, he was a postdoctoral fellow and lecturer in Physics at Princeton University, where he worked in the biophysics theory group led by William Bialek and Ned Wingreen.

Schwab’s current research focuses on collective computation in living systems. In neuroscience, he studies how groups of neurons encode the information present in their inputs, and what principles underlie the computations they perform on these signals. In cell biology, he investigates how populations of cells communicate and make decisions in response to environmental cues. Much of his recent work is at the interface between statistical inference and statistical physics. In particular, he has provided a novel mechanism that may explain recent observations of criticality in large biological datasets, and has shown that classical coarse graining concepts such as the variational renormalization group directly map to certain forms of deep learning.


Daniel Cox – University of California, Davis; Co-Director ICAM

Toward a Theory of Emergence: Integrative Discussion

Abstract:  A key goal of ICAM since its inception has been to understand how in a broad variety of problems, both in and out of equilibrium, we can produce models with few parameters or organizing principles that properly and economically describe a broad range of emergent phenomena. Within the context of the renormalization group (RG) as developed by Ken Wilson, Leo Kadanoff and others, we have a quantitative handle on how this works for equilibrium condensed matter phases as well as for quantum field theories, where a change in length scale allows one to tune to “fixed points” in which a few degrees of freedom quantitatively govern the physical phenomena over a broad range of parameters.  These fixed points define “effective field theories” for phases of matter or over ranges of scale in particle physics.

In recent years, in the physical sciences,  as we have witnessed at this meeting, a variety of approaches appear to be converging on RG like concepts.  The information geometry derived from the Fisher information matrix is demonstrating how few parameter models can emerge from complex, non-equilibrium systems.  The extraordinary pattern recognition made possible with deep learning algorithms is, looked at the right way, merely a sequence of RG transformations.  The new quantum many body approaches engendered by the tensor network/density matrix renormalization group (DMRG) approaches are illustrating how RG-like transformations can yield precise few parameter descriptions of complex low temperature phases of matter.  Although these remain scientific adventures whose practitioners are in somewhat disjoint sets, they point towards the ICAM dream of a truly quantitative theory of emergence, broadly applicable.

It seems to me that there can be much new science in fruitful dialogues between the subfields.  For example, there is clearly a superficial similarity between the convergence of Fisher matrix eigenvalues and the density matrix eigenvalues of the DMRG – is there anything like a correspondence principle which can be employed in that context to advance both approaches?  Is the deep learning success used to mimic the brain actually realized in the stacks of neurons in the brain, and if so, is an RG like structure a product of evolutionary adaptation to the critical structure of natural sights and sounds? Can we find “protectorates” of a few parameters in emergent social structures?

In this closing final session I will briefly propose and encourage discussion among the speakers and audience about possible visions of the synthesis of such a theory of emergence, the science opportunities presented, and whether the existing scientific bodies might have means to support an exploration of these ideas.

Biography: Daniel L. Cox is a Distinguished Professor of Physics at the University of California, Davis. In his research career he has carried out theoretical investigations of the heavy fermion and high temperature superconductors, quantum impurity models, and in recent years, problems at the interface between biology and physics, including electronic properties of DNA and protein aggregation phenomena, especially in “mad cow’’ and related diseases. He is passionately devoted to the enhancement of science characterizing emergent phenomena in complex matter like biomolecules, exotic superconductors, and soft materials.

He received his BS in Physics from the University of Washington in 1979, and his PhD in theoretical physics at Cornell University in 1985. After a postdoctoral stint at the University of California San Diego, he became joined the physics department faculty at Ohio State University in 1986, and thereafter the faculty at UC Davis in 1997. Cox has been the recipient of an A.P. Sloan Fellowship (1988), a National Science Foundation Presidential Young Investigator Award (1988), was a Troiseme Cycle Lecturer at the University of Lausanne in 1996, and won a J.S. Guggenheim Memorial Fellowship in 2004. He is the co-director for international matters of the Institute for Complex Adaptive Matter and has served on the American Physical Society Panel on Public Affairs (2001-2004) and as a member at large of the executive committee of the Division of Condensed Matter of the American Physical Society (1999-2001), the Division of Biological Physics of the American Physical Society (2008-present), and served as Chair of the Division of Biological Physics (2013).

In his spare time, Daniel loves to play with his children, run, hike, and write poetry.