Speaker: Erik Hoel (Columbia University)

* This talk was cancelled because of inclement weather

Abstract: It has long been thought that while higher scale (macro) descriptions of systems may be useful to observers, they are at best a compression and at worse leave out critical information. However, the theory of causal emergence shows that a macro model of a system (a map) can be more informative than a fully detailed model of the system (the territory). While causal emergence may at first glance seem counterintuitive or impossible, this paper actually grounds the phenomenon in a classic concept from information theory: Shannon’s discovery of the channel capacity. I identify how systems have a particular causal capacity, and that different causal models of those systems take advantage of that capacity to various degrees. For some systems, only macroscale causal models use the full causal capacity. Such macroscale causal models can either be coarse-grains, or may leave variables and states out of the model (exogenous) in various ways, which can improve the model’s efficacy and its informativeness via the same mathematical principles of how error-correcting codes take advantage of an information channel’s capacity. This provides a general framework for understanding why universal reductionism fails: because the causal structure of some systems cannot be fully captured by even the most detailed microscopic model.

Thursday, February 9, 12:30 p.m.
Institute for Advanced Study, 1 Einstein Drive, West Building Seminar Room, 2nd floor
Host by the Program in Interdisciplinary Studies