Venue

Co-located with the CCS 2018.

 Vellidio Convention Center
Leof. Stratou 3, Thessaloniki 546 39, Greece

Important dates

 7 Jun Call for abstracts
 26 Jun Deadline for abstract submission
 28 Jun Notification of acceptance
 30 Jun CCS EarlyBird Registration deadline
 29 Jul Notification of Acceptance (only for submissions received after 30 June)
 15 Aug CCS Standard Registration deadline
 26 Sep Satellite event

Submit an abstract

Only contributions submitted through the EasyChair system will be considered.
Authors of accepted contributions should register to the main conference.

Invited Speakers

Person 1

Kavan Modi

Monash University

Person 2

Andrew Garner

Nanyang Technological Unversity

Person 2

Samir Suweis

University of Padua

Person 2

Paul Riechers

UC Davis

Person 2

Alec Boyd

UC Davis

Person 2

Nora Tischler

Griffith University

Program

14:30 Information-based fitness and the emergence of criticality in living systems

Samir Suweis

Recently, evidence has been mounting that biological systems might operate at the borderline between order and disorder, i.e., near a critical point. In this talk we will present a general mathematical framework for understanding this common pattern, explaining the possible origin and role of criticality in living adaptive and evolutionary systems. We rationalize this apparently ubiquitous criticality in terms of adaptive and evolutionary functional advantages. We provide an theoretical framework based on information theory and genetic algorithms, which shows that the optimal response to broadly different changing environments occurs in systems organizing spontaneously to the vicinity of a critical point. In particular, criticality turns out to be the evolutionary stable outcome of a community of individuals aimed at communicating with each other to create a collective entity.

15:00 Evolutionary Organization of Information Processing - Learning from Economics and Biology


Carl Henning Reschke

15:20 Efficient Generation of Trade Spaces for Complexity Allocation in Complex Systems

Farouk Bonilla, Shahram Sarkani and Thomas Mazzuchi

15:40 Detecting Information Processing? The Case of Structured White Noise


Paul Riechers

Can model-free signal-analysis methods detect information processing and thus complex structure in the world around us? As an obvious candidate, power spectra are a common and convenient way to analyze signals in many disciplines of science and engineering. Their structure not only shows the prominence of various signal-frequencies but also hints at mechanisms of correlation, resonance, and broader behavior. However, here we show that power spectra can nevertheless hide all structure about arbitrarily complex processes, conveying only a flat power spectrum---the renowned signature of structureless white noise. Indeed, we argue that the most insightful signals from complex systems will have large beyond-pairwise correlation that evades power spectra. To offer more than a word of warning, we give three more constructive results:

  1. We characterize the minimal generative structure implied by any power spectrum.
  2. We show how to construct arbitrarily complex processes with flat power spectra.
  3. We suggest more sophisticated tools (and introduce the excess entropy spectrum) to detect computational structure.

16:00 BREAK

16:30 (Quantum-) informational approaches to causality, non-markovianity, interventions

Kavan Modi

TBD

17:00 Quantum Measurement as a Statistically Driven Reversible Bifurcation Process

Kristian Lindgren

17:20 Physical realization of a quantum memory advantage in the simulation of stochastic processes

Nora Tischler

Technologies that exploit quantum mechanical effects promise to enhance applications in a number of different areas. One celebrated example is the speed-up in factoring numbers that can be obtained through Shor’s algorithm. Recently, a new task has emerged for which quantum information science provides an advantage: the simulation of stochastic, i.e. partially random, processes.
Stochastic process models are used to describe a wide range of natural and social phenomena, including for example the weather and the stock market. The simulation of such processes provides valuable information about the dynamics of complex systems. However, for highly complex processes, a large amount of information about the system's past needs to be stored in order to simulate its future—a quantity formally measured by its statistical complexity. This translates to a large memory requirement, which may limit the feasibility of such a simulation. Here, quantum mechanics promises an advantage: Simulators based on quantum information processing can outperform classical simulators by reducing the memory requirements below the ultimate classical limits.
The first physical demonstrations of this recently proposed quantum memory advantage have been achieved in the Quantum Optics and Information Laboratory at Griffith University, Australia. In a series of four experiments, we have, among other things, explored how the relative simplicity of two stochastic processes can depend on the classical or quantum nature of the information processing, and have demonstrated that quantum resources allow storing information about the past of the process in a lower-dimensional memory system. Our experiments employ single particles of light as the information carrier. In this talk, I will introduce quantum information processing with light and provide an overview of these physics experiments.

17:40 Thermodynamics of Modularity: Structural Costs Beyond the Landauer Bound


Alec Boyd

Complex computations typically occur via the composition of modular units, such as the universal logic gates found in logical circuits. The benefit of modular information processing, in contrast to globally integrated information processing, is that complex global information processing is more easily and flexibly implemented via a series of simpler, localized information processing operations that only control and change local degrees of freedom. We show that, despite these benefits, there are unavoidable thermodynamic costs to modularity--costs that arise directly from the operation of localized processing and that go beyond Landauer's dissipation bound for erasing information. We quantify the minimum irretrievable dissipation of modular computations in terms of the difference between the change in global nonequilibrium free energy and the local (marginal) change in nonequilibrium free energy, which bounds modular work production. This modularity dissipation is proportional to the amount of additional work required to perform the computational task modularly, measuring a structural energy cost. It determines the thermodynamic efficiency of different modular implementations of the same computation, and so it has immediate consequences for the architecture of physically embedded transducers, which are information processing agents. Constructively, we show how to circumvent modularity dissipation by designing agents that capture the information reservoir's global correlations and patterns. We prove that these agents, when acting as pattern generators or extractors, must match the complexity of their environment to minimize the modularity dissipation. Thus, there are routes to thermodynamic efficiency by optimizing the modular architecture of computations.

18:00 Thermodynamics of complexity


Andrew Garner

TBD

About IPCS

The Information Processing in Complex Systems (IPCS) satellite meeting is organized, since 2012, during the Conference on Complex Systems.
Our goal is to provide a forum for researchers who follow an information-theoretic approach for the analysis of complex systems. Here they can present recent achievements and discuss promising hypotheses and further research directions, combining both classical and quantum information approaches.

Background


All systems in nature can be considered from the perspective that they process information.
Information is registered in the state of a system and its elements, implicitly and invisibly. As elements interact, information is transferred. Indeed, bits of information about the state of one element will travel – imperfectly – to the state of the other element, forming its new state. This storage and transfer of information, possibly between levels of a multi level system, is imperfect due to randomness or noise. From this viewpoint, a system can be formalized as a collection of bits that is organized according to its rules of dynamics and its topology of interactions.
Mapping out exactly how these bits of information percolate through the system reveals fundamental insights in how the parts orchestrate to produce the properties of the system.

A theory of information processing would be capable of defining a set of universal properties of dynamical multi level complex systems, which describe and compare the dynamics of diverse complex systems ranging from social interaction to brain networks, from financial markets to biomedicine. Each possible combination of rules of dynamics and topology of interactions, with disparate semantics could be translated into the language of information processing which in turn will provide a lingua franca for complex systems.

Organizers

Past Editions

 IPCS12 Bruxelles, Belgium (2012)
 IPCS13 Barcelona, Spain (2013)
 IPCS14 Lucca, Italy (2014)
 IPCS15 Tempe, Arizona (2015)
 IPCS16 Amsterdam, Netherlands (2016)
 IPCS17 Cancun, Mexico (2017)