Palma De Mallorca | October 20, 2022
Information Processing in Complex Systems
#IPCS22

Venue

Part of the CCS 2022 conference.

 Sala Granados I, Auditòrium de Palma
Av. de Gabriel Roca, 18, 07014 Palma, Illes Balears, Spain

Important dates

 July 4 Call for abstracts
 Aug. 28 Extended deadline for abstract submission
 Sept. 5 Notification of acceptance
 July 25 CCS EarlyBird registration deadline
 Sept. 16 CCS Standard registration deadline
 Oct. 20 IPCS2022 satellite event

Submit an abstract

Only contributions submitted through the EasyChair system will be considered.

Authors of accepted contributions must also register to the main conference.

Submit Register

Invited Speakers

Leonardo Banchi

Leonardo Banchi

University of Florence

Felix Binder

Felix Binder

Trinity College Dublin

Jayne Thompson

Jayne Thompson

Horizon Quantum Computing

Daniele Marinazzo

Daniele Marinazzo

Ghent University

Pedro Mediano

Pedro Mediano

University of Cambridge

Program of IPCS2022

Session 1: Dimension reduction and generalization

10:00-10:30 Generalization in quantum machine learning: a quantum information standpoint

Leonardo Banchi

Quantum classification and hypothesis testing (state and channel discrimination) are two tightly related subjects, the main difference being that the former is data driven: how to assign to quantum states ρ(x) the corresponding class c (or hypothesis) is learnt from examples during training, where x can be either tunable experimental parameters or classical data “embedded” into quantum states. Does the model generalize? This is the main question in any data-driven strategy, namely the ability to predict the correct class even of previously unseen states. Here we establish a link between quantum classification and quantum information theory, by showing that the accuracy and generalization capability of quantum classifiers depend on the (Rényi) mutual information I(C:Q) and I2(X:Q) between the quantum state space Q and the classical parameter space X or class space C. Based on the above characterization, we then show how different properties of Q affect classification accuracy and generalization, such as the dimension of the Hilbert space, the amount of noise, and the amount of neglected information from X via, e.g., pooling layers. Moreover, we introduce a quantum version of the information bottleneck principle that allows us to explore the various trade-offs between accuracy and generalization. Finally, in order to check our theoretical predictions, we study the classification of the quantum phases of an Ising spin chain, and we propose the variational quantum information bottleneck method to optimize quantum embeddings of classical data to favor generalization.

10:30-11:00 Quantum adaptive agents using less memory

Jayne Thompson

Central to the success of adaptive systems is their ability to interpret signals from their environment and respond accordingly -- they act as agents interacting with their surroundings. Such agents typically perform better when able to execute increasingly complex strategies. This comes with a cost: the more information the agent must recall from its past experiences, the more memory it will need. Here we investigate the power of agents capable of quantum information processing. We uncover the most general form a quantum agent need adopt to maximise memory compression advantages, and provide a systematic means of encoding their memory states. We show these encodings can exhibit extremely favourable scaling advantages relative to memory-minimal classical agents, particularly when information must be retained about events increasingly far into the past.

11:00-11:30 Coffee break

Session 2: Information decomposition and causality

11:30-12:00 Information decomposition as a link between biological and artificial brains

Pedro Mediano

One of the key principles of information processing is that it is substrate-independent – i.e. the same computation can be implemented by multiple systems obeying different physical laws. In this talk, I will illustrate how the principles of information decomposition (in particular, metrics of synergy and redundancy) can provide such a substrate-independent description of computation in complex systems by linking biological and artificial brains. First, I will show results obtained from fMRI data showing that regions of the brain responsible for high-level cognitive processes are synergy-rich, while areas responsible for sensory input and motor output are redundancy-rich. Then, I will show results from a study of artificial neural networks, showing that synergy increases as neural networks learn novel tasks, possibly aiding in the process of generalizing learned representations; while redundancy helped the network sustain random perturbations. Together, these results suggest different functional roles of synergy and redundancy for computation in complex systems.

12:00-12:30 Lead/Lag directionality is not generally equivalent to causality: Precautionary tale for PSI compared to CMI

Andreu Arinyo i Prats

The application of causal techniques to neural imaging of the brain has increased extensively over the years to include a wide and diverse family of methods that can be applied to EEG data. Moreover, growing interest has developed on the analysis of cross frequency, phase and amplitude correlations and directionality. These show in some cases contradicting results of directionality from high frequency to low frequency data. However, lacking is a comparison of two widely available methods which provide an estimation for directionality for cross frequency analysis in EEG data, these are Conditional Mutual Information (CMI) and Phase Slope Index (PSI). We show that these methods can present differing interpretations of the data based on the phase lag.

12:30-13:00 A network-based approach to identifying synergistic triplets in high-dimensional data

Jie Li, Stavroula Tassi, Rick Quax

Networks are complex systems characterized by different types of relationships between their interacting parts. These parts are often computed of pairwise correlations of high dimensional data in order to identify patterns, as well as higher-order (synergistic) interdependences. However, mounting evidence suggests that higher-order/synergistic interactions are important and an open task which makes the analysis and modeling of these complex systems challenging. Unfortunately, synergistic interactions are missed almost completely when calculating pairwise correlations. The problem occurs due to the lack of an agreed measure that directly quantifies the information synergy, along with the computational explosion, which leads to long calculations. In this paper we propose an efficient method to identify triplets of variables in large-scale datasets with high synergistic information, operationalized as low O-information. For the identification of triplets, we focus on an algorithm considering information theory measures. The algorithm builds up the theoretical relationship of mean conditional entropy between pairs of variables and information synergy, ameliorating the computational complexity from cubic to quadratic. We test our method to a wide-reaching medical dataset and find consistently that the top predicted triplets overlap about 75% with the true top synergistic triplets, considering the calculation of O_information. We conclude that including synergistic interactions may change how conclusions are drawn from network analysis in a profound way. Our method is an important step to making this feasible.

13:00-14:30 Lunch break

14:30-15:15 Plenary Session

Session 3: Complex processes

15:15-15:45 Parameter estimation for complex processes

Felix Binder

Many real-world tasks include some kind of parameter estimation, i.e., determination of a parameter encoded in a probability distribution. Often, such probability distributions arise from stochastic processes for which complexity manifests in the form of temporal correlations. For a stationary stochastic process this means that the random variables that constitute it are identically distributed but not independent. The memory complexity underlying these correlations may appear as an advantage or as a disadvantage for parameter estimation compared to the memoryless case. Here, we illustrate this effect with suitable examples and present a fundamental bound, which is asymptotically linear in the number of outcomes. We then apply our results to the case of thermometry on a spin chain.

15:45-16:15 Quantum dimension reduction for stochastic simulation

Thomas Joseph Elliott

Simulating quantum dynamics on a classical computer bears a resource cost that grows exponentially with the size of the system, and even the simplest of quantum systems often exhibit seemingly complex behaviours. This apparent problem can be recast as a positive - complex classical systems can be simulated efficiently on simple quantum computers. In this talk I will discuss the application of quantum technologies to the modelling of stochastic processes, for which quantum simulators can operate with lower memory dimension than any classical alternative - i.e., a quantum dimension reduction. I will highlight examples of quantitative scaling divergences in modelling highly non-Markovian processes, wherein the provably-memory-minimal classical simulator must store diverging amounts of information with increasing precision, while arbitrary precision can be achieved with a finite-sized quantum simulator. I will further describe how a lossy quantum dimension reduction be used for high fidelity, low memory cost stochastic simulation in settings where no exact quantum dimension reduction is possible. I will also discuss recent work on the experimental implementation of such quantum dimension reduction.

16:15-16:45 Coffee break

Session 4: Higher order interactions and information decomposition

16:45-17:15 Higher order informational interactions in systems close to transition

Daniele Marinazzo

Information transfer is crucial to understanding the dynamics of complex systems. Most approaches have so far considered pairwise interactions, overlooking the fact that two or more variables can share information on the rest of the system. Considering higher order interactions allows for a more unbiased estimate of pairwise and conditioned interactions, and allows to disclose new properties about the system. This view is particularly relevant when the system undergoes a transition.

17:15-17:45 Information Dynamics and Stability in Tangled Nature Model of Evolutionary Ecology

Hardik Rajpal, Clemens Von Stengel, Pedro A.M. Mediano, Fernando Rosas and Henrik Jeldtoft Jensen

The tangled nature model has been a rich resource to study metastability, species abundance, adaptability, and mass extinctions in closed ecosystems. In this study, we analyse the model from an information-theoretic perspective to study species-environment relationships and the emergence of higher-order cliques of species. To this end we rely on two novel developments in the field of information theory – Phi -Information Decomposition (ϕ-ID) and Information Individuality. The ϕ-ID framework enables us to decompose the information shared between species and its environment into various components like self-prediction, information transfer, redundant and synergistic information propagation. Thus, enabling us to study the variation of various aspects of information propagation as we make the system incrementally more unstable. By focusing on the self-information dimension of ϕ-ID, we can look for optimally self-predictive groups of species known as information individuals. This notion defines the boundary between an individual and environment to lie outside maximally self-informative sub-system. We leverage this definition to identify the emergence of higher-level structures where groups of species act as an individual in the fitness landscape. These stable groups emerge only for an intermediate range of instability in the system which enables these groups to be adaptable against stochastic fluctuations.

17:45-18:15 Two orders for decomposing multivariate information

Conor Finn

The partial information decomposition (PID) of Williams and Beer provides a general framework for decomposing the information provided by a set of source variables about a target variable. For instance, when we have two source variables then the total information provided by these sources about the target can be decomposed into four components, namely (i) the unique information provided first source, (ii) the unique information provided by the second source, (iii) the shared information which is redundantly provided by both sources, and (iv) the synergistic information which is only attainable from simultaneous knowledge of both sources together. The PID framework is based upon three axioms regarding the behaviour of a measure of redundant or intersection information. From these axioms, Williams and Beer derived a general structure for shared multivariate information called the redundant information lattice. These three axioms, however, do not uniquely determine a particular measure of redundant information. Consequently, over the past decade, there has been much debate as to exact measure of redundant information that should be used. Many of these are based on new axioms or differing operational interpretations, and several of these new measures have subsequently been determine to be incompatible with the original formulation. At the present time, there is no clear consensus as to how exactly one should proceed when evaluating such a decomposition. In this talk, I will discuss a novel method for decomposing the information provided by a set of source variables about a target variable. This method is based upon three axioms for a measure of the total marginal or union information which are analogous to the original Williams and Beer axioms. From these axioms I derive a second structure that can also be used to decompose multivariate information, which I call the total marginal information lattice. Finally, I will explore the implications of demanding a consistency between these two decompositions, and in particular how requiring this consistency constrains the possible measures of redundant information, or equivalently, possible measures of the total marginal information. This approach is similar to Kolchinsky, who also considered two orders but did not demand that they produce the same decomposition.

About IPCS

The Information Processing in Complex Systems (IPCS) satellite meeting is organized, since 2012, during the Conference on Complex Systems.

Our goal is to provide a forum for researchers who follow an information-theoretic approach for the analysis of complex systems. Here they can present recent achievements and discuss promising hypotheses and further research directions, combining both classical and quantum information approaches.

Background

All systems in nature can be considered from the perspective that they process information.

Imformation is registered in the state of a system and its elements, implicitly and invisibly. As elements interact, information is transferred. Indeed, bits of information about the state of one element will travel – imperfectly – to the state of the other element, forming its new state. This storage and transfer of information, possibly between levels of a multi level system, is imperfect due to randomness or noise. From this viewpoint, a system can be formalized as a collection of bits that is organized according to its rules of dynamics and its topology of interactions. Mapping out exactly how these bits of information percolate through the system reveals fundamental insights in how the parts orchestrate to produce the properties of the system.

A theory of information processing would be capable of defining a set of universal properties of dynamical multi level complex systems, which describe and compare the dynamics of diverse complex systems ranging from social interaction to brain networks, from financial markets to biomedicine. Each possible combination of rules of dynamics and topology of interactions, with disparate semantics could be translated into the language of information processing which in turn will provide a lingua franca for complex systems.

Organizers

Past Editions

 IPCS12 Bruxelles, Belgium (2012)
 IPCS13 Barcelona, Spain (2013)
 IPCS14 Lucca, Italy (2014)
 IPCS15 Tempe, Arizona (2015)
 IPCS16 Amsterdam, Netherlands (2016)
 IPCS17 Cancun, Mexico (2017)
 IPCS18 Thessaloniki, Greece (2018)
 IPCS19 Singapore (2019)