Venue
Co-located with the CCS 2018.
Vellidio Convention Center
Leof. Stratou 3, Thessaloniki 546 39, Greece
Co-located with the CCS 2018.
Vellidio Convention Center
Leof. Stratou 3, Thessaloniki 546 39, Greece
7 Jun Call for abstracts
26 Jun Deadline for abstract submission
28 Jun Notification of acceptance
30 Jun CCS EarlyBird Registration deadline
29 Jul Notification of Acceptance (only for submissions received after 30 June)
15 Aug CCS Standard Registration deadline
26 Sep Satellite event
Only contributions submitted through the EasyChair system will be considered.
Authors of accepted contributions should register to the main conference.
UC Davis
UC Davis
Griffith University
Samir Suweis
Recently, evidence has been mounting that biological systems might operate at the borderline between order and disorder, i.e., near a critical point. In this talk we will present a general mathematical framework for understanding this common pattern, explaining the possible origin and role of criticality in living adaptive and evolutionary systems. We rationalize this apparently ubiquitous criticality in terms of adaptive and evolutionary functional advantages. We provide an theoretical framework based on information theory and genetic algorithms, which shows that the optimal response to broadly different changing environments occurs in systems organizing spontaneously to the vicinity of a critical point. In particular, criticality turns out to be the evolutionary stable outcome of a community of individuals aimed at communicating with each other to create a collective entity.
Carl Henning Reschke
This paper argues that parallel concepts and models from economics, complexity theory, and developmental biology, which explain the buildup of complex systems should be applied in the analysis and explanation of complex phenomena. They hint at modular, hierarchical structures necessary for the coordination and control of information processing such that layered, modular outcomes can arise. The growing complexity and relative transaction costs of information processing put pressure on 'systems' to organize interactions in aggregated, modular, hierarchically organized function complexes. However in the social arena lowered transaction cost of information and communication technologies allow for organization of information processing outside of the traditional organization form, the firm. Nevertheless, also large scale web-based forms of interaction require routinized rules and procedures based on norms and per-petuated as 'traditions' in hierarchical structures. The concepts and examples are used to advance the view that there is a formal framework that can help in designing 'principles' governing construction rules in the 'computational' social and biological aggregation processes giving rising to complex organizations forms. In line with this reasoning, we will argue for an approach to complex organizations based on an evolutionary model developed in early evo-devo biology by Rupert Riedl (1978) that stresses the importance of modular and hierarchical organization of information, Robert Coases (1937) transaction cost approach to organizations and Ernst Mach’s process and functional relations view of natural science (Mach 1908) - formalizing the blind watchmaker argument made by Herbert Simon (Simon 1962, 1970).
Farouk Bonilla, Shahram Sarkani and Thomas Mazzuchi
Cancelled
Paul Riechers
Can model-free signal-analysis methods detect information processing and thus complex structure in the world around us?
As an obvious candidate, power spectra are a common and convenient way to analyze signals in many disciplines of science and engineering. Their structure not only shows the prominence of various signal-frequencies but also hints at mechanisms of correlation, resonance, and broader behavior. However, here we show that power spectra can nevertheless hide all structure about arbitrarily complex processes, conveying only a flat power spectrum---the renowned signature of structureless white noise. Indeed, we argue that the most insightful signals from complex systems will have large beyond-pairwise correlation that evades power spectra. To offer more than a word of warning, we give three more constructive results:
Kavan Modi
TBD
Kristian Lindgren
We take the perspective that the underlying processes that determine the time evolution of physical systems can be viewed as reversible computational processes. The reversibility is of course consistent with both classic and quantum mechanic time evolution. This implies that an information perspective can be used in order to understand some phenomena in nature. The reversibility that is present on the microscopic level implies that there is an information quantity that is conserved in the time evolution. There is an apparent contradiction between information conservation on the microscopic level and two fundamental phenomena in physics: the entropy in- crease towards equilibrium and the measurement process in quantum mechanics selecting one eigenstate out of several possibilities. These are phenomena in which information is destroyed and created, respectively. The information processing perspective guides us to search for explanatory mechanisms that take the lost in- formation and the added information, respectively, explicitly into account. Such a perspective has a potential to demystify and clarify the second law of thermody- namics and the quantum measurement process. In a closed thermodynamic system, entropy increases and information is de- stroyed as a system approaches equilibrium. Still, a microscopically conserved information quantity may be consistent with an approach towards equilibrium if some of that information is dissipated over increasing distances in a way so that it gets inaccessible for physical processes. This was recently illustrated and mathe- matically explored in a dynamic spin model. In this talk I will focus on how a mechanism could work that explains how a definite measurement can result from the interaction between a smaller quantum system and a larger measurement device. The process turns out to be a bifurca- tion process that is driven by detailed information in the state of the measurement device. With this approach, information is indeed conserved, no measurement pos- tulate is needed, and parallel world interpretations are unnecessary.
Nora Tischler
Technologies that exploit quantum mechanical effects promise to enhance applications in a number of different areas. One celebrated example is the speed-up in factoring numbers that can be obtained through Shor’s algorithm. Recently, a new task has emerged for which quantum information science provides an advantage: the simulation of stochastic, i.e. partially random, processes.
Stochastic process models are used to describe a wide range of natural and social phenomena, including for example the weather and the stock market. The simulation of such processes provides valuable information about the dynamics of complex systems. However, for highly complex processes, a large amount of information about the system's past needs to be stored in order to simulate its future—a quantity formally measured by its statistical complexity. This translates to a large memory requirement, which may limit the feasibility of such a simulation. Here, quantum mechanics promises an advantage: Simulators based on quantum information processing can outperform classical simulators by reducing the memory requirements below the ultimate classical limits.
The first physical demonstrations of this recently proposed quantum memory advantage have been achieved in the Quantum Optics and Information Laboratory at Griffith University, Australia. In a series of four experiments, we have, among other things, explored how the relative simplicity of two stochastic processes can depend on the classical or quantum nature of the information processing, and have demonstrated that quantum resources allow storing information about the past of the process in a lower-dimensional memory system. Our experiments employ single particles of light as the information carrier. In this talk, I will introduce quantum information processing with light and provide an overview of these physics experiments.
Alec Boyd
Complex computations typically occur via the composition of modular units, such as the universal logic gates found in logical circuits. The benefit of modular information processing, in contrast to globally integrated information processing, is that complex global information processing is more easily and flexibly implemented via a series of simpler, localized information processing operations that only control and change local degrees of freedom. We show that, despite these benefits, there are unavoidable thermodynamic costs to modularity--costs that arise directly from the operation of localized processing and that go beyond Landauer's dissipation bound for erasing information. We quantify the minimum irretrievable dissipation of modular computations in terms of the difference between the change in global nonequilibrium free energy and the local (marginal) change in nonequilibrium free energy, which bounds modular work production. This modularity dissipation is proportional to the amount of additional work required to perform the computational task modularly, measuring a structural energy cost. It determines the thermodynamic efficiency of different modular implementations of the same computation, and so it has immediate consequences for the architecture of physically embedded transducers, which are information processing agents. Constructively, we show how to circumvent modularity dissipation by designing agents that capture the information reservoir's global correlations and patterns. We prove that these agents, when acting as pattern generators or extractors, must match the complexity of their environment to minimize the modularity dissipation. Thus, there are routes to thermodynamic efficiency by optimizing the modular architecture of computations.
Andrew Garner
TBD
The Information Processing in Complex Systems (IPCS) satellite meeting is organized, since 2012, during the Conference on Complex Systems.
Our goal is to provide a forum for researchers who follow an information-theoretic approach for the analysis of complex systems.
Here they can present recent achievements and discuss promising hypotheses and further research directions, combining both classical and quantum information approaches.
All systems in nature can be considered from the perspective that they process information.
Information is registered in the state of a system and its elements, implicitly and invisibly.
As elements interact, information is transferred. Indeed, bits of information about the state of one element will travel – imperfectly – to the state of the other element, forming its new state.
This storage and transfer of information, possibly between levels of a multi level system, is imperfect due to randomness or noise.
From this viewpoint, a system can be formalized as a collection of bits that is organized according to its rules of dynamics and its topology of interactions.
Mapping out exactly how these bits of information percolate through the system reveals fundamental insights in how the parts orchestrate to produce the properties of the system.
A theory of information processing would be capable of defining a set of universal properties of dynamical multi level complex systems, which describe and compare the dynamics of diverse
complex systems ranging from social interaction to brain networks, from financial markets to biomedicine. Each possible combination of rules of dynamics and topology of interactions, with
disparate semantics could be translated into the language of information processing which in turn will provide a lingua franca for complex systems.
Assistant Professor in the Complexity Institute at the School of Physical and Mathematical Sciences, Nanyang Technological University