One of the most important questions to be solved by neuroscience is, how does the orchestrated activity of multiple brain areas result into brain function;
How does the coordinated neural dynamics from millions of neurons result into cognitive function?
Our work at the Information Theory & Complex Systems team focuses on theoretical foundations of the emergence of multivariate complex phenomena.
Aiming to a deeper understanding on neuron interdependencies and the their dynamics our team research bases on differential geometrical methods as well as stochastic thermodynamics.

#Information theory
#Information Geometry


(1) A Generalized Maximum Entropy Principle (MEP)

In the search for a statistical description of a complex system lacking complete knowledge, such as the brain, a deeper grasp on the maximum entropy principle (MEP) is quintessential. MEP is a fundamental framework that bridges statistical mechanics, complex system analysis, and Bayesian statistics, at the core of complexity science. Unfortunately, the functional form of Shannon’s entropy restricts the possible outcomes of the MEP, and hence extensions of it are highly desirable.Rooted in the differential-geometric representation of information theory, we have put forward an extension of the MEP based on the fundamental properties of curved statistical manifolds [1]. In this way, we highlighted the special geometrical properties of the Renyi entropy that make it stand apart from other measures of entropy, setting solid mathematical foundations for the numerous recent applications of this entropy and divergence.

(2) Toward a new understanding of consciousness based on information theory

Researchers have developed many imitation learning method over the last few Information processing in neural systems can be described and analyzed at multiple spatiotemporal scales. However, only information processed at specific scales appears to be available for conscious awareness. We do not have direct experience of information available at the scale of individual neurons, which is noisy and highly stochastic. Neither do we have experience of more macro-scale interactions, such as interpersonal communications. To solve this mysterious scale problem of consciousness, we proposed a new theory of consciousness based on information theory: the Information Closure Theory of Consciousness (ICT) [2]. With parsimonious definitions and a hypothesis, ICT provides explanations and predictions of various phenomena associated with consciousness. ICT further ties together a number of scientific theories of consciousness. Most importantly, ICT demonstrates that information can be the common language between consciousness and physical reality.


Pablo Morales, Ph.D.
Chief Researcher
Pablo is a Chief Researcher at Araya. He received his Ph.D. in Theoretical Particle Physics at The University of Tokyo in 2018. He is a former Japan Society for the Promotion of Science (JSPS) fellow and MEXT scholar. His current research interests include information geometry and complex systems applied to theoretical neuroscience.
Shun-ichi Amari, Ph.D.
Research Advisor
Dr Shun-ichi Amari is a mathematical neuroscientist. He is a professor emeritus at the University of Tokyo and a honorary science advisor at RIKEN. He initiated information geometry, which has been widely applied in various fields such as statistics, signal processing, information theory, machine learning and more. He is also one of pioneers in developing mathematical theories of neural networks. He is currently working on integrated information theory and Wasserstein distance using information geometry.