(click to copy)

Publication

Equivalence of information production and generalised entropies in complex processes

Complex systems with strong correlations and fat-tailed distribution functions have been argued to be incompatible with the Boltzmann-Gibbs entropy framework and alternatives, so-called generalised entropies, were proposed and studied.

Here we show, that this perceived incompatibility is actually a misconception. For a broad class of processes, Boltzmann entropy –the log multiplicity– remains the valid entropy concept. However, for non-i.i.d. processes, Boltzmann entropy is not of Shannon form, −kipi log pi, but takes the shape of generalised entropies.

We derive this result for all processes that can be asymptotically mapped to adjoint representations reversibly where processes are i.i.d. In these representations the information production is given by the Shannon entropy. Over the original sampling space this yields functionals identical to generalised entropies.

The problem of constructing adequate context-sensitive entropy functionals therefore can be translated into the much simpler problem of finding adjoint representations. The method provides a comprehensive framework for a statistical physics of strongly correlated systems and complex processes.

R. Hanel, S. Thurner, Equivalence of information production and generalised entropies in complex processes, PLOS ONE 18(9) (2023) e0290695.

0 Pages 0 Press 0 News 0 Events 0 Projects 0 Publications 0 Person 0 Visualisation 0 Art

Signup

CSH Newsletter

Choose your preference
   
Data Protection*