Please login first

List of accepted submissions

 
 
Show results per page
Find papers
 
  • Open access
  • 136 Reads
Clausius relation for Active Particles

Many kinds of active particles, such as bacteria or active colloids, move in a thermostatted fluid by means of self-propulsion. Energy injected by such a non-equilibrium force is eventually dissipated as heat in the thermostat. Since thermal fluctuations are much faster and weaker than self-propulsion forces, they are often neglected, blurring the identification of dissipated heat in theoretical models. For the same reason, some freedom—or arbitrariness—appears when defining entropy production. Recently three different recipes to define heat and entropy production have been proposed for the same model where the role of self-propulsion is played by a Gaussian coloured noise. Here we compare and discuss the relation between such proposals and their physical meaning. One of these proposals takes into account the heat exchanged with a non-equilibrium active bath: such an “active heat” satisfies the original Clausius relation and can be experimentally verified.

  • Open access
  • 238 Reads
Theory and Practice of Permutation Entropy

Permutation entropy is a relatively new promising concept for measuring complexity of time series and of the systems behind with applications in various fields. The idea of permutation entropy it is to quantify the amount of up and down in a time series on the base of considering the distribution of ordinal patterns in it. Although metric features are neglected to a great extend, this approch often preserves more information than expected on a first glance.

The aim of the talk is to explain why permutation entropy is interesting for complex data analysis and to discuss perspectives, challenges and limits of its application. Our discussion emphasizes the viewpoint of symbolic dynamics.

  • Open access
  • 149 Reads
Pointwise Information Decomposition Using the Specificity and Ambiguity Lattices

Multivariate information theory has long been problematic.  Recently, the partial information decomposition (PID) of Williams and Beer has provided a promising axiomatic framework which clarifies the general structure of multivariate information.  However, PID lacks the necessary measure of redundant information required to complete the framework; despite much recent research, no well-accepted measure of redundant information has emerged that is applicable to more than two sources and respects the locality of information.  In this paper, we introduce a new framework based upon the axiomatic approach taken in PID but which aims to decompose multivariate information on a local or pointwise scale.  It is shown that in order to identify when information from two sources is indeed the same information, one must consider decomposing the local mutual information into its two directed components, the specificity and the ambiguity.  Based upon the axiomatic approach taken in PID, we decompose these two components separately resulting in two lattices - the specificity and ambiguity lattices.  This Specificity and Ambiguity decomposition retains the appealing multivariate structure provided by PID, but applying this notion on a much more granular level enables the decomposition to identify when information is the same information. This last point is justified by providing an operational interpretation of redundancy in terms of Kelly gambling.  Applying the decomposition to canonical examples from the PID literature demonstrates the unique ability to provide a pointwise decomposition, and the fact that the Specificity and Ambiguity decomposition possesses the much sought-after target chain rule. Finally, interpreting these results sheds light on why defining a redundancy measure for PID has proven to be so difficult - one lattice is not enough.

  • Open access
  • 224 Reads
On the roles of energy and entropy in thermodynamics by Ingo Müller and Wolf Weiss

The first dependable thermodynamic laws -- in the early 19th century -- were those of Fourier and Navier-Stokes for heat conduction and viscous friction. They were purely phenomenological and did not require the understanding of the nature of heat, let alone the concepts of energy and entropy.

Those concepts emerged in the works of Mayer, Joule, Helmholtz, Clausius and Boltzmann in the latter half of the 19th century and -- apart from making energy conversion more efficient -- they had a deep impact on natural philosophy: It was recognized that nature is governed by a universal conflict between determinacy, which forces energy into a minimum, and stochasticity, by which entropy tends to a maximum. The outcome of the competition is determined by temperature so that miminal energy requires cold systems and maximal entropy occurs in hot systems. Thus phenomena like phase transitions, planetary atmospheres, osmosis, phase diagrams and chemical reactions were all understood in terms of the universal competition of energy and entropy. Mixtures, alloys and solutions were systematically incorporated into thermodynamics by Gibbs. And so the early 20th century saw the development of a second vast field of industrial application, -- apart from energy conversion --, the rectification of naturally occuring mixtures like natural gas and mineral oil.

By the work of Eckart the phenomenological laws of Fourier, Navier-Stokes and Fick -- the latter for diffusion -- received a systematic derivation in terms of the Gibbs equation for the entropy. Thus equilibrium thermodynamics and thermodynamics of irreversible processes had come to a kind of conclusion by the mid 20th century. At the same time the limits of those fields were recognized: They required fairly dense matter which means that they could not describe rapidly changing processes and processes with steep gradients. 

Some wishful thinking occured concerning stationary processes: The hypothesis of Onsager about the mean regression of fluctuations, and Prigogine´s principle on minimal entropy production.
Boltzmann´s kinetic theory of gases provides macroscopic balance laws in its hierarchy of moments of the distribution function. These have furnished the blueprint for extended thermodynamics which is capable of describing rapid changes and steep gradients, if enough moments are taken into account. The theory assumes that all field equations are balance laws with local and instantaneous constitutive relations for fluxes and productions. Thus follows a system of quasi-linear field equations and the entropy principle ensures that the system is symmetric hyperbolic, if written in the proper variables. That property guarantees

  • existence and uniqueness of solutions,
  • continuous dependence of solutions on the initial data and
  • finite speeds of characteristic waves.

That is to say: The entropy principle ensures well-posedness of initial value problems.

Extended thermodynamics permits the calculation of light scattering spectra down to very rarefied gases. Moreover, it offers a deep insight into the structure of shocks and, above all, it reveals the very narrow limitations of linear irreversible thermodynamics based on the laws of Fourier, Fick and Navier-Stokes. Thus for instance extended thermodynamics shows that a gas in the gap between two cylinders cannot rotate rigidly, if heat conduction occurs between the inner and outer surface. Moreover, the temperature fields in the gap does not obey the Fourier law, even if there is no rotation.  

  • Open access
  • 129 Reads
The role of information in complex systems -- Self-organisation in stem cells and glass formers

Shannon entropy is a measure of disorder. It is a function of a probability distribution without having any physical meaning by itself. And yet, it can give insight into physical processes, in particular when a system is undergoing change. The process of self-organisation is ubiquitous in complex systems. I will give two examples of natural systems which undergo self-organisation: a glass forming material and a differentiating stem cell. The point of transition in each case is detected with tools from information theory.

I will begin with highlighting the importance of disorder and order in the natural dynamics of complex systems and give the basics of information theory needed to follow this talk. I will then explain briefly the experimental set-up for each system before discussing the information theoretic analysis of the data and role of information in the transition from disorder to order.  

The talk is based on the following papers:
1. Ladyman, J., Lambert, J. & Wiesner, K. What is a complex system? European Journal for Philosophy of Science 3, 33–67 (2013), also available at http://philsci-archive.pitt.edu/8496/
2. Dunleavy, A. J., Wiesner, K., Yamamoto, R. & Royall, C. P. Mutual information reveals multiple structural relaxation mechanisms in a model glass former. Nat Commun 6, (2015).
3. Wiesner, K., Teles, J. Hartnor, M., Petersen, C. Hematopoietic stem cells -- Entropic landscape oF differentiation. arXiv:q-bio (November 2017).

  • Open access
  • 168 Reads
Relating Fisher information and thermodynamic cost of near-equilibrium computation

A recently developed framework of information dynamics systematically studies the phenomenon of computation and information processing in complex systems, relating it to critical phenomena, e.g., phase transitions and the edge of chaos. It has been conjectured that at the edge of chaos the distributed computation exhibits a high level of complexity. However, it remains unclear how such complexity is related to physical fluxes which are observed and studied during phase transitions. We consider several examples of near-equilibrium computation (e.g., random Boolean networks and collective motion) as thermodynamic phenomena.  Specifically, we consider the dynamical model of collective motion which undergoes a kinetic phase transition over parameters that control the particles’ alignment: from a “disordered motion” phase, in which particles keep changing direction but occupy a fairly stable collective space, to a “coherent motion” phase, in which particles cohesively move towards a common direction. The control parameters that we consider are the alignment strength among particles and the number of nearest neighbours affecting a particle’s alignment.  This analysis allows us to contrast Fisher information with the curvature of the system's entropy. During the phase transition, where the configuration entropy of the system decreases, the sensitivity of the distributed computation diverges. Overall, the comparison highlights the balancing role of the sensitivity and the uncertainty of computation when the system fluctuates near equilibrium, quantifying the thermodynamic cost of this computation.

  • Open access
  • 198 Reads
Using Entropy to Unify Mechanics and Thermodynamics

The field of classical mechanics is based on Sir Isaac Newton’s work in “The Principia,” published in 1687. In this work, Newton introduced the world to three universal laws of motion, which describe the relationships of any object, the forces acting upon it and the object’s resulting motion. It is these three laws that make up the foundation for classical mechanics, and all subsequent theories of mechanics are derived from them. But Newtonian mechanics still cannot account for the past, present or future of any aspect of a physical body or its governing equations.

Around 1850, Rudolf Clausius and William Thomson (Kelvin) formulated both the First and Second Laws of Thermodynamics. Because the field of thermodynamics governs the past, present and future of all physical bodies, the aging process and life span of any physical body can be modeled in accordance with the thermodynamics laws. Still, thermodynamics alone cannot convey the response of a physical body under an external force at any given moment – something classical mechanics equations are able to achieve.

Being able to accurately predict the life span of physical bodies, both living and non-living, has been one of humankind’s eternal endeavors.  Over the last 150 years, many unsuccessful attempts were made to unify the fields of classical mechanics and thermodynamics, in order to create a generalized and consistent theory of evolution of life-span of inorganic and organic systems.  The objective has been to map out the aging process of a physical body using classical mechanics equilibrium equations while also predicting its life span. Most past attempts were based solely on the use of physical experiments, which would reveal the aging rate and life span of any physical body first.  The experimental data is later be used to create a life-span expectancy model by curve fitting.

Professor Basaran, will report a new unified mechanics theory that can now predict the aging and life span of any physical body based purely on mathematical calculations and without the need for any prior life-span degradation testing or curve fitting phenomenological damage mechanics models. Entropy generation rate is used as the metric for aging process and also as a link between Newtonian Mechanics and Thermodynamics.

 

  • Open access
  • 123 Reads
Pattern recognition in nuclear fusion data by means of geometric methods in probabilistic spaces

Statistics and machine learning algorithms increasingly have to be able to handle complex data, involving higher-level descriptors (features) of the characteristics of an object or system. Such descriptors are usually inspired by knowledge of the intrinsic structure of the object, e.g. a covariance matrix modeling variability and correlation between pixels in an image, or by physical understanding of the system, such as a probability distribution of a flow characteristic in a turbulent flow. In such cases, each data point does not simply represent a set of numbers (coordinates in a vector space), but has substructure of its own, representing more complex notions like a matrix, a probability distribution, a function, a shape, etc. In this talk, I will discuss several applications from the field of nuclear fusion plasma physics, wherein we characterize fluctuation and measurement uncertainty by probability distributions. We employ a metric on the Riemannian space of Gaussian probability distributions to discriminate between various types of plasma instabilities and classify them. Furthermore, we describe a new, very robust regression technique, called geodesic least squares regression, for estimating relations between plasma quantities that are affected by a considerable amount of fluctuation or measurement uncertainty.

  • Open access
  • 81 Reads
On a general definition of conditional Renyi entropies

In recent decades, different definitions of conditional Renyi entropy (CRE) have been introduced. Thus, Arimoto proposed a definition that found an application in information theory, Jizba and Arimitsu proposed a definition that found an application in time series analysis and Renner-Wolf, Hayashi and Cachin proposed definitions that are suitable for cryptographic applications. However, there is still no a commonly accepted definition, nor a general treatment of the CRE-s, which can essentially and intuitively be represented as an average uncertainty about a random variable X if a random variable Y is given. In this paper we fill the gap and propose a three-parameter CRE, which contains all of the previous definitions as special cases that can be obtained by a proper choice of the parameters. Moreover, it satisfies all of the properties that are simultaneously satisfied by the previous definitions, so that it can successfully be used in aforementioned applications. Thus, we show that the proposed CRE is positive, continuous, symmetric, permutation invariant, equal to Rényi entropy for independent X and Y , equal to zero for X = Y and monotonic. In addition, as an example for the further usage, we discuss the properties of generalized mutual information, which is defined using proposed CRE.

  • Open access
  • 36 Reads
Characterization of Some Dynamic Network Models

Dynamic Random Network Models are presented as a mathematical framework for modelling and analyzing the time evolution of complex networks. Such framework allows the time analysis of several network characterizing features such as link density, clustering coefficient, degree distribution, as well as entropy-based complexity measures, providing new insight on the evolution of random networks.  Some simple dynamic models are analyzed with the aim to provide several basic reference evolution behaviors. Inference issues from real data are also discussed, together with simulation examples, to illustrate the applicability of the proposed framework.

1 2 3 4
Top