Please login first

List of accepted submissions

 
 
Show results per page
Find papers
 
  • Open access
  • 62 Reads
The fundamental diagram in vehicular traffic

Nowadays, the study of traffic flow in highways represents a big challenge. This problem is mainly interesting when vehicular density is high enough to produce congestion. The description of such phenomena has been made through several approaches going from the phenomenological up to individual behavior of drivers. In this work we start with a generalization of the Prigogine-Herman-Boltzmann (PHB) kinetic equation to consider the vehicles’ sizes, like the Enskog generalization to take into account the finite size of molecules in a moderately dense gas. Our main goal is the derivation of a fundamental diagram relating the flux with the density, in the homogeneous steady state of traffic. The conditions satisfied by the distribution function are given and their numerical solution allows the construction of the corresponding fundamental diagram. This derivation allows us to obtain some of the threshold values which separate the free and congested regimes. The model results are contrasted with recent empirical data and show excellent agreement.

  • Open access
  • 119 Reads
Entropy production of reaction-diffusion systems under confinement

Diffusion processes under confinement within a channel in which one coordinate is longer than the others have been studied by projecting the diffusion equation into one dimension. This results in the so-called Fick-Jacobs equation that introduces an effective diffusion coefficient dependent on the position. Several approaches have been used to propose position-dependent diffusion coefficients, and it has been found that it depends on the channel's width function as well as the geometric properties of the midline, such as its curvature and torsion. Within this approach we study the entropy production for a reaction-diffusion process of two species on a two dimensional channel. Recently, it has been seen that the Turing instability conditions, the range of unstable modes for patterns formation, as well as the spatial structure of the patterns themselves, can be modified through the geometric parameters of the confinement. In this contribution, the effect of the confinement on entropy production is analyzed and characterized in terms of the geometry of the corresponding channel.

  • Open access
  • 57 Reads
Entropic dynamics on Gibbs statistical manifolds

In a modern approach to statistical physics [Jaynes: Phys. Rev. 106, 620, 1957], Gibbs distributions appear naturally as a solution to a well set optimization problem: maximizing entropy under a set of expected values constraints. Generally in physics, the space in which a canonical distribution is defined are the microstates and the set expected values define the macrostates , which can be used as coordinates in a space of such canonical distributions (statistical manifold). In the field of Information Geometry [Amari and Nagaoka: Methods of Information Geometry, American Mathematical Soc. , 2007][Ruppeiner: Rev. Mod. Phys. 68, 313, 1996] these distributions happen to have deeply interesting geometrical properties such as their metric tensor is a covariance matrix and important thermodynamical objects, such as free energy, appear naturally. This work aims to provide a systematic way to create dynamical systems in a space of canonical distributions. These dynamics are derived as an application of entropic methods of inference i.e. that is a form of entropic dynamics [Caticha: Entropy 17, 6110, 2015]. As an interesting result, the average motion in such a dynamical process reduces to the Onsager Relations [Onsager: Phys. Rev. 37,405,1931], derived from purely probabilistic - not intrinsically thermodynamical - arguments. This can give new insight on fields such as critical phenomena and renormalization groups as well as deal with statistical problems in which the microstate dynamics is not so well defined such as economics and ecology.

  • Open access
  • 106 Reads
Quantum chaos and quantum randomness—paradigms of quantum entropy production

Quantum chaos and quantum measurement have one constitutive feature in common: They capture information at the smallest scales to lift it to macroscopic observability, thus generating classical facts. Fundamental bounds of the information content of closed quantum systems with finite-dimensional Hilbert space restrict their entropy production to a finite timescale. Only in open systems where fresh entropy infiltrates from the environment, quantum dynamics (partially) recovers sustained entropy production as in classical chaos.

This interpretation opens a novel perspective also on randomness in quantum measurement, where a macroscopic apparatus observes a quantum system. Notably in spin measurements, their results involve an element of fundamental unpredictability. The analogy with quantum chaos suggests that random outcomes of quantum measurements could, in a similar manner, reveal the entropy generated through the coupling to a macroscopic environment, which is required anyway to explain a crucial feature of quantum measurement that becomes manifest in the collapse of the wavepacket: decoherence. However, the subsequent step from a set of probabilities to specific individual measurement outcomes (the “second collapse”) still evades a proper understanding in terms of microscopic models. Could it be explained by the exchange of entropy between macroscopic apparatus and measured system?

I explore this hypothesis in the case of spin measurements. The model of quantum measurement proposed by Zurek and others is combined with a unitary approach to decoherence using heat baths that comprise only a finite number N of modes, as recently proposed in quantum chemistry and quantum optics. For large N >> 1, the dynamics of the measured spin is expected to exhibit a scenario of episodes of significant spin polarization in either direction of increasing length, alternating with spin flips, determined by the initial condition of the apparatus. I present preliminary analytical and numerical results which support this expectation.

  • Open access
  • 99 Reads
Max Entropy through Natural Interactions

The Principe of Maximum Entropy, suggests that when one tries to predict the shape of a distribution, then among all possible distributions available for his choice, he should choose the one that maximizes the entropy of the distribution under some few chosen constraints expressing his limited knowledge of the situation. This principle has deep meaning for human and non-human organisms alike, but it is hard to imagine how it is taken place in natural environments under bounded rationality [1]. The context of the current talk is the way natural cognitive processes may be modeled through entropy and bounded rationality (e.g. [2, 3]). More specifically, we would like to present a novel idea [4] describing the way in which the entropy of a predicted distribution increases through a structured process of natural interaction that builds on three principles only: Zipf’s [5] principle of least effort, Laplace’s principle of indifference, and The Copernican principle, suggesting no observer occupy a special place in the universe. This process will be presented, illustrated and supported by simulations never presented before.

References

[1] Simon, H. A. (1957). Models of man. New York: Wiley.

[2] Neuman, Y., & Vilenchik, D. (2019). Modeling small systems through the relative entropy lattice. IEEE Access, 7, 43591-43597.

[3] Neuman, Y., Cohen, Y., & Tamir, B. (2021). Short-term prediction through ordinal patterns. Royal Society Open Science, 8(1), 201011.

[4] Neuman, Y. (2021, forthcoming). How small social systems work: From soccer teams to families and jazz trios. N.Y.: Springer.

[5] Zipf, G. K. (2016). Human behavior and the principle of least effort: An introduction to human ecology.Y.: Ravenio Books.

  • Open access
  • 27 Reads
Network Analysis of Multivariate Transfer Entropy of Cryptocurrencies in Times of Turbulence

We investigate the effects of the recent financial turbulence of 2020 on the market of
cryptocurrencies taking into account the hourly price and volume of transactions from December
2019 to April 2020. The data were subdivided into time frames and analyzed the directed network
generated by the estimation of the multivariate transfer entropy. The approach followed here is based
on a greedy algorithm and multiple hypothesis testing. Then, we explored the clustering coefficient
and the degree distributions of nodes for each subperiod. It is found the clustering coefficient
increases dramatically in March and coincides with the most severe fall of the recent worldwide stock
markets crash. Further, the log-likelihood in all cases bent over a power-law distribution, with a
higher estimated power during the period of major financial contraction. Our results suggest the
financial turbulence induce a higher flow of information on the cryptocurrency market in the sense of
a higher clustering coefficient and complexity of the network. Hence, the complex properties of the
multivariate transfer entropy network may provide early warning signals of increasing systematic
risk in turbulence times of the cryptocurrency markets.

  • Open access
  • 54 Reads
ENTROPY MEASUREMENTS WITH INFRARED SENSORS

Infrared sensors (IRS) have been used for long to measure non-contact body temperatures, as in thermal cameras. These sensors are based on thermopiles that measure heat flows, which are converted to temperature through sensor’s calibration curve. In this contribution we aim to combine the physical heat measurements with the inferred temperature measurements of the emitter to characterize the emitter radiated entropy reaching the sensor. We implemented a data acquisition system based on Arduino UNO microcontroller. Several IRS sensors, which had both thermopile and thermistor for independent environment temperature were tested. As thermal emitters, we used resistors whose power dissipation could be selected by controlling their current and voltage. We investigated the emitter transferred entropy reaching the IRS sensor as a function of emitter-sensor distance and the emitter dissipated power. Thus, the measurement system manages the emitter temperature and power dissipation, the heat reaching the sensor and the temperature of the sensor. From this data, we were able to monitor non-contact radiated entropy. This setup was used to characterize resistor ageing as resistor ageing is related to its entropy production. Degradation accelerated tests were carried out. Emitter resistors were submitted to dissipation powers well beyond their nominal power rate to speed up its degradation mechanisms. We were, thus, able to monitor resistor degradation with a non-contact sensor by evaluating the entropy reaching the sensor.

  • Open access
  • 35 Reads
Causal Entropy-Complexity Plane with Multivariate Probability Distribution

We propose a multivariate causality entropy-complexity plane by using a multivariate probability distribution function (PDF) and adapting the Normalized Permutation Entropy H and Statistical Complexity C.

Alongside the standard embedding dimension D for ordinal patterns of length D!, we considered sub-patterns of embedding dimensions that get more information in the phase space, since the order patterns do not always detect sensory changes in the time series. This level of complexity can capture details of the probability distribution of the system that are not discriminated by measures of randomness, such as entropy, thus the multivariate causality entropy-complexity plan is adopted.

In this application, the plane was used to analyze complex time series and the results indicate robustness in the distinction between chaotic systems, even after the insertion of noise in each of the chaotic time series that defines the system. Thus, the multivariate plan is more objective to extract the structure of a system and, thus, to characterize it.

  • Open access
  • 59 Reads
Thermodynamics of systems with emergent molecule structures

Boltzmann entropy is defined as the logarithm of state multiplicity. For multinomial multiplicities, it results in the ordinary Boltzmann-Gibbs-Shannon entropy. However, for non-multinomial systems, we obtain different expressions for entropy. This is the case of complex systems, particularly the case of systems with emergent structures. Probably the most prominent examples of such systems are provided by the chemical reactions with long-range interactions, i.e., where every particle can interact with each other. Based on the original ideas of L. Boltzmann, we calculate the entropy of a system with emergent molecule states. It turns out that the corresponding entropy is the Boltzmann-Gibbs entropy plus a correction that can be interpreted as a structural entropic force. The corresponding thermodynamics is an alternative for the grand-canonical ensemble that correctly counts the number of states. We demonstrate this approach on several examples, including chemical reactions of the type 2X <-> X2, phase transitions in a magnetic gas, and the fully connected Ising model. For the fully-connected Ising model, the presence of molecule states shifts the Curie temperature down and changes the order of the phase transition from the second-order to the first order. For systems with short-range interactions, we recover the ordinary Boltzmann-Gibbs entropy and derive the well-known relation between chemical potential and concentration.

  • Open access
  • 73 Reads
Information routing in proteins: the case of a therapeutic antibody

Internal dynamics is the link between structure and biological function in proteins [1]. It has been shown that low-frequency dynamics is not only essential for a protein to function [2], but also that a correlation exists between a protein’s activity and its specific dynamical properties [3]. Propagation of information between two or more distant sites on the protein network allows concerted, large-scale conformational changes to take place, triggering as a consequence biological responses.
In this work, we aim at identifying patterns of information routing within the therapeutic antibody pembrolizumab [4], as communication channels that emerge from the underlying topology and drive the observed correlated motions. Specifically, we focus on the mutual information (MI) of the displacements of atomic positions, as computed from atomistic molecular dynamics simulations, both in presence and in absence of the bound antigen. MI is used to build network models of the antibody for each of the conformational clusters emerging from the simulations; these networks are then interpreted in the light of a graph-theoretical approach, to couple chemical detail and large-scale dynamics.
Unveiling inter-residue communication pathways in may find application not only in biotechnological manipulation for improved therapeutic agents, but also in design of simplified, multi-resolution antibody models that, describing channels of information transfer at an appropriate high-resolution level, facilitate the dynamical investigation at a lower computational cost [5] .

[1] Berendsen, H. J., Hayward, S. (2000). Curr. Opin. Struct. Biol., 10(2), 165-169
[2] Yang, L. Q., et al. (2014). J. Biomol. Struct. Dyn., 32(3), 372-393.
[3] Hensen, U., et al. (2012). PloS one, 7(5), e33931.
[4] Scapin, G., et al. (2015). Nat. Struct. Biol., 22(12), 953-958.
[5] Diggins IV, P., et al. (2018). J. Chem. Theory Comput., 15(1), 648-664.

Top