Please login first

List of accepted submissions

 
 
Show results per page
Find papers
 
  • Open access
  • 91 Reads
Evaluation of the performance of permutation entropy variants for classifying auditory evoked potentials

The aim of the present work was to investigate the performance of entropic tools to classify, subjects with normal hearing and those with pathologies in the auditory pathway, using short-latency records of auditory evoked potentials. To accomplish with this objective, traditional permutation entropy, weighted permutation entropy and a modified version of the original permutation entropy, correcting the count of the missing or forbidden patterns.

The database used consisted of two age groups, one of minors aged from a few months to four years and the other of older aged over 18 years.

For both minor and older subjects, thirty samples were randomly selected from both normal and hearing-impaired subjects.

Once the different varieties of entropy had been calculated, the difference in mean values was analyzed using statistical tests to check whether the difference between them was significant or not. The differences between means of the groups with and without pathology were found to be significantly different at a level of 99.9 %.

Finally, the choice of the most appropriate entropy was made based on the calculation of the specificity, selectivity, precision and area under the operation reception curve. The results obtained showed that for the differentiation of healthy and pathological records in the case of minors, the weighted permutation entropy was more appropriate and the modified version of the permutation in the case of older adults.

  • Open access
  • 120 Reads
Design and characterization of Cr29.7Co29.7Ni35.4Al4.0Ti1.2 precipitation hardened high entropy alloy

In 2004 a new class of metallic alloys, called high entropy alloys (HEA), was introduced in the literature. The main concept behind these alloys is that they have multiple main elements instead of only one or two. Therefore, they can exist under a vast compositional landscape, most of which is yet to be explored. In the early studies on HEAs, a great effort was made on searching for single-phase alloys with optimized properties, mainly those with a face-centered cubic structure (FCC). Recently, in the so-called “second generation” of HEAs, research focus has been broadened to multi-phase alloys. In particular, those alloys with FCC matrix plus L12 precipitates or BCC matrix with B2 precipitates to compete with traditional superalloys. In the present work, CALPHAD method (via Pandat® software) was used to aid the design of a new precipitation hardened, namely Cr29.7Co29.7Ni35.4Al4.0Ti1.2 (a.t %), with a FCC matrix and L12 precipitates. The alloy was produced, solution-treat, cold-rolled and annealed, the alloy was then cut into different pieces and aged at 850 °C for different times. The microstructure was composed of spherical precipitates distributed uniformly in the FCC matrix. The characterization of the microstructure showed that thermodynamic calculation was accurate. The improvement in mechanical properties caused by introducing the ordered precipitates, analyzed through microhardness test, showed a promising mechanical behavior of the new designed alloy.

  • Open access
  • 145 Reads
Entropy: from Thermodynamics to Information Theory

Entropy is a concept that remote to the 19th century. It was associated with the heat used by a thermal machine to realize work in the context of the Industrial Revolution. The 20th century saw an unprecedented scientific revolution, and one of the essential innovations from this time was Information Theory, which also has a concept of entropy. A natural question arises: ‘what is the difference, if any, between the entropies used in each field?’ The concept is misused misconceptions about the theme. There have been attempts to conciliate the entropy of thermodynamics with that of information theory. The most common use is defining entropy as “disorder”, however, it is not a good analogy since “order” is a subjective human concept, and “disorder” is not the measurement that can always be obtained with entropy. Another way is relating complexity to entropy. Computer science and statistics have boarded the problem of complexity utilizing Kolmogorov Complexity. Again, the abstraction level of the concept can make researchers from other areas, such as biologists and chemists (in which the study of complexity plays an essential role in misunderstood concepts). In this paper, the historical background for the evolution of “entropy” is presented, and mathematical proofs and logical arguments for the concept’s interconnection in various science areas.

  • Open access
  • 103 Reads
The minimum entropy production principle and heat transport in solids with internal structure
Published: 05 May 2021 by MDPI in Entropy 2021: The Scientific Tool of the 21st Century session Thermodynamics

Variational principles have a long story in the study of the time evolution of dissipative systems. There is a wide variety of formulations of such principles, some of which are ad hoc techniques like doubling of the dynamic variables, restricted variations, etc. There also exist some principles which have a fundamental character like the minimum entropy production principle, MEPP, which refers to the stationary state eventually reached by a system after it has been taken out of equilibrium. Much has been discussed whether the MEPP has general application or whether it is of rather limited validity. In particular, it has been concluded that for systems with constant phenomenological Onsager coefficients the entropy production can only decrease in time until a minimum is reached when the system is in the stationary state, it yet being on debate. In this work heat transport in non-homogeneous solids is considered. We study the case of solids with internal structure within the framework of a two temperature description. The internal structure is introduced in the model through the dependence of the thermal conductivity on position. The time evolution equations are obtained through the usual methods of irreversible thermodynamics and from the MEPP. We find that in our approach both sets of evolution equations coincide and that, without imposing any restriction on the phenomenological coefficients other than those coming from the internal structuring of the solid, the appropriate temperature profiles are obtained. We exemplify this finding with the case of pure Aluminum subjected to a heat pulse.

  • Open access
  • 65 Reads
Superresolved light microscopy information on the structure of the stained dental tissue section obtained by point divergence gain analysis

Light microscopy is an unavoidable tool in understanding of the internal structure and chemical composition of materials. It has its limits of resolution [1] which have been analysed mostly from the point of view of the ability of a user of the microscopic instrument to distinguish two objects unambiguously.

We have developed a new variable, point divergence gain (PDG), which enables us to find centroids of the imaging function in any expectable context. In the standard terminology of the light microscopy, irrespectively of the nature of the light-matter interactions, we may create a 3D superresolved map of the interior of a dense semi-transparent material. We have demonstrated this ability on the structure of a living cell [2]. Now we show the ability of PDG-based superlocalisation on a histological sample stained by methods dating back to 1770´.

The localization of the elementary centroid of the absorbing object was achieved with the precision of 78x78x5 nm3. The comparison of subsequent images shows that these localisations are unique, i.e. do not repeat in consequent images. This indicates that the elementary coloured objects are of macromolecular size. The coloured objects are grouped into structures which may be identified as histologically relevant elements but, in this case, we understand their internal structure.

Besides technical description of the results, we compare the PDG-based results with the standard terms of light microscopy such as resolution and depth of focus and demonstrate their proper definitions based on the theory of information.

[1] de Villiers G., Roy Pike E. (2016) The Limits of Resolution. CRC Press, Taylor & Francis Book. [2] Rychtáriková R. et al. (2017) Ultramicroscopy 179, 1–14.

  • Open access
  • 71 Reads
Thermodynamics Beyond Molecules: Statistical Mechanics of Probability Distributions and Stochastic Processes

Statistical mechanics has a universal appeal that extends beyond molecular systems, and yet, as its tools are being transplanted to fields outside physics, the fundamental questions, what is thermodynamics and how it may be applied outside the realm of physical particles, have remained unanswered. We answer these questions here: Statistical mechanics in its most general form is variational calculus applied to probability distributions and by extension to stochastic processes in general; as a mathematical theory, it is independent of physical hypotheses but provides the means to incorporate our knowledge and model assumptions about the particular problem. The fundamental ensemble is a microcanonical space of probability distributions sampled via a bias functional that establishes a probability measure on this space. The maximization of this measure expresses the most probable distribution via a set of parameters (microcanonical partition function, canonical partition function and generalized temperature) that are connected through a set of mathematical relationships that we recognize as the familiar equations of thermodynamic. Any distribution in in this space maybe endowed with the status of the most probable distribution under an appropriately constructed bias functional. Entropy, Kullback-Leibler divergence and the second law have simple interpretations in this theory. We obtain statistical mechanics as a special application to molecular systems and make contact with Information Theory and Bayesian inference. We use numerical examples to demonstrate the thermodynamic treatment of generic probability distributions, present a thermodynamic algorithm (the cluster ensemble) to sample arbitrary distributions with positive argument by analogy to reacting particles and discuss the extension of statistical mechanics to stochastic processes in general.

  • Open access
  • 71 Reads
On the estimation the probability of cardiovascular and cerebrovascular events in hypertensive patients using nonlinear analysis, time and frequency domain methods.

Abstract

Applications of entropy-based parameters to the time series generated by physiological systems of the human body are revealing hidden information of interest, which is allowing to improve diagnosis and treatment of several diseases [2]. In this context, a widely analysed physiological time series is the heart rate variability, obtained from the Surface electrocardiograma (ECG) [1]. For example, its entropy-based analysis has allowed to detect abnormalities associated with letal and non-letal cardiac arrhythmias. In addition to provide information about the instantaneous heart rhythm, it is today accepted that this time series reflects the behavior of the autonomic nervous system.

In this study, the heart rate variability is analyse to anticipate the risk of occurence of an adverse cardiovascular event in hypertensive patients (i.e., myocardial infarction, stroke, syncopal event, etc.), which is the pathology with the most prevalence amongst the developed countries [3]. Briefly, several entropy-based measures haven been applied to the heart rate time series derived from 24 hour-length ECG signals acquired from 139 patients, who were followed during 1 year. These parameters consist of sample entropies, Fuzzy-based entropies, symbolic entropies, etc.

Entropy measures based on different approaches, such irregularity, symbolization, ordinal pattern quantification, etc. were analyzed these indices were only able to discern between hypertensive and non-hypertensive patients who suffered an event within the follow up time, with a 70% accuracy, their combination with other common time and frequency domain parameters estimated from heart rate variability improved diagnostic accuracy to about 80% [4].

As a conclusion, nonlinear analysis of the heart rate variability provided by entropy-based measures provides complementary information to other linear indexes, at high-risk of developing future cardiovascular events, which can lead them to imminent death.

[1] Goldberger A.L., Peng C.K., and Lipsitz L.A., What is physiologic complexity and how does it change with aging and disease? Neurobiol Aging 23: 23-26, 2002. [2] Goldberger, A.L. Non-linear dynamics for clinicians: Chaos theory, fractals, and complexity at the bedside. Lancet 347: 1312-1314, 1996. [3] Spasic S.Z., and Kesic S., Nonlinearity in living systems: Theoretical and practical perspectives on metrics of physiological signal complexity, Front Physiol 10: 298, 2019. [4] Solaro N., Malacarne M., Pagani, M., and Lucini, D., Cardiac baroreflex, HRV, and statistic: An interdisciplinary approach in hypertension. Front Physiol 10: 478, 2019.

  • Open access
  • 48 Reads
Different scenarios leading to hyperchaos development in radiophysical generators

Chaos is a typical attribute of nonlinear dynamical systems in various fields of science and technology. One of the conventional indicator of chaotic dynamics is the largest Lyapunov exponent. Chaos is implemented in a situation when in the spectrum of Lyapunov exponents for a flow there is one positive, one zero and at least one negative exponents. Using full spectrum of Lyapunov exponents it is possible to classify hyperchaos, when spectrum contains two or more positive Lyapunov exponents.

In the frame of this work we describe two scenarios leading to occurrence of hyperchaos on the examples of the modified Anishchenko-Astakhov's generator and coupled generators of quasiperiodic oscillations. The first scenarios is a new scenario associated with appearance of Shilnikov's attractor, when saddle-focus with two-dimensional unstable manifold occurs via secondary Neimark-Sacker bifurcation and absorbs by chaotic attractor. For this scenario we will present cascade of secondary Neimark- Sacker bifurcations, corresponding to hierarchy of Shilnikov's attractors corresponding to hyperchaos. The second scenario was described previously, and associated with cascade of period doubling bifurcation of saddle-cycles with two-dimensional unstable manifold. Both scenarios will be presented for radiophysical generators.

The reported study was funded by RFBR according to the research project № 19-31-60030.

  • Open access
  • 87 Reads
Quantal Response Statistical Equilibrium: A New Class of Maximum Entropy Distributions

The principle of maximum entropy has been applied fruitfully to many economic and social situations. One limitation of such applications, however, is the tendency neglect of the joint determination of social outcomes and the actions that shape them in the formal modeling despite the fact that both dimensions of the problem are typically articulated in the theoretical exposition. The fact that closed-form marginal distributions are good model candidates for many economic variables in statistical equilibrium based on fit has the unfortunate consequence of leading to economic rationalization based on mathematical necessity of the constraints. From the Principle of Maximum Entropy perspective the use of closed-form distributions is premature. While there may be good reasons to rely on such distributions for modeling statistical equilibrium in some situations, in general this limits inference to an arbitrary subset of models.

This research explores an alternative approach to modeling economic outcomes based on the Principle of Maximum Entropy called the quantal response statistical equilibrium (QRSE) model of social interactions. The QRSE model provides a behavioral foundation for the formation of aggregate economic outcomes in social systems characterized by negative feedbacks. It can approximate a wide range of commonly encountered theoretical distributions that have been identified as economic statistical equilibrium and displays qualitatively similar behavior to the Subbotin and Asymmetric Subbotin distributions that range from the Laplace to the Normal distribution in the limit. Asymmetry in the frequency distributions of economic outcomes arises from the unfulfilled expectations of entropy-constrained decision makers and asymmetric impacts of actions. The logic of the model is demonstrated in an application to US stock market data, firm profit rate data, and the distribution of income from a classical perspective.

  • Open access
  • 111 Reads
Kullback-Leibler Divergence of a Freely Cooling Granular Gas of Inelastic Hard Disks and Spheres

The velocity distribution function (fHCS) of a granular gas modeled by inelastic hard d-spheres in the Homogenous Cooling State (HCS) is still unknown. Deviations from a Maxwellian distribution (fM) at the system temperature by means of an infinite expansion in terms of Sonine polynomials is the typical approach. In the quest of finding the Lyapunov functional related to this system, the Kullback-Leibler divergences DKL[f||fHCS] and DKL[f||fM] of the time-dependent velocity distribution function (f) with respect to the HCS and Maxwellian distributions, respectively, are proposed and studied. Kinetic theory results for inelastic hard disks and spheres [1] are supported by Molecular Dynamics (MD) simulations. Whereas DKL[f||fM] may present a non-monotonic behavior with time, it is observed that DKL[f||fHCS] seems to be a valid candidate for a Lyapunov functional, as proposed in [2]. Interestingly, DKL[fHCS||fM] exhibits a non-monotonic dependence on the coefficient of restitution. Moreover, for a more complete description of the problem, fourth- and sixth-order cumulants are revisited, MD simulation results being compared with kinetic-theory predictions for a wide range of values of the coefficient of restitution. Finally, in some simulations, and after a first freely cooling period, a sort of Maxwell's demon acts by reversing the instantaneous velocity of each particle in an attempt to return to the initial configuration. Although an initial ordering is apparently reached by the elastic system (Loschmidt's paradox), the complete reverted evolution is flatly rejected in the inelastic case since time reversal symmetry and detailed balance are broken down in that case, both results being in accordance with [3].

[1] A. Santos and J. M. Montanero, Granul. Matter 11, 157 (2009).

[2] M. I. G. de Soria, P. Maynar, S. Mischler, C. Mouhot, T. Rey, and E. Trizac, J. Stat. Mech. P11009 (2015), 10.1088/1742-5468/2015/11/p11009.

[3] J. Orban and A. Bellemans, Phys. Lett. A 24, 620 (1967).

Top