Please login first

List of accepted submissions

 
 
Show results per page
Find papers
 
  • Open access
  • 81 Reads
Efficiency of an arrangement in series of irreversible thermal engines working at maximum power.
Published: 05 May 2021 by MDPI in Entropy 2021: The Scientific Tool of the 21st Century session Thermodynamics

Within the context of finite-time thermodynamics several regimes of performance have been used to study the well known Curzon-Ahlborn (CA) heat engine model [1-5]. Also the optimal performance and the effects on environment are studied to find the best approximation with real heat engines.

In this work we present a model of an arrangement in series of irreversible Carnot heat engines, which consist of k reservoirs connected in series, this heat engine model is working under three different regime of performance: maximum power output, maximum ecological function [6] and maximum efficient power [7]. At first we used three reservoirs, and we calculated its efficiency. For the case of maximum power output we calculated the efficiency for the case of the generalizing of k reservoirs, and we get an efficiency expression similar to the one of Curzon-Ahlborn, the irreversibilities are taken into account by irreversibility parameter R. Finally we present the comparison of the efficiencies obtained under three differents regimes of performance.

  • Open access
  • 45 Reads
Do Entropic Biodiversity Methods Outcompete Alternatives?

The journal ‘Entropy’ is full of elegant entropy/information approaches for all aspects of biology and medicine, from molecular interactions to landscape diversity. But do these methods work as well as (or better than) alternative methods? Often this is not evaluated by either simulation or empirical data, with some exceptions (eg: Sherwin et al. 2017); here I concentrate on two newer evaluations:

Assessing frequency differentiation between groups, times or locations; I focus on the very popular Bray-Curtis measure and its entropic competitors.
I assess Bray-Curtis’ fit to several groups of criteria. First, it fits many basic requirements for any diversity measure. Secondly, I examine its independence from confounding effects. Finally, I look at its sensitivity to natural changes, threats, or management that affect underlying dispersal, adaptation, random change, or generation of novelty. If we can forecast a measure such as Bray-Curtis under various conditions, then we can evaluate effects of past events, and forecast effects of future management. I show that Bray-Curtis can be forecast from underlying biological processes, but that the forecasting ability is improved by converting it into closely related entropy/information measures.

Incorporating functional differences between the variants (eg, DNA sequence, expression, morphology) into biodiversity measures.
Past methods have met counterintuitive stumbling-blocks, such as negative diversity, and apparent differentiation depending heavily on variability within-group. Attempts to minimise these problems have resulted in measures with poor sensitivity to functional differences between variants – which the method was supposed to incorporate! A novel approach, based on three related entropy measures, avoids these counterintuitive problems (Chao et al 2019).

Sherwin W et al. 2017. Trends Ecol. Evol. 32:948
Chao A et al. 2019. Ecol. Monog. doi.org/10.1002/ecm.1343

  • Open access
  • 90 Reads
Simulation Studies of Entropy-Driven Crystallization in Athermal Chain Packings in the Bulk and Under Confinement

We present results from extensive off-lattice simulations on packings of flexible linear chains of hard spheres in the bulk and under confinement. We employ a Monte Carlo scheme, built around advanced, chain-connectivity-altering moves, for the short- and long-range equilibration even for very long and definitely entangled systems, at very high concentrations near the maximally random jammed (MRJ) state and under extreme confinement1. Local environment and similarity to specific crystal structures are gauged through the crystallographic element norm (CCE) metric2 which is able to distinguish between different competing crystal structures. The established crystal morphologies range from random hexagonal close packed ones with a single or varied stacking direction(s) to pure face-centred cubic (fcc) and hexagonal close packed (hcp) crystals. We explain how the total entropy of the system increases as the local environment of the crystal phase becomes more symmetric and spherical. This entropic effect leads to the observed transition from the initial amorphous to the final crystal phase3. By extending the simulations to trillions of steps crystal perfection is observed in accordance to the Ostwald´s rule of stages in crystal polymorphism.

In general, bond tangency of successive monomers along the chain backbone or the corresponding gaps affect profoundly the ability of chains to crystallize4. Based on these findings, by using simple geometric arguments, we explain the role of rigid and flexible constraints in the packing behavior (crystal nucleation and growth) of general atomic and particulate systems.

  1. Ramos, P. M.; Karayiannis, N. C.; Laso, M., J. Comput. Phys. 2018, 375, 918-934.
  2. Karayiannis, N. C.; Foteinopoulou, K.; Laso, M., J. Chem. Phys. 2009, 130 (7).
  3. Karayiannis, N. C.; Foteinopoulou, K.; Laso, M., Phys. Rev. Lett. 2009, 103 (4).
  4. Karayiannis, N. C.; Foteinopoulou, K.; Laso, M., Soft Matter 2015, 11 (9), 1688-1700.
  • Open access
  • 41 Reads
Developing an information theory of quantitative genetics

Developing an information theory of quantitative genetics

David J. Galas, James Kunert-Graf and Nikita A. Sakhanenko

Pacific Northwest Research Institute,

Seattle Washington, 98122 USA

Quantitative genetics has evolved dramatically in the century since its foundation, and the modern proliferation of genetic data both in quantity and in type now enables new kinds of analysis beyond the scope of its theoretical foundations. We have begun laying the foundations of an alternative formulation of quantitative genetics based on information theory since it can provide sensitive and unbiased measures of statistical dependencies among variables, as well as a natural mathematical language for an alternative description of quantitative genetics. After all, genetics is fundamentally the science of information transfer between generations. Earlier work has applied information theory to descriptions of evolution and some aspects of population genetics. In our previous work we examined the information content of discrete functions, which are useful in describing genetic relations and applied this formalism to the analysis of genetic data. We describe a set of relationships that both unifies the information measures for these discrete functions and uses them to express key genetic relationships in genotype and phenotype data. We present information-based measures of the genetic quantities of penetrance, heritability and degrees of statistical epistasis. We analyze two- and three-variable dependencies for independently segregating variants, which captures a range of phenomena including genetic interactions, and two phenotype pleiotropy. Note however that this formalism applies naturally to multi-variable interactions and higher-order complex dependencies as well, and can be extended to account for population structure, genetic linkage and non-randomly segregating markers. We discuss our progress towards laying the groundwork for a full formulation of quantitative genetics based in information theory.

  • Open access
  • 57 Reads
A Novel Technique for Achieving the Approximated ISI at the Receiver for a 16QAM Signal sent via a FIR Channel based only on the Received Information and Statistical Techniques

Let us consider for a moment the digital communication case where during transmission, a source signal undergoes a convoluted distortion between its symbols and the channel impulse response. This distortion is referred to as inter-symbol interference (ISI) which causes harmful distortions and presents a major difficulty in the recovery process. A Single-input-multiple-output (SIMO) channel is obtained from the use of an array of antennas in the receiver where the same information is transmitted through different sub-channels, all received sequences will be distinctly distorted versions of the same message. The ISI level from each sub-channel is unknown up to now to the receiver. Thus, even when one or more sub-channels cause heavy ISI, the whole information from all the sub-channels was still considered in the receiver. Obviously, if we know the approximated ISI of each sub-channel, we would use in the receiver only those sub-channels with the lowest ISI level to get improved system performance. In this talk we present a systematic way for getting the approximated ISI from each sub-channel modeled as a finite-impulse-response ( FIR) channel with real-valued coefficients for a 16QAM (16 quadrature amplitude modulation, a modulation using ± {1,3} levels for in-phase and quadrature components) source signal transmission. The approximated ISI is based on the Maximum Entropy density approximation technique, on the Edgeworth Expansion up to order six, on the Laplace integral method and on the generalized Gaussian distribution (GGD). Although the approximated ISI was derived for the noiseless case, it was tested successfully for signal to noise ratio (SNR) of SNR=30 dB and SNR=20 dB as well.

  • Open access
  • 45 Reads
On the CTW-based Entropy Estimator

Estimating the entropy of a sequence of discrete observations is a problem that raises in many different fields. There are numerous different applications of it, specifically in neuroscience, where entropy has been adopted as the main measure for describing the amount of information transmitted between neurons (see Gao et. al. 2008 and reference therein). Gao et. al. 2008 conducted a thorough comparison of the performance of the most popular and effective entropy estimators. They have shown that the context tree weighted (CTW) based estimator, which uses the probability estimation produced by the CTW lossless compression algorithm by Willems et. al., 1995, repeatedly and consistently provides the most accurate results. The motivation for using the CTW probability for estimating the entropy is the well-known Shannon-McMillan-Breiman (SMB) result. However, the CTW probability is a result of a “twice universal” approach, meaning it is a weighted combination of the estimated probabilities of the sequence, over all possible bounded memory tree models (up to a predetermined maximum memory).

Motivated by this we examine the CTW based estimator from the view point of the CTW algorithm redundancy performance analysis (Willems et. al., 1995). We define the SMB entropy as the normalized logarithm of the true probability, assuming a specific model for the source. We consider this finite length quantity to be the best possible estimator of the entropy given a specific model. By defining a random variable distributed over all possible bounded memory tree models we extend this definition and define the conditional SMB entropy. We bound the over estimation of the CTW-based estimator compared to both the conditional SMB entropy as well as the SMB entropy of a specific model. In both cases we show that the over estimation approaches zero according to O(log T/ T), where T is the length of the sequence.

  • Open access
  • 94 Reads
Evaluating spatial and temporal fragmentation of a categorical variable using new metrics based on entropy: example of vegetation land cover

Associated with climate change and/or land use pressure, forest fragmentation is a spatio-temporal shrinking process that reduces the sizes of forest patches. This breaks up forest patches so increasing their number before the small ones progressively disappear. Fragmentation can be assessed spatially as a level of the current status of the fragmented spatial configuration and temporally as the level of the speed of the fragmentation process itself. Among the different landscape metrics based on patches as indicative measures for fragmentation, the Shannon entropy of the observed spatial distribution of categories has been of particular interest. Based on a recently suggested spatio-temporal entropy framework focusing on patch size and shape distributions, this paper shows how to derive useful fragmentation metrics at local and global levels, spatially, temporally or both. Moreover, it shows that using fully symmetric approaches between space, time and category within this framework, can lead to more sensitive fragmentation metrics as well as providing complementary local approach for cartographic representation. Land cover data simulations from land surface modelling to a 2100 horizon are used to illustrate the proposed fragmentation metrics.

  • Open access
  • 62 Reads
Estimating differential entropy using recursive copula splitting

We present a new method for estimating the Shannon differential entropy of multidimensional random variables using independent samples. The method is based on decomposing the distribution into a product of the marginal distributions and the joint dependency, also known as the copula. The entropy of marginals is estimated using one-dimensional methods. The entropy of the copula, which always has a compact support, is estimated recursively by splitting the data along statistically dependent dimensions. The method can be applied both for distributions with compact and non-compact supports, which is imperative when the support is not known or of mixed type (in different dimensions). At high dimensions (larger than 20), numerical examples demonstrate that our method is not only more accurate, but also significantly more efficient than existing approaches. We apply the new method to estimate the entropy of several statistical physics model showing out of equilibrium dynamics. The models show a phase transition in which the structure become hyper-uniform. We show that the phase transition can be detected by studying the entropy of particle configurations.

  • Open access
  • 90 Reads
Multivariate Symmetrical Uncertainty as a measure for interaction in categorical patterned datasets

Interaction between three or more variables is often found in statistical models where the response variable is numeric. Techniques like regression or analysis of variance can show interaction as a composite-variable term in the model, and their algorithms include calculations to determine the size of the interaction. However, there is a lack of methods to appropriately detect and measure interactions when the variables are a mix of numerical and categorical.

In this work, we present a way of measuring interactions between n categorical variables for the case of samples with patterned records. In these datasets, of all the possible attribute value combinations only some of them are present. We explore various datasets using the Multivariate Symmetrical Uncertainty, which is a recently developed entropy-based correlation measure. MSU is unbiased for representative samples, and it detects linear and non-linear associations between any mix of categorical and discretized numerical variables.

More precisely, we explore the behavior of a number of known 3-variable record structures such as XOR, AND, OR, NAND and others, plus their extensions to more variables. Simulations using different sampling scenarios on each record structure show that every n-variable pattern possesses a characteristic minimum value M_L and a characteristic maximum value M_U for the MSU correlation.

It is observed that the M_L value, attained when the pattern occurs in a certain combination of frequencies, hints that interaction is intrinsically expressed by this minimum value. Other sampling scenarios resulting in higher MSU values carry this intrinsic interaction due to the pattern itself, plus additional correlation due to extra occurrences of some configurations.

This method of quantifying n-way categorical interactions opens up new questions on the behavior of datasets that exhibit multivariate correlation, as for example in semi-patterned and non-patterned datasets.

  • Open access
  • 44 Reads
The Entropy of Supermassive Black Holes during its Evaporation Time
Published: 05 May 2021 by MDPI in Entropy 2021: The Scientific Tool of the 21st Century session Thermodynamics

Is it possible to quantify in General Relativity, RG, the entropy generated by supermassive black holes, BHs, during its evaporation time, since the intrinsic Hawking radiation in the infinity that, although insignificant, is important in the effects on the thermal quantum atmosphere?

The purpose was to develop a formula that allows us to measure the entropy generated during the evaporation time of different types of BHs of: i. remnant BH of the binary black holes’ merger, BBH: GW150914, GW151226 and LTV151012 detected by the Laser Interferometer Gravitational-Wave Observatory (LIGO), and ii. Schwarzschild, Reissner-Nordström, Kerr and Kerr-Newman, and thus quantify in GR the "insignificant" quantum effects involved, in order to contribute to the validity of the generalized second law (GSL) that directly links the laws of black hole mechanics to the ordinary laws of thermodynamics, as a starting point for unifying quantum effects with GR. This formula could have some relationship with the detection of the shadow´s image of the event horizon of a BH.

This formula was developed in dimensional analysis, using the constants of nature and the possible evaporation time of a black hole taking into account its distance to the Earth, to quantify the entropy generated during that time. The energy-stress tensor was calculated with the 4 metrics to obtain the material content and apply the proposed formula.

The entropy of the evaporation time of BHs proved to be insignificant, its temperature is barely above absolute zero, however, the calculation of this type of entropy allows us to argue about the importance of the quantum effects of Hawking radiation mentioned by authors who have studied the quantum effects with arguments that are fundamentally based on the presence of the surrounding thermal atmosphere of the black hole.

Top