Please login first

List of accepted submissions

 
 
Show results per page
Find papers
 
  • Open access
  • 46 Reads
Temperature-dependent energy levels and the second law formulation for engineering devices

We shall give an overview, within a scope of the author's knowledge, on
fundamental importance of the temperature dependency of energy levels in materials
in quantifying heat dissipated in thermodynamically operated devices. After a brief review of the
relevant studies of temperature dependency in semiconductors, we point out that entropy produced
or heat dissipated in thermoelectric devices based on thermoelectric effects such as Peltier, Seebeck
and Thomson can be reformulated by inclusion of the effect of temperature-dependent energy levels in
materials (semicondactors). Statistical mechanics consideration for the origin of the temperature
dependent energy levels is also given based on a general ensemble theory.

  • Open access
  • 83 Reads
Maximum Entropy Analysis of Flow Networks with Nonlinear Constraints

The concept of a flow network - a set of nodes connected by flow paths - encompasses many different disciplines, including electrical, pipe flow, transportation, chemical reaction, ecological, epidemiological, economic and human social networks. Over the past two years, we have developed a maximum entropy (MaxEnt) method to infer the stationary state of a flow network, subject to “observable” constraints on expectations of various parameters, “physical” constraints such as conservation (Kirchhoff's) laws and frictional properties, and “graphical” constraints due to uncertainty in the network structure itself. The method enables the probabilistic prediction of physical parameters and (if necessary) the graphical properties of the network, when there is insufficient information to obtain a closed-form solution. A number of analytical, semi-analytical and numerical tools have been developed for the handling of nonlinear constraints, and for extracting analytical and/or numerical solutions. The method is demonstrated by application to the analysis of (i) a 1123-node, 1140-pipe urban water distribution network; (ii) a 327-node urban electrical power network with distributed sources; and (iii) an urban road network.

  • Open access
  • 68 Reads
Entropy production for the terminal orientation of a half cylinder in a flow

The terminal orientation of a rigid body is a classic example of a system out of thermodynamic equilibrium and a perfect testing ground for the validity of the maximum entropy production principle(MEPP). A freely falling body in a quiescent fluid generates fluid flow around the body resulting in dissipative losses. Thus far dynamical equations  have been employed in deriving the equilibrium states of such falling bodies, but they are far too complex and become analytically intractable when inertial effects come into play.  At that stage, our only recourse is to rely on numerical techniques which can be computationally expensive.  In our past work, we have realized that the MEPP is a reliable tool to help predict mechanical equilibrium states of free falling, highly symmetric bodies such as cylinders, spheroids and toroidal bodies. We have been able to show that the MEPP correctly helps choose the stable equilibrium in cases when the system is slightly out of thermodynamic equilibrium.  In the current paper, we expand our analysis to examine bodies with fewer symmetries than previously reported, for instance, a half-cylinder. Using two-dimensional numerical studies at Reynolds numbers substantially greater than zero, we examine the validity of the MEPP. Does the principle still hold up when a sedimenting body is no longer isotropic or has three planes of symmetry? In addition, we also examine the relation between entropy production and dynamical quantities such as drag force to find possible qualitative relations between them.

  • Open access
  • 54 Reads
Toward improved understanding of the physical meaning of entropy in classical thermodynamics

This year marks the 150th anniversary of the concept of entropy, introduced into thermodynamics by Rudolf Clausius. Despite its central role in the mathematical formulation of the Second Law and most of classical thermodynamics, its physical meaning continues to be elusive and confusing. This is particularly the case when one invokes the connection between the classical thermodynamics of a system and the statistical behavior of its constituent microscopic particles.

This paper sketches Clausius approach to its definition and offers a modified mathematical definition that is still in the spirit of Clausius’ derivation. In the modified version, the differential of specific entropy appears as a non-dimensional energy term that captures the invigoration or reduction of microscopic motion upon addition or withdrawal of heat from the system. It is also argued that heat transfer is a better thermodynamic model process to illustrate the concept of entropy instead of the canonical heat engines and refrigerators that are not relevant to new areas of thermodynamics (e.g. thermodynamics of biological systems). In this light, it is emphasized that entropy changes, as invoked in the Second Law, are necessarily related to the non-equilibrium interactions of two or more systems that might have initially been in thermal equilibrium but at different temperatures. The overall direction of entropy increase indicates the direction of naturally occurring heat transfer processes in an isolated system of internally interacting (non-isolated) sub systems.

We discuss the implication of the proposed modification on the interpretation of entropy in statistical thermodynamics as well as the formulation of the most common thermodynamic potentials.

  • Open access
  • 96 Reads
Can we describe the evolution of the cosmic event horizon with the maximum entropy production principle?

The universe is dominated by a non-zero energy of the vacuum (ρΛ) that is making the expansion of the universe accelerate. This acceleration produces a cosmic event horizon with associated entropy SCEH ~ ρΛ-1When this entropy is included in the entropy budget of the universe, it dominates the entropy of the next largest reservoir, supermassive black holes, by 19 orders of magnitude: 10122 k >> 10103 k. Here we address the issue of how one might apply the maximum entropy production principle (MEPP) to the processes responsible for the production of the cosmic event horizon entropy.

  • Open access
  • 82 Reads
Shannon’s entropy usage as statistic

Distribution of measured data is important in applied statistic to conduct a appropriate statistical analysis. Different statistics are use to assess a general null hypothesis (H0): data follow a specific distribution. The Shannon’s entropy (H1) is introduced as statistic and its evaluation was conducted compared with Anderson-Darling (AD), Kolmogorov-Smirnov (KS), Cramér-von Mises (CM), Kuiper V (KV), and Watson U2 (WU) statistics.

A contingency containing four continuous distributions (error function, generalized extreme value, normal, and lognormal), six statistics (including Shannon’s entropy as statistic), and fifty datasets with sample sizes from 14 to 1714 of active chemical compounds was constructed. Fisher's combined probability test was applied to obtain the overall p-value from different tests bearing upon the same null hypothesis for each data set. Two scenarios were analyzed, one without (Scenario 1: AD & KS & CM & KV & WU) and one with (Scenario 2: AD & KS & CM & KV & WU & H1) inclusion of Shannon’s entropy as statistic.

One hundred and sixty-eight rows of cases were valid and included in the analysis. The number of H0 rejections of varied from 0 to 14:

Distribution  AD  KS  CM  KV  WU  H1  Scenario 1  Scenario 2

Err                   1      3      2      10     8    0                 10               10

Gen                  2      1      0       9     7    0                  9                 9

Lognormal        0      3      0     14    12    1                 16              14

Normal             1     5       1     12    11    0                 12              12

The Shannon’s entropy (H1) was the statistic with smallest number of rejections. The overall combine test showed identical results in assessment of Error, Generalized Extreme value and Normals distribution when inclusion (Scenario 2) or not (Scenario 1) of Shannon’s statistic led to the same results. In the case of lognormal distribution, inclusion of Shannon’s statistic decreases the number of rejections from 16 to 14.

  • Open access
  • 100 Reads
Local effect of asymmetry deviations from Gaussianity using information-based measures

In this paper local sensitivity measures are proposed to evaluate deviations from multivariate normality caused by asymmetry; the model we use to regulate asymmetry is the multivariate skew-normal distribution because it reflects the deviation in a very tractable way. The paper also examines the connection between local sensitivity and Mardia’s and Malkovich-Afifi’s skewness indices. Once the local sensitivity measures have been introduced, we study the effect of local perturbations in asymmetry on the conditional distributions; this issue has important implications because there are many procedures in statistics and other fields where the output depends on the distribution of some variables for known values of the others. The proposed measures use the Kullback-Leibler divergence to evaluate dissimilarities between probability distributions in order to assess deviation from normality on the joint distribution and on the marginal and conditional distributions as well. The results are illustrated with some examples

  • Open access
  • 97 Reads
Asymptotic behaviour of the weighted Shannon differential entropy in a Bayesian problem

Consider a Bayesian problem of success probability estimation in a series of conditionally independent trials with binary outcomes. We study the asymptotic behaviour of the weighted differential entropy for posterior probability density function conditional on $x$ successes after $n$ conditionally independent trials when $n \to \infty$. Suppose that one is interested to know whether the coin is approximately fair with a high precision and for large $n$ is interested in the true frequency. In other words, the statistical decision is particularly sensitive in small neighbourhood of the particular value $\gamma=1/2$. For this aim the concept of weighted differential entropy is used. It is shown that when $x$ is a proportion of $n$ after an appropriate normalization the limiting distribution is Gaussian and the standard differential entropy of standardized RV converges to differential entropy of standard Gaussian random variable. Also, we found that the weight in suggested form does not change the asymptotic form of the Shannon and Renyi differential entropies, but changes the constants. 

  • Open access
  • 56 Reads
On the Daróczy-Tsallis capacities of discrete channels

In the past there has been an extensive work on generalized entropies and generalized channel capacities. One of the first was Daroczy, who introduced new parameterized generalization of Shannon entropy, which reduces to the Shannon case if the parameter is set to one. A variant of this entropy, with a different normalization constant, was later proposed by Tsallis, who set it up as a basis for non-extensive statistical mechanics. Based on the generalized entropy, Daroczy introduced generalized mutual information which shares several important properties with the Shannon case, such as symmetry with respect to input/output channel distributions, non-negativity (if the parameter is greater than one) and obeying the chain rule. Daroczy also introduced a generalized channel capacity as the maximum of the generalized mutual information and derived expressions for the capacities of symmetric channel and binary symmetric channel as a special case.

In this paper we provide new expressions for Daroczy capacities of weakly symmetric channel, binary erasure channel and z-channel, extending the previous work by Daroczy. Similarly to the Shannon case, the capacity of weakly symmetric channel is expressed as the source entropy reduced by the entropy of the transition matrix row (scaled by appropriate constant), capacity of binary erasure channel is expressed as the q-average number of bits which can be recovered after transmission, while the capacity of z-channel is expressed in terms of q-logarithm and q-exponential of generalized binary entropy function. All the expressions are general and can be directly applied to Tsallis entropy, reducing to the Shannon capacity results in a limit case, when the parameter tends to one.

  • Open access
  • 104 Reads
Entropy Measures in Finance and Risk Neutral Densities

The application of entropy in finance can be regarded as the extension of the information entropy and the probability entropy. It can be an important tool in various financial methods such as measure of risk, portfolio selection ,option pricing and asset pricing. A typical example for the field of option pricing, is the Entropy Pricing Theory (EPT) introduced by Les Gulko [1996]. The Black-Scholes model [1973] exhibits the idea of no arbitrage which implies the existence of universal risk-neutral probabilities but unfortunately it does not guarantees the uniqueness of the risk-neutral probabilities. In a second step the parameterization of these risk-neutral probabilities needs a frame of stochastic calculus and to be more specific for example the Black and Scholes frame is controlled by Geometric Brownian Motion (GBM). This implies the existence of risk-neutral probabilities in the field of option pricing and their uniqueness is vital. The Shannon entropy can be used in particular manners to evaluate entropy of probability density distribution around some points but in the case of specific events for example deviation from mean and any sudden news for stock returns up (down), needs additional information and this concept of entropy can be generalized. If we want to compare entropy of two distributions by considering the two events i.e. deviation from mean and sudden news then Shannon entropy [1964] assumes implicit certain exchange that occurs as a compromise between contributions from the tail and main mass of the distribution. This is important now to control this trade-off explicitly. In order to solve this problem the use of entropy measures that depend on powers of probability for example Tsallis [1988], Kaniadakis [2001], Ubriaco [2009], Shafee [2007] and  Reyni [1961] provide such control.

In this article we use entropy measures depend on the powers of the probability. We propose some entropy maximization problems in order to obtain the risk neutral densities. We present also the European call and put in this frame work.

1 2 3 4 5
Top