Please login first

List of accepted submissions

 
 
Show results per page
Find papers
 
  • Open access
  • 118 Reads
Optimization Performance of Irreversible Refrigerators Based on Evolutionary Algorithm
In early works done by authors, performance analysis of refrigeration systems such as power input, refrigeration load and coefficient of performance (COP) was investigated. In this article a new function called "Coefficient of Performance Exergy" or COPE has been introduced. Two objective functions of coefficient of performance exergy and exergy destruction are optimized simultaneously using the multi-objective optimization algorithm NSGAII. COPE has been maximized and exergy destruction has been minimized in order to get the best performance. Decision making has been done by means of two methods of LINAMP and TOPSIS. Finally an error analysis done for optimized values shows that LINAMP method is preferable against TOPSIS method.
  • Open access
  • 72 Reads
Maximum Entropy Approach for Reconstructing Bivariate Probability Distributions
The most considerable purpose for this study is to provide a useful algorithm combined of Maximum Entropy Method (MEM) and a computational method to predict the unique form of bivariate probability distributions. The new algorithm provides reasonable estimations for target distributions which have maximum entropy. The MEM is a powerful implement for reconstructing distribution from many types of data. In this study, we introduce this technique to estimate the important bivariate distributions which are very effective in industrial and engineering fields especially in Cybernetics and internet systems. To examine the effectiveness of our algorithm, some different simulation studies were conducted. This method will provide you the unique solution to find a probability distribution based on given information. Possessing the simple and accurate mathematical formulation and using presence-only data, MEM has become a well-suited method for different kinds of distribution modeling.
  • Open access
  • 152 Reads
A Note on Bound for Jensen-Shannon Divergence by Jeffreys
The Jensen-Shannon divergence JS(p;q) is a similarity measure between two probability distributions p and q. It is presently used in varied disciplines. In this presentation, we provide a lower bound on the Jensen-Shannon divergence by the Jeffrery's J-divergence when p_i≥q_i is satisfied. In the original Lin's paper, the upper bound in terms of the J-divergence was the quarter of it. Recently, the shaper one was reported by Crooks. We discuss upper bounds by transcendental functions of Jeffreys by comparing those values for a binary distribution.
  • Open access
  • 42 Reads
Towards the Development of a Universal Expression for the Configurational Entropy of Mixing
Several decades ago, the development of analytical expressions for the configurational entropy of mixing was an active field of research. Empirical or theoretical expressions and methods were deduced in each field of the condensed matter. Several examples can be found in the literature, as: i) the expressions of Flory and Huggins for linear polymer solutions, ii) Cluster Variation Method (CVM) and Cluster Site Approximation (CSA) for studying order-disorder and phase equilibrium in alloys, and iii) the expression of Gibbs and Di Marzio for glasses, just to cite the most well known expressions in each field. Each model has its own area of research and applications. For example, CVM can not be applied to polymer solutions and Flory's expression is not suitable to study order-disorder in alloys. However, the traditional methodology, based on the calculation of the number of configurations, found severe restrictions in the development of accurate and general expressions in complex systems, like interstitial solid solutions or liquids and amorphous materials. The development of a common model for all previously cited methods just counting the number of configurations is an impractical idea. However, a recent formalism to compute the configurational entropy, based on the identification of energetically independent complexes within the mixture and the calculation of their respective probabilities, opens new possibilities to consider seriously such proposal. The importance of such idea is not only academic in nature, i.e., the development of a unified description for all states of the matter. It has its origin in the need of developing general expressions with the same level of accuracy for each state of the matter. Indeed, it would be desirable to describe liquids, glasses and solid states with the same model and level of accuracy to get a precise description of their physical properties and phase diagrams. This work shows that it is possible to develop such a model but a major theoretical and computational effort will be required. The main requirement to achieve this goal is to find a complex (clustering of atoms) suitable to describe, simultaneously, the structural features of liquids, glasses and solids. The first step towards such formulation are discussed in this work based on several inspiring previous works related to hard sphere systems, metallic glasses, CVM method and a recently deduced analytical expression for interstitial solutions. The methodology presented in this work is based on the identification, through a careful analysis of the main physical features of the system, of the energy independent complexes in the mixture and the calculation of their corresponding probabilities. The examples presented in this work show that accurate and general expressions for the configurational entropy of mixing can be developed, even in systems with no translational symmetry.
  • Open access
  • 85 Reads
Energy-Driven Competitive Mechanism of Entropy Change in a Multi-Molecular System
The origin of life and its evolving dynamics is tightly related to molecular dynamics of initially 'small' molecules in the presence of energy source. Observationally, life is a non-stop growth of living matter with steady increase of its complexity. Sometimes this dynamics is related to the entropy change. The point of this note is to discuss the limits of the entropy-based analysis of life dynamics, including the limits of statistical approach to molecular dynamics in an energy-rich multi-component system. The statistical context introduces many 'particles' and their degrees of freedom; in an equilibrium, energy gets distributed equally over the degrees of freedom(DoF) and the entropy is maximal.  Meanwhile, the real 'particles' have an internal structure with discrete energy levels and related states. So, the excitation of molecules and their reactivities which drive the attachments/growth need to be taken into account to see how the growth increases the number of DoF, an option used to be beyond the statistical context.  So, we use a more precise quantum approach. Our analysis indicates to the preferential selection of the larger (more complex) particles by sampling the environment to find and attach a matched partner.  The larger particles better reuse (and not to lose) the captured energy.  The smaller ones lose energy and get used as building components.  The net result is an energy-driven non-stop competitive growth of increasingly complex particles by selective molecular sampling - until the energy source and the building components are available.  This is a 'side effect' of the primary process of competitive redistribution of incoming energy into increasingly larger number of DoF. The energy loss due to dissipation gets minimal and the energy reuse gets maximal. The growth increases the number of DoF making it easier for energy to spread. Statistically, this may be viewed as the overall increase of entropy. However, due to permanency of energy gradients and unlimited availability of 'materials', the total equipartition may take a 'long' time - 4.5 Byr and still going... Meanwhile, for a separate species and in an 'in-between' time, the energy gradient is still in the process of spreading and steadily away from equilibrium, so the entropy gets steadily decreased.
  • Open access
  • 97 Reads
Neural Wave Interference in Inhibition-Stabilized Networks
We study the spatiotemporal dynamics of activation in chains of inhibition-stabilized neural networks with nearest-neighbor coupling. We identify the regions of parameters where the network behavior is stable with respect to corrugation perturbations, allowing us to investigate the dynamical regimes associated with these regions. The neuronal excitation generated by local stimuli in such networks propagate across space and time, forming spatiotemporal waves that affect the excitation generated by the inputs separated spatially and temporally. These interactions form characteristic interference patterns manifested as intrinsic preferences of the network for specific spatial and temporal frequencies of luminance modulations in the stimulus, and for specific stimulus velocities. Notably, the interference leads to characteristic contextual ("lateral") interactions between the stimuli. Previously such interactions were attributed to distinct specialized mechanisms.
  • Open access
  • 66 Reads
Information Geometry in Gaussian Random Fields: Searching for an Arrow of Time in Complex Systems
Random fields are characterized by intricate non-linear relationships between their elements over time. However, what is a reasonable intrinsic definition for time in such complex systems? Here, we discuss the problem of characterizing the notion of time in isotropic pairwise Gaussian random fields. In particular, we are interested in studying the behavior of these fields when temperature deviates from infinity. Our investigations are focused in the relation between entropy and Fisher information, by the definition of the Fisher curve. The results suggest the emergence of an arrow of time as a consequence of asymmetrical geometric deformations in the random field model's metric tensor. In terms of information geometry, the process of taking a random field from a lower entropy state A to a higher entropy state B and then bringing it back to A, induces a natural intrinsic one-way direction of evolution. In practical terms, there are different trajectories in the information space, suggesting that the deformations induced by the metric tensor into the parametric space (manifold) are not reversible for positive and negative displacements in the inverse temperature parameter direction. In other words, there is only one possible orientation to move through different entropic states along the Fisher curve.
  • Open access
  • 75 Reads
Some New Insights Into Information Decomposition in Complex Systems Based on Common Information
We take a closer look at the structure of bivariate dependency induced by a pair of predictor random variables (X,Y) trying to encode a target random variable, Z. The information that the pair (X,Y) contains about the target Z can have aspects of redundant information (contained identically in both X and Y), of unique information (contained exclusively in either X or Y), and of synergistic information (available when (X,Y) are together). Williams and Beer proposed such a decomposition for the general case of K predictors to specify how the total information about the target is shared across the singleton predictors and their overlapping or disjoint coalitions. However, effecting a non-negative decomposition is known to be a surprisingly difficult problem even for the case of K = 3. In particular, it is not always possible to attribute operational significance to all the atoms induced by the decomposition. What operational questions should an ideal measure of redundant or unique information answer? In this paper, we seek to demonstrate the richness of this question through the lens of network information theory. We show the following: We evaluate a recently proposed measure of redundancy based on the Gács and Körner's common information (Griffith et al., Entropy, vol. 16, no. 4, pp. 1985–2000, 2014) and show that the measure, in spite of its elegance is degenerate for most non-trivial distributions. We show that Wyner's common information also fails to capture the notion of redundancy as it violates an intuitive monotonically non-increasing property. We further show why a combinatorial dual of the Gács and Körner's CI is unremarkable as a measure of unique information. We identify a set of conditions when a conditional version of Gács and Körner's common information is an ideal measure of unique information. Finally, we show how the notions of approximately sufficient statistics and conditional information bottleneck can be used to quantify unique information.
  • Open access
  • 68 Reads
An Open Logic Approach to EPM
Recently, Elementary Pragmatic Model (EPM) intrinsic Self-Reflexive Functional Logical Closure contributed to find an original solution to the dreadful double-bind problem in classic information and algorithmic theory. EPM is a high didactic versatile tool and new application areas are envisaged continuosly. In turn, this new awareness has allowed to enlarge our panorama for neurocognitive system behaviour understanding, and to develop information conservation and regeneration systems in a numeric self-reflexive/reflective evolutive reference framework. Unfortunately, a logically closed model cannot cope with ontological uncertainty by itself; it needs a complementary logical aperture operational support extension. We apply to EPM our Anticipatory Learning System (ALS) approach. In this way, it is possible to use two coupled irreducible information management subsystems, based on the following ideal coupled irreducible asymptotic dichotomy: "Information Reliable Predictability" and "Information Reliable Unpredictability" subsystems. To behave realistically, overall system must guarantee both Logical Closure and Logical Aperture, both fed by environmental "noise" (better... from what human beings call "noise"). So, a natural operating point can emerge as a new Trans-disciplinary Reality Level, out of the Interaction of Two Complementary Irreducible Information Management Subsystems. Building on this idea, it is possible to create an Evolutive Elementary Pragmatic Model (E2PM) able to profit by both classic EPM intrinsic Self-Reflexive Functional Logical Closure and new numeric Self-Reflective Functional Logical Aperture. EPM can be thought as a reliable starting subsystem to initialize a process of continuous self-organizing and self-logic learning refinement. Through the hypercube geometric algebra, we propose a notation that goes beyond a format distinction and constructed with the purpose to facilitate inferences either on a diagrammatic representation, or a lexical one. The latter particularly allows operations on complex propositions within hypercube with more than three dimensions, mentally difficult to imagine.
  • Open access
  • 81 Reads
Variations of Neighbor Diversity for Fraudster Detection in Online Auction
Inflated reputation fraud is a serious problem in online auction. Recently, the neighbor diversity based on Shannon entropy has been proposed as an effective feature to discern fraudsters from normal users. In the literature, there exist many different methods to quantify diversity. This raises the problem of finding the most suitable method to calculate neighbor diversity for fraudster detection. In this study, we collect four different methods of quantifying diversity, and apply them to calculate neighbor diversity. We then use these various neighbor diversities for fraudster detection. Our experimental results against a dataset collected from a real world auction website show that, although these diversities are calculated differently, their performances on fraudster detection are similar.
1 2 3 4
Top