Please login first

List of accepted submissions

 
 
Show results per page
Find papers
 
  • Open access
  • 90 Reads
NEAR ZERO ENERGY BUILDINGS ENTROPY PERFORMANCE

Buildings are one of the main areas of energy consumption in Europe. At European level, last agreements and guidelines are focused to optimize energy efficiency. New architecture paradigm evolves towards nearly Zero Energy Building (nZEB). The aim of this contribution is to evaluate building energy performance in terms of entropy production and entropy flows. We projected an nZEB at La Roca del Vallès, close to Barcelona (Spain). Special attention was given to thermal envelope adapted to the location’s climate, highly efficient radiant floor with aerothermal equipment both for heating and cooling, high energy-efficient appliances and illumination while keeping comfort conditions. Energy balance was investigated with Cypetherm software for all energy sources and flows with the environment. From this energy balance and taking into account the temperature of the building and environment, we investigated the entropy balance of the building in terms of the Gouy-Stodola theorem, both considering entropy generation due to thermal exchanges, electrical consumption and occupancy along with thermal flows through the envelope. We simulated the nZEB entropy performance under different efficiency designs and irradiance in order to establish guidelines between the entropy production and the energy efficiency in buildings. As expected, the energy and entropy balance for a whole year are zero. However, entropy generation is strongly affected by the efficiency of the building along with sun irradiance. Moreover, the standard models used in architecture for building energy characterization are found to be insufficient to describe internal entropy changes, and thus, building aging. These results would help to relate energy efficiency to building life cycle analysis.

  • Open access
  • 107 Reads
BATTERY CHARGE AND DISCHARGE STRATEGIES IN TERMS OF ENTROPY PRODUCTION

At present, batteries are the bottle neck of electric mobility, both for their cost and management. In this contribution, we aim to evaluate optimal battery charge in terms of entropy production and relate it to charging time and cost, with the aim to take advantage of vehicle-to-grid configuration. Li-ion batteries are usually charged using a constant current constant voltage strategy (CC-CV). The shift between CC and CV occurs when the battery reaches the maximum charging voltage. In terms of energy efficiency, this is not the most efficient approach. We develop Matlab® software scripts to simulate the performance of a real cell during charging by shifting from constant current to constant voltage at different battery state of charge b. We carried out a variational analysis of the charging/discharging processes in order to obtain its entropy performance so that it could be related to optimal energy efficiency in terms of b and in terms of cost/benefit according to the present Spanish electric market prices. Finally, these magnitudes studied at different charge rates and at different battery aging levels. We find that entropy is a valuable magnitude to characterize battery charge/discharge processes and find the optimal strategy in terms of cost, time and energy.

  • Open access
  • 187 Reads
Sample entropy approach to the examination of cardio-respiratory coupling in response to cardiac resynchronization therapy

Cardiac resynchronization therapy (CRT) is a well-established therapy for symptomatic patients with heart failure and reduced left ventricular ejection fraction. It is known that patients with heart failure have altered cardio-respiratory interactions, but it has not been examined whether resynchronization therapy leads to changes in coupling of cardiac and respiratory rhythm, and whether the success of this therapy leads to restoring of cardio-respiratory interactions. In these patients, in addition to sinus rhythm, different types of arrhythmias usually appear, which limits the application of linear methods in analysis of interbeat interval time series. Therefore they should be analyzed with non-linear techniques and we applied the sample entropy approach.

Twenty minutes of ECG (RR intervals) and respiratory signal were recorded simultaneously in 47 patients with heart failure and CRT indication. The interbeat interval time series in patients with sinus rhythm, sinus rhythm with ventricular extrasystoles and with atrial fibrillation were analyzed. Sample entropy values were calculated from RR interval time series (SampEnRR) and respiratory signal time series (SampEnResp) to assess their complexity/regularity as well as cross sample entropy (CrossSampEn) to estimate their asynchrony. Measurements were performed before (baseline) and approximately 9 months after CRT implantation (follow-up). After follow-up, patients were divided into two groups, responders and non-responders, in relation to the response to CRT, which was assessed according to changes in certain clinical parameters.

In both groups, there was no difference in SampEnRR between baseline and follow-up. However, in the non-responders group, a significant increase was obtained in SampEnResp and CrossSampEn (p < 0.05). Responders to CRT showed significant decrease in heart rate and breathing frequency, while non-responders showed only a significant decrease in heart rate. Our results indicate that in non-responders to CRT, respiratory rhythm is not adapting to changes in cardiac dynamics, which resulted in loss of their synchrony.

  • Open access
  • 120 Reads
Epoch-based Entropy : A Statistical EEG Marker for Alzheimer’s Disease Detection

The utility of electroencephalography (EEG) in Alzheimer’s disease (AD) research has been demonstrated over several decades in numerous studies. EEG markers have been employed successfully to investigate AD-related alterations, by comparing EEG recordings of AD patients to those of healthy subjects.

It is widely admitted that AD leads to a reduction in the complexity of EEG signals and changes in EEG synchrony. These modifications in EEG recordings have been used as discriminative features for AD diagnosis using several complexity, especially entropy-based measures, and synchrony measures. Usually, these measures are applied with two main drawbacks: first, they are computed on the whole EEG sequences without addressing the problem of EEG non-stationarity; secondly such measures do not consider the EEG signal as a multidimensional time series: the prevailing paradigms extract information from EEG signals by averaging them over channels.

We expose a new EEG marker based on an entropy measure, termed epoch-based entropy. This measure quantifies the information content or the disorder of EEG signals both at the time level and spatial level, using local density estimation by a Hidden Markov Model on inter-channel stationary epochs.

We investigated the classification performance of this EEG marker, its robustness to noise, and its sensitivity to sampling frequency and to variations of hyper-parameters. We showed that this measure is efficient for AD detection since the statistical modelling of the multidimensional EEG signal allows characterizing the information content induced by the coupling of neural activity in EEG signals recorded at different locations.

Houmani N, Vialatte F, Gallego-Jutglà E, Dreyfus G, et al. Diagnosis of Alzheimer's disease with Electroencephalography in a differential framework, PlosOne, 2018; 13(3):e0193607. eCollection 2018.

  • Open access
  • 153 Reads
From Network Reconstruction to Network Econometrics: unbiased estimation of average effects

The International Trade Network (ITN) can be thought of as a set of countries (nodes) that are linked among each other by trade relationships (links). Such a network can be described at the topological level, discerning the partners of each country, and at the weighted level, where our interest shifts to the trade volume among partners.

In Economics, the focus has traditionally been on the estimation of trade volumes and marginal effects, at times forgetting the importance of the network structure. A correct estimation of such a structure increases the precision of the estimation of the parameters and hence of trade volumes.
Even if the Econometric models can reproduce the expected number of zeros, they cannot reproduce ITN topological statistics.

An alternative way of studying the ITN structure has been advanced by the Network Science community and consists in a constrained maximization of the Graph Entropy. A recent development is the so-called Enhanced Gravity Model (EGM) where the Lagrange Multipliers are expressed in terms of the rescaled GDPs. The EGM has been proven to accurately predict both topological and weighted statistics. Here we rephrase the EGM into an Econometric model at different degrees of topological detail. Such a model can be used for unbiased estimation of covariate effects on trade volumes taking into account the possible bias given by the underlying topological structure.

Our results confirm that Network Econometric Models replicate the principal network statistics and are associated with
the smallest AIC and BIC scores with respect to Econometric models commonly used in International Trade such as
Poisson, Negative Binomial and Zero-Inflated.

Our contribution sheds light on how International Trade Econometrics can be updated and completed with Network
Regression Models derived from a constrained Maximum Entropy principle.

  • Open access
  • 87 Reads
Analysing discursive communities and semantic networks on Twitter: an entropy-based approach

At the intersection between social sciences and network theory, the aim of this presentation is that of illustrating an entropy-based, data-driven approach to infer communities of Twitter users and looking at their interactions within a specific discussion. Amongst the various kind of interactions, the retweets have been chosen as a particularly insightful relational mechanism featured by Twitter. Our approach is based on a Shannon entropy maximization under certain constraints which guarantees that the procedure is unbiased and suitable for being applied to any properly-defined Twitter discussions.

One of the main results of our analysis is the operational definition of "discursive communities" as groups of users who share significantly-similar retweeting patterns. Comparing the observed bipartite network of users retweeting activities with the outcome of a properly-defined benchmark model, i.e. the Bipartite Configuration Model (BiCM), our inference method validate the similarity in the online activity of Twitter users who share the same contents, published by the accounts of public figures, to infer the presence of users communities with supposedly common features. The discursive communities recovered by our method display a coherent picture of its users behavior, in terms of retweeting and mentioning activities; besides, these communities are consistent with the political coalitions and sensitive to the dynamics characterizing the relationship within and between these coalitions as well.

A second core result of our analysis concerns the study of the mechanisms that shape the Twitter discussions characterizing the aforementioned discursive communities. By monitoring, on a daily and a monthly basis, the structural evolution of the community-generated semantic networks, the users in each community are observed to be characterized by a significantly different online behavior, thus inducing semantic networks with diverse topological structures.

  • Open access
  • 124 Reads
Assessing Transfer Entropy in cardiovascular and respiratory time series: A VARFI approach

In the study of complex biomedical systems represented by multivariate stochastic processes, such as the cardiovascular and respiratory systems, an issue of great relevance is the description of the system dynamics spanning multiple temporal scales. Recently, the quantification of multiscale complexity based on linear parametric models, incorporating autoregressive coefficients and fractional integration, encompassing short term dynamics and long-range correlations, was extended to multivariate time series. Within this Vector AutoRegressive Fractionally Integrated (VARFI) framework formalized for Gaussian processes, in this work we propose to estimate the Transfer Entropy, or equivalently Granger Causality, in the cardiovascular and respiratory systems. This allows to quantify the information flow and assess directed interactions accounting for the simultaneous presence of short-term dynamics and long-range correlations. The proposed approach is first tested on simulations of benchmark VARFI processes where the transfer entropy could be computed from the known model parameters. Then, it is applied to experimental data consisting of heart period, systolic arterial pressure and respiration time series measured in healthy subjects monitored at rest and during mental and postural stress. Both simulations and real data analysis revealed that the proposed method highlights the dependence of the information transfer on the balance between short-term and long-range correlations in coupled dynamical systems.

  • Open access
  • 70 Reads
Maximum Entropy Method Applied to Time-series Data in Real-time Time-Dependent Density Functional Theory

Density functional theory (TDDFT) in real-time, which is expected to be a key technique to calculate optical spectra. We solve the time-dependent equation, keeping track of the dipole moment as time-series data. In the traditional optical analysis, the oscillator strength distribution is related to the imaginary part of the polarizability, which is usually calculated by the Fourier transform (FT) of the dipole moment. Theoretically, it works only if the time-series data is quite large.

To obtain spectra of fairly high resolution with a relatively small number of time-series data, we have recognized that, from our analysis, even simple MEM provides the oscillator strength distribution at high resolution even with a half of the evolution time of a simple FT. In the practical optical analysis, we are much interested in the lower energy region near the band gap to obtain photo absorption and emission spectra. However, long enough time-evolution is still required in the calculation.

We propose that, as a further improved MEM analysis, we use the concatenated data set made from the several-times repeated raw data. In this process, we also introduce an appropriate phase for the target peak frequency to reduce the side effect of the artificial periodicity. In this procedure, we can effectively take into account the information from large lags of autocorrelation, which represent the interesting signal in the lower energy region.

We have compared the result of our improved MEM and that of FT for the analysis applied to the time-series data in the spectrum analysis of small-to-medium size molecules. As a result, we can observe the clear spectrum of MEM. Our new technique provides higher resolution in fewer time steps, compared to that of FT.

  • Open access
  • 80 Reads
Work sharing as a metric and productivity indicator for administrative workflows

A mathematical method was developed to generate an indicator on the productivity of administrative complex workflows. The indicator can roughly and steadily reveal the definition of productivity as a state of endosectoral (among internal sector agents) and exosectoral (among external sector agents) administrative services sharing. The indicator can be used by public and private managers to measure human resource efficiency in proportion to work requests/inputs of all administrative services occurring in a given workflow.

Defining administrative workflow events as a nonlinear dynamics that assume a random ordered or disordered growth rate of information processing, a method has been proposed for large-scale administrative systems defined as structures of hybrid system variables (continuous or discrete), iterated and composed of fixed-point attracted events at which for all possible metric spaces solutions, the modeling of variables from Lyapunov exponential stability point of view allows the projection of system performance to be oriented, that is in other words, the relationship between the number of agents and the number of administrative services within an administrative workflow environment.

This definition differs from traditional or differentiated key performance indicators (KPI), such as working hours, medical certificate, per capita productivity, among others and the proposed methodology is only suitable for large-scale administrative activities that have a wide range of activities as well as the number of agents that perform them.

For system management purposes, it is possible to view the possibility of administrating system discretization, computerization and monitoring in order to be able to predict and validate the exponential function as a valid indicator of nonlinear systems where the results were defined by the amount of work sharing effect.

  • Open access
  • 86 Reads
A kinetic model for pedestrian evacuation in a corridor with an aggressive sparse countercurrent.

The modeling of pedestrian flow is a relevant topic which can lead to valuable information for urban planning as well as for improving evacuation strategies. Most works in this field include a heavy numerical component while theoretical predictions are scarce. In the present work we propose a simple yet rich model for bidirectional pedestrian flow in a one dimensional evacuation scenario where a dense crowd of passive walkers exit the building while a sparse group of aggressive individuals attempt to re-enter. The model is based on a kinetic theory treatment with Boltzmann-like equations considering a two moment approach for the transport equations. The corresponding system in linearly analyzed in order to identify stability regions where the flow towards the exit is uninterrupted provided the countercurrent is aggressive enough. The criterion for the onset of a congestion, and thus the relevant parameters in order to avoid it, are obtained in a purely analytical fashion based on statistical physics.

Top