About 160 years ago, the concept of entropy was introduced in thermodynamics by Rudolf Clausius. Since then, it has been continually extended, interpreted, and applied by researchers in many scientific fields, such as general physics information theory, chaos theory, data mining, and mathematical linguistics. Based on the original concept of entropy, many variants have been proposed. This paper presents a universe of entropies, which aims to review the entropies that had been applied to time series. The purpose is to answer important open research questions such as: How did each entropy emerge? What is the mathematical definition of each variant of entropy? How are entropies related to each other? What are the most applied scientific fields for each entropy? Answering these questions, we describe in-depth the relationship between the most applied entropies in time series for different scientific fields establishing bases for researchers to properly choose the variant of entropy most suitable for their data.
The number of citations over the past fifteen years of each paper proposing a new entropy, was accessed. The Shannon/differential, the Tsallis, the sample, the permutation, and the approximate entropy were the most cited entropies. Based on the ten categories with the most significant number of records obtained in the Scopus categories, the areas in which the entropies are more applied are computer science, physics, mathematics, and engineering. From the top ten, the application area with less citations of papers proposing new entropies is the medical category.