Machine-learning applications nowadays usually become a subject of data unavailability, complexity, and drift resulting from massive and rapid changes in data Volume, Velocity, and Variety (3V). Recent advances in deep learning have brought many improvements to the field providing, generative modeling, nonlinear abstractions, and adaptive learning to address these challenges respectively. In fact, deep learning aims to learn from representations that provide a consistent abstraction of the original feature space, which makes it more meaningful and less complex. However, data complexity related to different distortions such as higher levels of noise, for instance, remains difficult to overcome and challenging. In this context, recurrent expansion (RE) algorithms are recently unleashed to explore deeper representations than ordinary deep networks, providing further improvement in feature mapping. Unlike traditional deep learning, which extracts meaningful representations through inputs abstraction, RE allows entire deep networks to be merged into another one consecutively allowing exploration of Inputs, Maps, and estimated Targets (IMTs) as primary sources of learning; a total of three sources of information to provide additional information about their interaction in a deep network. Besides, RE makes it possible to study IMTs of several networks and learn significant features, improving its accuracy with each round. In this context, this paper presents a general overview of RE, its main learning rules, advantages, disadvantages, and its limits while going through an important state-of-the-art and some illustrative examples.
Previous Article in event
On the Solutions for a Class of Boundary Value Problems of Fractional Type Using Coincidence Degree TheoryPrevious Article in session
Next Article in event Next Article in session
What are Recurrent Expansion Algorithms? Exploring a New Space Deeper than Deep Learning
Published: 28 April 2023 by MDPI in The 1st International Online Conference on Mathematics and Applications session Mathematics and Computer Science
Keywords: Recurrent expansion; deep learning; machine learning
Supplementary material: Download Supplementary material View paper View Poster