Please login first
What are Recurrent Expansion Algorithms? Exploring a New Space Deeper than Deep Learning
* 1 , 2, 3
1  University of Batna 2, Laboratory of Automation and Manufacturing Engineering, 05000 Batna; Algeria
2  University of Brest, UMR CNRS 6027 IRDL, 29238 Brest ; France
3  Logistics Engineering College, Shanghai Maritime University, Shanghai 201306, China
Academic Editor: Marjan Mernik

Abstract:

Machine-learning applications nowadays usually become a subject of data unavailability, complexity, and drift resulting from massive and rapid changes in data Volume, Velocity, and Variety (3V). Recent advances in deep learning have brought many improvements to the field providing, generative modeling, nonlinear abstractions, and adaptive learning to address these challenges respectively. In fact, deep learning aims to learn from representations that provide a consistent abstraction of the original feature space, which makes it more meaningful and less complex. However, data complexity related to different distortions such as higher levels of noise, for instance, remains difficult to overcome and challenging. In this context, recurrent expansion (RE) algorithms are recently unleashed to explore deeper representations than ordinary deep networks, providing further improvement in feature mapping. Unlike traditional deep learning, which extracts meaningful representations through inputs abstraction, RE allows entire deep networks to be merged into another one consecutively allowing exploration of Inputs, Maps, and estimated Targets (IMTs) as primary sources of learning; a total of three sources of information to provide additional information about their interaction in a deep network. Besides, RE makes it possible to study IMTs of several networks and learn significant features, improving its accuracy with each round. In this context, this paper presents a general overview of RE, its main learning rules, advantages, disadvantages, and its limits while going through an important state-of-the-art and some illustrative examples.

Keywords: Recurrent expansion; deep learning; machine learning
Comments on this paper
Nor--El--Houda Beghersa
I think its containts very important results.

Aitouche Samia samiaaitouche
I am not vey specialist, but I found the paper very instructif and organised

Rafik BENSAADI
A new convergence approach is to be encouraged.
checking/high ratio validation requires deeper mathematics analysis, thus more contributions.

Wei Hong Lim
Inspiring works that bring significant progress to the field of study. Congrats to the authors for the excellent works done. Looking forward to see more impactful works from the authors



 
 
Top