Please login first
"Fractional case of Composition of Activationfunctions and the reduction to finite domain" GEORGE ANASTASSIOU, UNIVERSITY OF MEMPHIS, USA
1  Department of Mathematical Sciences, University of Memphis, Memphis, TN 38152, USA
Academic Editor: Carlo Cattani

Abstract:

This work takes up the aim of the determination of the rate of fractional pointwise, uniform and Lp, p >_ 1 convergences to the unit operator
of the "normalized cusp neural network operators". The cusp is a com-
pact support activation function, which is the composition of two general
activation functions with the whole real line as a domain. These convergences are given via the moduli of continuity of the engaged right and left
fractional derivatives in the form of Jackson-type inequalities.
The composition of activation functions aims to more flexible and powerful neural networks, introducing for the first time the reduction of infinite domains to the one domain of compact support. In mathematical neural network
approximation, AMS Mathscinet lists no articles related to the composition of activation functions.
So, this is a seminal article.
By using composition of activation functions we achieve an activation function of compact support, though the initial activation functions had an
infinite domain: the whole real line.
Now, the resulting activation function is an open cusp of compact support
[-1; 1]. Our involved activation functions are very general, and the constructed
neural network operators resemble the squashing operators, and so do
the produced quantitative results. As a result, our produced convergence inequalities
look much simpler and nicer. A complete paper will follow.

Keywords: neural network approximation; fractional calculus

 
 
Top