Full Description
This book presents the new idea of going from the neural networks main tools, the activation functions, to convolution integrals and singular integrals approximations. That is the rare case of employing applied mathematics to treat theoretical ones.
Authors introduce and use also the symmetrized neural network operators able to achieve supersonic speeds of convergence.
Authors use a great variety of activation functions. Thus, in this book all presented is original work by the author given at a very general level to cover a maximum number of different kinds of Neural Networks: giving ordinary, fractional, and stochastic approximations. It is presented here univariate, fractional, and multivariate approximations. Iterated-sequential multi-layer approximations are also studied.
Contents
Degree of Approximation by Parametrized logistic activated convolution operators.- Approximation by Parametrized logistic activated Multivariate convolution operators.- Degree of Approximation by symmetrized and perturbed hyperbolic tangent activated convolution operators.- Approximation by Symmetrized and Perturbed Hyperbolic Tangent activated Multivariate convolution operators.- Symmetrized and perturbed hyperbolic tangent neural network multivariate approximation over infinite domains.



