Stochastic One-Step Training for Feedforward Artificial Neural Networks
Article
-
- Overview
-
- Research
-
- Identity
-
- Additional Document Info
-
- View All
-
Overview
abstract
-
This paper studies the use and application of a fast method (non-iterative and instantaneous) for Feedforward Neural Networks training in which the weights of the hidden layer are assigned randomly, and the weights of the output layer are trained through a linear regression adjustment. The method solves two of the problems that are present in traditional training: training time and optimal structure. While traditional iterative training methods require long periods to train a single structure, the proposed method allows training a structure in a single step (not iterative). In this way, by scanning the number of neurons in the hidden layer, many structures are trained in a short time, and it is possible to obtain an optimal topology. A quality control criterion of the predictions is proposed based on the coefficient of determination that guarantees short times and an optimal number of hidden neurons to characterize a specific problem. The feasibility of the proposed method is tested by comparing its performance against building functions of the artificial neural networks toolbox in Matlab®, resulting superior in both approximation quality and training time. A rigorous study and analysis are performed for the regression of simulated data on two different surfaces with a specific noise and different topologies of the neural network. The resulting process time is at least 150 times shorter for proposed training than with the iterative training that Matlab uses, thus obtaining well-founded learning rules. A novel way of an amputated matrix is proposed that breaks the paradigm of the way multiple-output systems are trained and improves the quality of predictions with no detriment to training times. © 2020, Springer Science%2bBusiness Media, LLC, part of Springer Nature.
publication date
funding provided via
published in
Research
keywords
-
Constructive networks; Cross-validation; Feedforward neural network; Multiple responses; Single-hidden layer feedforward network; Training Iterative methods; Multilayer neural networks; Quality control; Stochastic systems; Structural optimization; Topology; Approximation quality; Coefficient of determination; Feed-forward artificial neural networks; Multiple output systems; Optimal structures; Optimal topologies; Quality of predictions; Regression adjustment; Feedforward neural networks
Identity
Digital Object Identifier (DOI)
Additional Document Info
start page
end page
volume
issue