Sadiq, A. and Yahya, N. (2021) Fractional Stochastic Gradient Descent Based Learning Algorithm For Multi-layer Perceptron Neural Networks. In: UNSPECIFIED.
Full text not available from this repository.Abstract
Neural Networks are indispensable tools in adaptive signal processing. Multi-layer perceptron (MLP) neural network is one of the most widely used neural network architecture. The performance is highly subjective to the optimization of learning parameters. In this study, we propose a learning algorithm for the training of MLP models. Conventionally back-propagation learning algorithm also termed as (BP-MLP) is used. It is a type of stochastic gradient descent algorithm where performance is governed by eigen spread of the input signal correlation matrix. In order to accelerate the performance, we design a combination of integral and fractional gradient terms. The proposed fractional back-propagation multi-layer perceptron (FBP-MLP) method is based on fractional calculus and it utilizes the concept of fractional power gradient which provides complementary information about the cost function that helps in rapid convergence. For the validation of our claim, we implemented leukemia cancer classification task and compared our method with standard BPMLP method. The proposed FBP-MLP method outperformed the conventional BP-MLP algorithm both in terms of convergence rate and test accuracy. © 2021 IEEE.
Item Type: | Conference or Workshop Item (UNSPECIFIED) |
---|---|
Impact Factor: | cited By 0 |
Uncontrolled Keywords: | Backpropagation; Calculations; Cost functions; Diseases; Gradient methods; Multilayer neural networks; Network architecture; Network layers; Signal processing, Back Propagation; Fractional calculus; Indispensable tools; Multi-layer perceptron; Multilayer perceptrons neural networks (MLPs); Multilayers perceptrons; Neural-networks; Performance; Stochastic gradient descent; Stochastics, Stochastic systems |
Depositing User: | Ms Sharifah Fahimah Saiyed Yeop |
Date Deposited: | 25 Mar 2022 01:12 |
Last Modified: | 25 Mar 2022 01:12 |
URI: | http://scholars.utp.edu.my/id/eprint/29220 |