Hyperparameter Optimization of Evolving Spiking Neural Network for Time-Series Classification

Ibad, T. and Abdulkadir, S.J. and Aziz, N. and Ragab, M.G. and Al-Tashi, Q. (2022) Hyperparameter Optimization of Evolving Spiking Neural Network for Time-Series Classification. New Generation Computing, 40 (1). pp. 377-397.

Full text not available from this repository.
Official URL: https://www.scopus.com/inward/record.uri?eid=2-s2....

Abstract

Spiking neural networks are the third generation of artificial neural networks that are inspired by a new brain-inspired computational model of ANN. Spiking neural network encodes and processes neural information through precisely timed spike trains. eSNN is an enhanced version of SNN, motivated by the principles of Evolving Connectionist System (ECoS), which is relatively a new classifier in the neural information processing area. The performance of eSNN is highly influenced by the values of its significant hyperparameters� modulation factor (mod), threshold factor (c), and similarity factor (sim). In contrast to the manual tuning of hyperparameters, automated tuning is more reliable. Therefore, this research presents an optimizer-based eSNN architecture, intended to solve the issue regarding optimum hyperparameters� values� selection of eSNN. The proposed model is named eSNN-SSA where SSA stands for salp swarm algorithm, which is a metaheuristic optimization technique integrated with eSNN architecture. For the integration of eSNN-SSA, Thorpe�s standard model of eSNN is used with population rate encoding. To examine the performance of eSNN-SSA, various benchmarking data sets from the UCR/UAE time-series classification repository are utilized. From the experimental results, it is concluded that the salp swarm algorithm plays an effective role in improving the flexibility of the eSNN. The proposed eSNN-SSA offers solutions to conquer the disadvantages of eSNN in determining the best number of pre-synaptic neurons for time-series classification problems. The performance accuracy obtained by eSNN-SSA was on datasets spoken Arabic digits, articulatory word recognition, character trajectories, wafer, and GunPoint, i.e., 0.96, 0.97, 0.94, 1.0, and 0.94, respectively. The proposed approach outperformed standard eSNN in terms of time complexity. © 2022, Ohmsha, Ltd. and Springer Japan KK, part of Springer Nature.

Item Type: Article
Impact Factor: cited By 1
Uncontrolled Keywords: Benchmarking; Classification (of information); Encoding (symbols); Network architecture; Optimization; Swarm intelligence; Time series, ESNN; Hyper-parameter; Hyper-parameter optimizations; Neural-networks; Optimisations; Performance; Salp swarms; Swarm algorithms; Third generation; Time series classifications, Neural networks
Depositing User: Mr Ahmad Suhairi Mohamed Lazim
Date Deposited: 20 Dec 2022 04:01
Last Modified: 20 Dec 2022 04:01
URI: http://scholars.utp.edu.my/id/eprint/33979

Actions (login required)

View Item
View Item