Abstract
The logic dendritic neuron model (LDNM), which is inspired by natural neurons, has emerged as a novel machine learning model in recent years. However, recent studies have also shown that the classification performance of LDNM is restricted by the backpropagation (BP) algorithm. In this study, we attempt to use a heuristic algorithm called the gradient-based optimizer (GBO) to train LDNM. First, we describe the architecture of LDNM. Then, we propose specific neuronal structure pruning mechanisms for simplifying LDNM after training. Later, we show how to apply GBO to train LDNM. Finally, seven datasets are used to determine experimentally whether GBO is a suitable training method for LDNM. To evaluate the performance of the GBO algorithm, the GBO algorithm is compared with the BP algorithm and four other heuristic algorithms. In addition, LDNM trained by the GBO algorithm is also compared with five classifiers. The experimental results show that LDNM trained by the GBO algorithm has good classification performance in terms of several metrics. The results of this study indicate that employing a suitable training method is a good practice for improving the performance of LDNM.
Funder
Natural Science Foundation of Jiangsu Province of China
National Natural Science Foundation of China
Japan Science and Technology Agency SPRING
Subject
Electrical and Electronic Engineering,Computer Networks and Communications,Hardware and Architecture,Signal Processing,Control and Systems Engineering