Author:
Dai Wei,Ning Chuanfeng,Pei Shiyu,Zhu Song,Wang Xuesong
Abstract
AbstractAs a randomized learner model, SCNs are remarkable that the random weights and biases are assigned employing a supervisory mechanism to ensure universal approximation and fast learning. However, the randomness makes SCNs more likely to generate approximate linear correlative nodes that are redundant and low quality, thereby resulting in non-compact network structure. In light of a fundamental principle in machine learning, that is, a model with fewer parameters holds improved generalization. This paper proposes orthogonal SCN, termed OSCN, to filtrate out the low-quality hidden nodes for network structure reduction by incorporating Gram–Schmidt orthogonalization technology. The universal approximation property of OSCN and an adaptive setting for the key construction parameters have been presented in details. In addition, an incremental updating scheme is developed to dynamically determine the output weights, contributing to improved computational efficiency. Finally, experimental results on two numerical examples and several real-world regression and classification datasets substantiate the effectiveness and feasibility of the proposed approach.
Funder
National Natural Science Foundation of China
Natural Science Foundation of Jiangsu Province
Open Project Foundation of State Key Laboratory of Synthetical Automation for Process Industries
Postgraduate Research & Practice Innovation Program of Jiangsu Province
Publisher
Springer Science and Business Media LLC