Towards Compact Broad Learning System by Combined Sparse Regularization

Author:

Miao Jianyu123,Yang Tiejun123,Jin Jun-Wei123,Sun Lijun145,Niu Lingfeng67,Shi Yong689

Affiliation:

1. Key Laboratory of Grain Information Processing and Control (HAUT), Ministry of Education, Zhengzhou 450001, P. R. China

2. Henan Key Laboratory of Grain Photoelectric Detection and Control (HAUT), Zhengzhou 450001, P. R. China

3. College of Artificial Intelligence and Big Data, Henan University of Technology, Zhengzhou 450001, P. R. China

4. Henan Key Laboratory of Grain Photoelectric, Detection and Control (HAUT), Zhengzhou 450001, P. R. China

5. College of Information Science and Engineering, Henan University of Technology, Zhengzhou 450001, P. R. China

6. Key Laboratory of Big Data Mining and Knowledge Management, Chinese Academy of Sciences, Beijing 100190, P. R. China

7. School of Economics and Management, University of Chinese Academy of Sciences, Beijing 100190, P. R. China

8. School of Electrical and Information Engineering, Southwest Minzu University, Chengdu 610041, P. R. China

9. College of Information Science and Technology, University of Nebraska at Omaha, NE 68182,USA

Abstract

Broad Learning System (BLS) has been proven to be one of the most important techniques for classification and regression in machine learning and data mining. BLS directly collects all the features from feature and enhancement nodes as input of the output layer, which neglects vast amounts of redundant information. It usually leads to be inefficient and overfitting. To resolve this issue, we propose sparse regularization-based compact broad learning system (CBLS) framework, which can simultaneously remove redundant nodes and weights. To be more specific, we use group sparse regularization based on [Formula: see text] norm to promote the competition between different nodes and then remove redundant nodes, and a class of nonconvex sparsity regularization to promote the competition between different weights and then remove redundant weights. To optimize the resulting problem of the proposed CBLS, we exploit an efficient alternative optimization algorithm based on proximal gradient method together with computational complexity. Finally, extensive experiments on the classification task are conducted on public benchmark datasets to verify the effectiveness and superiority of the proposed CBLS.

Funder

National Natural Science Foundation of China

Key R&D and Promotion Projects of Henan Province

High-level Talent Fund Project of Henan University of Technology

Natural Science Project of Henan Education Department

Research Platform of Grain Information Processing Center of Henan University of Technology

Publisher

World Scientific Pub Co Pte Lt

Subject

Computer Science (miscellaneous),Computer Science (miscellaneous)

Cited by 4 articles. 订阅此论文施引文献 订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献

1. Generalized sparse and outlier-robust broad learning systems for multi-dimensional output problems;Information Sciences;2024-08

2. A pruning extreme learning machine with $$L_{2, 1/2}$$ regularization for multi-dimensional output problems;International Journal of Machine Learning and Cybernetics;2023-08-05

3. RCBLS: An Outlier-Robust Broad Learning Framework with Compact Structure;Electronics;2023-07-18

4. English Online Learning Platform Based on Mobile Android App;2022 Second International Conference on Advanced Technologies in Intelligent Control, Environment, Computing & Communication Engineering (ICATIECE);2022-12-16

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3