A shared-subspace learning framework for multi-label classification

Author:

Ji Shuiwang1,Tang Lei1,Yu Shipeng2,Ye Jieping1

Affiliation:

1. Arizona State University, Tempe, AZ

2. Siemens Medical Solutions

Abstract

Multi-label problems arise in various domains such as multi-topic document categorization, protein function prediction, and automatic image annotation. One natural way to deal with such problems is to construct a binary classifier for each label, resulting in a set of independent binary classification problems. Since multiple labels share the same input space, and the semantics conveyed by different labels are usually correlated, it is essential to exploit the correlation information contained in different labels. In this paper, we consider a general framework for extracting shared structures in multi-label classification. In this framework, a common subspace is assumed to be shared among multiple labels. We show that the optimal solution to the proposed formulation can be obtained by solving a generalized eigenvalue problem, though the problem is nonconvex. For high-dimensional problems, direct computation of the solution is expensive, and we develop an efficient algorithm for this case. One appealing feature of the proposed framework is that it includes several well-known algorithms as special cases, thus elucidating their intrinsic relationships. We further show that the proposed framework can be extended to the kernel-induced feature space. We have conducted extensive experiments on multi-topic web page categorization and automatic gene expression pattern image annotation tasks, and results demonstrate the effectiveness of the proposed formulation in comparison with several representative algorithms.

Funder

Division of Information and Intelligent Systems

NGA

Division of Computing and Communication Foundations

National Institutes of Health

Publisher

Association for Computing Machinery (ACM)

Subject

General Computer Science

Reference63 articles.

1. Uncovering shared structures in multiclass classification

2. Andersen E. D. and Andersen K. D. 2000. The MOSEK interior point optimizer for linear programming: An implementation of the homogeneous algorithm. In High Performance Optimization. Kluwer Academic Publishers 197--232. Andersen E. D. and Andersen K. D. 2000. The MOSEK interior point optimizer for linear programming: An implementation of the homogeneous algorithm. In High Performance Optimization. Kluwer Academic Publishers 197--232.

3. Arenas-García J. Petersen K. B. and Hansen L. K. 2007. Sparse kernel orthonormalized PLS for feature extraction in large data sets. Adv. Neural Inform. Proces. Syst. 19. 33--40. Arenas-García J. Petersen K. B. and Hansen L. K. 2007. Sparse kernel orthonormalized PLS for feature extraction in large data sets. Adv. Neural Inform. Proces. Syst. 19. 33--40.

Cited by 117 articles. 订阅此论文施引文献 订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献

1. Optimal performance of Binary Relevance CNN in targeted multi-label text classification;Knowledge-Based Systems;2024-01

2. Multi-label Feature Selection with Adaptive Subspace Learning;Lecture Notes in Computer Science;2024

3. A survey on multi-label feature selection from perspectives of label fusion;Information Fusion;2023-12

4. Latent Topic-Aware Multioutput Learning;IEEE Transactions on Systems, Man, and Cybernetics: Systems;2023-12

5. Integrating Global and Local Feature Selection for Multi-Label Learning;ACM Transactions on Knowledge Discovery from Data;2023-02-20

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3