One-Step Clustering with Adaptively Local Kernels and a Neighborhood Kernel

Author:

Chen Cuiling1ORCID,Hu Zhijun2ORCID,Xiao Hongbin1ORCID,Ma Junbo13ORCID,Li Zhi1

Affiliation:

1. School of Computer Science and Engineering, Guangxi Normal University, 15 Yucai Road, Guilin 541004, China

2. School of Mathematics and Statistics, Guangxi Normal University, 15 Yucai Road, Guilin 541004, China

3. Department of Psychiatry, University of North Carolina at Chapel Hill, Chapel Hill, NC 27599, USA

Abstract

Among the methods of multiple kernel clustering (MKC), some adopt a neighborhood kernel as the optimal kernel, and some use local base kernels to generate an optimal kernel. However, these two methods are not synthetically combined together to leverage their advantages, which affects the quality of the optimal kernel. Furthermore, most existing MKC methods require a two-step strategy to cluster, i.e., first learn an indicator matrix, then executive clustering. This does not guarantee the optimality of the final results. To overcome the above drawbacks, a one-step clustering with adaptively local kernels and a neighborhood kernel (OSC-ALK-ONK) is proposed in this paper, where the two methods are combined together to produce an optimal kernel. In particular, the neighborhood kernel improves the expression capability of the optimal kernel and enlarges its search range, and local base kernels avoid the redundancy of base kernels and promote their variety. Accordingly, the quality of the optimal kernel is enhanced. Further, a soft block diagonal (BD) regularizer is utilized to encourage the indicator matrix to be BD. It is helpful to obtain explicit clustering results directly and achieve one-step clustering, then overcome the disadvantage of the two-step strategy. In addition, extensive experiments on eight data sets and comparisons with six clustering methods show that OSC-ALK-ONK is effective.

Funder

Research fund of Guangxi Key Lab of Multi-source Information Mining and Security

Guangxi Natural Science Foundation

Publisher

MDPI AG

Subject

General Mathematics,Engineering (miscellaneous),Computer Science (miscellaneous)

Reference31 articles.

1. Valizadegan, H., and Jin, R. (2006, January 4–7). Generalized maximum margin clustering and unsupervised kernel learning. Proceedings of the Twentieth Annual Conference on Neural Information Processing Systems, Vancouver, BC, Canada.

2. Feature selection and kernel learning for local learning-based clustering;Zeng;IEEE Trans. Pattern Anal. Mach. Intell.,2011

3. Cortes, C., Mohri, M., and Rostamizadeh, A. (2009, January 18–21). L2 regularization for learning kernels. Proceedings of the Twenty-Fifth Conference on Uncertainty in Artificial Intelligence, Montreal, QC, Canada.

4. Zhao, B., Kwok, J.T., and Zhang, C.S. (May, January 30). Multiple Kernel Clustering. Proceedings of the SIAM International Conference on Data Mining, Sparks, NV, USA.

5. Kloft, M., Brefeld, U., Sonnenburg, S., Laskov, P., Müller, K.R., Zien, A., and Sonnenburg, S. (2009, January 7–10). Efficient and accurate lp-norm multiple kernel learning. Proceedings of the 23rd Annual Conference on Neural Information Processing Systems, Vancouver, BC, Canada.

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3