Affiliation:
1. Graduate School of Informatics, Kyoto University, Kyoto 606-8501, Japan, and CREST, JST, Kyoto 606-8501, Japan
2. Graduate School of Informatics, Kyoto University, Kyoto 606-8501, Japan
Abstract
The self-organizing map (SOM) is an unsupervised learning method as well as a type of nonlinear principal component analysis that forms a topologically ordered mapping from the high-dimensional data space to a low-dimensional representation space. It has recently found wide applications in such areas as visualization, classification, and mining of various data. However, when the data sets to be processed are very large, a copious amount of time is often required to train the map, which seems to restrict the range of putative applications. One of the major culprits for this slow ordering time is that a kind of topological defect (e.g., a kink in one dimension or a twist in two dimensions) gets created in the map during training. Once such a defect appears in the map during training, the ordered map cannot be obtained until the defect is eliminated, for which the number of iterations required is typically several times larger than in the absence of the defect. In order to overcome this weakness, we propose that an asymmetric neighborhood function be used for the SOM algorithm. Compared with the commonly used symmetric neighborhood function, we found that an asymmetric neighborhood function accelerates the ordering process of the SOM algorithm, though this asymmetry tends to distort the generated ordered map. We demonstrate that the distortion of the map can be suppressed by improving the asymmetric neighborhood function SOM algorithm. The number of learning steps required for perfect ordering in the case of the one-dimensional SOM is numerically shown to be reduced from O(N3) to O(N2) with an asymmetric neighborhood function, even when the improved algorithm is used to get the final map without distortion.
Subject
Cognitive Neuroscience,Arts and Humanities (miscellaneous)
Cited by
16 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献