Affiliation:
1. Department of Computer Science and Information Engineering, Tamkang University, New Taipei City, Taiwan, R.O.C.
2. Department of Information Management, Chihlee Institute of Technology, New Taipei City, Taiwan, R.O.C.
Abstract
In this paper, we propose an improved version of the neighbor embedding super-resolution (SR) algorithm proposed by Chang et al. [Super-resolution through neighbor embedding, in Proc. 2004 IEEE Computer Society Conf. Computer Vision and Pattern Recognition(CVPR), Vol. 1 (2004), pp. 275–282]. The neighbor embedding SR algorithm requires intensive computational time when finding the K nearest neighbors for the input patch in a huge set of training samples. We tackle this problem by clustering the training sample into a number of clusters, with which we first find for the input patch the nearest cluster center, and then find the K nearest neighbors in the corresponding cluster. In contrast to Chang’s method, which uses Euclidean distance to find the K nearest neighbors of a low-resolution patch, we define a similarity function and use that to find the K most similar neighbors of a low-resolution patch. We then use local linear embedding (LLE) [S. T. Roweis and L. K. Saul, Nonlinear dimensionality reduction by locally linear embedding, Science 290(5500) (2000) 2323–2326] to find optimal coefficients, with which the linear combination of the K most similar neighbors best approaches the input patch. These coefficients are then used to form a linear combination of the K high-frequency patches corresponding to the K respective low-resolution patches (or the K most similar neighbors). The resulting high-frequency patch is then added to the enlarged (or up-sampled) version of the input patch. Experimental results show that the proposed clustering scheme efficiently reduces computational time without significantly affecting the performance.
Publisher
World Scientific Pub Co Pte Lt
Subject
Artificial Intelligence,Computer Vision and Pattern Recognition,Software
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献