Abstract
Abstract
Many problems in multidimensional data analysis involve the optimization of quadratic functions, due to the common assumption of normally distributed errors, together with the prevalence of linear and bilinear models. By present standards, the resulting optimization problems are of moderate complexity, frequently involving the search for eigenvectors and eigenval ues, or projections of vectors on subspaces. Even in fairly complicated situations, such as for example generalized canonical correlation analysis with optimal scaling of the variables (Van der Burg, De Leeuw, and Verde gaal, 1988), it is often possible, by partitioning the parameter space into convenient regions, to split the problem into a connected series of simpler subproblems so that monotonic convergence to at least a local minimum remains guaranteed. This approach is called NIPALS (Wold, 1966), for Nonlinear Iterative PArtial Least Squares, or ALS (De Leeuw, Young, and Takane, 1976), for Alternating Least Squares, and is strongly related to the Gauss-Seidel and block decomposition (or relaxation) methods, which are well-known in numerical analysis for iteratively solving linear systems (e.g., Burden and Faires, 1985).
Publisher
Oxford University PressOxford
Cited by
5 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献