Affiliation:
1. College of Mathematics and Information Science and Yantai Key Laboratory of Big Data Modeling and Intelligent Computing, Shandong Technology and Business University, Yantai 264005, China
2. College of Business and Economics, Shanghai Business School, Shanghai 201400, China
Abstract
<p>This article explores the mathematical and statistical performances and connections of the two well-known ordinary least-squares estimators (OLSEs) and best linear unbiased estimators (BLUEs) of unknown parameter matrices in the context of a multivariate general linear model (MGLM) for regression, both of which are defined under two different optimality criteria. Tian and Zhang <sup>[<xref ref-type="bibr" rid="b38">38</xref>]</sup> once collected a series of existing and novel identifying conditions for OLSEs to be BLUEs under general linear models: <italic>On connections among OLSEs and BLUEs of whole and partial parameters under a general linear model, Stat. Probabil. Lett., 112 (2016), 105–112</italic>. In this paper, we show how to extend this kind of results to multivariate general linear models. We shall give a direct algebraic procedure to derive explicit formulas for calculating the OLSEs and BLUEs of parameter spaces in a given MGLM, discuss the relationships between OLSEs and BLUEs of parameter matrices in the MGLM, establish many algebraic equalities related to the equivalence of OLSEs and BLUEs, and give various intrinsic statistical interpretations about the equivalence of OLSEs and BLUEs of parameter matrices in a given MGLM using some matrix analysis tools concerning ranks, ranges, and generalized inverses of matrices.</p>
Publisher
American Institute of Mathematical Sciences (AIMS)
Reference39 articles.
1. I. S. Alalouf, G. P. H. Styan, Characterizations of estimability in the general linear model, Ann. Statist., 7 (1979), 194–200. http://dx.doi.org/10.1214/aos/1176344564
2. T. W. Anderson, An introduction to multivariate statistical analysis, 2 Eds., New York: Wiley, 1984.
3. A. Basilevsky, Applied matrix algebra in the statistical sciences, New York: Dover Publications, 2013.
4. D. Bertsimas, M. S. Copenhaver, Characterization of the equivalence of robustification and regularization in linear and matrix regression, Euro. J. Oper. Res., 70 (2018), 931–942. https://dx.doi.org/10.1016/j.ejor.2017.03.051
5. N. H. Bingham, W. J. Krzanowski, Linear algebra and multivariate analysis in statistics: development and interconnections in the twentieth century, British Journal for the History of Mathematics, 37 (2022), 43–63. http://dx.doi.org/10.1080/26375451.2022.2045811