Dimension Reduction in Multivariate Linear Regression
We defined some elementary terminology. It includes the vector space, linear combination, set of independent vectors,
dependent vectors, basis of vector space, and direct sum of subspaces. This theory can help us lower the dimension of a given vector spaces. We apply to multivariate linear multiple regression analysis. It not only simplifies the computation and eases the interpretation, but also reduce the rate of errors. Cook (2010) developed an envelope model for the same reason. The main objective in that model is decomposing the covariance matrix into the sum of two matrices, each of whose column spaces either contains, or is orthogonal to, the subspace containing the mean. In other words, break the covariance matrix into the direct sum of the subspaces.
(1) Anderson, T.W. (1958). An Introduction to Multivariate Statistical Analysis. Second Edition, Wiley, New York.
(2) Cook, R.D. (2007). Fisher lecture: Dimension Reduction in Regression (with discussion). Statist. Sci. 22 p1-26.
(3) Cook, R.D., Li, B. and Chiaromonte, F. (2010). Envelope Models for Parsimonious and Efficient Multivariate Linear Regression. Statistica Sinica 20, p927-1010.
(4) Cook, R.D., Li, B. and Chiaromonte, F. (2007). Dimension Reduction without matrix inversion. Biometrika, 94, p569-584.
(5) Halmos, P.R. (1974,1987). Finite-Dimensional Vector Space. Published by Springer-Verlag, New York Inc.
(6) Handbook of Linear Algebra. (2007). Edited by Leslie Hogben,Associate editors, Richard Brualdi, Anne Greenbaum, Roy Mathias. Chapman & Hall/CRC, Taylor & Francis Group, Boca Raton, London, New York.
(7) Mal^Cev, A.I.(1963). Foundations of Linear Algebra. Translated from the Russian by Thomas Craig Brown, Edited by Roberts, J.B. Published by W.H. Freeman and Company, San Francisco and London.