About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Publication
JASA
Paper
A Theorem on Least Squares and Vector Correlation in Multivariate Linear Regression
Abstract
This paper shows that the least-squares estimate of the matrix of coefficients in a multivariate linear regression model maximizes the squared vector correlation coefficient between the dependent variables and a linear transformation of the explanatory variables. © Taylor & Francis Group, LLC.