 ## د. محمد ابراهيم محمد الحر

استاذ الإحصاء المساعد بكلية المجتمع

## البحث الأول

Predicting  Multivariate  Responses  in  Multiple  Linear Regression: Case of One Different Independent variable.

Abstract

While studying the problem of predicting several response variables from the same set of independent variables, the use of correlations between the response variables to improve predictive accuracy is considered and is compared with the usual procedure of doing individual regressions of each response variable on the common set of predictor variables. Breiman and Friedman (1997) introduced a new procedure called the curds and whey method. Its use can substantially reduce prediction errors when there are correlations between responses while maintaining accuracy even if the responses are uncorrelated. They applied this procedure when each response  variables depend on all the same set of explanatory variables. In this paper we will apply this method when each response variables depend on one different independent variable.

Keywords: CURDS AND WHEY METHOD; CANONICAL REGRESSION; CROSS-VALIDATION.

1. Introduction

The idea of predicting several quantities using a common set of predictor variables have so many applications and have been studied by so many authors(van der Merwe, A. and Zidek, J. V. ,1980,1989; Breiman and Friedman ,1997 ;S. Srivastavaand Tumulesh K. S. Solanky ,2003). A major paper by Breiman and Friedman (1997) is considered on of the basic papers in this  area concentrating on taking the advantage of correlations between response variables to improve predictive accuracy compared with the usual procedures used in that matter.

(5.2) The conclusion

## REFERENCES

Anderson, T.W., (1957): An introduction to Multivariate Analysis.New York: Wiley.

Breiman and Friedman, (1997): predicting multivariate responses in multiple linear regression. J.R. statist. Soc. B., 59, 3-54.

Brown, P.J. and Zidek, J.V., (1980): Adaptive multivariate ridge regression. Ann. Statist., 8,64-74

Brown, P.J. and Zidek, J.V.,  (1982): Multivariate regression shrinkage estimators with unknown covariance matrix. Scand. J. Statist., 9, 209-215.

Copas, J. B., (1983): Regression ,  prediction  and  shrinkage  ( with discussion ). J.R.  statist. Soc.,

B., 45, 311-354.

Copas, J. B., (1987):  Cross-validation   shrinkage   of   regression   predictors.   J. R.   statist.  Soc.

B., 49, 175-183.

Craven, P. and Wahba, G., (1979): Smoothing noisy data with spline functions: estimating the correct degree of smoothing by the method of generalized cross-validation. Numer. Math., 31, 317-403.

Garthwaite, P.H., (1994): An interpretation of partial least squares. J. Am. Statist. Ass.89,122-127.

Hoerl. A.E. and Kennard. R.W., (1970): Ridge regression: biased estimation for nonorthogonal problems. Technometrics.8. 27-51.

Izenman, A.J., (1975): Reduced-rank regression for the multivariate linear model. J. Multiv. Anal.,5, 248-264.

Massey, W.F., (1965): Principle components regression in exploratory statistical research. J. Am. Statist. Ass., 60, 234: 246.

MuniS. Srivastavaand Tumulesh K. S. Solanky (2003) : predicting multivariate response in linear regression model .communications in statistics – simulation and computation , 32 , 389 – 409 .

Van der Merwe, A. and Zidek, J. V. (1980): Multivariate regression analysis and canonical variates.Can.J. Statist., 8, 27-39.

Skagerberg, B., MacGregor, J. and Kiparissides, C., (1992): Multivariate data analysis applied to lowdensity polyethylene reactors. Cinemometr. Intell. Lab. Syst., 14, 341-356.

Stone, M., (1974): Cross-validatory choice and assessment of statistical predictions (with discussion). J. R. Statist. Soc. B., 36, 111-147.

Zidek, J., (1978): Deriving unbiased risk estimators of multinormal mean and regression coefficient estimators using zonal polynomials. Ann. Statist., 6, 769-782.

الساعات المكتبية

إعلان هام

### أرقام الاتصال

رقم التحويلة 3261

البريد الالكتروني [email protected]

### إحصائية الموقع

عدد الصفحات: 16

البحوث والمحاضرات: 0

الزيارات: 7115