Career, Family And Living For The Lord
-
A Twenty-Five Year History

by James Thomas Lee, Jr. 12/25/97 Copyrighted 1995 by James Thomas Lee, Jr. Copyright Number: XXx xxx-xxx

Appendices

Appendix B. Multiple Linear Regression {249 words}

1. For multiple linear regression, the following deviation-squared function is defined: f = SUM (y(i) - m1x1(i) - m2x2(i) - b)**2

2. Just like with the simpler linear regression discussed above, multiple linear regression seeks to find the optimum values for m1, m2, and b, such that the total variance between the observed and estimated values for yi will be minimized. Therefore, consistent with the former case, we determine Partial(f) / Partial(m1), Partial(f) / Partial(m2), and Partial(f) / Partial(b), set each to zero, and then solve for m1, m2, and b by again using the concepts of matrix theory.

3. Following the same manner as before, we obtain the following solutions:


              G (DF - E2)  +  H (EC - BF)  +  J (BE - CD)
 m1 =    --------------------------------------------------------  ,
              A (DF - E2)  +  B (EC - BF)  +  C (BE - CD)


              G (CE - BF)  +  H (AF - C2)  +  J (BC - AE)
 m2  =    -------------------------------------------------------  ,
              B (CE - BF)  +  D (AF - C2)  +  E (BC - AE)


              G (BE - CD)  +  H (BC - AE)  +  J (AD - B2)
 b   =    -------------------------------------------------------  ,where
              C (BE - CD)  +  E (BC - AE)  +  F (AD - B2)


        A = SUM(x1(i)**2) , B = SUM(x1(i)x2(i)) , C = SUM(x1(i)) , D = SUM(x2(i)**2) , 
        E = SUM(x2(i)) , F = n , G = SUM(x1(i)y(i)) , H = SUM(x2(i)y(i)) , J = SUM(y(i)).

Appendix C. Polynomial Least Squares Regression

Back To The Table Of Contents

Back To TLEE's Home Page

Send email to: tlee6040@aol.com 1