How to Solve Linear Regression

Sometimes after plotting a set of data, a linear relationship seems to exist between the dependent variable and the independent variables. In many cases, researchers wish to solve the linear regression problem to obtain a true linear function relating the dependent and independent variables. Solving linear regression requires a method known as least squares. To use the method of least squares to arrive at a solution for the linear regression function, you should have a solid background in linear algebra or matrix algebra.

Instructions

    • 1

      Label your data as "X" and "y." The data in matrix form is "X," whereas the output in vector form is "y."

    • 2

      Set up the residual sum of squares function. Introduce a new vector of variables, "beta." This vector represents the coefficients of the linear regression function. The residual sum of squares function is RSS(beta) = t(y - Xbeta)(y - Xbeta), where the function "t()" is the transpose function, which gives the transpose of a matrix (switching columns for rows).

    • 3

      Take the first derivative with respect to "beta" of the residual sum of squares function. Use standard matrix calculus. The solution is always -2t(X)(y-Xbeta).

    • 4

      Set the derivative equal to zero. You will yield the equation -2t(X)(y-Xbeta) = 0. Notice that the -2 goes away when dividing both sides by -2, leaving t(X)(y-Xbeta) = 0.

    • 5

      Solve the equation for beta. Matrix algebra reveals that the solution is beta = inv[t(X)X]t(X)y, where the function "inv()" is the function that gives the inverse of a matrix. Writing beta in this way allows you to compute a number for it. Call this number "betahat."

    • 6

      Write the linear regression equation. The linear regression equations is y = Xbetahat. In this equation "X" is not your data matrix, but a matrix of variables. Using new data or estimates for X can yield linear regression estimates.

Learnify Hub © www.0685.com All Rights Reserved