Least squares regression calculates statistical parameters used to build a predictive model in which the value of a dependent variable y can be estimated using known values of a set of j independent variables x(j). The mathematics used in calculating the j x-coefficients minimize the difference of the sum of the squares between the n observed x and y pairs to determine an a and and set of bs such that y = b(i)x(i) + a provides the best predictive model.
Instructions
1. Obtain n, the sample size for which you'll calculate the least squares x-coefficients in the calculated model.
2. Calculate the sum of each of the j pairs of the observed x(i) and y values and call them Sum(X(i)) and Sum(Y).
3. Calculate the square of the sum of each of the j observed x(i) values and call them Sum(X(i))^2.
4. Calculate the sum of the squares of each of the j observed x(i) value deviations from the mean of x (x(i) - x (i)bar)^2 and call it Sum(x(i)^2).
5. Calculate the sum of the cross products of each of the j observed x(i) value deviations from the mean of x(i), (x(i) - x(i) bar) times the observed y value deviations from the mean of y (y - y bar) and call it Sum (x(i)y).
6. Calculate each of the j regression coefficients b(i) using the following formula:
b(i) = [n(Sum(x(i)y) - Sum(X(i))Sum(Y)]/[nSum(x(i)^2) - Sum(X(i))^2].
Tags: deviations from, deviations from mean, each observed, from mean, observed value