# AP Statistics Curriculum 2007 GLM Regress

### From Socr

(→Regression Coefficients Inference) |
(→Regression Coefficients Inference) |
||

Line 50: | Line 50: | ||

If the error terms are Normally distributed, the estimate of the slope coefficient has a normal distribution with mean equal to '''b''' and ''standard error'' given by: | If the error terms are Normally distributed, the estimate of the slope coefficient has a normal distribution with mean equal to '''b''' and ''standard error'' given by: | ||

- | : <math> s_ \hat{b} = \sqrt { \frac {\sum_{i=1}^N \hat{\varepsilon_i}^2} { | + | : <math> s_ \hat{b} = \sqrt { {1\over (N-2)} \frac {\sum_{i=1}^N \hat{\varepsilon_i}^2} {\sum_{i=1}^N (x_i - \bar{x})^2} }</math>. |

A confidence interval for ''b'' can be created using a [[AP_Statistics_Curriculum_2007_StudentsT | T-distribution with N-2 degrees of freedom]]: | A confidence interval for ''b'' can be created using a [[AP_Statistics_Curriculum_2007_StudentsT | T-distribution with N-2 degrees of freedom]]: |

## Revision as of 04:05, 18 February 2008

## Contents |

## General Advance-Placement (AP) Statistics Curriculum - Regression

As we discussed in the Correlation section, many applications involve the analysis of relationships between two, or more, variables involved in the process of interest. Suppose we have bivariate data (*X* and *Y*) of a process and we are interested on determining the linear relation between X and Y (e.g., determining a straight line that best fits the pairs of data (*X,Y*)). A linear relationship between *X* and *Y* will give us the power to make predictions - i.e., given a value of *X* predict a corresponding *Y* response. Note that in this design, data consists of paired observations (*X,Y*) - for example, the Longitude and Latitude of the SOCR Earthquake dataset.

### Lines in 2D

There are 3 types of lines in 2D planes - Vertical Lines, Horizontal Lines and Oblique Lines. In general, the mathematical representation of lines in 2D is given by equations like *a**X* + *b**Y* = *c*, most frequently expressed as *Y* = *a**X* + *b*, provides the line is not vertical.

Recall that there is a one-to-one correspondence between any line in 2D and (linear) equations of the form

- If the line is
**vertical**(*X*_{1}=*X*_{2}):*X*=*X*_{1}; - If the line is
**horizontal**(*Y*_{1}=*Y*_{2}):*Y*=*Y*_{1}; - Otherwise (
**oblique**line): , (for and )

where (*X*_{1},*Y*_{1}) and (*X*_{2},*Y*_{2}) are two points on the line of interest (2-distinct points in 2D determine a unique line).

- Try drawing the following lines manually and using this applet:

- Y=2X+1
- Y=-3X-5

### Linear Modeling - Regression

There are two contexts for regression:

- Y is an observed variable and X is specified by the researcher - e.g., Y is hair growth after X months, for individuals at certain dose levels of hair growth cream.

- X and Y are both observed variables - e.g., Height (Y) and weight (X) for 20 randomly selected individuals from the population.

Suppose we have *n* pairs *(X,Y)*, {} and {}, of observations of the same process. If a scatterplot of the data suggests a general linear trend, it would be reasonable to fit a line to the data. The main question is how to determine the best line?

#### Airfare Example

We can see from the scatterplot that greater distance is associated with higher airfare. In other words airports that tend to be further from Baltimore tend to be more expensive airfare. To decide on the best fitting line, we use the **least-squares method** to fit the least squares (regression) line.

#### Estimating the Best Linear Fit

The parameters of the linear regression line, *Y* = *a* + *b**X*, can be estimated using Least Squares. This method finds the line that minimizes the sum of the squares of the regression **residuals**, , where *y*_{i} and are the observed and the predicted values of *Y* for *x*_{i}, respectfully.

The minimization problem can be solved using calculus, by finding the first order partial derivatives and setting them equal to zero. The solution gives the slope and y-intercept of the regressions line:

- Regression line Slope:

- Y-intercept:

- Properties of the least square line:
- The line goes through the point
- The sum of the residuals is equal to zero
- The estimates are unbiased (their expected values are equal to the real slope and intercept values).

### Regression Coefficients Inference

If the error terms are Normally distributed, the estimate of the slope coefficient has a normal distribution with mean equal to **b** and *standard error* given by:

- .

A confidence interval for *b* can be created using a T-distribution with N-2 degrees of freedom:

### Hands-on Example

Suppose we have 3 sample of points {(1,-1),(2,4),(6,3)}. The mean of X is 3 and the mean of Y is 2. The slope coefficient estimate is given by:

The standard error of the slope coefficient (*b*) is 0.866. A 95% confidence interval is given by:

- CI(b): [0.5 - 0.866 x 12.7062, 0.5 + 0.866 x 12.7062] = [-10.504, 11.504].

### Earthquake Example

Use the SOCR Earthquake Dataset to determine the best-leaner fit between the Longitude and the [http://nationalatlas.gov/articles/mapping/a_latlong.html Latitude of the California Earthquakes since 1900. What is the interpretation of this regression line (San Andreas fault?). You can see the SOCR Geomap of these Earthquakes.

### References

- SOCR Home page: http://www.socr.ucla.edu

Translate this page: