# Lecture 22: Residuals, Multicollinearity, Inference

## Least Squares Regression

• definition of orthogonal
• difference between Y vector and X theta hat is 0?
• invert the matrix if full rank for solution

## A Regression Model

• X is a design matrix, first column is 1
• theta is params

## Residuals

• difference Y and Y hat (estimate)

## Seperating Signal and Noise

• true signal + noise and prediction + residual

## Residuals Sum to Zero

• ?

• The average of the fitted values is equal to average of the observed responses

• orthogonal to the residuals?

## Multiple R^2 and Overved Response and Fitted Values

• Multiple R^2

• Coefficient of Determination
• variance of the fitted values
• variance of observed responses
• "percent of variance explained by the model"

## Colinearity and the Meaning of Slope

• Change in y per unit change in x_1 given all other variables held constant

• colinearity: when a covariate can be predicted by a linear function of others

## Inference and Assumptions of Randomness

• our model can be expressed as intercept, weight of features, and error

• We ha ve to estimate the weights (slopes)

• how do we test theta_1 is 0?

## Confidence Intervals for True Slope

• could bootstrap, could build a confidence interval (?)