Projections

To fully understand the linear model we will develop a geometric interpretation.

  • When <x, z> = 0, then $cos(\theta) = 0$ and $x$, $z$ are orthogonal.

  • Any vector $y \in \mathbb{R}^n$ can be written as $y = \hat{y} + y^{\perp}$, where $\hat{y} \in S, y^{\perp} \in S^{\perp}$

  • $\hat{y}$ is the projection of $y$ on $S$

  • Matrices P and M such that $Py = \hat{y}, My = y^{\perp}$

How to compute P and M?

$$P = X(X'X)^{-1}X'$$ $$M = I - P = I - X(X'X)^{-1}X'$$

$PX = X $

$MX = 0$

P and M are symmetric and idempodent. P and M are orthogonal $PM = 0$

Lets connect this to regression now.

$y = \hat{y} + y^{\perp}$ $\implies \hat{y}' + y^{\perp} = 0$ $\implies E(\hat{y}y^{\perp}) = 0$

Connection to BLP

  • Recall that for BLP: $y = x' \beta + e$ and $E(xe) = 0$

  • BLP is equivalent to a projection of the dependent variable $y$ on the linear space spanned by the independent variables $x$! The error term is perpendicular to this.

For OLS:

$\beta = (X' X)^{-1}X'y$

$\hat{y} = Py$

$\hat{e} = My$

Least Squares as a Linear Projection!