Lesson 13: Weighted Least Squares & Robust Regression

Printer-friendly versionPrinter-friendly version

So far we have utilized ordinary least squares for estimating the regression line. However, aspects of the data (such as nonconstant variance or outliers) may require a different method for estimating the regression line. This lesson provides an introduction to some of the other available methods for estimating regression lines. To help with the discussions in this lesson, recall that the ordinary least squares estimate is

\[\begin{align*} \hat{\beta}_{\textrm{OLS}}&=\arg\min_{\beta}\sum_{i=1}^{n}\epsilon_{i}^{2} \\ &=(\textbf{X}^{\textrm{T}}\textbf{X})^{-1}\textbf{X}^{\textrm{T}}\textbf{Y} \end{align*}\]

Because of the alternative estimates to be introduced, the ordinary least squares estimate is written here as \(\hat{\beta}_{\textrm{OLS}}\) instead of b.