Week 6: Matrix Decomposition

Stat 431



Time Estimates:
     Videos: 10 min
     Readings: 30 min
     Activities: 60 min
     Check-ins: 2


Matrix Decomposition

Recall the following expression you used last week to compute the coefficient estimates for multiple regression:

\(\hat{\beta} = (X'X)^{-1} X'Y\)

There were some very nice and simple functions for performing these operations in R. While you may not have noticed it, computing the inverse of \(X'X\) can actually be quite costly. In particular, if we have a large number of predictors then \(X'X\) is very large and it can be computationally infeasible to invert without taking special care. What is this special care, you ask? Stay awhile and listen!

One such way to ease the task of computing coefficient estimates in regression is to decompose the \(X\) matrix into more manageable pieces (in this case, “more manageable” means “easier to invert”). There are multiple ways to decompose matrix depending on the properties it has, but the one we’ll explore here is called QR-decomposition.

Once again, we’re not going to dwell too much on the mathematical derivations, but now that you’ve gained some knowledge in matrix operations in R you should try to understand the general ideas here. You are not responsible for knowing the calculus and linear algebra used in the video.


Required Video: QR-Decomposition


Notice that this video didn’t actually involve any R!


Check-In 1: QR-Decomposition


  1. The \(x\) in the video corresponds to what in our multiple regression?
  • The data matrix, \(X\)
  • The coefficient vector, \(\beta\)
  • The response vector, \(Y\)
  1. One requirement for the decomposition to work is that the columns of \(A\) must be linearly independent. This is already a requirement/assumption of multiple regression.
  • True
  • False
  • Maybe
  1. The end of the video implies that a different way to compute \(\hat{\beta}\) is
  • \(R^{-1}Q'b\)
  • \(Q'b\)
  • \(Q'Q\)

Canvas Link     

Because \(R\) is upper triangular it is much easier to invert and so this QR-decomposition should be able to help us quite a bit!

You might still be wondering, though, how we actually compute/obtain \(Q\) and \(R\). Without going further into the mathematics behind it, check out the following reading for how to do it in R.


Required Reading: QR-Decomposition in R


Notes:

  • Please also read through the documentation for the qr() functions and the other code used in this reading. There are multiple way to employ these functions to help us with regression!

Check-In 2: QR-Decomposition in R


  1. What does the qr() function return?
  • The \(Q\) matrix
  • The \(R\) matrix
  • A list which includes the \(Q\) and \(R\) matrices
  1. Which function does this reading use to take advantage of \(R\) being upper triangular?
  • uppertri_solve()
  • solve()
  • backsolve()
  • easy_invert()

Canvas Link