Videos: 30 min

Readings: 20 min

Activities: 40 min

Check-ins: 1

A **scalar** is a single number by itself.

In statistics, a scalar will usually be:

- A single data observation
- A parameter
- A summary statistic

In R, a scalar is just a single number; or equivalently, an “atomic vector”:

`## [1] 5`

A **vector** is a set of scalars put together.

In statistics, a vector might be a set of *samples* from a single variable or a set of *observations of many variables* from a single sample.

In R, we use vectors all the time:

`## [1] 1 2 3`

`## [1] 5.1 4.9 4.7 4.6 5.0 5.4`

A **matrix** is a two dimensional set of scalars; or equivalently, many vectors put together.

In statistics, a matrix usually represents observations from *one or more variables* (columns) for *many samples* (rows).

The **dimension of a matrix** (\(m \times n\)) is the number of rows (\(m\)) by the number of columns (\(n\)). The elements of the matrix are often written as \(a_{ij}\), as in \[ {\bf A} \, = \, \left( \matrix{ a_{11} & a_{12} \\ a_{21} & a_{22} } \right) \]

You will sometimes hear of a \(1 \times n\) matrix is called a *row vector* and an \(m \times 1\) matrix is called a *column vector*.

Careful - in R, a *matrix* and a *vector* are different object types and sometimes behave differently!

`## [1] 1 2 3`

```
## [,1]
## [1,] 1
## [2,] 2
## [3,] 3
```

```
## [,1] [,2] [,3]
## [1,] 1 2 3
```

`## [1] "numeric"`

`## [1] "matrix"`

`## [1] "matrix"`

A *square* matrix has the same number of rows as columns. A *diagonal* matrix has all zeros except on the diagonal. One special *square, diagonal* matrix is the **identity matrix**:

\[{\bf I_n} = \left( \matrix{1 & 0 & \ldots \\ 0 & 1 & \ldots \\ \vdots & \vdots & \vdots \\ \ldots & 0 & 1} \right)\]

```
## [,1] [,2] [,3]
## [1,] 1 0 0
## [2,] 0 1 0
## [3,] 0 0 1
```

For purposes, you’ll need to understand the basics of doing math with matrices. Here are a few quick tutorial (or refresher) options:

The main thing to make sure you understand is how **matrix multiplication** (or *cross product*) is different from **elementwise multiplication** (or *dot product*).

```
## [,1] [,2]
## [1,] 5 -3
## [2,] -1 2
```

```
## [,1] [,2]
## [1,] 1 3
## [2,] 2 4
```

```
## [,1] [,2]
## [1,] 5 -9
## [2,] -2 8
```

```
## [,1] [,2]
## [1,] -1 3
## [2,] 3 5
```

```
## [,1] [,2]
## [1,] 2 3
## [2,] 6 2
```

In particular, note that **order matters** and that not all matrices can be multiplied together!

`## Error in A * C: non-conformable arrays`

`## Error in A %*% C: non-conformable arguments`

```
## [,1] [,2]
## [1,] 26 -17
```

There are a few simple special terms you should know about.

A **square root** of a matrix is the matrix that, if multiplied with itself, gets back the original:

\[{\bf A}^{1/2} {\bf A}^{1/2} = {\bf A}\]

Note that not every matrix has a valid square root!

`## Warning in sqrt(A): NaNs produced`

```
## [,1] [,2]
## [1,] 2.2 NaN
## [2,] NaN 1.4
```

```
## [,1] [,2]
## [1,] 1.0 1.7
## [2,] 1.4 2.0
```

`## Warning in sqrt(C): NaNs produced`

```
## [,1] [,2]
## [1,] 2.2 NaN
```

The **transpose** of a matrix is the “tilted” or “mirror imaged” version, with rows and columns swapped:

\[M = \left( \matrix{8 & 3 \\ 4 & 1 \\ 2 & 3} \right)\]

\[M' \text{ or } M^t = \left( \matrix{8 & 4 & 2 \\ 3 & 1 & 3} \right)\]

```
## [,1] [,2]
## [1,] 8 3
## [2,] 4 1
## [3,] 2 3
```

```
## [,1] [,2] [,3]
## [1,] 8 4 2
## [2,] 3 1 3
```

There are many situations (especially in statisics) where we want to multiply a matrix by its transpose. This is called a **crossproduct**.

```
## [,1] [,2]
## [1,] 84 34
## [2,] 34 19
```

```
## [,1] [,2]
## [1,] 84 34
## [2,] 34 19
```

```
## [,1] [,2] [,3]
## [1,] 73 35 25
## [2,] 35 17 11
## [3,] 25 11 13
```

```
## [,1] [,2] [,3]
## [1,] 73 35 25
## [2,] 35 17 11
## [3,] 25 11 13
```

The **inverse** of a matrix is the thing that we can multiply it by to create the *identity* matrix.

\[ {\bf A}^{-1} {\bf A} = {\bf I}\]

Once again - not all matrices have a valid inverse! In particular, only square matrices can possibly have inverses. (Although this is not a guarantee by itself.)

```
## [,1] [,2]
## [1,] 0.29 0.43
## [2,] 0.14 0.71
```

```
## [,1] [,2]
## [1,] 1 0
## [2,] 0 1
```

`## Error in solve.default(M): 'a' (3 x 2) must be square`

Some matrices (in fact, those with an inverse) have a **determinant**. This is a single number that has some mathematical importance to the matrix. You don’t need to know anything about the underlying math, just how to find it iwth R:

`## [1] 7`

`## Error in determinant.matrix(x, logarithm = TRUE, ...): 'x' must be a square matrix`

Lastly, the **eigenvalues** and **eigenvectors** of a matrix are certain numbers and vectors with special properties. The video in a moment will give you some intuition behind these. We won’t concern ourselves with the math, but you should know how to find these with R:

```
## eigen() decomposition
## $values
## [1] 5.8 1.2
##
## $vectors
## [,1] [,2]
## [1,] 0.97 0.62
## [2,] -0.26 0.78
```

`## [1] 0.97 -0.26`

`## [1] 5.8`

Be careful… you saw this coming… not all matrices have eigenvalues and vectors.

`## Error in eigen(M): non-square matrix in 'eigen'`

At this repository, you will find a puzzle using matrix calculations.

The puzzle will ask you to perform a series of steps on a mystery image. If you do the steps correctly, the image will be transformed into something recognizeable.

Once you discover the person in the image, you are done!