Matrix Multiplication, the human way!

Having to do copious calculations by hand when preparing for an exam, I came to realize that there was an alternative way of interpreting a matrix multiplication. This new insight would allow me to instantly guess the following product without ever doing any numerical multiplication:
\[\begin{bmatrix}
1 & 2 & 3 \\
4 & 5 & 6 \\
7 & -8 & 0
\end{bmatrix}
\begin{bmatrix}
0 & 0 & 0 \\
0& 1 & 0 \\
1& 0 & 0
\end{bmatrix}
=
\begin{bmatrix}
3 & 2 & 0 \\
6 & 5 & 0 \\
0 & -8 & 0
\end{bmatrix}\]

Was there a way to have known that the first column of the product would be the third column of the first matrix?

MATRIX TIMES VECTOR

Observation: If A has $n$ columns and $v$ is an n-vector, then

\[Av=\begin{bmatrix}
c & c & c & \cdots \\
o & o & o & \cdots \\
l & l & l & \cdots \\
u & u & u & \cdots \\
m & m & m & \cdots \\
n & n & n & \cdots \\
\end{bmatrix}
\begin{bmatrix}
a  \\
b \\
c \\
\vdots
\end{bmatrix}
=
a* \begin{bmatrix}
1^{st}  \\
c \\
o \\
l \\
u \\
m \\
n
\end{bmatrix}
+b*  \begin{bmatrix}
2^{nd}  \\
c \\
o \\
l \\
u \\
m \\
n
\end{bmatrix}
+c* \begin{bmatrix}
3^{rd}  \\
c \\
o \\
l \\
u \\
m \\
n
\end{bmatrix}+ \cdots \]

That is, the product is a linear combination of the columns with coefficients coming from the vector.

Examples:

1.

\[\begin{bmatrix}
1 & -5 & 3 \\
0 & 0.5 & 2\\
1 & -8 & 0
\end{bmatrix}
\begin{bmatrix}
2\\
-1 \\
0
\end{bmatrix}
= 2* \begin{bmatrix} 1st\\ col \end{bmatrix} + (-1)*\begin{bmatrix} 2nd\\ col \end{bmatrix} + 0   \\ \ \ \ \ \ \ \ \ \ \ \ \ \  \ \ \ \ \ \  \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \  =
2 \begin{bmatrix}
1 \\
0 \\
1
\end{bmatrix} -\begin{bmatrix}
-5 \\
0.5 \\
-8
\end{bmatrix}+0 = \begin{bmatrix}
7 \\
-0.5 \\
10
\end{bmatrix}\]

2. What vector to multiply a matrix A into, so that we get the second column of A as the product?

Answer: \[ \begin{bmatrix}
0 \\
1 \\
0 \\
\vdots
\end{bmatrix}. \]

It is not only the practical calculations that benefit from this new method. Some lemmas and theorems of linear algebra become more obvious and accessible. For example:

Ax=0 has a nontrivial solution iff the columns of A are linearly dependent (as vectors).

dimension of {Ax: x ranging over all column vectors} = number of independent columns of A

Given A and b, Av=b has a solution iff b can be written as a linear combination of columns of A.

 

MATRIX TIMES MATRIX

In computing

\[AB=\begin{bmatrix}
c & c & c & \cdots \\
o & o & o & \cdots \\
l & l & l & \cdots \\
u & u & u & \cdots \\
m & m & m & \cdots \\
n & n & n & \cdots \\
\end{bmatrix}
\begin{bmatrix}
c & c & c & \cdots \\
o & o & o & \cdots \\
l & l & l & \cdots \\
u & u & u & \cdots \\
m & m & m & \cdots \\
n & n & n & \cdots \\
\end{bmatrix}
, \ \]

To get the  j’th column of the product, ignore all other columns of B except the  j’th, and then do A*[column j of B], using the above method. Do this for all columns and you will have the product AB.

Examples:

1.

\[\begin{bmatrix}
1 & 2 & 3 \\
4 & 5 & -2 \\
7 & -4 & 0
\end{bmatrix}
\begin{bmatrix}
1 & 0 & 1 \\
0& 1 & 1 \\
2& 3 & 0
\end{bmatrix} \]

The first column will be

\[ \begin{bmatrix}
1 & 2 & 3 \\
4 & 5 & -2 \\
7 & -4 & 0
\end{bmatrix}
\begin{bmatrix}
1  \\
0 \\
2
\end{bmatrix} =
1*\begin{bmatrix}
1  \\
4 \\
7
\end{bmatrix} + 2* \begin{bmatrix}
3  \\
-2 \\
0
\end{bmatrix} = \begin{bmatrix}
7\\
0 \\
7
\end{bmatrix} \]

The second column will be

\[ \begin{bmatrix}
1 & 2 & 3 \\
4 & 5 & -2 \\
7 & -4 & 0
\end{bmatrix}
\begin{bmatrix}
0  \\
1 \\
3
\end{bmatrix} =
1*\begin{bmatrix}
2  \\
5 \\
-4
\end{bmatrix} + 3* \begin{bmatrix}
3  \\
-2 \\
0
\end{bmatrix} = \begin{bmatrix}
11 \\
-1 \\
-4
\end{bmatrix} \]

Column 3 is computed similarly, and the full result is:

\[\begin{bmatrix}
1 & 2 & 3 \\
4 & 5 & -2 \\
7 & -4 & 0
\end{bmatrix}
\begin{bmatrix}
1 & 0 & 1 \\
0& 1 & 1 \\
2& 3 & 0
\end{bmatrix} = \begin{bmatrix}
7 & 11  & 3 \\
0 & -1 & 9 \\
7 & -4 & 3
\end{bmatrix} \]

Note: Most of the calculations above are doable in mind, and I have written them out to explain the procedure only.

2. Multiply  \[A= \begin{bmatrix}
2 & 33  & 0 \\
0 & -1 & 9 \\
4  & -14 & 13
\end{bmatrix} \]  into a matrix B so that the columns 2 and 3 of A are interchanged.

Answer:  \[ B=\begin{bmatrix}
1 & 0 &  0 \\
0 & 0 & 1 \\
0 & 1 &   0
\end{bmatrix} \]

3. Find the 3rd column of the product

\[\begin{bmatrix}
10 & 2 & 3 \\
4 & 0 & -2 \\
7 & -4 & 0
\end{bmatrix}
\begin{bmatrix}
1 & 0 & 1 \\
0& 1 & 1 \\
-2& 0 & -1
\end{bmatrix} \]

Answer:

thrid column of AB = A * third column of B. So

\[ \begin{bmatrix}
10 & 2 & 3 \\
4 & 0 & -2 \\
7 & -4 & 0
\end{bmatrix}
\begin{bmatrix} 1 \\ 1 \\ -1 \end{bmatrix}= \begin{bmatrix} 9 \\ 6 \\ 3 \end{bmatrix} \]

(Please verify!)

 

4. Is the following true or false? “If we change arrays in the second column of B, then in the product AB only the arrays in the second column may change — all other arrays on other columns will remain the same.”

Answer: True.

 

Whether you are convinced that computing AB one column at a time is more practical than the conventional array-by-array multiplication or not, one cannot ignore its usefulness in helping one to “see” the linear algebra behind matrices and their multiplication. So, next time you encounter a sparse matrix save yourself tens of multiplications by zero!

HOMEWORK

Can you guess and prove a row-wise version of the above?

 

About Behnam Esmayli

I started PhD in Mathematics at Pitt in Fall 2015. I have come to grow a passion for metric spaces -- a set and a distance function that satisfies the triangle inequality -- simple and beautiful! These spaces when equipped with other structures, such as a measure, becomes extremely fun to play with!
This entry was posted in Math and tagged , , . Bookmark the permalink.