Skip to main content

Section 2 Linear transformations

Subsection 2.1 Definition of a linear transformation

A function \(L\colon \R^n\to \R^m\) is called a linear transformation (or a linear mapping, or simply a linear map) if
  1. \(L(\mathbf{x}+\mathbf{y}) = L(\mathbf{x}) + L(\mathbf{y})\text{,}\) and
  2. \(\displaystyle L(\alpha\mathbf{x}) = \alpha L(\mathbf{x})\)
for all vectors \(\mathbf{x},\mathbf{y}\) in \(\R^n\) and real numbers \(\alpha\text{.}\) Properties (i) and (ii) are called linearity properties. We say that \(L\) preserves or respects vector operations of addition and scaling. Instead of \(L(\mathbf{x})\text{,}\) it is common practice to omit the parentheses and write \(L\mathbf{x}\) when \(L\) is a linear transformation. For the case when \(m=n\text{,}\) a linear transformation \(L\colon \R^n\to \R^n\) is also called a linear operator on \(\R^n\text{.}\)

Subsection 2.2 Why linear maps are easy to compute

Given a vector \(\mathbf{x}=(x_1,\ldots,x_n)\) and a linear map \(L\colon \R^n\to\R^m\text{,}\) we have
\begin{align} L\mathbf{x} \amp = L\left(\sum_{j=1}^n x_j\mathbf{e}_j\right)\tag{2.1}\\ \amp = \sum_{j=1}^n L(x_j\mathbf{e}_j)\tag{2.2}\\ \amp = \sum_{j=1}^n x_j L\mathbf{e}_j\tag{2.3} \end{align}
A consequence of equation
\begin{equation} L\mathbf{x} = \sum_j x_j L\mathbf{e}_j\tag{2.4} \end{equation}
is that a linear map is determined by its values on the standard basis vectors \(\mathbf{e}_1, \mathbf{e}_2,\ldots,\mathbf{e}_n\text{.}\) In words, the value of \(L\) on the vector \(\mathbf{x}\) is a linear combination of the vectors \(L\mathbf{e}_j\text{,}\) with the coefficient \(x_j\) for the vector \(L\mathbf{e}_j\text{.}\) Given the vectors \(L\mathbf{e}_j\text{,}\) computing the value of a linear map requires only scalar multiplication and vector addition.

Subsection 2.3 Operations on linear transformations

Let \(L, L'\colon \R^n\to \R^m\text{,}\) let \(M\colon \R^p\to \R^n\text{,}\) and let \(\alpha\) be a real number. Linear transformations \(\alpha L\colon \R^n\to \R^m\text{,}\) \(L+L'\colon \R^n \to \R^m\text{,}\) and \(LM\colon \R^p\to \R^m\) are defined as follows.
\begin{align} (\alpha L)\mathbf{x} \amp = \alpha (L\mathbf{x})\amp \text{(scalar multiplication)}\tag{2.5}\\ (L+L')\mathbf{x} \amp = L\mathbf{x} + L'\mathbf{x}\amp \text{(addition)}\tag{2.6}\\ (LM)\mathbf{x} \amp = L(M\mathbf{x})\amp \text{(composition)}\tag{2.7} \end{align}
Note that \(LM\) is the same thing as \(L\circ M\text{,}\) the ordinary composition of functions. It is conventional to omit the composition symbol in the context of linear transformations.

Exercises 2.4 Exercises

1.

Let \(L\colon \R^2\to\R^2\) be a linear map such that \(L\mathbf{e}_1=(2,3)\) and \(L\mathbf{e}_2=(-1,-2)\text{.}\) Find \(L(1,2)\text{.}\)
Answer.
\(L(1,2)=(0,-1)\)

2.

Let \(L\colon \R^3\to\R\) be a linear map. Find \(L{\mathbf k}\) if \(L{\mathbf i}=2\text{,}\) \(L{\mathbf j}=-1\text{,}\) and \(L(1,2,3)=0\text{.}\)
Answer.
\(L{\mathbf k} = 0\)

3.

Show that the two linearity properties in the definition of linear transformation given in Subsection 2.1 are equivalent to the single property
\begin{equation} L(a\mathbf{x}+b\mathbf{y}) = a L\mathbf{x} + bL\mathbf{y}\tag{2.8} \end{equation}
for all vectors \(\mathbf{x},\mathbf{y}\) and scalars \(a,b\text{.}\)

5.

The dot product has the following properties that look like the properties in the definition of linear map.
\begin{align} \mathbf{u}\cdot (\mathbf{v}+\mathbf{w}) \amp = \mathbf{u}\cdot \mathbf{v} + \mathbf{u}\cdot \mathbf{w}\tag{2.9}\\ \mathbf{u}\cdot (\alpha \mathbf{v}) \amp = (\alpha \mathbf{u})\cdot \mathbf{v}= \alpha (\mathbf{u}\cdot\mathbf{v})\tag{2.10} \end{align}
for all \(\mathbf{u},\mathbf{v},\mathbf{w}\) in \(\R^n\) and scalars \(\alpha\text{.}\) Show that these properties hold.

6.

Prove that the composition of two linear maps is a linear map.