# Row and column vectors

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Short description: Matrix consisting of a single row or column

In linear algebra, a column vector with $\displaystyle{ m }$ elements is an $\displaystyle{ m \times 1 }$ matrix[1] consisting of a single column of $\displaystyle{ m }$ entries, for example, $\displaystyle{ \boldsymbol{x} = \begin{bmatrix} x_1 \\ x_2 \\ \vdots \\ x_m \end{bmatrix}. }$

Similarly, a row vector is a $\displaystyle{ 1 \times n }$ matrix for some $\displaystyle{ n }$, consisting of a single row of $\displaystyle{ n }$ entries, $\displaystyle{ \boldsymbol a = \begin{bmatrix} a_1 & a_2 & \dots & a_n \end{bmatrix}. }$ (Throughout this article, boldface is used for both row and column vectors.)

The transpose (indicated by T) of any row vector is a column vector, and the transpose of any column vector is a row vector: $\displaystyle{ \begin{bmatrix} x_1 \; x_2 \; \dots \; x_m \end{bmatrix}^{\rm T} = \begin{bmatrix} x_1 \\ x_2 \\ \vdots \\ x_m \end{bmatrix} }$ and $\displaystyle{ \begin{bmatrix} x_1 \\ x_2 \\ \vdots \\ x_m \end{bmatrix}^{\rm T} = \begin{bmatrix} x_1 \; x_2 \; \dots \; x_m \end{bmatrix}. }$

The set of all row vectors with n entries in a given field (such as the real numbers) forms an n-dimensional vector space; similarly, the set of all column vectors with m entries forms an m-dimensional vector space.

The space of row vectors with n entries can be regarded as the dual space of the space of column vectors with n entries, since any linear functional on the space of column vectors can be represented as the left-multiplication of a unique row vector.

## Notation

To simplify writing column vectors in-line with other text, sometimes they are written as row vectors with the transpose operation applied to them.

$\displaystyle{ \boldsymbol{x} = \begin{bmatrix} x_1 \; x_2 \; \dots \; x_m \end{bmatrix}^{\rm T} }$

or

$\displaystyle{ \boldsymbol{x} = \begin{bmatrix} x_1, x_2, \dots, x_m \end{bmatrix}^{\rm T} }$

Some authors also use the convention of writing both column vectors and row vectors as rows, but separating row vector elements with commas and column vector elements with semicolons (see alternative notation 2 in the table below).[citation needed]

Row vector Column vector
Standard matrix notation
(array spaces, no commas, transpose signs)
$\displaystyle{ \begin{bmatrix} x_1 \; x_2 \; \dots \; x_m \end{bmatrix} }$ $\displaystyle{ \begin{bmatrix} x_1 \\ x_2 \\ \vdots \\ x_m \end{bmatrix} \text{ or } \begin{bmatrix} x_1 \; x_2 \; \dots \; x_m \end{bmatrix}^{\rm T} }$
Alternative notation 1
(commas, transpose signs)
$\displaystyle{ \begin{bmatrix} x_1, x_2, \dots, x_m \end{bmatrix} }$ $\displaystyle{ \begin{bmatrix} x_1, x_2, \dots, x_m \end{bmatrix}^{\rm T} }$
Alternative notation 2
(commas and semicolons, no transpose signs)
$\displaystyle{ \begin{bmatrix} x_1, x_2, \dots, x_m \end{bmatrix} }$ $\displaystyle{ \begin{bmatrix} x_1; x_2; \dots; x_m \end{bmatrix} }$

## Operations

Matrix multiplication involves the action of multiplying each row vector of one matrix by each column vector of another matrix.

The dot product of two column vectors a, b, considered as elements of a coordinate space, is equal to the matrix product of the transpose of a with b,

$\displaystyle{ \mathbf{a} \cdot \mathbf{b} = \mathbf{a}^\intercal \mathbf{b} = \begin{bmatrix} a_1 & \cdots & a_n \end{bmatrix} \begin{bmatrix} b_1 \\ \vdots \\ b_n \end{bmatrix} = a_1 b_1 + \cdots + a_n b_n \,, }$

By the symmetry of the dot product, the dot product of two column vectors a, b is also equal to the matrix product of the transpose of b with a,

$\displaystyle{ \mathbf{b} \cdot \mathbf{a} = \mathbf{b}^\intercal \mathbf{a} = \begin{bmatrix} b_1 & \cdots & b_n \end{bmatrix}\begin{bmatrix} a_1 \\ \vdots \\ a_n \end{bmatrix} = a_1 b_1 + \cdots + a_n b_n\,. }$

The matrix product of a column and a row vector gives the outer product of two vectors a, b, an example of the more general tensor product. The matrix product of the column vector representation of a and the row vector representation of b gives the components of their dyadic product,

$\displaystyle{ \mathbf{a} \otimes \mathbf{b} = \mathbf{a} \mathbf{b}^\intercal = \begin{bmatrix} a_1 \\ a_2 \\ a_3 \end{bmatrix}\begin{bmatrix} b_1 & b_2 & b_3 \end{bmatrix} = \begin{bmatrix} a_1 b_1 & a_1 b_2 & a_1 b_3 \\ a_2 b_1 & a_2 b_2 & a_2 b_3 \\ a_3 b_1 & a_3 b_2 & a_3 b_3 \\ \end{bmatrix} \,, }$

which is the transpose of the matrix product of the column vector representation of b and the row vector representation of a,

$\displaystyle{ \mathbf{b} \otimes \mathbf{a} = \mathbf{b} \mathbf{a}^\intercal = \begin{bmatrix} b_1 \\ b_2 \\ b_3 \end{bmatrix}\begin{bmatrix} a_1 & a_2 & a_3 \end{bmatrix} = \begin{bmatrix} b_1 a_1 & b_1 a_2 & b_1 a_3 \\ b_2 a_1 & b_2 a_2 & b_2 a_3 \\ b_3 a_1 & b_3 a_2 & b_3 a_3 \\ \end{bmatrix} \,. }$

## Matrix transformations

Main page: Transformation matrix

An n × n matrix M can represent a linear map and act on row and column vectors as the linear map's transformation matrix. For a row vector v, the product vM is another row vector p:

$\displaystyle{ \mathbf{v} M = \mathbf{p} \,. }$

Another n × n matrix Q can act on p,

$\displaystyle{ \mathbf{p} Q = \mathbf{t} \,. }$

Then one can write t = pQ = vMQ, so the matrix product transformation MQ maps v directly to t. Continuing with row vectors, matrix transformations further reconfiguring n-space can be applied to the right of previous outputs.

When a column vector is transformed to another column vector under an n × n matrix action, the operation occurs to the left,

$\displaystyle{ \mathbf{p}^\mathrm{T} = M \mathbf{v}^\mathrm{T} \,,\quad \mathbf{t}^\mathrm{T} = Q \mathbf{p}^\mathrm{T}, }$

leading to the algebraic expression QM vT for the composed output from vT input. The matrix transformations mount up to the left in this use of a column vector for input to matrix transformation.