In linear algebra, a Euclidean subspace (or subspace of R^{n}) is a set of vectors that is closed under addition and scalar multiplication. Geometrically, a subspace is a flat in ndimensional Euclidean space that passes through the origin. Examples of subspaces include the solution set to a homogeneous system of linear equations, the subset of Euclidean space described by a system of homogeneous linear parametric equations, the span of a collection of vectors, and the null space, column space, and row space of a matrix.^{[1]}
In abstract linear algebra, Euclidean subspaces are important examples of vector spaces. In this context, a Euclidean subspace is simply a linear subspace of a Euclidean space.
Contents 
In mathematics, R^{n} denotes the set of all vectors with n real components:
Here the word vector refers to any ordered list of numbers. Vectors can be written as either ordered tuples or as columns of numbers:
Geometrically, we regard vectors with n components as points in an ndimensional space. That is, we identify the set R^{n} with ndimensional Euclidean space. Any subset of R^{n} can be thought of as a geometric object (namely the object consisting of all the points in the subset). Using this mode of thought, a line in threedimensional space is the same as the set of points on the line, and is therefore just a subset of R^{3}.
A Euclidean subspace is a subset S of R^{n} with the following properties:
There are several common variations on these requirements, all of which are logically equivalent to the list above.^{[4]} ^{[5]}
Because subspaces are closed under both addition and scalar multiplication, any linear combination of vectors from a subspace is again in the subspace. That is, if v_{1}, v_{2}, ..., v_{k} are elements of a subspace S, and c_{1}, c_{2}, ..., c_{k} are scalars, then
is again an element of S.
Geometrically, a subspace of R^{n} is simply a flat through the origin, i.e. a copy of a lower dimensional (or equidimensional) Euclidean space sitting in n dimensions. For example, there are four different types of subspaces in R^{3}:
In ndimensional space, there are subspaces of every dimension from 0 to n.
The geometric dimension of a subspace is the same as the number of vectors required for a basis (see below).
The solution set to any homogeneous system of linear equations with n variables is a subspace of R^{n}:
For example, the set of all vectors (x, y, z) satisfying the equations
is a onedimensional subspace of R^{3}. More generally, that is to say that given a set of n, independent functions, the dimension of the subspace in R^{k} will be the dimension of the null set of A, the composite matrix of the n functions.
In linear algebra, a homogeneous system of linear equations can be written as a single matrix equation:
The set of solutions to this equation is known as the null space of the matrix. For example, the subspace of R^{3} described above is the null space of the matrix
Every subspace of R^{n} can be described as the null space of some matrix (see algorithms, below).
The subset of R^{n} described by a system of homogeneous linear parametric equations is a subspace:
For example, the set of all vectors (x, y, z) parameterized by the equations
is a twodimensional subspace of R^{3}.
In linear algebra, the system of parametric equations can be written as a single vector equation:
The expression on the right is called a linear combination of the vectors (2, 5, 1) and (3, −4, 2). These two vectors are said to span the resulting subspace.
In general, a linear combination of vectors v_{1}, v_{2}, . . . , v_{k} is any vector of the form
The set of all possible linear combinations is called the span:
If the vectors v_{1},...,v_{k} have n components, then their span is a subspace of R^{n}. Geometrically, the span is the flat through the origin in ndimensional space determined by the points v_{1},...,v_{k}.
A system of linear parametric equations can also be written as a single matrix equation:
In this case, the subspace consists of all possible values of the vector x. In linear algebra, this subspace is known as the column space (or image) of the matrix A. It is precisely the subspace of R^{n} spanned by the column vectors of A.
The row space of a matrix is the subspace spanned by its row vectors. The row space is interesting because it is the orthogonal complement of the null space (see below).
In general, a subspace of R^{n} determined by k parameters (or spanned by k vectors) has dimension k. However, there are exceptions to this rule. For example, the subspace of R^{3} spanned by the three vectors (1, 0, 0), (0, 0, 1), and (2, 0, 3) is just the xzplane, with each point on the plane described by infinitely many different values of t_{1}, t_{2}, t_{3}.
In general, vectors v_{1},...,v_{k} are called linearly independent if
for (t_{1}, t_{2}, ..., t_{k}) ≠ (u_{1}, u_{2}, ..., u_{k}).^{[6]} If v_{1}, ..., v_{k} are linearly independent, then the coordinates t_{1}, ..., t_{k} for a vector in the span are uniquely determined.
A basis for a subspace S is a set of linearly independent vectors whose span is S. The number of elements in a basis is always equal to the geometric dimension of the subspace. Any spanning set for a subspace can be changed into a basis by removing redundant vectors (see algorithms, below).
Most algorithms for dealing with subspaces involve row reduction. This is the process of applying elementary row operations to a matrix until it reaches either row echelon form or reduced row echelon form. Row reduction has the following important properties:
See the article on row space for an example.
If we instead put the matrix A into reduced row echelon form, then the resulting basis for the row space is uniquely determined. This provides an algorithm for checking whether two row spaces are equal and, by extension, whether two subspaces of R^{n} are equal.
See the article on column space for an example.
This produces a basis for the column space that is a subset of the original column vectors. It works because the columns with pivots are a basis for the column space of the echelon form, and row reduction does not change the linear dependence relationships between the columns.
If the final column of the reduced row echelon form contains a pivot, then the input vector v does not lie in S.
See the article on null space for an example.
If U and V are subspaces of R^{n}, their intersection is also a subspace:
The dimension of the intersection satisfies the inequality
The minimum is the most general case^{[7]}, and the maximum only occurs when one subspace is contained in the other. For example, the intersection of twodimensional subspaces in R^{3} has dimension one or two (with two only possible if they are the same plane). The intersection of threedimensional subspaces in R^{5} has dimension one, two, or three, with most pairs intersecting along a line.
The codimension of a subspace U in R^{n} is the difference n − dim(U). Using codimension, the inequality above can be written
If U and V are subspaces of R^{n}, their sum is the subspace
For example, the sum of two lines is the plane that contains them both. The dimension of the sum satisfies the inequality
Here the minimum only occurs if one subspace is contained in the other, while the maximum is the most general case.^{[8]} The dimension of the intersection and the sum are related:
The orthogonal complement of a subspace U is the subspace
Here x · u denotes the dot product of x and u. For example, if U is a plane through the origin in R^{3}, then U^{⊥} is the line perpendicular to this plane at the origin.
If b_{1}, b_{2}, ..., b_{k} is a basis for U, then a vector x is in the orthogonal complement of U if and only if it is orthogonal to each b_{i}. It follows that the null space of a matrix is the orthogonal complement of the row space.
The dimension of a subspace and its orthogonal complement are related by the equation
That is, the dimension of U^{⊥} is equal to the codimension of U. The intersection of U and U^{⊥} is the origin, and the sum of U and U^{⊥} is all of R^{n}
Orthogonal complements satisfy a version of De Morgan's laws:
In fact, the collection of subspaces of R^{n} satisfy all of the axioms for a Boolean algebra, with intersection as AND, sum as OR, and orthogonal complement as NOT.
