- 1-to-1
- A linear transformation, T, is 1-to-1 if each
vector in the range of T has at most a single preimage. Thus,
for any vector
`w`, the equation T(`x`) =`w`can be solved by at most a single value of`x`.

The linear transformation T is 1-to-1 if and only if the null space of its corresponding matrix has only the zero vector in its null space. Equivalently, a linear transformation is 1-to-1 if and only if its corresponding matrix has no non-pivot columns. • - Basis
- A basis of a vector space is an ordered set
of vectors which is linearly independent
and spans the vector space.

Note that every basis of a vector space has the same number of vectors, which is called the dimension of the vector space.

*Examples:*- R
^{3}has bases {[1;0;0],[0;1;0],[0;0;1]} (the standard basis) and {[2;1;0],[1;1;1],[1;2;1]} - P
_{3}has bases {1,x,x^{2},x^{3}} (the standard basis) and {1,x,2x^{2}-1,4x^{3}-3x} (the Tchebicheff basis)

- R
- Codomain
- The codomain of a linear transformation is the vector space which contains the vectors resulting from the transformation's action. Thus, if T(v) = w, then v is a vector in the domain and w is a vector in the range, which in turn is contained in the codomain.

*Examples:*- The codomain of the transformation T:R
^{3}→R^{5}is R^{5} - The matrix A=[1,2;2,1;1,1] (three rows and two columns) induces a linear map from R
^{2}to R^{3}, with domain R^{3}

- The codomain of the transformation T:R
- Column Space
- The column space of a matrix is the subspace of the
codomain which is spanned
by the columns of the matrix.

The dimension of the column space is called the rank of the matrix, and is equal to the dimension of the column space.

*Synonyms:*If a linear transformation T is represented by a matrix A, then the range of T is equal to the column space of A. - Consistent
- A system of linear equations is consistent if it has a solution.

Note that a homogeneous system Ax=0 will always be consistent as it always has the solution x=0 (the trivial solution). - Dimension
- The dimension of a vector space is the
number of vectors in any basis of the space.

*Examples:* - Domain
- The domain of a linear transformation is the vector space on which the transformation acts. Thus, if T(v) = w, then v is a vector in the domain and w is a vector in the range, which in turn is contained in the codomain.

*Examples:*- The domain of the transformation T:R
^{3}→R^{5}is R^{3} - The matrix A=[1,2;2,1;1,1] (three rows and two columns) induces a linear map from R
^{2}to R^{3}, with domain R^{2}

- The domain of the transformation T:R
- Echelon Form

*Synonyms:*

*Related:*- Field
- A field is an algebraic structure with addition, "+", and multiplication, "·", (and subtraction and division) with certain rules. A vector space is defined with scalars chosen from a particular field. The rules which define a field F are: For all
`a`,`b`and`c`∈F`a`+`b`∈ F (Closure of Addition)`a`+`b`=`b`+`a`(Commutativity of Addition)`a`+(`b`+`c`) = (`a`+`b`)+`c`(Associativity of Addition)- There is some 0∈F such that 0+
`a`=`a`(Additive Identity) - For any
`a`∈F there is some`b`∈F such that`a`+`b`=0 (Additive Inverse) [We call this element -`a`] `a`·`b`∈ F (Closure of Multiplication)`a`·`b`=`b`·`a`(Commutativity of Multiplication)`a`·(`b`·`c`) = (`a`·`b`)·`c`(Associativity of Multiplication)- For any non-zero
`a`∈F there is some`b`∈F such that`a`·`b`= 1 (Multiplicative Inverse) [We call this element`a`^{-1}] `a`·(`b`+`c`) =`a`·`b`+`a`·`c`and (`a`+`b`)·`c`=`a`·`c`+`b`·`c`(Distributive Laws)

*Examples:*- R (real numbers), C (complex numbers) and Q (rational numbers) are fields
- Z
_{2}, the set {0,1} with addition and multiplication mod 2 (i.e. 1+1=0) is a field. (For any prime integer`p`, the set {0,1,...,`p`-1} with addition and multiplication mod`p`is also a field, called Z_{p}.)

- Gauss Reduction

*Synonyms:*Gauss Reduction is also known as row reduction or just reduction.

*Related:*- Homogeneous
- A homogeneous system of linear equations is a system of linear equations without constant terms. A homogeneous matrix vector equation has form Ax=0.

Note that a homogeneous linear system is always consistent, as it always has the solution x=0 (the trivial solution). - Image
- The image of a vector,
`v`under a given linear transformation, T, is the result of applying the linear transformation to the vector, T(`v`). The set of the images of all vectors in the domain is called the range of T, which in turn is contained in the codomain.

*Examples:*- The image of the vector [1;2] under the matrix A=[1,2;2,1;1,1] is the vector [5;4;3]
- The image of the polynomial
`x`^{2}+3`x`+5∈P_{3}^{R}[`x`] under the linear operator**D**(differentiation) is the polynomial 2`x`+3

- Linear Combination

*Synonyms:*

*Related:*- Linearly Dependent

*Synonyms:*

*Related:*- Linearly Independent

*Synonyms:*

*Related:*- Nullity
- The nullity of a matrix A is the dimension of the null space or kernel. More generally, this applies to linear maps as well as matrices.

The rank-nullity theorem is that the sum of the rank and the nullity of a matrix is equal to the number of columns in the matrix (the dimension of the codomain of the matrix.)*Examples:*- The null space of the matrix A=[1,0;0,1] (the 2x2 identity matrix) is only the zero vector: null(A) = {0}, so it has nullity 0.
- The null space of the matrix B=[0,0;0,0] (the 2x2 zero matrix) is all of the domain: null(B) = R
^{2}, so B has nullity 2 = dim(R^{2}). - The null space of the 2x2 matrix C=[1,1;1,1] is the 1-dim subspace of R
^{2}generated by the vector [1;-1]: null(C) = {[`a`;-`a`]}. Thus the nullity of C is 1. - The kernel (aka null space) the linear operator
**D**(differentiation) when applied to the space of cubic polynomials P_{3}[`x`] is the set of all constant polynomials. The dimension of the space of all constant polynomials is 1, so the nullity of**D**:P_{3}[`x`]→P_{2}[`x`] is 1.

- Null Space
- The null space of a matrix A is the set of solutions to the
homogeneous equation Ax=0.

The zero vector is always in the null space and the null space is a subspace of the domain space.

More generally, the null space of any linear map`T`is the space of all inputs`x`such that`T`(`x`)=0. The dimension of the null space is sometimes called the nullity of the matrix or linear map.

*Examples:*- The null space of the matrix A=[1,0;0,1] (the 2x2 identity matrix) is only the zero vector: null(A) = {0}.
- The null space of the matrix B=[0,0;0,0] (the 2x2 zero matrix) is all of the domain: null(B) = R
^{2}. - The null space of the 2x2 matrix C=[1,1;1,1] is the 1-dim subspace of R
^{2}generated by the vector [1;-1]: null(C) = {[`a`;-`a`]}. - The null space of the 2x4 matrix D=[1,1,1,1;1,2,3,4] is the 2-dim subspace of R
^{4}generated by the vectors [1;-2;1;0] and [2;-3;0;1]: null(D) = {[`z`+2`w`;-2`z`-3`w`;`z`;`w`]} = 〈[1;-2;1;0],[2;-3;0;1]〉.

*Synonyms:*kernel - onto
- A linear transformation, T, is onto if its
range is all of its codomain, not merely
a subspace. Thus, for any vector
`w`, the equation T(`x`) =`w`has at least one solution`x`(is consistent).

The linear transformation T is 1-to-1 if and only if the null space of its corresponding matrix has only the zero vector in its null space. Equivalently, a linear transformation is 1-to-1 if and only if its corresponding matrix has no non-pivot columns. - Pivot
- Pivot has two meanings:
- A pivot of a matrix which has been reduced to echelon form is the leading non-zero element in each row. (We also refer to the pivots of a matrix which has not been reduced, referring implicitly to its echelon form.)
- When performing Gaussian Reduction on a matrix, reducing it to echelon form or reduced echelon form, the act of pivoting is to perform a row swap (partial pivoting) or both a row and a column swap (total pivoting), in order to bring as large an element as possible into the pivot position. This can reduce accumulated error when performing numerical operations on large matrices.

*Synonyms:*

*Related:* - Range
- The range of a linear transformation, T, is the set of all possible values of T(
`v`). Thus, if T(v) = w, then v is a vector in the domain and w is its image in the range, which in turn is a subspace of the codomain.

*Examples:*- The range of the transformation T:R
^{3}→R^{5}is a subspace of R^{5}(but not all of R^{5}) - The matrix A=[1,2;2,1;1,1] (three rows and two columns) induces a linear map from R
^{2}to R^{3}, with domain R^{2}

*Synonyms:*If a linear transformation T is represented by a matrix A, then the range of T is equal to the column space of A. - The range of the transformation T:R
- Rank
- The rank of a matrix is the dimension of the row space, which is equal to the dimension of the column space. The rank of a linear transformation is the dimension of the range.

The rank of a matrix is the number of pivots in that matrix.

The rank-nullity theorem is that the sum of the rank and the nullity of a matrix is equal to the number of columns in the matrix (the dimension of the codomain of the matrix.)*Synonyms:**Related:* - Reduced Echelon Form

*Synonyms:*

*Related:*- Row Space
- The row space of a matrix is the subspace of the
domain which is spanned
by the rows of the matrix.

The dimension of the row space is called the rank of the matrix, and is equal to the dimension of the column space.

*Synonyms:* - Scalar
- A scalar is an element of the field of scalars for a vector space. The specification of a vector space includes specifying a field of scalars. The rules of scalar multiplication for that vector space apply to scalars from that field.

*Examples:*- R
^{3}has real numbers for scalars - C has complex numbers for scalars when viewed as a vector space over the complexes, but it can also be defined as a vector space with real scalars (a vector space "over the reals") in which case a basis would be {1,
*i*}.

*Synonyms:*

*Related:* - R
- Span

*Synonyms:*

*Related:*- Subspace

*Synonyms:*

*Related:*- Trivial Solution

*Synonyms:*

*Related:*- Vector Space
- A vector space is a set, V, together with a choice of a field of scalars,
F, and operations of addition of vectors and multiplication by a scalar such that for any vectors
`u`,`v`and`w`∈ V and scalars`s`and`t`∈ F the following properties hold:`v`+`w`=`w`+`v`(Commutativity of Addition)`u`+(`v`+`w`) = (`u`+`v`)+`w`(Associativity of Addition)- There is a vector 0∈V such that
`v`+ 0 =`v`for all`v`∈V (Additive Identity) - For any vector
`v`∈V there is some vector`w`∈V such that`v`+`w`=0 (Additive Inverse) [We call this element -`v`] - (
`s`·`t`)`v`=`s`(`t``v`) (Associativity of Scalar Multiplication) - For 1∈F (the multiplicative identity of F) we have 1
`v`=`v`(Multiplicative Identity) `s`(`v`+`w`) =`s``v`+`s``w`and (`s`+`t`)`v`=`s``v`+`t``v`(Distributive Laws)

*Examples:*- R
^{3}is the set {[`a`;`b`;`c`] where`a`,`b`,`c`∈R} with scalar multiplication by real numbers. - C
^{2}is the set {[`a`;`b`;`c`] where`a`,`b`∈C} with scalar multiplication by complex numbers. - M
_{2,2}(R) is the set of 2x2 matrices with real entries, with matrix addition and scalar multiplication by real numbers. - P
_{3}^{R}[`x`] is the set of cubic (or less) degree polynomials in`x`with real coefficients, {`a`+`b``x`+`c``x`^{2}+`d``x`^{3}, where`a`,`b`,`c`∈R} with addition of polynomials and scalar multiplication by real numbers.

*Synonyms:*

*Related:*

Robert Campbell, campbell@math.umbc.edu

October 17, 2004