Viewing posts tagged vectors

Representing Compositions of Linear Transformations as Matrices

Let S:U->V and T:V->W be linear transformations, and let A = [T][α,β] and B = [S][β,γ] with respective bases of U, V, and W given as α = {u_1,..,u_n}, β = {v_1,..,v_m}, and γ = {w_1,..,w_p}; [1 < n,m,p < ∞] ∈ Z+. The product of these two matrices AB = [TS][α,γ].

Proving Properties of Compositions of Linear Transformations

Theorem: Let V be a vector space. Let R, S, and TL(V) be linear transformations. Then:
(a) R(S + T) = RS + RT and (S + T)R = SR + TR
(b) R(ST) = (RS)T
(c) RI = IR = R
(d) c(ST) = (cS)T = S(cT) ∀ c ∈ R

Linear Dependence and Independence

Theorem: Let V be a vector space, and let S_1S_2V. If S_1 is linearly dependent, then S_2 is linearly dependent.
Corollary: Let V be a vector space, and let S_1S_2V. If S_2 is linearly independent, then S_1 is linearly independent.
Theorem: If given S_1 ⊆ S_2 ⊆ V: S_1 is linearly dependent then S_2 is linearly dependent we begin by assuming S_1 is linearly dependent. Then for some arbitrary set of vectors  u_1,...,u_n in S_2, then for all a_1,...,a_n in R, with at least some a_i ≠ 0, 1 ≤ i ≤ n:

Span(S) is the set of all 2 x 2 Symmetric Matrices

Let W_2 be the set of all 2 x 2 real symmetric matrices such that for all A in M_(2 x 2)(R), A_ij = A_ji; for all 1 ≤ i,j ≤ 2.

Symmetric Matrices as a Subspace of all Square Matrices

Theorem: subspace W of a vector space V over R is a subset of V which also has the properties that W is closed under addition and scalar multiplication. That is, For all x, y in W, x and y are in V and for any c in R, cx + y is in W.
Let W_n be the subset of all M_(n x n)(R) such that A_ij = A_ji (ie. the set of all real symmetric square matrices).