Viewing posts tagged Linear Algebra
Let S:U->V and T:V->W be linear transformations, and let A = [T][α,β] and B = [S][β,γ] with respective bases of U, V, and W given as α = {u_1,..,u_n}, β = {v_1,..,v_m}, and γ = {w_1,..,w_p}; [1 < n,m,p < ∞] ∈ Z+. The product of these two matrices AB = [TS][α,γ].
Theorem: Let V be a vector space. Let R, S, and T ∈ L(V) be linear transformations. Then:
(a) R(S + T) = RS + RT and (S + T)R = SR + TR
(b) R(ST) = (RS)T
(c) RI = IR = R
(d) c(ST) = (cS)T = S(cT) ∀ c ∈ R
Theorem: Let V be a vector space, and let S_1 ⊆ S_2 ⊆ V. If S_1 is linearly dependent, then S_2 is linearly dependent.
Corollary: Let V be a vector space, and let S_1 ⊆ S_2 ⊆ V. If S_2 is linearly independent, then S_1 is linearly independent.
-----------------
Theorem: If given S_1 ⊆ S_2 ⊆ V: S_1 is linearly dependent then S_2 is linearly dependent we begin by assuming S_1 is linearly dependent. Then for some arbitrary set of vectors u_1,...,u_n in S_2, then for all a_1,...,a_n in R, with at least some a_i ≠ 0, 1 ≤ i ≤ n:
Let W_2 be the set of all 2 x 2 real symmetric matrices such that for all A in M_(2 x 2)(R), A_ij = A_ji; for all 1 ≤ i,j ≤ 2.
Theorem: A subspace W of a vector space V over R is a subset of V which also has the properties that W is closed under addition and scalar multiplication. That is, For all x, y in W, x and y are in V and for any c in R, cx + y is in W.
-----------------
Let W_n be the subset of all M_(n x n)(R) such that A_ij = A_ji (ie. the set of all real symmetric square matrices).