Home

Μηδείς αγεωμέτρητος εισίτω μοι την στέγην

Representing Compositions of Linear Transformations as Matrices

Let S:U->V and T:V->W be linear transformations, and let A = [T][α,β] and B = [S][β,γ] with respective bases of U, V, and W given as α = {u_1,..,u_n}, β = {v_1,..,v_m}, and γ = {w_1,..,w_p}; [1 < n,m,p < ∞] ∈ Z+. The product of these two matrices AB = [TS][α,γ].

Proving Properties of Compositions of Linear Transformations

Theorem: Let V be a vector space. Let R, S, and TL(V) be linear transformations. Then:
(a) R(S + T) = RS + RT and (S + T)R = SR + TR
(b) R(ST) = (RS)T
(c) RI = IR = R
(d) c(ST) = (cS)T = S(cT) ∀ c ∈ R

Three Animal Fables

A tortoise and a mallard are walking a cobblestone road in the old town. The mallard turns to the tortoise and asks "So, how am I to come out of my shell?" "I'd prefer to duck that question entirely!" replies the tortoise.

When are Squares Triangles?

This is based upon the final part of this discussion (2) for children given by Rav Ginzburgh on the relationship of square and triangle numbers in the structure of Torah and his paper, "When Two Triangles Make a Square.

Linear Dependence and Independence

Theorem: Let V be a vector space, and let S_1S_2V. If S_1 is linearly dependent, then S_2 is linearly dependent.
Corollary: Let V be a vector space, and let S_1S_2V. If S_2 is linearly independent, then S_1 is linearly independent.
----------------- 
Theorem: If given S_1 ⊆ S_2 ⊆ V: S_1 is linearly dependent then S_2 is linearly dependent we begin by assuming S_1 is linearly dependent. Then for some arbitrary set of vectors  u_1,...,u_n in S_2, then for all a_1,...,a_n in R, with at least some a_i ≠ 0, 1 ≤ i ≤ n: