let V and W be finite dimensional vector spaces with ordered bases beta and lamda respectively. let T, U: V –> W be linear transformations then [t+U] = T+U
1
To prove that [T+U] = [T]+[U], where [T], [U], and [T+U] denote the matrices of linear transformations T, U, and T+U, respectively, with respect to the given ordered bases beta and lambda, we need to show that each corresponding entry in [T+U] is equal to the sum of the corresponding entries in [T] and [U].
Let beta = {v1, v2, …, vn} and lambda = {w1, w2, …, wm}.
Then, for any v in V, we have:
[T+U]([v]_beta) = [(T+U)(v)]_lambda (by definition of [T+U])
= [Tv + Uv]_lambda (by definition of T+U)
= [Tv]_lambda + [Uv]_lambda (by properties of matrix addition)
= [T]([v]_beta) + [U]([v]_beta) (by definition of [T] and [U])
= ([T]+[U])([v]_beta) (by properties of matrix addition)
Therefore, [T+U] = [T]+[U] as desired.
This proves that the matrix of the sum T+U of two linear transformations T and U is equal to the sum of their matrices [T] and [U].
More Answers:
Mastering Math: How Consistency In Study Improves Learning And RetentionThe Linearity Of Determinants: Exploring The Relationship Of 2X2 Matrices
Proving The Linearity Of The Determinant Function In Matrices: Additivity And Homogeneity