Home  18.013A  Chapter 3  Section 3.5 


Prove: any $k+1$ kvectors are linearly dependent. (You can do it by using mathematical induction.)
Solution:
Suppose we have $k+1$ kvectors. Each such vector has $k$ components. We assume as an induction hypothesis that any $k$ (k1)vectors are linearly dependent, which means that there must be a linear dependence among any $k$ (k1)component vectors.
(This hypothesis is trivial for $k=2$ )
We notice that a linear combination of linear combinations is a linear combination. This means if we can take our $k+1$ kvectors and produce $k$ linear distinct linear combinations of them that are each (k1)vectors, we can use the induction hypothesis to give us a linear dependence among these which will produce a linear dependence among our original vectors.
Notice also that kvectors all of whose last components are 0 can be considered to be k1vectors.
So we pick one of our vectors, say the kth, which has a nonzero kth component. (if there is no vector with nonzero kth component, then we really have k1component vectors and can apply the induction hypothesis immediately) Now we subtract enough of this vector from each of the others to make the resulting kth components all zero.
Now we have our $k$ (k1)vectors and find a linear combination that is 0. This gives a linear combination of the original vectors which is 0 and we are done.
We have to verify that the new linear combination cannot have all 0 coefficients if the one obtained from the induction hypothesis did not. This is obvious because each of our vectors except for the kth occurs in exactly one of the combinations to which the induction hypothesis was applied. Any nonzero coefficient in the linear combination of linear combinations will give rise to a nonzero coefficient of the corresponding one of our original vectors and we are really done.
