Complete the vectors (1 2 -1 2 3)^T (2 2 1 5 5)^T (-1,-4,4,7,-11)^T to a basis in R⁵.
Hint: Form the matrix with rows (1,2,-1,2,3), (2,2,1,5,5), and (-1,-4,4-7,-11) and compute an echelon form.
Please be as detailed as possible. Thanks!
Update:Thanks but I'm not entirely certain I follow. Could you work the steps? I don't understand the ultimate goal, or even how you intend to get there. Sorry I am not the best at this subject.
Copyright © 2024 1QUIZZ.COM - All rights reserved.
Answers & Comments
Verified answer
Define S as the span of the given 3 vectors. The idea here is to find a basis for S^{perp}, the space perpendicular to S. Then a basis for S, together with a basis for S^{perp}, will always give you a basis for the full vector space.
You can define A as the matrix with columns given by the 3 known vectors. Then S = Col(A) and S^{perp} = Col(A)^{perp} = Null(A^T). So you can just compute a basis for Null(A^T), which is what the problem is hinting at.
***
EDIT: No, I am saying the same thing as the hint. Compute a basis for Null(A^T). Recall A^T is the transpose of A, treating all columns of A as rows.
However, I think you are being much to "procedural" driven here. The point is that you need to find the set of all vectors x that are orthogonal to all of the given vectors c_1, c_2, c_3. So, if we think of these vectors as column vectors, for orthogonality we need c_1^T x = 0, c_2^T x = 0, c_3^T x = 0. You can equivalently write that as a matrix equation C x = 0, where the rows of C are c_1^T, c_2^T, c_3^T. So just think about what vectors are taking inner products with what other vectors.
***
You want to find all vectors x that solve Cx = 0, where:
C =
[1 2 -1 2 3]
[2 2 1 5 5]
[-1 -4 4 7 -11]
To do that, you do Gaussian elimination. You should get 2 linearly independent vectors that way.
***
Another unexplained thumbs down. Any explanation?
I just gave the other answerer a thumbs up as a correct alternative way, his does not compute vectors orthogonal to the first 3 but they still are linearly independent from the first 3. The explanation of "why" his way works is that row reductions on C produce new rows that span the same space as the original rows of C. I am surprised that he seems to ignore my own answer and does not provide any thumbs up for it...? =) Clearly Cx=0 if and only if x is orthogonal to all rows of C. Computing a basis for Null(C) is standard and I assumed was in your "box of tools."
Let's do what the hint says:
[ 1 2 -1 2 3]
[ 2 2 1 5 5]
[-1 -4 4 -7 -11]
~
[1 2 -1 2 3]
[0 -2 3 1 -1]
[0 -2 3 -5 -8]
~
[1 2 -1 2 3]
[0 2 -3 -1 1]
[0 0 0 -6 -7]
You can see we have vectors with leading digits in the first, second and fourth columns. So to extend this to a basis for R^5 we can simply add the vectors (0, 0, 1, 0, 0)^T and (0, 0, 0, 0, 1)^T. It's obvious that if we add these to the matrix above we'll have something that's row-equivalent to the identity matrix, i.e. a basis for R^5.
So the full basis will be
{(1, 2, -1, 2, 3)^T, (2, 2, 1, 5, 5)^T, (-1, -4, 4, -7, -11)^T, (0, 0, 1, 0, 0)^T, (0, 0, 0, 0, 1)^T}.