Not able to correct the sign of computed SVD.
2 views (last 30 days)
Show older comments
I am trying to check the working of Matlab's SVD by correcting the sign of the computed SVD.
Step 1: I am creating an A matrix from random U,S and V' ;
Step 2: I am calculating the svd(A) and finding subsequent U2,S2 and V2'
Step 3: Next up I am comparing the U, V and A by taking the norm of their difference!
Step 4: Correction step; I am multiplying the computed U2, V2 with appropriate sign changes such that they get back to the original sign convention.
Step 5: I am again comparing them with original U,V and A in N4,N5 and N6!
The U matrix is getting corrected but the V and A matrix is still showing error of high orders! Can anyone help me with this!
Thanks!
U = rand(50);
V = rand(50);
d = rand(50,1);
s = sort(d);
S = diag(d);
A = U*S*(V)'; % Compute A matrix from U, S, V
[U2,S2,V2]= svd(A); % Compute SVD of A
A2 = U2*S2*(V2)';
N1=norm(U2-U);
N2=norm(V2-V);
N3= norm(A2-A);
D = ((U2')*U); % Converting to diagonal matrix for plotting only
D2 = ((V2')*V);
A3 = (U2*D)*S2*(D2*V2)';
N4=norm(U2*D-U);
N5=norm(D2*V2-V);
N6 = norm(A3-A);
5 Comments
Torsten
on 10 Oct 2022
Did you ask google ? There are so many hits for "matlab & generate unitary matrix".
Answers (0)
See Also
Categories
Find more on Linear Algebra in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!