Properties of SVD of a hermitian matrix not holding at single precision

10 views (last 30 days)
Greetings,
I came accross something I did not expect and I was hoping for some insight.
I am working on a project that includes dozens of very large and dense hermitian matrices. To try and reduce the amount of memory used by my code, I am storing these matrices at single precision and I am also attempting to use a low-rank approximation of these matrices during execution.
Since these matrices are hermitian, I am anticipating the following to be true:
  1. I should be able to express these matrices as, where S and V come from the singular value decomposition of A.( [~, S, V] = svd(A) )
  2. For a rank k approximation of A,
Ak = V(:,1:k) * S(1:k,1:k) * V(:,1:k)';
That the frobenius norm of A-Ak should be equal to the k+1 singular value (that is, norm(A-Ak) == S(k+1) )
What I am finding is that 2) is not holding true if I run the svd() function on matrix A, when it is in single precision. However, if I convert the arrays to double before running the SVD, 2) holds true. It also holds true if I convert S and V back to single precision after running the SVD function.
Is this really a precission issue? The smallest singular value of my matrice is 1e-6 so I wasn't expecting a single precission array to be an issue. Any insight would be greatly appreciated.
Thank you in advance!
  8 Comments
David Goodmanson
David Goodmanson on 25 Sep 2020
Hi Michael,
now that the problem is well defined, from your results it appears that you have a prima facie case that single precision is not enough precision.
I would not put that much import into the 1e-6 singular value. The question is, compared to what? If the matrix has a large condition number then it could easily happen that single precision does not get the job done.
Bruno Luong
Bruno Luong on 25 Sep 2020
Edited: Bruno Luong on 25 Sep 2020
Stupid question but do you take care of M numerically hermitian by running
A=0.5*(A+A')
before anything?
As David rightly pointout, spectral norm equality holds, not Frobenius norm (the later has little meaning interpretation).
Also don't expect SVD to return U == V, even for Hermitian matrix A, there is always an arbitrary phase-shift (sign) in SVD so that
U*diag(epx(1i*theta)) ~= V
with theta arbitrary (real) phase.
To get S and V such that V*S*V' = A
run
[V,S]=eig(A)

Sign in to comment.

Accepted Answer

Christine Tobler
Christine Tobler on 28 Sep 2020
I'd suggest just calling EIG
[V, S] = eig(A)
instead of calling the SVD. If A is (exactly!) symmetric on input, this will return S and V such that V*S*V' == A, and you can check if A is numerically S.P.D. by seeing if S contains any negative numbers (if A is symmetric positive semi-definite, there are likely to be some negative up to round-off values). With the SVD, such a small number would be returned as a positive, and the associated vectors in U and V would have different signs.
If there are values on S's diagonal that are negative at round-off level, you can decide if you have to stop the algorithm, or if it's all right to just set these diagonal values as zero.

More Answers (0)

Categories

Find more on Eigenvalues in Help Center and File Exchange

Tags

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!