How to add a total variation term to my tomographic reconstruction problem?

5 views (last 30 days)
Hi there,
I have got a very simple model to do a tomographic reconstruction:
arg min_x ||Ax - y||^2
in which, A is the system matrix and y is the true projections.
We can solve the problem using Matlab Quasi-Newton (fminunc) or lsqr function; however, may I ask how to add a total variation penalty to the objective function please?
arg min_x ||Ax - y||^2 + TV(x)
Is there any exsisting implementation in Matlab I can use?
Thanks very much.
Best regards, Aaronne

Answers (1)

Bjorn Gustavsson
Bjorn Gustavsson on 1 Jun 2011
Wouldn't TV(x) be something similar to:
[D1,D2,D3] = gradient(x);
TV = sum((D1.^2+D2.^2+D3.^2).^.5);
If you're using fminunc or the other general optimizers this should be fairly straightforward to add. But wouldn't you get approximately the same by:
D = del2(x);
TV = sum(abs(D));
It is just a smoothing term after all?
  2 Comments
Aaronne
Aaronne on 2 Jun 2011
Hi Bjorn,
Thanks a lot for your reply. Yes, essentially I would like to add a smoothing term for my cost function.
For my understanding of your post, you mean there are two way to construct this TV term:
1.
[D1,D2,D3] = gradient(x);
TV = sum((D1.^2+D2.^2+D3.^2).^.5);
2.
D = del2(x);
TV = sum(abs(D));
For example, if my cost function is:
C = sum(sum(sum((Ax-y).^2)));
should I just add TV term to my cost function directly? Like:
C_TV = sum(sum(sum((Ax-y).^2))) + TV;
If I use 'fminunc' to perform the optimisation, then I want to provide the gradient as well. I have got the gradient of my cost function C quite straightforward, but how to get the gradient of TV let's say TV'?
Cheers,
Aaronne
Bjorn Gustavsson
Bjorn Gustavsson on 3 Jun 2011
Well, I guess that version 1 should be close enough to the definition of TV I found, but I don't think there should be any practical difference between using that as a smoother or the del2 version. Perhaps it would be wiser to use del2(x).^2 instead of the abs.
If you need the gradient then either you have to do the full tedious expansions yourself, or (hopefully) you could use the "Adaptive Robust Numerical Differentiation" package found on the file exchange: http://www.mathworks.com/matlabcentral/fileexchange/13490-adaptive-robust-numerical-differentiation

Sign in to comment.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!