LSQ of exponential function without linearizing
Show older comments
Hi. I have a problem solving the inverse problem of this form: y = a*exp(b*x1)*exp(c*x2)
I need to solve for a,b,c given data: y, x1, x2. I believe this requires an iterative LSQ approach using damping by the Levenberg-Marquardt algorithm. However, I find that the solution is highly dependent on starting point. y is in the range 10^10 while x1 and x2 are in the range [0:1]. I have an expectation that b and c are in the range [-2:2] meaning that "a" is much bigger in the range [10^9:10^10].
DO you guys have any good advices for how to make the inverse method stable? Standardization / normalization?
Best regards, Commat
3 Comments
What do
>> cond([x1(:), x2(:)])
and
>> cond([x1(:), x2(:), ones(size(x1(:)))])
give you?
Answers (1)
I don't know exactly what you mean by "linearizing". If you take the log of both sides of your model equation, you obtain linear equations in b,c, and log(a)
log(y(:))=x1(:)*b +x2(:)*c +log(a)
Your result for cond([x1(:), x2(:), ones(size(x1(:)))]) says you should be able to get a pretty stable solution using mldivide,
params=[x1(:), x2(:), ones(size(x1(:)))]\log(y(:));
a=exp(params(3));
b=params(1);
c=params(2);
These kinds of transformations don't always play well when you have additive measurement noise, but the above should at least be a good initial guess [a0,b0,c0] for an iterative method.
Without actual data to run, I can only guess what might be wrong with the iterative method you're using. However, I would recommend normalizing your data to smaller, more manageable, numbers y=y/10^10. This should have no effect other than to re-express a in different units. You might also try using FMINSPLEAS , which can take advantage of the fact that y is linear w.r.t a.
flist={@(bc,x) exp(x*bc(:))};
[bc,a] = fminspleas(flist,[b0,c0],[x1(:),x2(:)],y(:)/1e10);
a=a*1e10;
b=bc(1);
c=bc(2);
4 Comments
commat
on 5 Mar 2014
Matt J
on 6 Mar 2014
Didn't understand that remark too well I'm afraid. Are you saying it's unclear why you get different results for different initial guesses? The least squares cost function can in general have local minima. So, naturally, the result you get can be sub-optimal if you guess badly.
commat
on 6 Mar 2014
Matt J
on 6 Mar 2014
Yes, it should, but I still don't understand what you say you're witnessing. You think it's starting to converge to the global solution, then jumping to another sub-optimal one? What is the evidence of that?
Categories
Find more on Discontinuities in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!