Fitting data to an equation with complex part
3 views (last 30 days)
Show older comments
Hi,
I hope you are all well.
I have experimental data for a diaphragm displacement and want to fit it to the following equation. The experimental data is displacement (H in meters) and frequency (w in rad/s).
I have a working code but it is changes a lot with the initial guess. I need help to improve this.
x(1) is the alpha (numerator) and x(2) is the damping. omega_r is the resonant frequency and it is known. Gamma should be between 0.04 and 0.07. I am interested in gamma, the damping term in the denominator.
% non linear fitting
fun = @(x,omega)(x(1)./(omega_r^2 - omega.^2 + 1i*(2*x(2)*omega_r.*omega)));
x0 = [-35.9811 + 1i*23.8154,0.06];
opts = optimoptions(@lsqcurvefit,'Display','off','Algorithm','trust-region-reflective');
[vestimated,resnorm] = lsqcurvefit(fun,x0,omega,H,[],[],opts);
Looking forward to your suggestions.
Best regards,
Baris
Edit: x0 is changed.
13 Comments
Accepted Answer
Alex Sha
on 11 Apr 2020
lsqcurvefit is seneitive to initial start-values since it use local optimization algorithm, same as cftool, there is a GA toolbox with global optimization algorithm in Matlab, unfortunately, the effects of GA are not so good as expected.
2 Comments
J. Alex Lee
on 13 Apr 2020
I don't know that it's helpful to try to characterize the methods' sensitivities...it's really sensitivity, i.e., the nature of the optimization problem that is important. If your optimization landscape is bumpy, as Alex Sha says, methods like Levenberg and trust-region may only find local bumps, depending on how much control you have over the size of the search space. On the other hand, if you have a very smooth optimization landscape, you might be fine. On the other hand, if your optimization landscape is smooth but flat, most algorithms will have trouble because no set of parameters will look any better than the next.
This is why I recommended characterizing your specific problem to the extent possible by studying norm(res) (and its gradient) as a function of your fit parameters.
It looks like you have a 4 parameter fit (2 real parts and 2 complex parts). But depending on how the model works (since you only care about real part of H), it's possible that you're over-specifying, in which case your sensitivity might have to do with multiple combinations of your 4 parameters result in the same real part of the model.
More Answers (1)
Baris Gungordu
on 17 Apr 2020
1 Comment
Alex Sha
on 17 Apr 2020
Hi, if all are as previous except the data, then the result will be:
Root of Mean Square Error (RMSE): 0.478025304038979
Sum of Squared Residual: 133.220275528809
Correlation Coef. (R): 0.994221627528188
R-Square: 0.9884766446448
Parameter Best Estimate
-------------------- -------------
x1.realpart 828282884.340485
x1.imagpart -136314338.186314
x2.realpart 0.681537210289169
x2.imagpart 9.3260946979766
See Also
Categories
Find more on Get Started with Curve Fitting Toolbox in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!