Calculating model error
5 views (last 30 days)
Show older comments
I am trying to calculate model error by calculating the model covariance matrix.
My data consists of a time series measurement, and I am fitting the data to a multi-exponential decay function:
Sum(Mi*exp(-t/Ai)), where A is a predetermined range of values.
The way I have been trying to go about this is by calculating the matrix, G:
for i=1:length(A) for j=1:length(t) G(j,i) = exp(-t(j)/A(i)); end end
Then I calculate the with the variance of the data, dvar:
CovM = dvar^2*inv(G'*G);
Matlab doesn't like me using inv here, it tells me to use:
CovM = dvar^2/(G'*G);
What I get is a CovM that is very singular, and has a very small correlation coefficient.
What am I doing wrong here. I am pretty sure my data is not the issue, as device I used is brand new.
1 Comment
Andrew Newell
on 10 Apr 2011
One way you can get singularity is by having too many terms in your fit. Try using just one or two first, and then add more terms.
Answers (1)
John D'Errico
on 6 Oct 2020
Even though you may THINK your data is good, the problem surely lies in your model. That it is a new device is not even relevant here. Sums of exponential models are notoriously poorly conditioned when you have too many terms. Worse, if you choose poor starting values for the parameters, you will again have issues with the model, again resulting in a singular matrix.
0 Comments
See Also
Categories
Find more on Random Number Generation in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!