Error in evaluating a polynomial model as a function of its variables

1 view (last 30 days)
Hi I used polyfitn function to my 8 independent data and one dependent variable .I would like to use this function to fill the missing values.So I run the function for the data that I have and after getting the function I run the polyval function but I am getting this error
"Error using polyvaln (line 39) Size of indepvar array and this model are inconsistent."
the polynomial model is like this:
p =
ModelTerms: [495x8 double]
Coefficients: [1x495 double]
ParameterVar: [1x495 double]
ParameterStd: [1x495 double]
DoF: 879
p: [1x495 double]
R2: 0.7664
AdjustedR2: 0.6351
RMSE: 8.0472
VarNames: {'X1' 'X2' 'X3' 'X4' 'X5' 'X6' 'X7' 'X8'}
Any help would be appreciated.

Answers (2)

the cyclist
the cyclist on 24 Feb 2016
Edited: the cyclist on 24 Feb 2016
Are these functions from this FEX submission?
Have you carefully read the documentation of these functions, and are you certain you are calling them correctly? John D'Errico's submissions are typically flawless.
Almost certainly, you have some kind of dimension mismatch. It looks like you should be calling polyvaln like this:
ypred = polyvaln(p,X)
where X is an N-by-8 numeric array (if I understand the syntax correctly).
May I suggest you post your code and a small sample that exhibits the problem? Otherwise, we'll just be guessing at the solution.
  4 Comments
Rita
Rita on 24 Feb 2016
Thank you so much John.The last column of ig should be removed."ig " has 8 independent parameters.(sorry! my mistake) Now polyvaln works perfectly.But if I wanted to compare the data that I got from the polyvaln called "ypred" with the observed dependent variable "fg" . The R squred is 0.0044 . (I would like to use polyfitn to fit my data so I would be able to get missing data(dependent variables) based on independent variables)
ypred = polyvaln(p,ig);
R = regression(ypred',fg');% =0.0661
Any advice would be greatly appreciated.
John D'Errico
John D'Errico on 24 Feb 2016
Edited: John D'Errico on 24 Feb 2016
Now that I have your data, I see that this is a problem that is classically a bit nasty. I'm not amazed. 8 independent variables are a difficult problem to work with.
The problem is, MANY of those coefficients in the model generated by polyfitn are worthless for prediction. So lets take a look at whether polyfitn thinks those terms are useful in the model, or if they are just dead wood.
mdl = polyfitn(x,y,4);
hist(mdl.p,100)
A simple scheme, but terms with a low value of p might be deemed statistically unlikely to be zero. Large values of p here indicate a term that MAY possibly be unnecessary in the model. So in fact, many of those terms are not useful as predictors. The problem is, if they are in the model, they still do SOMETHING to your predictive ability away from the actual data points.
Essentially, I think you are over-fitting the data. What happens is those useless terms now do squirrelly things to the predictor between the data points.
E = mdl.ModelTerms;
nt = size(E,1)
nt =
495
size(x)
ans =
1374 8
To estimate 495 coefficients from 1374 data points is pushing the limits of what can be done. A tool like a stepwise regression tool might help to resolve which of those terms are actually of any predictive utility. Luckily, stepwise is in the stats toolbox.
A = zeros(1374,494);
for i = 1:(nt-1)
A(:,i) = prod(bsxfun(@power,x,E(i,:)),2);
end
stepwise(A,y,[])
This generated a set of 7 predictors that seem to be clearly significant, with a final R^2 of roughly 0.11. Not a terribly good fit. Pushing stepwise a bit harder, by including 101 terms plus a constant, I can get the R^2 up to 0.49.
stepwise(A,y,[],.1,.2)
stats
stats =
intercept: -6.9961
rmse: 12.333
rsq: 0.49196
adjrsq: 0.45162
fstat: 12.195
pval: 5.1172e-127
As a check to see if polyfitn agrees:
mdlterms = E([in1,495],:);
p102 = polyfitn(x,y,mdlterms)
p102 =
ModelTerms: [102x8 double]
Coefficients: [1x102 double]
ParameterVar: [1x102 double]
ParameterStd: [1x102 double]
DoF: 1272
p: [1x102 double]
R2: 0.49196
AdjustedR2: 0.45162
RMSE: 11.867
VarNames: {'' '' '' '' '' '' '' ''}
However, I cannot test this model to see how well it would do, since you did not include fg in the test2.mat file. Admittedly, I don't expect it to do terribly well.

Sign in to comment.


Rita
Rita on 25 Feb 2016
Edited: Rita on 14 Apr 2016
Thank you so much John for your comprehensive answer. Now R-squared increased 0.1967 . What if I remove some of unimportant parameters from 8 independent parameters using stepwise( based on importance of those parameters) and then using polyfitn and optimize the number of terms again ?Thanks again for your help.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!