how to make the trend line identifies x as dependant variable
1 view (last 30 days)
Show older comments
Hello
my code is :
data = [12;8;7;10;16;15;12;12;20;19;19;17;20;16];
depth=[3.225;4.725;6.225;7.725;9.225;10.725;12.225;13.725;15.225;16.725;18.225;19.725;21.225;22.725];
figure
plot(data,depth,'or')
set ( gca, 'Ydir', 'reverse' )
ylabel('D (m)')
xlabel('N')
The resulting quadratic trendline equation is: y=-0.0038x^2+1.21x-3.69 where x relates to N
However, x values (or N) are the dependant variables in reality. In fact, I want the regression (least square method) consider y (or D) values as independant variable. How can I get the correct equation such as x=...y^2+...y-...?
0 Comments
Answers (1)
Mathieu NOE
on 7 Jun 2023
hello
so basically you permute x and y data and then you get the new result
in my equation output you have to permute x and y names
so that y = 5.0288 + 1.0288*x -0.018926*x^2 becomes x = 5.0288 + 1.0288*y -0.018926*y^2
if you want the code to behave this way , simply change these 3 lines in the function "poly_equation"
eqn = " y = "+a_hat(1);
eqn = eqn+str+a_hat(i)+"*x";
eqn = eqn+str+a_hat(i)+"*x^"+(i-1)+" ";
into
eqn = " x = "+a_hat(1);
eqn = eqn+str+a_hat(i)+"*y";
eqn = eqn+str+a_hat(i)+"*y^"+(i-1)+" ";
% what you did
% x = [12;8;7;10;16;15;12;12;20;19;19;17;20;16]; % data
% y=[3.225;4.725;6.225;7.725;9.225;10.725;12.225;13.725;15.225;16.725;18.225;19.725;21.225;22.725]; % depth
% what you want
y = [12;8;7;10;16;15;12;12;20;19;19;17;20;16];
x=[3.225;4.725;6.225;7.725;9.225;10.725;12.225;13.725;15.225;16.725;18.225;19.725;21.225;22.725];
% Fit a polynomial p of degree "degree" to the (x,y) data:
degree = 2;
p = polyfit(x,y,degree);
% Evaluate the fitted polynomial p and plot:
f = polyval(p,x);
eqn = poly_equation(flip(p)); % polynomial equation (string)
Rsquared = my_Rsquared_coeff(y,f); % correlation coefficient
figure(1);plot(x,y,'*',x,f,'-')
legend('data',eqn)
title(['Data fit , R² = ' num2str(Rsquared)]);
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
function Rsquared = my_Rsquared_coeff(data,data_fit)
% R2 correlation coefficient computation
% The total sum of squares
sum_of_squares = sum((data-mean(data)).^2);
% The sum of squares of residuals, also called the residual sum of squares:
sum_of_squares_of_residuals = sum((data-data_fit).^2);
% definition of the coefficient of correlation is
Rsquared = 1 - sum_of_squares_of_residuals/sum_of_squares;
end
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
function eqn = poly_equation(a_hat)
eqn = " y = "+a_hat(1);
for i = 2:(length(a_hat))
if sign(a_hat(i))>0
str = " + ";
else
str = " ";
end
if i == 2
% eqn = eqn+" + "+a_hat(i)+"*x";
eqn = eqn+str+a_hat(i)+"*x";
else
% eqn = eqn+" + "+a_hat(i)+"*x^"+(i-1)+" ";
eqn = eqn+str+a_hat(i)+"*x^"+(i-1)+" ";
end
end
eqn = eqn+" ";
end
See Also
Categories
Find more on Linear and Nonlinear Regression in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!