How to set up an optimization problem to minimize the sum of squared residuals using the Genetic Algorithm?

2 vues (au cours des 30 derniers jours)
Hello, my name is Victor Assis and I am a student from Brazil. I have been working hard on a problem that I could not quite get myself, and I need your help. This is the deal:
I have to fit an equation to a data set. But my equation does have linear and nonlinear parameters. My idea is to simply use OLS, with a slightly difference: I am gonna anchor the value of my linear parameters on the value of the nonlinears. To do this I just set up the OLS classical problem, and computed the partial derivatives of the linear parameters. Once I got those, I am using this as a constraint (but I just substituted it for the values in the sum of the squared residuals).
the problem that i am facing is that I don't seem to understand how to set up this problem in Matlab. When I am trying to set up an objective function I don't understand how to define what is parameters and what is the dataset that i am gonna use.
I don't know if i made myself clear, but i will print the equation i am trying to fit to my dataset here just in case:
y= A + B*(tc-t)^(z)+C*(tc-t)^(z)*cos(w*log(tc-t)+phi)
Linear parameters that will be anchored : A,B, C Parameters estimated by the Genetic algorithm : tc,z,w,phi Dataset used: y,t (both are column vectors Nx1
I appreciate your help. Thank you very much.

Réponse acceptée

Star Strider
Star Strider le 3 Mai 2014
To parameterise your function to be used in MATLAB regression functions, you need to write your function as:
% b(1) = tc, b(2) = z, b(3) = w, b(4) = phi
yfit = @(b,t,A,B,C) A + B.*(b(1)-t).^b(2) + (C.*(b(1)-t).^b(2)).*cos(b(3).*log(b(1)-t)+b(4));
or, if you put A, B, and C inside the function rather than calling them as arguments, the function becomes:
yfit = @(b,t) A + B.*(b(1)-t).^b(2) + (C.*(b(1)-t).^b(2)).*cos(b(3).*log(b(1)-t)+b(4));
The nonlinear regression functions ( nlinfit, lsqcurvefit ) will compute the Jacobian for you. If you use a genetic algorithm, you will not need the Jacobian because it does not use it.
See the documentation for various functions for details in using them. You may have to experiment with different choices of initial parameter estimates in order for your regression to converge.
  8 commentaires
Victor Assis
Victor Assis le 4 Mai 2014
I do hace the Global optimization kit for the Matlab, I am trying to use it for the Genetic algorithm.
If I use the SS function as the objective one, together with the second yfit function that you wrote on your previous comments, how am I gonna slave the A,B,C parameters on the estimated values of b (the vector of nonlinear parameters.
thank you again Star Strider, you are actually saving my life.
Star Strider
Star Strider le 4 Mai 2014
Modifié(e) : Star Strider le 4 Mai 2014
Good point!
% b(1) = tc, b(2) = z, b(3) = w, b(4) = phi, b(5) = A, b(6) = B, b(7) = C
yfit = @(b,t) b(5) + b(6).*(b(1)-t).^b(2) + (b(7).*(b(1)-t).^b(2)).*cos(b(3).*log(b(1)-t)+b(4));
That should work. The SS function doesn’t change.
My pleasure! I’m glad I could help.

Connectez-vous pour commenter.

Plus de réponses (1)

Victor Assis
Victor Assis le 4 Mai 2014
Thank you very much Star Strider, if there is anything you might need help in the future I will be glad to return the favor!

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by