Error in lsqnonlin() - "Failure in initial objective function evaluation."
Afficher commentaires plus anciens
I'm building a script to generate a non-linear least squares estimation, after prepping my data, I was able to generate the correct results into an array with the following function:
for x = 1:1000
fun = @(x)((leis_psi_minus(x)*(mu*cons_phi(x)+(1-mu)*poll_negphi(x)))/(cons_phi_minus(x)*(rho*leis_psi(x)+(1-rho)*poll_negpsi(x))));
w2(x) = fun(x);
end
where leis_psi_minus, cons_phi, poll_negphi, cons_phi_minus, leis_psi, poll_negpsi are all arrays of data. But when I try to do lsqnonlin(fun,x0), I get:
Error in lsqnonlin (line 196) initVals.F = feval(funfcn{3},xCurrent,varargin{:});
Caused by: Failure in initial objective function evaluation. LSQNONLIN cannot continue.
I haven't used lsqnonlin before, so I'm trying to research if I need to adjust options, but any insight or advice would be greatly appreciated!
Réponses (1)
Star Strider
le 13 Août 2016
0 votes
For the optimisation functions, the argument of the objective function is the vector of parameters you want to optimise. (Your anonymous function objective function will get your data vectors from your workspace. It is not necessary to include them as arguments.)
What function did you start with, and what do you want to do with it?
10 commentaires
Philip Newell
le 13 Août 2016
Star Strider
le 13 Août 2016
I’m not following.
My observation is that ‘x’ in your function should be the vector of the parameters you want to optimise. You’re treating your data arrays as functions (it seems to me at least), and that’s confusing MATLAB.
MATLAB wants something like this:
t = ... ; % Vector Of Independent Variables
y = ... ; % Vector Of Dependent Variables
f = @(x) sum((y(:) - x(1).*t(:) + x(2)).^2); % Objective Function
This is just an example and is not what you’re doing. It illustrates what I am doing my best to communicate.
Philip Newell
le 14 Août 2016
Modifié(e) : Star Strider
le 14 Août 2016
Star Strider
le 14 Août 2016
My pleasure!
You need to completely vectorise your function, using element-wise operators for all multiplication and division (and exponentiation, although I don’t see any of those in your code), unless you intend to do matrix operations:
fun = @(x)((leis_psi_minus(:).*(x(1).*cons_phi(:)+(1-x(1)).*poll_negphi(:)))./(cons_phi_minus(:).*(x(2).*leis_psi(:)+(1-x(2)).*poll_negpsi(:))));
See if using that version improves your result.
That should do what you want. If it doesn’t, I’ll help as much as I can to get your code to work.
Philip Newell
le 14 Août 2016
Modifié(e) : Philip Newell
le 14 Août 2016
Star Strider
le 14 Août 2016
You need a vector of two values for ‘x0’, since you have two parameters.
Other than that, run your function with the initial value of ‘x0’:
x0 = [0.5; 0.5];
test_call = fun(x0)
and see what it returns.
Philip Newell
le 14 Août 2016
Modifié(e) : Philip Newell
le 14 Août 2016
Star Strider
le 14 Août 2016
Possibly.
I’m flying blind here, so you will have to experiment to get your function to work with lsqnonlin.
I’m out of ideas.
Philip Newell
le 15 Août 2016
Star Strider
le 16 Août 2016
Delete the observations with NaN values. That’s what the Statistics and Machine Learning Toolbox functions do.
Providing your own Jacobian function can speed the regression considerably for large data sets. I don’t know if it improves the ability of the regression to find the global minimum if there are many local minima. The Jacobian you supply is not an objective function. You can include it within your objective function, as described in the lsqnonlin documentation for fun.
You can only fit one objective function at a time, but you can fit as many models (objective functions) as you want. There are ways to compare two models to see which is the best fit, but that requires that both models describe the actual process that created your data with reasonable accuracy. (When I did that, I used the likelihood ratio test to determine if there was a significant difference between them. You have to determine if that is appropriate for your application.) Comparing multiple models (more than two) requires that you take the statistics of multiple comparisons into consideration. This varies with the test you’re doing, so I’ll leave that to you to research.
Catégories
En savoir plus sur Nonlinear Least Squares (Curve Fitting) dans Centre d'aide et File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!