Effacer les filtres
Effacer les filtres

Neural network for curve fitting (estimating function parameters)

40 vues (au cours des 30 derniers jours)
Jakub Lukes
Jakub Lukes le 6 Fév 2023
Hi, I have a question regarding Neural network fitting.
I have curves where I know the curve prescription, but I don't know the parameters (let's say, for example, the curve prescription will be y = a*x^2+b*x+c), and there is some noise in these data.
I need to estimate the parameters a,b,c and the way I am currently doing it is through minimizing square error through fminsearch, but it sometimes has trouble fitting the curve properly (my real data are quite noisy and the curve has 6 parameters). So I wanted to try a different approach and Neural fitting looks quite promising, but I have never done anything with neural networks, so I wanted to ask, if it could be possible to use it for estimating the parameters of a curve (I have a lot of simulated data, where I know the parameters and it could be used for training). Also, in Neural network fitting app, as a input, should I use values of y as input and values of a,b,c as target?
Thank you

Réponses (1)

Bjorn Gustavsson
Bjorn Gustavsson le 6 Fév 2023
In any type of parameter fitting problem you're facing the curse of dimensionality in one form or the other, because the number of dimensions in your search-space increase with the number of parameters. To evaluate your error-function for all combinations of three values per parameter goes from 3 for 1 parameter, to 9 for 2 and 729 for 6 parameters. In addition the optimization will typically have a more complex shape of the error-function to search through, often with different regions of attraction to different local minima. Therefore the search becomes increasingly difficuly the larger number of parameters you fit. One additional stumbling-stone is if your parameters have one redundant parameter (or nearly redundant), for example a model-function of the form:
(only for illustrating purposes) where the coefficient for the linear term has a redundant parameter, this often makes the optimization waste time trying to find optimal values for these two parameters in vain.
Trying to solve these types of issues by turning to a neural network appears to me to be turning to one "more general black-box" machine instead of the black-box machines designed to solve these types of problems, that seems the wrong way to go.
My suggestion is instead that you should try to use a more considered aproach to the fitting:
1, make sure your parameters are truly non-redundant. (this is a pratfall that stumped me a couple of times before I realized I couldn't be that lazy).
2, Try to constrain the search-space, positivity-constraints, range of any of the parameters, anything that helps reducing the parameter-space should help. If you don't have the optimization toolbox you can still use:
fminsearchbnd or minimize from the file exchange to do constrained optimizations.
3, try to turn to lsqnonlin instead of fminsearch, it often is more efficient. The only thing you need to change is to convert your error-function from returning the sum of squared residuals to a residual-function returning the individual residuals.
4, Definitely try multi-start optimization, this to avoid getting trapped in a local minima.
HTH
  2 commentaires
Jakub Lukes
Jakub Lukes le 6 Fév 2023
Ok, thank you for your suggestion.
As for point 1) I am pretty sure, that none of the parameters, which I am trying to fit is redundant.
2) I already do this to some extent, but I will look at it more deeply.
3) Ok, I will try it, maybe it will give better results.
4) I have already implemented multi-start optimization.
Bjorn Gustavsson
Bjorn Gustavsson le 7 Fév 2023
On point 1: some times I've had problems where one or a few parameters have been "nearly redundant" or marginally possible to estimate. For example fitting a 2-sided Gaussian with different widths, where the the two widths and x0 makes the difference in widths close to redundant in some cases. For such cases I've had use of stepwise sub-space fitting - first a normal Gaussian, then using those parameters as initial guess for the 2-sided Gaussian fit that could be more sensibly constrained. Maybe this situation applies to your case.

Connectez-vous pour commenter.

Catégories

En savoir plus sur Statistics and Machine Learning Toolbox dans Help Center et File Exchange

Produits


Version

R2020b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by