Effacer les filtres
Effacer les filtres

Curve Fitting Techniques

4 vues (au cours des 30 derniers jours)
Clement Wong
Clement Wong le 26 Juil 2011
Hello everyone,
I have a project I'm working on which requires that I search a 3-parameter parameter space for a best fit curve. Unfortunately, the curve cannot be described by an explicit function. To generate the best fit, the process I have been using involves varying the 3 parameters, generating a test curve from the parameters, subtracting my experimental data, and then performing an RMS function to search for the lowest RMS value.
I'm wondering if there is any better way to do this, since my current method is a "brute force" method, where I search large sections of parameter space. This ends up taking hours to finish solving (reaching a stable minimum for RMS). For example, I know there is a built in least squares fit in MATLAB, but it requires that you provide a function with a Jacobian. Is there any similar process for non-explicit functions?

Réponse acceptée

Bjorn Gustavsson
Bjorn Gustavsson le 26 Juil 2011
fminsearch (and functions that use fminsearch such as John d'Errico's fminsearchbnd, and others on the file exchange) does not need explicit derivatives. As long as you can calculate the curve from your parameters you should be able to run a least-square-type minimization with those tools.
HTH
  2 commentaires
Clement Wong
Clement Wong le 26 Juil 2011
I'm not sure that that will work either. I'm not looking for the minimum in my function, I'm looking for a best fit. A crude example of what I'm trying to do is such as fitting A*sin(w*t+c*x) where A, w, and c could all change. I have empirical data, and I'm trying to find the best values for 3 parameters to fit my function.
In a little more detail, my function can only be arrived at by solving a system of 12 equations. However, the final form cannot be explicitly stated, so, unlikely the above example, I would not have a "sin(x)" function. When solving the system, I need to plug in test values for each of the parameters, solve, and then compare the resulting values to each value in my empirical data set. Then I find RMS of the comparison.
Bjorn Gustavsson
Bjorn Gustavsson le 27 Juil 2011
Sure you're looking for a minimum of your _final_ function in the fitting of f(p,x,t) = p(1)*sin(p(2)*t+p(3)*x) to your empirical data (Y). What you do then is (in very mixed notation!):
min_p sum((f(p,x,t)-Y).^2)
That is a minimization. In matlab this is easily done:
p_optimal = fminsearch(@(p) sum((f_of_12eqs(p,x,t)-Y).^2),p0)
If you can automate the solving of your 12 equation, and those solutions are continuous and piecewise smooth in the parameters p then it is possible to make that into a function that will gove you Ymodel as a function of p, given x and t - and you're good to go.

Connectez-vous pour commenter.

Plus de réponses (1)

Caleb Downard
Caleb Downard le 8 Mai 2020
Super late on this but the regress function could work. it preforms regressional analysis on data sets so its a good way to find fitting constants. I'm not sure how it would work on trig functions because I'm not a math guy. I use it for analysising data sets with unkown fitting constants. I just used it to fit a fuction that was of the form y = m1*x1+m2*x2+c were x1 and x2 were two different arrays of data.

Catégories

En savoir plus sur Fit Postprocessing dans Help Center et File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by