I have a database P with columns X, Y and Z (file above):
N = 150; % Number Of Points Desired
xv = linspace(min(P(:,1)), max(P(:,1)), N);
yv = linspace(min(P(:,2)), max(P(:,2)), N);
[X,Y] = ndgrid(xv, yv);
Z = griddata(P(:,1), P(:,2), P(:,3), X, Y);
contourf(X, Y, Z, 35)
Zi = interp2(P(:,1), P(:,2), P(:,3), X, Y) ;
With the code above, I get the following plot:
How to make contour lines neater and smoother?

 Réponse acceptée

Star Strider
Star Strider le 20 Oct 2022
Modifié(e) : Star Strider le 20 Oct 2022
I dislike going to external sites.
Increasing ‘N’ would be the option I would use, initially experimenting with 250 and perhaps 500 or greater, depending on the available memory and the result provided. That should increase the resolution of the vectors, and therefore the resolution of the matrices derived from them.
EDIT — (20 Oct 2022 at 15:45)
Using the supplied data —
LD = load(websave('P','https://www.mathworks.com/matlabcentral/answers/uploaded_files/1163343/P.mat'));
P = LD.P;
N = 500; % Number Of Points Desired
xv = linspace(min(P(:,1)), max(P(:,1)), N);
yv = linspace(min(P(:,2)), max(P(:,2)), N);
[X,Y] = ndgrid(xv, yv);
Z = griddata(P(:,1), P(:,2), P(:,3), X, Y);
Warning: Duplicate data points have been detected and removed - corresponding values have been averaged.
figure
contourf(X, Y, Z, 35)
I looked at the first column of ‘P’ to see if reshaping would work. It will not, because the first indices of the different unique values of ‘P(:,1)’ do not have constant distances between them.
The data simply do not appear to be smooth (i.e. may have limited precision), and that is reflected in the contours even with a relatively high precision in the interpolating matrices.
This is likely as good as it gets.
.

18 commentaires

Andrew Sol
Andrew Sol le 20 Oct 2022
  1. "I dislike going to external sites. " - I am attaching a file with data to insert into your version of Matlab
  2. "Increasing ‘N’ would be the option I would use, initially experimenting with 250 and perhaps 500 or greater, depending on the available memory and the result provided. That should increase the resolution of the vectors, and therefore the resolution of the matrices derived from them. " - I increased N to 2500. This did not give a significant result (in fact, it stopped changing significantly after N = 1500).
Andrew Sol
Andrew Sol le 20 Oct 2022
Modifié(e) : Andrew Sol le 20 Oct 2022
I have read your addendum and agree with your points. Yes, reshaping is difficult to apply here due to the irregularity of 1 column. Perhaps the data can be "smoothed" a little for each column using some kind of filter?
I doubt that any sort of smoothing would help, specifically because the data are not consistently sampled, that is they are neither regularly sampled nor are they sampled in any consistent manner. It would be difficult to use any sort of smoothing function (such as the Savitzky-Golay filter) with them for those reasons.
If they are data, sampling them with greater precision might be worthwhile in a subsequent experiment. I have no idea what the data represent, so I have no idea how to create them with greater precision.
The default interpolation method is 'linear' so experimenting with the 'natural' and 'cubic' methods —
LD = load(websave('P','https://www.mathworks.com/matlabcentral/answers/uploaded_files/1163343/P.mat'));
P = LD.P;
N = 750; % Number Of Points Desired
xv = linspace(min(P(:,1)), max(P(:,1)), N);
yv = linspace(min(P(:,2)), max(P(:,2)), N);
[X,Y] = ndgrid(xv, yv);
Z = griddata(P(:,1), P(:,2), P(:,3), X, Y, 'natural');
Warning: Duplicate data points have been detected and removed - corresponding values have been averaged.
figure
contourf(X, Y, Z, 35)
title('''natural'' Method')
Z = griddata(P(:,1), P(:,2), P(:,3), X, Y, 'cubic');
Warning: Duplicate x-y data points detected: using average values for duplicate points.
figure
contourf(X, Y, Z, 35)
title('''cubic'' Method')
Even increasing ‘N’ further and using differnt ‘method’ options fails to make any meaningful changes in the result.
.
1. The data were collected as follows: an inequality is given, and the space of its solutions is filled with uniformly distributed parameter values for which the inequality is satisfied. 2. The outline is necessary because the boundaries of this contour will be used in tasks of orientation of the robot in a complex environment.
2. I decided to try the
smoothdata
command and different methods. The Savitsky-Gollay filter turned out to be the fastest and best. But I filtered only the first two columns of data, and did not touch the 3rd one. It seemed to me that this might be enough. But the plot is really quite strange :)))
Star Strider
Star Strider le 20 Oct 2022
For the reasons I mentioned, I really do not recommend using any smoothing functions on these data, simply because the data are not regular in any sense of that term. (I also have no idea what the columns represent.)
I am not certain how you calculated the data, so I cannot suggest any better ways of dealing with it. I also do not understand ‘uniformly distributed parameter values’. If they are regularly-spaced over a specific range (for example using linspace to create them), then the data might be more regular and the precision of the results could be increased to produce a smoother result. If they are the result of rand or similar funcitons, this could be the reason for the irregularities. The contours appear essentially to be symmetric about zero on the y-axis.
Andrew Sol
Andrew Sol le 20 Oct 2022
I will try to increase the number of points and see how it affects the smoothness of the data.
Star Strider
Star Strider le 20 Oct 2022
That, and increasing the precision of the result (although control over that may be limited).
I have no idea what you’re doing, so my ability to help with this is limited.
Andrew Sol
Andrew Sol le 21 Oct 2022
I seem to have found the reason why the contour lines are "noisy". Once again, I'll bring it up. There is some space in which the robot moves. The range of position change along the x-coordinate = [-4;4], and the range of position change along the y-coordinate = [-5;5]. We generate a huge number of uniformly distributed points from these ranges and see if the robot is in or out of the valid area. The invalid position is marked on the contour plot with a "white" area. So that's where the "noise" comes from. When the value of the acceptable location criterion was calculated, the /ceil/ command was applied to this value. That's it. I don't know how I missed it, but the fact remains. When I removed it (and increased the number of contour lines), the graph began to look like this:
But some problems still remain. They stem from the fact that you need a lot of points to make a nice graph. And doing it by hand is extremely difficult and difficult. Here is a graph with a much smaller (by several orders) number of measurements.
It would be nice to somehow "smooth" it. Eliminate "creases" on the borders and fill in the "voids" within it. Perhaps surface approximation will help here?
Andrew Sol
Andrew Sol le 21 Oct 2022
Modifié(e) : Andrew Sol le 21 Oct 2022
With the help of this command, with a small number of measurements, it was possible to obtain a more or less smooth graph. But I did not understand how to set the line thickness in this command.
If you use
[cout, h] = contourspline(appropriate parameters);
then
set(h, 'LineWidth', THICKNESS);
Andrew Sol
Andrew Sol le 21 Oct 2022
@Walter Roberson Yes, it works
Star Strider
Star Strider le 21 Oct 2022
The ceil function will certainly limit the precision of the results, and limited precision is the problem I mentioned earlier. I had no idea you were using it.
I didn’t mention smoothing the contour lines, (although that’s certainly an option) because that will not solve the underlying problem.
Andrew Sol
Andrew Sol le 21 Oct 2022
Yes, deleting this feature certainly improved the smoothness of the graphs. At first, I used it to limit the number of digits of the number and free up a small amount of memory for other modules, because. used a computer with limited resources. Now this is not necessary. How should I proceed?
I doubt that using ceil would change the memory requirements —
v1 = pi;
v2 = ceil(pi);
whos v1 v2
Name Size Bytes Class Attributes v1 1x1 8 double v2 1x1 8 double
If you want to use less memory, the single precision option (or even integers, such as int8) is possible —
v1 = single(pi);
v2 = int16(pi);
v3 = int8(pi);
whos v1 v2 v3
Name Size Bytes Class Attributes v1 1x1 4 single v2 1x1 2 int16 v3 1x1 1 int8
however many MATLAB functions require double precision, so this could involve converting from single (or integer) to double and back again as necessary, likely also multiplying and dividing by powers-of-10 to retain at least some decimal precision in the integers, considerably slowing the code.
Since I have no idea what your code does, I can’t suggest specific methods to make it more efficient or less msmory intensive.
In any event, keeping as much precision as possible in the interim calculations will create a better final result.
.
Andrew Sol
Andrew Sol le 21 Oct 2022
"In any event, keeping as much precision as possible in the interim calculations will create a better final result. " this wording is fine with me.
Star Strider
Star Strider le 21 Oct 2022
My pleasure!
If my Answer helped you solve your problem, please Accept it!
.
Andrew Sol
Andrew Sol le 21 Oct 2022
And thank you for not passing by!
Star Strider
Star Strider le 21 Oct 2022
As always, my pleasure!
This was an interesting problem!

Connectez-vous pour commenter.

Plus de réponses (0)

Produits

Version

R2019a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by