Match the coordinate systems of "triangulate" and "reconstructScene" with "disparitySGM"

6 vues (au cours des 30 derniers jours)
Hello,
I have an image pair of a grid-ruled sheet of paper and the corresponding stereoParams, obtained from checkerboard calibration with the Stereo Camera Calibrator using default settings. I apply both the following trains of processing steps to the image pair:
Processing A: rectifyStereoImages (with stereoParams), disparitySGM, reconstructScene (with stereoParams)
Processing B: individual undistortImage (with stereoParams.CameraParameters1[or 2]), detection of line intersections in both images, get the 3D locations of the intersections with triangulate (with stereoParams)
I wish to display both the point cloud resulting from A and the 3D feature locations from processing B in the same 3D plot, such that they coincide. But I find that they don't coincide; there is an apparent rotation about the origin of coordinates (optical center of camera 1) between the two. I show here the output of showExtrinsics(stereoParams) together with the point cloud (jet colormap in y) and the triangulated points (green). The triangulated points are where I expected them to be, while the point cloud is located right of the area that I had calibrated, off the plane of symmetry between the cameras. Both cameras' optical axes point at the center of the sheet of paper in the experiment.
When I apply the rotation matrix of inv(stereoParams.RotationOfCamera2)^0.5 to the point cloud, the point cloud almost coincides with the feature locations, but not to my satisfaction. Here I show a closeup (color map in depth, triangulated points in orange):
Note that I have a similar result when I compute the Euler angles of the original rotation matrix, multiply them by -0.5, and turn them into a rotation matrix again.
Now I would like to understand what exactly are the output coordinate systems of both reconstructScene and triangulate, including the orientation. And maybe somebody can disapprove my assumption that both methods should yield matching results.

Réponse acceptée

Qu Cao
Qu Cao le 17 Août 2022
The point cloud generated from reconstructScene is in the rectified camera 1 coordinate.
Starting in R2022a, you can use the additional output R1 of the rectifyStereoImage function to convert the reconstructed point cloud from the rectified camera 1 coordinate to the original, unrectified camera 1 coordinate, used by the triangulate function.
  1 commentaire
Holger Mettelsiefen
Holger Mettelsiefen le 17 Août 2022
Thank you! When I rotate the point cloud with R1, it becomes coincident with the feature locations.

Connectez-vous pour commenter.

Plus de réponses (0)

Produits


Version

R2020b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by