Feature-Based Map Building from Lidar Data
This example demonstrates how to process 3-D lidar data from a sensor mounted on a vehicle to progressively build a map. Such a map is suitable for automated driving workflows such as localization and navigation. These maps can be used to localize a vehicle within a few centimeters.
Overview
There are different ways to register point clouds. The typical approach is to use the complete point cloud for registration. Build a Map from Lidar Data (Automated Driving Toolbox) example uses this approach for map building. This example uses distinctive features extracted from the point cloud for map building.
In this example, you will learn how to:
Load and visualize recorded driving data.
Build a map using lidar scans.
Load Recorded Driving Data
The data used in this example represents approximately 100 seconds of lidar, GPS, and IMU data. The data is saved in separate MAT-files as timetable
objects. Download the lidar data MAT file from the repository and load it into the MATLAB® workspace.
Note: This download can take a few minutes.
baseDownloadURL = ['https://github.com/mathworks/udacity-self-driving-data' ... '-subset/raw/master/drive_segment_11_18_16/']; dataFolder = fullfile(tempdir,'drive_segment_11_18_16',filesep); options = weboptions(Timeout=Inf); lidarFileName = dataFolder+"lidarPointClouds.mat"; % Check whether the folder and data file already exist or not folderExists = exist(dataFolder,'dir'); matfilesExist = exist(lidarFileName,'file'); % Create a new folder if it does not exist if ~folderExists mkdir(dataFolder); end % Download the lidar data if it does not exist if ~matfilesExist disp('Downloading lidarPointClouds.mat (613 MB)...'); websave(lidarFileName,baseDownloadURL+"lidarPointClouds.mat",options); end
Load the point cloud data saved from a Velodyne® HDL32E lidar sensor. Each lidar scan is stored as a 3-D point cloud using the pointCloud
object. This object internally organizes the data using a Kd-tree data structure for faster search. The timestamp associated with each lidar scan is recorded in the Time
variable of the timetable
object.
% Load lidar data from MAT-file data = load(lidarFileName); lidarPointClouds = data.lidarPointClouds; % Display first few rows of lidar data head(lidarPointClouds)
Time PointCloud _____________ ______________ 23:46:10.5115 1×1 pointCloud 23:46:10.6115 1×1 pointCloud 23:46:10.7116 1×1 pointCloud 23:46:10.8117 1×1 pointCloud 23:46:10.9118 1×1 pointCloud 23:46:11.0119 1×1 pointCloud 23:46:11.1120 1×1 pointCloud 23:46:11.2120 1×1 pointCloud
Visualize Driving Data
To understand what the scene contains, visualize the recorded lidar data using the pcplayer
object.
% Specify limits for the player xlimits = [-45 45]; % meters ylimits = [-45 45]; zlimits = [-10 20]; % Create a pcplayer to visualize streaming point clouds from lidar sensor lidarPlayer = pcplayer(xlimits,ylimits,zlimits); % Customize player axes labels xlabel(lidarPlayer.Axes,'X (m)'); ylabel(lidarPlayer.Axes,'Y (m)'); zlabel(lidarPlayer.Axes,'Z (m)'); title(lidarPlayer.Axes,'Lidar Sensor Data'); % Loop over and visualize the data for l = 1:height(lidarPointClouds) % Extract point cloud ptCloud = lidarPointClouds.PointCloud(l); % Update lidar display view(lidarPlayer,ptCloud); end
Use Recorded Lidar Data to Build Map
Lidars can be used to build centimeter-accurate maps which can later be used for in-vehicle localization. A typical approach to build such a map is to align successive lidar scans obtained from a moving vehicle and combine them into a single, large point cloud. The rest of this example explores this approach.
Preprocessing
Take two point clouds corresponding to nearby lidar scans. Every tenth scan is used to speed up processing and accumulate enough motion between scans.
rng('default');
skipFrames = 10;
frameNum = 100;
fixed = lidarPointClouds.PointCloud(frameNum);
moving = lidarPointClouds.PointCloud(frameNum+skipFrames);
Process the point cloud to retain structures in the point cloud that are distinctive. These steps are executed using the helperProcessPointCloud
function:
Detect and remove the ground plane.
Detect and remove ego vehicle.
These steps are described in more detail in the Ground Plane and Obstacle Detection Using Lidar (Automated Driving Toolbox) example.
fixedProcessed = helperProcessPointCloud(fixed); movingProcessed = helperProcessPointCloud(moving);
Display the initial and processed point clouds in top-view. Magenta points correspond to the ground plane and ego vehicle.
hFigFixed = figure; axFixed = axes(Parent=hFigFixed,Color=[0 0 0]); pcshowpair(fixed,fixedProcessed,Parent=axFixed); title(axFixed,'Segmented Ground and Ego Vehicle'); axis on; view(axFixed,2);
Downsample the point clouds to improve registration accuracy and algorithm speed.
gridStep = 0.5; fixedDownsampled = pcdownsample(fixedProcessed,gridAverage=gridStep); movingDownsampled = pcdownsample(movingProcessed,gridAverage=gridStep);
Feature-Based Registration
Align and combine successive lidar scans using feature-based registration as follows:
Extract Fast Point Feature Histogram (FPFH) descriptors from each scan using the
extractFPFHFeatures
function.Identify point correspondences by comparing the descriptors using the
pcmatchfeatures
function.Estimate the rigid transformation from point correspondences using the
estgeotform3d
function.Align and merge the point cloud with respect to reference point cloud using the estimated transformation. This is performed using the
pcalign
function.
% Extract FPFH Features for each point cloud neighbors = 40; [fixedFeature,fixedValidInds] = extractFPFHFeatures(fixedDownsampled, ... NumNeighbors=neighbors); [movingFeature,movingValidInds] = extractFPFHFeatures(movingDownsampled, ... NumNeighbors=neighbors); fixedValidPts = select(fixedDownsampled,fixedValidInds); movingValidPts = select(movingDownsampled,movingValidInds); % Identify the point correspondences method = 'Exhaustive'; threshold = 1; ratio = 0.96; indexPairs = pcmatchfeatures(movingFeature,fixedFeature,movingValidPts, ... fixedValidPts,Method=method,MatchThreshold=threshold, ... RejectRatio=ratio); matchedFixedPts = select(fixedValidPts,indexPairs(:,2)); matchedMovingPts = select(movingValidPts,indexPairs(:,1)); % Estimate rigid transform of moving point cloud with respect to reference % point cloud maxDist = 2; maxNumTrails = 3000; tform = estgeotform3d(matchedMovingPts.Location, ... matchedFixedPts.Location,"rigid",MaxDistance=maxDist, ... MaxNumTrials=maxNumTrails); % Transform the moving point cloud to the reference point cloud, to % visualize the alignment before and after registration movingReg = pctransform(movingProcessed,tform); % Moving and fixed point clouds are represented by magenta and green colors hFigAlign = figure; axAlign1 = subplot(1,2,1,Color=[0 0 0],Parent=hFigAlign); pcshowpair(movingProcessed,fixedProcessed,Parent=axAlign1); title(axAlign1,'Before Registration'); axis on; view(axAlign1,2); axAlign2 = subplot(1,2,2,Color=[0 0 0],Parent=hFigAlign); pcshowpair(movingReg,fixedProcessed,Parent=axAlign2); title(axAlign2,'After Registration'); axis on; view(axAlign2,2);
% Align and merge the point clouds alignGridStep = 1; ptCloudAccum = pcalign([fixedProcessed movingProcessed], ... [rigidtform3d tform],alignGridStep); % Visualize the accumulated point cloud hFigAccum = figure; axAccum = axes(Parent=hFigAccum,Color=[0 0 0]); pcshow(ptCloudAccum,Parent=axAccum); title(axAccum,'Accumulated Point Cloud'); axis on; view(axAccum,2);
Map Generation
Apply preprocessing and feature-based registration steps in a loop over the entire sequence of recorded data. The result is a map of the environment traversed by the vehicle.
rng('default'); numFrames = height(lidarPointClouds); accumTform = rigidtform3d; pointCloudMap = pointCloud(zeros(0,0,3)); % Specify limits for the player xlimits = [-200 250]; % meters ylimits = [-150 500]; zlimits = [-100 100]; % Create a pcplayer to visualize map mapPlayer = pcplayer(xlimits,ylimits,zlimits); title(mapPlayer.Axes,'Accumulated Map'); mapPlayer.Axes.View = [0 90]; % Loop over the entire data to generate map for n = 1:skipFrames:numFrames-skipFrames % Get the nth point cloud ptCloud = lidarPointClouds.PointCloud(n); % Segment ground and remove ego vehicle ptProcessed = helperProcessPointCloud(ptCloud); % Downsample the point cloud for speed of operation ptDownsampled = pcdownsample(ptProcessed,gridAverage=gridStep); % Extract the features from point cloud [ptFeature,ptValidInds] = extractFPFHFeatures(ptDownsampled, ... NumNeighbors=neighbors); ptValidPts = select(ptDownsampled,ptValidInds); if n==1 moving = ptValidPts; movingFeature = ptFeature; pointCloudMap = ptValidPts; else fixed = moving; fixedFeature = movingFeature; moving = ptValidPts; movingFeature = ptFeature; % Match the features to find correspondences indexPairs = pcmatchfeatures(movingFeature,fixedFeature,moving, ... fixed,Method=method,MatchThreshold=threshold, ... RejectRatio=ratio); matchedFixedPts = select(fixed,indexPairs(:,2)); matchedMovingPts = select(moving,indexPairs(:,1)); % Register moving point cloud w.r.t reference point cloud tform = estgeotform3d(matchedMovingPts.Location, ... matchedFixedPts.Location,"rigid",MaxDistance=maxDist, ... MaxNumTrials=maxNumTrails); % Compute accumulated transformation to original reference frame accumTform = rigidtform3d(accumTform.A*tform.A); % Align and merge moving point cloud to accumulated map pointCloudMap = pcalign([pointCloudMap moving], ... [rigidtform3d accumTform],alignGridStep); end % Update map display view(mapPlayer,pointCloudMap); end
Functions
pcdownsample
| extractFPFHFeatures
| pcmatchfeatures
| estgeotform3d
| pctransform
| pcalign
Objects
Related Topics
Build a Map from Lidar Data (Automated Driving Toolbox)
Ground Plane and Obstacle Detection Using Lidar (Automated Driving Toolbox)