Main Content


Check status of visual SLAM object

Since R2023b



    status = checkStatus(vslam) returns the current status of the visual SLAM object. The frame the object is currently processing might be different than the most recently added frame.


    collapse all

    Perform monocular visual simultaneous localization and mapping (vSLAM) using the data from the TUM RGB-D Benchmark. You can download the data to a temporary directory using a web browser or by running this code:

    baseDownloadURL = ""; 
    dataFolder = fullfile(tempdir,"tum_rgbd_dataset",filesep); 
    options = weboptions(Timeout=Inf);
    tgzFileName = dataFolder+"fr3_office.tgz";
    folderExists = exist(dataFolder,"dir");
    % Create a folder in a temporary directory to save the downloaded file
    if ~folderExists  
        disp("Downloading fr3_office.tgz (1.38 GB). This download can take a few minutes.") 
        % Extract contents of the downloaded file
        disp("Extracting fr3_office.tgz (1.38 GB) ...") 

    Create an imageDatastore object to store all the RGB images.

    imageFolder = dataFolder+"rgbd_dataset_freiburg3_long_office_household/rgb/";
    imds = imageDatastore(imageFolder);

    Specify your camera intrinsic parameters, and use them to create a monocular visual SLAM object.

    intrinsics = cameraIntrinsics([535.4 539.2],[320.1 247.6],[480 640]);
    vslam = monovslam(intrinsics);

    Process each image frame, and visualize the camera poses and 3-D map points.

    for i = 1:numel(imds.Files)
        if hasNewKeyFrame(vslam)
            % Query 3-D map points and camera poses
            xyzPoints = mapPoints(vslam);
            [camPoses,viewIds] = poses(vslam);
            % Display 3-D map points and camera trajectory
        % Get current status of system
        status = checkStatus(vslam);

    Figure contains an axes object. The axes object with xlabel X, ylabel Y contains 12 objects of type line, text, patch, scatter. This object represents Camera trajectory.

    Note that the monovslam object runs several algorithm parts on separate threads, which can introduce a latency in processing of an image frame added by using the addFrame function.

    % Plot intermediate results and wait until all images are processed
    while ~isDone(vslam)
        if hasNewKeyFrame(vslam)

    After all the images are processed, you can collect the final 3-D map points and camera poses for further analysis.

    xyzPoints = mapPoints(vslam);
    [camPoses,viewIds] = poses(vslam);
    % Reset the system

    Input Arguments

    collapse all

    Visual SLAM object, specified as a monovslam object.

    Output Arguments

    collapse all

    Current status of the visual SLAM object, returned as a TrackingLost, TrackingSuccessful, or FrequentKeyFrames enumeration. This table describes these enumerations.

    Enumeration ValueNumeric ValueDescription

    Tracking is lost. The number of tracked feature points in the frame currently being processed is less than the lower limit of the TrackFeatureRange property. This indicates the image does not contain enough features, or that the camera is moving too fast.

    To improve the tracking, you can increase the upperLimit value of the TrackFeatureRange argument and decrease the SkipMaxFrames argument to add key frames more frequently.


    Tracking is successful. The number of tracked feature points in the frame currently being processed is between the lower limit and upper limit values of the TrackFeatureRange property.


    Tracking adds key frames too frequently. The number of tracked feature points in the frame currently being processed is greater than the upper limit of the TrackFeatureRange property.

    Version History

    Introduced in R2023b

    See Also