Main Content

estimateGravityRotation

Estimate gravity rotation using factor graph optimization

Since R2023a

    Description

    [gRot,info] = estimateGravityRotation(poses,gyroscopeReadings,accelerometerReadings) estimates the gravity rotation, using factor graph optimization, that helps transform the input poses to the local navigation reference frame of an inertial measurement unit (IMU). gyroscopeReadings and accelerometerReadings are the gyroscope readings and accelerometer readings of the IMU, respectively. The function also returns info, which contains the validation metrics for the estimate, as well as velocity and bias estimates to use to update factor graph. For more information about gravity rotation and validating the estimates, see Gravity Rotation and Pose Scale.

    Note

    Input poses must be in the initial IMU reference frame, unless you specify the SensorTransform name-value argument. Then, the poses can be in a different frame. This function assumes that the input poses are in the real world scale. If your input poses are in an unknown scale, you can estimate their pose scale and gravity rotation by using the estimateGravityRotationAndPoseScale function.

    example

    [gRot,info] = estimateGravityRotation(___,Name=Value) specifies options using one or more name-value arguments to improve the estimation process, in addition to the input arguments from the previous syntax. For example, FixPoses=true specifies for the function to treat the input poses as fixed parameters during the optimization process of estimation.

    example

    Examples

    collapse all

    Specify input poses in the first camera pose reference frame.

    poses = [0.1 0 0 0.7071 0 0 0.7071; ...
             0.1 0.4755 -0.1545 0.7071 0 0 0.7071];

    Specify 10 gyroscope and accelerometer readings between consecutive camera frames.

    accelReadings = repmat([97.9887 -3.0315 -22.0285],10,1);
    gyroReadings = zeros(10,3);

    Specify IMU parameters.

    params = factorIMUParameters(SampleRate=100, ...
                                 ReferenceFrame="NED");

    Specify a transformation consisting of 3-D translation and rotation to transform input poses from the initial camera pose reference frame to the initial IMU pose reference frame. Initial sensor reference frame has first sensor pose at the origin of the sensor reference frame.

    sensorTransform = se3(eul2rotm([-pi/2 0 0]),[0 0.1 0]);

    Specify factor graph solver options.

    opts = factorGraphSolverOptions(MaxIterations=50);

    Estimate gravity rotation using IMU measurements between camera pose estimates.

    [gRot,solutionInfo] = estimateGravityRotation(poses, ...
                          {gyroReadings},{accelReadings}, ...
                          IMUParameters=params, ...
                          SensorTransform=sensorTransform, ...
                          SolverOptions=opts)
    gRot = 3×3
    
        0.0058   -0.6775   -0.7355
        0.1023    0.7320   -0.6736
        0.9947   -0.0713    0.0735
    
    
    solutionInfo = struct with fields:
                        Status: SUCCESS
             TranslationErrors: [0.0088 0.0092 5.9933e-04]
                RotationErrors: [0.0092 0.0151 0.0481]
        VelocityInIMUReference: [2×3 double]
                          Bias: [2×6 double]
            PoseToIMUTransform: [1×1 se3]
           PosesInIMUReference: [2×7 double]
                   InitialCost: 2.3123e+03
                     FinalCost: 19.0067
                  FixedNodeIDs: [0 2 3 6]
              IsSolutionUsable: 1
            NumSuccessfulSteps: 23
          NumUnsuccessfulSteps: 18
              OptimizedNodeIDs: [1 4 5 7 8]
               TerminationType: 0
                     TotalTime: 0.0029
    
    

    Use gravity rotation to transform gravity vector from local navigation frame to initial camera pose reference frame.

    % Gravity direction in NED frame is along Z-Axis.
    gravityDirectionNED = [0; 0; 1];
    % Gravity direction in pose reference frame.
    gravityDirection = gRot*gravityDirectionNED
    gravityDirection = 3×1
    
       -0.7355
       -0.6736
        0.0735
    
    
    % Gravity vector in pose reference frame.
    gravityMagnitude = 9.81;
    gravity = gravityDirection*gravityMagnitude
    gravity = 3×1
    
       -7.2149
       -6.6076
        0.7214
    
    

    Input Arguments

    collapse all

    Camera or lidar poses, with translation component of each pose in meters, estimated by stereo-visual-inertial or lidar-inertial systems, respectively, specified as one of these pose types:

    • N-by-7 matrix, where each row is of the form [x y z qw qx qy qz]. Each row defines the xyz-position, in meters, and quaternion orientation, [qw qx qy qz].

    • Array of se3 objects.

    • Camera pose table returned by the poses (Computer Vision Toolbox) function of the imageviewset (Computer Vision Toolbox) object.

    • Array of rigidtform3d (Image Processing Toolbox) objects.

    Gyroscope readings between consecutive camera views or poses, specified as an (N-1)-element cell array of M-by-3 matrices, in radians per second. N is the total number of camera views or poses specified in poses. M is the number of gyroscope readings between consecutive camera views and the three columns of gyroscopeReadings represent the [x y z] measurements between the camera views or poses.

    Note that there can be a different number of readings between consecutive camera views.

    Data Types: cell

    Accelerometer readings between consecutive camera views or poses, specified as an (N-1)-element cell array of M-by-3 matrices, in meters per second squared. N is the total number of camera views or poses specified in poses. M is the number of accelerometer readings between consecutive camera views and the three columns of accelerometerReadings represent the [x y z] measurements between the camera views or poses.

    Note that there can be a different number of readings between consecutive camera views.

    Data Types: cell

    Name-Value Arguments

    collapse all

    Specify optional pairs of arguments as Name1=Value1,...,NameN=ValueN, where Name is the argument name and Value is the corresponding value. Name-value arguments must appear after other arguments, but the order of the pairs does not matter.

    Example: estimateGravityRotation(poses,gyroscopeReadings,accelerometerReadings,IMUParameters=factorIMUParameters(SampleRate=100)) estimates the gravity rotation based on an IMU.

    IMU parameters, specified as a factorIMUParameters object.

    Each IMU may have different characteristics specified by its IMU parameters. The default parameter values in the factorIMUParameters object specify identity noise covariances, which are significantly larger than typical IMU covariances in practical scenarios. So, estimating and specifying IMU parameters is important for accurate estimation of gravity rotation. If the noise parameters of IMU, AccelerometerNoise, AccelerometerBiasNoise, GyroscopeNoise, and GyroscopeBiasNoise, are unknown, use Allan variance analysis of IMU measurements to estimate them. For an example how to use the Allan variance to determine noise parameters, see Inertial Sensor Noise Analysis Using Allan Variance.

    Example: IMUParameters=factorIMUParameters(SampleRate=100)

    Solver options, specified as a factorGraphSolverOptions object.

    Example: SolverOptions=factorGraphSolverOptions(MaxIterations=50)

    Transformation from the input pose reference frame to the initial IMU sensor reference frame, specified as an se3 object.

    Transformation between the input pose reference frame and initial IMU frame consist of 3-D translation and rotation. For example, if the input poses are camera poses in the initial camera sensor reference frame, then the sensor transform rotates and translates a pose or a point in the initial camera sensor reference frame to the initial IMU sensor reference frame. The origin of the initial sensor reference frame is the first pose for that sensor.

    In some cases, the camera and IMU might be rigidly attached to a common base reference frame, and poses are computed in the base reference frame. In this case, the sensor transform specifies the transformation from the base reference frame to the IMU reference frame. If the sensor transform is unknown, you can estimate it by using extrinsic calibration. For an example that shows the process of extrinsic calibration between a camera and an IMU to estimate an SE(3) homogeneous transformation, see Estimate Camera-to-IMU Transformation Using Extrinsic Calibration.

    Example: SensorTransform=se3(eul2rotm([-pi/2 0 0]),[0 0.1 0])

    Since R2025a

    Fixed input poses during estimation, specified as a logical true or false. When set to true, the function treats the input poses as fixed parameters during the optimization process of estimation. When set to false, the function treats the input poses as adjustable parameters that it can refine during optimization, but only within a small neighborhood of their specified values. Not fixing the poses provides the estimator with more freedom, and generally achieves better results. If you expect that the input poses are highly accurate, fixing them is computationally less expensive, and can result in better accuracy.

    Since R2025a

    Initial velocity guess for the IMU, specified as a three element row vector. By default, the initial velocity is unknown. Specifying the initial velocity helps with bias estimation. You can compute the initial velocity from GPS measurements.

    Example: InitialVelocity=[0 0 0] specifies the initial velocity in all directions as 0

    Data Types: double

    Since R2025a

    IMU pose prediction threshold, specified as a two-element row vector of the form [rotationErrorThreshold translationErrorThreshold]. If any rotation error in the RotationErrors field or any translation error in the TranslationErrors field of info exceeds rotationErrorThreshold, translationErrorThreshold respectively, the Status field of the info structure indicates a failure due to a bad prediction.

    The rotation error and translation error measure how well the IMU pose predictions align with the input poses after estimation. This argument defines the maximum acceptable rotation and translation errors in radians and metres respectively.

    Rotation error is the difference in relative rotation between a pair of successive input poses computed directly from the input poses and using IMU pose prediction.

    Translation error is the difference in relative translation between a pair of successive input poses computed directly from the input poses and using IMU pose prediction.

    During estimation, optimization minimizes these errors. Lower errors indicate better alignment between the IMU pose predictions and the input poses, and thus better estimates. If you expect the input poses to be accurate, you can specify smaller prediction thresholds, as they require IMU pose predictions to align more closely with the input poses. Use larger thresholds when you expect the input poses to be less accurate.

    Data Types: double

    Output Arguments

    collapse all

    Gravity rotation, returned as a 3-by-3 matrix, se3 object, or rigidtform3d (Image Processing Toolbox) object similar to input pose type. It contains the rotation required to transform the gravity vector from the local navigation reference frame of IMU (NED or ENU) to the input pose reference frame.

    Estimation diagnostics and optimization results, returned as a structure. The fields of the structure are:

    FieldDescription
    Status

    Status of checks, returned as an enumeration with one of these values.

    Enumeration ValueDescription
    SUCCESS

    No failures.

    FAILURE_BAD_PREDICTION

    Prediction error is above the specified threshold. This indicates that one or more rotation errors in the RotationErrors field, or one or more translation errors in the TranslationErrors field, has exceeded the rotationErrorThreshold, translationErrorThreshold value, respectively, of the PredictionThreshold argument.

    FAILURE_BAD_BIAS

    Bias variation from the initial value is outside the bounds computed from the provided bias covariance values.

    FAILURE_BAD_PREDICTION_BIAS

    Prediction error is above the specified threshold, and bias variation from the initial value is outside the expected bounds.

    FAILURE_BAD_OPTIMIZATION

    Optimization encountered errors, and estimates are not usable.

    FAILURE_NO_CONVERGENCE

    Initialization optimization did not converge in the specified number of iterations. Increase the maximum number of iterations.

    TranslationErrors

    Translation errors in the IMU pose prediction, returned as an (N-1)-by-3 matrix. Each row represents the difference in relative translation between a pair of successive input poses computed directly from the input poses and using IMU pose prediction, in the form [x y z]. A lower translation error indicates that the relative translation predicted using IMU measurements is very close to visual odometry estimate.

    RotationErrors

    Rotation errors in the IMU pose prediction, returned as an (N-1)-by-3 matrix. Each row represents the difference in relative rotation between a pair of successive input poses computed directly from the input poses and using IMU pose prediction, in the form [x y z]. A lower rotation error indicates that the relative rotation predicted using IMU measurements is very close to visual odometry estimate.

    InitialCost

    Initial cost of the non-linear least squares problem formulated by the factor graph before the optimization.

    FinalCost

    Final cost of the non-linear least squares problem formulated by the factor graph after the optimization.

    Note

    Cost is the sum of error terms, known as residuals, where each residual is a function of a subset of factor measurements.

    NumSuccessfulSteps

    Number of iterations in which the solver decreases the cost. This value includes the initialization iteration at 0 in addition to the minimizer iterations.

    NumUnsuccessfulSteps

    Number of iterations in which the iteration is numerically invalid or the solver does not decrease the cost.

    TotalTime

    Total solver optimization time in seconds.

    TerminationType

    Termination type as an integer in the range [0, 2]:

    • 0 — Solver found a solution that meets convergence criterion and decreases in cost after optimization.

    • 1 — Solver could not find a solution that meets convergence criterion after running for the maximum number of iterations.

    • 2 — Solver terminated due to an error.

    IsSolutionUsable

    Solution is usable if 1 (true), not usable if 0 (false).

    PosesInIMUReference

    Optimized poses transformed into the IMU navigation reference frame using the estimated gravity rotation, returned in the same format as the input poses.

    VelocityInIMUReference

    Velocity estimates, returned as an N-by-3 numeric matrix, representing velocities in the IMU navigation reference frame at input pose sampling times. You can specify these values can be set as initial guesses for newly created velocity nodes in the visual-inertial factor graph, by using the nodeState function.

    Bias

    Bias estimates at the input pose sampling times, returned as an N-by-6 numeric matrix. You can specify these values as initial guesses for newly created bias nodes in the visual-inertial factor graph, by using the nodeState function.

    PoseToIMUTransform

    Transformation from the input pose reference frame to the IMU navigation reference frame, returned as an se3 object. This is the inverse of the gravity rotation.

    Data Types: struct

    More About

    collapse all

    References

    [1] Campos, Carlos, Richard Elvira, Juan J. Gomez Rodriguez, Jose M. M. Montiel, and Juan D. Tardos. “ORB-SLAM3: An Accurate Open-Source Library for Visual, Visual–Inertial, and Multimap SLAM.” IEEE Transactions on Robotics 37, no. 6 (December 2021): 1874–90. https://doi.org/10.1109/TRO.2021.3075644.

    Extended Capabilities

    expand all

    C/C++ Code Generation
    Generate C and C++ code using MATLAB® Coder™.

    Version History

    Introduced in R2023a

    expand all