Use imufilter to calculate rotation matrix to earth frame

33 vues (au cours des 30 derniers jours)
Yunda Liu
Yunda Liu le 10 Mai 2023
Modifié(e) : Yunda Liu le 1 Juin 2023
Hi,
I have attempted to use imufilter in order to fuse local acceleration and gyroscope data in order to calculate the rotation matrix. According to the webpage, if a reference frame is specified as "NED", it should be possible to obtain the rotation matrix relative to the earth frame. However, I am unsure if there are any specific requirements for how the local frame of the IMU sensor is defined. And is there any detailed documentation on how to achieve this?

Réponse acceptée

William Rose
William Rose le 11 Mai 2023
Modifié(e) : William Rose le 11 Mai 2023
[Edit: correct the sign in one element of one of the rotInit matrices, below.]
You have (linear) acceleration (in the body frame) and gyro data (angular velocity in the body frame), at M instants. You will do
fuse = imufilter('SampleRate',Fs);
and
q = fuse(accelerometerReadings,gyroscopeReadings);
to get q, a Mx1 vector of quaternions.
Or you can do
rotMat= fuse(accelerometerReadings,gyroscopeReadings,'orientation','Rotation matrix');
to obtain a Mx3x3 array of M rotation matrices.
The man page says "imufilter fusion correctly estimates the change in orientation from an assumed north-facing initial orientation." This means the output, q or rotMat, represents the orientations of the body, relative to its initial orientation, at M instants. The manual then says "However, the device's x-axis was pointing southward when recorded. To correctly estimate the orientation relative to the true initial orientation or relative to NED, use ahrsfilter." But the man page for ahrsfilter() indicates that ahrsfilter() expects magnetometer data, which you do not have.
If you use the default reference frame, the NED frame, then you are assuming the global x,y,z axes point north, east, down respectivey. The acceleromter and gyro data come from a sensor with a right-handed x,y,z set of axes. Suppose you know the initial orientation of the sensor relative to global frame. Then you can translate the rotMat matrices from "initial orientation of the body" frame to the global frame by pre-multiplying each 3x3 matrix in rotMat by rotInit, which is a 3x3 matrix representing the initial orientaiton of the body in the global frame.
If the initial orientation of the body is aligned with the NED axes (i.e. body x,y,z axes point north, east, and down respectively), then
rotInit=eye(3);
If the body initially is oriented with X pointing east, Y pointing south, and Z pointing down, then
x0=[0;1;0]; y0=[-1;0;0]; z0=[0;0;1];
rotInit=[x0,y0,z0];
If the body initially is oriented with X pointing south, Y pointing west, and Z pointing down, then
x0=[-1;0;0]; y0=[0;-1;0]; z0=[0;0;1];
rotInit=[x0,y0,z0];
If the body initially is oriented with X pointing northeast and 45 deg up, Y pointing southeast and horizontal, and Z pointing wherever, then
x0=[0.5;0.5;-sqrt(2)]; y0=[-sqrt(2);sqrt(2);0]; z0=cross(x0,y0);
I have not used the Sensor Fusion and Tracking toolbox, so my code above is not guaranteed. Do lots of checks with simulated data to make sure the routines behave as expected when you use relatively simple simulated input data. I have used Matlab and Labview to fuse linear and rotational acceleration data, to estimate head accelerations (rotational and linear) and orientation during soccer heading, and the results were published in several papers. So I do know something about this topic, but I am not familiar with these routines. I don't think this toolbox existed when I did that work. I would have used it.
The interesting challenge with the soccer data is that the sensor is on the edge of the head (strapped to the occiput with an elastic headband). We wanted to estimate linear and rotational accelerations in the middle of the brain. That means we must translate the measured data from the edge to the center. A purely rotational acceleration at the edge will generate linear plus rotational acceleration in the center, and vice versa. The equations are complicated and are best done with quaternions, which were not well-supported in Labview at the time.
  5 commentaires
William Rose
William Rose le 14 Mai 2023
@Paul, your concerns are very reasonable. As I said above, I have not used Matlab's sensor Fusion and Tracking toolbox. One thing I have learned from collecting and analyzing linear and rotational acceleration in soccer heading and figure skating is that I must test my data acquisition methods and my code extensively with controlled and simulated inputs, to make sure I get the expected results. The scripts for testing are longer, and take more time, than the scripts being tested. I recommend 3d visualization of the output. I have made plenty of errors along the way. I do my best to identify and eliminate the errors through careful testing.
Yunda Liu
Yunda Liu le 1 Juin 2023
Modifié(e) : Yunda Liu le 1 Juin 2023
Hi,
I used an IMU to collect acceleration and gyroscope that are under the sensor's local coordinate system. The local acceleration and gyroscope were low-pass filtered with cut-off frequency of 10Hz. Then I used imufilter to calculate the rotation matrix and transformed the local acceleration. The code is as follows.
% Read csv file
data = readmatrix(data_path);
timestamp = data(:, 1);
acc = data(:, 2:4); % acc size: N * 3
gyro = data(:, 5:7);
% Calculate orientation
fuse = imufilter('SampleRate', 50, 'OrientationFormat', 'Rotation Matrix');
[orientation, ] = fuse(acc, gyro); % orientation size: 3 * 3 * N
% Transform acc
[rows, cols] = size(acc);
earth_acc = zeros(rows, cols);
for n = 1:rows
earth_acc(i, :) = acc(i, :) * orientation(:, :, i);
end
Then the transformed acceleration seems to be incorrect. To be more specific, the local acceleration looks like the following picture, where X-axis is frame, and Y-axis is acceleration (m/s^2).
However, the transformed acceleration looks like the following picture. To me, it just looks like noise.
Can you please help me to figure out what is going on with the code? Thank you!

Connectez-vous pour commenter.

Plus de réponses (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by