Sensor Fusion for Orientation Estimation
From the series: Perception
Sensors are a key component of an autonomous system, helping it understand and interact with its surroundings. In this video, Roberto Valenti joins Connell D'Souza to demonstrate using Sensor Fusion and Tracking Toolbox™ to perform sensor fusion of inertial sensor data for orientation estimation. This is a common and important application for teams participating in maritime and aerial vehicle competitions.
First, Connell and Roberto introduce common inertial sensors like inertial measurement units (IMU) and magnetic, angular rate, and gravity (MARG) before explaining why sensor fusion is important to make sense of this sensor data.
Roberto will then use MATLAB Mobile™ to stream and log accelerometer, gyroscope, and magnetometer sensor data from his cell phone to MATLAB® and perform sensor fusion on this data to estimate orientation using only a few lines of code. The imufilter and ahrsfilter functions used in this video use Kalman filter-based fusion algorithms. The results of the fusion are compared with the orientation values streamed from the cell phone to check the accuracy of the estimation.
You can also select a web site from the following list
How to Get Best Site Performance
Select the China site (in Chinese or English) for best site performance. Other MathWorks country sites are not optimized for visits from your location.