Sensor Fusion and Tracking Toolbox
Design and simulate multisensor tracking and positioning systems
Sensor Fusion and Tracking Toolbox™ includes algorithms and tools for the design, simulation, and analysis of systems that fuse data from multiple sensors to maintain position, orientation, and situational awareness. Reference examples provide a starting point for implementing components of airborne, ground-based, shipborne, and underwater surveillance, navigation, and autonomous systems.
The toolbox includes multi-object trackers, sensor fusion filters, motion and sensor models, and data association algorithms that let you evaluate fusion architectures using real and synthetic data. With Sensor Fusion and Tracking Toolbox you can import and define scenarios and trajectories, stream signals, and generate synthetic data for active and passive sensors, including RF, acoustic, EO/IR, and GPS/IMU sensors. You can also evaluate system accuracy and performance with standard benchmarks, metrics, and animated plots.
For simulation acceleration or desktop prototyping, the toolbox supports C code generation.
Trajectory and Scenario Generation
Generate ground-truth waypoint-based and rate-based trajectories and scenarios. Model platforms and targets for tracking scenarios.
Generate Object Poses
Define and convert the true position, velocity, and orientation of objects in different reference frames.
Create Tracking Scenarios
Model platforms such as aircraft, ground vehicles, or ships. Platforms can carry sensors and provide sources of signals or reflect signals. Platforms can be stationary or in motion, carry sensors and emitters, and contain aspect-dependent signatures that reflect signals.
Rotations, Orientation, and Quaternions
Represent orientation and rotation using quaternions, Euler angles, rotation matrices, and rotation vectors. Define sensor orientation with respect to body frame.
Simulate measurements from IMU (accelerometer, gyroscope, magnetometer), GPS receivers, radar, sonar, and IR under different environmental conditions.
Inertial and GPS Sensors
Model IMU (inertial measurement units), GPS (global positioning systems), and INS (inertial navigation systems). Tune environmental parameters such as temperature, and noise properties of the models to mimic real-world environments.
Model radar and sonar sensors and emitters to generate detections of targets. Simulate mechanical and electronic scans in azimuth and/or elevation.
Model RWR (radar warning receiver), ESM (electronic support measure), passive sonar, and infrared sensors to generate angle-only detections for use in tracking scenarios. Define emitters and channel properties to model interferences.
Inertial Sensor Fusion
Estimate orientation and position over time with algorithms that are optimized for different sensor configurations, output requirements, and motion constraints.
Fuse accelerometer and magnetometer readings to simulate an electronic compass (eCompass). Fuse accelerometer, gyroscope, and magnetometer readings with an attitude and heading reference system (AHRS) filter.
Estimate pose with and without nonholonomic heading constraints using inertial sensors and GPS. Determine pose without GPS by fusing inertial sensors with altimeters or visual odometry.
Use Kalman, particle, and multiple-model filters for different motion and measurement models.
Filters for Object Tracking
Estimate object states using linear, extended, and unscented Kalman filters for linear and non-linear motion and measurement models. Use Gaussian-sum and particle filters for non-linear, non-Gaussian state estimation including tracking with range-only or angle-only measurements. Improve tracking of maneuvering targets with interacting multiple model (IMM) filters.
Motion and Measurement Models
Configure tracking filters with constant velocity, constant acceleration, constant turn, and custom motion models in cartesian, along with spherical and modified spherical coordinate systems. Define position and velocity, range-angle, angle-only, or custom measurement models.
Create multi-object trackers that fuse information from various sensors. Maintain single or multiple hypotheses about the objects it tracks.
Integrate estimation filters, assignment algorithms, and track management logic into multi-object trackers to fuse detections into tracks. Use a multiple hypothesis tracker (MHT) in challenging scenarios such as tracking closely spaced targets under ambiguity.
Find the best or k-best solutions to the global nearest neighbor (GNN) assignment problem. Solve the S-D assignment problem. Assign detections to tracks, or tracks to tracks. Confirm and delete tracks based on recent track history or on track score.
Track Detection Fusion
Fuse state and state covariance. Statically fuse synchronous detections including triangulation of angle detections from passive sensors.
Visualization and Analytics
Analyze and compare the performance of inertial filters and multi-object tracking systems.
Plot objects orientation and velocity, ground truth trajectories, sensor measurements, and tracks in 3D. Plot detection and track uncertainties. Visualize track ID with history trails.
Sensor and Track Metrics
Generate track establishment, maintenance, and deletion metrics including track length, track breaks, and track ID swaps. Estimate track accuracy with position, velocity, acceleration, and yaw rate root-mean square error (RMSE), along with average normalized estimation error squared (ANEES). Analyze inertial sensor noise using Allan variance.
Perform track-to-track fusion and architect decentralized tracking systems
Gaussian Mixture PHD Tracker
Track point objects and extended objects with designated shapes
Evaluate tracker performance against ground truth using the optimal subpattern assignment metric (OSPA)
Tracker Simulink Blocks
Track objects using
trackerJPDA Simulink blocks
Estimate orientation based on accelerometer, gyroscope, and magnetometer sensor data
ENU Reference Frame
Use ENU (east-north-up) reference frame for inertial sensor fusion workflows