Main Content

Sensor Fusion and Tracking Toolbox

Design, simulate, and test multisensor tracking and positioning systems

Sensor Fusion and Tracking Toolbox™ includes tools for designing, simulating, validating, and deploying systems that fuse data from multiple sensors to maintain situational awareness and localization. Reference examples provide a starting point for multi-object tracking and sensor fusion development for surveillance and autonomous systems, including airborne, spaceborne, ground-based, shipborne, and underwater systems.

You can fuse data from real-world sensors such as active and passive radar, sonar, lidar, EO/IR, IMU, and GPS. To further test your tracking algorithms, you can use the simulation environment and sensor models. The toolbox also includes multi-object trackers and estimation filters for evaluating and validating various fusion architectures using track performance metrics such as OSPA and GOSPA.

For simulation acceleration, rapid prototyping, or deployment the toolbox supports C/C++ code generation.

Get Started

Learn the basics of Sensor Fusion and Tracking Toolbox

Applications

Examples for autonomous system tracking, surveillance system tracking, localization, and hardware connectivity

Orientation, Position, and Coordinate Systems

Quaternions, Euler angles, rotation matrices, and conversions

Data Import and Preparation

Import real-world and simulated tracking data; convert data units, formats, and coordinate systems

Trajectory and Scenario Generation

Ground-truth waypoint- and rate-based trajectories and scenarios

Sensor Models

IMU, GPS, RADAR, ESM, and EO/IR

Inertial Sensor Fusion

IMU and GPS sensor fusion to determine orientation and position

Estimation Filters

Kalman and particle filters, linearization functions, and motion models

Multi-Object Trackers

Multi-sensor multi-object trackers, data association, and track fusion

Visualization and Analytics

Multi-object theater plots, detection and object tracks, and track metrics