Sensor Fusion and Tracking Toolbox

Design, simulate, and test multisensor tracking and positioning systems

Sensor Fusion and Tracking Toolbox™ includes algorithms and tools for designing, simulating, and testing systems that fuse data from multiple sensors to maintain situational awareness and localization. Reference examples provide a starting point for multi-object tracking and sensor fusion development for surveillance and autonomous systems, including airborne, spaceborne, ground-based, shipborne, and underwater systems.

You can fuse data from real-world sensors, including active and passive radar, sonar, lidar, EO/IR, IMU, and GPS. You can also generate synthetic data from virtual sensors to test your algorithms under different scenarios. The toolbox includes multi-object trackers and estimation filters for evaluating architectures that combine grid-level, detection-level, and object- or track-level fusion. It also provides metrics, including OSPA and GOSPA, for validating performance against ground truth scenes.

For simulation acceleration or rapid prototyping, the toolbox supports C code generation.

Get Started:

Tracking for Surveillance Systems

Track targets in surveillance regions using data from active and passive sensors mounted on stationary and moving platforms.

Airspace Surveillance

Track multiple objects using data from active and passive sensors such as radar, ADS-B, and EO/IR sensors. Customize trackers to handle maneuvering objects.

Tracking aircraft with Earth-centered scenarios.

Space Surveillance

Track multiple spaceborne objects using data from radar sensors to generate space situational awareness. You can configure the trackers to use a Keplerian motion model or other orbital models.

Tracking space debris using Keplerian motion models.

Ground and Maritime Surveillance

Track extended objects in ground-based and maritime applications using high resolution radar and lidar sensors.

Extended object tracking with lidar.

Tracking for Autonomous Systems

Improve perception systems in autonomous vehicles by tracking extended objects using camera, radar, and lidar data. Fuse point clouds, detections, and tracks from multiple sensors to estimate the position, kinematics, extent, and orientation of these objects.

Single Sensor Tracking

Model and simulate multi-object trackers to perform the processing required in smart sensors. This includes transforming raw data into object track lists.

Track objects using a 3D bounding box generated from a lidar point cloud.

Centralized Fusion

Track extended objects with a centralized tracker that fuses data from multiple sensors and sensor modalities. Use a probability hypothesis density (PHD) tracker to estimate the kinematics of moving objects, along with the objects’ dimensions and orientation. For complex urban environments, implement a random finite set (RFS) grid-based tracker to track each grid cell’s occupancy as well as its kinematics.

Use dynamic occupancy grid map in urban driving scenes.

Track-Level Fusion

Fuse tracks from multiple tracking sources to provide a more comprehensive estimate of the environment. Evaluate track-to-track fusion architectures in systems with bandwidth constraints and systems that employ rumor control to eliminate stale results.

Track-level fusion with lidar and radar sensors.

Multi-Object Tracking

Integrate and configure Kalman and particle filters, data association algorithms, and multisensor multi-object trackers. Maintain single or multiple hypotheses about the tracked objects.

Estimation Filters and Data Association

Estimate object states using a rich library of estimation filters, including linear and nonlinear Kalman filters, multimodel filters, and particle filters. Find the best or k-best solutions to the 2D assignment problem or the S-D assignment problem. Assign detections to detections, detections to tracks, or tracks to tracks.

Range-only tracking with non-Gaussian filters.

Multi-Object Trackers

Integrate estimation filters, assignment algorithms, and track management logic into multi-object trackers to fuse detections into tracks. Convert your sensor data into a detection format and use a global nearest neighbor (GNN) tracker for simple scenarios. Easily switch to a joint probabilistic data association tracker (JPDA), a multiple hypothesis tracker (MHT), or a PHD tracker for challenging scenarios such as tracking closely spaced targets where measurement ambiguities exist.

Track closely spaced targets where measurement ambiguities exist.

Extended Object and Grid-Based Trackers

Use a PHD tracker to track the kinematics, size, and orientation of extended objects. Using high resolution sensor data such as lidar and radar point clouds, track with grid-based, RFS trackers to estimate the dynamic characteristics of grid cells in complex urban environments.

Extended object tracking with size and orientation estimation.

Centralized and Decentralized Tracking

Build centralized and decentralized tracking architectures that fuse sensor reports within communication bandwidth limitations. Use different methods for state and state covariance fusion.

Track-Level Fusion

Fuse tracks generated by tracking sensors or other track-to-track fusion objects. Architect decentralized tracking systems in bandwidth constrained systems. Reduce rumor propagation to eliminate stale tracker results.

Track-to-track fusion between two vehicles.

Fusion Architectures

Explore tracker architectures and evaluate design trade-offs between track-to-track fusion, central-level tracking, or hybrid tracking architectures. Use static (detection) fusion to combine detections from angle-only and range-only sensors such as IR, ESM, or bistatic radars.

Track using distributed synchronous passive sensors.

Tracking Scenario Simulation

Generate sensor reports to test tracking systems. Define multiplatform scenarios and generate motion profiles for each platform using waypoint-based and kinematics-based trajectories. Attach sensor models and signatures to each platform and simulate their reports statistically. Use a simulated ground truth in Monte Carlo simulations to verify and validate tracking systems.

Object Trajectory and Pose Generation

Define scenarios interactively with the Tracking Scenario Designer app and generate MATLAB scripts that define and convert the true position, velocity, and orientation of objects in different reference frames.

Active and Passive Sensor Models

Model active sensors (including radar, sonar, and lidar) to generate detections of objects. Simulate mechanical and electronic scans across azimuth, elevation, or both. Model radar warning receiver (RWR), electronic support measure (ESM), passive sonar, and infrared sensors to generate angle-only detections for use in tracking scenarios. Model multistatic radar and sonar systems with emitters and sensors.

Monte Carlo Simulations

Perform Monte Carlo simulations using different random noise values. Perturb ground truth and sensor configurations to increase testing robustness.

Perturbing trajectories and sensors to generate test data.

Localization for Tracking Platforms

Perform IMU, GPS, and altimeter sensor fusion to determine orientation and position over time and enable tracking with moving platforms. Estimate orientation and position for inertial navigation systems (INS) over time with algorithms that are optimized for different sensor configurations, output requirements, and motion constraints.

INS Sensor Models

Model inertial measurement unit (IMU), GPS, altimeter, and INS sensors. Tune environmental parameters, such as temperature, and noise properties of the models to emulate real-world environments.

Model IMU and GPS sensors to test inertial fusion algorithms.

Orientation Estimation

Fuse accelerometer and magnetometer readings to simulate an electronic compass (eCompass). Fuse accelerometer, gyroscope, and magnetometer readings with an attitude and heading reference system (AHRS) filter to estimate orientation.

Estimating the orientation of a platform by fusing inertial sensors.

Pose Estimation

Estimate pose with and without nonholonomic heading constraints using inertial sensors and GPS. Determine pose without GPS by fusing inertial sensors with altimeters or visual odometry.

Visual-inertial odometry using fused IMU and camera data.

Visualization and Analytics

Analyze and evaluate the performance of tracking systems against ground truth.

Scenario Visualization

Plot the orientation and velocity of objects, ground truth trajectories, sensor measurements, and tracks in 3D. Plot detection and track uncertainties. Visualize track IDs with history trails.

Theater plot of a multiplatform scenario.

Sensor and Track Metrics

Generate track establishment, maintenance, and deletion metrics including track length, track breaks, and track ID swaps. Estimate track accuracy with position, velocity, acceleration, and yaw rate root-mean square error (RMSE) or average normalized estimation error squared (ANEES). Use integrated OSPA and GOSPA metrics to summarize performance in a single score. Analyze inertial sensor noise using Allan variance.

Integrated tracking metrics to evaluate tracker performance against ground truth.

Tuning Filters and Trackers

Tune parameters of multi-object trackers such as the assignment threshold, filter initialization function, and confirmation and deletion thresholds to maximize performance. Compare results across trackers and tracker configurations. Automatically tune INS filters to optimize noise parameters.

Tracking point targets in dense clutter with a GM-PHD tracker.

Algorithm Acceleration and Code Generation

Speed up simulations by applying coarse gating, generating C/C++ and MEX code, or using pools of workers.

Code Generation

Generate C/C++ and MEX code for simulation acceleration or desktop prototyping using MATLAB Coder™. Apply cost calculation thresholds to reduce time spent on calculating the assignment cost.

Tracking thousands of targets with generated code for fastest simulation time.

Additional Sensor Fusion and Tracking Toolbox Resources