Sensor Fusion and Tracking Toolbox
Design, simulate, and test multisensor tracking and positioning systems
Have questions? Contact Sales.
Have questions? Contact Sales.
Sensor Fusion and Tracking Toolbox includes algorithms and tools for designing, simulating, and testing systems that fuse data from multiple sensors to maintain situational awareness and localization. Reference examples provide a starting point for multi-object tracking and sensor fusion development for surveillance and autonomous systems, including airborne, spaceborne, ground-based, shipborne, and underwater systems.
You can fuse data from real-world sensors, including active and passive radar, sonar, lidar, EO/IR, IMU, and GPS. You can also generate synthetic data from virtual sensors to test your algorithms under different scenarios. The toolbox includes multi-object trackers and estimation filters for evaluating architectures that combine grid-level, detection-level, and object- or track-level fusion. It also provides metrics, including OSPA and GOSPA, for validating performance against ground truth scenes.
For simulation acceleration or rapid prototyping, the toolbox supports C and C++ code generation.
Simulate and track inertial navigation, autonomous, and surveillance systems.
Define multiplatform scenarios, then assign motion profiles and attach sensor models to each platform. Simulate these scenarios and dynamically visualize the platform trajectories, sensor coverages, and object detections.
Use various estimation filters, like Kalman filters, multimodel filters, and particle filters, to estimate object states. These filters have been optimized for specific scenarios, such as linear or nonlinear motion models, or incomplete observability.
Use multi-object multi-sensor trackers that integrate filters, data association, and track management. Choose from a variety of trackers that include single-hypothesis, multiple-hypothesis, joint probabilistic data association, random finite sets, or grid-based tracking.
Explore centralized or decentralized multi-object tracking architectures and evaluate design trade-offs between track-to-track fusion, central-level tracking, or hybrid tracking architectures for various tracking applications.
Analyze and evaluate the performance of tracking systems against ground truth using various tracking metrics. Visualize ground truth, sensor coverages, detections, and tracks on a map or in a MATLAB figure.
Deploy algorithms to hardware targets by automatically generating C/C++ code from fusion and tracking algorithms. Deploy generated code to low-cost hardware with limited memory allocation and strictly single precision processing.
30 days of exploration at your fingertips.
Let us know how we can help you.
Contact the Sensor Fusion and Tracking Toolbox technical team.