Generate Scenarios from Recorded Sensor Data - MATLAB
Video Player is loading.
Current Time 0:00
Duration 3:20
Loaded: 4.95%
Stream Type LIVE
Remaining Time 3:20
 
1x
  • Chapters
  • descriptions off, selected
  • en (Main), selected
    Video length is 3:20

    Generate Scenarios from Recorded Sensor Data

    In automated driving applications, scenario generation is the process of building virtual scenarios from real-world vehicle data recorded from GPS, IMU, lidar and Camera sensors.

    Scenario Builder support package is an add-on to Automated Driving Toolbox™ that provides functions and tools that help to automatically generate scenarios from both real-world raw vehicle data and processed object list data from perception modules. The resulting scenarios can be exported to ASAM OpenSCENARIO® v1.x and v2.0 formats.

    The generated scenes and scenarios can be used for designing and testing automated driving applications.

    You can get the Scenario Builder support package here: Scenario Builder for Automated Driving Toolbox

    Published: 30 Aug 2023

    Creating realistic and challenging scenarios is important for simulating automated driving applications. With the aid of recorded sensor data, such as camera, LiDAR, and radar, we can create virtual environments for simulation. In this video, we will focus on generating scenes and scenarios from recorded sensor data for scenario-based testing.

    The scenario builder support package and add-on to the Automated Driving Toolbox helps you to generate realistic scenes and scenarios from recorded sensor data collected from the vehicles. Using data from sensors like LiDAR, camera, IMU, GPS, and radar, you can analyze and extract crucial information such as roads, lanes, and static objects. With reconstructed actor trajectories and roadside objects, you can generate scenarios that mimics the real-world conditions, which can also be exported to ASAM OpenSCENARIO formats.

    Now, let's dive into an example on how you can bring virtual scenarios to life using recorded sensor data. In this demonstration, we used camera data to extract lanes, raw LiDAR data to extract vehicles, and label LiDAR data to extract roadside objects such as trees and buildings. The workflow involves the following steps-- first, capturing precise sensor data from actual driving scenario, followed by ego actor localization to accurately extract vehicle trajectories, then extracting roads with lanes and reconstructing roadside objects, and reconstructing non-ego actors and its trajectories. These steps culminate in a virtual scenario that mirrors the real-world scenario.

    We can also export the scenario to OpenSCENARIO 1.x or 2.0 formats. Let's look into each of these individual steps in detail. Ego localization ensures precise localization of ego vehicle using GPS data. Here, we can also fuse GPS with IMU to improve the drift in the orientation of ego vehicle. Additionally, leveraging lane information, we can collect multi-lane drift caused by bias and GPS reading.

    The road reconstruction involves extracting essential information such as lanes and road boundaries from LiDAR data, or using a combination of GPS and camera data. We can also reconstruct scenarios having lane additions, drop-offs, road curvature, and junctions.

    In this step for roadside objects reconstruction, using labeled LiDAR data, we can reconstruct roadside static objects such as trees and buildings. In the absence of labeled data, the scenario builder can extract trees and buildings from raw radar data. Additionally, it can use a combination of camera and data to approximate the scene with roadside objects.

    Target actor trajectory reconstruction focuses on extracting non-ego actors such as vehicles from sensors like camera, LiDAR, radar, or any of their combinations, or even using process object lists. LiDAR sensor data enables the extraction of objects from all sides of the vehicle, while radar sensor data can capture farther objects. Camera sensor data helps identify object classes such as cars, trucks, and more.

    We encourage you to try the scenario builder workflow with your own data set. For more information, visit the File Exchange page and download the support package. You can access a number of shipping examples to kick start your autonomous driving development. Thank you.