Synthesize Sensors with Unreal Engine Driving Simulation - MATLAB
Video Player is loading.
Current Time 0:00
Duration 5:32
Loaded: 2.98%
Stream Type LIVE
Remaining Time 5:32
 
1x
  • Chapters
  • descriptions off, selected
  • en (Main), selected
    Video length is 5:32

    Synthesize Sensors with Unreal Engine Driving Simulation

    Automated Driving Toolbox™ provides blocks for visualizing sensors in a simulation environment that uses Unreal Engine® from Epic Games®. Learn how to simulate a driving scenario in a prebuilt scene and capture data from the scene using a fisheye camera sensor. Use the model in this video to learn the basics of configuring and simulating scenes, vehicles, and sensors.

    Published: 19 Oct 2021

    Hey, everyone. This is Pitambar from MathWorks. In this video, I'm going to show you how to simulate driving scenarios with Unreal Engine and MATLAB and Simulink. Previously, I did two videos on how to simulate driving scenarios in the Cuboid world. This time, I'll be covering the same topic except with 3D simulation in Unreal engine.

    To follow along with this example, you'll need Automated Driving Toolbox, Computer Vision Toolbox, and Simulink. Let's get into it.

    First, we'll go through some of the tools that are available to you. Let's open up Simulink, and click on the Library Browser. Click the drop down menu in Automated Driving Toolbox, and then click on Simulation 3D.

    Here, you'll see simulation blocks that allow you to configure pre-built scenes in the simulation environment, place and move vehicles within these scenes; and set up and simulate camera, radar, and lidar sensors on the vehicle. These simulation blocks provide some of the tools for testing and visualizing path planning vehicle control and perception algorithms.

    Now let's talk about scenes. To co simulate with Unreal Engine, we use a Simulation 3D scene configuration block. With this block, you can choose from a set of pre-built scenes where you can test your algorithms. For example, let's select this Straight Road scene. Now when I run the simulation, we will see the scene in Unreal Engine.

    Now let's take a look at the vehicles. For this, we will need to use the Simulation 3D Vehicle with Ground Following block. Using this block, you can control the movement of the vehicle by supplying the X, Y, and Yaw values that define its position and orientation at each time step. If we run this, we'll see the vehicle on our straight road.

    There's also several sensor models that you can use with Unreal Engine, including camera, radar, and lidar. You can double-click on a sensor model block to view and edit the parameters. For example, here we can change the mounting location of our Simulation 3D camera, as well as the sensor parameters.

    Communication between Automated Driving Toolbox and Unreal Engine happens in the following order. First, the Simulation 3D Vehicle with Ground Following block initializes the vehicles and sends their X, Y, and Yaw signal data to the Simulation 3D Scene Configuration block. The Simulation 3D Scene Configuration block receives the vehicle data and sends it to the sensor blocks. Finally, the sensor blocks receive the vehicle data and use it to accurately locate and visualize the vehicles.

    This flow of information is helpful when you try to debug simulation issues. So let's put all of this information together using a Doc example where we simulate a simple driving scenario and sensor model for Unreal Engine simulation.

    I'm going to open up the model, which looks like this. Now that you know about the 3D Simulation blocks, you should be able to understand this model. So on the top here, we have our 3D Scene Configuration block. We are using the Large Parking Lot scene which we'll see during simulation. We've also set the weather to have some rain.

    To the right, we have a fish eye camera mounted onto the car. The output of the sensor will be viewed in the video viewer when we simulate. Underneath, we have two vehicles. We have our Ego vehicle, which is following our Target vehicle. Let's go ahead and run the simulation to see it in action. We saw the simulation from that view, but we can also view the output of the fish eye camera using the video viewer.

    So that is a very simple example of simulating with Unreal Engine and Simulink. Now in this example, we did use a pre-built 3D scene that ships with Automated Driving Toolbox. However, if you want to create custom scenes, you have two main options.

    The first is to install the support package that you see on the screen. This allows you to customize scenes in the Unreal editor, and use them in Simulink. For more information on this, we have a series of videos available that can help you get started.

    The other option is to use a different MathWorks product called Roadrunner. Roadrunner is a graphical scene editing tool that allows you to export directly to Unreal Engine and several other simulators. If this is something you're interested in, we have several videos on the Roadrunner product page, which shows you how to use Roadrunner.

    To summarize, to simulate with pre-built scenes you can use Simulink blocks for Automated Driving Toolbox. But if you want to create custom scenes, you have two options. Either the support package I just showed, or Roadrunner.

    So hopefully this helps you get started with 3D simulation with Unreal Engine and MATLAB and Simulink. Some of the resources that I mentioned will be linked through the MathWorks website where this video is located, but that's all for now. Thanks for watching.

    Related Products