This is machine translation

Translated by Microsoft
Mouseover text to see original. Click the button below to return to the English version of the page.

Note: This page has been translated by MathWorks. Click here to see
To view all translated materials including this page, select Country from the country navigator on the bottom of this page.

Lane-Following Control with Monocular Camera Perception

This example enables you to simulate the combined effects of a vision-based perception algorithm with a lane-following controller.

In this example, you:

  1. Integrate a Simulink®-based control algorithm, a MATLAB®-based vision perception algorithm, and a photorealistic environment based on the Unreal Engine®

  2. Explore the effects of a lane detector on lane-following control

  3. Explore the effect of a vehicle detector on spacing control

Introduction

Control algorithms used in automated driving applications are often designed and tested with Simulink models for closed-loop simulation. Closed-loop simulation involves modeling vehicle dynamics and probabilistic sensor models in a simplified virtual environment. For more information on this workflow, see Lane Following Control with Sensor Fusion and Lane Detection. In that example, the controller is partitioned as a model reference that supports simulation and code generation.

Perception algorithms in automated driving applications are often designed and tested with MATLAB code using recorded data. For example, vision-based perception algorithms are often designed and tested against recorded video. For more information on this workflow, see Visual Perception Using Monocular Camera (Automated Driving Toolbox). In that example, recorded video is used to test a vision processing algorithm partitioned as a MATLAB class.

System engineers often want to explore how the combination of perception and controller algorithms affects system performance. To test the integration of a controller with a vision-based perception algorithm, integration with a photorealistic simulation environment is required. In this example, you enable system-level simulation through integration with the Unreal Engine. The 3-D visualization engine requires a Windows® 64-bit platform.

if ~ispc
    error('Vehicle Dynamics Blockset 3D visualization engine requires a Windows 64-bit platform');
end

Add example file folder to MATLAB® path.

addpath(fullfile(matlabroot,'examples','mpc','main'));

To ensure reproducibility of the simulation results, set the random seed.

rng(0)

Explore Integration of Controller and Perception Algorithms

Open the system-level simulation model, which integrates the control and perception algorithms.

open_system('LaneFollowingWithMonoCameraTestBench')

The test bench model integrates the control algorithm by referencing the LFRefMdl Simulink model from Lane Following Control with Sensor Fusion and Lane Detection. The vehicle dynamics are also consistent with that example.

The test bench model integrates a lane boundary and vehicle detection algorithm using the Vision Perception Algorithm block. The Vision Perception Algorithm block is based on the docid:driving_examples#mw_b13b3ce9-9de9-4d94-a014-13a01f2ca568 example. The primary difference from the example is the addition of a wrapper function, monoSensorWrapper.m, that packs output data into buses required by LfRefMdl.

The test bench model integrates the Unreal Engine using the techniques described in Scene Interrogation with Camera and Ray Tracing Reference Application (Vehicle Dynamics Blockset), which enable control of the ego vehicle pose and target vehicle pose. The target vehicle pose follows a predetermined path. An ideal camera from the Unreal Engine is used as input to the perception algorithm.

The test bench model does not include a sensor model for radar detections. Therefore, the Zero Radar Detections block outputs a bus consistent with zero radar detections.

Simulate Lane Following Scenario

To see the effects of lane detection independent of vehicle occlusion, disable the target vehicle, then run the simulation.

set_param('LaneFollowingWithMonoCameraTestBench/Target Vehicle Pose',...
          'EnableTargetVehicle','off')
sim('LaneFollowingWithMonoCameraTestBench')
   Assuming no disturbance added to measured output channel #3.
-->Assuming output disturbance added to measured output channel #2 is integrated white noise.
   Assuming no disturbance added to measured output channel #1.
-->Assuming output disturbance added to measured output channel #4 is integrated white noise.
-->The "Model.Noise" property of the "mpc" object is empty. Assuming white noise on each measured output channel.

Plot the lateral controller performance results.

plotLFLateralResults(logsout)

For this simulation, the:

  • Detected lane boundary lateral offsets plot shows the lateral offsets for the detected left and right lanes boundaries. The detected values are close to the truth of the lane, which is 4 meters wide.

  • Lateral deviation plot shows the lateral deviation of the ego vehicle from the centerline of the lane. The lateral deviation is close to 0, which implies that the ego vehicle closely follows the centerline.

  • Relative yaw angle plot the relative yaw angle between ego vehicle and the centerline of the lane. The relative yaw angle is very close to 0, which implies that the heading angle of ego vehicle matches the yaw angle of the centerline closely.

  • Steering angle plot shows the steering angle of ego vehicle. The steering angle reaches steady state within 2 seconds, and the steering angle trajectory is smooth.

The perception algorithm detects the boundaries of the right lane, and the ego vehicle closely follows the center of the lane.

Simulate Lane Following with Spacing Control Scenario

Integration with the Unreal Engine lets you control the position of a target vehicle on the road. To see the effects of lane detection with spacing control, enable the target vehicle, run the system simulation, and plot the results. The simulation runs slower than real time primarily due to processing of the image by the perception algorithm.

set_param('LaneFollowingWithMonoCameraTestBench/Target Vehicle Pose','EnableTargetVehicle','on')
sim('LaneFollowingWithMonoCameraTestBench')
   Assuming no disturbance added to measured output channel #3.
-->Assuming output disturbance added to measured output channel #2 is integrated white noise.
   Assuming no disturbance added to measured output channel #1.
-->Assuming output disturbance added to measured output channel #4 is integrated white noise.
-->The "Model.Noise" property of the "mpc" object is empty. Assuming white noise on each measured output channel.

Plot the longitudinal controller performance results.

plotLFLongitudinalResults(logsout,time_gap,default_spacing)

For this simulation, the:

  • Detected relative distance to target plot shows the detected relative distance between the target lead vehicle and ego vehicle. The detected distance values are noisy.

  • Velocity plot shows the target and ego vehicle velocities. Due to the noise in the detection of relative distance, the detected lead vehicle velocity is also noisy. The average of the detected lead vehicle velocity is near the truth, which is 10 m/s. Further, to maintain safe distance between lead vehicle and ego vehicle, the controller for the ego vehicle can follow the set velocity or lead vehicle velocity.

  • Distance between ego and target vehicles plot shows the distance between lead vehicle and ego vehicle. Most of the time, the relative distance is greater than the safe distance.

  • Acceleration plot shows the ego vehicle acceleration. When the relative distance is less than the safe distance, the controller slows down using maximum braking (for example, from 4 seconds to almost 8 seconds).

Plot the lateral controller performance results.

plotLFLateralResults(logsout)

Similar to the case when the target vehicle is not enabled, the ego vehicle has satisfactory lateral performance. The addition of a target vehicle slightly affects the detection of lane boundaries in some conditions. In the Detected lane boundary lateral offsets plot, the spikes in the lane detections are due to the increased noise when adding the target vehicle. Given the noisy measurements, the lateral deviation (second plot), and relative yaw angle (third plot) are less accurate comparatively. The controller performance (bottom plot) is satisfactory in this case as well. In real-world applications, a tracker is often added to the lane detections to compensate for detection noise.

Conclusion

This example shows how to enable system-level simulation of controls and vision-based perception algorithms by integrating a Simulink-based control algorithm (using a reference model) with a lane boundary and vehicle detection algorithm using the Vision Perception Algorithm block. The example also examines the system-level effects of combining these algorithms by simulating the vehicle dynamics and perception in a photorealistic environment using the Unreal Engine.

Remove the example file folder from the MATLAB path.

rmpath(fullfile(matlabroot,'examples','mpc','main'));

See Also

Blocks

Related Topics