Simulation 3D Fisheye Camera
Libraries:
UAV Toolbox /
Simulation 3D
Automated Driving Toolbox /
Simulation 3D
Simulink 3D Animation /
Simulation 3D /
Sensors
Description
Note
Simulating models with the Simulation 3D Fisheye Camera block requires Computer Vision Toolbox™.
The Simulation 3D Fisheye Camera block provides an interface to a camera with a fisheye lens in a 3D simulation environment. This environment is rendered using the Unreal Engine® from Epic Games®. The sensor is based on the fisheye camera model proposed by Scaramuzza [1]. This model supports a field of view of up to 195 degrees. The block outputs an image with the specified camera distortion and size. You can also output the location and orientation of the camera in the world coordinate system of the scene.
If you set Sample time to -1
, the block uses the
sample time specified in the Simulation 3D Scene Configuration block. To use
this sensor, you must include a Simulation 3D Scene Configuration block in your
model.
Tip
The Simulation 3D Scene Configuration block must execute before the Simulation 3D Fisheye Camera block. That way, the Unreal Engine 3D visualization environment prepares the data before the Simulation 3D Fisheye Camera block receives it. To check the block execution order, right-click the blocks and select Properties. On the General tab, confirm these Priority settings:
Simulation 3D Scene Configuration —
0
Simulation 3D Fisheye Camera —
1
For more information about execution order, see Control and Display Execution Order (Simulink).
Examples
Simulate Simple Driving Scenario and Sensor in Unreal Engine Environment
Learn the basics of configuring and simulating scenes, vehicles, and sensors in a virtual environment rendered using the Unreal Engine from Epic Games.
(Automated Driving Toolbox)
Ports
Input
Rel Translation — Relative translation of sensor from mounting point (m)
[0 0 0]
(default) | real-valued 1-by-3 vector of form [X
Y
Z]
Relative translation of the sensor from its mounting point on the vehicle, in meters, specified as a real-valued 1-by-3 vector of the form [X Y Z].
Dependencies
To enable this port, select the Input parameter next to the Relative translation [X, Y, Z] (m) parameter. When you select Input, the Relative translation [X, Y, Z] (m) parameter specifies the initial relative translation and the Rel Translation port specifies the relative translation during simulation. For more details, see Sensor Position Transformation.
Data Types: single
| double
| int8
| int16
| int32
| int64
| uint8
| uint16
| uint32
| uint64
Rel Rotation — Relative rotation of sensor from mounting point (deg)
[0 0 0]
(default) | real-valued 1-by-3 vector of form [Roll
Pitch
Yaw]
Relative rotation of the sensor from its mounting point on the vehicle, in degrees, specified as a real-valued 1-by-3 vector of the form [Roll Pitch Yaw].
Dependencies
To enable this port, select the Input parameter next to the Relative rotation [Roll, Pitch, Yaw] (deg) parameter. When you select Input, the Relative translation [Roll, Pitch, Yaw] (deg) parameter specifies the initial relative rotation and the Rel Rotation port specifies the relative rotation during simulation. For more details, see Sensor Position Transformation.
Data Types: single
| double
| int8
| int16
| int32
| int64
| uint8
| uint16
| uint32
| uint64
Output
Image — 3D output camera image
m-by-n-by-3 array of RGB triplet values
3D output camera image, returned as an m-by-n-by-3 array of RGB triplet values. m is the vertical resolution of the image, and n is the horizontal resolution of the image.
Data Types: int8
| uint8
Translation — Sensor location
real-valued 1-by-3 vector
Sensor location along the X-axis, Y-axis, and Z-axis of the scene. The Translation values are in the world coordinates of the scene. In this coordinate system, the Z-axis points up from the ground. Units are in meters.
Dependencies
To enable this port, on the Ground Truth tab, select Output location (m) and orientation (rad).
Data Types: double
Rotation — Sensor orientation
real-valued 1-by-3 vector
Roll, pitch, and yaw sensor orientation about the X-axis, Y-axis, and Z-axis of the scene. The Rotation values are in the world coordinates of the scene. These values are positive in the clockwise direction when looking in the positive directions of these axes. Units are in radians.
Dependencies
To enable this port, on the Ground Truth tab, select Output location (m) and orientation (rad).
Data Types: double
Parameters
Mounting
Sensor identifier — Unique sensor identifier
1
(default) | positive integer
Specify the unique identifier of the sensor. In a multisensor system, the sensor identifier enables you to distinguish between sensors. When you add a new sensor block to your model, the Sensor identifier of that block is N + 1, where N is the highest Sensor identifier value among the existing sensor blocks in the model.
Example: 2
Parent name — Name of parent vehicle
Scene Origin
(default) | vehicle name
Name of the parent to which the sensor is mounted, specified as Scene
Origin
or as the name of a vehicle in your model. The vehicle names
that you can select correspond to the Name parameters of the
simulation 3D vehicle blocks in your model. If you select Scene
Origin
, the block places a sensor at the scene origin.
Example: SimulinkVehicle1
Mounting location — Sensor mounting location
Origin
(default) | Front bumper
| Rear bumper
| Right mirror
| Left mirror
| Rearview mirror
| Hood center
| Roof center
| ...
Sensor mounting location.
When Parent name is
Scene Origin
, the block mounts the sensor to the origin of the scene. You can set the Mounting location toOrigin
only. During simulation, the sensor remains stationary.When Parent name is the name of a vehicle, the block mounts the sensor to one of the predefined mounting locations described in the table. During simulation, the sensor travels with the vehicle.
Vehicle Mounting Location | Description | Orientation Relative to Vehicle Origin [Roll, Pitch, Yaw] (deg) |
---|---|---|
Origin | Forward-facing sensor mounted to the vehicle origin, which is on the ground and at the geometric center of the vehicle (see Coordinate Systems for Unreal Engine Simulation in Automated Driving Toolbox (Automated Driving Toolbox)) | [0, 0, 0] |
| Forward-facing sensor mounted to the front bumper | [0, 0, 0] |
| Backward-facing sensor mounted to the rear bumper | [0, 0, 180] |
Right mirror | Downward-facing sensor mounted to the right side-view mirror | [0, –90, 0] |
Left mirror | Downward-facing sensor mounted to the left side-view mirror | [0, –90, 0] |
Rearview mirror | Forward-facing sensor mounted to the rearview mirror, inside the vehicle | [0, 0, 0] |
Hood center | Forward-facing sensor mounted to the center of the hood | [0, 0, 0] |
Roof center | Forward-facing sensor mounted to the center of the roof | [0, 0, 0] |
Roll, pitch, and yaw are clockwise-positive when looking in the positive direction of the X-axis, Y-axis, and Z-axis, respectively. When looking at a vehicle from above, the yaw angle (the orientation angle) is counterclockwise-positive because you are looking in the negative direction of the axis.
The X-Y-Z mounting location of the sensor relative to the vehicle depends on the vehicle type. To specify the vehicle type, use the Type parameter of the Simulation 3D Vehicle with Ground Following to which you mount the sensor. To obtain the X-Y-Z mounting locations for a vehicle type, see the reference page for that vehicle.
To determine the location of the sensor in world coordinates, open the sensor block. Then, on the Ground Truth tab, select the Output location (m) and orientation (rad) parameter and inspect the data from the Translation output port.
Specify offset — Specify offset from mounting location
off
(default) | on
Select this parameter to specify an offset from the mounting location by using the Relative translation [X, Y, Z] (m) and Relative rotation [Roll, Pitch, Yaw] (deg) parameters.
Relative translation [X, Y, Z] (m) — Translation offset relative to mounting location
[0, 0, 0]
(default) | real-valued 1-by-3 vector
Translation offset relative to the mounting location of the sensor, specified as a real-valued 1-by-3 vector of the form [X, Y, Z]. Units are in meters.
If you mount the sensor to a vehicle by setting Parent name to the name of that vehicle, then X, Y, and Z are in the vehicle coordinate system, where:
The X-axis points forward from the vehicle.
The Y-axis points to the left of the vehicle, as viewed when looking in the forward direction of the vehicle.
The Z-axis points up.
The origin is the mounting location specified in the Mounting location parameter. This origin is different from the vehicle origin, which is the geometric center of the vehicle.
If you mount the sensor to the scene origin by setting Parent
name to Scene Origin
, then
X, Y, and Z are in
the world coordinates of the scene.
For more details about the vehicle and world coordinate systems, see Coordinate Systems for Unreal Engine Simulation in Automated Driving Toolbox (Automated Driving Toolbox).
Example: [0,0,0.01]
Adjust Relative Translation During Simulation
To adjust the relative translation of the sensor during simulation, enable the Rel Translation input port by selecting the Input parameter next to the Relative translation [X, Y, Z] (m) parameter. When you enable the Rel Translation port, the Relative translation [X, Y, Z] (m) parameter specifies the initial relative translation of the sensor and the Rel Translation port specifies the relative translation of the sensor during simulation. For more details about the relative translation and rotation of this sensor, see Sensor Position Transformation.
Dependencies
To enable this parameter, select Specify offset.
Relative rotation [Roll, Pitch, Yaw] (deg) — Rotational offset relative to mounting location
[0, 0, 0]
(default) | real-valued 1-by-3 vector
Rotational offset relative to the mounting location of the sensor, specified as a real-valued 1-by-3 vector of the form [Roll, Pitch, Yaw]. Roll, pitch, and yaw are the angles of rotation about the X-, Y-, and Z-axes, respectively. Units are in degrees.
If you mount the sensor to a vehicle by setting Parent name to the name of that vehicle, then X, Y, and Z are in the vehicle coordinate system, where:
The X-axis points forward from the vehicle.
The Y-axis points to the left of the vehicle, as viewed when looking in the forward direction of the vehicle.
The Z-axis points up.
Roll, pitch, and yaw are clockwise-positive when looking in the forward direction of the X-axis, Y-axis, and Z-axis, respectively. If you view a scene from a 2D top-down perspective, then the yaw angle (also called the orientation angle) is counterclockwise-positive because you are viewing the scene in the negative direction of the Z-axis.
The origin is the mounting location specified in the Mounting location parameter. This origin is different from the vehicle origin, which is the geometric center of the vehicle.
If you mount the sensor to the scene origin by setting Parent
name to Scene Origin
, then
X, Y, and Z are in
the world coordinates of the scene.
For more details about the vehicle and world coordinate systems, see Coordinate Systems for Unreal Engine Simulation in Automated Driving Toolbox (Automated Driving Toolbox).
Example: [0,0,10]
Adjust Relative Rotation During Simulation
To adjust the relative rotation of the sensor during simulation, enable the Rel Rotation input port by selecting the Input parameter next to the Relative rotation [Roll, Pitch, Yaw] (deg) parameter. When you enable the Rel Rotation port, the Relative rotation [Roll, Pitch, Yaw] (deg) parameter specifies the initial relative rotation of the sensor and the Rel Rotation port specifies the relative rotation of the sensor during simulation. For more details about the relative translation and rotation of this sensor, see Sensor Position Transformation.
Dependencies
To enable this parameter, select Specify offset.
Sample time — Sample time
-1
(default) | positive scalar
Sample time of the block, in seconds, specified as a positive scalar. The 3D simulation environment frame rate is the inverse of the sample time.
If you set the sample time to -1
, the block inherits its sample time from
the Simulation 3D Scene Configuration block.
Parameters
These intrinsic camera parameters are equivalent to the properties of a fisheyeIntrinsics
(Computer Vision Toolbox)
object. To obtain the intrinsic parameters for your camera, use the Camera
Calibrator app.
For details about the fisheye camera calibration process, see Using the Single Camera Calibrator App (Computer Vision Toolbox) and Fisheye Calibration Basics (Computer Vision Toolbox).
Distortion center (pixels) — Center of distortion
[640, 360]
(default) | real-valued 1-by-2 vector
Center of distortion, specified as real-valued 2-element vector. Units are in pixels.
Image size (pixels) — Image size produced by camera
[720, 1280]
(default) | real-valued 1-by-2 vector of positive integers
Image size produced by the camera, specified as a real-valued 1-by-2 vector of positive integers of the form [mrows,ncols]. Units are in pixels.
Mapping coefficients — Polynomial coefficients for projection function
[320, 0, 0, 0]
(default) | real-valued 1-by-4 vector
Polynomial coefficients for the projection function described by Scaramuzza's
Taylor model [1], specified as a real-valued
1-by-4 vector of the form [a0 a2 a3 a4]
.
Example: [320, -0.001, 0, 0]
Stretch matrix — Transforms point from sensor plane to camera plane
[1, 0; 0, 1]
(default) | real-valued 2-by-2 matrix
Transforms a point from the sensor plane to a pixel in the camera image plane. The misalignment occurs during the digitization process when the lens is not parallel to sensor.
Example: [0, 1; 0, 1]
Ground Truth
Output location (m) and orientation (rad) — Output location and orientation of sensor
off
(default) | on
Select this parameter to output the translation and rotation of the sensor at the Translation and Rotation ports, respectively.
Tips
To visualize the camera images that are output by the Image port, use a Video Viewer (Computer Vision Toolbox) or To Video Display (Computer Vision Toolbox) block.
Because the Unreal Engine can take a long time to start up between simulations, consider logging the signals that the sensors output. You can then use this data to develop perception algorithms in MATLAB®. See Mark Signals for Logging (Simulink).
Algorithms
Sensor Position Transformation
At each simulation time step, the sensor block transforms the position (translation and rotation) of the sensor by using this equation:
TVehicle + TMount + TOffset + TPort
This equation contains these steps:
Take the world coordinate position of the vehicle to which the sensor is mounted. (TVehicle)
Transform the sensor to the mounting position specified by the Mounting location parameter. (TMount)
Transform the sensor to the position specified by the Relative translation [X, Y, Z] (m) and Relative rotation [Roll, Pitch, Yaw] (deg) parameters, if enabled. (TOffset)
To enable these parameters, select the Specify offset parameter
Transform the sensor from the offset position to the position specified by the Translation and Rotation ports. (TPort)
To enable these ports, select the Input parameters corresponding to the relative translation and rotation parameters.
References
[1] Scaramuzza, D., A. Martinelli, and R. Siegwart. "A Toolbox for Easy Calibrating Omindirectional Cameras." Proceedings to IEEE International Conference on Intelligent Robots and Systems (IROS 2006). Beijing, China, October 7–15, 2006.
Version History
Introduced in R2024a
See Also
Blocks
Apps
- Camera Calibrator (Computer Vision Toolbox)
Objects
fisheyeIntrinsics
(Computer Vision Toolbox)
MATLAB Command
You clicked a link that corresponds to this MATLAB command:
Run the command by entering it in the MATLAB Command Window. Web browsers do not support MATLAB commands.
Select a Web Site
Choose a web site to get translated content where available and see local events and offers. Based on your location, we recommend that you select: .
You can also select a web site from the following list
How to Get Best Site Performance
Select the China site (in Chinese or English) for best site performance. Other MathWorks country sites are not optimized for visits from your location.
Americas
- América Latina (Español)
- Canada (English)
- United States (English)
Europe
- Belgium (English)
- Denmark (English)
- Deutschland (Deutsch)
- España (Español)
- Finland (English)
- France (Français)
- Ireland (English)
- Italia (Italiano)
- Luxembourg (English)
- Netherlands (English)
- Norway (English)
- Österreich (Deutsch)
- Portugal (English)
- Sweden (English)
- Switzerland
- United Kingdom (English)
Asia Pacific
- Australia (English)
- India (English)
- New Zealand (English)
- 中国
- 日本Japanese (日本語)
- 한국Korean (한국어)