Chapter 2
Working with Hardware and Sensors
Developing an autonomous mobile robot (AMR) involves:
- Simulating and building an underlying hardware platform
- Adding sensors and computing boards to the hardware platform
If you want to add autonomy in an existing mobile robot platform, you will start with step 2.
Building the Hardware Platform for Mobile Robots
The hardware platform for a complex assembly consists not only of mechanical components but also sensors and embedded boards. The platform may include lidar sensors and electric circuits that include microcontrollers. Simulating a model of the platform before developing the physical hardware can reduce failure risks and development costs.
Model-Based Design is an effective method of hardware platform development. This approach lets you create a simulation model of the target platform that represents all the components and their dynamic constraints. Simulating the target behavior in various real-world situations reduces the development time and costs. Simulink has been helping engineers for many years as a de facto standard for Model-Based Design environments and can be efficiently leveraged to develop mobile robot platforms.
When developing hardware from scratch, the architecture must be defined and analyzed to develop appropriate specifications, and the software design must be configured in parallel with the mechanical design of the hardware. The system can be represented as an architectural model described as components and interfaces, and you can use System Composer™ and Requirements Toolbox™ to create or import such descriptive architecture models. You can also validate system requirements and verify system architectures using simulation-based tests. With Model-Based Design in Simulink, you can translate and refine requirements into architectures with components ready for simulation and implementation. You can also populate an architecture model from the architectural elements of Simulink designs or C/C++ code. With these architecture models you can analyze requirements, capture properties via stereotyping, perform trade studies, and produce specifications.
Once you have defined and analyzed the specifications, the next step is to create the physical model to be controlled. You can use Simscape™ in Simulink to represent the physical model of the robot, including Simscape Multibody™ for mechanical components and Simscape Electrical™ for electrical components. Simscape supports intuitive modeling with physical components, as well as modeling based on mathematical formulas such as transfer functions and experimental data. Simscape Multibody lets you use existing CAD models with the CAD import function, where you can import mechanical models from various CAD tools such as SolidWorks®, Autodesk Inventor®, and PTC® Creo™.
After simulating and developing the physical model, the next step is to define a control model. Depending on the kinematic motion model of the robot, there are several components (e.g., wheel encoders) that need to be included to represent the control model. With Simulink, you can simulate typical AMR motion models, including differential drive or Ackermann steering. Control System Toolbox™ and Simulink Control Design™ helps with control model development for transfer function-based design and PID control. Stateflow® is ideal for writing logic such as state transitions and flowcharts. With these tools, you can accelerate development verification with simulations that integrate physical models and control models.
In addition, you can use automatic code generation to generate C code from the control model for rapid control prototyping (RCP). RCP offers an approach to verifying the control algorithm on the actual machine without having to create a model such as fixed-point conversion for the embedded board.
Model-Based Design using Simulink accelerates development verification and lets you simulate and build a mobile robot platform efficiently and effectively.
Collecting Sensor Data
After building the hardware platform, the next step is placing a group of sensors on the mobile platform or vehicle to collect sensor data. The sensor group can include multiple sensor devices such as cameras, lidar, radar, ultrasonic sensors, global positioning systems (GPS), and/or inertial measurement units (IMUs). These sensors provide information such as distance to objects in the AMR’s vicinity. Accuracy is crucial, and therefore the sensors need to be properly calibrated. Data streams from multiple sensors are fused to improve estimations, and this requires calibration of the sensors’ coordinate systems, as well as time synchronization.
Calibrating Coordinate Systems for Multiple Sensors
When fusing data from multiple sensors, you need to integrate the position and orientation of each sensor to recognize the environment. For example, performing position estimation with a monocular camera requires internal parameters (focal length, optical center, etc.), external parameters (position and orientation), and distortion coefficients. Here, calibration refers to the process of estimating and correcting these camera parameters. Computer Vision Toolbox™ provides applications for calibrating pinhole and fisheye cameras. Lidar Toolbox™ supports lidar camera cross-calibration for workflows that combine computer vision and lidar processing.
In addition, sensors also need to be calibrated based on their multiple mounting positions. To combine the advantages of sensors such as a monocular camera for object detection and point cloud sensors for distance estimation, you will need to correct the point cloud data obtained from lidar or RGB-D cameras in association with the coordinates of the monocular camera.
Time Synchronization Among Sensors
Another important task when integrating multiple sensors is time synchronization. Different sensor types have varying parameters such as sampling rate, phase, jitter, and so on. For accurate position estimation of the surrounding objects, you need to know the correct parameters and synchronize the time stamps.
You can use middleware such as a robot operating system (ROS) to integrate sensor data with different rates. Various sensor vendors offer ROS drivers, allowing you to obtain sensor data easily by connecting or subscribing to multiple sensors. The topics that each ROS node outputs are timestamped, which helps to manage time synchronization. ROS also provides a rosbag function that lets you store the sensor data to replay and analyze. Using ROS Toolbox, time-stamped data recorded with rosbag can be played back in MATLAB and Simulink for time synchronization.
- Mechatronic System Design - Overview
- Embedded Vision Using MATLAB and Simulink (2:43) - Video
- Get Trial Software - Free Trial
Select a Web Site
Choose a web site to get translated content where available and see local events and offers. Based on your location, we recommend that you select: .
You can also select a web site from the following list
How to Get Best Site Performance
Select the China site (in Chinese or English) for best site performance. Other MathWorks country sites are not optimized for visits from your location.
Americas
- América Latina (Español)
- Canada (English)
- United States (English)
Europe
- Belgium (English)
- Denmark (English)
- Deutschland (Deutsch)
- España (Español)
- Finland (English)
- France (Français)
- Ireland (English)
- Italia (Italiano)
- Luxembourg (English)
- Netherlands (English)
- Norway (English)
- Österreich (Deutsch)
- Portugal (English)
- Sweden (English)
- Switzerland
- United Kingdom (English)