UAV Code Development with Simulink for Student Competitions - MATLAB
Video Player is loading.
Current Time 0:00
Duration 1:01:37
Loaded: 0.27%
Stream Type LIVE
Remaining Time 1:01:37
 
1x
  • Chapters
  • descriptions off, selected
  • en (Main), selected
    Video length is 1:01:37

    UAV Code Development with Simulink for Student Competitions

    Overview

    Join us for this webinar and learn how to develop and deploy UAV code using Simulink for Pixhawk Flight Controllers and Flight Computers, including Raspberry Pi and Jetson GPUs. This session will also cover the integration and testing of these systems, equipping you with the skills needed to excel in student aerospace competitions.

    Highlights

    In this webinar, you will learn how to develop and deploy code for UAVs and test it in a virtual environment. We will focus on the following topics during the webinar:

    • Code generation and deployment on Pixhawk Flight Controllers

    • CUDA code generation and deployment for Jetson GPUs as a Flight Computer

    • Integration and testing of Flight Controller and Flight Computer codes in a given scenario

    About the Presenters

    Khushin Lakhara is currently working as an Aerospace Engineer in the Education Programs team at MathWorks. He develops technical tools and content to design aircraft, UAVs, and flight controllers for aerospace student competitions. He received his B.Tech. and M. Tech. degrees in Aerospace Engineering (UAVs) from the Defense Institute of Advanced Technology, India. Previously, he has designed UAVs, developed the Flight Controller for Combat UAVs, and published research articles in the field of high-power electric propulsion. His major areas of interest are drone design, flight controller development, and mission simulation.

    Abhishek Shankar is working as a Robotics Engineer in the Student Programs team at The MathWorks. He supports robotics student competitions and develops content and tools to develop perception, control, and state estimation for robots. He received his bachelor's in electrical engineering from Amrita School of Engineering and his Master’s in Robotics from Wayne State University. His areas of interest are surgical and marine robotics.

     

    Recorded: 13 Nov 2024

    So before we start, I'd like you to ask yourself some questions. Does your competition or your project demand your UAV to perform waypoint following or avoid obstacles while trying to get from A to B? Then this webinar is for you. My name is Abhishek Shankar. And this is Khushin Lakhara with me. And we're going to be talking about how we can use Simulink to develop your code and deploy it into a PX4 and a Jetson Nano.

    In this webinar, we will focus on preparing a virtual environment for testing your UAV code. Virtual environments are important because it helps you analyze your code and deploy it before you go out in the real world. And then we'll talk about code generation and deployment to flight controllers, particularly the Pixhawk.

    We'll also talk about code generation and deployment to flight computers, the Jetson and Raspberry Pi. We'll focus more on the Jetson in this webinar. But we'll give you all the resources you need to do the same for the Raspberry Pi as well. Finally, we'll look at some learning resources that you can go through to get better at your projects and compete well in your competitions. Now I'll hand it over to Khushin to start with the virtual environment.

    Thanks, Abhishek. Hello, everyone. I hope you all are excited for today's webinar. Welcome to the webinar on UAV code development with Simulink for student competitions. So if you are participating in the competition GoAERO or AIAA DBF, SAE Aero Design, SAE Aero or US Challenge, then this webinar is going to be very useful for you because we are going to discuss a lot of content and demos as well where we will talk about how can you develop and deploy code on the flight controller and flight computer hardwares. And we are also going to talk about, how can you test this code?

    So before we get into the technical nitty gritties of generating the code for the flight controller and flight computers, we will start our session with understanding how can we prepare the virtual environment for integrating and testing these UAV codes.

    So to understand this, that how can we have a virtual environment, let's take the example of, how do we fly the drone in actual environment? So if I'm a pilot and I do have a drone, I would be watching over it. And I'll have the RC controller, which I'll control my UAV. And it would be open-loop control, where I'm the pilot controlling the my hardware, visualizing it, and again, doing the controls.

    But if you want to if-- but for the competitions where when we are developing such UAVs, we may have some hardwares to control it. It may be the pixel hardware controls, which gives us the support to handle-- which gives us the support in the terms of UAV control capabilities. And if you have this hardwares and along with your UAVs and we want to create in virtual environment, what are we going to need? Let's explore.

    So first, to visualize all this, we may need a scenario or scenario simulation where we can visualize the environment and we can see how our drone or UAV is flying. So we can do this scenario simulation with Simulink. Moving forward, in order to replace the actual hardware UAV, we need to model our UAV plant in the Simulink.

    So once we do the plant modeling with Simulink, the last thing is the pilot. How are you going to go with the pilot? So there are multiple options, where if you want to fly it using RC controller, you can do it. But if you are preparing for the autonomous competitions, then you may need a ground control station. So we can use the various ground controllers, such as QGroundControl.

    So these are the four basic elements that we see of a virtual environment, where we need to plant our model. We need to do a scenario simulation. We need to have controller hardware. And we should also have, for planning the mission and executing the mission, a ground control station.

    So once we get these things, the next question comes to, how are we going to integrate all of these? How these platforms will interact with each other? So for that purpose, we can use the MAVLink connectivity. So MAVLink connectivity stands for the Micro Aerial Vehicle Link connectivity. And the Simulink provides you the capabilities for the MAVLink connectivity.

    So in Simulink, you can actually use the MAVLink bridge source and MAVLink bridge sync-- MAVLink bridge source and MAVLink bridge sync blocks and share the data. Pardon us for the glitch here for MAVLink bridge sync. So using MAVLink, we can send and receive the data from plant. This MAVLink is connected to the PX4 hardware using the serial communication, where it's writes and reads the data. And finally, using the UDP/TCP communication, it interact with the mission planner or along with the mission planner, flight control, flight computer hardwares.

    So once we do have this communication setup ready, we are actually ready to test our UAV codes. So as now we understand what goes behind the testing of these codes and in environment, let's move forward and explore, how can we generate the code for the flight controllers, such as PX4 hardwares?

    So in this one, we are going to have a short demo where we will be flying a waypoint follow mission. So let's start our discussion with the time modeling. So here, this webinar is especially focused on the UAV code development. So if you want to learn about modeling the plant and scenario simulation, we do have already a recorded webinar available. You can refer this webinar and learn more about how can you model and create the simulation for aerospace vehicles.

    And this webinar was specially dedicated to the aerospace design competition. So it would be very relevant to your existing student competition where you are participating. You can find the link for this video in the chat. Throughout the sessions, we will be discussing a lot of resources. And be assured, all of these resources, we will be sharing in the chat. Also, after the webinar, you will be getting all these resources on your emails.

    So moving forward, we have to generate the code for the flight controller. But before we generate the code for the flight controller, we have to do the flight controller modeling in Simulink. So this flight controller modeling in Simulink, if you will want to learn, we do have a video series available that is drone simulation and control series where we extensively talk about developing the simulation and architecture of a flight controller. So please check out this video.

    Once your flight controller has been modeled in Simulink, we move forward, generate the code, and deploy the code to the Pixhawk hardware. So here, I want to know from the teams, how many teams are actually generating the codes for the Pixhawk hardwares? Please let us know in the chat.

    Khushin, I have a question.

    Yes?

    So Pixhawk comes with its own stock controllers.

    Yeah.

    So what is the advantage of developing your own controller instead of using the stock controllers in Pixhawk?

    Oh yeah, Abhishek, I think you are telling the students a way where they can directly use the controller, directly use the stock flight controller without modeling it. But if you talk about the benefits of the flight controller modeling, think about your customized vehicle that you are developing for specific capabilities like NASA did.

    And in the industry, a lot of the vehicles, when they develop, they prefer to develop their own controller because it comes with their own benefits. If you talk about these benefits, they help us to build accurate controllers for the customized vehicles. They also help us to-- we can model and simulate complete vehicles in the Simulink. And as the vehicle, along with the flight controllers modeled in Simulink, we do have a complete control on our design process.

    Moving forward, you can deploy your physical controller on hardware and achieve specific capabilities. So these capabilities, these benefits, actually motivates us to develop our own controller and deploy on the hardware.

    OK.

    I see the poll is-- yeah. Yeah. Thanks, Abhishek. I see the poll is going on. I see many people have answered. They are using Pixhawk hardwares as a flight controller. A few of the people I can see are using Raspberry Pi as well. And very few people are using Arduino.

    So yeah, we are going to discuss about-- in this section, we are going to discuss about, how can you develop the code for the Pixhawk flight controllers? We also have the support for the Raspberry Pi code generation and deployment and for the Arduino as well. So we'll be talking about the Raspberry Pi code generation in the coming section.

    So let's move forward and explore, how can we develop the flight code? But before we get into the flight code development, the few things we need to understand is that the flight controller code or the flight controller architecture is just a small part of the complete UAV hardware, software, or what we call as flight code, because it is complete by the code for various sensor, acquiring the information, doing state estimation, managing your battery, Bluetooth connectivity, your internal messaging, motor controls, and a lot of other things.

    So now the question comes is, how do we generate such a code and integrate with the hardware, integrate with the existing hardware, so that everything works perfectly? We do not mess up the other part. And for this purpose, we need the environment-- we need to develop the code in such a way that it gets the exact inputs and exact outputs.

    For example, if you are developing a controller, it will just fit a small part. But it may be exchanging the information with the state estimation, data logging, or the file protection as well. So we have to develop the code and integrate in such a way that goes on the hardware and integrate very well.

    And for that purpose, Simulink provides you very good capabilities. So you don't need to worry about the optimization of the code and integration. We can do it. But even before we go for that, the next point comes to my mind is, how do we build the flight code? We talk about the flight code.

    So the one traditional way to build the flight code is you write your manual C code, compile it, and then you execute on the hardware. So that is the traditional way. I want to ask the audience, what do you think can be the possible challenges with this method? Please let us know in the chat.

    By looking at it, I feel like this-- just out of my experience, I feel writing manual C code is more error prone. You have to write multiple functions, make sure all of them work, and then compile and deploy it. So I just feel like it's more susceptible to errors, human errors.

    Exactly, it will be more susceptible to human error. And also, one thing I see here is, as a human or as the person, if you have to code, it is going to be a lot of hard work. It would be taking a long time to complete your projects as well.

    Yeah, yeah. It's not a fast process.

    Yeah, exactly. So let's hear from the audience. What do you think? What are the possible reasons? What are the possible challenges with these codes? One is, as Abhishek suggested, this can be prone to the errors because humans are developing these second-- as we mentioned, because the codes developed over time, it may delay overall workflow, or it may delay our projects.

    The other issues that we see is that with these processes, sometimes it may be difficult to tune the controller and interpret the impact. Consider a situation where you change the parameters, and now you want to see that impact. Every time, you are going to make the changes in the code. And then you are going to deploy on the hardware. And again, if you are not finding the performance adequate, you will come back, make the changes, and deploy it, a lot of to-and-fro processes. Also, you may--

    Khushin, just to interrupt you for a second, we do have a couple of audience comments out saying that debugging is an issue.

    Oh yeah.

    I think you were just going to touch on it with traceability. But yeah, one of them said debugging will be an issue. And some said you might not get all the libraries you need to write your own code.

    Oh wow. Wow, I think our audience is well aware about the challenges. That's really good. Yeah, visibility issues would be there because if you're making the changes and you want to see what part of the code is making what issue, then you won't be able to make it. So debugging or traceability is a big issue.

    Moving forward, if you see-- if we go forward-- the other issues, it can be-- other possible issues are, along with the debugging, it may not be the optimized code. And your hardware may not be adequate for it. Sometimes these hardwares are just a toy. And if the code is not optimized, we may end up burning our hardwares.

    So what is the solution? Let's check out the option two. So the option two is the model-based design approach of Simulink, where you first model your vehicle and flight controller in the Simulink. Moving forward, you generate the code using Simulink capabilities. And you deploy it on hardware.

    The benefit of these-- a benefit of this process is that as you are making the changes in the model, you can always see what impact it is going to make. Also, we have a method, such as monitor and tune, where you can continuously make the changes in the parameters, tunable parameters, and see their impact on the performance, which actually makes the traceability easier as well as tuning faster and helps you a lot to develop an optimized code.

    So now, with this understanding, let's move forward and go for a demo on the flight controller deployment. But before we go for this demo, I want to show the-- I want to discuss the example that we are going to discuss for this case. So we do have a couple of examples already available as the MATLAB documentation.

    So this is going to be the hardware-in-the-loop simulation examples because for all the demos, our hardware would be connected with our systems. And we would be doing all the calculations on the hardware. So you can go through these examples. Go through these examples. The link is-- you can find the link in the chat.

    And for the first one, we are going to discuss the example of scenario simulation and flight visualization with PX4 hardware-in-the-loop and UAV dynamics in the Simulink. So let me show you how does this model looks like. Let's go to MATLAB.

    So here, my MATLAB project is available. This project brings from the same example. And as this is based on the PX4 demo, you will be needing the PX4 support package. So you can install this PX4 support package from add-ons. So just go to the add-ons and search for the PX4 support package, and you can install it. Oh, this is-- OK, we got it.

    So if you search the UAV toolbar, PX4, you will see this toolbar support package for PX4, autopilots. This support package should be installed to explore all the capabilities of the PX4 hardware-related simulation and the code generation. So I'll open this autopilot controller model that we have already developed. So here, to bring our understanding of the overall flight controller deployment as a deployment, as we discussed earlier, we need to share this data with the other systems and make the communication.

    So if you see here, we do have this estimated output block, where if we go get into this, you will see that it provides-- here, we are receiving the data from the vehicles, from the hardware. From the hardware, we are receiving this information. And we are processing this information and feeding it to the controller and navigation.

    So before we share this data to the hardware for actual controller, we need to make sure that we are collecting the data from the right ports and sending it to the right ports. So for this purpose, we can use the uORB read blocks. So here, you see this uORB read and write blocks are available. You can access all these blocks from the uORB toolbox support package, where it gives you all related useful blocks.

    Moving forward, let's get into this position in our controller model and see what we got inside it. So you can see this complete controller is modeled here. So we see this. There is a position and read controller is available, along with this attitude controller is available. So these models, the complete controller modeling-- we are not going to discuss about the controller modeling. But we have already shared the video link where you can learn more about all these resources.

    So let me quickly move to the demos and show you how can we generate the code and deploy it on the hardware. So this is the same model. It's just a recorded video because these code generation sometimes take a lot of time. So we have recorded this model for you.

    So to do this, you have to open Model Settings from Modeling tab. And you have to get into the Hardware Implementation tab here. There, we actually select appropriate hardware board. So once the hardwares are loaded, you can see that there are number of PX4-supported hardwares available. So we are going to select PX4 Cube Orange+ because we develop the code for this hardware.

    And once we select that, it will be tuning all the related parameters for this hardware. And also, as this is going to be a HITL simulation, and we have to access the MAVLink connectivity as well. So we are going to turn on these features. So once these features are ready, we are ready to generate the code. But we would be also generating a code generation report to visualize our report and final code. So once you generate the code, you will see a Hardware tab is coming up. And in the Hardware tab-- my bad.

    So once we generate this hardware-- once we select the hardware and we set up for the code generation, the hardware board will be automatically selected. And then we see here that connected input/output mode is coming. So here, actually, if you see, it provides us-- it provides us two modes.

    So one is you can run the code on your hardware. That means your code will be deployed on the hardware. And you can use all these capabilities. But another mode is run on the system and external input/output method, where we just change-- where we just share the sensor data.

    Moving forward, for run on board, there are two modes. One is Build, Deploy. And second is the Monitor and Tune. So here, we are talking about Build, Deploy. And you can see that code generation has been started. And we are generating the codes for the PX4 hardware.

    Once this code is generated, the code generation report comes, which actually gives us all the details-- how executable your code is there, how much space is it going to there-- and related code reports, which you can check how optimized your code is. And below, we also see that the generated code is available. So this is complete C code, which is actually editable. So if you want to edit the code and do something at your own, you are free to do that.

    So let's move forward and see what all hardwares we do have. So we have a variety of hardwares for the PX4 support package. So you can use this PX4 support package for all of these hardwares. So now, as you have learned that, how can we develop the code for the PX4 flight controller? Let's move forward and see that. How do we go for doing the final waypoint mission? Pull up UAV simulation.

    So for this purpose, we need to set up the ground control station. For this one, a complete documentation is available. So you can do the settings in QGC as per the documentation. And link is available in the chat for this one.

    Once you set up your mission planner, we are good to go with the final HITL simulation. So this is the final HITL model. The simulation model is available. So here, you see that in one of the Simulink model, we have Unreal Engine visualization is available. So we need to provide the same UDP communication port in both of the models. One is the UAV visualization model. And another is UAV dynamics model.

    So once we do that, the data transfer will be easier and appropriate. Now we run the simulation. And we see this photorealistic visualization we get. Along with this photorealistic visualization, the UAV dynamics has been modeled. And this UAV dynamics shares the data over UDP protocol. So now, once you start the UAV dynamics and in mission planner, you can run your mission starting. You can upload your mission and start your mission and fly your mission here and visualize how your UAV is performing.

    So that's how we develop the simulations for the HITL flight simulations. So now, as we have understood that, let's move to the next section, which is based on the-- which is for the code generation and deployment to flight computers. But before that, I request the audience. If you guys have any questions, please post them in the Q&A. At the end of this webinar, we will be taking all the Q&A. And we will be answering there.

    So let's move forward and understand, how do we use the flight computers? And how do we generate the code? So this one will be complete by a demo that is based on the simulating UAV for a waypoint-follow mission with obstacle avoidance. So use it for the obstacle avoidance. We are using this-- we are going to use the Jetson GPU hardware, which will be integrated with that. And it will help us to datamine the depth data and provide the appropriate information about obstacle avoidance.

    So in the existing architecture setup, if we bring our hardware, it would be set up along with the UDP protocol with QGC. So that's where we come up for the complete architecture. But here, I request you all to let us know that what hardware are you using as a flight computer. Are you using the Raspberry Pi? Are you using the Jetson GPUs? Or are you using something else?

    Yeah. And, Khushin, while they take the poll, I have a question.

    Yeah?

    So do we need a separate flight computer, or can we do the same navigation and path planning in flight controller itself?

    Oh, so you want to just use one hardware and do everything?

    Yeah, the UAV weight is a requirement, right? We don't want to make it too heavy. So can we reduce it?

    Oh, that makes sense. But let's see that-- can we do it? This question brings to my mind that, what are the flight computers, and what are the flight controllers? So, as you said, do we need the both? And parallelly, you may think that, are there different-- can't we do it the same?

    Audience, if you also think on the same terms, please let us know in the chat that if they are same or different. And really, do we need both? So to understand all of these, we need to get into the details of, what are the flight controller, and what is the flight computer, and why these are used?

    So if you look at the flight control software architecture, we find out that, for the UAV actuators, we get the commands from motor mixer algorithm. This motor mixer algorithm gets all the command from controller. The major purpose of this controller is to take care of the perturbations and execute appropriate controls.

    So you can see that it has the appropriate controller for the position controller control and yaw-pitch-roll controller and altitude control. So this controller takes care of that. Along with this, the guidance block is there, which actually provide us the guidance on the system. For the autonomous algorithm, If I need to take appropriate decisions or I need to identify something or we need to calculate something to make our next move, then all these are done by the guidance block.

    So now, if you look at these, what are the hardwares? How are we going to use that? So as we discussed, that guidance block is taking care of your identification and tracking. So Pixhawk controllers are mostly used as a flight controllers. And for flight computers, most of the teams use the Jetson GPUs and Raspberry Pi.

    So these flight computer and flight controller hardwares do the handshake together. And flight computers generates the data for providing the controls. And the flight controllers finally execute the controller or finally generate appropriate command to execute the control. So that's why we need the both. I hope, Abhishek, it clarifies your question.

    Yep, that answers the question. Thank you.

    Thanks, Abhishek. But now I have a question for the audience. And the question is, if you need to track some objects during the mission, where will you deploy your object tracking algorithm? It may be the bullseye. It may be your buoy that you need to identify or some specific target. What do you think where it should go?

    Hamja said, it goes on flight computer? Yes, Hamja, fastest finger first. You're right. It will go on flight computer because flight computer hardwares are more capable of doing heavy calculation. And that's why most of the teams uses the Jetson GPUs because if you are using some neural network model or AI model, then you need high computation. And for these purpose, we use the Jetson GPUs.

    So moving forward, let's move forward and discuss, how can we do the code generation and deployment to NVIDIA Jetson? And I request Abhishek to take us through this, as he is the expert on this. Over to you, Abhishek.

    Thank you, Khushin. Right, so for code generation for deployment to NVIDIA-- sorry?

    Abhishek, sorry to interrupt. Would you like me to control the slides, or would you like to take the access?

    Yeah, I'll take access.

    Thank you.

    Just a second.

    All right. So thank you, Khushin, for passing it on to me. Let's talk about computers. And in this webinar, we'll be focusing more on the NVIDIA Jetson because we have seen that it's been used a lot for competition and also for personal projects.

    So with MATLAB, you can develop path-planning algorithms. You can give waypoints to UAV, have it go from point A to B. And you can also process a lot of sensor data, such as LIDARs, depth data from cameras. And with that data, you could do object detection. You can detect obstacles or particular points that you want to go to. And you can also compute paths from A to B. You can choose the most efficient path to go from one point to the other.

    And now, since we'll be deploying this to the drone, we need something that has the capabilities to process all this data in real time. Because the drones are moving in high speeds, you want all these data to be processed as they come in and then not lag behind the UAV speed itself. And for that, we have the Jetson, which has much faster compute than a Raspberry Pi, for example. And it also has the CUDA cores that use the GPU in the Jetson to do the computations faster.

    So with MATLAB and Simulink, you can deploy these as an executable or a static or dynamic library, which you can integrate with your own code base. Or you can also interface with other middleware, like ROS, for example, and deploy that as a ROS node to the Jetson. And now we look at a small video of how you can generate code and also communicate with the NVIDIA Jetson.

    So the first thing to do, we're going to be running the first model, which is the visualization, which will start the Unreal Engine for the photorealistic simulation. So here, you can see the 3D view is starting up. And you have the drone with all the visualization set up. And we'll be streaming this data to the Jetson so the Jetson can do all the depth processing.

    So this is the camera view that you see out here. And this view is being sent to the Jetson. So now we'll open the next model, which will be our algorithm, the algorithm that goes on board the Jetson.

    So in this model, before we start anything, let's go into the modeling section. We need to make sure that we choose the right hardware that we're using. Since this is a Jetson, if you go into the hardware implementation, you have a bunch of different boards that you can choose from. And then you can choose the right hardware. And you also give the device address, username, and password. This helps Simulink SSH into the system and transfer the code that you want to upload.

    And then you go to the Hardware tab. And there is something called Monitor and Tune. Let me pause here for a second. So Monitor and Tune is something that Khushin has touched on before. But what it lets you do is it lets you transfer the code to your hardware. But you are still able to tune the code with MATLAB.

    So you have your drone flying with the Jetson in it. And then you can sit here in the ground station and then change these tuning parameters. Because whenever you simulate something in a virtual environment and you transfer it into the real world, there's always some gap that you need to address or some extra tuning that you have to do. MATLAB helps you do this with the Monitor and Tune button. And then that lets you look at the drone in real time and then adjust these parameters so your drone behaves the way that you want it to.

    Abhishek, I have one more question here. Can we pause it?

    Yeah, Khushin.

    So we discussed about Monitor and Tune. And if you remember, for the PX4 flight controller, we did build, deploy, and start. So can you let us know, in the audience, that when we are in the design process when should we go for finally build, deploy, and start?

    Yeah, that's a good question. So Monitor and Tune-- as the name suggests, you'll be monitoring the system and tuning it. So it's not final. There are still some adjustments that you want to do. Make sure the drone is doing what you want it to do. It's not wobbling too much. It's not-- it's pretty stable. So these are things that you can tune on the fly.

    And when that has achieved an acceptable performance level, that's when you go for build and deploy. So in that case, you're completely closing MATLAB. You are deploying the code generated from Simulink. And you're sending it to the Jetson. In that case, you don't need Simulink anymore. It's all running on the hardware. And that's the tuned model. So the drone is going to be performing as you would expect. I hope that answers your question.

    Yeah. Thanks, Abhishek.

    Yeah. So let me play the video further. So I am using Monitor and Tune because it is not at that level where it can be deployed yet and also to show the depth data that is coming out of the Jetson. So while the depth data loads, let me also look at what details that you need to get right when you're sending and receiving data from the Jetson.

    So in the first model, we are sending data using the stream depth data to Jetson. In that case-- let me also open the other one. So the thing that is important here is to make sure that your IP address and your port numbers are same. You want to be sending to the same port number and receiving from the same port number. And also, the compression format should be the same so the Jetson knows what kind of data to expect so that it can decode it accordingly.

    So once you get all these correct, you should be able to see the depth data that is being streamed to the Jetson. As you can see, the camera data is down here. This is the Unreal environment. And this is the depth data that is being streamed to the Jetson. Let me also quickly open MATLAB to show you all the different functions that you have for the Jetson.

    So if you look at the library browser in Simulink, you can see that for audio and video, we have different blocks that you can use. So you can take an audio. You can receive video. And you can also read and send. So these are the blocks we're using in this model. We are sending video from Unreal to Jetson. And we are reading it again to do our obstacle avoidance algorithm.

    And we also have other GPU read/write blocks. And we also have some communication blocks that can use CAN or to talk to the Jetson. And we have some other sensors that you can use, such as the IMUs that you are integrating with the Jetson. And you also have MQTT for return of things.

    So these are some options that you get when you use the Jetson support package that comes with Simulink. Now we look at how you can use this data that you're sending to the Jetson, that you're streaming to the Jetson, and use that to simulate obstacle avoidance. Let me again open MATLAB before I show the video. This is the obstacle avoidance model we have. Let me get it full screen.

    So the first two sections here, the MAVLink decoding and encoding, is for getting data from MAVLink and then also sending it back. But the main algorithm is in this obstacle avoidance block, where you go into obstacle avoidance, where you get all these data. So you're going to get the waypoints that you want to go to, the current position you are in, the position in the sense x, y, z value, and also the orientation, which comes as a quaternion. So you have x, y, z, and omega.

    So these go into your obstacle avoidance block, which is a block that's inbuilt in the UAV toolbox that we have. So if you just search for obstacle avoidance, you will see that there is an inbuilt block with the UAV toolbox. And this takes in the current position, the current orientation, the target position that you want to go to, and then the obstacle points.

    So these obstacle points-- before I go to the obstacle point, let's just see what this block does. So it takes all these inputs and then gives the controller decide direction, decide your-- and the current status. So this goes into your flight controller, which, in this case, is a Pixhawk.

    Now, to look at where these obstacle points come from, the depth data that you're streaming to the Jetson is being converted to a point cloud. So these depth images-- while just looking at the image, you cannot really gauge the distance. You can tell relatively which is closer, which is further. But with this function, you can convert it to point clouds that gives you a good estimate of the distance between obstacles.

    So this is then, again, fed to the obstacle avoidance algorithm. And that gives you the waypoints and the desired yaw and pitch that you need for the drone to follow, which is sent to the Pixhawk flight controller. So now let's go back to the presentation. And you can see it in action here.

    So, Abhishek, here, what I see, the two sub windows. One of them is the camera view, right?

    Yeah.

    So the second one that we see in the black and white?

    So that question is the depth data that we are getting. So the darker it is, the closer the obstacle is. And the further gray it is, the obstacle is further away. So here, you can see that the tree, when it comes closer, it becomes darker and darker. And then you can see the drone avoiding the tree by moving slightly left. And then, finally, it goes to the landing location and comes here. There's also another example that goes much closer to ground. So you can see more of the obstacles and see that it is changing its trajectory to avoid all these obstacles coming at it.

    So with Simulink, you can quickly do obstacle avoidance. There's already an inbuilt block. All you have to do is make sure that the data to the block are in the right format. So you want point clouds where you want the drone to go, the waypoints, which, as Khushin mentioned in the previous one, you have already a waypoint follower. You give these waypoints from your QGC. And you also have depth data, which is important because you need some way to sense the environment. And in this case, we're using a depth camera.

    So if you want to more about how you can generate code and deploy it for the Jetson, here are some resources that we have. So some of our colleagues have recorded great videos on deploying not only a regular obstacle avoidance algorithm, but also deep-learning algorithms, such as YOLO. So you can use the GPU in the Jetson to speed up your inference steps.

    Now, before we move further, I also want to touch a little bit on Raspberry Pi because I know some of you are also using Raspberry Pi for your obstacle avoidance or as your flight computer. So MATLAB and Simulink definitely has support for Raspberry Pi. We have a support package that supports the latest version of Raspberry Pi, which is 5. And we also have all these libraries that you can use to interface with Pi.

    So you can do communication libraries, such as SPI, I2C. You can also read all the GPIO data, read and write GPIO data, and also read all the sensor data that you are interfacing with the Raspberry Pi. And of course, MATLAB and Simulink coder lets you deploy these algorithms that you develop as standalone execution code into the Pi. So once code generation is completed, you can close Simulink and MATLAB and just have everything run on the Raspberry Pi.

    And the coder that we have automatically optimizes your code. So it is faster on the Raspberry Pi. So because Simulink and MATLAB already has information on the architecture that is on the PI, it knows how to optimize the code so it runs faster.

    Since we don't have a lot of time to go into code generation for Raspberry Pi, I have some resources listed here. Again, my colleagues have recorded wonderful videos that you can use to see how you can deploy code from either Simulink or MATLAB to the Raspberry Pi and how you can focus more on algorithm development rather than low-level code implementation. And then let me give it back to Khushin to summarize on the rest of our webinar.

    Thanks, Abhishek. Just allow me for a minute to share my screen. Thanks, Abhishek, for letting us know about how can we develop the code for the NVIDIA Jetson GPUs and also discussing about the-- sorry, also about the Raspberry Pi.

    So with your session, I got this understanding that we already have a number of tools to develop and deploy our code for the various hardwares what our audience are using. And by far in this session, we discuss about how do we create the virtual environment, how can we develop the code for the flight controllers and flight computers.

    So with this understanding, we have to move to the next section. And this is all about learning resources because this webinar is not just enough for the audience. And they may want to learn more about these content.

    So just to recap, there are a lot of tools we discussed. So for your understanding, if you want to develop the code for the PX4 autopilot hardwares, you need the UAV toolbox support package. This is one of the add-on that you have to get.

    Similarly, for the NVIDIA Jetson, you will need the Jetson support package or GPU coder. And for the Raspberry Pi, you need the Raspberry Pi support package. So make sure that when you are using-- when you are working on your projects, all these support packages are already installed so that you can utilize the full capabilities of the MATLAB and Simulink.

    If you are new to the MATLAB and Simulink, we do have the MATLAB Academy, where a number of Onramp courses are available, starting from the MATLAB and Simulink to machine learning and the advanced workflows. So you can learn from these. And also, you can earn the certificates by completing these courses. And these courses are completely free.

    Moving forward, if you are looking for the dedicated learning resources for your competition, we support more than 50 competitions. And the list is available at our student programs web page, where you can find the link of this web page in the chat. And you can find the specific resources for your competition. And also, on these pages, you can request for a complimentary software license. So we provide all the teams the complimentary MATLAB and Simulink software licenses.

    Yeah, and that license includes everything that you would need. So all the support packages-- the support packages will have to be downloaded. But all the toolboxes that you would need for your competition will be included. And all of them are free, including the support package.

    Yes. Yes. But with that, we also have a lot of tutorial videos available. So starting from aerospace to robotics, automotive if you are working-- a number of tutorial videos have been developed by our colleagues and in the collaboration with the student competition teams like you. So please go through these resources and ramp up on the specific capabilities you want to learn on.

    Specifically to the code generation related-- specifically related to the code generation, we do have this code generation training available as part of the student video tutorials. You can go through this and learn the basics or learn the fundamental tools of the code generation for any hardware. And after this, you can move to the specific hardware code generation, where it's more about optimizing the code and using it in its best way.

    Finally, we have the student community, where we have more than 50 videos are available for the aerospace and other competitions. We do have the two Facebook groups, where more than 19,000 members are available. These are the Robotics Arena and Racing Lounge groups are there. Please feel free to join.

    And also, while working on your projects, you may need some technical support. So always feel free to write us on these two email IDs, Robotics Arena and the Racing Lounge. And they're @mathworks.com. And someone from our team will help you to resolve your doubts.

    And finally, we do have a lot of content available in the form of blogs and videos at the Student Lounge and the File Exchange. So please check out check out this content. And if you have any questions, tag us. Tag the MATLAB Answers. And at MATLAB Answers, tag the Robotics Arena and Racing Lounge so the right people can answer your doubts specifically related to the competitions.

    And the one exciting announcement, we also are running the Simulink Student Challenge. So this can be-- for this challenge, whatever the Simulink projects you are working on, either for competition or your academic projects, you can record a video and upload on YouTube and submit it as an entry. So please sign up for this competition by December 12. It is the last date. And the winning teams will be getting the cash prizes of $1,000 USD $500 USD, and $300 USD.

    And now we are open for Q&A.

    Yeah, question. So we do have some questions. If you want to take the first one, it is-- Manav is asking, is this recording going to be shared? And if yes, where will they get access to the recording?

    Yeah, so thanks, Manav. This session is recorded. And within two weeks, you will be getting the recording and the technical resources in your email ID that you use for registering for this webinar. So be assured all the learning resources along with this session recording is going to be available.

    Perfect. We also have Kevin asking where he can find the complete tutorial to make the autonomous flight using PX4 and Simulink. I think we have that link in the chat?

    Yes. Yes, Abhishek. We have shared the links. So, Manav, for your information, depending on your mission, what final mission you want to implement, your hardware requirements changes. And as per your hardware requirements, we have the example.

    So, for example, you want to just do a waypoint follower, we do have already examples and resources available for following a waypoint mission. But if you want to go for the complex mission, such as obstacle avoidance, we do have the examples for that also available.

    But the difference between the two is the first works only with the PX4 hardware. And the second works with the NVIDIA Jetson GPU support. So you have to understand have to identify what hardware you are using and what, finally, you want to achieve. And accordingly, you can find the resources for deploying and integrating the code for various hardwares, either it is a flight controller or flight computers.

    So I can take the next one. So Hamza is asking, how reliable and ready is the generated code to be deployed directly to the hardware? So what we always say is if it works in Simulink or MATLAB, the generated code will work on the hardware. However, you have developed your algorithm in Simulink and MATLAB, if it works in the simulation environment and if you--

    You can also use Monitor and Tune, as I mentioned, and see that the tuning parameters can be adjusted to get the desired performance. So once that is done and then you do code generation, it will work exactly as it did in MATLAB or Simulink. So it's very reliable if it works in your simulation environment.

    And the other one is, what is the approach taken for integration of Unreal Engine with Jetson? So I am not sure what exactly you mean by this, Varun. If you want to run Unreal Engine in the Jetson, then I would say that's a bad idea. So you want to use Jetson for your real-world application, which is processing data in the real world and doing real-time object detection or obstacle avoidance.

    But you can use Unreal in your desktop and then stream that data to the Jetson for you to do the processing. So that is what-- for that, we have the Jetson support package, which gives you option to stream video data from your simulation environment to the Jetson so that it can do all the heavy work processing for that.

    Abhishek, I would like to add a point here.

    Yeah?

    Related to the-- as we discussed that, how did the-- this question is mainly about, how do we make interact over NVIDIA Jetson and our Unreal Engine? So when we talk about the Unreal Engine, mainly the Unreal Engine is used for the visualization. And in the Simulink, we do have the direct support in the form of Sim 3D viewer, where the Unreal Engine visualization scenarios can be imported. And you can directly visualize the scenarios.

    So what are we doing? If you look at the slide, you'll see that we are sharing-- we are connecting the MAVLink link and using the UDP/TCP communications. Finally, the scenario simulation is connected with the plant modeling.

    But this is a specific example where we are running two hardwares together. And it's a hardware-in-the-loop simulation. So two separate MATLAB instances are working. And that's why we are using this UDP/TCP communication. But simply, if you want to just run the simulation for your plant modeling, you can do the same thing just using the Sim 3D viewer blocks in your same model.

    But if we talk about, how does it share the data with NVIDIA Jetson and all that-- and with NVIDIA Jetson? So in one of the slides where we saw that we had the Jetson here along with the QGC-- so this data transfer and integration works with the MAVLink connectivity. And we especially use the UDP protocol for that.

    Yeah.

    Thank you.

    The next one is about certification MATLAB Academy for aerospace. So the question is, are these certification completely free? Yes. Yes, these certificates are-- there are on Onramp courses are there, which are completely free. And if you are going to the advanced courses, depending on what courses, depending on what program is it, they may be the paid courses as well.

    But the MATLAB Academy provides a lot of courses for the students, which are free. And also, we are coming also along with the student competition license. We are coming with the MATLAB training suit. So from there, you can actually access a lot of free Onramp and the advanced tutorial programs.

    So the last question is by Tarini. And it is, in the HITL setup, using QGC and give the RC input, it takes garbage values and does not follow the input. I don't know if it is easy to debug without looking at the system, but I'll let you, Khushin, to take a crack at this.

    Yeah. Yeah, Tarini. So as Abhishek said, we may need to get into the details because most of the things when-- these are the systems are connected by the MAVLink connectivity. Sometimes it may be that the port you are getting from where you are receiving the data and port where you are sending the data, they have some issues.

    So you would be receiving the garbage value from some other port. And that can create an issue. But it is difficult to answer here without looking at the actual problem, actual model. So for all search queries, you can reach out to us at the roboticsarena@mathworks.com. And someone from our team can help you with that.

    Yeah, feel free to reach out to us with any technical questions. And we'll usually get back to you within a day or two. We can also go into the details that way and then give us our best effort in answering or solving your problems.

    Also, Abhishek, I see there are some of the questions or the points and mentioned in the chat as well.

    OK, I think we have time to take one more. Let me see.

    Yeah, we are--

    We are at time. So let me see what the question is. Again, if you have question, it would be easier if you post it in the Q&A section.

    So yeah, the one question I see here and I can answer is, that the last FTDI cable is required for doing the HITL scenario simulation? So for all these scenario simulations, when you are going to the examples, they actually show you how your hardware should be connected. So depending on the simulations, you have to check how the connection should be. So please check these examples hardware setup connection, and you will find all the details here.

    So for example, for the obstacle avoidance system, we do have these connections here. And you can see which protocol would be connected and how these hardwares would be connected with each other.

    Perfect. I think with that, we are at time. So thank you all for joining. I hope you guys learned something new, or we were able to help you along in your student competition progress or your personal projects. Again, if you have any more questions, feel free to reach out to us at Robotics Arena or Racing Lounge at the mathworks.com.

    Thanks a lot, everyone. Thanks, Abhishek. Thanks, team.

    Thank you, Khushin.

    View more related videos