Bringing the iCub Humanoid Towards Real-World Applications - MATLAB
Video Player is loading.
Current Time 0:00
Duration 23:48
Loaded: 0.69%
Stream Type LIVE
Remaining Time 23:48
 
1x
  • Chapters
  • descriptions off, selected
  • en (Main), selected
    Video length is 23:48

    Bringing the iCub Humanoid Towards Real-World Applications

    Dr. Daniele Pucci, Italian Institute of Technology

    The iCub project was launched in 2004 as part of the RobotCub European Project, whose main aim was to study embodied cognition—the theory that an organism develops cognitive skills as it interacts with its environment. The main outcome of the iCub project is a one-meter-tall, 53-degrees-of-freedom humanoid currently being developed at the Italian Institute of Technology (IIT). Over the years, the iCub robot has been used as a research platform for diverse fields of science ranging from neuroscience to engineering. This presentation focuses on the work that the Artificial and Mechanical Intelligence research lab at IIT is carrying out with the iCub along the three axes: physical human-robot collaboration; avatar systems; and aerial humanoid robots (jet powered humanoid robots). The presentation shows how MATLAB® and Simulink® are fundamental for research and development for the iCub in control, planning, estimation, and artificial intelligence—and how these tools can be beneficial for humanoid robotics as a whole.

    Published: 7 May 2023

    [AUDIO LOGO]

    Hello, everyone. My name is Daniele Pucci. And I lead the Artificial and Mechanical Intelligence Lab research line at the Italian Institute of Technology. And we basically work on human robotics and artificial intelligence.

    More specifically, we are working on iCub, a long-standing humanoid robot developed here at IIT. And today, I'll be talking about how we're trying to evolve the iCub humanoid robot towards real applications. So our effort in making the long-standing iCub platform from a child-size robot to a platform that can actually do something.

    But before telling you our efforts in this direction, I wanted to mention the why we actually embarked upon the journey of making a humanoid robot that can actually do something. So I'll be quick on this. But in order to do so, we have to take a look at the current trends.

    So if you look at some data concerning the average age of the European population, we understand that the next years would be challenging. By 2050, we will be having about 130 million people about 65 years old, which means that all these people, including ourselves, need to be taken care of, and also be monitored by new devices and equipment that surround our places. And these actually are a little bit more worrying if you associate them with the current trends in musculoskeletal diseases, namely back pains.

    So if you take a look at the current trends of the musculoskeletal diseases, especially in the industry, we understand that there is a large impact of these diseases onto the industry field, and more precisely into the workers. So the next years, we will be having a work population which will be a little bit more aged. And then most likely, the impact of these diseases will also translate into less productivity. And these are challenges that really call for close attention and new solutions in terms of robotics and artificial intelligence.

    Analogously, other data that somehow are worrying are the effects of global warming and climate change, which are in turn translating into natural disasters. So every year, thousands of people are killed by natural disasters. And on the side, there is also the risk of biological disasters. The recent pandemic clearly taught us that this stage of the current technologies is not mature when attempting to allow the human being to operate remotely. So right now, we really need technologies that somehow allow the human being to operate remotely, and to decouple this so-called risk of the operator from the local environment at risk.

    So in our opinion all these challenges, call for new tools. And in our opinion, these tools take the shape of humanoid robots. I mean, we view humanoid robots, let's say, as a hammer of the future, if you want.

    And our starting point is the iCub humanoid robot. So the iCub humanoid robot is a robot developed here at the Italian Institute of Technology. I mean, initially, the first idea was born around 2005, then the first version of the robot around 2009 here, thanks to the incredible contributions of Giorgio Metta, Giulio Sandini, and David Vernon, which led the project of iCub-- of RobotCub more precisely, were incredibly successful in that iCub was a robot child, which is one of the most known platforms for embodied cognition. And right now, that robot child needs to grow in order to help us face the next challenges. And this is basically part of our missions as a research team.

    So our mission is basically twofold. The first one is to do research on topics that can allow the evolution of the iCub humanoid robot towards real applications. And the second is to transfer these technologies into concrete applications via also collaborations with the industries.

    And the directions along which we are trying to make iCub grow are basically three. The first one is human-robot collaboration. So we want to understand how to make the robot collaborate physically with human beings.

    The second one is avatar systems. We want a robot to be like an avatar, so it can act remotely when the human cannot be present at the remote location. And the third one is aerial humanoid robotics. So we are the only lab in the world trying to make flying humanoid robots in this respect. And it is yet another challenging direction along which iCub has to walk.

    So when you try to work on humanoid robotics and try to get closer to applications, you cannot clearly resort to a single tool or a single programming language. So you can imagine that an ecosystem working in human robotics and artificial intelligence combines several tools and methods, among which clearly there are the MathWorks tools that are fundamental for us when we try to achieve specifically two objectives. The first one-- when we try to lower the barriers for non-expert coders. So most of the times, we receive people not with a background in software engineering, but also mechanical engineers or other kinds of backgrounds that really somehow might have incredible difficulties in starting code in C or C++ So in this case, the MathWorks tools are really, let's say, an incredible catalyzer for us, and enable young researchers to be productive since the beginning of their experience.

    The second objective for which we use MathWorks tools is fast analysis, rapid prototyping, and deployment. I mean, when you try to increase complexity of solutions, like a humanoid robot that has to operate in our environment, somehow, you have to create architectures. But then you have to also test quickly ideas that you really don't know if they will work.

    So when trying to test these ideas, we have been developing tools and methods in order to use the MathWorks tools that allow us to be really fast in prototyping and deploying the ideas that might also be translated into other programming languages via either code generation or somehow by hand translation. But these two objectives are really achieved most of the time thanks to the MathWorks tools. So let's see then how we can use these tools in order to make iCub evolve toward applications.

    Clearly, you need first of all, system representation. So whether you are dealing with a humanoid robot, or let's say a representation of the human that has to deal with the robot itself, you need a representation of the system at hand. And for us, the system is most of the time always subject to holonomic constraints, also known as differential algebraic equations.

    And we have been developing a lot of models for floating base systems that can encompass both humans and humanoid robots. Our long-term objective is to basically view the humanoid robot as a simplified, constrained version of the human being. So there is a lot of effort. We've been doing a lot of effort in developing models for either the human or the humanoid robot in order to basically have the same system representation.

    And once, basically, you get the math done, you would like to then have these tools ready to use. So we have been clearly first coding C+, C++ libraries that translate math of floating base systems. And then we have been bringing these libraries into the MathWorks world via two big ways. The first one is we use bindings. So MATLAB bindings via SWIG is really a nice way that automatically generates functions that we can use readily using MathWorks.

    And the second one is that we have been also developing Simulink libraries that wrap C++ libraries. And in particular, we have been working on BlockFactory, which is really a nice framework and tiny, which wraps our C++ libraries, but can also wrap other libraries. It is in general meant for dataflow programming.

    And then on top of BlockFactory, we have been coding WP Toolbox, which is Simulink library that is dedicated for generating the adding iDynTree functionalities. So let's say that this approach allows us to have a system representation. And then what we want to do is basically simulate the system, and also control the system. So these are the other two objectives that we have been partially achieving with the MathWorks tools.

    And how did we do it? Well, starting from the work that I just presented, we also coded the pure MATLAB and Simulink libraries-- so no longer libraries that wraps C+ libraries, but pure MATLAB Simulink libraries developed in the MathWorks ecosystem. And the first one is the MATLAB-whole-body simulator, which is a simulator for floating base systems subject to constraint.

    So we have been also able to encode the information of the context of the floating base system. And then the second one is whole-body controllers, which is a Simulink library that we developed. And this library allows us to basically fast prototype controllers.

    All this has been also reported into two nice publications that I wanted to advertise here in case you want to know the details. And what are the outputs, basically, that you can achieve? So this video shows a nice implementation of the WP Toolbox and WP Toolbox controllers. So on the right-hand side, bottom, you have the simulator-- so the floating base simulator that we developed. And then on the top, you have the controller, which is basically a QP or quasi programming-based controller that automatically generates joint torques for the system in order for the robot to keep balancing.

    The really nice thing is that exactly the same Simulink controller, instead of sending commands to the simulator, by sending commands to the robot, can actually be used on the real robot. So we have these high-performance balancing controllers of the iCub, which run completely in Simulink after being prototyped using the MATLAB-based simulator. And in this case, the code was automatically generated from the Simulink block.

    So basically, these demonstrations show the specific instances of trajectory generation and control. And now what we are trying to do is to add a loop on top of this when using parameterization of the underlying system. So you see this pi around. This pi basically is composed of the geometry of the system, like densities and lengths. And then you can further optimize this using an additional block.

    So this approach, then, is used to make iCub evolve. And one direction, as I was mentioning to you before, is the physical human-robot collaboration. In this case, we are about to present a new iCub which is called ergoCub. And I will not be talking about ergoCub in this presentation, but it'll be coming out in the next months.

    So ergoCub involves the INAIL. INAIL Is the Italian Institute for Work Accidents. And the overall problem of INAIL is how to reduce the impact of musculoskeletal diseases because they have to pay for after somehow there is a suit.

    So if somehow you cannot work, you have to make a suit to INAIL, and then they have to provide with economic support. So there is a really big concern by INAIL in saying, how can we achieve the so-called prevention by design using new technologies in the form of humanoid robots and wearables? So we would like to have workers of the future which wear smart clothes that can send data to humanoid robots that can intervene when the so-called biomechanical risk of the human is too high.

    And the ergoCub project is facing this in two validation scenarios. One is health care where nurses, when they have to move patients, are often subject to musculoskeletal diseases. And then the other one is industry-like environments, where moving payloads also induces muscular diseases.

    In this research axis, we have been trying to use iCub 3, which is a new version of the iCub. It's taller and optimized for manipulation and locomotion tasks. And the nice thing is that iCub has been evolved thanks to the approach that I showed you before. I cannot give you all the details. But I just wanted to mention that, thanks to the overall approach of parameterizing the underlying system and finding optimal parameters, you can really make the system emerge dynamic behaviors.

    But in this specific video, there is no interaction between the robot and the human. So what we have been doing is to add the human collaboration into the classical control architectures for humanoid robots. And the problem basically has to be decomposed.

    First of all, if the robot has to react to a human being, it needs to sense the human. So we have been working on wearable sensors. And then we have been working on models of the human behaviors because the overall control architecture for the robot has to, on the one hand, perceive the human, but also have models for the humans.

    So I cannot give you the details. I really invite you to take a look at the publications. But the overall idea is that somehow, the robot looks at the human and says, well, how does the human move? Since it looks like me, it has to move like me. So the balancing controllers of the human are similar to those of the human being.

    And interestingly, this architecture can be mapped to 1 to 1 to a Simulink block thanks to the libraries I showed you before. So in the same Simulink block, we have the robot model, which is a floating base system representing the robot. And then we have also the floating base system representing the human being. So we can extract online at about 500 hertz quantities of both the human and the robot at the same time. And then we can synthesize controllers on top of this.

    What is the output? Well, the outputs are something like this. You have the robot that collaborates physically with a human being.

    And the human is also wearing our own wearable sensors that we call iFeel. And it's basically a motion tracking system and a force tracking system for the human. And then online, the Simulink generates joint torques for the human according to articular stresses of the-- sorry, joint torques for the robot according to joint stresses of the human. So it tries to minimize the joint stresses of the human in collaborative lifting tasks.

    Interestingly, you can substitute the human with another humanoid robot. And everything works magically because the approach basically sees floating base systems that exchange forces. And so you can generate joint torques for two humanoid robots in the same time. In this case, you have that the overall objective was to lift a box.

    But the planned trajectory had to minimize the energy consumption of the green robot. So automatically, we have these trajectory generation methods that can generate trajectories with minimum energy for one of the two systems. And once again, it all runs in Simulink.

    And this, I have to say, that was really, really fast to do. If we had to do it in other programming languages, well, that would be really complex, because you have different models. You have to combine different models of humans and humanoid robots, and it would have been much, much longer. It would have taken much, much longer.

    So the second research axis were evolving iCub for is aerial humanoid robotics. So in this case, what we are trying to do is make the iCub fly. So you might be wondering why we want a flying humanoid robot. Well, we have to get back to the initial slides.

    So response to natural disasters-- so in this scenario you have floods, fire. It is already difficult to make a humanoid robot to work in the lab. So you can imagine that passing through floods and fire would be much more difficult.

    So we need a platform that can take off, avoid floods and fire, land, inspect, open doors. And then once the inspection is finished, they move to another building looking for survivors. So we need platforms that basically combine a degree of terrestrial locomotion, like walking, manipulation, and area locomotion-- so flying.

    And the idea back in time was, well, simple. We take the humanoid robot that already has manipulation and locomotion. We used it with jets. And this is it and we have the flying humanoid robot. This idea took a little bit before implementing, and there are several reasons actually why it is difficult.

    But the first problem clearly is the design of the robot itself. You have to imagine that jet turbines emits gas at supersonic speed, 800 degrees Celsius. And so you have dynamic pressure that breaks everything and temperature that melts most of the things. So we had to develop a new version of the iCub that we call iRonCub that could withstand temperature and dynamic pressure. So we had to redesign, basically, the robot in order to give it with jets.

    Once the design was done, we had to modify the previous control architecture in order to take into account jets. So the overall issue if we wanted the overall control of the robot-- control in planning to generate not anymore the motion of the robot joint torques, but also the throttles of the jets. Clearly, the first question is, how do we do with the model of the jet turbines?

    So once again, in Simulink, we have been developing models that could read data from a test bench. And then we could process data for system identification purposes. And as I was mentioning, here you start seeing some flames because jets are not easy to deal with. So having, let's say, really a burden in terms of architecture complexity from the system perspective is something that you cannot really neglect. And Simulink in this case lowers a lot at least the complexity for fast prototyping. We had to really understand what was working or not on the jet turbines.

    Right now, what we are doing is using the Deep Learning Toolbox, because although the spatial identification techniques proved to be successful, they are not really accurate when you want to also have a model that takes into account temperature, pressure. So the model of the turbine is complex, so we are currently working on basically feedforward neural networks in order to have a better model for the turbine. And once again, this is done thanks to MathWorks.

    So once we got the model of the turbine, we had to then develop the architecture. And the architecture is done thanks to a library that is made completely in Simulink. So we developed this library that allows us to call for dynamic jet engines a certain number of utilities and not to neglect the visualizer itself. So we then were able to synthesize quadratic programming-based controllers that could run online. And we could online generate not only joint torques for the robot, but also the throttles.

    And once again, everything runs in Simulink, both the simulator and the controller. The simulator will also be attached to Gazebo, another simulator that roboticists know pretty much. And exactly the same controller works here. So we really have just the API that changes. So in this case, we were still using the same Simulink controller in order to control not our MATLAB-based simulator, but Simulink simulator.

    And right now, what we're trying to do is to test the controller on the real robot. So this is the iRonCub during the ignition. And also, the ignition part is being handled in Simulink.

    And we have been able to achieve the so-called flight zero, which is the unload of the forces. There are videos that I cannot show you right now. But we think that the first flight is going to come soon.

    So just to wrap up, we started with the question of when and why to use the MathWorks tool. So when working with humanoid robotics and AI. I mean, for us, MathWorks was fundamental when lowering the barriers for non-expert coders, and also to do fast prototyping and rapid deployment, which are fundamental activities when you don't really know how to solve the problems.

    And right now there, are many problems in robotics that are still open. So thank you very much. And in case you have questions, I'm here.

    [AUDIO LOGO]