Smart Factory Summit - MATLAB
Video Player is loading.
Current Time 0:00
Duration 2:12:58
Loaded: 0.12%
Stream Type LIVE
Remaining Time 2:12:58
 
1x
  • Chapters
  • descriptions off, selected
  • captions off, selected
  • en (Main), selected
    Video length is 2:12:58

    Smart Factory Summit

    Anup Wadhwa, Director, Automation Industry Association (AIA)
    Prof. Sunil Jha, Director IITD-AIA Foundation for Smart Manufacturing
    Shireesh H, Practice Head – Digital Transformation & Industry 4.0, Tech Mahindra

    Overview

    The industrial world is changing with the emergence of smart factory. Industry 4.0 technologies such as Industrial IoT, big data analytics, artificial intelligence and robotics are enabling manufacturers to have a sustainable and profitable future. It is a digital transformation journey that needs vision, commitment, and value creation along the way. Each organization is different and so is their journey. MathWorks and FSM will be hosting this event to bring together experts from industry and academia to showcase technology and workflows that will help you go through the Fourth Industrial Revolution and generate value for your digital transformation.

    Recorded: 20 Oct 2021

    Welcome to the Smart Factory Summit. MathWorks and FSM will be hosting this event to showcase technology and workflows that will help you go through the Fourth Industrial Revolution. FSM, the foundation for smart manufacturing, is an initiative at the Indian Institute of Technology, Delhi, with collaboration from the Automation Industry Association and the Indian Ministry of Heavy Industries and Public Enterprises.

    FSM's charter is to promote the adoption of Industry 4.0 solutions to the Indian manufacturing sector. To begin this event I welcome Mr. Prashant Rao, who heads the Application Engineering Team at MathWorks India. And it is over to you, Proshant.

    Good morning and good evening, good day to everybody, all the attendees of this session. As you heard, my name is Prashant Rao. I head Application Engineering for MathWorks in India, and I extend a warm welcome to you to the Smart Factory Summit. This summit is a culmination of two different things, actually. One is the start of an engagement that we have got.

    We've actually been partners with FSM since mid or the latter part of 2020, but due to the pandemic issues, we weren't able to do a physical event to announce that partnership. So we're using this opportunity to do that on the one side. This also has given us some time to prepare some technical demonstrations that we can talk about today as well.

    The other part of this is that it's a culmination of a series of webinars that we have had over the past few weeks around smart factory and enabling AI in smart factories, so we had three parts already, three sessions already of this, and in case you were not able to attend any of these and you are interested as an attendee to take part in this, we have the recordings for these sessions available.

    So please feel free-- I'm leaving this slide up for a few seconds-- to scan this link, the QR link on the right-hand side, or look up Smart Factory Webinar Series on The Mathworks Home Page and you'll be able to find the other sessions as well. Today is a culmination of the series, in that sense, where we have the Smart Factory Summit, with a series of speakers.

    We'll have Mr. Anup Wadhwa joining me in the introduction to the session. He's the Director of Automation Industry Association in India. We have Professor Sunil Jha, who will deliver the keynote address. Professor Sunil Jha is Director of the Foundation for Smart Manufacturing. He is also a Professor at IIT Delhi in the Mechanical Engineering Department.

    We have Philip Wallner joining us from MathWorks in Germany. Philip is our Industrial Automation and Machinery Industry Marketing Manager, and he'll be sharing a few thoughts from his side as well. And last and absolutely not least is Mr. Shireesh. He's the Practice Head of Digital Transformation and Industry 4.0 at Tech Mahindra. We'll be sharing some insight on how he is looking at Industry 4.0 in the context of Tech Mahindra's work.

    We have a technical demo team comprising of team members from FSM, Vinshu and Omkar, and a team from MathWorks as well, a few application engineers from the team. We have Peeyush, Ramanuja, Rahul, and Shripad, who will be working behind the scenes. Sometimes, you'll hear them, but they've done a lot of work behind the scenes as well. Right, moving on to the agenda for today.

    We have a brief introduction note. After a few minutes, I'll hand it over to Mr. Anup Wadhwa a few thoughts and introduction. From his end, we will have a keynote address from Professor Sunil Jha. We will move to Philip Wallner, who will talk about the factory of the future, and then the industry presentation as mentioned earlier by Mr. Shireesh around the role of digital twins in 4.0, Tech Mahindra's point of view and experiences.

    We have time at that point in time for a Q&A. Please post questions in the chat. We may be able to take them and curate them from the chat and answer them during that Q&A session as well. At times, we may answer the questions directly in the chat as well. We have a five-minute stretch break. We have two videos that will be running in the break, but this also gives you an opportunity to get some coffee, refresh yourself.

    And then we have a technical demonstration session where we'll be showcasing through three different demonstrations how we can transform from a traditional to a smart factory setup. We have demonstrations around building a smart lathe machine, digital twin modeling of a machine and virtual commissioning with PLC. Integration. So let me spend a few minutes to talk about your hosts, specifically now with MathWorks.

    MathWorks is a leader in the 2021 Gartner Magic Quadrant for Data Science and Machine Learning for the second year in a row, and this is on the basis of our completeness of vision and the ability to execute on this vision. The difference is that we understand that AI success demands more than just an effective model or an algorithm. It's about delivering a full product or a service to the market that is based on AI, and we approach this problem by enabling engineers and scientists to incorporate AI across their entire system design work.

    A few more words about The MathWorks itself. You may be knowing some of our products. MATLAB is a programming environment for algorithmic development data analysis, visualization, and numeric computation. And Simulink is a graphical environment for designing, modeling, simulating, testing, verifying systems. The work that you do in MATLAB and Simulink can be deployed in any kind of system. It can be deployed on embedded systems, desktop systems, on the cloud, and there are many other options for that.

    We have a whole set of 100-odd different products for specialized tasks. Here is an example of a product known as Computer Vision Toolbox that is specifically for computer vision tasks. And these products span several domains across engineering and science.

    Talking a bit about digital transformation in manufacturing, so we hear this term a lot, digital transformation. And the way we understand it in this manner is it's about the data integration between assets, edge, OT-- operational technology-- and IT systems. So whereas these systems used to be encapsulated in their own silos earlier on, now with the technology that we have at hand and with communication technology as well as technology that enables us sync between these different systems, we can actually send data seamlessly across the entire set of systems over here, these family of systems.

    What does this enable us to do? So when you think about it, of course, on the one side, we can have integrated operations. So that means we know at the IT level, or at the organizational level, about processes that are happening in a specific asset or in a specific production environment. But the feedback loop isn't also, or-- possibly of more value. We can actually then optimize and manage processes at the factory level or at the asset level, but across the entire organization.

    So optimizations can occur then throughout the organization, taking inputs and data from across the organization to enable that optimization. And finally, in when applied completely, this allows organizations to use business insight to drive and optimize manufacturing processes. This can also be used vice versa. You can use manufacturing processes to then optimize your business decisions as well. So overall, organizations find great value in being able to enable digital transformation across the workforce.

    What does it mean for us when we're looking at it from a domain perspective, from a domain expert perspective, especially when we're talking about engineers and scientists? We believe that engineers and scientists should be enabled to do the work that they are demanded to do in the best possible manner. And we provide tools that enable domain experts to do their best work. Now moving from traditional- or first-principles-based approaches, for example, from system design and development-- now incorporating also some of the newer data driven techniques, big data, AI, analytics, and cloud integration-- our tools enable domains experts to leverage the tools that-- or the AI-based applications or AI-based technology, across their entire workflow.

    In terms of the partnership of what we are expecting or what we also do with FSM, we think that there are three key pillars to this. Basically, we would like to enable and accelerate the adoption of smart manufacturing methodologies in the market, and we can do this by focusing on technologies themselves, which is what you will see part of today. But also, by building competency, helping people to adopt these methodologies using the domain expertise, empowerment, collaboration.

    So we are focused on that as well through sessions like this and also help organizations set up the processes that they would need to do be able to develop smart manufacturing or leverage smart manufacturing techniques. So whether it is setting up processes for model development, test validation, incorporation of AI, or these technologies across an entire enterprise and basically enabling the entire digital thread, we will be very, very happy to be able to help organizations do this.

    I will hand it over to Mr. Anup Wadhwa for the rest of the introduction. Anup Wadhwa is an alumnus of IIT Delhi. He served in systems engineering and special projects for BHEL and Rockwell Automation as well as SAMTEL. As director of AIA, he conceptualized and steered the industry partnership with IIT Delhi and the Department of Heavy Industries to set up the IITD-AIA Foundation for Smart Manufacturing, FSF.

    Anup G. is an evangelist of the Samarth Udyog Ecosystem that integrates the efforts of different agencies in government and academia. He has served on Government and UNIDO Expert Committees related to country-relevant initiatives and Industry 4.0 with special focus on interdisciplinary cooperation. Anup G., I'll hand it over to you to take us through the rest of this introduction. Thank you very much.

    Thank you, Prashant, for that introduction. And also, thank you, Sunita, for sharing a brief on the IIT-Delhi-AIA Foundation for Smart Manufacturing. First of all, I would like to place on record my sincere thanks to you, Prashant, and the entire team of The MathWorks for being a very active participant-- I would say more a partnership approach, which your team has demonstrated-- and I'm very sure that you know, we would take this forward with greater speed now that we are getting ready for physical engagements.

    Having said that, I would like to share with our audience a little bit about the genesis of Samarth Udyog and our foundation as a hub for Samarth Udyog. A few years back, Prime Minister Modi had named the chief guest at the Hanover Fair and India was also a partner country. And that was the time he was taken to a booth and shown a bit about Industry 4.0. And he was very fascinated and he foresaw that a country's economic development was very closely linked with how we would absorb and implement this new technology.

    So he came back and he said things, you know, floating about. Who are the agencies? What can they do? How can India ramp up absorption in this technology? In our meetings-- the Automation Industry Association-- we began to discuss the implications of Industry 4.0, and we had several players from Germany, America, and Japan who were on our board. And you know, they had a lot of things to share about how their respective parent organizations were at the forefront.

    One message was very, very clear that there was no single company which was claiming to be at the leadership position, and there were a lot of cooperative efforts which were going on in these countries to establish the foundation for Industry 4.0. I'd like to draw the attention of the participants to this slide. As you can see that there are two colors splashed. One is the blocks with green, and the other are those darker blocks.

    Now, if you look at the blocks in green, they represent the established entities of an automation or an automated manufacturing ecosystem. So you have the automation players, you have the engineering process contractors, you have the machine builders, the line builders, control integrators, panel builders, distributors. This group of players were, at that time, already adopting well-documented engineering practices. We're very well aware of global standards and buyers, consultants were also pretty clear about how to tackle or chase and testing of physical assets, including the automation system.

    So that was one established ecosystem. On the other hand, if you see the dark blocks, there was this emergence of the software-based industry players. So we had the industrial IoT solution providers, we had the cybersecurity providers, you had the IT Infra providers, and generally, we had a great buzz about cloud computing and internet-based solutions as the next big thing happening for the industrial world.

    Now the dilemma was not about whether or not the IT and OT systems would integrate. The key concern was how do we, as a nation of industry practitioners, of educators, of skill providers, of standard agencies, how would we understand, absorb, educate, and implement going forward industry for solutions? So with that challenge, the Indian government took a leadership position and-- can I have the next slide-- we all realized that even the advanced countries had got a mission statement-- they had a flag-- and they were putting all their energy in a very well-coordinated manner.

    So industry 4.0 was obviously the pioneer from Germany, but then you had China, you had the Americans with the AMP, and you had the Indian government giving a new twist to our mission, calling it Samarth Udyog. That was the genesis of the foundation for smart manufacturing, and I'll talk about that in a minute. Can I have the next slide?

    So players from automation industry, consulting, and the software industry came together. And along with IIT-Delhi's gracious hosting of facility, we looked at several facilities. You can see the slide. We have the IIOT systems. We have the collaborative robots, smart sensors, rapid prototyping, AR, simulation, remote maintenance, safety, cybersecurity. A lot of this is a work in progress, but all these pieces are being knitted very well by Professor Sunil Jha, who is the chief guest for today's function.

    You will get to hear about him, how he's going about actually building this facility and creating an infrastructure which is available for industry and academia. And as I hand over the baton back to Prashant, I'd like to extend a very warm invitation to all the guests on today's show. Please come to see this facility and help us in developing more and more use cases for you, for India, so that collectively, this can truly become a facility of national importance and relevance. Thank you so much and I also look forward to Professor Jha and the other esteemed speakers for their valuable contribution.

    Thank you very much, Anup G., for your introduction. I will hand it back to Sunita as we move through the logistics of transferring prisoners to introduce Sunil Jha at this point in time.

    Thank you Mr. Anup Wadhwa and Mr. Prashant Rao for this insightful opening address. I now welcome Professor Sunil Jha to deliver the keynote address. As director, IIT-Delhi-AIA Foundation for Smart Manufacturing, Dr. Charles is closely the manufacturing industry in absorbing new technologies to make them ready for the next Industrial Revolution. And it is overdue, Professor Jha.

    Thank you so much. Very good morning and let me share my slide and then we can-- this. Hope my slides will be visible to you.

    Yes sir.

    OK. Thank you so much for this nice introduction in the start of the session today. And as you are already aware of this, is a series of the talks on the smart manufacturing being initiated by The MathWorks. And today, we are going to have this summit where we'll be discussing some of the methodologies of implementing smart manufacturing, the kind of solutions that we have developed, the kind of infrastructure and facility that we have created together.

    So I will be just taking you through a very short glance on what kind of facility we have and what we are exactly doing, because in a short time of 20 minutes, it's very difficult to go a little deeper into the technologies. But I will give you at least the idea of the implementation of the smart manufacturing, and our foundation is certainly committed to bring out the solutions, develop the solutions, do research and development together with the leading industries in this domain, and then developing a solution, which are more adaptable by Indian industry.

    So as we all heard about this, this is a common engineering facility center as briefed by Mr. Anup. That, as a part of this mission by government of India, IIT-Delhi has created this common engineering facility center with Automation Industry Association. So we bring together their strength, basically, what the strength lies with the industry. And as you all understand, we have partners now working with us all across the world.

    These are MNCs which are basically contributing, sharing their ideas, thought process, the new technology developments with us. At the same time, as an academic institution, will also have the required specializations at IIT-Delhi which are certainly helping us in developing the solutions on the smart manufacturing. So with this initiative, and certainly the supported by the government and financially-supported supported by the industry and the government, we are able to build a world-class facility at IIT-Delhi to do research in this area and develop solutions for the industry.

    So let me quickly brief you about the different stages in implementing smart manufacturing, because that's how we think about the implementation part. As we all understand, the basic foundation is to be built by the automation technologies. That's how we look forward, that the sensing, computation, and actuation has to be integrated. If they are legacy machines, if they are old machines, we need to integrate these basic ingredients, all the elements of the automation into the system. And post that, we would like to build the connectivity with these assets. And that is what the prerequisite for the implementation of the smart manufacturing.

    On top of that, we can provide many solutions, many services that can be built upon. And the journey for the smart manufacturing implementation certainly start after they're done with. So there are multiple technologies that fall in these four blocks that have been shown on the right, how you can bring out the visibility in the organization, how the transparency can build across the different departments and the customers over here, how the predictive capacity of an organization can build up, and certainly we look forward for the more self-organized systems. The system should be able to adapt to the changes and the decision-making can happen. And that's the last stage of implementation of smart manufacturing is their adaptability.

    So when we talk about the implementation across these different stages of implementation of smart manufacturing, we are talking about these key smart manufacturing technologies. And the idea, when we started this journey of understanding the smart manufacturing technologies-- certainly, we are familiar with the manufacturing domain and the automation technologies that are available as part of Industry 3.0 developments. And certainly, we look forward more cyber physical integration with the technology that are already available as part of Industry 3.0.

    So we plan to do a lot of experimentation around these technologies. Certainly, you can understand these technologies well when you do experiment, when you understand these technologies better. So we created certain, basic infrastructure to learn these technologies and understand these technologies and create a platform where the people from academics, people from industry can come together and then do a lot of experimentation. Maybe these days, you heard about a lot of tinkering labs and the similar initiatives are done for the basic science learning and the STEM learning.

    We can see that this is one of the infrastructure that is available for doing a lot of experiments in the domain of smart manufacturing. So we created this basic cyber physical assembly line in our lab, which is an assembly line for electromechanical components, how you can assemble the electromechanical component. Starting from the basic ordering process from the mobile application or the web application, you can just select the component to be assembled. You can click the order. You will get information, how much time it is going to complete the assembly operation and when your product will be ready to deliver.

    So this assembly line is being created, mixing the technologies of the Industry 3.0 and 4.0 all put together in an initiative where the automation stations are being designed, developed, and integrated over here with the backend communication IIOT smart sensors being integrated. So whatever technology landscape that I've shown in the previous slide, they are all put together in this basic systems over here. And then, intentionally, we plan to put a heterogeneous hardware in place so that the similarity will be drawn in the manufacturing industries as well. Because there, you will find there are heterogeneous hardware in the automation systems to be operated together.

    Then we have also added a state-of-the-art, multi-process robotics cell to the same facility. The idea is that on one side, we have more than 12 axes, 15 axes of the motions being executed by three different controllers with the three different HMI interfaces, how the networking is being done between the systems. And seeing now we have designed this multi-process robotic cell, we have the single, six degrees of freedom robot is integrated with automation technologies in place. So this small, three-by-three-meter footprint is basically capable of demonstrating the complete flexible assembly line where the robot can do the pick and place. And then the functional testing stations are being integrated as a part of this.

    And starting from the basic raw material to the final and finished product being labeled and dispatched through this cell, that is the capability that is being demonstrated in this cyber physical assembly line. So we designed this multi-process robotics cell with the capability that it can do the assembly of multiple type of components put together. The orders can be customized, the routing of the product can be changed on the fly, and finally, a finished good can be delivered. There's a lot of learning happened as a part of this development in terms of how the integration of the robotics with automation can be done, how the IoT data can be used for controlling and monitoring the cell as a system.

    So this is just a quick inside view of the facility that you can see there. So there's an incoming conveyor from which the raw material can be picked up by the robot. This is a ceiling-mount robot and which will do the pick and place for the assembly operation to be started. So we have assembly stations. Over here, we have a functional testing station, we have a nutrunner as well as the screen arrangement inside the cell itself.

    We designed this system. Some of the indigenous development also happened as a part of this development activity, and we have this final packaging and the labeling system integrated into that. We have also implemented the capability how PLC can be used to control the robotics, motion control and the robotic programming, through one single interface. So that is also being implemented in this case.

    Now we are also extending. This is one of the lab parts and we are extending the similar facility towards creating a smart factory, which is a kind of a micro production system, where we will be connecting together some of the CNC machines with the old machines, or old infrastructure like lathe machine, with a state-of-the-art robotic welding cell that we are designing and placing in place. All these are connected within a mobile robot, which is responsible for transporting the material, inspecting the material all across the floor in the smart factory.

    We are also building some of the applications for the smart factory so that they can be utilized by the supervisors, the plant manager, the maintenance engineer to get real-time information about the production facility. So certain framework you need to be prepared how the data can be put together and how this information can be exchanged, keeping the data in premises and how the solutions can be delivered to the various stakeholders. We are designing all these applications for the smart factory.

    Next development is basically creating a robotic welding cell. The idea is to demonstrate how our flexible manufacturing cell can be designed and created, which can be made available as a service because most of the time we see that the manufacturing infrastructure is lying idle if it is not being utilized for making your own products. So in that case, we are designing this cell so that it can be available as a service. You can have a web interface or the mobile interface through you can define your product requirements-- and certainly, in this case, a welding requirement-- how you will upload your data to the cell, and the cell will schedule the ordering, and then finally, will proceed for the welding operation, do the right quality testing.

    So all is being integrated into this robotic building cell. And throughout the process, we will also be demonstrating how that automatic identification of a product can be done. Because each product that is being welded in this cell is a unique actually. So it's not on a repetitive welding operation as you see most of in the mass production activities. They are-- a particular job is to be welded in a particular fashion and whose data is being made available before the job is being submitted to the cell.

    So we designed this architecture here, where the customer web application will capture all the required data through the cloud interface. And finally, the necessary data is being delivered to the local production server. And from there, the production activity starts. So the information is being passed to the master PLC because here we have multiple controllers to be coordinated among. So you have a master PLC that is coordinating the control between the building power source, the multiple robot controllers because this has two robots, one is the building robot, another is the material handling robot. At the same time, we'll also be integrating the welding inspection system inside the facility.

    So this is a cell under creation. We are almost in the final finishing stage of this robotic welding cell. And certainly, as we allow the access to the visitors, you are most welcome to come and witness this facility. We have also created enough infrastructure for training because that's a very important part of our learning, that whatever time we spend in understanding these technologies last three years, we look forward the same infrastructure will be made available to all the learners from the industry and academia.

    So we have this smart manufacturing training system designed and created as we understood how the learning can happen for the smart manufacturing technologies. These are made accessible remotely also, so that a lot of experiments can be done by the students and the people from industry, from a remote. They can try out their own concepts and implement on this facility.

    We are also integrating multiple technologies, bringing out more demonstrations and the use cases over here. So that is how the smart manufacturing trainer kit integrated with the electromechanical hardware to demonstrate the various communication protocol in the industry, how the edge hardware where the vision processing can be done, and certainly, the kind of a toolbox-- Prashant was mentioning the computer vision toolbox-- all these solutions can be deployed onto this edge hardware. And then data can be exchanged with the OPC-UA, or with the industrial hardware.

    So all these experiments can be done on this environment, and certainly, we can bring out the technologies like augmented reality to generate an augmented view of the information on the physical system. And you can certainly implement the check on the quality parameters with the various sensors. We are also working on the development of the solutions like an expert maintenance system where the multiple technologies are going to be put together in this environment and how we can build a system which can help us in identifying or analyzing the system behavior and finding out the fault. And once you are able to locate the fault, you can use the technologies like AR to locate and then do the replacement of the repairing part as a part of maintenance activity.

    We are also working on integrating multiple solutions available around the robotics. That's how the robotic toolbox available from The MathWorks and the open-source ROS integration. All that is an ongoing process that we are working on. So putting an open-source technology along with the commercial technologies to make the solutions available and viable for the industry, that is the idea behind this implementation.

    We are also integrating the digital twin part over here. And we'll try to build up a solutions like this where you can give the goals to the robot, and then the planning will be done with the open-source libraries, like move it over here. And then robotic motions can be controlled and the final solution can be deployed onto the robotic controllers and the extended controller.

    We are also working on developing a solutions for the conventional machines. That is what we are going to demonstrate in the later part of this summit itself. Idea is to enable the same technologies, what we are using for the state-of-the-art equipments, how we can deploy the solutions for the legacy machines, and how we can convert that the whole infrastructure into a smart infrastructure in the facility. So we have already done that integration, the desired integration, in terms of the sensors and the controllers in place.

    The idea is to monitor the product quality, control the product quality, on the manually-operated machines, how we can monitor the machine health. And finally, what kind of guidance can be given to an operator who is manually operating this machine. So that kind of solutions can be built, so that apart from controlling a machine through any manual control, we can also enable people to produce because most of our manufacturing facilities are having this kind of an infrastructure. So we will demonstrate that integration to you in the later part of this effort.

    We are also working on the creation of the digital twin of the machine tools. That is the idea. So that is how the second demonstration is linked with that. So we have certain machines being created at IIT-Delhi. We are trying to build our digital twins of these machines. So idea is that how you can implement the simulations of the physical machine tools and how we can represent the machine tools with the different models. And then on the other side, you will be implementing the monitoring and control technologies together.

    Idea is that once we have this fleet-based knowledge system put together, where you on one end you will have a simulation, on the other side, you have a physical machine running, and the control is being executed by the local controller, how these two systems will exchange the information between themselves. So how you will be managing the real machine data utilize to improve the accuracy of the simulation model? On the other side, you will have a simplified version of the simulation that is being used to monitor and control the machine activity. So that is how the digital twin implementation will be demonstrated in the next part.

    We are also working on the training programs. That is how we are running the various training programs. So starting from a basic awareness through the deep dive on the technologies, how to do the implementation. So we have designed our own courses, which are running over a period of a month and a much deeper engagement in terms of learning the technologies and so on. So these are the training programs that FSM is conducting. So those who are interested can certainly join and then watch for the announcement in this directions.

    Thanks all. I think that's what I have for you today. So idea is that we would like to demystify what all technologies being involved in the smart manufacturing. And we are open for all sorts of demonstrations, prototyping, research, all put together in this foundation. So thank you so much. Over to you, Sunita.

    Thank you very much Professor Jha for this enlightening session on implementing smart manufacturing and especially the exciting sections on the cyber physical assembly line and factory. We wish FSM all the very best for supporting the industry in adopting smart manufacturing initiatives. Just a reminder for our audience, if you have any questions for our speakers during the event, please do put them into the Q&A box-- or click on the bottom right of the screen to bring up the Q&A box-- and our speakers will answer them shortly.

    I now welcome Mr. Philipp Wallner, industry manager for industrial automation and machinery at The MathWorks, to take us through the session on the factory of the future. Welcome Mr. Philipp and it's over to you.

    So Thank you Sunita Let me share from my end. So please let me know if you can see my screen.

    Yes, Philip.

    Thank you.

    Thank you.

    So hello and good morning. My name is Philipp Wallner, industry manager at The MathWorks. I'm working out of Munich, Germany and it's an honor for me to speak at today's Smart Factory Summit. The title of my presentation is The Factory of the Future. And in the next about 20 minutes, I want to talk about how models and the systematic use of models and data supports your digital transformation on the way to the factory of the future.

    And-- well, in my talk today, I will do this with highlighting two specific practical examples of companies that are already on their digital transformation path and that are extensively using models and data for that. But first, let me give you a brief overview of The MathWorks and who we are. So while we heard a bit from Prashant Rao already, just to give you a bit more background. So The MathWorks has been founded in 1984. Since then, we have grown to a team of more than 5,000 employees who are supporting our customers worldwide. And more than half of our employees are developing our products and specifically, our core platforms MATLAB and Simulink.

    For MATLAB, as we already heard, this used for developing data analytics, algorithms, and functionality for artificial intelligence. And then Simulink is a graphical environment for developing and testing mechatronics systems. And on top of these two platforms, we-- well, by now, have more than 100 tools for industry-specific workflows. So for instance, for code generation, for industrial controllers as we will see later on in my presentation, or for predictive maintenance, or for industrial communication through OP-CUA, as we just heard from Professor Jha. OPC-UA is pretty commonly used for connected systems in the manufacturing industry.

    So our insights and our expertise on digital transformation comes from our work with many customers, many companies across multiple industries. And some of these industries come from a very electromechanical basis, electromechanical roots, and now shifting more and more into software-defined systems and more autonomous systems. Other industries don't have any physical products at all, so they are leaders in the area of using big data analytics and integrating with IT systems.

    But all of these industries and companies in these industries are undergoing a digital transformation journey at the moment. And well of course, there are multiple differences between these industries, but we also see a lot of commonalities. So what does digital transformation really mean for the manufacturing industry and for the factory of the future? So what we keep hearing from customers in the manufacturing industry is the common theme of mass customization. So how can individualized goods be developed or be manufactured with the benefits of mass-- of the cost benefits of mass production? And this is actually really in the core of Industry 4.0.

    I quoted a recent article from Rainer Brehm from Siemens, and he describes mass customization as producing what matters. So reducing waste by only producing what the market and the customer really requires and demands. So what does that mean for the factory of the future? So how can the factory of the future really be prepared for mass customization? So mass customization, out of our experience and our customers' experience, requires connected, flexible, and autonomous production.

    And that means that the systems that are used in the factory are becoming more capable, they are more software-driven in order to be able to quickly adapt to new situations, to new products, to new parameters. And that actually means that these systems themselves-- the machines, the equipment themselves-- have to be connected, flexible, and autonomous. That sounds good but also comes with some challenges.

    So recent studies show that the growing design complexity is among the top challenges that engineering teams in the manufacturing industry see. And if you look at the right hand side, this study from McKinsey, this is specifically true for the software parts, for their software, the machine application that runs on the machine on industrial controllers or other software platforms that are trailing the machine. And what you can see here is that software complexity is growing much faster than software development productivity.

    So how can this be addressed? For this, I want to bring up the first practical example, industry example that I have here. this video, you can see the robot box. Robot box is a pick-and-place system that is developed by Krones. Krones is an industry leader, a worldwide industry leader, in the bottle-filling industry. So they are-- they're delivering bottle-filling machines, bottle-filling lines. Their R&D is mainly located in Germany. And well, as you can see here, what the robot box does is pick up some containers of bottles and then react flexibly to the demands that their customers have.

    And how does Crohn's really address this complexity of a system like the robot box? What they do is they build models. And they built models on the one-hand side, as you can see on the left-hand side of the entire system, so really high-level models to test the functionality, to do virtual commissioning. And then, also they take specific pieces, specific mechatronic components and then develop subsystem models in tools like Simulink and Simscape, where they can really dive deeply into like what are the forces that occur on a gripper system-- like on this robot box-- or what are the maximum torques that you need to have on your motors and gears.

    And actually, these system models and the subsystem component models, they are two aspects of the total lifecycle of the equipment, of the manufacturing machine. But as we said before, digital transformation comes from using models over the entire lifecycle. So these models are then connected to requirements, documents, to use cases, to really make sure that what is modeled and what is developed really addresses the needs and the requirements for these machines, so for this equipment.

    And then once you have the models and test the models, implementation comes more or less automatically. So what tools like Simulink or MATLAB offer is generation of executable code, like it may be C/C++ or IEC 61131-3 code that runs on industrial controllers, or also executable code in the form of Docker containers that then run on industrial servers or in the cloud. And we will see more about that later on. And then the same models are used for verification and validation. And not only on the physical system itself, but actually much earlier in the design process with the virtual representation with the models of the system and of the software, and this really leads to a shift to the left of the VNV process to identify and to fix errors much earlier in the design process.

    But of course, the systematic use of models doesn't end here. So what we see then is that these models are enhanced and updated with data from the field with operation data and then really developed into specific, digital twins for an asset in operation. For instance, for predictive maintenance or for operation optimization. So many of our customers are using models in MATLAB and Simulink for this workflow, which we call model-based design. But they are actually asking for more. So what they really want to see is that they can add additional functionality, specifically additional software functionality, to their machines in [AUDIO OUT] and from the back operation data, measurement data into their design process-- into their development teams.

    So in other words, I would say what they want to offer to their customers is the perpetually-upgradable machine. And well, how do we get there? To explain that further, it makes sense to take this flat view of the life cycle and rearrange it a bit and split it into the development phase on the one hand side, and the operations face on the other hand side, and then connect these two with continuous delivery and continuous feedback.

    And that ends up in this picture. And I'm sure this picture is familiar to most of you from agile development. This is actually DevOps, a set of tactics that is typically used for software-only systems in IT. But we believe that if you combine DevOps with models, make it model DevOps, then this is a very valid and very beneficial way of developing entire mechatronics systems and not only the software that runs on this.

    So let's switch gears a bit and come to the second example that I mentioned. Atlas Copco is a manufacturer of air compressor systems, and they use models in MATLAB and Simulink for designing the functionality on their air compressors. Like, for instance, for automatically adaptization to the specific environment and operation points of the compressor in operation. So they develop a digital twin that's designed based on these models. But then they also connect these models and integrate these models into their manufacturing environment. So connect them to the CNC machines where the individual pieces for the compressors are manufactured, and so they have digital twins produced.

    But they also use this-- sorry. They also use these models and integrate them with these configuration tools, that their sales engineers then use with Atlas Copco's customers, to identify the right-sized and rightly-parameter-based air compression. So digital twins is configured. And last, not least, Atlas Copco has more than 100,000 machines in the field where they can continuously get data, operation data, feedback and feed this back into their models that they run on their servers. And then they use the data connected with these models for predictive maintenance, for optimizing maintenance intervals, and for identifying and solving issues in the field as digital twins is maintained. So I would say, in summary, what Atlas Koepka does is they use models and data systematically over the entire lifecycle, has different digital twins of air compressors, and that results, actually, in more flexible air compressors for their customers, higher quality of the systems, and lower maintenance costs.

    So for the last part of my presentation, let us step back to the model DevOps image again. And here, I want to dive a bit deeper into the build piece of this process. So what is behind this build piece? So when using MATLAB and Simulink and developing models in systems-- or in environments like MATLAB and Simulink-- you simulate this behavior on your desktop computer. But then, at the end of the day, you need to deploy them somewhere on a production system. So what we offer here is deployment on various different targets, from embedded systems like real-time deployment, to enterprise systems into the cloud.

    And I want to dive a bit deeper specifically into code generation for industrial controllers and for PLCs because as we just saw in Professor Jha's talk, PLCs are only present in the manufacturing industry, so they are the-- well, the PCs where typically the software runs, that automates the factory or the equipment. So what does that mean? How does code generation for industrial controllers, specifically for PLCs look like? Well, so first of all, the basis again is models, models in MATLAB and in Simulink.

    So these models could be, for instance, controls models like a PID controller or a state machine. This could also be AI functionality, like for predictive maintenance, also like for reinforcement learning when teaching-- automatically teaching a robot that needs to adapt flexibly to its environment as we saw in the robot box example early on. Then that could be optimization algorithms, for instance for path planning. And, of course, it could also be a digital twin, the model of the plant itself-- of the mechatronics system itself-- that runs in parallel, runs in real time, and is connected to the physical plant.

    So these models are simulated in the desktop environment. And then what we can do is generate real-time code, C/C++ or IC-61131-3 structured texts or letter diagram and then automatically integrate this generated code into your hardware vendor's ID and then compile it there and deploy it on the respective piece of hardware-- on the respective PLC or industrial PC. Well, that comes with a couple of different benefits. So obviously, this really streamlines the process of developing functionality and really bringing that functionality into operation and of course, also shortens that whole cycle, which allows for more innovation or faster pace of innovation.

    Secondly, code generation helps eliminate manual coding errors. And the third benefit, that typically companies in the manufacturing industry like a lot, is it makes you more independent from a specific hardware platform. So you develop your functionality once you test it once in simulation and then you use code generation for targeting your respective PLC platform, or controls platform. And that, for instance, also allows to easily go for a second source hardware vendor strategy.

    And well, for these workflows, we are working very closely with all major vendors of industrial controllers and PLCs as you can see here on this table. And this actually-- yeah, sorry. And well, I wanted to elaborate a bit on like who is using code generation for industrial controllers today. That's a pretty broad range. A lot of our customers in the manufacturing industry are using these workflows today.

    Some of them, we have official references with, like for instance here, IMA in Italy. They have developed machine learning models, or deploy machine learning models, for predictive maintenance on the PLCs on their equipment. And ENGEL or Metso develop controls functionality that they deploy on the industrial controllers. Vestas is using code generation for their wind energy power plants. We already talked about Krones using code generation for their pick-and-place robots. And we also see that system integrators, like Vintecc in Europe, are using model-based design and automatic code generation for PLCs and other industrial controllers heavily in their customer projects.

    And that actually brings me to my summary slide. So I hope I could show that the systematic use of models and data throughout the entire lifecycle helps you on your digital transformation journey to the factory of the future. We also saw that deployment on PLCs and industrial controllers can help you shorten your design cycles and allows for new functionality that can then be deployed on your perpetually upgradeable machines. And last not least, you don't have to go this journey on your own. So The MathWorks has a lot of experience with working with industry leaders in this field. We have experts all over the world, and we are ready to support you with your factory of the future projects. So just approach us and reach out to us. Thanks a lot for your attention.

    Thank you, Mr. Philipp, for this informative talk and for joining us so early in your day from Germany. It was good to learn about the insightful aspects of the factory of the future, and especially about the perpetually-upgraded machine. We now welcome Mr. Shireesh, practice head digital transformation and Industry 4.0 at Tech Mahindra. Mr. Shireesh will speak on the role of digital twin in Industry 4.0. Welcome Mr. Shireesh, and it's over to you.

    Thanks Sunita. I hope I'm audible and I'm sharing my screen. Just a moment for that.

    Yes, please. Thank you.

    So while my screen sharing comes up, I thank Mr. Anup for sharing the focus Government has and initiatives taken by the government and Professor Jha now for the impressive lab and infrastructure and facility that they have set up. And honestly speaking, I envy the student and the beneficiaries of the facility. And Mr. Philipp, thanks for giving us the industry view on how manufacturing is evolving, right?

    So with that, I would want to start my part of the presentation. This is what I have on the agenda. And a very brief of who we are and what we do and then get into what is Tech Mahindra's point of view now on the digital twin-- what has been our experiences so far and how do we see it going forward. This is what I would want. And finally, we're going to see a small video demo of end-to-end digital twin technology demonstration that we have built.

    Talking about us, we are part of a larger Mahindra and Mahindra Group of companies, consolidated 22 billion revenue. And Tech Mahindra itself is a five-plus billion revenue in size. And I represent a group called Integrated Engineering Services, which kind of brings the engineering and the IT business together. So here, as part of our digital offerings, this is our digital footprint and we contribute heavily on the smart cities telecom side. All track and trace use cases, connected products, connected vehicles. When you say connected vehicles, anything that moves gets classified out here, be it automotive, be it transportation, or with that matter, on the aerospace side of it.

    And then we have a dedicated practice for connected practice and Industry 4.0. And this is where we deal with the discrete manufacturing as well as the process manufacturing side of it. So on our offerings, we have end-to-end offerings all the way from consulting to wherever need be, sensitization, gateway development, edge computing side of it, AI, ML, application development, cloud engineering, all of it, right? And we have built appropriate infrastructures to facilitate all of the development in these areas.

    And our digital engineering offerings span across these three major buckets smart products, smart manufacturing, especially Industry 4.0, and smart aftermarket services. So smart products is innovating. We help our customers make the product smart or build products that are smart and capable of catering to the demands of the digital transformation that we're talking about. In the smart manufacturing side with respect to the supply chain, optimizing of the production or bringing efficiencies into the operations side of it is what we are for. I would say almost 60% of our digital offerings fall in this space.

    And for the digital aftermarket side of it, it is predominantly the analytics AI & ML-based services and the asset monitoring kind of services that we offer. Yeah, this is the laundry list on the manufacturing side, you know, be it with respect to condition-based monitoring, predictive maintenance, prescriptive maintenance, now AR/VR, track and trace, digital twin, digital thread. Of course, digital twin and digital thread cut across these three buckets, but yes, another something that we offer our customers.

    So with that brief introduction, I would want to get into the topic of digital twin, our point of view and our experiences in this area. But today, we see a new organization as a function of three-dimensional space. So wherein one of the dimensions of the organization is it would procure parts or raw materials from suppliers and then put it together or process it to supply finished goods to customers.

    The second arm of the three dimension is product conceptualization to product realization. And the third dimension of this three-dimensional space is software shop floor visibility. So while we understand that currently there are a lot of discrete, IT/OT systems that are prevalent, which kind of encapsulates and holds the information within themselves and this information is currently available in silo. And having this be available across-- enabling information exchange between these discrete IT/OT systems, we believe would help bring out more meaningful insights to drive efficiencies and optimize the whole process.

    And we believe that digital twin and digital thread are two technologies that are going to enable us to achieve this while digital thread would provide us the infrastructure necessary, infrastructure to connect the data points and bring the data together across these various discrete IT/OT systems and across the functions. The digital twin would help bring out more meaningful insights into the effectiveness of the current processes and bring in driving more optimizations across the value chain.

    Having said that, today, my discussion more focused towards digital twin. Having said that, we also have built digital thread solutions. If that be of an interest, people can reach out to me and then I should be able to provide you more information on that, OK? And let's take a glimpse of what the industry experts feel about the digital twin. OK, this is a coalition of various analysts who have talked to the industry experts and SMEs.

    On the bottom left, what you see is this is what the SMEs' opinion that these are the functions of an organization that the digital twin would benefit. 76% of them say it's going to help bring in efficiencies across maintenance and repair and operations. 69% say it's going to help bring in optimizations into the manufacturing. 62% say it's going to aid the quality control side of it and so on and so forth.

    On the right side bottom, this is what the SMEs feel are the essential components or the key components that would be required to build and realize the digital twin. 70% say physical assets are necessary. 55% say live data set is something that would be needed, and 49% say offline data set and so on and so forth.

    So having said that, while the digital twin-- the definition and the structure and the benefits of the digital twin is still evolving. Every organization has their own understanding and what digital twin means to them. We at Tech Mahindra classify digital twin into four product categories, product twin, process twin, supply chain twin, and customer twin. So if I'm an OEM and manufacturing product, which is going to be across digital geolocations-- I don't know-- which is going to be deployed. And I would want to get the data from these machines that are already deployed and utilize this information to bring in optimizations or efficiencies into the design itself. Then I'm looking at a problem.

    So all of the key, salient features that classifies a product twin is its need for design information. It would need the technicalities of the mission construction and its functions. We'll get into that a bit later on the deck. And then, we have the second category, which is a process screen. This is where say, for example, if you want to understand the utilization of my machines or the effectiveness of my processes on the shop floor or the quality that is delivered or derived from these processes, then I typically would be looking at a process chain.

    Again, there is a big time overlap between process twin and product twin, especially when we talk about the process industries such as vaccine manufacturing, build manufacturing, or any chemical processes, and so on and so forth. But, you know, in most of the cases the process twin would rarely use the design information of the machine itself, right? And then we have the third category. It is the supply chain twin. This is more for the localization use cases. This is where if I would want to know the whereabouts of my movable assets, I would go for a supply chain twin.

    And then we have the customer twin. So this is a unique categorization, I would say. I say, for example, if I am an owner of a car and I would want to know the performance of my car with respect to respect to its benchmark specifications, then I would be looking at a customer twin. And we at Tech Mahindra work on all of these four kind of trends.

    Digging a bit deeper, this is how we see a typical digital twin solution to be, while the bubbles that you see in the center of the circle are, most of the time, point solutions. But when multiple point solutions get clobbered together to offer a solution, consolidated solution, it gets qualified to be called a digital twin. That's our opinion.

    And while here are the first level of the circle, you would see quality analytics, real-time insights, process visualization for sites. And the next level of the circle could be a combination of multiple first level bubbles, like say, for example, condition-based monitoring is a combination of real-time insights and process visualization. Predictive analytics is a combination of process visualization and . So if you have a unified solution that offers multiple of these benefits or point solutions, then it gets classified to be called as a digital twin.

    And when we talk about digital twin, the typical digital twin would take real-time data from the equipment or sensors. It would take real-time data from the process. It would take real time data from the machines. There is data from the simulations side of it as well. And then it would take data from the quality side of it and there is offline data that's also used for its consumption.

    And the benefits that are typically offered by a digital twin would be insights into the performance or the effectiveness of the quality and the processes of the product. And then you have real-time insights into the performance of the product, forecasting and prediction, and, in some cases, it gets into the space of recommended next best action. This is where in certain [AUDIO OUT] it would aid the user with recommendations as to what would be more optimal decision to take. OK, and in other sense it would help them make appropriate decisions more quickly.

    And then, of course, perceivable visualization and alerts and notification that help a user to more understand it from a human perspective outside of it. While listed below are the typical enablers that are seen to be essential to build a digital twin, but at the same time when we talk about the modeling side of it, we at Tech Mahindra talk about a four kind of models that go into building a digital twin.

    One is the structural model. The structural model typically comes from CAD tools, 3D drawings, and so on and so forth. The second one is the function model. This is where MATLAB and Simulink kind of tools come into a picture, where we have a complete modeling of the functionality of the whole process, of the equipment itself. And then the third one is the reliability of the performance model. This is where the twin would consume data from reliability analysis, if any FEA kind of processes, or in some cases, an FEA analysis as well, mathematical model so on and so forth.

    OK, and then at the last one we have the data science model. This is a pure ML- and vision-based analytics and AI deep learning. All of that fall into this category. The typical digital twin would consume data from various such sources. And again, for which functionality of the digital twin, what kind of data source is accounted for as a matter of more discrete decision?

    And having said that, we at Tech Mahindra, as a technology demonstrator, have a built-in end-to-end digital twin solution for a micro factory facility that we have in one of our labs in Bangalore. We have a lab called Factory of Future Lab, and we have built-- we have set up a micro factory facility in that particular lab. And we have built this digital twin for that particular facility, right? So the facility consists of a four-station conveyor system and the robotic arm and all the typical stuff that goes into making discrete manufacturing facility. We do the cordless power tools with RFID readers, barcode readers, add-on interfaces, stack lights. All of that is available on this particular system, OK?

    And having said that, what the solution offers is remote monitoring and control of this machine. It provides real-time insights into the health and performance and KPIs for the machine. And not only for the machine, also for most of the major subsystems on the machine itself, right? And we have coupled it with the condition-based monitoring and predictive analytics solution. It also comes with interactive 3D- and AR-based visualization. I like to coin the term "perceivable visualization" for it.

    So I'm going to be having the small video demo of this. And you'll get to see what I mean by perceivable visualization. And then we have repair and maintenance assistance that we have built on HoloLens which would aid the operator or maintenance guy in a mixed reality environment with step-by-step instructions to carry out appropriate maintenance SOB, right? And in terms of the technology stack, you have the bottom-most picture representing the various entities of the layers that go into making the digital twin. Well at the bottom, mostly we have the sensors, the machines, and devices and so on and so forth.

    On top of it, we have the PLC network and over which-- and for purposes we have gone with the multi-recall OEM configuration for the PLC network. And then we have the SCADA system on top of it. And then on top of it, we have the-- SCADA system is more for local visualization. And then we have the IOT platform. In this particular case, what we have gone with is PTC ThingWorx, and again, as being SIs, we are pretty much agnostic to IOT platform, though ThingWorx has been a choice for this particular solution, right?

    And on top of it, we have the visualization layer that we have built with Unity Engine and hosted this application on a tablet which is an iPad, that kind of device, on this particular solution. Of course, in the backend, we have interfaces to all the models that I talked about earlier, the structural model, the functional model, the reliability model, and the data science model, all in the backend.

    So with this, I would want to switch to crisp video demo and subsequent to which we would be open for any question and answers. Can somebody confirm me if you're able to see my video, please?

    Yes, Shireesh. We can see the video.

    Thanks. OK, so let me briefly explain you what you're going to be seeing out here. On the right side-- you know, this is a split screen that I have. On the right side, you see the real asset, which is the core conveyor stations, the robotic arm, and so on and so forth. On the left side, we have the digital twin application that we have hosted on an iPad, right?

    So one would be able to launch the digital twin right by scanning the bar code for that particular-- QR code for that particular system. And this is the interactive visualization, the perceivable visualization, that I was talking about. So while I have various stats from the right side, I would be walking through each of them providing you a list of features that it poses. But this is the perceivable visualization I was talking about.

    On clicking the first tab, it would bring you up to here, while on the top right on the top side, you have the metrics related to the utilization of the machine, the ideal time, the run time, and all of that information that would be helpful to assess the utilization. On the right side, you have the status of the machine, whether if there are any false alerts, whether the machine is in idle state, running state, and all of that. And then you have the OEE and operational metrics as well. One would be able to start and stop the process automation right from the twin itself, and I'm using these two buttons out here.

    So I want to draw your attention to this portion of the machine, OK? So what we have done is we have incorporated the condition-based monitoring. The pallet is expected to move from point A to point B within a certain duration. For any reason that the pallet falls back, the CPM kicks in and says, you know, there has been a process violation, specifically in this region, and it is trying to draw the attention of the operator or the maintenance guy to take a look at it.

    [CONVEYOR RUMBLES]

    So on the second tab, we have-- this is a dashboard for the smart tools. While we have, on the top, the metrics for the utilization of smart tools on the right side, the connectivity and the health status for the smart tools, and we also have the job configuration, identified for the tool. And the tools also have geofencing and geotagging features. So if you would want to restrict the utilization of the tool to a specific area or region, we could do that beyond which the tool would be non-functional.

    As and when operations are performed on the tool, we get to see the status of whether it was a good one or a bad one, over here.

    [MUSIC PLAYING]

    And we also get to see the historic quality of the runs. You know, how many of them have been good and how many of them have been bad. And we also have, for traceability purposes, the job ID versus torque. So if somebody want to go back and look into what kind of torque was applied and to what kind of job and they want it for a root cause analysis, then they have it handy for their purposes, right?

    [MUSIC PLAYING]

    So then the interesting feature that I have is the major component view. This is where we get to see all the major subsystems and information related to their health status and running status in one single group. And we have hot spots on each of these major subsystems. By clicking on the hot spot, it would take you to that particular subsystem and present the user-- the user would be presented with the KPIs for that particular subsystem. So in this case, it's a motor of a conveyor system, and we get to see the current status of the motor, what is the current torque, voltage values, power frequency, and so on and so forth.

    And some of the parameters are controllable and configurable right from the start, like, say, for example, the RPM. You can set a RPM right from the digital twin itself, right? And then we have a vibration sensor that's mounted on this particular motor. And by clicking on the dashboard view for this particular thing, it would take you to this particular screen.

    This is where we have built ML-based forecast and prediction side of it. So here, what you'll see is the historic vibration levels-- out here, the red portion of the graph-- and then the conical portion of the graph is ML-based forecasting and prediction. So if I had limits identified for the vibration level, the intersection point between the limit and the conical graph would let me know what is the time left with before the asset would require any sort of maintenance. While this isn't completely implemented-- so we are currently working on physics-based based model.

    On a physics based model, what we are trying to do is-- from the FEA analysis, we know what are the stress points on this particular motor. And from the same FEA analysis for the observed vibration level, we would be able to determine what would be the stress levels experienced by each of the stress points. And from the fatigue analysis, we would be able to understand for how long the model would be able to sustain these stress levels before it breaks down. So that way, we are also getting into the physics-based remaining [AUDIO OUT] useful life of asset.

    Well we also have you know details related to it if there have been any overload conditions in the past 24 hours. And the data related to the utilization of the subsystem for the past 30 days, of course, even for the equipment as well, right? And then, if there have been any scenarios which would require intervention, for that we have HoloLens-based repair and maintenance assistance, which is completely hands-free, step-by-step instructions in that would be provided to the operator or the user or a maintenance guy.

    The SOP for which can be directly launched from the digital twin itself, right? So on-- if a maintenance guy or operator selects a particular maintenance solely based on the recommendation from the digital twin, it will launch this particular remote repair and maintenance assistance on this hololens in a mixed reality environment. The hololens, it would help him or her go through the maintenance procedure in a mixed reality environment. We get to see that out here. These are the third lines. It would guide you with what your instructions are on a real asset, right?

    And the last feature that we have on this is to understand the performance of the machine and the behavior of the machine or the current status of the machine. One need not be required to be present in front of the machine. They can virtually bring up the machine onto any surface. And say I'm in a remote place in my conference room, I would want to bring up this particular machine virtually--

    [MUSIC PLAYING]

    --and I can walk around this machine as if it were physically present in the conference room. As in when I pass by the major subsystems on this virtual machine, it would present me with the KPI for that particular machine.

    This is what we have built and this is a technology demonstrator. And there's a lot of interesting work that we are doing in this space as we speak. And as I said, digital twin is still in its infancy. The industry is catching up, but there's a lot of ground to cover, and I'm sure initiatives like this and the infrastructure that Professor Jha has put together would help cover the miles that we have to cover. With that, I would conclude my session. And I'm open for-- we are open for question and answers. Thank you.

    Thank you very much Mr. Shireesh for this exciting session with the captivating video demo on the digital twin as a replica of the physical world, taking the real-time data from the ecosystem to give real-time insights into the performance alongside recommendations on optimal decisions. Thank you to all our esteemed speakers for the time and effort to share your thoughts and experiences.

    I come back, and we now move to the technology demonstration section to see for ourselves the steps taken for the transformation from traditional to smart factory. I now hand over to Professor Jha and our technical demo team to take us through this session. And it is over to you, Professor Jha.

    Thanks. I think we come to the end of this event. It's really exciting, one that we will be looking forward for your interest and participation. So idea was that throughout the presentations and discussion, you might have seen the impact of the different technology that has been explained, and I think, a wonderful digital implementation that Shireesh has explained and Philipp has mentioned about the Factory of Future.

    So all that is basically start with the basic building blocks. So we had brought out these three technology demonstrations, which are relevant and related to whatever has been discussed till now. So first project is under the building of a smart lathe machine, so whatever I was explaining in Q&A, we'll just explain you how that has been done with this simple lathe machine, which is again, a 20-/ 30-year-old machine.

    Then we'll also be discussing about the digital twin modeling of a machine. So idea is that once you see a digital twin-- certainly a very complex system to be modeled, and it's a lot of steps being involved before you really reap the advantage of the digital twin. So we'll be giving you a quick glimpse on the individual building blocks, the steps basically, required to start building a digital twin of a machine.

    So this demonstration is about how you can start building a digital twin-- what kind of tools being used. The complete flow is being explained through that. And lastly, we'll be discussing or demonstrating about the virtual commissioning with the PLC integration. So these three demonstrations really take up in this session. Let me start with the first one, how a smart lathe machine, or how a legacy machine can be converted to a smart lathe machine. So I will just quickly explain you the basic architecture over here, and then I will take you live to the lab, and then my colleague Vishnu will be explaining you and giving you the live demonstration of the system.

    So in this lathe machine-- if you see that-- the basic understanding is to capture the critical parameters. So what are the critical parameters? Certainly, the machining parameters are critical for a product that is coming out of the machine. And there are certain parameters which are critical for monitoring the machine health. So far from that perspective, we have integrated the various sensors into the lathe machine. So if you can see in this picture, we have integrated measurement for the spindle RPM, the displacement sensors for the depth-of-cut measurement, displacement sensor for feed measurement, the vibration sensor, and the temperature sensor. All of that plays into this machine.

    Then on top of that, we have in a controller with an HMI interface available on the machine. This is a low-end controller that has been added into this machine so that all the analog inputs and the digital inputs are coming out to the controller, which will further then transmitting to the IoT system. So the data is being then-- currently, this lathe machine is connected over ethernet so that data can be captured at the receiving end on the IoT server side.

    We also integrated the energy meter so that we can monitor the various energy-related parameters on the machine. And then we also have a provision to connect an RFID so that we'll be giving the right authentication, because this machine is now in our own central workshop. But ideally we want an authentication being given to an operator so that he can operate a machine during the specified time. So that authentication is also being integrated into this.

    When it comes to the network, basically how you will establish the connectivity with this is smart lathe, is once you make the sensors and controller in place on this lathe machine and then you brought the data onto the internet. So we'll have three different layer of communication being established over here. First of all, certainly the control levels. So the PLC is monitoring all the parameters, pushing the data back to the IoT systems where you can either running the manufacturing execution system at the plant level, or you will be exchanging the information through a ERP system at the management level. So you will be running various services on the different server in the plan.

    In current system, what we have done-- we brought all the connection or connectivity from the sensors to the PLC. And from there, through the modbus TCP, we will be taking the data to a KepServer, which is in a data aggregator. So this is in a software that is running to capture the data on an industrial communication basis. So from PLC to the KepServer, we are taking the data on the modbus TCP and then we will be sharing it on a more open communication protocol that is OPC UA, basically.

    So we have connected this data source now to the MATLAB web app through OPC UA, and then this data will be utilized to do processing as well as visualization of the information. So once we brought that data to the MATLAB web app, we will be running the applications for the live monitoring. And at the same time, we can also build up predictive maintenance that can be integrated into the same applications to even estimate the remaining useful life of a bearing of the machine itself.

    So these are some of the apps that are being built on the data that is being captured from the machine. So you can have a login and then you can have a dashboard for the energy monitoring. You can also have a maintenance dashboard and then the monitoring of the machine status and the operator's dashboard or the supervisor's dashboard. So with this now, I'll just request to my colleague Vishnu Kumar who is in the workshop to connect and then we'll follow up on the next demonstrations. I will stop sharing here and now request to Vishnu to please get connected onto this stream. Over to you, Vishnu.

    Am I audible? Yeah, you are audible, and you can share your video and then you can explain.

    Yeah. So hi, everyone. My name is Vishnu Kumar. I'm an automation engineer in FSM. So today, I would like to demonstrate the smart lathe machine. So if you can see here, this is the HMI PLC that I was talking about. Here, the PLC and HMI is integrated into a single module. And then below, that we have this energy monitoring energy meter, which is monitoring the voltage, current, and other parameters related to the power.

    And then we come back to this displacement sensor where it is an analog sensor and it is used to calculate the depth of cut using a target base that is kept below this one. So apart from this, we are also having one more displacement sensor, but we will be tracing out the feed rate using this. And this is the particular target base that you'll see. It will be traveling along this sensor so that we will get the coordinate and we can then put the programming part into it and then calculate the fitting.

    So apart from this, we are also having this sensor, which is a proximity sensor used to calculate the speed of this chuck. So basically, we have a target here. Whenever it comes contact to this, it'll get ignited. So at that time, the number of rotations will be calculated and RPM will be generated. So apart from this, we also have a temperature sensor to monitor the spindle temperature. And then for the addition of the operator safety, we are also having this allen key holder, where this particular thing, which is used for mounting the product, will be kept inside this one. Whenever this allen key is removed, the machine goes into stop mode.

    So whenever a person is operating-- say, for example, I am on the step. So now I am working on it. So if someone is trying to take this allen key or if he is trying to pull the allen key from the place and he is trying to tighten it while it is on, so the machine automatically goes into off. So this is somewhat the interlock that we have created. So basic purpose is to have the machine performance and then the quality of the product and then we are also giving some updated guidelines how to work so that we can trace the root cause of the problem.

    So now, I will just switch my video to my PC so that I can show you the dashboard that we have created, which is the second part of this demonstration. I guess my screen is visible? Yes Vishnu, we can see it, yeah.

    OK, so this is the supervisor or the operator dashboard that we have created. So currently, what you're seeing is the speed, depth-of-cut, feed rate, these are the cutting parameters. And then the allen key holder, whether the key is present or not, it is being detected. And then the temperature value of the spindle is being generated and these are getting plotted. So right now, I have just turned off the motor. I'll just turn it on so that you will be able to see the speed graph also coming in.

    Apart from this, we also have this OEE, availability, performance, and quality of this machine. And then, whatever the product that we are finishing or we are producing that is being monitored, depending upon the performance of the machine, these things are calculated and they are displayed here. So we can see a bar graph also displaying the last two days and the present day quality and the machine availability.

    So I just move on to the next screen. So this is the energy monitoring screen. Right now, the motor is running so we are getting different plots with respect to power factor, and then we have power kilovolt-Ampere, basically. And then with respect to each phase of the current, each phase of the voltage, and line-to-line voltage. And then the energy consumed. So these are the raw data that is coming from the machine. And these are used for calculating or displaying in the form of plot.

    So moving on to the third dashboard, which is the maintenance dashboard. So in this maintenance dashboard, whenever the maintenance engineer is coming and if he is doing some planned maintenance, then that is being logged so that we can able to see when was the last maintenance being done and then what was the issue for which we have logged the downtime. So here, if you can see, the average voltage current. This will be coming through the internet, so that maintenance person, if he is in a remote area also, he can be able to see these details.

    And then maintenance log. Whenever he is coming for a scheduled maintenance, we'll be logging those data and it will be coming here. And then whenever he did some maintenance, that data will be recorded. And then whenever the data is being logged, the last whatever time period he has maintained, that will be shown in the graph form of a dial gauge.

    Apart from this, the downtime that we are recording this. So how much time we are taking this, like how much time the machine is down. So that is being calculated here. So whenever the this downtime log is being created, the number of minutes wasted in that particular downtime is being recorded in the form of minutes. So that is being shown here in the form of a dial gauge.

    Apart from this, whenever the date we are putting here, for that particular downtime, so from that date until current date, how much time we have used the machine, or how much time since the failure has happened that is being calculated. So these are the historic downtime for each month from August, September, now we are at October so currently we have only one downtime, which is being shown here in the form of two hours. So that's all from my side for the demonstration.

    Thank you, Vishnu, I think. Thanks for this excellent demonstration that you are given. So idea is that once you have enough physical sensors in place, they are connected. And from there, now we will be getting a data and then the various services has been built up. So we have used this MATLAB web server, basically, to host all these applications and they can be accessible by different role holders that is being demonstrated here.

    So we'll now move to the second demonstration, which is on creation of a digital twin. So I will just request Rahul and Omkar to give the demonstrations. First, they will explain how the basic building block has been built over there, and then Omkar will demonstrate you all the steps and it will show on the live machine as well. Over to you, Rahul.

    Yeah, thank you Professor Jha. Let me share my screen and please let me know when you can see it. Hope you see it.

    Yes, Rahul. We can see that. Yeah. Yeah, it's visible.

    OK, so welcome everyone. So let me talk about how we started this digital twin approach and what all methods we used here. So digital twin is essentially a digital or virtual replica of any physical asset. And that's what we have seen in the earlier presentations. The definition of digital twin can be different for different personas, whether it is for the process, or whether it is for any machinery.

    So similar to that, when it comes to creating digital twins, there could be multiple approaches. Based on what kind of information you have about the system, you can take a different approach for digital modeling. So it could be a physics-based modeling where you can derive some kind of mathematical equation based on the flow of energy and mass, observation, all those things. And you can create those models using some kind of simulation tool.

    It could also be using some kind of data-driven modeling technique where you don't have much insight about what is going on in your system, but what you have is a bunch of sensors connected to your setup and you are collecting or logging data from those sensors and how you can leverage those data to basically estimate some of these states using common filter or develop a system identification-based model to basically predict some of the states.

    And the third one is using AI-based models if you want to use your gathered data to predict the remaining useful life of your asset or classify some of the problems, then definitely you can go ahead with the AI-based modeling technique. In this particular approach, since the problem statement was clearly known to us, we knew that what kind of actuators-- electromagnetic actuators-- or setup you are using in the system, so that's why we took this physics-based approach in this particular case.

    So when it comes to creating a physics-based model or physical models, in The MathWorks product family, we have a lot of product because our models are essentially backbone for any control system design or model-based design philosophy. So we can develop these models using basic Simulink blocks, or we can also take advantage of Simscape, which is our physical modeling tool. And in this particular demonstration, we have used Simscape because using Simscape, we can develop models based on the physical architecture of a system.

    And if the model is-- which is being developed-- can reflect the system architecture, it is easier to understand, modify, and reuse, right? So what we did? We basically developed the actuator part, which is a AC servo motor, which is kind of driving this, the entire mechanical assembly. And this is how the actual model looks like. Once the model is ready, what we need to do is we need to parameterize these models based on the data sheet specification provided by the manufacturer, right?

    So I can easily parameters my motor based on the parameters available in the data sheet. I can also model the effects of the hardware implementation in form of thermal characteristics of the motor, or the speed controller, or inverter and the rectifier logic here, right? So when it comes to parameterizing from the data sheet, first of all when I say it, it sounds pretty simple, but there are certain parameters which may not be directly available in the data sheet. Or even if those parameters are available, sometimes they come with some kind of tolerance.

    So OK, the motor inductance is this x value, but it has some plus/minus 10% of tolerance, so we don't know exactly what parameter we have to enter in order to make this simulation model exactly a digital twin of our actual physical asset, right? So in that case, what we can do we can take advantage of built-in optimization algorithms. So here, you can see that when we parameterized the motor based on the data sheet specification, and some of the unknown parameter, the model output was shown in the blue color. And when you compare it with experimental data, the experimental data is not necessarily the same, right? And hence, it is not a true digital twin of our actual setup because simulation output is producing different output versus the actual asset when providing the same kind of input.

    So here, what we did? We used the optimization algorithm, basically, to minimize the difference between simulation output and experimental output by changing some of the parameters. And these parameters are some of the motor parameters and also the controller parameters. And when we did this tuning, we were able to minimize the difference between the simulation output and experimental output. And you can see here, it updated the parameters to match this simulation output with the experimental output.

    Now in actual asset, or actual setup, we don't have only three, four, or five parameters. There could be multiple parameters. And the same situation was with us also. So we had close to 10,12 parameters and we didn't know which all parameters to tune. And if we bombard our operation algorithm with all the parameters, then the operation process is going to take more time. And there's no guarantee that it is going to converge.

    So we also took advantage of sensitivity analysis, where we identified which all parameters are critical when it comes to monitoring or modeling the asset. So what all parameters can influence the output speed of the motor that we identified using the sensitivity analysis? It essentially uses a Monte Carlo simulation where it varies different parameters and at the end of the simulation, it gives you which parameters are critical. So out of those 12, which top three or four parameters I can pick for my tuning process.

    And the remaining part was basically integrating with this electromechanical actuators with the actual mechanical system. And since team had already developed the CAD model of the system, so what we did, we basically use this workflow where we imported the CAD model directly to the Simulink and we created a overall complete package using a The MathWorks Simscape model and Simscape multibody model. So how we use all these different pieces to basically come up with the final product, that's what Omkar is going to talk about. Omkar?

    Thank you, Rahul sir. The basis of the digital twin has already been communicated by experts and Prof Jha. I will be talking more regarding the demo of the first one. Can you please share the screen now, please?

    Yeah, sure. Avik, can you please pass on the presenter right to me? Thank you.

    OK, thank you. So I will be talking about developing digital twin of a machine built from demonstration part. So this is a brief overall process of the digital twin model that we have built where we have the physical station. Then the CAD model of physical station is being converted to from the SolidWorks .stp file to the XML format that it is being acceptable to be imported into MATLAB. Simscape multibody. Then, with the help of OPC, we are trying to get the real-time data from the physical station into our Simulink model. Then the parameters we are training the physical model on then calibrate the PID controller and then accordingly getting the output data of the position that we're trying to model versus the time output.

    Can you go on to the next slide? Yeah, so this is our physical station here that we are modeling. Here we are modeling the one part that is-- you can see here x,y,z. The axis we are modeling. they Right now modeling the y-axis. The horizontal one that we have for the digital twin process.

    The next slide. So this is our physical model that we have prepared. So in the physical model, you can see on the top left, we have the OPC blocks where we are connecting, which are using to connect with the real-time parameters from the physical station to our Simulink portal. Then we AC servo motor and drive where the rotating motion of the motor is being converted into the linear motion with the help of lead screw.

    And then that is the linear motion is being sensed with the box sensor. And then that sensed data-- the output that we are getting-- it's being plotted with the real-time output that we're getting for comparison purposes. Next slide.

    So this is our OPC UA block. We have developed a MATLAB script where we can be connecting to the machine by a server and then taking the relevant information from the machine into the model. And then, filling that into our model to interpret a MATLAB function for plotting the dots and getting simulation model. Next slide.

    And this our AC servo motor and drive where we have controller where we are getting the transposition that we want and then, accordingly, the drive is used to control the speed of the motor. So that based on the required linear position, the rotating position of the motor can be calculated and then we have speed sensor for feedback control. Yeah, then we have speed sensor for feedback control. Yeah, well we have controller. Then in here we have two controllers, one is position controller, and another is speed controller based on-- these are based on the parameters that we defined and PI controller that is proportional and integral controller is used for controlling the position and speed.

    OK, then we have driven and motor. So here in motor, we have the real-time parameters, whatever we have of our machine-- in our machine that is the voltage, the torque, and top speed characteristics. And then, accordingly, we have tuned with the help of controller. We are making it-- with the help of drive and motor, we are making the motor so that it mimics the real-time motor.

    I guess you missed one slide. Yeah this is linear position sensor where we are sensing the linear position and we are preparing for getting the output of simulation of linear position simulation for comparison purpose. Next slide. OK, I'll need the control.

    So you can see here, we have these screens that are visible here where we have--

    Omkar, we are not able to see your screen.

    Just a second again. Let me try to share it again.

    Yes.

    OK, is it visible now, sir?

    Yeah, now it is coming up.

    Thank you. Thank you.

    Yeah, it is visible. Yeah.

    So here, you can see we have three screens here. One is a CAD model then another thing is graph where the transposition and the reference position coming from a OPC and Simulink model is being plotted. And then there is real-time motion. So here you can see my parameter is moving the slide in real time. We can record this motion. You can see this, visualize this motion in Simscape Multibody and at the same time that can be seen in the graph form, where the data coming from the real-time OPC and the output coming from our Simulink model is being plotted here.

    So you can see when the motion is being done in a positive direction, in forward direction, the graph goes in upward motion. And then as it moves in the reverse, that is indicating direction, it moves in the opposite page. So this is the position in ML versus the time graph that we have for that. And you can see that the real-time machine HMI we have .

    Thanks Omkar, I think that's a nice demonstration. So idea is basically that in the two views that Omkar was explaining, you can see that on one side, we have a physical machine where the actual machine is being commanded. And from there we are getting in a real-time data. And other side, what you are seeing in the simulation part where we will be using our own controller. And then whatever the model is predicting, that is being plotted here. So the model, along with real-time data has been plotted.

    Now this is already being explained by Rahul. It's tuned right now. This motor is tuned. So that you will see almost they are coming overlapped. So once we are able to tune the model, then later on I can use only the Simscape model, basically, to predict their next position. So the real time data is now coming and the model is basically utilizing that information to compare and then update the model parameters. Well thanks. I thanks to Omkar and Rahul for this nice demonstration.

    Now we'll go to the third demonstration, which is again, a very important one regarding the industrial controllers, basically how you can do the virtual commissioning on an industrial controller using the MATLAB. So over to you, Ramanuja.

    Yup. Thank you, Professor Jha. So let me share my screen first. OK, is my screen visible? Can someone confirm?

    Yes, we can see that.

    OK, thank you. OK, so my presentation is going to be on the virtual commissioning side. So you know, Rahul just spoke that how MATLAB can be used for developing the physical models and then developing controllers. So now I'll be talking about how to commission those controls for setup. So first, I want to talk about the traditional approach. Now traditionally, how it works is that you-- so here you see some industrial robot, which is interfaced to PLC.

    Typically a PLC programmer writes the code and they would download that to the PLC itself. On the day of commissioning, typically they go and integrate with hardware and check if everything works fine. Now this is a traditional approach and it has some downsides to it. One is that typically in any commissioning activity, there are delays incurred during development and construction of the machinery.

    And commissioning comes in the last stage. And typically less time is good for that. And you're not able to cope with the delays created because of the need of processing the software. So these are some real challenges, what commissioning engineer might face. So the concept more industries are taking up is to go ahead with virtual commissioning. How virtual commissioning works is that you typically make a simulation model of your plan, right? And then you test your control algorithm with that simulation model instead of distributing the actual plan.

    Now how it helps? Basically, it helps you to test your algorithm earlier, so any software errors you will be able to identify it much before the actual commissioning and it also reduces the commissioning time. You might just need to integrate and do some tuning at the end of the day, right? So this approach is good. This reduces your commissioning time. It ensures that you can commission the quality software, but then the downside, again, is that you might want to build a physical model just to test your control logic. And this model would be a more detailed model also.

    So now what can we do? I mean, can we make more value out of this model what we create? Now that is-- another concept is the model-based design methodology what the industry follows. So typically in this stage, we also make a simulation model of the control algorithm as well. So there is a model of the plant. There's a simulation model of the algorithm. And when you have a simulation model of the entire system, you could layer on with that, you could optimize the control algorithm, could change some algorithms and see which works well.

    But also, you could do a few more things. You could also optimize the mechatronic design. You could see what could be the proper sizing of the models, what would be the proper length of the robot. So many analysis we could do. And overall, you could develop a more robust and performing product. So that's one advantage. And also model-based design advocates that once you are confident with the algorithm, you can generate a code out of that. So in PLCs we have structure, texture, and ladder logic. We can generate that equivalent code out of the algorithm and it reduces development time of all aspects.

    So once the score is developed again, you can go through virtual commissioning, verify that, and then make sure the algorithm works fine. So this process of going from traditional commissioning to virtual commissioning model was designed. So that's what my presentation will be all about. the demo-- the model, what I have here, is a simplified pick-and-place robot. It's very simple, but it's good to showcase the workflow, right?

    So you have the simulation model of the robot. We also have the model of the controls and then we have an interface block for OPC UA communication. So what we do is that we do desktop simulation, verify this model works fine, then once we're confident on the control logic, we generate the PLC code and deploy it to the Siemens hardware. And then we use-- in this demonstration, I will use a PLC emulator, the PLC sim, and then I'll try to deploy this code into that, right?

    And then once that is done, I could do a code simulation. The PLC supports OPC server into it. And then I could perform co-simulation between that OPC and then the PLC and then the Simulink model. So that's essentially what I'll be showing in this demonstration, right? So first, I spoke about building models of the plant. So here is this pick-and-place robot in action you just saw right now. Now, this model is a complete dynamic simulation model. You can input data into it, you can put multi-domain characteristics into it. In the previous demonstration, Rahul just spoke about how we can do those things.

    But the point is that when you do all those things, when you have a simulation model, you get the benefits of, you know, iterating in a desktop simulation, and then you can catch the errors much earlier. So in simulation world, you capture everything much earlier, you refine your model, and when it comes to the actual mechanical prototype, you could eliminate a lot of mistakes. That's one benefit.

    The other benefit is that simulation models also lets you size actuators. For example, this robot over here has several motors driving it. And what rating should the servo motor be? That question could be answered by simulation models. And finally, once the plan model is ready, you could use that to actually test your software. So this is how the model look like. So we have three different motors which drive the joint. If we go into the subsystems, now this is how the motor model looked like. It's a simplified DC motor model. In the top, you can see that there's an electrical portion to it, right? And then you also have the mechanical portion after that.

    Now, the mechanical portions are put in place with the 3D joints. And that's what the bottom model shows you. So this is a 3D model, the 3D model of the robot. You could see that we have a lot of lead elements, the body elements, which really characterize the industrial mark of the objects. We have rigid body transforms and then there are joints, which basically give you some degree of motion between the body parts. And these joints are actuated by this DC motor. Output of the DC motor motor is connected to the joints so that you actually see the mechatronics simulation in matching.

    So this is the physical model now that is done. Now, come into this control model. So we have three different motors then these motors need to be controlled by some sort of PID control to make sure that it follows some set point, right? So to make sure we have some servo control-- so that way we see three different PID controls, right? And these controls are tuned to make sure that the robot acts as per the desired specification. But what gives the command to the PID? You want to use the points, right? And that depends upon the viewpoints of the robot, how we want to traverse this robot gripper.

    So that you want to give us a set point to be PID. And that's where a supervisor logic come to the picture. I mean, this is a simple example so you can see that we have different states of the robot. For example, the first state. The entry state here is to take the dropper to a target. And you see some set points over here and some temporal conditions, like after 2.5 seconds, go to this position and 2.5 seconds, go out of that position. You see, that kind of logic. So you take the gripper to the target position, you pick the object, you take the gripper back to the destination, and then you drop the object. And then you repeat the process over and over again, right? This is a simple automation example. This is how a typical robot control might look like, maybe of a bigger scale, but this is how it would look like-- entire different state of operation and it operate continuously.

    So once you make this control logic, once you develop the PID, the next thing will be for you to test whether the control logic works. And that's where we actually perform the dynamic simulation. So here, you could see that-- I could see the robot actually moving, right, through different operations like picking the object and going to place and dropping it. And when it moves, I also saw that the state machine here was highlighting the accuracy. So I could see a correlation between the control logic and what the robot was doing, and this makes me understand how the controller works and if any errors occur, I could look at it much easier.

    So this is how I verify that my control logic is working properly and through this visualization, or through some plots, I can verify the response , the transient response is also good. So once I'm really confident that desktop simulation performs well, my algorithm is really good, the next step would be for me to deploy that algorithm, implement PLC. So that is where I would convert this algorithm-- the state machine what I showed you-- into a language what a PLC understands. In this case, structure text, right?

    I mean, it's pretty simple. You just need to right click on the block. You see PLC code and generate code for the subsystem. And then you get a code like the one you see in the left. So this is a structured text equivalent of the model what I was showing you. And this, you could use it to actually deployed to the PLC, right? So once you generate the structure text, the next step would be to integrate that with the ID. And in this case, I'm working with the Siemens TIA Portal and over there, I have imported the structure text. And you could see a function block over here, right?

    So this function block basically encapsulates my model. You could see that it has interfaces which takes feedback for the motor position. It has outputs, which basically gives the driver controls for the motors. And it has one more input for on/off, right? So I have built this ladder logic with this block over here. Now next, I would deploy this into a PLC. In this case, I don't have a PLC, but I will deploy it to the simulator, a PLC system which is also provided by Siemens. And this PLC has an OPC server into it. So I can also see what the address of the PLC is, so I could see it's, you know, 172.168.0.1.

    Now my next step would be to connect to this PLC to OPC server, right? So that's where the MATLAB APIs for the connectivity helps, and that's what in the previous demonstration also we saw. Now what I do? I need to replace the simulation controller model with an actual interface with a PLC, right? And that interface is by OPC. So I switch the model over here. And this block, if I open that, now that contains the APIs. So you could see that we are connecting to the server and you could see that we are tapping into the tags for all the motor's position and then the drive output.

    Once we tap into the tab, once I defined what is input what is output, then-- I mean, the model is set. Now I have to submit it and see what happens. And let's see how that works. So here is the model. And you could see that to OPC UA, that is what is getting highlighted right now, all right? Actually, when the model simulates, we are co-simulating with the PLC. So let's go to the AR portal now and then see. Now here we have program loaded. You could see that I could-- you could see all the variables mentioned, right? You could see the values. Everything is reset to 0 right now because nothing is working. And you could see what these values correspond to you. Like you see theta1, theta2, theta3, they are the motor position feedback.

    So the PLC sim. Bottom left, you could see that. Now that's gone down. So the PLC is ready to be operated. And only thing is now to turn on the robot also with the simulation. So let's turn on the robot. Initially, the robot is in off position. You would see a lot of these values close to 0. Everything equal or minus 11 or minus six, right? Everything is zero almost. At some point of time, the robot turns on and around 5 seconds time, it turns on. And when it turns on, now you can see the robot functioning in a similar manner as part of the desktop simulation. Same as the desktop simulation, right?

    But now the difference is that this time the robot is getting controlled from the PLC emulator, right? So this makes sure that whatever PLC, what we have on the PLC code, what we have actually worked properly. You see the PLC code with the planned model. And I'm sure that PLC planned model works fine. The code also should work fine. Now on the day of commissioning things should work fine, right? That's the confidence what I get. In this case, I have used PLC simulator, but then you could also connect it to an actual PLC. PLCs do have OPC servers and then you put that into that and then do the simulation with the PLC and the mechatronic model over here.

    So that is my demonstration. So what I have tried to show is that two steps. I mean, compared to traditional commissioning, one is that you could perform virtual commissioning, basically you replace the actual plan with the plan model. And then you take your PLC code and then you integrate the PLC code with the plant model and then see how it works. So you get more confident that your model-- your code works fine and the day of commissioning, you are more confident that things will go fine. So that's virtual community.

    The other approaches, model-based design, value, actually have a controls model. Everything in simulation. And then once you are confident with the controller simulation, then you basically deploy the PLC code out of the controls model, right? And this code you can, again, subject to virtual commissioning, and then see if it works fine with the PLC in loop. So these are the different steps, what you could follow. Basically, following these steps, make sure that, you know, can reduce the overall development time. You could be more confident that before you commission, it's more robust. It works fine.

    So these are the benefits what you get. And that's basically what I want to present to today, right? As a conclusion now here, we saw three different demonstration today, right? The first one was the smart lathe machine, and if you want to have any clarifications on that, I have contact details over here. Vishnu and Peeyush, you work on that. The second demonstration was on the model-- the digital twin modeling of the motion, right? For Omkar and Rahul worked on this and you have their contact details. And the final demonstration from me, the virtual commissioning with PLC integration. So any questions you have on this, you could reach out to me after.

    So I mean, you can take a snapshot. Note down the email address and contact details of person you want to connect with, right? And finally, I would like to close by coming to this slide. Prashant opened with this slide, and I'll try to close it here. So I mean, today what we saw the at the Summit is basically-- I mean, we had a precursor to it. Basically we had webinars where we discussed about how smart factories can be designed and operationalized.

    So those recordings are available. You can take this QR code as a screenshot, and then you could access those recordings, right? So please do that if you have missed these webinars. I hope that those webinars will definitely give you an insight on how to make your smart factory designed and operationalize them. With that-- I mean, today we had this Smart Factory Summit and hope you found this demonstration much useful. So with that, I'll pass onto Sunita to take it up further. Thank you all.

    Thank you, Professor Jha, Ramanuja, and team for taking us through this excellent demo session. And sorry we went a bit over time. And that would be because of the exciting sessions from the light from the workshop and for the demos. To your audience, we also have added the contacts of the team members whom you can reach out to. If you have any questions, to also reach out to the FSM Team. Or to The MathWorks If you need help in your journey to enable the smart factory. And we will try our best to do all we can to help. Thank you very much for your time and attention. We do hope you found the session useful. Have a great rest of the day and goodbye.