Verification of Avionics Systems Using Simulink Test and Simulink Real-Time
Maciej Stefaniak, GE Aerospace
The current market situation forces companies to search for methods to accelerate the development of new avionics systems in accordance with guidelines and industry standards (including ARP4754A and DO-178C) while maintaining high quality and cost competitiveness. In this talk, hear about challenges related to the automation of the system verification process of avionics devices, solutions using Simulink Test™ and Simulink Real-Time™, and future development directions of the test environment.
Published: 7 May 2023
Good day, everyone. My name is Maciej Stefaniak Today, I would like to talk about our approach to verification of avionics systems using Simulink Test and Simulink Real-Time. First, I will start with a short introduction of GE Aerospace. Then we will move to objectives we wanted to satisfy when designing a test system. Later, I will show you how Simulink Tests and Simulink Real-Time help us to satisfy our objectives. And at the end, I will briefly talk about plans for the future.
My name is Maciej Stefaniak. I am an advanced lead engineer working more than five years in GE Aerospace.
At GE Aerospace, we invent the future of flight, lift people up, and bring them home safely. We see how aviation matters to the world, connecting families, economies, and ideas. We see the GE Aerospace technology that enables flight, our engines, our avionics, and our systems, the power of aircraft around the world. We have a strong legacy as a world leader in propulsion technology, builder of America's first jet engine, the world's most produced, most powerful, and biggest jet engines.
These are five key industry areas GE Aerospace is focused on-- commercial engines and services, military engines and services, aviation systems, which I'm representing today, turboprop engines, and additive manufacturing. We have around 45,000 employees in locations all around the world, also in Poland, which I'm proudly representing today.
Let's move now to main part of our presentation and talk about objectives and challenges we faced when we started development of new test system. Next, I will show you how MathWorks's tools help us to satisfy this process.
Just to give you a bit overview about the aviation industry, safety is always in the first place. This is a highly regulated industry. Every new aircraft goes through the process of certification. And our responsibility is to provide evidence of compliance to industry standards, like ARP 4754A for systems, ARP 4761 for safety, DO-254 for hardware development, and DO-178C for software development. Every requirement needs to be verified. This approach is requirements-based testing.
It all started around 2016 when we were looking for new test system recognizing the limitations of test system we had. This slide is a set of high-level objectives we wanted to achieve. Let's go through each of them to define how we understand them.
This is a main goal of the automation, reduce human factor impact due to errors for multiple reasons, like tiredness, for example. Also, quality of verification could vary for different individuals.
We are looking for commonality. Test system should be applicable to whole group of systems, rather a single product. At the end, every test system needs to be qualified according to DO-330 standard. So use of components like models, code, and documentation is essential for us.
We don't want to reinvent the wheel for every new project. We want that each product will contribute to testing framework development, adding new components not available before. With this approach, we expect to reduce total cost of development despite the first project costs could be higher compared to the traditional approach. I can tell you right now that after all these years, this is not the estimate, negative projection only. This has been proved to work as we successfully deployed this framework to several projects so far.
We need real-time execution with accurate and precise sample time representation. Our test system would need to run at least 10 times faster than Unit Under Test. And we would like both real-time model and the test be executed on real-time target.
We would like to perform the verification in open and closed loop. We need support for a variety of interfaces, like analog, digital, but also communication protocols like serial, ethernet, and CAN for aviation variance. We are also looking for third-party test and measurement equipment integration. Model in the loop and hardware in the loop shall share a common test. This is a key requirement if we want to reduce the impact of test rig availability. With this approach, we can develop tests in simulation and move to hardware testing when they are mature enough, reducing need of physical test rig time to minimum.
Training and onboarding of verification team is always cost of the project. Verification teams should focus on requirements and on the test, not the tool itself. We need easy and intuitive tool for the verification. We would like to use predefined signals, organized in dropdown lists, that should reduce the error-- the probability of errors during the test development. We also need efficient way of results analysis and debugging after the test execution.
We need a support for the certification process. The main goal of certification is to deliver the evidence of compliance. There was a term called traceability, and ARP standard defines it as the recorded relationship established between two or more elements in development process. In our case, it is usually requirements, models, and test cases.
Challenge we face is complexity and scale of our requirements. We are talking about hundreds of requirements and hundreds of test cases we need to develop and manage. We want to generate coverage matrix, including requirements, models, test cases, but also test case results to monitor development progress. As I mentioned before, every requirement needs to be verified. But before we do that formally, requirements need to achieve proper level of maturity by requirements validation.
Why it is really important to validate requirements? This is the V diagram which describes process we use to develop majority of our products-- orange figure showing the percentage of pillars introduced in each phase of the project, green figure showing the percentage of failures faults being resolved, while red figure is showing the normalized cost of failure resolving across the project execution. As we can easily see, most of faults are introduced at an early stage, but detected much later, usually during the integration. Solving faults related to requirements during integration requires much more work and adds cost to the project.
So it is so crucial to find faults immediately after they are introduced with the same phase of the project. Let's see how Simulink Test and Simulink Real-Time help us to satisfy our objectives.
We use MATLAB and Simulink as programming environment. We use Speedgoat as hardware platform and Simulink Test Toolbox for test development and management. Key features of Speedgoat target PC are real-time operation using Simulink Real-Time time hardware, vast range of support interfaces from analog digital to communication protocols, seamless integration with Simulink Test Toolbox, and we have very good cooperation and support from Speedgoat so far.
In test system, we define tools-- Verification Engineer, whose main test system user being focused on requirements and unit under test. But with limited understanding of MATLAB and Simulink, this person should use predefined templates and predefined signals to develop test cases. While test system engineer is person responsible for tailoring test system framework to product specific needs, on the contrary, the verification engineer, test system engineer, has to have a very good understanding of MATLAB, Simulink, Speedgoat platform, and interface signals.
This is the context of use of our test system. Based on project interface control document, we create input and output signal definitions, which are later used by the verification engineer to alter test sequences and test cases, executed either in simulation or real-time mode. After execution, test report, test results, and test procedure is created. This procedure is effectively export of test sequence and test case into the Word document. All items on the right side are customer deliverables.
Let's go now through Simulink Test components hierarchy to understand how each item is contributing to the verification process. Test model is driven by the test sequence, which together creates test harness. Test case defines how the test harness is executed. And on top of that, there is test manager that governs execution of tests and generates results and reports after execution.
This is how the test model looks like, this really generic approach, one input and one output. Simulink bus, which is defined in Excel templates, allows for customization to project needs. System under test could be simulation model, could be model executed on Speedgoat target, or even multiple simulation models, as long as they use same interface. This approach is possible because most of the projects use standardized interface control document.
Test sequence is the place where we develop our main testing algorithm. Effectively, it is state machine, where in state, we define values of input signals. We transition after specific amount of time or after specific event to the next step. And in the next step, using the output signals, we perform verification using verify state.
Not everything should be automated. For this purpose, with help of my MathWorks team, we designed three types of pop-up windows allowing to stop execution and take specific action. We can perform actions like short signal to the ground and press OK to continue. We can perform specific verification using external oscilloscope and provide results of evaluation back to test sequence. We can read value from the external equipment and feed it back to test sequence for the evaluation. The advantage of this solution is that pop-up windows are embedded within the test sequence, and verification GE has full control of what and when will be displayed during test execution.
Test harness, as I said before, is relation between the test sequence and the test model. Test sequence drives input signals of system under test. And output of a system under test is feedback to the test sequence for evaluation, using verify statements. Both input and output signals are logged.
The test manager has two tabs, test browser and results and artifacts. The test browser is the main editor for the test cases. And test cases define how the test harnesses will be executed.
So we define the execution mode, either this is simulation or real time. We can link to the requirements. We can add a test case description. We can specify which test harness should be executed. We can change the test harness parameters with parameters overrides. We can change the standard way of execution, adding custom actions and callbacks. We can also define testing vectors using iterations section. And in custom criteria with MATLAB unit test framework capabilities and other MATLAB functions, we have ability for very powerful results post-processing.
In results and artifacts, we have access to verify statements and log signals. We have multiple plots, markers, zooming features that allow us for very convenient debugging and failure root cause analysis.
Regarding requirements managing, we have two approaches. One is to use Simulink Requirements and export requirements to Test Toolbox and fully manage requirements within Simulink Requirements. It makes linking between requirements, model test cases, and test steps straightforward. We have built-in function to generate the coverage matrix, including different components.
On the other hand, we can integrate Test Manager with Quality Management System, and the traceability will be done on the QMS level. We need only Test Manager being executed by the QMS, and all results are a later feedback to QMS to generate information about the coverage. This solution requires more work and more knowledge if we want to perform successful integration.
Let's get back to requirements validations, see how working test system helps us to make our work much more efficient. Initially, we had three independent sequential stages of work-- requirements refinement, requirements modeling, requirements verification. Because each stage was formalized and relied on output of previous stage, errors found during modeling or during verification could be corrected in the next cycle, causing unnecessary rework.
So we moved to one peak continuous development stage, including previous three stages, like requirements development, requirements modeling, and verification. Work was done iteratively, as long as team decided that all artifacts-- like requirements, model, tests-- achieved required level of maturity. Despite looking complicated, this allows us to deliver much faster change requirements once, rather than multiple times, and get instant confirmation with modeling and testing that requirements are constantly meeting our expectations.
For the end, quickly, plans for the future-- we would like to transform to full CI, continuous integration, continuous development workflow, also extending the scope of our toolboxes with Simulink Design Verify. We would like to also extend the coverage matrix capabilities, being able to generate it with actual test results to monitor project progress live. Integration with QMS is still something that is planned, and we performed only initial feasibility studies.
We also would like to integrate with external third-party test and measurement equipment using ethernet protocols like TCP/IP. Right now, we received some solutions from Speedgoat, which is looking very promising and could definitely extend our testing capabilities. We are also in progress of migration of our test system into the latest version of MATLAB, where we would like to use the QNX real-time operating system on Speedgoat.
Over all these years, we established very good cooperation with MathWorks, allowing to report back to the form and instantly got fixes, but also talked about the improvements we need from tool to make our work more efficient. This is something we definitely want to continue in the future.
This is all I wanted to share with you today. Thank you for your attention, and have a good day.
[MUSIC PLAYING]