Technical Articles

Validating Design Requirements Using Model Coverage

By Uttara Kumar and Nishaat Vasi, MathWorks


It’s a big day for your team. The system you designed is ready to be deployed on the hardware for integration testing. However, there is a last-minute glitch: the system demonstrated unexpected behavior where the speed threshold criteria was unsatisfied in the production code. How could this have happened? The team followed the standard verification process:

  1. Review the system and software design requirements.
  2. Formulate the algorithm in Simulink.
  3. Verify the design using functional test cases.
  4. Verify the generated code and measure for adequate code coverage.

So, why didn’t functional testing predict this unexpected behavior? And more importantly, how much “testing” would have been enough?

Testing the functional correctness, or in other words verifying the input-output behavior of the implementation model alone, does not guarantee design correctness. Functional tests are essentially derived from requirements, and these requirements could be incomplete, incorrect, or over-specified. This makes it difficult for functional verification techniques to detect errors that stem from flaws in the requirements themselves.

Structural verification techniques such as model coverage identify unexecuted or unused simulation pathways in the model. By investigating these untested pathways, you can detect potential design errors and validate requirements. Model coverage measurement is useful for applications that must comply with DO 178C, ISO 26262, and IEC61508 standards.

This article outlines a workflow for testing a cruise controller component using model coverage analysis. The model consists of a PI controller that computes the throttle output based on the difference between the actual speed and the target speed (Figure 1).

Model_Coverage_Fig1_w.jpg
Figure 1. Controller design for a cruise control system.

Setting up the Test Harness

We generate a harness model for our controller using Simulink Verification and Validation™ (transitioned at R2017b) (Figure 2).

Model_Coverage_Fig2_w.jpg
Figure 2. Harness model components generated with Simulink Verification and Validation.

The Signal Builder block in the harness model contains test vectors that characterize the input scenarios under which we want to test our design. We could create these test vectors manually within the Signal Builder block, but we want to reuse the requirements-based test cases that were used to test the design for functional correctness. To do this, we simply specify the Excel file that contains these test cases and import them into the Signal Builder block (Figure 3).

Model_Coverage_Fig3_w.jpg
Figure 3. Interface for importing existing test cases into the Signal Builder block. Each imported signal group represents a unique test case.

Analyzing Model Coverage

Do the imported test cases that have tested the functional correctness of the design also sufficiently test the structure of our design? Also, to what extent are these test cases exercising the logical pathways of the design?

Model coverage analysis will help us answer these questions. We simulate the design under test (DUT), which in our case is the cruise controller, with all the test cases in the Signal Builder block. We then analyze the model for various types of coverage metrics, including condition, decision, and modified condition/decision coverage.1

At the end of the simulation, Simulink Verification and Validation generates an HTML report containing detailed information on the coverage metrics for the different model elements in the DUT. The Summary section of the report provides overall coverage metrics, while the Details section includes coverage metrics for each individual design element. Using the report, we can see which logical pathways in our design were not tested for all possible combinations during simulation (Figure 4). We can use the hyperlinks in the report to identify the relevant block in the model.

Model_Coverage_Fig4_w.jpg
Figure 4. HTML report generated for model coverage analysis.

As an alternative to the report, we can view the coverage results in the model itself. Each model element is color-coded based on the coverage it received. Green indicates elements that were fully covered or tested by the existing test cases, and red indicates elements that received incomplete structural coverage.

In our example,we clearly see that certain model elements did not receive full coverage (Figure 5). Hence, the functional test that portrayed functional correctness did not fully test the design.

Model_Coverage_Fig5_w.jpg
Figure 5. Model coverage results displayed on the model.

We have two options for creating test cases to account for the missing coverage in the design: create the test cases manually, or generate them automatically using Simulink Design Verifier™.

Extending Test Cases to Increase Model Coverage

With Simulink Design Verifier we can leverage existing test cases to generate tests for the missing coverage. We log the existing test cases from the harness and save them in a MAT file. We then use the Simulink Design Verifier options to extend the existing test cases (Figure 6).

Model_Coverage_Fig6_w.jpg
Figure 6. Simulink Design Verifier interface showing options for test case generation.

At the end of the test case generation process, Simulink Design Verifier generates a separate harness model with the newly generated test cases within the Signal Builder block. We combine all the test cases for our DUT into a single Signal Builder block (Figure 7), and then we simulate the DUT using these test cases.

Model_Coverage_Fig7_w.jpg
Figure 7. Existing functional test cases and test cases generated by Simulink Design Verifier.

Just by visually inspecting the coverage results on the DUT, we see that the automatically generated test cases have increased the extent of structural coverage for our design. However, one Switch block was not fully covered by any of the test cases (Figure 8).

Model_Coverage_Fig8_w.jpg
Figure 8. Model coverage results for the cruise controller shown on the model indicate incomplete structural coverage (shown in red) for the Switch block.

Investigating Untested Design Elements

We started our investigation by looking at the Simulink Design Verifier report that was generated at the end of the test generation process. The report showed which generated test case satisfied which coverage objective. We saw that the objectives associated with the Switch block were proven unsatisfiable (Figure 9). The Switch block essentially passes through the first or third input based on the value of the second input, which acts as the trigger for this block. The report indicated that there was no test case that would make the Switch block in our design pass the first input. What was causing this to happen?

Model_Coverage_Fig9_w.jpg
Figure 9. Simulink Design Verifier report for test generation showing objectives proven unsatisfiable.

Using the requirements traceability feaure in Simulink Verification and Validation, we trace the Switch block back to the related requirement to understand the motivation behind the Switch block in the design (Figure 10).

The Switch block’s criteria for passing the first input was to have the speed at the second input exceed 150m/s. However, given the dynamics of our system, and as a consequence of other design requirements, the speed in our design was limited to between 0 and 100m/s. Therefore, the speed would never exceed the specified threshold.

Model_Coverage_Fig10_w.jpg
Figure 10. Left: The Switch block in the Simulink model. Right: The requirement associated with the Switch block.

Next Steps

After reevaluating the system requirements with our team, we revised the speed threshold limit to be 75m/s instead of 150m/s to better represent the system we are designing. Fixing these errors in our requirements, and making subsequent changes to the controller model, will enable us to achieve our goal of 100% model coverage (Figure 11). We can now reuse the test vectors that were formulated during this structural verification process for equivalence and regression testing and for obtaining code coverage metrics.

Model_Coverage_Fig11_w.jpg
Figure 11. 100% model coverage results displayed on the model.

In summary, structural verification with model coverage analysis helped uncover a problem with the requirements for our design which had otherwise passed functional verification. In our example, we used model coverage and automatic test generation on a relatively simple controller design. For a larger design with more inputs and complex logic in the form of Stateflow charts, for example, the number of possible parallel simulation pathways and interactions is high. Testing for structural coverage is even more important here because as design complexity increases, the likelihood of manually created test cases exercising all these pathways decreases. Improvements to the design structure can only be made after an assessment of the current structure. Early verification techniques such as model coverage provide this much needed assessment.

1 A Practical Tutorial on Modified Condition/Decision Coverage – Hayhurst, Veerhusen, Chilenski, Rierson

Published 2014 - 92223v00

View Articles for Related Capabilities