Model Testing Metrics
The Model Testing Dashboard collects metric data from the model design and testing artifacts in a project, such as requirements, models, and test results. Use the metric data to assess the status and quality of your model testing. Each metric in the dashboard measures a different aspect of the quality of the testing of your model and reflects guidelines in industry-recognized software development standards, such as ISO 26262 and DO-178. Use the widgets in the Model Testing Dashboard to see high-level metric results and testing gaps, as described in Explore Status and Quality of Testing Activities Using Model Testing Dashboard.
Alternatively, you can use the API functions to collect metric results programmatically.
When using the API, use the metric IDs to refer to each metric. See Collect Metrics on Model Testing Artifacts Programmatically for an example of how to
collect these metrics programmatically. You can use the function getAvailableMetricIds
to return a list of
available metric identifiers.
The model testing metrics provide:
Metrics for Requirements Linked to Tests
The metrics associated with the Requirements Linked to Tests section of the dashboard include:
Metric | Description |
---|---|
RequirementWithTestCase | Determine whether a requirement is linked to test cases. |
RequirementWithTestCasePercentage | Calculate the percentage of requirements that are linked to test cases. |
RequirementWithTestCaseDistribution | Distribution of the number of requirements linked to test cases compared to the number of requirements that are missing test cases. |
TestCasesPerRequirement | Count the number of test cases linked to each requirement. |
TestCasesPerRequirementDistribution | Distribution of the number of test cases linked to each requirement. |
For more information, see Requirements Linked to Tests.
Metrics for Tests Linked to Requirements
The metrics associated with the Tests Linked to Requirements section of the dashboard include:
Metric | Description |
---|---|
TestCaseWithRequirement | Determine whether a test case is linked to requirements. |
TestCaseWithRequirementPercentage | Calculate the fraction of test cases that are linked to requirements. |
TestCaseWithRequirementDistribution | Distribution of the number of test cases linked to requirements compared to the number of test cases that are missing links to requirements. |
RequirementsPerTestCase | Count the number of requirements linked to each test case. |
RequirementsPerTestCaseDistribution | Distribution of the number of requirements linked to each test case. |
For more information, see Tests Linked to Requirements.
Metrics for Test Case Breakdown
The metrics associated with the Test Case Breakdown section of the dashboard include:
Metric | Description |
---|---|
TestCaseType | Return the type of the test case. |
TestCaseTypeDistribution | Distribution of the types of the test cases for the unit. |
TestCaseTag | Return the tags for a test case. |
TestCaseTagDistribution | Distribution of the tags of the test cases for the unit. |
For more information, see Test Case Breakdown.
Metrics for Model Test Status
The metrics associated with the Model Test Status section of the dashboard include:
Metric | Description |
---|---|
TestCaseStatus | Return the status of the test case result. |
TestCaseStatusPercentage | Calculate the fraction of test cases that passed. |
TestCaseStatusDistribution | Distribution of the statuses of the test case results for the unit. |
TestCaseVerificationStatus | Determine whether a test case has pass/fail criteria such as verify statements, verification blocks, custom criteria, and logical or temporal assessments. |
TestCaseVerificationStatusDistribution | Distribution of the number of test cases that do not have pass/fail criteria compared to the number of test cases that do have pass/fail criteria. |
For more information, see Model Test Status.
Metrics for Model Coverage for the Unit
The Model Coverage section of the dashboard shows the aggregated coverage for the unit.
The metrics associated with the Model Coverage section of the dashboard include:
Metric | Description |
---|---|
ExecutionCoverageBreakdown | Overall model execution coverage achieved, justified, or missed by the tests in the unit. |
DecisionCoverageBreakdown | Overall model decision coverage achieved, justified, or missed by the tests in the unit. |
ConditionCoverageBreakdown | Overall model condition coverage achieved, justified, or missed by the tests in the unit. |
MCDCCoverageBreakdown | Overall model modified condition and decision coverage (MC/DC) achieved, justified, or missed by the tests in the unit. |
For more information, see Model Coverage for the Unit.
Metrics for Model Coverage for Each Model in the Unit
When you click on a bar in the Model Coverage bar chart, the Metric Details show the coverage for each model in the unit.
The metrics associated with the model coverage for each model in the unit include:
Metric | Description |
---|---|
ExecutionCoverageFragment | Execution coverage for each model in the unit. |
DecisionCoverageFragment | Decision coverage for each model in the unit. |
ConditionCoverageFragment | Condition coverage for each model in the unit. |
MCDCCoverageFragment | Modified condition/decision coverage (MC/DC) for each model in the unit. |
For more information, see Model Coverage for each Model in the Unit.
Metrics for Requirements-Based Tests for the Unit
The Achieved Coverage Ratio section of the dashboard shows the sources of achieved coverage for the unit. The Requirements-Based Tests section shows how much of the overall achieved coverage comes from requirements-based tests.
The metrics associated with the Requirements-Based Tests section of the dashboard include:
Metric | Description |
---|---|
RequirementsExecutionCoverageBreakdown | Fraction of the overall achieved execution coverage that comes from requirements-based tests in the unit. |
RequirementsDecisionCoverageBreakdown | Fraction of the overall achieved decision coverage that comes from requirements-based tests in the unit. |
RequirementsConditionCoverageBreakdown | Fraction of the overall achieved condition coverage that comes from requirements-based tests in the unit. |
RequirementsMCDCCoverageBreakdown | Fraction of the overall achieved MC/DC coverage that comes from requirements-based tests in the unit. |
For more information, see Requirements-Based Tests for the Unit.
Metrics for Requirements-Based Tests for Each Model in the Unit
When you click on a bar in the Requirements-Based Tests section, the Metric Details show the coverage ratio for each model in the unit. For requirements-based tests, the coverage ratio is the percentage of the overall achieved coverage that comes from requirements-based tests.
The metrics associated with requirements-based coverage for each model in the unit include:
Metric | Description |
---|---|
RequirementsExecutionCoverageFragment | Fraction of the overall achieved execution coverage that comes from requirements-based tests in each model in the unit. |
RequirementsDecisionCoverageFragment | Fraction of the overall achieved decision coverage that comes from requirements-based tests in each model in the unit. |
RequirementsConditionCoverageFragment | Fraction of the overall achieved condition coverage that comes from requirements-based tests in each model in the unit. |
RequirementsMCDCCoverageFragment | Fraction of the overall achieved MC/DC coverage that comes from requirements-based tests in each model in the unit. |
For more information, see Requirements-Based Tests for each Model in the Unit.
Metrics for Unit-Boundary Tests for the Unit
The Achieved Coverage Ratio section of the dashboard shows the sources of achieved coverage for the unit. The Unit-Boundary Tests section shows how much of the overall achieved coverage comes from unit-boundary tests.
The metrics associated with the Unit-Boundary Tests section of the dashboard include:
Metric | Description |
---|---|
UnitBoundaryExecutionCoverageBreakdown | Fraction of the overall achieved execution coverage that comes from unit-boundary tests in the unit. |
UnitBoundaryDecisionCoverageBreakdown | Fraction of the overall achieved decision coverage that comes from unit-boundary tests in the unit. |
UnitBoundaryConditionCoverageBreakdown | Fraction of the overall achieved condition coverage that comes from unit-boundary tests in the unit. |
UnitBoundaryMCDCCoverageBreakdown | Fraction of the overall achieved MC/DC coverage that comes from unit-boundary tests in the unit. |
For more information, see Unit-Boundary Tests for the Unit.
Metrics for Unit-Boundary Tests for Each Model in the Unit
When you click on a bar in the Unit-Boundary Tests section, the Metric Details show the coverage ratio for each model in the unit. For unit-boundary tests, the coverage ratio is the percentage of the overall achieved coverage that comes from unit-boundary tests.
The metrics associated with unit-boundary coverage for each model in the unit include:
Metric | Description |
---|---|
UnitBoundaryExecutionCoverageFragment | Fraction of the overall achieved execution coverage that comes from unit-boundary tests in each model in the unit. |
UnitBoundaryDecisionCoverageFragment | Fraction of the overall achieved decision coverage that comes from unit-boundary tests in each model in the unit. |
UnitBoundaryConditionCoverageFragment | Fraction of the overall achieved condition coverage that comes from unit-boundary tests in each model in the unit. |
UnitBoundaryMCDCCoverageFragment | Fraction of the overall achieved MC/DC coverage that comes from unit-boundary tests in each model in the unit. |
For more information, see Unit-Boundary Tests for each Model in the Unit.
Requirements Linked to Tests
The metrics associated with the Requirements Linked to Tests section of the dashboard are:
RequirementWithTestCase
Determine whether a requirement is linked to test cases.
Metric Information |
---|
Metric ID:
|
Description: Use this
metric to determine whether a requirement is linked to a test case with a link
where the Type is set to To collect data for this metric:
Collecting data for this metric loads the model file and requires a Requirements Toolbox™ license. |
Results: For this metric,
instances of
|
Capabilities and Limitations: The metric:
|
See Also: For an example of collecting metrics programmatically, see Collect Metrics on Model Testing Artifacts Programmatically. |
RequirementWithTestCasePercentage
Calculate the percentage of requirements that are linked to test cases.
Metric Information |
---|
Metric ID:
|
Description: This metric
counts the fraction of requirements that are linked to at least one test case
with a link where the Type is set to
This
metric calculates the results by using the results of the
To collect data for this metric:
Collecting data for this metric loads the model file and requires a Requirements Toolbox license. |
Results: For this metric,
instances of
|
Compliance Thresholds: The default compliance thresholds for this metric are:
|
Capabilities and Limitations: The metric:
|
See Also: For an example of collecting metrics programmatically, see Collect Metrics on Model Testing Artifacts Programmatically. |
RequirementWithTestCaseDistribution
Distribution of the number of requirements linked to test cases compared to the number of requirements that are missing test cases.
Metric Information |
---|
Metric ID:
|
Description: Use this
metric to count the number of requirements that are linked to test cases and the
number of requirements that are missing links to test cases. The metric analyzes
only requirements where the Type is set to
This metric returns the result as a
distribution of the results of the To collect data for this metric:
Collecting data for this metric loads the model file and requires a Requirements Toolbox license. |
Results: For this metric,
instances of
The first bin includes requirements that are not linked to test cases. The second bin includes requirements that are linked to at least one test case. |
Compliance Thresholds: The default compliance thresholds for this metric are:
|
Capabilities and Limitations: The metric:
|
See Also: For an example of collecting metrics programmatically, see Collect Metrics on Model Testing Artifacts Programmatically. |
TestCasesPerRequirement
Count the number of test cases linked to each requirement.
Metric Information |
---|
Metric ID:
|
Description: Use this
metric to count the number of test cases linked to each requirement. The metric
analyzes only requirements where the Type is
set to To collect data for this metric:
Collecting data for this metric loads the model file and requires a Requirements Toolbox license. |
Results: For this metric,
instances of |
Capabilities and Limitations: The metric:
|
See Also: For an example of collecting metrics programmatically, see Collect Metrics on Model Testing Artifacts Programmatically. |
TestCasesPerRequirementDistribution
Distribution of the number of test cases linked to each requirement.
Metric Information |
---|
Metric ID:
|
Description: This metric
returns a distribution of the number of test cases linked to each requirement.
Use this metric to determine if requirements are linked to a disproportionate
number of test cases. The metric analyzes only requirements where the Type is set to This metric returns the result
as a distribution of the results of the To collect data for this metric:
Collecting data for this metric loads the model file and requires a Requirements Toolbox license. |
Results: For this metric,
instances of
The bins in the result of this metric correspond to the bins 0, 1, 2, 3, and >3 in the Tests per Requirement widget. |
Compliance Thresholds: This metric does not have predefined thresholds. Consequently, the compliance threshold overlay icon appears when you click Uncategorized in the Overlays section of the toolstrip. |
Capabilities and Limitations: The metric:
|
See Also: For an example of collecting metrics programmatically, see Collect Metrics on Model Testing Artifacts Programmatically. |
Tests Linked to Requirements
The metrics associated with the Tests Linked to Requirements section of the dashboard are:
TestCaseWithRequirement
Determine whether a test case is linked to requirements.
Metric Information |
---|
Metric ID:
|
Description: Use this
metric to determine whether a test case is linked to a requirement with a link
where the Type is set to To collect data for this metric:
Collecting data for this metric loads the model file and requires a Simulink® Test™ license. |
Results: For this metric,
instances of
|
Capabilities and Limitations: The metric:
|
See Also: For an example of collecting metrics programmatically, see Collect Metrics on Model Testing Artifacts Programmatically. |
TestCaseWithRequirementPercentage
Calculate the fraction of test cases that are linked to requirements.
Metric Information |
---|
Metric ID:
|
Description: This metric
counts the fraction of test cases that are linked to at least one requirement
with a link where the Type is set to
This metric calculates the results by using the results of the
To collect data for this metric:
Collecting data for this metric loads the model file and requires a Simulink Test license. |
Results: For this metric,
instances of
|
Compliance Thresholds: The default compliance thresholds for this metric are:
|
Capabilities and Limitations: The metric:
|
See Also: For an example of collecting metrics programmatically, see Collect Metrics on Model Testing Artifacts Programmatically. |
TestCaseWithRequirementDistribution
Distribution of the number of test cases linked to requirements compared to the number of test cases that are missing links to requirements.
Metric Information |
---|
Metric ID:
|
Description: Use this
metric to count the number of test cases that are linked to requirements and the
number of test cases that are missing links to requirements. The metric analyzes
only test cases that run on the model or subsystems in the unit for which you
collect metric data. A test case is linked to a requirement if it has a link
where the Type is set to
This metric returns the result as a
distribution of the results of the To collect data for this metric:
Collecting data for this metric loads the model file and requires a Simulink Test license. |
Results: For this metric,
instances of
The first bin includes test cases that are not linked to requirements. The second bin includes test cases that are linked to at least one requirement. |
Compliance Thresholds: The default compliance thresholds for this metric are:
|
Capabilities and Limitations: The metric:
|
See Also: For an example of collecting metrics programmatically, see Collect Metrics on Model Testing Artifacts Programmatically. |
RequirementsPerTestCase
Count the number of requirements linked to each test case.
Metric Information |
---|
Metric ID:
|
Description: Use this
metric to count the number of requirements linked to each test case. The metric
analyzes only test cases that run on the model or subsystems in the unit for
which you collect metric data. A test case is linked to a requirement if it has
a link where the Type is set to
To collect data for this metric:
Collecting data for this metric loads the model file and requires a Simulink Test license. |
Results: For this metric,
instances of |
Capabilities and Limitations: The metric:
|
See Also: For an example of collecting metrics programmatically, see Collect Metrics on Model Testing Artifacts Programmatically. |
RequirementsPerTestCaseDistribution
Distribution of the number of requirements linked to each test case.
Metric Information |
---|
Metric ID:
|
Description: This metric
returns a distribution of the number of requirements linked to each test case.
Use this metric to determine if test cases are linked to a disproportionate
number of requirements. The metric analyzes only test cases that run on the
model or subsystems in the unit for which you collect metric data. A test case
is linked to a requirement if it has a link where the Type
is set to This metric returns the
result as a distribution of the results of the To collect data for this metric:
Collecting data for this metric loads the model file and requires a Simulink Test license. |
Results: For this metric,
instances of
The bins in the result of this metric correspond to the bins 0, 1, 2, 3, and >3 in the Requirements per Test widget. |
Compliance Thresholds: This metric does not have predefined thresholds. Consequently, the compliance threshold overlay icon appears when you click Uncategorized in the Overlays section of the toolstrip. |
Capabilities and Limitations: The metric:
|
See Also: For an example of collecting metrics programmatically, see Collect Metrics on Model Testing Artifacts Programmatically. |
Test Case Breakdown
The metrics associated with the Test Case Breakdown section of the dashboard are:
TestCaseType
Return the type of the test case.
Metric Information |
---|
Metric ID:
|
Description: This metric returns the type of the test case. A test case is either a baseline, equivalence, or simulation test.
To collect data for this metric:
Collecting data for this metric loads the model file and test files and requires a Simulink Test license. |
Results: For this metric,
instances of
|
Capabilities and Limitations: The metric includes only test cases in the project that test the model or subsystems in the unit for which you collect metric data. |
See Also: For an example of collecting metrics programmatically, see Collect Metrics on Model Testing Artifacts Programmatically. |
TestCaseTypeDistribution
Distribution of the types of the test cases for the unit.
Metric Information |
---|
Metric ID:
|
Description: This metric returns a distribution of the types of test cases that run on the unit. A test case is either a baseline, equivalence, or simulation test. Use this metric to determine if there is a disproportionate number of test cases of one type.
This metric returns the result as a distribution of
the results of the To collect data for this metric:
Collecting data for this metric loads the model file and requires a Simulink Test license. |
Results: For this metric,
instances of
|
Compliance Thresholds: This metric does not have predefined thresholds. Consequently, the compliance threshold overlay icon appears when you click Uncategorized in the Overlays section of the toolstrip. |
Capabilities and Limitations: The metric includes only test cases in the project that test the model or subsystems in the unit for which you collect metric data. |
See Also: For an example of collecting metrics programmatically, see Collect Metrics on Model Testing Artifacts Programmatically. |
TestCaseTag
Return the tags for a test case.
Metric Information |
---|
Metric ID:
|
Description: This metric returns the tags for a test case. You can add custom tags to a test case by using the Test Manager. To collect data for this metric:
Collecting data for this metric loads the model file and test files and requires a Simulink Test license. |
Results: For this metric,
instances of |
Capabilities and Limitations: The metric includes only test cases in the project that test the model or subsystems in the unit for which you collect metric data. |
See Also: For an example of collecting metrics programmatically, see Collect Metrics on Model Testing Artifacts Programmatically. |
TestCaseTagDistribution
Distribution of the tags of the test cases for the unit.
Metric Information |
---|
Metric ID:
|
Description: This metric returns a distribution of the tags on the test cases that run on the unit. For a test case, you can specify custom tags in a comma-separated list in the Test Manager. Use this metric to determine if there is a disproportionate number of test cases that have a particular tag. This metric returns the result as a distribution of the results
of the To collect data for this metric:
Collecting data for this metric loads the model file and requires a Simulink Test license. |
Results: For this metric,
instances of
|
Compliance Thresholds: This metric does not have predefined thresholds. Consequently, the compliance threshold overlay icon appears when you click Uncategorized in the Overlays section of the toolstrip. |
Capabilities and Limitations: The metric includes only test cases in the project that test the model or subsystems in the unit for which you collect metric data. |
See Also: For an example of collecting metrics programmatically, see Collect Metrics on Model Testing Artifacts Programmatically. |
Model Test Status
The metrics associated with the Model Test Status section of the dashboard are:
TestCaseStatus
Return the status of the test case result.
Metric Information |
---|
Metric ID:
|
Description: This metric returns the status of the test case result. A test status is passed, failed, disabled, or untested. To collect data for this metric:
Collecting data for this metric loads the model file and test result files and requires a Simulink Test license. |
Results: For this metric,
instances of
|
Capabilities and Limitations: The metric:
|
See Also: For an example of collecting metrics programmatically, see Collect Metrics on Model Testing Artifacts Programmatically. |
TestCaseStatusPercentage
Calculate the fraction of test cases that passed.
Metric Information |
---|
Metric ID:
|
Description: This metric counts the fraction of test cases that passed in the test results. This metric calculates the results by using the results of
the To collect data for this metric:
Collecting data for this metric loads the model file and requires a Simulink Test license. |
Results: For this metric,
instances of
|
Compliance Thresholds: The default compliance thresholds for this metric are:
|
Capabilities and Limitations: The metric:
|
See Also: For an example of collecting metrics programmatically, see Collect Metrics on Model Testing Artifacts Programmatically. |
TestCaseStatusDistribution
Distribution of the statuses of the test case results for the unit.
Metric Information |
---|
Metric ID:
|
Description: This metric returns a distribution of the status of the results of test cases that run on the unit. A test status is passed, failed, disabled, or untested. This metric returns the result as a distribution of the
results of the To collect data for this metric:
Collecting data for this metric loads the model file and requires a Simulink Test license. |
Results: For this metric,
instances of
|
Compliance Thresholds: The default compliance thresholds for this metric are:
|
Capabilities and Limitations: The metric:
|
See Also: For an example of collecting metrics programmatically, see Collect Metrics on Model Testing Artifacts Programmatically. |
TestCaseVerificationStatus
Determine whether a test case has pass/fail criteria such as verify statements, verification blocks, custom criteria, and logical or temporal assessments.
Metric Information |
---|
Metric ID:
|
Description: Use this metric to determine whether a test case has pass/fail criteria. A test case has pass/fail criteria if it has at least one of the following:
To collect data for this metric:
Collecting data for this metric loads the model file and test result files and requires a Simulink Test license. |
Results: For this metric,
instances of
|
Capabilities and Limitations: The metric:
|
See Also: For an example of collecting metrics programmatically, see Collect Metrics on Model Testing Artifacts Programmatically. |
TestCaseVerificationStatusDistribution
Distribution of the number of test cases that do not have pass/fail criteria compared to the number of test cases that do have pass/fail criteria.
Metric Information |
---|
Metric ID:
|
Description: Use this metric to count the number of test cases that do not have pass/fail criteria and the number of test cases that do have pass/fail criteria. A test case has pass/fail criteria if it has at least one of the following:
This metric returns the result as a distribution of
the results of the To collect data for this metric:
Collecting data for this metric loads the model file and test files and requires a Simulink Test license. |
Results: For this metric,
instances of
|
Compliance Thresholds: The default compliance thresholds for this metric are:
|
Capabilities and Limitations: The metric:
|
See Also: For an example of collecting metrics programmatically, see Collect Metrics on Model Testing Artifacts Programmatically. |
Model Coverage for the Unit
The Model Coverage section of the dashboard shows the aggregated coverage for the unit.
The metrics associated with the Model Coverage section of the dashboard are:
ExecutionCoverageBreakdown for the execution coverage
DecisionCoverageBreakdown for the decision coverage
ConditionCoverageBreakdown for the condition coverage
MCDCCoverageBreakdown for the MC/DC coverage
ExecutionCoverageBreakdown
Overall model execution coverage achieved, justified, or missed by the tests in the unit.
Metric Information |
---|
Metric ID:
|
Description: This metric returns the model execution coverage measured in the test results, aggregated across the unit. The metric result includes the percentage of execution coverage achieved by the test cases, the percentage of coverage justified in coverage filters, and the percentage of execution coverage missed by the tests. To collect data for this metric:
Collecting data for this metric loads the model file and test results files and requires a Simulink Coverage™ license. |
Results: For this metric,
instances of
|
Compliance Thresholds: The default compliance thresholds for this metric are:
|
Capabilities and Limitations: The metric:
|
See Also: For an example of collecting metrics programmatically, see Collect Metrics on Model Testing Artifacts Programmatically. |
DecisionCoverageBreakdown
Overall model decision coverage achieved, justified, or missed by the tests in the unit.
Metric Information |
---|
Metric ID:
|
Description: This metric returns the model decision coverage measured in the test results, aggregated across the unit. The metric result includes the percentage of decision coverage achieved by the test cases, the percentage of coverage justified in coverage filters, and the percentage of decision coverage missed by the tests. To collect data for this metric:
Collecting data for this metric loads the model file and test results files and requires a Simulink Coverage license. |
Results: For this metric,
instances of
|
Compliance Thresholds: The default compliance thresholds for this metric are:
|
Capabilities and Limitations: The metric:
|
See Also: For an example of collecting metrics programmatically, see Collect Metrics on Model Testing Artifacts Programmatically. |
ConditionCoverageBreakdown
Overall model condition coverage achieved, justified, or missed by the tests in the unit.
Metric Information |
---|
Metric ID:
|
Description: This metric returns the model condition coverage measured in the test results, aggregated across the unit. The metric result includes the percentage of condition coverage achieved by the test cases, the percentage of coverage justified in coverage filters, and the percentage of condition coverage missed by the tests. To collect data for this metric:
Collecting data for this metric loads the model file and test results files and requires a Simulink Coverage license. |
Results: For this metric,
instances of
|
Compliance Thresholds: The default compliance thresholds for this metric are:
|
Capabilities and Limitations: The metric:
|
See Also: For an example of collecting metrics programmatically, see Collect Metrics on Model Testing Artifacts Programmatically. |
MCDCCoverageBreakdown
Overall model modified condition and decision coverage (MC/DC) achieved, justified, or missed by the tests in the unit.
Metric Information |
---|
Metric ID:
|
Description: This metric returns the modified condition and decision (MC/DC) measured in the test results, aggregated across the unit. The metric result includes the percentage of MC/DC coverage achieved by the test cases, the percentage of coverage justified in coverage filters, and the percentage of MC/DC coverage missed by the tests. To collect data for this metric:
Collecting data for this metric loads the model file and test results files and requires a Simulink Coverage license. |
Results: For this metric,
instances of
|
Compliance Thresholds: The default compliance thresholds for this metric are:
|
Capabilities and Limitations: The metric:
|
See Also: For an example of collecting metrics programmatically, see Collect Metrics on Model Testing Artifacts Programmatically. |
Model Coverage for each Model in the Unit
When you click on a bar in the Model Coverage bar chart, the Metric Details show the coverage for each model in the unit.
The metrics associated with the model coverage for each model in the unit are:
ExecutionCoverageFragment for the execution coverage
DecisionCoverageFragment for the decision coverage
ConditionCoverageFragment for the condition coverage
MCDCCoverageFragment for the MC/DC coverage
ExecutionCoverageFragment
Execution coverage for each model in the unit.
Metric Information |
---|
Metric ID:
|
Description: This metric returns the model execution coverage measured in the test results for each model in the unit. The metric result includes the percentage of execution coverage achieved by the test cases, the percentage of coverage justified in coverage filters, and the percentage of execution coverage missed by the tests. To collect data for this metric:
Collecting data for this metric loads the model file and test results files and requires a Simulink Coverage license. |
Results: For this metric,
instances of
|
Compliance Thresholds: This metric does not have predefined thresholds. |
Capabilities and Limitations: The metric:
|
See Also: For an example of collecting metrics programmatically, see Collect Metrics on Model Testing Artifacts Programmatically. |
DecisionCoverageFragment
Decision coverage for each model in the unit.
Metric Information |
---|
Metric ID:
|
Description: This metric returns the model decision coverage measured in the test results for each model in the unit. The metric result includes the percentage of decision coverage achieved by the test cases, the percentage of coverage justified in coverage filters, and the percentage of decision coverage missed by the tests. To collect data for this metric:
Collecting data for this metric loads the model file and test results files and requires a Simulink Coverage license. |
Results: For this metric,
instances of
|
Compliance Thresholds: This metric does not have predefined thresholds. |
Capabilities and Limitations: The metric:
|
See Also: For an example of collecting metrics programmatically, see Collect Metrics on Model Testing Artifacts Programmatically. |
ConditionCoverageFragment
Condition coverage for each model in the unit.
Metric Information |
---|
Metric ID:
|
Description: This metric returns the model condition coverage measured in the test results for each model in the unit. The metric result includes the percentage of condition coverage achieved by the test cases, the percentage of coverage justified in coverage filters, and the percentage of condition coverage missed by the tests. To collect data for this metric:
Collecting data for this metric loads the model file and test results files and requires a Simulink Coverage license. |
Results: For this metric,
instances of
|
Compliance Thresholds: This metric does not have predefined thresholds. |
Capabilities and Limitations: The metric:
|
See Also: For an example of collecting metrics programmatically, see Collect Metrics on Model Testing Artifacts Programmatically. |
MCDCCoverageFragment
Modified condition/decision coverage (MC/DC) for each model in the unit.
Metric Information |
---|
Metric ID:
|
Description: This metric returns the modified condition and decision (MC/DC) measured in the test results for each model in the unit. The metric result includes the percentage of MC/DC coverage achieved by the test cases, the percentage of coverage justified in coverage filters, and the percentage of MC/DC coverage missed by the tests. To collect data for this metric:
Collecting data for this metric loads the model file and test results files and requires a Simulink Coverage license. |
Results: For this metric,
instances of
|
Compliance Thresholds: This metric does not have predefined thresholds. |
Capabilities and Limitations: The metric:
|
See Also: For an example of collecting metrics programmatically, see Collect Metrics on Model Testing Artifacts Programmatically. |
Requirements-Based Tests for the Unit
The Achieved Coverage Ratio section of the dashboard shows the sources of achieved coverage for the unit. The Requirements-Based Tests section shows how much of the overall achieved coverage comes from requirements-based tests.
The metrics associated with the Requirements-Based Tests section of the dashboard are:
RequirementsExecutionCoverageBreakdown for the requirements-based execution coverage
RequirementsDecisionCoverageBreakdown for the requirements-based decision coverage
RequirementsConditionCoverageBreakdown for the requirements-based condition coverage
RequirementsMCDCCoverageBreakdown for the requirements-based MC/DC coverage
RequirementsExecutionCoverageBreakdown
Fraction of the overall achieved execution coverage that comes from requirements-based tests.
Metric Information |
---|
Metric ID:
|
Description: This metric returns the fraction of overall achieved execution coverage that comes from requirements-based tests. To collect data for this metric:
Collecting data for this metric loads the model file and test results files and requires a Simulink Coverage license. |
Results: For this metric,
instances of
|
Compliance Thresholds: The default compliance thresholds for this metric are:
|
Capabilities and Limitations: The metric:
|
See Also: For an example of collecting metrics programmatically, see Collect Metrics on Model Testing Artifacts Programmatically. |
RequirementsDecisionCoverageBreakdown
Fraction of the overall achieved decision coverage that comes from requirements-based tests.
Metric Information |
---|
Metric ID:
|
Description: This metric returns the fraction of overall achieved decision coverage that comes from requirements-based tests. To collect data for this metric:
Collecting data for this metric loads the model file and test results files and requires a Simulink Coverage license. |
Results: For this metric,
instances of
|
Compliance Thresholds: The default compliance thresholds for this metric are:
|
Capabilities and Limitations: The metric:
|
See Also: For an example of collecting metrics programmatically, see Collect Metrics on Model Testing Artifacts Programmatically. |
RequirementsConditionCoverageBreakdown
Fraction of the overall achieved condition coverage that comes from requirements-based tests.
Metric Information |
---|
Metric ID:
|
Description: This metric returns the fraction of overall achieved condition coverage that comes from requirements-based tests. To collect data for this metric:
Collecting data for this metric loads the model file and test results files and requires a Simulink Coverage license. |
Results: For this metric,
instances of
|
Compliance Thresholds: The default compliance thresholds for this metric are:
|
Capabilities and Limitations: The metric:
|
See Also: For an example of collecting metrics programmatically, see Collect Metrics on Model Testing Artifacts Programmatically. |
RequirementsMCDCCoverageBreakdown
Fraction of the overall achieved MC/DC coverage that comes from requirements-based tests.
Metric Information |
---|
Metric ID:
|
Description: This metric returns the fraction of overall achieved MC/DC coverage that comes from requirements-based tests. To collect data for this metric:
Collecting data for this metric loads the model file and test results files and requires a Simulink Coverage license. |
Results: For this metric,
instances of
|
Compliance Thresholds: The default compliance thresholds for this metric are:
|
Capabilities and Limitations: The metric:
|
See Also: For an example of collecting metrics programmatically, see Collect Metrics on Model Testing Artifacts Programmatically. |
Requirements-Based Tests for each Model in the Unit
When you click on a bar in the Requirements-Based Tests section, the Metric Details show the coverage ratio for each model in the unit. For requirements-based tests, the coverage ratio is the percentage of the overall achieved coverage that comes from requirements-based tests. Requirements-based tests are test cases that are linked to at least one requirement in the project.
The metrics associated with requirements-based coverage for each model in the unit are:
RequirementsExecutionCoverageFragment for the requirements-based execution coverage
RequirementsDecisionCoverageFragment for the requirements-based decision coverage
RequirementsConditionCoverageFragment for the requirements-based condition coverage
RequirementsMCDCCoverageFragment for the requirements-based MC/DC coverage
RequirementsExecutionCoverageFragment
Fraction of the overall achieved execution coverage that comes from requirements-based tests.
Metric Information |
---|
Metric ID:
|
Description: This metric returns the fraction of overall achieved execution coverage that comes from requirements-based tests. To collect data for this metric:
Collecting data for this metric loads the model file and test results files and requires a Simulink Coverage license. |
Results: For this metric,
instances of
|
Compliance Thresholds: This metric does not have predefined thresholds. |
Capabilities and Limitations: The metric does not analyze coverage from test cases that run in software-in-the-loop (SIL) or processor-in-the-loop (PIL) mode. |
See Also: For an example of collecting metrics programmatically, see Collect Metrics on Model Testing Artifacts Programmatically. |
RequirementsDecisionCoverageFragment
Fraction of the overall achieved decision coverage that comes from requirements-based tests.
Metric Information |
---|
Metric ID:
|
Description: This metric returns the fraction of overall achieved decision coverage that comes from requirements-based tests. To collect data for this metric:
Collecting data for this metric loads the model file and test results files and requires a Simulink Coverage license. |
Results: For this metric,
instances of
|
Compliance Thresholds: This metric does not have predefined thresholds. |
Capabilities and Limitations: The metric does not analyze coverage from test cases that run in software-in-the-loop (SIL) or processor-in-the-loop (PIL) mode. |
See Also: For an example of collecting metrics programmatically, see Collect Metrics on Model Testing Artifacts Programmatically. |
RequirementsConditionCoverageFragment
Fraction of the overall achieved condition coverage that comes from requirements-based tests.
Metric Information |
---|
Metric ID:
|
Description: This metric returns the fraction of overall achieved condition coverage that comes from requirements-based tests. To collect data for this metric:
Collecting data for this metric loads the model file and test results files and requires a Simulink Coverage license. |
Results: For this metric,
instances of
|
Compliance Thresholds: This metric does not have predefined thresholds. |
Capabilities and Limitations: The metric does not analyze coverage from test cases that run in software-in-the-loop (SIL) or processor-in-the-loop (PIL) mode. |
See Also: For an example of collecting metrics programmatically, see Collect Metrics on Model Testing Artifacts Programmatically. |
RequirementsMCDCCoverageFragment
Fraction of the overall achieved modified condition and decision (MC/DC) coverage that comes from requirements-based tests.
Metric Information |
---|
Metric ID:
|
Description: This metric returns the fraction of overall achieved MC/DC coverage that comes from requirements-based tests. To collect data for this metric:
Collecting data for this metric loads the model file and test results files and requires a Simulink Coverage license. |
Results: For this metric,
instances of
|
Compliance Thresholds: This metric does not have predefined thresholds. |
Capabilities and Limitations: The metric does not analyze coverage from test cases that run in software-in-the-loop (SIL) or processor-in-the-loop (PIL) mode. |
See Also: For an example of collecting metrics programmatically, see Collect Metrics on Model Testing Artifacts Programmatically. |
Unit-Boundary Tests for the Unit
The Achieved Coverage Ratio section of the dashboard shows the sources of achieved coverage for the unit. The Unit-Boundary Tests section shows how much of the overall achieved coverage comes from unit-boundary tests.
The metrics associated with the Unit-Boundary Tests section of the dashboard are:
UnitBoundaryExecutionCoverageBreakdown for the unit-boundary execution coverage
UnitBoundaryDecisionCoverageBreakdown for the unit-boundary decision coverage
UnitBoundaryConditionCoverageBreakdown for the unit-boundary condition coverage
UnitBoundaryMCDCCoverageBreakdown for the unit-boundary MC/DC coverage
UnitBoundaryExecutionCoverageBreakdown
Fraction of the overall achieved execution coverage that comes from unit-boundary tests.
Metric Information |
---|
Metric ID:
|
Description: This metric returns the fraction of overall achieved execution coverage that comes from unit-boundary tests. To collect data for this metric:
Collecting data for this metric loads the model file and test results files and requires a Simulink Coverage license. |
Results: For this metric,
instances of
|
Compliance Thresholds: This metric does not have predefined thresholds. Consequently, the compliance threshold overlay icon appears when you click Uncategorized in the Overlays section of the toolstrip. |
Capabilities and Limitations: The metric:
|
See Also: For an example of collecting metrics programmatically, see Collect Metrics on Model Testing Artifacts Programmatically. |
UnitBoundaryDecisionCoverageBreakdown
Fraction of the overall achieved decision coverage that comes from unit-boundary tests.
Metric Information |
---|
Metric ID:
|
Description: This metric returns the fraction of overall achieved decision coverage that comes from unit-boundary tests. To collect data for this metric:
Collecting data for this metric loads the model file and test results files and requires a Simulink Coverage license. |
Results: For this metric,
instances of
|
Compliance Thresholds: This metric does not have predefined thresholds. Consequently, the compliance threshold overlay icon appears when you click Uncategorized in the Overlays section of the toolstrip. |
Capabilities and Limitations: The metric:
|
See Also: For an example of collecting metrics programmatically, see Collect Metrics on Model Testing Artifacts Programmatically. |
UnitBoundaryConditionCoverageBreakdown
Fraction of the overall achieved condition coverage that comes from unit-boundary tests.
Metric Information |
---|
Metric ID:
|
Description: This metric returns the fraction of overall achieved condition coverage that comes from unit-boundary tests. To collect data for this metric:
Collecting data for this metric loads the model file and test results files and requires a Simulink Coverage license. |
Results: For this metric,
instances of
|
Compliance Thresholds: This metric does not have predefined thresholds. Consequently, the compliance threshold overlay icon appears when you click Uncategorized in the Overlays section of the toolstrip. |
Capabilities and Limitations: The metric:
|
See Also: For an example of collecting metrics programmatically, see Collect Metrics on Model Testing Artifacts Programmatically. |
UnitBoundaryMCDCCoverageBreakdown
Fraction of the overall achieved MC/DC coverage that comes from unit-boundary tests.
Metric Information |
---|
Metric ID:
|
Description: This metric returns the fraction of overall achieved MC/DC coverage that comes from unit-boundary tests. To collect data for this metric:
Collecting data for this metric loads the model file and test results files and requires a Simulink Coverage license. |
Results: For this metric,
instances of
|
Compliance Thresholds: This metric does not have predefined thresholds. Consequently, the compliance threshold overlay icon appears when you click Uncategorized in the Overlays section of the toolstrip. |
Capabilities and Limitations: The metric:
|
See Also: For an example of collecting metrics programmatically, see Collect Metrics on Model Testing Artifacts Programmatically. |
Unit-Boundary Tests for each Model in the Unit
When you click on a bar in the Unit-Boundary Tests section, the Metric Details show the coverage ratio for each model in the unit. For unit-boundary tests, the coverage ratio is the percentage of the overall achieved coverage that comes from unit-boundary tests. Unit-boundary tests are test cases that test the whole unit (and not just lower-level subsystems of the unit).
The metrics associated with unit-boundary coverage for each model in the unit are:
UnitBoundaryExecutionCoverageFragment for the unit-boundary execution coverage
UnitBoundaryDecisionCoverageFragment for the unit-boundary decision coverage
UnitBoundaryConditionCoverageFragment for the unit-boundary condition coverage
UnitBoundaryMCDCCoverageFragment for the unit-boundary MC/DC coverage
UnitBoundaryExecutionCoverageFragment
Fraction of the overall achieved execution coverage that comes from unit-boundary tests.
Metric Information |
---|
Metric ID:
|
Description: This metric returns the fraction of overall achieved execution coverage that comes from unit-boundary tests. To collect data for this metric:
Collecting data for this metric loads the model file and test results files and requires a Simulink Coverage license. |
Results: For this metric,
instances of
|
Compliance Thresholds: This metric does not have predefined thresholds. |
Capabilities and Limitations: The metric does not analyze coverage from test cases that run in software-in-the-loop (SIL) or processor-in-the-loop (PIL) mode. |
See Also: For an example of collecting metrics programmatically, see Collect Metrics on Model Testing Artifacts Programmatically. |
UnitBoundaryDecisionCoverageFragment
Fraction of the overall achieved decision coverage that comes from unit-boundary tests.
Metric Information |
---|
Metric ID:
|
Description: This metric returns the fraction of overall achieved decision coverage that comes from unit-boundary tests. To collect data for this metric:
Collecting data for this metric loads the model file and test results files and requires a Simulink Coverage license. |
Results: For this metric,
instances of
|
Compliance Thresholds: This metric does not have predefined thresholds. |
Capabilities and Limitations: The metric does not analyze coverage from test cases that run in software-in-the-loop (SIL) or processor-in-the-loop (PIL) mode. |
See Also: For an example of collecting metrics programmatically, see Collect Metrics on Model Testing Artifacts Programmatically. |
UnitBoundaryConditionCoverageFragment
Fraction of the overall achieved condition coverage that comes from unit-boundary tests.
Metric Information |
---|
Metric ID:
|
Description: This metric returns the fraction of overall achieved condition coverage that comes from unit-boundary tests. To collect data for this metric:
Collecting data for this metric loads the model file and test results files and requires a Simulink Coverage license. |
Results: For this metric,
instances of
|
Compliance Thresholds: This metric does not have predefined thresholds. |
Capabilities and Limitations: The metric does not analyze coverage from test cases that run in software-in-the-loop (SIL) or processor-in-the-loop (PIL) mode. |
See Also: For an example of collecting metrics programmatically, see Collect Metrics on Model Testing Artifacts Programmatically. |
UnitBoundaryMCDCCoverageFragment
Fraction of the overall achieved MC/DC coverage that comes from unit-boundary tests.
Metric Information |
---|
Metric ID:
|
Description: This metric returns the fraction of overall achieved MC/DC coverage that comes from unit-boundary tests. To collect data for this metric:
Collecting data for this metric loads the model file and test results files and requires a Simulink Coverage license. |
Results: For this metric,
instances of
|
Compliance Thresholds: This metric does not have predefined thresholds. |
Capabilities and Limitations: The metric does not analyze coverage from test cases that run in software-in-the-loop (SIL) or processor-in-the-loop (PIL) mode. |
See Also: For an example of collecting metrics programmatically, see Collect Metrics on Model Testing Artifacts Programmatically. |