metric.Result
Metric data for specified metric algorithm
Description
A metric.Result
object contains the metric data for a specified
metric algorithm that traces to the specified unit or component.
Creation
Description
creates a
handle to a metric result object.metric_result
= metric.Result
Alternatively, if you collect results by executing a metric.Engine
object, using the getMetrics
function on the engine object returns the
collected metric.Result
objects in an array.
Properties
MetricID
— Metric identifier
string
Metric identifier for the metric algorithm that calculated the results, returned as a string.
Example:
'TestCasesPerRequirementDistribution'
Artifacts
— Project artifacts
structure | array of structures
Project artifacts for which the metric is calculated, returned as a structure or an array of structures. For each artifact that the metric analyzed, the returned structure contains these fields:
UUID
— Unique identifier of the artifact.Name
— Name of the artifact.Type
— Type of artifact.ParentUUID
— Unique identifier of the file that contains the artifact.ParentName
— Name of the file that contains the artifact.ParentType
— Type of file that contains the artifact.
Value
— Result value
integer | string | double vector | structure
Value of the metric result for the specified algorithm and artifacts, returned as an integer, string, double vector, or structure. For a list of model testing metrics and their result values, see Model Testing Metrics.
Scope
— Scope of metric results
structure
Scope of the metric results, returned as a structure. The scope is the unit or component for which the metric collected results. The structure contains these fields:
UUID
— Unique identifier of the unit or component.Name
— Name of the unit or component.Type
— Type of unit or component.ParentUUID
— Unique identifier of the file that contains the unit or component.ParentName
— Name of the file that contains the unit or component.ParentType
— Type of file that contains the unit or component.
UserData
— User data
string
User data provided by the metric algorithm, returned as a string.
Examples
Collect Metric Data on Design Artifacts in a Project
Use a metric.Engine
object to collect metric data
on the design artifacts in a project.
Open the project. At the command line, type
dashboardCCProjectStart
.
dashboardCCProjectStart
Create a metric.Engine
object for the project.
metric_engine = metric.Engine();
Collect results for the metric
"slcomp.OverallCyclomaticComplexity"
by executing the metric
engine. For more information on the metric, see Model Maintainability Metrics.
execute(metric_engine,'slcomp.OverallCyclomaticComplexity');
Use the function getMetrics
to access the results. Assign the
array of result objects to the results
variable.
results = getMetrics(metric_engine,'slcomp.OverallCyclomaticComplexity');
Access the metric results data by using the properties of the
metric.Result
objects in the results
array.
for n = 1:length(results) disp(['Model: ',results(n).Scope.Name]) disp([' Overall Design Cyclomatic Complexity: ',num2str(results(n).Value)]) end
Model: db_Controller Overall Design Cyclomatic Complexity: 1 Model: db_LightControl Overall Design Cyclomatic Complexity: 4 Model: db_ThrottleController Overall Design Cyclomatic Complexity: 4 Model: db_ControlMode Overall Design Cyclomatic Complexity: 22 Model: db_DriverSwRequest Overall Design Cyclomatic Complexity: 9
For more information on how to collect metrics for design artifacts, see Collect Model Maintainability Metrics Programmatically.
Collect Metric Data on Testing Artifacts in a Project
Collect metric data on the requirements-based testing artifacts in a
project. Then, access the data by using the metric.Result
objects.
Open the project. At the command line, type
dashboardCCProjectStart
.
dashboardCCProjectStart
Create a metric.Engine
object for the project.
metric_engine = metric.Engine();
Update the trace information for metric_engine
to
ensure that the artifact information is up to
date.
updateArtifacts(metric_engine)
Collect results for the metric 'RequirementsPerTestCase'
by using
the execute
function on the metric.Engine
object.
execute(metric_engine,'RequirementsPerTestCase');
Use the function getMetrics
to access the results. Assign the
array of result objects to the results
variable.
results = getMetrics(metric_engine,'RequirementsPerTestCase');
Access the metric results data by using the properties of the
metric.Result
objects in the array.
for n = 1:length(results) disp(['Test Case: ',results(n).Artifacts(1).Name]) disp([' Number of Requirements: ',num2str(results(n).Value)]) end
Version History
Introduced in R2020b
MATLAB Command
You clicked a link that corresponds to this MATLAB command:
Run the command by entering it in the MATLAB Command Window. Web browsers do not support MATLAB commands.
Select a Web Site
Choose a web site to get translated content where available and see local events and offers. Based on your location, we recommend that you select: .
You can also select a web site from the following list:
How to Get Best Site Performance
Select the China site (in Chinese or English) for best site performance. Other MathWorks country sites are not optimized for visits from your location.
Americas
- América Latina (Español)
- Canada (English)
- United States (English)
Europe
- Belgium (English)
- Denmark (English)
- Deutschland (Deutsch)
- España (Español)
- Finland (English)
- France (Français)
- Ireland (English)
- Italia (Italiano)
- Luxembourg (English)
- Netherlands (English)
- Norway (English)
- Österreich (Deutsch)
- Portugal (English)
- Sweden (English)
- Switzerland
- United Kingdom (English)