Resolve Missing Artifacts, Links, and Results
Issue
The dashboards analyze artifacts—models, requirements, tests, code, and results—that are part of the model design and testing workflow for software units. If an artifact or a link between artifacts is not part of the workflow, it might not appear in the dashboard or contribute to the analysis results. Additionally, some artifacts and links are not supported by the dashboard. If you expect a link or artifact to appear in the dashboard and it does not, try one of these solutions.
Possible Solutions
Try these solutions when you begin troubleshooting artifacts in the dashboard:
Save changes to your artifact files.
Check that your artifacts are saved in the project. The dashboard does not analyze files that are not saved in the project.
If your project contains a referenced project, check that the referenced project has a unique project name. The dashboard only analyzes referenced projects that have unique project names.
Check that your artifacts are on the MATLAB® search path before you open the dashboard. When you change the MATLAB search path, the traceability information in the Artifacts panel is not updated. Do not change the search path while the dashboard is open.
Open the Artifact Issues pane and address errors or warnings. The Artifact Issues icon in the dashboard toolstrip changes based on the severity of the artifact issues in the project. For more information, see View Artifact Issues in Project.
Use the dashboard to re-trace the artifacts and re-collect metric results.
Note
Artifacts shown in the following folders in the Artifacts panel are not directly related to the unit and do not contribute to the metric results shown in the dashboard for the unit. If you expect an artifact to contribute to the metric results, check that the artifact is not in one of these folders:
Functional Requirements > Upstream
Tests > Others
Test Results > Others
Trace Issues
Depending on the type of artifact or analysis issue that you are troubleshooting, try one of these solutions.
Enable Artifact Tracing for the Project
As you edit and save the artifacts in your project, the dashboard needs to track these changes to enable artifact tracing and to detect stale results.
By default, the dashboard requests that you enable artifact tracing the first time you open a project in the dashboard. Click Enable and Continue to allow the dashboard to track tool outputs to detect outdated metric results.
The dashboard needs to track tool outputs, such as test results from Simulink® Test™, to detect outdated metric results.
You can also enable artifact tracing from the Startup and Shutdown settings of the project. In the Startup and Shutdown settings for your project, select Track tool outputs to detect outdated results. For more information on the tool outputs and outdated metric results, see Digital Thread.
Cache Folder Artifact Tracking
By default, projects use the same root folder for both the simulation cache folder and the code generation folder. If possible, use different root folders for the simulation cache folder and code generation folder in your project. When you specify different root folders, the dashboard no longer needs to track changes to the simulation cache folder.
To view the cache folder settings for your project, on the Project tab, in the Environment section, click Details. The Project Details dialog shows the root folders specified for the Simulation cache folder and Code generation folder.
The behavior of change tracking only depends on the project settings. Custom
manipulations do not impact the change tracking behavior. For example, the
dashboard does not check root folders specified by
Simulink.fileGenControl
.
Project Requires Analysis by the Dashboard
The first time that you open the dashboard for the project, the dashboard identifies the artifacts in the project and collects traceability information. The dashboard must perform this first-time setup to establish the traceability data before it can monitor the artifacts. If you cancel the first-time setup, the artifacts in the project appear in the Unanalyzed folder in the Artifacts panel. To trace the unanalyzed artifacts, click Collect > Trace Artifacts.
Incorrect List of Models in Project Panel
The Project panel shows the models in your project that are either units or components. Models are organized under the components that reference them, according to the model reference hierarchy. If the list of units and components does not show the expected hierarchy of your models, try one of these solutions.
Check that your units and components are labeled. Label the units and components in your project and configure the dashboard to recognize the labeled models. Note that if a unit references one or more other models, the referenced models appear in the Design folder under the unit. For more information about labeling models and configuring the dashboard, see Categorize Models in a Hierarchy as Components or Units. Check that if you have Observer models, they are not labeled as units. The dashboard includes Observer models as units if they match the label requirements.
Check that your model was saved in a supported release. Check that your model was saved in R2012b or later. The dashboard does not support models that were saved before R2012b.
Block Skipped During Artifact Analysis
If a block has a mask and the mask hides the content of the block, the dashboard excludes the block from artifact analysis.
Check that your custom libraries do not contain blocks with self-modifiable masks. The dashboard does not analyze blocks that contain self-modifiable masks. Self-modifiable masks can change the structural content of a block, which is incompatible with artifact traceability analysis.
Library Missing from Artifacts Panel
Check that the library does not use a library forwarding table. The dashboard does not support library forwarding tables.
Requirement Missing from Artifacts Panel
If a requirement is missing from the Artifacts panel, try one of these solutions.
Check that the requirement is a functional requirement. Verify that the requirement is configured as a functional requirement. In
the Requirement Editor, on the left pane, click the requirement. On the
right pane, in the Properties section, set
Type to Functional
. Because the
Model Testing Dashboard reports on requirements-based unit testing, only
functional requirements appear in the Artifacts panel
and are analyzed by the dashboard.
Check that the requirement is saved in a supported requirements
file. Verify that the requirement is saved in a requirements file that has the
.slreqx
extension.
Test Missing from Artifacts Panel
Check that the test is supported by the dashboards. The Model Testing, SIL Code Testing, and PIL Code Testing dashboards do not support MATLAB-based Simulink tests.
Test Harness Missing from Artifacts Panel
Check that the test harness is not an internal test harness for a System Composer™ architecture model. The dashboard does not support internal test harnesses for System Composer architecture models. If your model already uses internal test harnesses, you can convert the internal test harnesses to externally stored test harnesses. Navigate to the top of the main model and open Simulink Test. On the Tests tab, click Manage Test Harnesses > Convert to External Harnesses. Click OK to convert the affected test harnesses. External test harnesses for System Composer architecture models appear in the Artifacts panel in the subfolder Tests > Test Harnesses.
Check that the test harness is not on a subsystem inside a library block instance. If a test harness is on a subsystem inside a library block inside a model, the dashboard cannot perform artifact traceability analysis on the test harness. The relationship between a model and a test harness on a subsystem inside a library block instance is incompatible with artifact traceability analysis. To enable artifact traceability analysis, move the test harness to the library.
Test Result Missing from Artifacts Panel
Check that either:
The result is saved in a test results file. Save test results by exporting them from the Test Manager.
You collected the results during the current project session and have not closed them. When you collect test results and do not export them, the dashboard recognizes the temporary results in the Test Manager, denoted by the
icon. The dashboard stops recognizing the temporary results when you close the project, close the test results set, or export the test results to a results file.
External MATLAB Code Missing from Artifacts Panel
The Artifacts panel shows external MATLAB code that the dashboard traced to the units and components in your project. If you expect external MATLAB code to appear in the dashboard and it does not, check if the construct is not supported:
A class method does not appear in the Artifacts panel if the method is:
A nonstatic method that you call using dot notation. The dashboard shows the associated class definition in the Artifacts panel.
A nonstatic method that you call using function notation. The dashboard shows the associated class definition in the Artifacts panel.
A static method that you call from a Simulink model using dot notation. The dashboard shows the associated class definition in the Artifacts panel.
A superclass method. The dashboard shows the associated superclass definition in the Artifacts panel.
Defined in a separate file from the class definition file. Methods declared in separate files are not supported. For the dashboard to identify a method, you must declare a method in the class definition file. For example, if you have a class folder containing a class definition file and separate method files, the method files are not supported by the dashboard. The dashboard shows the associated class definition in the Design folder.
A class constructor does not appear in the Artifacts panel if the constructor is a superclass constructor. The dashboard shows the associated superclass definition in the Design folder, but not the method itself.
A class property does not appear in the Artifacts panel if the property is called from Simulink or Stateflow®. The dashboard shows the associated class definition in the Artifacts panel.
An enumeration class does not appear in the
Artifacts panel. For example, if you use an
Enumerated Constant block in Simulink, the dashboard does not show the MATLAB class that defines the enum
type.
Check that methods and local functions do not have the same name. If a class file contains a method and a local function that have the same name, calls that use dot notation call the method in the class definition, and calls that use function notation call the local function in the class file.
For example, if you have a class file containing the method
myAction
and the local function
myAction
, the code obj.myAction
calls
the method and the code myAction(obj)
calls the local
function.
classdef Class methods function myAction(~) % method in the class disp("Called method in the class."); end function myCall(obj) obj.myAction(); % dot notation calls the method in the class myAction(obj); % function notation calls the local function end end end function myAction(x) % local function disp("Called local function"); end
Artifact Returns a Warning
Check the details of the warning by clicking the Artifact Issues button in the toolstrip.
Artifact Returns an Error
Check the details of the error by clicking the Artifact Issues button in the toolstrip.
If the dashboard returns an error in the Artifact Issues tab, the metric data shown by the dashboard widgets may be incomplete. Errors indicate that the dashboard may not have been able to properly trace artifacts, analyze artifacts, or collect metrics.
Before using the metrics results shown in the dashboard, resolve the reported errors and retrace the artifacts.
Fix ambiguous links. Check that the links in your project define unambiguous relationships between project artifacts.
In requirements-based testing, projects often contain links between software requirements and:
the design artifacts that implement the requirements
the tests that test the implemented requirements
the higher-level system requirements
The links in your project help to define the relationships between artifacts. The dashboard uses a digital thread to capture the traceability relationships between the artifacts in your project. To maintain the traceability relationships, the dashboard returns an error when the links to project artifacts are ambiguous. Ambiguous links are not supported in the dashboard.
If one of these conditions is met, the dashboard cannot establish unambiguous traceability:
A link set shadows another loaded link set of the same name.
A requirement set shadows another loaded requirement set of the same name.
A link is not on the project path or is only temporarily on the project path.
A link is not portable.
To avoid links that are not portable:
Do not set the preference for a link path format to be an absolute path. Absolute paths are not portable. For information on how to set the preference for the path format of links, see
rmipref
(Requirements Toolbox) and Document Path Storage (Requirements Toolbox).When you identify the source artifact of a link set, use the default link file name and location. Link source remapping persists in the MATLAB preferences directory and is not portable. For more information, see Requirements Link Storage (Requirements Toolbox).
Use the details and suggested actions in the dashboard error messages to fix the ambiguous links.
If you link a requirement to a MATLAB function, make sure you link to the first line of the function definition. For more information, see Link Requirements to MATLAB or Plain Text Code (Requirements Toolbox).
For more information on traceability relationships and the digital thread, see Digital Thread.
Trace Issues
If an artifact appears in the Trace Issues folder when you expect it to trace to a unit, depending on the type of artifact that is untraced, try one of these solutions.
Fix an untraced requirement. Check that the requirement traces to the unit using an implementation link.
The requirement must link to the model or to a library subsystem used by
the model with a link where the Type is set to
Implements
.
Requirements-based testing verifies that your model fulfills the functional requirements that it implements. Because the Model Testing Dashboard reports on requirements-based testing quality, it analyzes only requirements that are specified as functional requirements and implemented in the unit. For each unit, the dashboard shows the functional requirements that are implemented in the unit in the folder Functional Requirements > Implemented.
Check that the requirement does not use an unsupported link. The Model Testing Dashboard does not trace these links:
Downstream links. The Model Testing Dashboard traces only
Implements
links that link directly from the unit design to the requirement. Requirements that are not directly linked appear in the folder Functional Requirements > Upstream.Embedded links, which are requirements files that are saved directly in the model file.
Links to requirements that are saved externally and linked using the Requirements Management Interface (RMI).
Links to custom requirements that you defined by using stereotypes.
Links inside:
Requirement Table blocks
Test Sequence blocks
Test Assessment blocks
Links in deprecated requirement files, which have the extension
.req
. To analyze requirement links in the dashboard, save the links in an.slmx
file or create them in the requirements file (.slreqx
) that has the requirements.Links to models for which the model file extension changed. If a requirement is linked to a model with the file extension
.slx
, but the model file extension is changed to.mdl
, the dashboard lists the requirement link as unresolved. Modify the requirement link to reference the expected model file and re-save the requirement link.Symbolic file links in a project, such as shortcuts.
Links to modeling elements that are not supported by the Model Testing Dashboard, such as library forwarding tables.
Custom link types that you defined by using stereotypes.
A requirement link to a justification. If a requirement is linked with a justification and not linked to a test, it appears as unlinked in the metric results.
A requirement link to a test harness.
For requirement links to data dictionary entries, the dashboard traces from the requirement to the data dictionary file associated with the data dictionary entry.
If a requirement links to a range of MATLAB code that contains multiple code constructs, the dashboard resolves the link to the first code construct that appears in the range. For example, if the linked line ranges contains MATLAB code for two functions, the dashboard generates a warning and resolves the requirement link to the first function.
Fix an untraced requirement link set. Check that the requirement link set does not use the legacy
Requirements Management Interface (RMI) format. To allow the dashboard to
analyze your requirement link set, pass your requirement link set as the
input argument to the function slreq.refreshSourceArtifactPath
.
Fix an untraced design artifact. Check that the design artifact does not rely on a model callback to be linked with the model. The dashboards do not execute model loading callbacks when they load the models for analysis. If a model relies on a callback to link a data dictionary, the data dictionary will not be linked when the dashboards run the traceability analysis.
Fix an untraced test. Check that the test runs on the model or runs on an atomic subsystem in the model by using a test harness.
Fix an untraced test result. Check that the project and test are set up correctly and re-run your tests. If one of these conditions is met when you run your test, the generated results are untraced because the dashboard cannot establish unambiguous traceability to the unit:
No project is loaded.
Artifact tracing is not enabled for the project. If artifact tracing is not enabled, the dashboard cannot track changes or trace from the tests to the generated test results. For more information, see Enable Artifact Tracing for the Project.
You do not have a Simulink Check™ license.
The test file is stored outside the project.
The test file has unsaved changes.
The tested model has unsaved changes.
The test file returns an error during traceability analysis.
The tested model returns an error during traceability analysis.
The test result comes from a test that is not supported by the dashboards, such as a MATLAB-based Simulink test.
Check that the results and environment are set up correctly and re-export your test results. If one of these conditions is met when you export your test results, the generated results are untraced because the dashboard cannot establish unambiguous traceability to the unit:
No project is loaded.
Artifact tracing is not enabled for the project. For more information, see Enable Artifact Tracing for the Project.
You do not have a Simulink Check license.
The test result file returns an error during traceability analysis.
Metric Does Not Report Results for Requirement, Test, or Test Result
If an artifact traces to one of your units but does not appear in the metric results for that unit, depending on the type of artifact, try one of these solutions.
Fix a requirement that does not produce metric results. Check that the requirement directly links to the model with a link where
the Type is set to Implements
. The
dashboard metrics analyze only implemented functional requirements. For each
unit, the implemented functional requirements appear in the folder Functional Requirements > Implemented. Upstream requirements appear in the folder Functional Requirements > Upstream, but do not contribute to the metric results because upstream
requirements are only indirectly or transitively linked to the implemented
requirements.
Fix a test that does not produce metric results. Check that the test directly tests either the entire unit or atomic subsystems in the model. The dashboard metrics analyze only unit tests. For each unit, the unit tests appear in the folder Tests > Unit Tests. Other tests, that are not unit tests, appear in the folder Tests > Others, but do not contribute to the metric results because other tests do not directly test the unit or atomic model subsystems. For example, the dashboard does not consider tests on a library or tests on a virtual subsystem to be unit tests.
Fix a test result that does not produce metric results. Check that the results meet these criteria:
The results are the most recent results generated from the tests.
The results are from unit tests which appear in the folder Tests > Unit Tests. Tests in the folder Tests > Others do not contribute to the metric results.
The dashboard can only isolate outdated results to
individual test cases or test suites if the test cases or test suites have
revision numbers. If a test case or test suite was saved in a release that
did not save revision numbers, use the function sltest.testmanager.refreshTestRevisions
on the test file to
refresh the revision
information.
The coverage metrics do not aggregate coverage for external C code, such as S-functions and C Caller blocks, and the code coverage metrics do not include coverage results for shared code files.
For each unit, the test results that produce metric results appear in the Test Results folder in the subfolders Model, SIL, and PIL. The test results in the folder Test Results > Others do not contribute to the metric results.
Fix a test that does not produce simulation test result analysis metric results. Check that the test is a unit test and produces simulation results. For each unit, the metrics analyze the test results in the Test Results > Model. The test results in the folder Test Results > Others are results that are not model, software-in-the-loop (SIL), or processor-in-the-loop (PIL) results, are not from unit tests, or are only reports. The metrics in the Simulation Test Result Analysis section count results from only simulation tests, whereas the metrics in the Test Analysis section count all unit tests.
If a test is not counted in the metrics in the Simulation Test Result Analysis section, check that the test meets these criteria for being a simulation test:
The simulation mode is Normal, Accelerator, or Rapid Accelerator. If the test uses iterations to set a different simulation mode after testing one of these modes, the test is still considered a simulation test.
The test is not a real-time test.
If the test is an equivalence test, the first simulation meets one of the first two criteria.
If the test contains multiple iterations, the test or at least one iteration meets one of the first two criteria.
Metric Results Show Unexpected Model or Code Coverage
Note that the model coverage metrics do not scope coverage to requirements. If you select the Scope coverage results to linked requirements check box in your test results, the dashboard ignores that selection and does not scope the model coverage metrics results that appear in the dashboard. For information on the Scope coverage results to linked requirements option, see Scoping Coverage for Requirements-Based Tests (Simulink Test).
Fix inconsistent model and code coverage from inlined external MATLAB functions. By default, the coverage metrics include external MATLAB function coverage in the overall unit coverage.
If you have external MATLAB functions in your project, either:
Place the
coder.inline('never')
directive inside the function and use a project label to categorize the M file as a unitPlace the
coder.inline('always')
directive inside the function, but do not use a project label to categorize the M file as a unit
For information on the coder.inline
directive, see
coder.inline
. If possible, avoid using
coder.inline('default')
.
coder.inline('default')
uses internal heuristics to
determine whether to inline the function, which can produce inconsistent
coverage metric results in the dashboard.
Typically, you use a project label to categorize a model as a unit or component in the dashboard. When you add your unit label to an external MATLAB function, the function does not appear in the Project panel, but the dashboard is able to exclude the function coverage from the overall unit coverage. For information on how to use project labels to categorize units and components, see Categorize Models in a Hierarchy as Components or Units.
Metric Result Shows a Missing Link or Artifact
The metric results do not count all types of traceability links. If a metric shows that a test or requirement is missing links when you expect it to be linked, try one of these solutions.
Fix missing model coverage in test results. If the model coverage metrics report coverage gaps that you do not expect, re-run the tests and re-collect the metric results for the new test results. The Model Testing Dashboard might show coverage gaps if:
You change the test results file or the coverage filter file after you collect the metrics, including if you re-import the test results file after you make changes.
You collect accumulated coverage results and make changes to the model file after running one or more tests.