Main Content

Resolve Missing Artifacts, Links, and Results in the Model Testing Dashboard

Issue

The Model Testing Dashboard analyzes artifacts—models, requirements, tests, and results—that are part of the requirements-based testing workflow for software unit models. If an artifact or a link between artifacts is not part of the requirements-based testing workflow, it might not appear in the Model Testing Dashboard or contribute to the analysis results. Additionally, some artifacts and links are not supported by the Model Testing Dashboard. If you expect a link or artifact to appear in the dashboard and it does not, try one of these solutions.

Possible Solutions

Try these solutions when you begin troubleshooting artifacts in the Model Testing Dashboard:

  • Save changes to your artifact files.

  • Check that your artifacts are saved in the project. The Model Testing Dashboard does not analyze files that are not saved in the project.

  • Check that your artifacts are not in a referenced project. The Model Testing Dashboard does not analyze files in referenced projects.

  • Check that your artifacts are on the MATLAB search path before you open the dashboard. When you change the MATLAB search path, the traceability information in the Artifacts panel is not updated. Do not change the search path while the dashboard is open.

  • Open the Diagnostics pane and address errors or warnings.

  • Use the dashboard to re-trace the artifacts and re-collect metric results.

Depending on the type of artifact or analysis issue that you are troubleshooting, try one of these solutions.

Project Requires Analysis by the Dashboard

Before you begin testing your models, open the Model Testing Dashboard and run the initial traceability analysis. Open the project and, on the Project tab, click Model Testing Dashboard. Click Trace Artifacts to trace the artifacts or click Trace and Collect All to trace the artifacts and collect metric results. The dashboard must perform this initial analysis to establish the traceability data before it can monitor the artifacts. If you do not run the initial artifact analysis, the artifacts in the project appear in the Unanalyzed folder in the Artifacts pane.

Check that the project is not empty when you analyze it.  Before you run the initial artifact analysis, save at least one model in the project. If you analyze the project when it is empty, the Model Testing Dashboard does not establish the initial traceability data and you must analyze the project again before the dashboard starts monitoring the artifacts.

Incorrect List of Models in Artifacts Pane

The Artifacts pane shows the models in your project that are either unit models or component models. Models are organized under the components that reference them according to the model reference hierarchy. If the list of unit and component models does not show the expected hierarchy of your models, try one of these solutions.

Check that your unit and component models are labeled.  Label the unit and component models in your project and configure the Model Testing Dashboard to recognize the labeled models. Note that if a unit model references one or more other models, the referenced models appear in the Design folder under the unit model. For more information about labeling models and configuring the dashboard, see Categorize Models in a Hierarchy as Components or Units. Check that if you have observer models, they are not labeled as units. The dashboard includes observer models as units if they match the label requirements.

Check that your model was saved in a supported release.  Check that your model was saved in R2012b or later. The Model Testing Dashboard does not support models that were saved before R2012b.

Block Skipped during Artifact Analysis

Check that your custom libraries do not contain blocks with self-modifiable masks. The Model Testing Dashboard does not analyze blocks that contain self-modifiable masks. Self-modifiable masks can change the structural content of a block, which is incompatible with artifact traceability analysis.

Library Missing from Artifacts Pane

Check that the library does not use a library forwarding table. The Model Testing Dashboard does not support library forwarding tables.

Requirement Missing from Artifacts Pane

If a requirement is missing from the Artifacts pane, try one of these solutions.

Check that the requirement is a functional requirement.  Verify that the requirement is configured as a functional requirement. In the Requirement Editor, on the left pane, click the requirement. On the right pane, in the Properties section, set Type to Functional. Because the Model Testing Dashboard reports on requirements-based unit testing, only functional requirements appear in the Artifacts pane and are analyzed by the dashboard.

Check that the requirement is saved in a supported requirements file.  Verify that the requirement is saved in a requirements file that has the .slreqx extension.

Test Case Missing from Artifacts Pane

Check that the test case is supported by the Model Testing Dashboard. The Model Testing Dashboard does not support MATLAB-based Simulink tests.

Test Result Missing from Artifacts Pane

Check that either:

  • The result is saved in a test results file. Save test results by exporting them from the Test Manager.

  • You collected the results during the current project session and have not closed them. When you collect test results and do not export them, the dashboard recognizes the temporary results in the Test Manager, denoted by the Temporary test results icon icon. The dashboard stops recognizing the temporary results when you close the project, close the test results set, or export the test results to a results file.

Artifact Returns a Warning

Check the details of the warning by opening the Diagnostics pane.

Artifact Returns an Error

Check the details of the error by opening the Diagnostics pane.

Untraced Artifacts

If an artifact appears in the Untraced folder when you expect it to trace to a unit model, depending on the type of artifact that is untraced, try one of these solutions.

Fix an untraced requirement.  Check that the requirement traces to the unit model using an implementation link. The requirement and its links must meet one of these criteria:

  • The requirement is linked to the model or to a library subsystem used by the model with a link where the Type is set to Implements.

  • The requirement is the child of a container requirement that is linked to the model or to a library subsystem used by the model with a link where the Type is set to Implements.

  • The requirement traces to the model through a combination of the previous two criteria. For example, a requirement that is under a container requirement that links to another requirement, which links to the model.

Requirements-based testing verifies that your model fulfills the functional requirements that it implements. Because the Model Testing Dashboard reports on requirements-based testing quality, it analyzes only requirements that are specified as functional requirements and implemented in the model.

Check that the requirement does not use an unsupported link. The Model Testing Dashboard does not trace these links:

  • Downstream links. The Model Testing Dashboard traces only links from lower-level requirements to higher-level requirements. Check the direction of the link by using the Requirements Editor.

  • Embedded links, which are requirements files that are saved directly in the model file.

  • Links to requirements that are saved externally and linked using the Requirements Management Interface (RMI).

  • Links to and from data dictionaries.

  • Links to MATLAB code files.

  • Links to MATLAB Function blocks if you do not have a Stateflow® license. Analyzing MATLAB Function blocks requires a Stateflow license.

  • Links to some Stateflow elements.

  • Links in deprecated requirement files, which have the extension .req. To analyze requirement links in the dashboard, save the links in an .slmx file or create them in the requirements file (.slreqx) that has the requirements.

  • Links to System Composer™ architecture models.

  • Symbolic file links in a project, such as shortcuts.

  • Links to modeling elements that are not supported by the Model Testing Dashboard, such as library forwarding tables.

Fix an untraced design artifact.  Check that the design artifact does not rely on a model callback to be linked with the model. The Model Testing Dashboard does not execute model loading callbacks when it loads the models for analysis. If a model relies on a callback to link a data dictionary, the data dictionary will not be linked when the dashboard runs the traceability analysis.

Fix an untraced test case.  Check that the test case runs on the model or runs on a subsystem in the model by using a test harness.

Fix an untraced test result.  Check that the project and test case are set up correctly and re-run your tests. If one of these conditions is met when you run your test case, the generated results are untraced because the dashboard cannot establish unambiguous traceability to the unit:

  • No project is loaded.

  • The dashboard was not opened at least once for the project. If the dashboard has not established the initial traceability data when you run test cases, the dashboard cannot trace the generated results to the test cases that generated the test cases.

  • You do not have a Simulink® Check™ license.

  • The test file is stored outside the project.

  • The test file has unsaved changes.

  • The tested model has unsaved changes.

  • The test file returns an error during traceability analysis.

  • The tested model returns an error during traceability analysis.

  • The test result comes from a test case that is not supported by the Model Testing Dashboard, such as a MATLAB-based Simulink test.

Check that the results and environment are set up correctly and re-export your test results. If one of these conditions is met when you export your test results, the generated results are untraced because the dashboard cannot establish unambiguous traceability to the unit:

  • No project is loaded.

  • The dashboard was not opened at least once for the project.

  • You do not have a Simulink Check license.

  • The test result file returns an error during traceability analysis.

Metric Does Not Report Results for Requirement, Test Case, or Test Result

If an artifact traces to one of your unit models but does not appear in the metric results for that unit, depending on the type of artifact, try one of these solutions.

Fix a requirement that does not produce metric results.  Check that the requirement directly links to the model with a link where the Type is set to Implements. A requirement that traces to the model indirectly appears in the Artifacts pane but is not analyzed by the dashboard metrics because the metrics analyze only requirements that are directly implemented by the model.

Fix a test case that does not produce metric results.  Check that the test case runs on the whole model. Some test cases that trace to the model are not analyzed by the dashboard metrics because the metrics analyze only test cases that test the whole model. For example, tests that run subsystem test harnesses are traced to the unit model and appear in the Artifacts panel, but do not contribute to the metric results.

Fix a test result that does not produce metric results.  Check that the results meet these criteria:

  • The results are the most recent results generated from the test cases.

  • The results are from test cases that run on the whole model. For example, the metrics do not analyze test results from a test case that runs on a subsystem test harness.

If a test case includes multiple iterations, the metric results reflect the status of the whole test case and do not show individual iteration results.

Fix a test that does not produce test result analysis metric results.  The metrics in the Simulation Test Result Analysis section count results from only simulation tests, whereas the metrics in the Test Case Analysis section count all tests that run the whole model. If a test is not counted in the metrics in the Simulation Test Result Analysis section, check that the test case meets these criteria for being a simulation test:

  • The simulation mode is Normal, Accelerator, or Rapid Accelerator. If the test uses iterations to set a different simulation mode after testing one of these modes, the test is still considered a simulation test.

  • The test is not a real-time test.

  • If the test is an equivalence test, the first simulation meets one of the first two criteria.

  • If the test contains multiple iterations, the test case or at least one iteration meets one of the first two criteria.

Metric Result Shows a Missing Link or Artifact

The metric results do not count all types of traceability links. If a metric shows that a test case or requirement is missing links when you expect it to be linked, try one of these solutions.

Fix a link that is not counted in traceability results.  If there is a link between a requirement and a test case, but the traceability metrics show that the test or requirement is unlinked, check if the link is supported by the dashboard metrics. The metrics do not support these links:

  • A requirement link to a justification. If a requirement is linked with a justification and not linked to a test case, it appears as unlinked in the metric results.

  • A requirement link to an internal test harness.

Fix missing model coverage in test results.  If the model coverage metrics report coverage gaps that you do not expect, re-run the test cases and re-collect the metric results for the new test results. The Model Testing Dashboard might show coverage gaps if:

  • You change the test results file or the coverage filter file after you collect the metrics, including if you re-import the test results file after you make changes.

  • You collect accumulated coverage results and make changes to the model file after running one or more tests.

See Also

Related Topics