Main Content

Use Specification Models for Requirements-Based Testing

Since R2022b

This example shows how to use a specification model to model and test formal requirements on a model of an aircraft autopilot controller. The specification model uses two Requirements Table blocks to model the required inputs and outputs of the aircraft autopilot controller model. You analyze the Requirements Table block for modeling problems, generate tests from the specification model, and then run those tests on the aircraft autopilot controller model. The model that you test is the design model.

For more information on how to define and configure Requirements Table blocks, see Use a Requirements Table Block to Create Formal Requirements and Configure Properties of Formal Requirements.

View the High-Level Requirements

Open the requirements set, AP_Controller_Reqs, in the Requirements Editor.

slreq.open("AP_Controller_Reqs");

The high-level requirements specify the outputs of the model and the autopilot controller mode. Each requirement description uses high-level language that you can use to explicitly define the logic needed in the formal requirements.

This image shows the requirements set, AP_Controller_Reqs, in the Requirements Editor.

View the First Iteration of the Specification Model

Open the specification model, spec_model_partial.

spec_model = "spec_model_partial";
open_system(spec_model); 

The model contains two Requirements Table blocks that define the formal requirements that translate the high-level requirements into testable logical expressions. The block AP_Mode_Determination specifies the formal requirements for the autopilot controller mode, and the block Cmd_Determination specifies the outputs of the controller.

This image shows the specification model, spec_model_partial. The model has two Requirements Table blocks that are connected together.

To view the formal requirements, inspect each Requirements Table block.

Requirements Table Block for Controller Mode

Open AP_Mode_Determination. The block specifies the formal requirements for the autopilot controller mode. To determine the output data Mode, AP_Mode_Determination specifies three requirements by using two input data:

  • AP_Engage_Switch — The autopilot engage switch

  • HDG_Engage_Switch — The heading engage switch

Each requirement uses a combination of the inputs to specify a unique output value for Mode.

Ths image shows the table for AP_Mode_Determination. The block specifies the formal requirements for the autopilot controller mode.

Requirements Table Block for Controller Commands

Open Cmd_Determination. Cmd_Determination specifies the requirements for the aileron command and roll reference command. Cmd_Determination uses four input data:

  • Mode — The AP_Mode_Determination output, Mode

  • Roll_Ref_TK — The setting of the roll reference target knob

  • Roll_Angle_Phi — The actual aircraft roll angle

  • HDG_Ref_TK — The setting of the heading reference target knob

The block uses these input data to determine the controller output data:

  • Roll_Ref_Cmd — Roll reference command

  • Ail_Cmd — Aileron command

This image shows the table for Cmd_Determination. Cmd_Determination specifies the requirements for the aileron command and roll reference command.

In this example, the expressions use constant data to define the ranges of values for Roll_Ref_TK and Roll_Angle_Phi. You can also parameterize the values or use literal values. See Define Data in Requirements Table Blocks. To view these values, open the Symbols pane. In the Modeling tab, in the Design Data section, click Symbols Pane.

In addition to requirements, Cmd_Determination also defines the assumptions for the design. See Add Assumptions to Requirements. In this example, the assumptions constrain the values for the roll angle and the roll reference target knob based on their physical limitations. The roll angle cannot exceed 180 or fall below -180 degrees, and the roll reference target knob cannot exceed 30 or fall below -30. In the table, click the Assumptions tab.

This image shows the assumptions used in Cmd_Determination.

You can also specify data range limitations in the Minimum and Maximum properties of the data or explicitly specify the range from the signal with blocks.

Analyze the Requirements Table

Next, analyze the Requirements Table block. As a best practice, analyze the block to confirm that the formal requirements are complete and consistent. This best practice helps you avoid generating tests from specification models before you confirm that the formal requirements are complete, consistent, and correspond to the high-level requirements.

In the Analyze section, click Analyze Table. See Analyze Requirements Table Blocks for Modeling Problems.

The analysis detects an incompleteness issue.

The requirement set in Cmd_Determination is missing the formal requirement that corresponds to the third bullet of requirement 3.

This image shows the description for the 3rd high-level requirement. The missing requirement from Cmd_Determination is in the red box.

Open Cmd_Determination in the model spec_model_final to view the updated requirement set. The additional requirement has the index 2.2.4.

spec_model = "spec_model_final";
load_system(spec_model);
open_system(spec_model + "/Cmd_Determination");

This image shows the updated formal requirements of Cmd_Determination. The new requirement is in a red box.

Link High-Level and Formal Requirements

Loading the specification model loads the formal requirements in the Requirements Editor. Closing the specification model also closes the associated requirement set. When developing your formal requirements, link formal requirements to the corresponding high-level requirement to track the requirements in the specification model. In this example, linking the requirements does not affect test generation or test results.

To link the first formal requirement to the corresponding high-level requirement:

  1. In spec_model_final, expand the requirement set named Table1.

  2. Right-click the formal requirement with the Index of 1 and select Select for Linking with Requirement.

  3. Expand the AP_Controller_Reqs requirement set.

  4. Right-click the requirement with an ID of 1 and click Create a link from "1: Autopilot mode is OFF" to "1: High Level: Autopilot Con...".

The link type defaults to Related to. For more information on link types, see Link Types.

Generate Tests from the Updated Model

Simulink® Design Verifier™ automatically creates test objectives from the requirements defined in Requirements Table blocks. To generate tests, use the Configuration Parameter window or specify the tests programmatically. See Model Coverage Objectives for Test Generation (Simulink Design Verifier). Select different coverage objectives to determine if you want to minimize the number of tests generated, or if you want to improve test granularity and traceability.

In this example, generate tests with decision coverage and save the output to a MAT-file.

opts = sldvoptions; 
opts.Mode = "TestGeneration"; 
opts.ModelCoverageObjectives = "Decision"; 
[~,files] = sldvrun(spec_model,opts,true);

Simulink Design Verifier generates the test objectives and the tests from the requirements. In this version of the specification model, the test objectives are satisfied.

This image shows the results of generating the tests on the updated specification model. The test objectives are satisfied.

Run the Tests on the Design Model

After you create tests that satisfy the test objectives, you can run the tests on the design model. In this example, the design model is the model for the aircraft autopilot controller, sldvexRollApController.

Before you run tests on the design model, you must interface the specification model with the design model. Typically, the specification model does not produce or use the same signals as the design model. These differences can be simple or abstract. For example, the design model might use different input and output signal types than the specification model, or you may want to compare a scalar output from the design model against a range in the specification model. As a result, you need to construct an interface between the design model and the specification model.

Interface the Design Model with the Specification Model

In this example, the specification model spec_model_final and design model sldvexRollApController inputs can interface directly, but one of the outputs is different. spec_model_final represents the aileron command as a range of values, but the aileron command value produced by sldvexRollApController is a scalar double. The interface uses a MATLAB Function block to compare the aileron command values. The interface then verifies both outputs with Assertion blocks. Open the model, spec_model_test_interface, to view the interface.

test_interface = "spec_model_test_interface";
open_system(test_interface);

This image shows the model, spec_model_test_interface.

The MATLAB Function block compares the two signals by using this code:

function y = fcn(design_val, spec_val)
switch spec_val 
    case Ail_Cmd.All
        y = true;
    case Ail_Cmd.Zero
        y = (design_val == 0);
    otherwise
        y = false;
end    

Run the Updated Tests on the Design Model

To test and verify the design model, create a harness model that contains the:

  1. Specification model

  2. Design model

  3. Test interface and verification model

In the harness model, attach the models together. Then run the tests on the design model and verify the outputs correspond to the requirements in the harness model.

To view the harness model, open the model, sldvexDesignHarnessFinal.

harness_model = "sldvexDesignHarnessFinal";
open_system(harness_model);

Like with the interface model, not all design model inputs may directly correspond to specification model inputs. In this example, the harness model prepares the design model for testing with the five inputs specified by the specification model.

This image shows the harness model, sldvexDesignHarnessFinal.

Run the updated tests on the design model from within the harness model. Use the sldvruntest (Simulink Design Verifier) function to run the tests and save the results. If you have Simulink Coverage™, you can view the results of the tests from the output of sldvruntest in a coverage report. View the coverage report by using the cvhtml (Simulink Coverage) function.

cvopts = sldvruntestopts;
cvopts.coverageEnabled = true;
[finalData, finalCov] = sldvruntest(...
    harness_model,files.DataFile,cvopts);
cvhtml("finalCov",finalCov);

In the coverage report, click the sldvexRollApController link. The summary shows that full coverage is achieved in the design model, sldvexRollApController.

This image shows the coverage report after running the tests from within the harness model. The report shows that full coverage is achieved on the design model.

bdclose("all");
slreq.clear;

See Also

Related Topics