Analyze Code and Test Software-in-the-Loop
Code Analysis and Testing Software-in-the-Loop Overview
You can analyze code to detect errors, check standards compliance, and evaluate key metrics such as length and cyclomatic complexity. For handwritten code, you typically check for run-time errors with static code analysis and run test cases that evaluate the code against requirements and evaluate code coverage. Based on the results, you refine the code and add tests.
In this example, you generate code and demonstrate that the code execution produces equivalent results to the model by using the same test cases and baseline results. Then you compare the code coverage to the model coverage. Based on test results, add tests and modify the model to regenerate code.
Analyze Code for Defects, Metrics, and MISRA C:2012
This workflow describes how to check if your model produces MISRA® C:2012 compliant code and how to check your generated code for code
metrics and defects. To produce more MISRA compliant code from your model, you use the code generation and Model
Advisor. To check whether the code is MISRA compliant, you use the Polyspace®
MISRA C:2012 checker and report generation capabilities. For this example, you
use the model
simulinkCruiseErrorAndStandardsExample. To open the
Open the project.
path = fullfile(matlabroot,'toolbox','shared','examples',... 'verification','src','cruise') run(fullfile(path,'slVerificationCruiseStart'))
From the project, open the model
Run Code Generator Checks
Before you generate code from your model, use the Code Generation Advisor to check your model so that it generates code more compliant with MISRA C and more compatible with Polyspace.
Right-click Compute target speed and select C/C++ Code > Code Generation Advisor.
Select the Code Generation Advisor folder. In the right pane, move
Polyspaceto Selected objectives - prioritized. The
MISRA C:2012 guidelinesobjective is already selected.
Click Run Selected Checks.
The Code Generation Advisor checks whether the model includes blocks or configuration settings that are not recommended for MISRA C:2012 compliance and Polyspace code analysis. For this model, the check for incompatible blocks passes, but some configuration settings are incompatible with MISRA compliance and Polyspace checking.
Click the check that did not pass. Accept the parameter changes by selecting Modify Parameters.
Rerun the check by selecting Run This Check.
Run Model Advisor Checks
Before you generate code from your model, use the Model Advisor to check your model for MISRA C and Polyspace compliance. This example shows you how to use the Model Advisor to check your model before generating code.
At the bottom of the Code Generation Advisor window, select Model Advisor.
Under the By Task folder, select the Modeling Standards for MISRA C:2012 advisor checks.
Click Run Checks and review the results.
If any of the tasks fail, make the suggested modifications and rerun the checks until the MISRA modeling guidelines pass.
Generate and Analyze Code
After you have done the model compliance checking, you can generate the code. With Polyspace, you can check your code for compliance with MISRA C:2012 and generate reports to demonstrate compliance with MISRA C:2012.
In the Simulink® editor, right-click Compute target speed and select C/C++ Code > Build This Subsystem.
Use the default settings for the tunable parameters and select Build.
After the code is generated, in the Simulink Editor, right-click Compute target speed and select Polyspace > Options.
Click Configure to choose more advanced Polyspace analysis options in the Polyspace configuration window.
On the left pane, click Coding Standards & Code Metrics, then select Calculate Code Metrics to enable code metric calculations for your generated code.
Save and close the Polyspace configuration window.
From your model, right-click Compute target speed and select Polyspace > Verify > Code Generated For Selected Subsystem.
Polyspace Bug Finder™ analyzes the generated code for a subset of MISRA checks. You can see the progress of the analysis in the MATLAB® Command Window. After the analysis finishes, the Polyspace environment opens.
The Polyspace environment shows you the results of the static code analysis.
Expand the tree for rule 8.7 and click through the different results.
Rule 8.7 states that functions and objects should not be global if the function or object is local. As you click through the 8.7 violations, you can see that these results refer to variables that other components also use, such as
CruiseOnOff. You can annotate your code or your model to justify every result. Because this model is a unit in a larger program, you can also change the configuration of the analysis to check only a subset of MISRA rules.
In your model, right-click Compute target speed and select Polyspace > Options.
Set the Settings from option to
Project configurationto choose a subset of MISRA rules in the Polyspace configuration.
In the Polyspace window, on the left pane, click Coding Standards & Code Metrics. Then select Check MISRA C:2012 and, from the drop-down list, select
single-unit-rules. Now Polyspace checks only the MISRA C:2012 rules that are applicable to a single unit.
Save and close the Polyspace configuration window.
Rerun the analysis with the new configuration.
The rules Polyspace showed previously were found because the model was analyzed by itself. When you limited the rules Polyspace checked to the single-unit subset, Polyspace found only two violations.
When you integrate this model with its parent model, you can add the rest of the MISRA C:2012 rules.
To demonstrate compliance with MISRA C:2012 and report on your generated code metrics, you must export your
results. If you want to generate a report every time you run an analysis, see
Generate report (Polyspace Bug Finder).
If they are not open already, open your results in the Polyspace environment.
From the toolbar, select Reporting > Run Report.
Select BugFinderSummary as your report type.
Click Run Report.
The report is saved in the same folder as your results.
To open the report, select Reporting > Open Report.
Test Code Against Model Using Software-in-the-Loop Testing
You previously showed that the model functionality meets its requirements by running test cases based on those requirements. Now run the same test cases on the generated code to show that the code produces equivalent results and fulfills the requirements. Then compare the code coverage to the model coverage to see the extent to which the tests exercised the generated code.
In MATLAB, in the project window, open the
testsfolder, then open
SILTests.mldatx. The file opens in the Test Manager.
Review the test case. On the Test Browser pane, navigate to
SIL Equivalence Test Case. This equivalence test case runs two simulations for the
simulinkCruiseErrorAndStandardsExamplemodel using a test harness.
Simulation 1 is a model simulation in normal mode.
Simulation 2 is a software-in-the-loop (SIL) simulation. For the SIL simulation, the test case runs the code generated from the model instead of running the model.
The equivalence test logs one output signal and compares the results from the simulations. The test case also collects coverage measurements for both simulations.
Run the equivalence test. Select the test case and click Run.
Review the results in the Test Manager. In the Results and Artifacts pane, select SIL Equivalence Test Case to see the test results. The test case passed and the results show that the code produced the same results as the model for this test case.
Expand the Coverage Results section of the results. The coverage measurements show the extent to which the test case exercised the model and the code. When you run multiple test cases, you can view aggregated coverage measurements in the results for the whole run. Use the coverage results to add tests and meet coverage requirements, as shown in Perform Functional Testing and Analyze Test Coverage (Simulink Check).
You can also test the generated code on your target hardware by running a processor-in-the-loop (PIL) simulation. By adding a PIL simulation to your test cases, you can compare the test results and coverage results from your model to the results from the generated code as it runs on the target hardware. For more information, see Code Verification Through Software-in-the-Loop and Processor-in-the-Loop Execution (Embedded Coder).