Main Content

generateReport

Generate report file containing metric results

    Description

    reportFile = generateReport(metricEngine,Dashboard="ModelMaintainability") generates a model maintainability report for the metric engine.

    reportFile = generateReport(metricEngine,Dashboard="ModelUnitTesting") generates a model testing report for the metric engine.

    reportFile = generateReport(metricEngine,Dashboard="ModelUnitSILTesting") generates a SIL code testing report for the metric engine.

    reportFile = generateReport(metricEngine,Dashboard="ProjectModelTesting") generates a project model testing report for the metric engine.

    reportFile = generateReport(___,Name=Value) specifies options using one or more name-value arguments. For example, Type="html-file" generates an HTML report.

    Note that there is also a function generateReport (Fixed-Point Designer) in the Fixed-Point Designer™ documentation.

    example

    Examples

    collapse all

    Analyze the maintainability of artifacts in a project and generate a report file that contains the results.

    Open the project that you want to analyze. For this example, in the MATLAB® Command Window, enter:

    openExample("slcheck/ExploreTestingMetricDataInModelTestingDashboardExample");
    openProject("cc_CruiseControl");

    Create a metric.Engine object for the project.

    metric_engine = metric.Engine();

    Create a list of the metric identifiers for the Model Maintainability Dashboard by specifying the dashboard type as 'ModelMaintainability'.

    metric_ids = getAvailableMetricIds(metric_engine,...
    App="DashboardApp",...
    Dashboard="ModelMaintainability");

    Before you generate the report, collect metric results for the metric engine by using the execute function on the list of metric identifiers.

    execute(metric_engine,metric_ids);

    Generate an HTML report named maintainabilityResults in the current directory, which is the root folder of the project.

    reportLocation = fullfile(pwd,"maintainabilityResults.html");
    generateReport(metric_engine,Dashboard="ModelMaintainability",...
    Type="html-file",Location=reportLocation);

    The report opens automatically. To prevent the report from opening automatically, specify LaunchReport as false when you call the generateReport function.

    To open the table of contents and navigate to results for each unit, click the menu icon in the top-left corner of the report.

    Analyze the testing artifacts in a project and generate a report file that contains the results.

    Open the project that you want to analyze. For this example, in the MATLAB Command Window, enter:

    openExample("slcheck/ExploreTestingMetricDataInModelTestingDashboardExample");
    openProject("cc_CruiseControl");

    Create a metric.Engine object for the project.

    metric_engine = metric.Engine();

    Update the trace information for metric_engine to ensure that the artifact information is up to date.

    updateArtifacts(metric_engine)

    Create a list of the available metric identifiers for the Model Testing Dashboard by specifying the dashboard type as 'ModelUnitTesting'.

    metric_ids = getAvailableMetricIds(metric_engine,...
    App="DashboardApp",...
    Dashboard="ModelUnitTesting");

    Before you generate the report, collect metric results for the metric engine by using the execute function on the list of metric identifiers.

    execute(metric_engine,metric_ids);

    Generate a model testing report. By default, the report generates as a PDF in the root folder of the project.

    generateReport(metric_engine,Dashboard="ModelUnitTesting");

    The report opens automatically. To prevent the report from opening automatically, specify LaunchReport as false when you call the generateReport function.

    For each unit in the report, there is an artifact summary table that displays the number of artifacts in the requirements, design, and tests.

    Input Arguments

    collapse all

    Metric engine object for which you collected metric results, specified as a metric.Engine object.

    Name-Value Arguments

    collapse all

    Specify optional pairs of arguments as Name1=Value1,...,NameN=ValueN, where Name is the argument name and Value is the corresponding value. Name-value arguments must appear after other arguments, but the order of the pairs does not matter.

    Before R2021a, use commas to separate each name and value, and enclose Name in quotes.

    Example: Type="html-file"

    Automatically open the generated report, specified as true or false.

    Example: false

    Data Types: logical

    Full file name for the generated report, specified as a character vector or string scalar. Use the location to specify the name of the report.

    By default, the report name is the dashboardIdentifier, followed by an underscore, followed by the project name, and the report is generated in the root folder of the project.

    Example: "C:\MyProject\Reports\MyResults.html"

    File type for the generated report, specified as pdf or html-file.

    Example: html-file

    Output Arguments

    collapse all

    Full file name of the generated report, returned as a character vector.

    Alternative Functionality

    App

    You can use the dashboard user interface to generate a report.

    To open the dashboard user interface, use one of these approaches:

    • In the Command Window, enter:

      modelDesignDashboard
      The dashboard opens the Model Maintainability Dashboard.

    • In the Command Window, enter:

      modelTestingDashboard
      The dashboard opens the Model Testing Dashboard.

    Click the Report button on the toolstrip. The Create Metric Result Report dialog box opens. Click Create to create a report.

    For an example of how to use the dashboard user interface, see Monitor Design Complexity Using Model Maintainability Dashboard.

    Version History

    Introduced in R2021a

    expand all