Skip to content

Testing Related Release Reporting

Background

As a product is released to production, one of the activities that demonstrates our commitment to quality to users and regulatory bodies is the creation of test reports to align with the FDA's 21 CFR Part 820, ISO 13485 and ISO 9001. More information can be found in the DOC000689 Design Controls: Top Level Procedure.

There are two main parts of ensuring product quality. One is validation which ensures the system is built with the expected functionality. The other part is verification which ensures the functionality of the system was implemented correctly. An example to illustrate the difference: suppose we are including log in functionality in our system. Validation includes ensuring that there is a way to log in to the system; whereas verification confirms that the user receives an error message if a log in is unsuccessful.

For more information about the details about how this is done at Medtronic, review DOC000560 Design Controls: Design Validation and DOC000663 Design Controls: Design Verification.

## Types of Reports

  1. Test Run Results Report is a list of all test runs including the test that was executed, whether it passed or failed and the environment in which the test was run.
  2. Test to Requirement Traceability Report maps each of the requirements (or use case or user story) for a given product and the associated tests. This report identifies any gaps where there are requirements that have not been tested.
  3. Test Protocol Report is a list of every test for a product including a reference to the automation class and function or all the steps of the test if manually executed.

Populating Test Reports - Today

Today many teams use a combination of Octopus and Test Suites (made up of Test Plans and Test Cases) to run tests and place on the TFS Build Shelf. Since the URL of the Build Shelf is known, the test case can get the build to pull down to the test agent and run the tests.

The testing tools used today along with a bit of automation enable the tracking of the required information for the reports: test runs (which are made up of test plans, suites and cases), results of the test, which requirement or user story the test covers, the test agent used, and (with a little work) information about the build and automation of the test runs.

It should be noted that many teams use different tools and create these reports in slightly different ways. Some teams use 3rd party tools for requirements, test suites or both such as Selenium and Enterprise Architect. Often the information for the report is manually gathered and/or reformatted before uploading it to MAP Agile for formal documentation routing.

Populating the Reports - Future Direction

As we get up and running with Azure DevOps much of the testing related data will likely be gathered in a similar fashion as today. One of the easiest ways to pull data from Azure DevOps is to create work item queries that can be exported into Excel as a first step in report creation as described here.

As we learn more about what information is important to product teams and what is common across teams, we can capture this data from Azure Pipelines, put it into a data store of some sort and present it in Power BI reports. Another approach is to use the Azure DevOps Services REST API to retrieve this information.

Additionally there are other groups and teams working on various aspects of ensuring we release quality products, including the Systems group and the Flyers team who is creating a STEAM product for end to end testing. The STEAM product and voice of the Systems and Quality teams will also be key to this piece of our DevOps transformation.