Within a software development organization, whether for embedded code or a desktop application, there are distinct roles. They are the controls engineer, the system architect, and the quality engineer. Depending on the size of the development team some of these roles may be done by a single person.
Analysis versus testing
During the development phase of a project, the controls engineer should perform analysis tasks on the model. These analysis tasks enable the controls engineer to determine if the algorithm they are developing is functionally correct and is compliance with the requirements.
It is common for the analysis tasks to be performed in an informal fashion. It is common for engineers to simulate a model and then view the graphs of the outputs to determine if they have correctly implemented the algorithm.
The differentiating word in this description is informal. When comparing analysis with testing we see that testing (either verification or validation) requires a formalized and “locked down” framework. How then can the informal analysis be used during the formal testing?
Transitioning from analysis to testing
Ideally, the transition from informal analysis to functional testing would flow seamlessly. However, it is often the case that the work done in the analysis phase is thrown away in the transition to the testing phase. This is understandable in a non-MBD environment but with the single truth approach of MBD, the analysis results should not be thrown away. This is where the idea of “golden data” comes into use. Golden Data is a set of data, both inputs, and outputs that an experienced engineer verifies as meeting the requirements of the algorithm.
Enabling the use of golden data to create test cases
The easiest way to enable the use of golden data is to provide controls engineers with a simple interface in which they can provide the analysis data set and the information that transforms it into a testing data set.
Analysis data is transformed into test data by providing a method for “locking down” the results. To locked down data the controls engineer needs to provide the test engineer information on what is expected from the analysis data. This information could include the following types of golden data tests.
- Strict: The output data from testing must match the golden output data exactly. This is normally done for Boolean or integer outputs.
- Tolerance: The output data from testing must match the golden output data within some bounded tolerance. Tolerances can be absolute, or percentage. Note special care needs to be taken with data with values around zero for percentage based tolerances.
- Temporal: The output from testing must match the golden output data within some time duration. These tests can also include tolerance and strict conditions.
In addition to the type of golden data tests to run the controls engineers should include information on which requirements the test maps onto.
Formal tests in support of development
In the same way that golden data can support testing the formal testing can support the controls engineers by informing them of the constraints that the requirements place on their design. This can only be achieved if the tests are easy for the controls engineers to run.
What is “user-friendly?”
User-friendly interfaces for testing are defined by the following characteristics
- Data is accepted in “natural” format: Any formatting or interpolation of the data is performed by the testing environment.
- Test results are presented in “human readable” format: The results from the tests should be provided both in a summary format (pass/fail) and with detailed data, such as graphs and tabular data.
- Selection and execution of tests should be simple: Tests should be launchable from a user interface that provides a list of the valid tests and enables the running of tests in either single or batch modes.
- Test files should be automatically associated: The management of test data (inputs and results) should be handled by the test manager.
Final thoughts
This blog post has described how information should be shared and how tests should be run. In an upcoming post, I will cover the basics of modular test design.
2 thoughts on “User-friendly testing environments: Analysis and testing”