Formal Testing of Requirements
February 8, 2016 - Rebecca Haag, Systems Engineer II; and Gary Seim, Principal Systems Engineer, Retired
Requirements represent a translation of customer needs into a contract that designers can utilize to drive the development of the product, and that can impact a project’s schedule, budget, and quality output. After requirements have been elicited, captured, agreed to and managed, a detailed design has been developed and implemented and integration testing is complete, the last phase of the development effort is reached: Formal Testing (Figure A).
In formal testing, the design undergoes both verification and validation. The terms represent two related but distinct activities.
- Verification is the process of making sure that the design has been implemented correctly – i.e. that the system and subsystem requirements have been met in the design.
- Validation is the process of making sure that the design meets the need – i.e. that the stakeholder requirements have been met by the final design.
Verification:Was the product developed correctly?
Validation:Was the correct product developed?
Minnetronix focuses on verification and typically defers validation efforts to their customers as they have easier access to the system users. Properly executed verification can have a positive impact on the project outcome and provides necessary information for regulatory submissions. This paper will explore the details of verification in formal testing including the importance of preparing for verification, the need for a verification dry run, and how to execute a formal verification.
Preparation for Verification
Verification test preparation should begin early in the development process with test engineers at the table throughout the development process. It is important to ensure the project is testable, verifiable and that the protocol is appropriately designed.
As the requirements are being documented, a quick check for testability involves asking the simple question ‘How will this requirement be tested?’ This check helps to refine the requirements and makes them easier to understand.
One characteristic of good requirements is that they are verifiable – i.e. that they can be quantitatively evaluated for implementation. The test protocols provide the step by step instructions to complete this quantitative evaluation and the acceptance criteria which show whether the requirement is met. The level of detail that needs to be included in the test protocols reflects the amount of detail included in the requirements.
Each protocol may verify more than one requirement. The total number of requirements verified in each protocol is a matter of preference; however it generally makes sense to group related requirements together in the same protocol. Test protocols should be traced from the requirements which they verify. By setting up traceability early on and leveraging the traceability that was created within the requirements cascade, the impact of requirements changes on test protocols can be clearly tracked.
The Minnetronix verification team is involved in the early requirements reviews to ensure they are testable and verifiable from the beginning. The team begins to write test protocols once the requirements have been initially agreed to by the appropriate stakeholders (through a design review meeting) and an initial architecture has been defined. Minnetronix creates the design and test protocols in an extremely collaborative environment. The design team and the quality engineers work closely during the development effort to ensure appropriate testability. By starting work on the protocols as soon as possible, issues such as holes, conflicts, and redundancies within the requirements cascade can be easily identified and addressed early in the development process. Correcting issues as early as possible lessens the impact of project costs and timelines.
Verification Dry Run
Once initial prototypes are available, the protocols may be dry run – this is effectively a rehearsal of the verification testing which evaluates how well the test protocol describes the steps to be completed. The dry run testing can also make an initial evaluation of whether or not the requirements have been implemented correctly. Dry runs can identify issues with the test protocol (i.e. steps that cannot be completed as described) or issues with the design (i.e. features which have not been implemented as described). Finding and addressing these issues before a formal verification run is preferable – correcting them later on in the project is more difficult and expensive than during the development effort.
After the completion of dry run testing, the protocols are updated to capture any applicable learning. The updated verification protocols are reviewed with the appropriate stakeholders (design engineers, customer representatives, etc.) prior to beginning a formal test effort. This serves as another checkpoint to ensure that the requirements have been well written and are being interpreted correctly.
Minnetronix strongly advocates for inclusion of dry run testing within a project schedule. The additional investment (time and effort) is well worth it as it allows issues to be identified prior to design completion and formal verification.
Executing Formal Verification
Formal verification testing can begin once the verification protocols are written, reviewed, released and a final prototype is available. If the test protocols were extensively dry run before beginning formal testing, little new information should be learned at this point. However, it is always possible that some issues may arise. These issues can be addressed in a variety of ways ranging from accepting as is to design changes.
Any design changes that occur after the beginning of formal verification (including changes made to address test failures) may necessitate regression testing. Planning for a regression test involves evaluating the design changes and assessing their impact throughout the requirements, design, and the verification protocols. These design changes may be limited in scope such that only a handful of tests need to be re-run or they may impact enough of the design that re-executing the entire suite of tests is preferable. The traceability set up early on in the protocol writing helps immensely in evaluating the impact of design changes.
The outcome of the verification phase is a set of documentation showing the final version of the requirements and reports which show that:
- All requirements were met (ideal) or
- Some requirements were met and rationales for the acceptability of the ones which were not met (less ideal).
Minnetronix’ experience in managing formal verification efforts can help ensure a smooth testing effort which can result in more predictable project costs and timelines.
Verification is an important phase in the design process because it can have a positive impact on the project and provides necessary information for regulatory submission. Minnetronix firmly believes that the probability of project success and regulatory approval is greatly improved by having the test engineers at the table throughout the development lifecycle. By including test engineers early, they can better understand the product and design, begin preparing for the verification phase early and help to informally test the design at integration points. All of this helps to identify necessary changes as early in the development as possible.
Minnetronix seamlessly integrates quality assurance into the development process. Although this can increase the upfront investment (by having one more function at the table), the overall project cost decreases. Building testability in from the beginning helps ensure issues are identified and addressed early in the process rather than being caught only at the last minute, ultimately saving the customer time and money.