Series Background: In this series of posts, I’ve been using Medical Device Development (as Regulated by U.S. FDA via CFR 820.30 and international standard IEC62304) as an exemplar for suggesting ways to develop high quality software in regulated (and other high assurance, high economic cost of failure) environments in an agile manner. This series is sponsored, in part, by Rally Software Development.
In the last post, I introduced a suggested process model that teams could use to visualize and reason about how to apply iterative and incremental, and agile methods in the development of such systems. The graphic is reproduced here:
High Assurance Agile Software Development Lifecycle Model
I described the iterative and incremental nature of the model, and noted that we’ll need a better understating of Verification, Validation and Traceability to further understand it.
The use of the terms verification and validation, (also described as V&V, SV&V) are often used interchangeably in the industry. The most common interpretation of these terms is that they translate to “assured testing” practices. In point of fact, the words have different meaning and lumping them together as V&V can obscure the constructs entirely.
One of the reasons we’ve picked an exemplar for our discussion is to provide a more definitive basis for the method, grounded in at least one known, public example: in our case the development of medical devices containing software. So we’ll return to these roots for definitions of these terms, and build from there. In an earlier post, I described the chain of regulations covering medical devices marketed in the US, and summarized with the graphic below:
Chain of regulations governing medical device software in the US
While CFR21 Part 820.30 is the governing regulatory requirements, it’s surprisingly short and not very explanatory. To address this, FDA produced the document on the bottom right, (General Principles of Software Validation; Final Guidance for Industry and Staff) to provide guidance to those who operate under the 820.30 mandate.
This document provides some meaningful definitions, and we’ll start with the most critical ones that are relevant to our model:
“Software verification provides objective evidence that the design outputs of a particular phase of the software development life cycle meet all of the specified requirements for that phase.
Software verification looks for consistency, completeness, and correctness of the software and its supporting documentation, as it is being developed, and provides support for a subsequent conclusion that software is validated. Software testing is one of many verification activities intended to confirm that software development output meets its input requirements. Other verification activities include various static and dynamic analyses, code and document inspections, walkthroughs, and other techniques.”
Clearly this definition takes us beyond testing, and takes a more “white box” look at assuring that each development activity meets the requirements imposed from the prior activity. With our yet-to-be-described agile , high assurance practices, continuous (and where possible, automated) verification activities will play a central role. Fortunately, we’ll discover that many of these practices (hierarchical reqs, small units of functionality, unit testing, pair/peer review, acceptance testing, etc.) are part of standard agile, high quality hygiene, so we’ll be able to “rigorously apply” existing practices, rather than invent new ones.
“…FDA considers… software validation to be confirmation by examination and provision of objective evidence that software specifications conform to user needs and intended uses, and that the particular requirements implemented through software can be consistently fulfilled.”
In practice, software validation activities may occur both during, as well as at the end of the software development life cycle to ensure that all requirements have been fulfilled. Since software is usually part of a larger hardware system, the validation of software typically includes evidence that all software requirements have been implemented correctly and completely and are traceable to system requirements. A conclusion that software is validated is highly dependent upon comprehensive software testing, inspections, analyses, and other verification tasks performed at each stage of the software development life cycle. Testing of device software functionality in a simulated use environment, and user site testing are typically included as components of an overall design validation program for a software automated device.”
In addition to these definitions, this document also provides an introduction and definition to a few other requirements that will be imposed on the process:
Software Requirements Specification
“A documented software requirements specification provides a baseline for both validation and verification. The software validation process cannot be completed without an established software requirements specification (Ref: 21 CFR 820.3(z) and (aa) and 820.30(f) and (g)).”
So while in agile, we don’t often create these in a formal way, instead using the backlog, collection of user stories and acceptance criteria, test cases and the code itself to document requirements. But in this context, it is 100% clear that we will need to rigorously develop and maintain a software requirements specification as part of our high assurance, but still largely agile, practices.
This document goes on to describe traceability and traceability analysis as one of the primary mechanisms to assure that verification and validation are complete and consistent. However, it doesn’t define traceability, for this we refer to FDA Glossary of Computer Systems Software Development Terminology, where we find the IEEE definitions:
traceability. (IEEE) (1) The degree to which a relationship can be established between two or more products of the development process, especially products having a predecessor-successor or master-subordinate relationship to one another; e.g., the degree to which the requirements and design of a given software component match. See: consistency. (2) The degree to which each element in a software development product establishes its reason for existing; e.g., the degree to which each element in a bubble chart references the requirement that it satisfies.
With these activity/artifact/process definitions and the lifecycle graphic behind us, we can go on to a more meaningful elaboration of the model. We’ll do that in the next few posts.