Software Verification and Validation in High Assurance Agile Development: Definitions

Series Background: In this series of posts, I’ve been using Medical Device Development (as Regulated by U.S. FDA via CFR 820.30 and international standard IEC62304) as an exemplar for suggesting ways to develop high quality software in regulated (and other high assurance, high economic cost of failure) environments in an agile manner. This series is sponsored, in part, by Rally Software Development.


In the last post, I introduced a suggested process model that teams could use to visualize and reason about how to apply iterative and incremental, and agile methods in the development of such systems. The graphic is reproduced here:

High Assurance Agile Software Development Lifecycle Model

I described the iterative and incremental nature of the model, and noted that we’ll need a better understating of Verification, Validation and Traceability to further understand it.

The use of the terms verification and validation,  (also described as V&V, SV&V) are often used interchangeably in the industry. The most common interpretation of these terms is that they translate to “assured testing” practices. In point of fact, the words have different meaning and lumping them together as V&V can obscure the constructs entirely.

One of the reasons we’ve picked an exemplar for our discussion is to provide a more definitive basis for the method, grounded in at least one known, public example: in our case the development of medical devices containing software. So we’ll return to these roots for definitions of these terms, and build from there. In an earlier post, I described the chain of regulations covering medical devices marketed in the US, and summarized with the graphic below:

Chain of regulations governing medical device software in the US

While CFR21 Part 820.30 is the governing regulatory requirements, it’s surprisingly short and not very explanatory. To address this, FDA produced the document on the bottom right, (General Principles of Software Validation; Final Guidance for Industry and Staff) to provide guidance to those who operate under the 820.30 mandate.

This document provides some meaningful definitions, and we’ll start with the most critical ones that are relevant to our model:


“Software verification provides objective evidence that the design outputs of a particular phase of the software development life cycle meet all of the specified requirements for that phase.

Software verification looks for consistency, completeness, and correctness of the software and its supporting documentation, as it is being developed, and provides support for a subsequent conclusion that software is validated. Software testing is one of many verification activities intended to confirm that software development output meets its input requirements. Other verification activities include various static and dynamic analyses, code and document inspections, walkthroughs, and other techniques.”

Clearly this definition takes us beyond testing, and takes a more “white box” look at assuring that each development activity meets the requirements imposed from the prior activity. With our yet-to-be-described agile , high assurance practices, continuous (and where possible, automated) verification activities will play a central role. Fortunately, we’ll discover that many of these practices (hierarchical reqs, small units of functionality, unit testing, pair/peer review, acceptance testing, etc.) are part of standard agile, high quality hygiene, so we’ll be able to “rigorously apply” existing practices, rather than invent new ones.


“…FDA considers… software validation to be confirmation by examination and provision of objective evidence that software specifications conform to user needs and intended uses, and that the particular requirements implemented through software can be consistently fulfilled.”

In practice, software validation activities may occur both during, as well as at the end of the software development life cycle to ensure that all requirements have been fulfilled. Since software is usually part of a larger hardware system, the validation of software typically includes evidence that all software requirements have been implemented correctly and completely and are traceable to system requirements. A conclusion that software is validated is highly dependent upon comprehensive software testing, inspections, analyses, and other verification tasks performed at each stage of the software development life cycle. Testing of device software functionality in a simulated use environment, and user site testing are typically included as components of an overall design validation program for a software automated device.”

In addition to these definitions, this document also provides an introduction and definition to a few other requirements that will be imposed on the process:

Software Requirements Specification

“A documented software requirements specification provides a baseline for both validation and verification. The software validation process cannot be completed without an established software requirements specification (Ref: 21 CFR 820.3(z) and (aa) and 820.30(f) and (g)).”

So while in agile, we don’t often create these in a formal way, instead using the backlog, collection of user stories and acceptance criteria,  test cases and the code itself to document requirements. But in this context,  it is 100% clear that we will need to rigorously develop and maintain a software requirements specification as part of our high assurance, but still largely agile,  practices.


This document goes on to describe traceability and traceability analysis as one of the primary mechanisms to assure that verification and validation are complete and consistent. However, it doesn’t define traceability, for this we refer to FDA Glossary of Computer Systems Software Development Terminology, where we find the IEEE definitions:

traceability. (IEEE) (1) The degree to which a relationship can be established between two or more products of the development process, especially products having a predecessor-successor or master-subordinate relationship to one another; e.g., the degree to which the requirements and design of a given software component match. See: consistency. (2) The degree to which each element in a software development product establishes its reason for existing; e.g., the degree to which each element in a bubble chart references the requirement that it satisfies.

Next Steps

With these activity/artifact/process definitions and the lifecycle graphic behind us, we can go on to a more meaningful elaboration of the model. We’ll do that in the next few posts.


4 thoughts on “Software Verification and Validation in High Assurance Agile Development: Definitions

  1. In reviewing your diagram for the lifecycle process model it depicts validation after the iterations. In the FDA guidance section 4 it clearly states “Preparation for software validation should begin early, i.e., during design and development planning and design input. The final conclusion that the software is validated should be based on evidence collected from planned efforts conducted throughout the software lifecycle.”

    In my experience validation done at the end of a cycle is less effective. Validation being ‘documented evidence’ must occur throughout the process. I feel that your diagram depicts this incorrectly. Your discussion of traceability , approved specifications, and verification are all a part of the big “Validation” picture so I would recommend reconsidering that diagram.

    I am interested to continue reading your post for how the agile iterations can help produce the ‘specification, design and documented verification’ deliverables anticipated by the FDA.

  2. Thank you for the post Eve. I do agree with your statement that preparation should begin early and the guidance does state that you must have a validation plan in place (section 4 says this can take many forms – I recommend lightweight and adaptable). The model above is not depicting that we wait until the end of the development lifecycle to batch process validation but rather to build and verify for 6 to 8 weeks (3 to 4 iterations) and then validate that increment. So there is a method to our madness.

    Regarding Validation, Section 3.2 of the guidance reads..

    “A documented requirements specification represents the user’s needs and intended uses from which the product is developed. A primary goal of software validation is to then demonstrate that all completed software products comply with all documented software and system requirements. The correctness and completeness of both the system requirements and the software requirements should be addressed as part of the design validation process for the device. ”

    This leads me to believe that we can define, build, and verify a user story (requirement) and validate the correctness and completeness at the end of a 8 week cycle. Is it possible that we are even doing lightweight validation in the form of a product demonstration at the end of every 2 week increment (i.e. iteration demo)?

  3. Although it doesn’t seem to be explicitly stated as a necessary traceability step (unless I missed it), it would seem that in order to satisfy the goals of verification and validation, we would at some point need to ensure that the requirements specification is still representative of the user needs and intended uses.

    This would be an additional opportunity to ensure traceability – between the user and the requirements which are then the foundation for validation and verification. It appears that the feedback loops of Agile practices would provide this opportunity at regular intervals, thereby increasing consistency and building a more robust baseline for validation and verification. Without that, you could be performing that work based on faulty requirements. A strong argument, in my mind for the application of Agile in this domain.

  4. In a previous life, I did independent validation and verification (IV&V) for defense software systems. The work was grueling and it started long before we even moved out of the requirements phase. It would have helped tremendously to have had a tool such as Rally to help with our traceability issues in managing the requirements alone. I also wish we had used Agile to begin verification and validation sooner as the requirements moved from phase to phase. Instead, we had to do complete “white box” eyeballing of requirements hoping that there was completeness and consistency.

    Isn’t it wonderful that we are honestly moving away from that world? Thanks for the post!

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s