Computer Systems Validation Part 7 – Development

In this article, I will discuss how systems are developed, and how a developer/vendor and the client need to work together to best meet the requirements.

I will cover good and maybe not-so-good development processes, and what you, the customer, should expect from a vendor while they conduct the development.

CSV Part 7 Developers

System development is not something that we only need to consider for GAMP Level 5 systems.

Although we have seen previously how important this is for bespoke systems (as per the base of the ‘V’ model for that level). How a system is developed initially and continues to be developed post-implementation, can be critical to overall project success.

One point here that I cannot escape from is that at times this post will sound like ‘Part 2’ read from the previous entry on Vendors. But the thorny issue of vendors cannot unfortunately be ignored when discussing development.

Note also, I will touch on internal development. I am planning to have a specific article on this in the future, especially regarding in-house development applications using tools such as MS Excel and MS Access, so will not go into too much detail on these this time.

(as a sneak-preview for that one, if you are considering using Excel for GxP processes, I would strongly recommend you re-consider…)

Feeding the URS into Good Design

I have talked previously about the importance of the URS, and we are now at the point where the URS needs to be used in anger. Hopefully, the requirements list is in a good place and (I know this sounds obvious) even available and approved.

I can recount a number of examples where a developer was provided with a early draft URS document, the development process started, the URS was subsequently bounced around with significant changes from the draft, and the developer (understandably) was unhappy that the goalposts had shifted.

So now we are back in the hands of the vendor, as we need them to produce a detailed document explaining exactly how they are going to meet the requirements.

For reference, I would call this document the Design Specification (DS), but that is just a label – I have heard it called Product Design Specification, Functional Design Specification, Design Brief etc.

This stage of the process is often undervalued or even overlooked entirely – and usually because the vendor is unable, unwilling, or even refuses to generate the document. I have found that often the conversation will start with “what sort of document would you like?” from the developer.

In an ideal world of course, the developer would provide a detailed line by line discourse on exactly how each requirement would be developed and met. This makes the Design Qualification (DQ) rather straightforward.

The reality will lean more towards “it will do what you want it to” without going into any detail on how this bold statement might actually be realised. This makes the DQ anything but straightforward.

So, what do you do if you have little or nothing to go on to provide assurance that your requirements will be met? The simple answer is that you will need to rely on other things to support you such as a good quality systems audit, past experience with the vendor, existing software created by the vendor, the development method (see below) etc.

And of course, a big touch of hoping for the best…

But let us hope you have good assurance and confidence in the design of the application. Now what?

Development Methods

Appendix D4 of the GAMP Guide states that the development needs to produce something that will:

  • Meet requirements
  • Be reliable
  • Be robust
  • Be maintainable
  • Be capable of handling error conditions
  • Be well designed
  • Be sufficiently commented

There are many development methods used by the software industry, and these can include Agile, Sprint/Scrum, Rapid, Waterfall, DevOps etc. (note – the last of these is used on systems for regular and routine updates and deployment). There are others.

I will not go into detail on what all of these are – feel free to search for them online elsewhere – and the developer may have different ways of working on the Software Development Life Cycle (SDLC). But typically, it will be one of two ways for new systems:

Developers - Waterfall
The Waterfall

This is where you give your user requirements to the vendor, and they go off for a set period of time (maybe even months) and then return with something they believe is what you have asked for.

The Waterfall SDLC is broken down into six stages and is so called because each step follows on from the last in a linear manner like a waterfall over a series of cliff edges:

CSV Developers
  • Requirements
    • By gathering the requirements from the customer here, the developer in theory will need no further input past this point. So, the developer gets the URS and off they go
  • Design
    • This is where design choices are made regarding how the requirements will be realised
  • Development
    • Where the application is developed and coded

(I am actually going to interrupt the description of Waterfall at this point as hopefully you have noticed that this process by now does not sit that well with the whole “marrying the URS with a good set of Design Specification documents” approach. And you would of course be correct – design choices made at this time may well be too late)

  • Testing
    • Where the developer tests and may require a feedback loop with the previous step if issues are identified
  • Deployment
    • Where the system is deployed for the customer
  • Maintenance
    • Where the system goes live and is maintained by the vendor post-implementation
    • Of course, if the system is not what the customer wants at this time it is probably too late…
CSV Agile Development
Agile Development

This is considered a much more modern and effective approach to the SDLC, the Agile method can only succeed with the collaboration of the customer and developer.

Design choices are discussed and agreed upon by both parties before development begins using the URS as the starting point, and this is reviewed as an ongoing process during the whole SDLC.

Agile uses a series of “Sprints” where a small number of the requirements are developed and then shown to the customer with regular meetings. This allows the customer to monitor the progress and provide feedback on the suitability and effectiveness of each element. Where changes fixes / are required, these can quickly and more cost effectively be made ensuring the customer gets what they want in a way that they want it.

This process is then repeated for each system element in a continual loop. As follows:

CSV Development

Where each sprint is started, the plan for the step is agreed and the scope defined in cooperation between the developer and the customer. The developer then goes off to code and test that described element before the next meeting where the development is reported, demonstrated, and hopefully accepted; before the process is repeated.

These meetings will be typically weekly or fortnightly and involve the vendor and the customer SMEs – meaning that any issues / deviations from the original requirements can be identified and addressed quickly.

Open and clear communication with the developer is key to achieving your requirements. I cannot state this enough.

Neither of the above are wrong as such, but I would advise caution if choosing a developer that uses Waterfall. The reason for this is that Waterfall is considered an outdated method, typically will not involve the customer until late in the process and can often result in the original requirements getting lost along the way.

It is also often slow and can add costly delays to projects when the customer has to go back to the developer repeatedly once they realise their requirements have not been met.

But beware, there are plenty of examples of vendors supplying the healthcare industries that still work in this way.

Much better then that the vendor says they use an Agile method, where the development is broken down into those smaller steps and then reviewed/tested at each stage with the customer able to ensure that the project is correct and that the goals are being met right from the outset.

I have discussed this in general previously, but looking specifically at the base of our GAMP Level 5 V Model, it is expected the developer will do the following:

By adding the customer into the feedback loop, we can see the premise of the Agile methodology fits well with the guidance expectations as the visibility for the customer can be much greater.

Therefore, my advice would be to ask the vendor what development methods they use and how they manage the SDLC, before entering into any agreement with them.

If they say “Waterfall” then seriously consider using an alternative source.

The importance of Inputs and Outputs

You may need a system that requires specific inputs and outputs, and the availability of these will be crucial for the development and testing process, especially when equipment is involved.

For instance, if you need a system that connects and needs to seamlessly communicate with some specific hardware you already possess, it is important that the developer also has access to that equipment (or at least a model that mimics it).

I remember all too well where a developer of some shop-floor equipment control software did not have access to the equipment that we (the customer) possessed. And while the finished application ultimately worked up to a point, it did not integrate seamlessly with the hardware, data capture timings were out of sync with the equipment, and significant issues were observed both downstream in the material flow, and ultimately with the final product.

The message here therefore, is to collaborate with the vendor so that they are very aware of what data and in what format it is coming into the system, and what needs to go back out of the system again.

This also includes data transfer speeds. If your connected system requires data to be bounced around at a certain minimum capacity, make sure the developer know this!

Note – as a quick aside on this type of scenario – experience has shown that anti-virus software on a system may well interrupt data transfer between systems. Ensure this is taken into account!

Structured and Standard Approaches

This is maybe another area to discuss with the developer – what standards do they use for coding and formatting, agreed programming conventions etc.?

By using standard approaches, not only does it mean that all developers involved are “singing from the same hymn sheet,” but also in the event that the main coder moves on from the vendor’s organisation, the code can be easily understood and developed further by someone else.

This last point above is critical for in-house developed applications and ongoing system maintainability.

I have seen it time and time again where an individual has developed a system (using tools such as MS Access, Excel, SharePoint etc) with some back-end coding, and when they leave and the application invariably has an issue, there is no one remaining able to fix the problem – as no one understands what the original person has done. This has the potential to have a significant GxP and/or business impact and any instances should be captured on the company’s risk register together with the associated risk control measures.

Note – as I stated at the start of this entry this does lead onto an argument that organisations should avoid these types of in-house applications altogether where they can. But I realise this is not always possible. I will discuss in-house developed applications in a later post.

Testing

So how should the developer perform the testing? In many ways that is up to them. If we have a level of assurance that the developer will test thoroughly and when they are happy that the development is correct, we can trust them that this is indeed the case.

In terms of what we need to receive, I have seen companies that have provided test summary reports, quite often generated by in-house test administration software. Even better was one company I have been involved with that had its own automated regression testing reporting tool, so as each sprint added a new element to the system, the not only was that tested, but everything else was tested as well to ensure that the new element hadn’t impacted the existing parts.

If a developer does not have automated tools, then a manually written set of reports with screenshots will suffice.

Next on the list is where the developer will provide a summary of what has been tested and that they have been successful. Again, if we have assurance, this can suffice as acceptable.

Then there is the situation where the developer provides very little. In this instance, and having explored all possible options, following the principles of Quality Risk Management you will have to perform greater levels of testing to provide the assurance that the vendor has not.

This is why it is important to get clarity on the documentation that the vendor will provide during the selection process.

GAMP Levels 1, 3 and 4 Applications

Although all the above will apply for bespoke systems, the principals also apply for the development of other systems.

Obviously, the level of developer interaction with the customer may well be less, especially where the application is intended for a wider commercial market.

However, there would be an expectation that the developer would still employ good practices, would still use standard coding practices, and would fully test and report (even internally) on the success and / or failures of that.

System Updates

Depending on the type of system you have, and the scope of the update, it may well be that the Agile approach is only applicable where the customer requires major system enhancements.

Usually, the developer is more likely to use a Waterfall method for smaller updates, and this by and large will work quite well as long as a feedback loop with the customer is in place.

But where applications are in a constant cycle of development and updates (a good example of this would be FMD safety features serialisation management, where changing legislation worldwide required constant updates to the system), a slightly different approach can be used called DevOps.

DevOps tends to be used for constant rapid development and deployment, and the model itself appears to look like a figure of eight infinity loop:

CSV Development - Devops concept

Essentially this works similarly to the Agile development, but there is no involvement with the customer – essentially the developer keeps everything internally and in a constant cycle of Plan –> Develop –> Release.

Where this causes the regulated user an issue is the fact that these updates will keep coming through constantly; where the regulations expect each change to be assessed, tested and controlled. But the updates are out of the customer’s control…

I have seen some novel solutions to this particular problem, from ignoring the updates entirely and then assessing them at the next scheduled Life Cycle Review, all the way through to validating every single one. But I personally believe that the following is a pragmatic solution:

Namely, a formal Risk Assessment must be performed on every update against the current validated system. This is unavoidable.

But where there is no impact on the validated or key-business elements of the system I would suggest simply ending there – i.e. sign off the Risk Assessment as ‘No Further Action Required’ and log the new version number details. That way you will have the Release Notes and a documented assessment. The potential cumulative impact of multiple “no further action required” changes can be assessed during the next lifecycle review.

Where there is an impact, the outcome of the Risk Assessment will hopefully suggest that a very narrow-scope qualification would be required.

This approach can be justified if the developer provides the aforementioned Automated Validation Report (or similar) – i.e. a report where the developer can show that the system has been thoroughly positive and regression tested internally using standard automated tools. Of course, a vendor might well charge a premium for access to this report, but usually paying for this is a much more cost-effective solution that the alternative…

Vendor Mindset

Finally in this blog I’ll touch on vendor mindset. In many ways this follows on from the previous post on vendors, but has more to do with how vendors carry out their development and the mindsets they have to that and indeed their customers.

Having a vendor with a good customer relationship during development is vital, and this should be reflected in how “comfortable” you feel with that developer. Although bad vibes can sometimes be misplaced, silence, broken promises and missed deadlines due to development issues can have serious impacts on the project.

It pains me  to recall one particular vendor who not only used the Waterfall method, but also outright refused to engage in any discourse whatsoever during the development cycle until they had completed their side of things. And when the delivered system did not meet the requirements and was consequently pulled at the last minute because it wasn’t fit for purpose, they became rather upset about it all.

When challenged on this to make further changes, the vendor basically replied, “we’ve done what you requested – tough if you don’t like it.” Again, it is critical to have a robust vendor selection process to minimise the risks of this type of issue from occurring and to have regular reviews during the project.

Conclusion

In summary, it is important that the system is developed using a modern, standard approach, with full transparency and open communication between the developer and the end users.

Be incredibly careful when dealing with the more old-fashioned developers, ultimately remember that you are the customer – make sure the developer does things the way you need them to as much as possible and build this into your selection process.

Ultimately, focus on the risk at all times and plan accordingly.

In the next article in this series, I will discuss the Validation Plan. What are good things to put in it, what can be left out, and how to ensure it comes across as “inspector friendly.”

To find out how we can help your organisation with your Computer Systems Validation and Data Integrity needs please get in touch at consultancy@jensongroup.com or to read more about our CSV services, please click here.

Read all posts in this series:

Neil Rudd
2024-06-11T10:58:54+01:00June 13th, 2024|
Go to Top