Computer Systems Validation Part 2 – Why Validate?

In my earlier post, I  suggested that the management teams of many organisations will direct validation activities to be performed for the sole reason that a regulator tells them to.

Meeting a regulatory requirement is of course just that, namely a requirement, and so this must always be considered the bottom line in any discussion. But validation should be seen as much more than just ticking a box so that the next auditor through the door is “happy.”

There are three main reasons why we should validate our systems:

1. Regulatory Requirements

I will start with this one as the requirements of the regulator are immutable.

I first became aware of CSV as a unique entity in its own right around the mid-1990s – which ties in very much with a greater regulatory focus in the area at the time.

Certainly, there was previously an expectation that organisations should validate systems, but with precious-little practical guidance and understanding of exactly how and what this actually meant for both organisations and crucially for auditors, the area quickly became a minefield for everyone involved. As a result, the maxim “more is more” very much became the standard – where organisations would produce veritable forest’s worth of paperwork, qualifying every last character on screen and printout, and verifying each keystroke positively and negatively to the nth degree.

I remember clearly during a regulatory inspection around that time where an auditor requested the validation package for a system, and I literally wheeled over thirty large folders of paperwork balanced carefully on a trolley into the room. The poor auditor’s face quickly drained of colour with a look of abject terror as I placed the first few folders on the desk and sat down to go through them all.

The happy outcome there was that the auditor, clearly mindful of their time, patience and limited knowledge in the subject matter had a cursory look at the first document in the first file, replaced it quietly and said “Yes, I think everything is in order there, thank you”, at which point I carefully wheeled them all back out again.

Computer Systems Validation

Image by Freepik

Although this seems laughable now, similar situations were occurring across the industry – where large system implementations would face considerable delays due to validation efforts routinely generating tens of thousands of pages of printouts and taking a couple of years (or longer) to complete. Not only did these efforts rack up considerable time, costs and resources, by the time the system actually went into use, it would be 2 years old and in need of updating.

Thankfully slowly but surely, the industry woke up to this thorny problem and set about rationalising the approach based on risk and standardising the scope of the work required so that these massive undertakings were made less burdensome for everyone involved.

Saying that, no two electronic systems are alike, and even now it can be very easy to lose sight of the actual purpose of the exercise, namely, to show that the system is fit for its intended use.

Note – The words “fit,” “intended” and “use” each, in their own way, carry an awful lot of meaning in the wider day to day system use – and I will come back to that later.

With this in mind, it is easy to understand an underlying distrust of management to CSV and a reluctance to invest in newer technologies. This may, in part, also have contributed to validation matters typically not being considered until quite late on in a project (something that still happens today in a lot of organisations), or an assumption that it will magically get done by… someone.

In some cases, CSV is not performed at all, and companies still rely on manual paper and spreadsheets processes to perform, monitor, record and execute each and every time an activity is performed.

It should be noted however that this situation can often be forced upon a company, especially if they are small in size, and who cannot or do not have the capacity, knowledge or resource to validate their systems; even though ultimately doing so would prove a cost saving over the medium to longer-term.

There are three main regulations in the industry that from my perspective are most relevant (being based in the UK):

So, what do the regulations say? Bar a few wording differences, the main regulations pretty much state the same thing.

It is my feeling that the way they are written work in slightly different ways – namely that the UK and European guidelines work by starting with the system/application lifecycle management, whereas the US guidance works by starting with the records emphasising record integrity and control (note – I am reluctant to use the terms “top-down” and “bottom-up”, as these can have other meanings with IT System implementation).

Neither of these is incorrect by the way, and each leads to the same outcomes through similar processes.

As an interesting side note, many vendors will offer applications that are described as “21 CFR Part 11 Compliant” – but this does not mean it would not be suitable for use outside of US markets. The term itself seems to be used across the global industry as a generic label for any system that meets healthcare companies’ regulatory requirements.

Often challenging vendors by stating that you are not in the US and would like confirmation of compliance with Annex 11 will result in “umms and ahhs” from the vendor representative (I will revisit the subject of vendors in a later article).

So other than pretty much stating WHAT must be done, the regulations do not really provide details on HOW to adhere to them. For that, we need the other industry standard guidelines, namely:

Of these, the first provides descriptions of the steps required to validate, and the latter a more detailed conceptualisation of these steps.

In many ways, and especially as the PIC/s guide is nearly 20 years old (albeit being updated soon), the industry tends to treat the GAMP 5 guidance (or “GAMP Guide” for short) as the go-to de facto standard, and I will refer to this document throughout these blogs1

Note – there are also a number of ISO standards that are important, and I will discuss those as and when they become relevant in the discussion.

In many ways what neither of the Guideline documents do is discuss the day-to-day reality of CSV; what to be mindful of, what works, what does not, and the pragmatic ways in which the effort can be minimised while guaranteeing compliance and maximising the outcomes.

2. Risk Based Assurance

Of as much weighting to this discussion is remembering to keep in mind that qualifying a system is ensuring the system ultimately manages the residual risks to patient safety – very much the approach described in the UK, EU and ICH guidelines.

I have heard anecdotally of a tale where a major healthcare company had significant delays to a system implementation because the system User Interface (UI) background and text fonts were not the standard company configurations that were originally stated on the User Requirements Specification (URS).

This forced the developer to have to go back and change everything causing additional delays and costs.

Putting aside the fact that this issue should have been picked up very early with the vendor had good development practices been observed, ultimately the organisation had lost sight of the system’s purpose and ended up focussing on an irrelevance.

Therefore, to meet the regulatory requirements as a minimum, we need to focus more on what is important, following the principles of Quality Risk Management (and ultimately potential patient impacts) and focus less on… pretty much everything else.

That said, applying simple risk-based criteria can be used not only for defining the scope of GxP-related qualification activity focus areas, but can also be used to define the critical business criteria (see the Business Benefits section below).

Electronic systems have an advantage in this area over other types of validation, namely that typical risks inherent for one electronic system are almost exactly the same as the risks for pretty much every other electronic system.

Consequently, we can have a set of standard tests for these risk areas that would automatically get recorded as required, and then performed for all of them by default without the need to go through the same set of risk criteria each and every time a new system or system update occurs.

I will go into these in much more detail in a forthcoming post on Risk Assessments.

Therefore, we only need to focus the risk assessment on what is left, and this can include the key non-GxP business processes.

3. Business Benefits

If we take that the compliance box will be ticked, and the risk-based approach can reduce overall effort, we can then promote the business benefits of following a correct, controlled and validated lifecycle process.

The history of the healthcare industry is littered with delays, massive over-spending, and indeed complete failures when it comes to electronic system implementation. Talk to anyone who has been involved long enough in the industry and I can guarantee they will all be able to provide a first-hand example of such an experience (and probably more than one).

Failures can be due to a range (or combination) of different reasons, but they will almost always boil down to a small number of general categories. I will call these rather cornily:

The 4 Horsemen of the Validation Apocalypse

I will cover each in much more detail in later posts, but for now, I will give you an overview:

Computer Systems Validation

User Requirements need to be developed with the right people, and then they need to be locked down once agreed with no further changes allowed. Too often the requirements are vague, incomplete, incorrect in places, or indeed non-existent until a vendor writes them for the organisation (and this last one is all too common, unfortunately).

And then there is the issue of scope drift – where an initial URS is agreed upon and then someone inevitably comes along and says the dreaded words “Ooooh, can we get it to do this as well?.

Computer Systems Validation

Understandably businesses need to make money to survive. And in general, the management of those businesses will work to the principle of “minimise costs and maximise profits.” Nothing wrong with that per se.

But Management will often only look at the initial cost (i.e., the application/hosting) based on a small number of potential solutions and make top-down decisions in isolation from those who need to be involved, namely the subject matter experts, IT professionals and ultimately the users.

There is also a very real danger that Management will not understand or allocate the right resources – i.e., time, people and tools, as these are either not considered, considered as an un-budgeted afterthought, or represent additional costs they will not provide for fully.

Computer Systems Validation

Vendors can be your best friend or be your worst enemies. A well-run supplier with good quality systems and an understanding of healthcare industry CSV requirements absolutely should be the first requirement for every new system.

This statement is so important I will state it a second time… A well-run supplier with good quality systems and an understanding of healthcare industry CSV requirements absolutely should be the first requirement for every new system. I wish I could italicise and make bold every word of that statement.

Unfortunately, the above is very often not the case.

Computer Systems Validation

For those of you reading this outside of the UK, there is a long-running television quiz show in the UK called Mastermind where contestants are asked a series of questions for 2 minutes on a subject of their choosing and then get 2 minutes of General Knowledge questions.

One of the things the programme is known for is that after the 2 minutes are up, a bleeper goes off, and the quiz master says the words “I’ve started so I’ll finish” – meaning that the contestant will get the chance to answer the question that the bleeper cut-off mid-flow.

And it is this idea of “I’ve started so I’ll finish” where the problems can come with electronic system implementation – i.e. the teams will keep ploughing on with a project no matter how bad and no matter if everyone involved all knows fully well it is doomed to continued problems, delays or even failure.

The thing is with these Horsemen, they can ALL be avoided by proper lifecycle management and ensuring good validation processes. All of them every time. 100% guaranteed.

And good practices can be applied not only to GxP systems but also to non-GxP systems – the potential pitfalls and tangible benefits remain the same.

That is not to say there will not be problems along the way, of course the unexpected will always happen, but the process will be controlled, and the impact of the 4 Horseman understood and accounted for.

By minimising the potential for failure using good practice and quality risk management principles the business and ultimately patients always benefit.

So, to bring all this back to the initial question of “Why Validate?,” maybe the question should be “Why Would We Not Validate?”

In the next entry in the series, I will look at who needs to be involved with validation, right from the critical start of the process, through to the completion. Project and System Ownership can be a very real problem.

Footnote: 1 – For reference, I am not in any way affiliated with ISPE, and have not been part of the GAMP 5 guidelines development.

To find out how we can help your organisation with your Computer Systems Validation and Data Integrity needs please get in touch at consultancy@jensongroup.com or to read more about our CSV services, please click here.

Read all posts in this series:

Neil Rudd
2024-01-22T09:42:34+00:00November 30th, 2023|
Go to Top