Continued Process Verification: Bioprocess’ Ship of Theseus

Continued Process Verification: Bioprocess’ Ship of Theseus

  • What is continued process verification?
  • Why should manufacturers establish continued process verification programs?
  • Points to consider for continued process verification?
  • Workflow to set-up a continued process verification program?

There you stand: your bioprocess has been developed. Your DoEs executed and analysed. Your process validation is done (3 batches, likely) and the report is out. The FDA has given you the go ahead. You even have moderately data-based release criteria. Time to celebrate and leave the rest to the Quality-Operations department?

Not quite.

Élisée Reclus [Public domain], via Wikimedia Commons

Bioprocesses change over time. Improvement after improvement. Band aid after band aid. Shut-down after shut-down. GMP Bioprocesses are a veritable Ship of Theseus; changing and renewing piece by piece; at what point is it still the same ship? And from the authorities’ perspective, how can we know that our process continues to maintain its original validated state, even years after the launch?

The FDA’s modern approach to this is Continued Process Verification (CPV), Stage 3 of the process validation methodology which seeks to answer this question of continued control of the process. CPV is the last, lengthiest, and most-underestimated part of Quality by Design (QbD) process development. It is often thrown in at the last second, understaffed by process engineering departments, and underappreciated by quality departments (at least until the Quality Product Review).

Fortunately, the groundwork for CPV is straightforward. At its most basic, we must continuously verify the CQAs, CPPs, and other relevant process information over time to ensure the maintenance of the validated state. This includes not only checking acceptance criteria, but carefully monitoring and trending these attributes and parameters, using the correct statistical applications. And while most basic software can provide the groundworks for these approaches, there are endless opportunities to improve and development new methods to ensure the robust continuation of your processes’ validated state.

Where to look first? First & foremost, the FDA Validation Guidelines give us the main rundown:

Process Validation is defined as the collection and evaluation of data, from the process design stage throughout production, which establishes scientific evidence that a process is capable of consistently delivering quality product FDA process validation guideline

Otherwise stated, CPV is a framework of collecting and analyzing data to ensure that drug manufacturing processes are in a constant state of control. This state of control is very similar to the goals of the validation of the process, (proving control of the process), and is why the FDA has included it into its overall validation approach.

An ongoing program to collect and analyze product and process data that relate to product quality must be established (§ 211.180(e)). The data collected should include relevant process trends and quality of incoming materials or components, in-process material, and finished products. FDA process validation guideline

There are also numerous papers online describing approaches to CPV and a couple well executed case studies.

As a quick recap of the overall validation strategy: validation is put to practice in three steps (stages).

As would seem clear here, CPV is the hand-off from process characterization and performance qualification. However, this is not an entirely linear scheme. Information coming from both stage 1 and 2 can and should be included in CPV (more on that later).

What is CPV practically?

CPV can take a number of forms, but at its core, it is a study plan with a list of parameters and attributes that must be regularly controlled and observed during the lifecycle of the GMP bioprocess. Out-of-acceptance criteria, biases, and trends must be reviewed at regular intervals with experts from engineering, manufacturing and quality to ensure that the process stays under control, in as close to real-time as possible.

Most commonly, these parameters and attributes are observed in a control chart over intervals of time that correspond best to the frequency of manufacturing (not too often, nor too seldom).

Example of a control chart in Exputec inCyght® Bioprocess Software

Acceptance criteria issues are the most visible here, but any number of warning signs can be observed. For example, the Western-Electric rules provide a comprehensive guide to different trends and biases.

More recently, trending of CpK values (or process capability indices) is becoming common. This is effectively monitoring the probabilistic failure rate of your quality attributes based on their distribution and your specifications.

Fundamentally, CPV must be customized to your process, but there are number of helpful documents and case studies to assist you. Also, Exputec is happy to assist if you’re looking for some help.

Why should manufacturers establish continued process verification programs?

Reason 1: Achieve compliance. Not to be too subtle here, but we have to. The FDA requires is and now will issue 483s if there is insufficient proof of adequate monitoring of bioprocesses. As recently as 2014-2015, this was the 10th most common 483 observation. As the FDA sees this as part of validation, they will not skimp on reviewing your CPV plan at the GMP audit.

Reason 2: Avoid discarded batches: Failed commercial batches in biologics manufacturing cost companies millions of dollars. A sound continued process verification program with the right control strategy in place can avoid failed batches in commercial manufacturing. Even when full batches may not be saved, being able to abort in order to investigation and solve the problem before continuing production can be a huge benefit to the process and to overall fulfillment.

Reason 3: Mitigate vulnerabilities: Long term trends and biases will be detected early using a CPV plan. Warning limits, trend detection software, and even multivariate control of the process behavior can alert engineers to problems long before the first OOS, ranging from degrading materials (e.g. gels or media) to cell generation degradation.

Points to consider for continued process verification

  • The sources of the data to be used in CPV

As stated before, all prior information is helpful in establishing a CPV plan. Most common however should be:

  1. Process development data: Basic information about behavior of the process. Read more here: “Best practices for bioprocess data analytics”.
  2. Process characterization data: Advanced process models to determine statistical sophistication of the CPV plan based on process behavior. Read more here: “What is a bioprocess scale down model” and “What is process characterization?”
  3. Risk Assessments: Which parameters and attributes (CPPs, CQAs etc) are critical and therefore must be monitored
  4. Commercial scale batches and process: Upscale the models and determine frequency of production, which will in turn determine frequency of  monitoring
  5. PPQ data: Verification of process knowledge and determination of acceptance criteria
  • Frequency of Monitoring

The production schedule will determine how often the process will be monitored. Processes running 2x a year will not need to be monitored weekly. A process running 100 times per year cannot afford to wait until the quarterly quality review. Finding the perfect fit may difficult, but will help find the balance between overreacting and underreacting to potential issues.

  • Statistical sophistication required

Your data science group may find it fun to perform multivariate data analysis on every feasible parameter, but do you need this? Is a simple control chart enough? Do you have a large enough data set to be able to employe the powerful, but information hungry CpK value? These questions are important. Bottom line: do not assume that more sophistication is automatically better.

  • What are your limits and what limits matter?

Example: Image you take the decision to activate all possible warning limits on each CQAs (specifically, all Western-Electric rules). Within 2-3 weeks, your engineers will have impossibly high stacks of investigations sitting on their desk, most of whose underlying signals were likely caused by natural variation. In frustration, you might cancel the program entirely and rebuilt from the ground up with a harsh, but useful lesson learned: overly-conservative monitoring does not mean tighter control of the process. Finding the right balance between seeing no signals and seeing too many is a tightrope over a prison yard and the stakes are as high.

Basic thoughts on a Workflow to set-up a continued process verification program


“We recommend that a statistician or person with adequate training in statistical process control techniques develop the data collection plan and statistical methods and procedures used in measuring and evaluating process stability and process capability.” FDA process validation guideline

We would go even further.

Process engineers, data scientists, manufacturing and quality should all be involved in setting up a good CPV plan. While the statistician can help establish the rules, the process understanding and quality requirements come from all over the plant. Each of these departments will contribute and without them, key points may be overlooked.


  • Establish the parameters: Start with the CQAs; these are the closest to sure-things. Then move onto those CPPs which may be monitored and can bring indication to the variation of the process. Add to this the step yields, cell counts, and other in-process performance indicators that do not directly impact quality, but strongly imply whether the process is performing adequately. Finally, add attributes of the raw materials which can bring insight into upcoming shifts and trends.
  • Establish the tools: What monitoring program and software do you have available? Do you have graph paper and a guy with a calculator? Or do you have a fully automated data platform, effortlessly performing multivariate data and sparse recovery regressions?
    Our recommendation: only monitor attributes where the data is present in electronic form – and make sure your critical attributes exist in this form. Your engineers will go mad collecting data if they have to search batch records every time. Get your data into an electronic source as quick as possible. Only then do you have the luxury of considering your statistical software choices. We recommend InCyght for both.
  • Establish the statistical approach: lastly, determine which stat applications you will use. Don’t be afraid to ask for help from your data science or stats group. And if not, call us. We are available and happy to help!
  • Implement the plan and ensure participation: Ensure the proper people are really performing the monitoring plan regularly. Like diet and exercise, CPV is best when done routinely as lifestyle, and not for a few weeks after Christmas. Your quality department may not love it at first, but when the annual product review’s technical chapters are neatly delivered with minimal effort, they will thank you.


Yesterday! It is truly never too early to be thinking about monitoring. Have the developers already thinking about sensor and lab technology which will deliver fast electronic data with minimal variation. And have your engineers ready to go, so that they are monitoring from day 1.

Data integrity: well-validated data management and statistical tools

For more information on implementation strategies for bioprocess data management and statistical software, see our blogpost on bioprocess software implementation strategies.

How does Exputec support leading biopharmaceutical companies to realize continued process verification?

Exputec provides statistical consulting and software to solve challenges for biopharmaceutical companies.

Exputec consulting and statistical services streamlines process validation through all stages and streamlines interactions with regulatory authorities such as FDA and EMA by adopting statistical best practices.

inCyght® part 11 compliant data management and data analytics software manages the constant stream of process and quality data in one intuitive software environment. inCyght software is a one-stop shop to set-up CPVs efficiently and in-line with current regulatory best-practices.