Method Validation in Pharmaceutical Analysis

Method Validation in Pharmaceutical Analysis

In 2002, FDA began an initiative entitled “Pharmaceutical Quality for the 21st Century.” This initiative identified a number of problems in the pharmaceutical industry: pharmaceutical manufacturing processes often had low efficiencies in comparison to other industry sectors with significant levels of waste and rework, reasons for manufacturing failures were not always understood, the uptake of new technologies was slower than in other sectors, and manufacturing cycle times and costs were high. In September 2004, the FDA published a report “Phar-maceutical cGMPS for the 21st century – A risk based approach” which made a series of recommendations aimed at encouraging the early adoption of new technological advances, facilitating application of modern quality management techniques, encouraging adoption of risk-based approaches, and ensuring that regulatory review and inspection polices were consistent, coordinated, and based on state-of-the art pharmaceutical science. In October 2005, Janet Woodcock of the FDA described the desired state of the pharmaceutical industry as a maxi-mally efficient, agile, flexible pharmaceutical manufacturing sector that reliably produces high-quality drug products without extensive regulatory oversight. Between 2005 and 2012, the International Conference for Harmonisation (ICH) developed a series of guidances (ICH Q8,9,10 and 11) that were intended to modernize the pharmaceutical industries approach to Quality Management and embed more scientific and risk-based approaches to pharmaceutical development and manufacturing. This new paradigm was based on a philosophy of “Quality by Design” (QbD). ICHQ8,9,10, and 11 described how systematic approaches to process understanding and control of risk coupled with implementation of effective quality management systems could deliver more robust manufacturing processes.
A critical enabler to ensuring manufacturing processes consistently produce products that are fit for patients and consumers is the analytical data that allows an understanding of the process and confirms the quality of the product produced. Many of the problems and issues with pharmaceutical manufacturing processes uncovered via the FDAs “Pharmaceutical Quality for the 21st Century” initiative were also true for analytical methods used by the industry. Uptake of new analyt-ical technologies was slow, repeat occurrences of out-of-specification results due to lab errors were common, and levels of waste and rework were high. Clearly, analytical testing is simply a “process” in the same way that manufacturing is a process – the difference being that the output of a manufacturing process is a product, while the output from an analytical measurement is data. It follows there-fore that it should be possible to apply the QbD principles described in the ICH Q8–Q11 guidances to enhance the understanding, control, and performance of analytical methods.
In the second edition of Method Validation in Pharmaceutical Analysis, the edi-tors have included chapters written by subject matter experts, which illustrate how the QbD principles can be applied to analytical methods. These include the fol-lowing: how an analytical target profile (ATP) can be established to predefined the objectives for the quality of the data that the method is required to produce (which parallels the concept of a QTPP used to define the quality of product a manufacturing process needs to produce), how the lifecycle approach to process validation developed for manufacturing processes can also be applied to analyti-cal methods, and how the need for effective change and knowledge management process throughout the lifecycle are as equally important for analytical methods as they are for manufacturing processes.
The concepts described in this book reflect modern quality management prac-tices and include approaches used widely in other industries (e.g., measurement uncertainty). The establishment of “fit-for-purpose” criteria in an ATP will facil-itate a more scientific and risk-based approach to method validation activities ensuring efficient use of resources that are focused on the areas of highest risk and will bring the pharmaceutical industry in line with other science-based industries. Ultimately, this will help promote regulatory as well as business excellence and public health through the better understanding and control of the measurement of quality of pharmaceutical products.

Analytical Validation within the Pharmaceutical Lifecycle

he concept of validation in the pharmaceutical industry was first proposed by two Food and Drug Administration (FDA) officials, Ted Byers, and Bud Loftus, in the mid 1970s in order to improve the quality of pharmaceutical products [1]. Valida-tion of processes is now a regulatory requirement and is described in general and specific terms in the FDA’s Code of Federal Regulations – CFR21 parts 210 and 211 as well as in the EMA’s Good Manufacturing Practices (GMP) Guide Annex 15. The 1987 FDA guide to process validation [2] defined validation as Establish-ing documented evidence that provides a high degree of assurance that a specific process will consistently produce a product meeting its pre-determined specifica-tions and quality attributes. While the first validation activities were focused on the processes involved in making pharmaceutical products, the concept of valida-tion quickly spread to associated processes including the analytical methods used to test the products.
Regulatory guidance on how analytical methods should be validated has also existed for some time [3], however, it was not until the establishment of the Inter-national Conference on the Harmonisation of Technical Requirements for the Registration of Pharmaceuticals for Human Use (ICH) in 1990 that there was a forum for dialogue between regulatory authorities and industry and one of the first topics within the Quality section was analytical procedure validation. The ICH was very helpful in harmonizing terms and definitions [4a] as well as determining the basic requirements [4b]. Of course, due to the nature of the harmonization process, there were some compromises and inconsistencies.
Table 1.1 shows the ICH view on the required validation characteristics for the various types of analytical procedures.
The recognition that the current pharmaceutical industry’s manufacturing per-formance was not as state of the art as other industries [5–7] has resulted in unprecedented efforts over the last 15 years to modernize pharmaceutical devel-opment and manufacturing. In August 2002, the FDA announced a significant.

Analytical Instrument Qualification

esults generated using analytical procedures provide the basis for key deci-sions regarding compliance with regulatory, compendial, and manufacturing limits. A high degree of confidence is needed that the analytical procedure will generate reportable results that meet requirements under all conditions of use as the procedure progresses through the life cycle. Application of quality risk management (QRM) concepts and tools (International Conference on the Harmonisation, ICH, Q9) can be useful in providing a mechanism for achieving this. The analytical laboratory may be seen as a manufacturing process con-verting samples into information. This conversion process may be illustrated as data to information transformation shown in Figure 2.1. Assuming, sample relevance, the conversion foundation relies upon data integrity. Data integrity is predicated upon the assurance that instruments and systems employed as part of the analytical procedure are in a state of control. A state of control is established for instruments by calibration and qualification activities and for software applications, by validation.

Establishment of Measurement Requirements - Analytical Target Profile and Decision Rules

ICH Q8, Q9, and Q10 guidance documents describe concepts of obtaining process understanding. This is done through adopting a systematic and scientific approach to development and then implementing controls based on this enhanced under-standing. An analytical procedure can be considered a process and these concepts applied. The aim is to deliver an analytical procedure with improved robustness in comparison to traditional approaches. The analytical procedure is also directly aligned with and fully supports the intended use of the reported result.
An integrated quality by design (QbD) approach includes defining the require-ments for an analytical procedure based on the intended use of the reported result produced by the procedure; then defining the critical quality attribute to be con-trolled; developing and validating it accordingly; and keeping it suitable through-out its lifecycle [1].
In line with the concept, described in ICH Q8, of defining a quality target performance profile as the starting point for developing a process using QbD principles, the establishment of the requirements for the performance of an analytical procedure is the first step of the analytical lifecycle (see Figure 1.2). The required performance is defined in an analytical target profile (ATP). The ATP describes the target measurement uncertainty, which is the maximum acceptable uncertainty in the reportable result that must be achieved by the analytical procedure. The ATP is focused on defining the acceptable quality of the reportable result and is independent of a specific analytical procedure. This chapter will describe decision rules, discuss the target measurement uncertainty that becomes part of the ATP, and will draw on approaches described in consen-sus standards documents such as ASTM [2], Eurachem guidance [3], the guide to the Expression of uncertainty in measurement (GUM) [4], and ASME [5], as well as approaches described in pharmacopoeia, such as the performance-based concept in the USP’s (United States Pharmacopoeia’s) medicines compendia or the guidance on validation of bioassay methods in the USP–NF (The National Formulary).

Establishment of Measurement Requirements - Performance-Based Specifications

The majority of the chapters in this book provide insight into the active validation of an analytical procedure. This chapter is intended to focus on what should be considered before you begin the formal validation, if it is needed at all. Here the focus is not upon the act of validation, but rather on the determination of how a procedure should be validated. There are recent publications that suggest that formal validation is an on-going, lifecycle long effort [1]. The lifecycle approach requires the definition of an analytical target profile (ATP). The measurement system described in this chapter may be considered to be a generic ATP. While this approach holds tremendous promise, it cannot be implemented without a greater knowledge of what a valid procedure is and how it can be measured. Fur-ther, it relies upon the analyst to critically develop the measurement requirements to support the approach. It requires statistical analyses and out-of-the-box think-ing. This chapter focuses on a series of approaches to define these measurement requirements. These are not absolute requirements, but form a strong foundation on which to build one’s lifecycle approaches. If however a user needs to consider validation as a one-off activity, then the approaches included herein provide a novel and effective approach that will lead to greater confidence in the capabilities of the a procedure.
Validation is a term of art that is open for interpretation by the user. Terms of art are extremely helpful in discussions among peers having similar perspectives, but require careful definition when used with a general audience. For the pur-poses of this chapter, the term “validation” will have the following definition: “a rigorously controlled series of experiments designed to demonstrate that a given measurement procedure will produce data that is fit for its intended purpose.” This definition or variations of it have been presented previously and the last half of the definition (data … purpose) is the most critical for this discussion.

Method Performance Characteristics

The following sections discuss parameters and calculations that describe the per-formance of analytical procedures. Terminology and orientation with respect to the (validation) characteristics are taken from the ICH validation guideline [1], but they are integrated into the modern lifecycle concept. The selection and discus-sion of these parameters and calculations reflect the experience of the author and are primarily based on practical considerations. Their relevance will vary with the individual analytical application, and according to the issue to be investigated. It is not intended to replace statistical textbooks, but the author has tried to provide sufficient background information – always with the practical analytical applica-tion in mind – in order to make it easier for the reader to decide which parameters and tests are relevant and useful in his/her specific case.
According to the analytical target profile (ATP) concept, the performance cri-teria which are relevant for the intended measurement of the respective critical quality attribute (such as content of the active, or impurity, or water in a specified material) are defined apriori, that is, from the objective of the measurement, (as far as possible) independent of any specific analytical procedure (see Chapter 3). Consequently, two categories of performance characteristics can be defined:

Method Design and Understanding

Once the business purpose of a measurement has been defined together with the specific performance criteria in an analytical target profile (ATP, as discussed in Chapters 1 and 3), subsequent activities involve selection of a suitable analytical measurement technique followed by development and optimization of method conditions. The goal of this process is a well-understood method that is adequately controlled and capable of producing data compliant with the ATP reliably in any applicable environment during the product lifecycle. This process has also been documented extensively elsewhere [1–5].
For the purposes of this chapter, the early and late product development phases leading into the commercial environment will be the context for the strategies pro-posed for method selection, development, and optimization. During these drug development phases, analytical methods can change considerably and so method selection, development, and optimization activities should never be viewed as a one-time activity but rather as an ongoing process. In this process, each time any change occurs, for example, the synthetic route or formulation process is modi-fied or new knowledge is gained from development work (e.g., stability studies) or manufacturing campaigns, assessments need to be made to evaluate the potential impact of the change on the capability of the method to produce data compliant with its performance requirements. The outcome of such assessments may result in further experimentation and optimization of the method and/or changes to the target method conditions or method operating ranges. As such, application of quality by design (QbD) tools in concert with an appropriate knowledge manage-ment infrastructure is key to ensuring optimum analytical resource efficiency. It is imperative therefore that knowledge must be captured dynamically through this process in a vehicle that is accessible across organizational boundaries.

Method Performance Qualification

During Stage 1, critical method variables are identified, using risk management tools and experimental investigations such as design of experiment (DOE), opti-mized, and their acceptable ranges defined (see Chapter 6). These are initial inputs to the analytical control strategy, which plays a key role in ensuring that the ana-lytical target profile (ATP) is realized throughout the lifecycle.
The objective of Stage 2, Method Performance Qualification, is to confirm that the analytical procedure is capable of delivering reproducible data that the reportable result consistently meets the performance criteria defined in the ATP while operated in its routine environment. This qualification must be performed before any “productive,” that is, Good Manufacturing Practice (GMP) application of the analytical procedure, such as process attribute monitoring, batch release, stability studies, and so on. The applications may occur in the same laboratory where the analytical procedure is started, was developed (e.g., R&D, contract laboratory), or in another laboratory (e.g., quality control (QC)). Thus, Stage 2 can be compared to the traditional validation exercise, but would include also other activities to implement a new analytical procedure in a laboratory, such as implementation of compendial procedures (see Section 7.4) or traditional transfer (see Section 7.5). However, the stage approach reflects much better the iterative nature of the whole process. In the qualification report, which approves the suitability of the analytical procedure for routine use, information is compiled from Stage 1, such as the establishment of the Method Design Space (or Method Operable Design Region) and parts of the analytical control strategy,aswellasfromStage2,wheretheformat ofthereportableresult is defined and the analytical control strategy is finalized. All these activities follow the predefined objectives of the measurement requirements, that is, the ATP.

Continued Method Performance Verification

Analytical procedures used in the routine analysis of pharmaceuticals are a crit-ical element of the overall quality assurance system that ensures patients receive products that are safe and efficacious. It is essential therefore that the data gener-ated by an analytical procedure is fit for its intended purpose at all times during its lifecycle of use. In order to achieve this, it is important to have mechanisms of monitoring and controlling the performance of the procedure during its routine use as well as systems for detecting and addressing unplanned departures from the analytical procedure as designed. This requires the effective implementation of systems for
routine monitoring of the performance of the analytical procedure
investigating and addressing aberrant data 
controlling changes made to the analytical procedure.
This chapter describes practices that can be adopted in each of these three areas,


For Download Click on the following:


No comments:

Post a Comment