Skip to Main Content
Skip Nav Destination

Data Integrity is the hottest topic in the pharmaceutical industry now and will continue to be in the future. Regulatory authorities around the world have issued a number of guidance documents on the data integrity and data governance since the start of 2015 as well as two industry guidance documents. However, all documents are vague and most do not contain detailed examples or advice to help regulated laboratories to implement policies, procedures and processes to ensure integrity: they outline what needs to be done but there is often insufficient detail for effective implementation. From an analytical perspective there has not been a detailed focus on data integrity in a regulated analytical laboratory. Hence, the rationale for writing this book.

Data Integrity is the hottest topic in the pharmaceutical industry now and will continue to be in the future. Regulatory authorities around the world have issued a number of guidance documents on the data integrity and data governance since the start of 2015 as well as two industry guidance documents. However, all documents are vague and most do not contain detailed examples or advice to help regulated laboratories to implement policies, procedures and processes to ensure integrity: they outline what needs to be done but there is often insufficient detail for effective implementation. From an analytical perspective there has not been a detailed focus on data integrity in a regulated analytical laboratory. Hence, the rationale for writing this book.

The aim of this book is to provide practical and detailed advice on how to implement data integrity and data governance for regulated analytical laboratories working in the pharmaceutical and allied industries. Although the main thrust of the book is for chemical laboratories some microbiological analysis will also be discussed.

This book is written for analytical scientists, laboratory managers and supervisors and quality assurance personnel working in regulated laboratories in and for the pharmaceutical industry and who are involved with data integrity and data governance programmes. Where networked systems are discussed IT professionals may also find useful information here.

This book is comprised of 24 chapters that are divided into five sections of this book, as follows:

  1. How to Use this Book is covered in this chapter.

  2. The Regulatory Environment is discussed in Chapters 2 and 3.

  3. Data Governance is presented and explained in Chapters 4 to 10 as well as Chapter 19.

  4. Data Integrity is covered in Chapters 11 to 18.

  5. Quality Assurance Oversight and Outsourcing are discussed in Chapters 20–24.

The detailed structure of the book and the constituent chapters are shown in Figure 1.1 and discussed in more detail in the following sections of this chapter. It is important to understand that data integrity and data governance is a complex subject that is not just focused on the accuracy, completeness and correctness of numbers generated in a regulated laboratory.

Figure 1.1

Outline structure of this book.

Figure 1.1

Outline structure of this book.

Close modal

The majority of chapters in this book are structured and written in the same way:

  • A chapter starts with a brief overview why the chapter is important within the overall context of data integrity and data governance.

  • This is followed by a section on regulatory requirements or regulatory guidance that are relevant to the chapter; thereby positioning the regulatory rationale for the topic of the chapter.

  • Where appropriate, there is also the business rationale for the tasks contained in the chapter.

  • Then there is a discussion of how to achieve the objective of each chapter. For example, if you are assessing a process or a computerised system the chapter discusses how this can be achieved and how to avoid some of the common pitfalls.

  • Each chapter finishes with a list of the references used.

As many people working in the pharmaceutical industry do not read the applicable regulations or applicable guidance documents, the intention of this approach is to put the regulatory and business rationale for performing a task at the reader's fingertips. It also allows an individual chapter to stand alone if a quick reference to a specific data integrity or data governance topic is all that is required. Overall, the aim is to give any reader the practical basis and confidence to implement or perform any of the topics covered by this book.

This topic is covered in two introductory chapters.

  • Chapter 2, entitled How Did We Get Here?, provides the background to data integrity in regulated laboratories of the pharmaceutical industry over the past quarter of a century. The story starts with the Barr Laboratories court case and ruling in 1993 1–3  and then moves on to the Able Laboratories fraud case in 2005.4  The latter case triggered regulatory authorities to change their inspection focus from paper to electronic records and consequently they discovered many more cases of data falsification and poor data management practices in companies throughout the world. The key compliance issues, derived mainly from FDA warning letters for paper records and computerised systems, are used to highlight areas of regulatory concern for laboratory operations.

  • Chapter 3 is The Regulators' Responses and outlines the response to the increased occurrence of data falsification and poor data integrity practices by the various regulatory agencies by issuing guidance documents and where necessary updating regulations. This chapter looks at the various guidance documents issued. The first was the FDA's Inspection of Pharmaceutical Quality Control Laboratories issued in 1993 following the Barr Laboratories case.5  Since 2015 there have been a many data integrity guidance documents issued from regulatory agencies and industry bodies.6–12 

Data governance is a relatively new concept to the pharmaceutical industry and the term comprises several interlinked topics that are discussed over the course of eight chapters.

  • Chapter 4, entitled What is Data Governance?, sets out the strands of data governance that are discussed in more detail in the rest of the book. Data governance is not new and has been used outside the pharmaceutical industry for at least 15 years. We will compare and contrast DG definitions from inside and outside the industry. From the data governance definitions and input from the various regulatory and industry guidance documents we will draw out the key features of a programme of work: roles and responsibilities, DI policy and training, culture and ethics, open and honest approach to work as examples of a corporate DI policy. Are there areas in an organisation, e.g. research, where data integrity should not apply?

  • Chapter 5 presents a Data Integrity Model13  consisting of four layers that describes data integrity within a pharmaceutical company from a GMP perspective: production, quality control and quality assurance, however, this book will focus on an analytical laboratory with quality assurance oversight. The four layers of the Data Integrity Model comprise:

    • Foundation: data governance and setting right culture and ethos for data integrity.

    • Level 1: Ensuring the right system and right instrument for the job.

    • Level 2: Developing and validating the right analytical procedure for the job.

    • Level 3: Allying all lower layers of the model to ensure the right approach for the right reportable result.

The Data Integrity Model will be used throughout this book to demonstrate how the layers of the model interact together.

In addition, there is a comparison of the Data Management Maturity (DMM) Model from CMMI Institute14  and Data Integrity Maturity Model from GAMP Guide on Records and Data Integrity.15 

  • Chapter 6 focuses on all those involved with data governance and outlines the key Roles and Responsibilities of a Data Governance Programme. Here, the various roles, from the boardroom to the laboratory bench, are presented and the responsibilities of each one discussed. The roles of process owner and system owner presented in EU GMP Annex 11 16  will be mapped to the data governance roles so that the roles are aligned with the regulations. We will also see how a corporate data integrity programme impacts a regulated laboratory.

  • Chapter 7 discusses Data Integrity Policies, Procedures and Training. Data integrity policies and associated procedures must be present throughout an organisation. These will vary from an overall data integrity policy, good documentation practices for paper and computerised systems, interpretation of laboratory data and second person review. Coupled with the policies and procedures there must be effective staff training and where necessary evidence of the effectiveness of training. Typical poor documentation practices and record keeping failures are highlighted.

  • Chapter 8 covers the necessity for Establishing and Maintaining an Open Culture for Data Integrity. It will explore what is open culture as well as the role of senior and laboratory management in setting expectations and maintaining a culture to ensure data integrity. This will include ways of balancing the pressure of work that could result in staff cutting corners due to management pressure that could compromise data integrity. This is the most difficult part of the Data Integrity Model as it requires management leadership and will require constant nurturing.

  • Chapter 9 presents An Analytical Data Life Cycle from acquisition through processing, retention and finally destruction at the end of the retention period. The currently published data life cycles are inadequate as they do not cover analytical procedures in any detail as they are generic. Presented here is a flexible analytical data life cycle from sampling to reporting that can adapt to any analysis from simple to complex. There are potential problems as some analyses do not generate objective evidence that is essential to demonstrate that an activity took place and for second person review. Key phases of the life cycle are sampling, sample preparation, acquisition and interpretation and these are where poor practices and falsification can occur, often without detection. Reiterating, an analytical data life cycle model needs to be flexible as not all analytical procedures require all phases.

  • Chapter 10 concerns the Assessment and Remediation of Laboratory Processes and Systems. Management needs to have inventories of computerised systems and paper processes available and have them assessed for data integrity risks. The priority of assessment should be based on the criticality of the process and the records produced in it. Data process mapping is one methodology for assessing a process or system that will highlight the data generated and the record vulnerabilities for paper process and hybrid and electronic systems. From this will come short term quick wins for immediate implementation and also the development of long-term solutions options offering business benefit.

  • Chapter 19 focuses on Quality Metrics for Data Integrity. Although placed towards the back of the book, this is a data governance topic as the metrics are a major input into the management review of the data governance of an organisation. It is important to understand the data integrity issues before developing metrics to monitor the programmes. There are two main areas for quality metrics, the first is to assess the progress of the overall data integrity assessment and remediation programme as well as the metrics for day to day routine analysis to identify areas of concern for monitoring, setting key performance indicators (KPIs) that may trigger further investigation of a process or system.

Moving from the data governance umbrella, we come down into the detail of data integrity with eight chapters looking at the three top layers of the data integrity model.

  • Chapter 11 looks at the Data Integrity and Paper: Blank Forms and Instrument Log Books. The use of blank forms for documenting analytical records will be used. An analysis based on observation will be discussed and the main data integrity issue with such a process is there objective evidence available for second person review? Many manual processes in a laboratory such as sample preparation lack evidence of activities other than a trained analyst following a procedure. In addition, there are now stringent regulatory control requirements for master templates and blank forms that will be outlined to demonstrate data integrity. Instrument log books are key records for data integrity but are they completed correctly and if using an instrument data system why cannot the data system generate the log automatically?

  • Chapter 12 discusses The Hybrid System Problem. A hybrid computerised system is one that creates electronic records with signed paper printouts with the problem that two incompatible record formats must be managed and synchronised. What can software suppliers do to ensure data integrity? Many software applications used in the laboratory were designed before data integrity issues became a major issue. We also discuss the most common hybrid system: spreadsheets and present what to do to ensure data integrity. The regulatory guidance from WHO is that “use of hybrid systems is discouraged”9  will be presented as an introduction to Chapter 13.

  • Chapter 13 presents a contentious topic: Get Rid of Paper: Why Electronic Processes are Better for Data Integrity and looks at why electronic processes are better. This is a goal for long term solution to existing systems and paper based processes. If resources are used to update and validate new systems to improve data integrity, there must be substantial business benefits. This is a business opportunity that will be compared with approaches taken with two other major programmes of work for the pharmaceutical industry: Year 2000 and Part 11 remediation projects, as discussed in Section 1.9 of this chapter..

  • Chapter 14 focuses on Data Integrity Centric Computer System Validation and Analytical Instrument Qualification. It compares the traditional approach to top down computer validation: improving the process and then configuring the application to match the new process. However, this can leave the records generated by the system vulnerable especially if access via to data files via the operating system is available. Therefore, a combination of top-down validation coupled with a bottom-up approach identifying the record vulnerabilities and mitigating them is essential. The integrated approach of analytical instrument qualification and computerised system validation in the 2018 17  version of USP <1058> will be discussed. The data integrity issues of allowing a service provider to have access to your computerised systems will be debated.

  • Chapter 15 presents Validation of Analytical Procedures that discusses the new approach of a life cycle approach that has been proposed by the publication of a draft version of USP <1220> by the United States Pharmacopoeia (USP).18  The omission of ICH Q2(R1)19  is that method development is not mentioned. In contrast, the draft USP <1220> is a three-stage process looking at defining the aims of the method with an Analytical Target Profile (ATP) and the method design is based on this. When developed, the method is validated and then deployed for use with continuous monitoring of the performance.

  • Chapter 16 discusses Performing the Analysis. When an analysis is performed, the lower levels of the data integrity model (Foundation, Level 1 and Level 2) must work effectively so that the work is carried out correctly at Level 3 to ensure data integrity. We will look at analytical procedures requiring observation alone, sample preparation followed by instrumental analysis, instrumental analysis with data interpretation and finally a method involving sample preparation, instrumental analysis with data interpretation and calculation of the reportable result. Data integrity requires that out of specification results are documented and will be a lead-in to the next chapter.

  • Chapter 17 entitled Second Person Review. The second person review is important for ensuring data integrity and is the second line of defence (the first being the performer who does the work). A reviewer needs to ensure that all elements in the procedure have been followed and to check for falsification or hidden records. Review of paper, hybrid and electronic records will be discussed as well as risk based audit trail review including review by exception.

  • Chapter 18 looks at Record Retention. It presents and discusses options for retention of records: migration, museum, virtualisation or read only database. Discussion of data standards: history and current efforts as a way forward for the future. Some data integrity guidance documents require that dynamic data is not converted to static data.8,20 

The final five chapters present and discuss quality oversight within a regulated laboratory and also when outsourcing analysis to a Contract Manufacturing Organisation (CMO) or Contract Research Laboratory (CRO):

  • Chapter 20 is entitled Raising Data Integrity Concerns and discusses how data integrity concerns can be raised by an employee. This is different from the open culture discussed in Chapter 8 and is focused on how a concern can be raised and investigated by a company that protects the confidentiality of all concerned when handling such issues. This policy includes non-retaliation against individuals either raising or being subject to the data integrity concern.

  • Chapter 21 discusses Quality Assurance Oversight for Data Integrity and the role of the Quality Assurance department in self-inspections and risk based data integrity audits. The chapter will discuss the potential overlap between data integrity audits and computer system periodic reviews. What do I audit in a data integrity? This chapter is linked with Chapter 24 where several data integrity audit aide memoires are presented based on the data integrity model presented in Chapter 5.

  • Chapter 22 is focused on How to Conduct a Data Integrity Investigation. It begins with the regulations and guidance covering data integrity investigations from various regulatory authorities including the FDA. There is a spectrum of issues from mistakes through to deliberate falsification that will be discussed before focusing on falsification of data. Using a case study, the reader is taken through how to conduct a data integrity investigation including writing the report and the corrective and preventative actions to be taken. One important aspect is understanding the material impact of the data violation and do I report the violation to the regulators?

  • Chapter 23 looks at the Data Integrity and Outsourcing. The pharmaceutical supply chain is globally outsourced with Active Pharmaceutical Ingredient (API) manufacturers, excipient suppliers, contract manufacturers and contract laboratories being used extensively. This chapter takes the salient points from the preceding chapters and applies then to assessing and monitoring any supplier of a chemical, product or a service.

  • Chapter 24 entitled Data Integrity Audit Aide Memoires provides several aide memoires based on the Data Integrity Model described in Chapter 5. Many of these aide memoires are unusual as they are based on figures rather than a checklist of questions that will constrain an auditor. As such, they are intended for use by experienced auditors and referenced during an audit.

Although appearing premature, it is appropriate to introduce here the Data Integrity Model that will be discussed in more detail in Chapter 5. The model consists of a foundation and three levels that can describe data integrity and data governance in a single diagram. The model also includes production, analytical development/quality control and quality assurance. Shown in Figure 1.2 is the portion of the Data Integrity Model covered in this book (Analytical Development/Quality Control and Quality Assurance) with the chapters of the book mapped against it. You will see that one of the main areas of focus is on the Foundation level as it is winning hearts and minds that allows an organisation to change its culture.

Figure 1.2

Mapping this book to the Data Integrity Model.

Figure 1.2

Mapping this book to the Data Integrity Model.

Close modal

All the data governance and data integrity topics covered in this book must not exist in isolation but be integrated within the Pharmaceutical Quality System (PQS) of an organisation. For example, in Chapter 19 data integrity quality metrics are discussed that will be fed into a management review of not just data integrity but of the whole PQS. Similarly, the policies and procedures for data integrity together with the associated training will be integrated into the normal staff training covering both the scientific and compliance aspects of their position descriptions.

By taking this approach, data integrity coupled with the development of an open culture described in Chapter 8, becomes integrated into the normal operation of a regulated laboratory.

There is not a specific chapter on risk management in this book. This is a deliberate approach to risk management as it is integrated into the chapters throughout the book either implicitly or explicitly. For example, in an analytical data life cycle we look at the criticality of data generated through different mechanisms. The risks posed with using hybrid systems and standalone workstations are discussed in Chapters 12 and 10, respectively, and data process mapping in Chapter 9 on the analytical data life cycle is used to identify risks and vulnerabilities of records produced during an analytical procedure. In summary, we should not think of risk management as a separate subject but it must be integrated into normal work and should be second nature to all involved.

The FDA GMP regulations, 21 CFR 211, are entitled Current Good Manufacturing Practice Regulations for Finished Pharmaceuticals.21  Typically, this is shortened to GMP or sometimes cGMP. What does the word “current” mean in the context of the US GMP regulations? To understand the intent of the regulation this is where we need to go back to understand the intent of the regulation when it was first promulgated in 1978.21  This will be discussed in more detail in Chapter 13 where the advantages of electronic working over paper are discussed.

When you look at EudraLex volume 4 to see the various parts, chapters and annexes of EU GMP regulations there is nothing in them that is the equivalent to the “current” of cGMP described in the previous section. However, buried in the overarching and incredibly dry European Union directive 2001/83/EC on the Community code for medicinal products for human use there is Article 23 that states22 :

  1. After a marketing authorisation has been granted, the marketing authorisation holder shall, in respect of the methods of manufacture and control provided for in Article 8(3)(d) and (h), take account of scientific and technical progress and introduce any changes that may be required to enable the medicinal product to be manufactured and checked by means of generally accepted scientific methods.

Note the phrase “take account of scientific and technical progress and introduce any changes… and checked by generally accepted scientific methods”. This is the European equivalent of cGMP. In the 2015 MHRA GMP data integrity guidance document7  notes that:

  • Manufacturers and analytical laboratories should be aware that reverting from automated/computerised to manual/paper-based systems will not in itself remove the need for data integrity controls.

Therefore, ignoring computerised systems and going back to paper records alone may be a failure to comply with Article 23 22  and hence a GMP non-compliance.

The starting point in a discussion of data integrity must be to define what we mean by the term. There are several definitions from regulatory agencies as well as government bodies.

Table 1.1 shows six definitions of either integrity or data integrity, including two from the FDA. I have deliberately listed all these definitions in Table 1.1 to illustrate that different organisations or even different divisions of the same regulatory organisation can have different definitions for the same term.

Table 1.1

Data integrity and integrity definitions

SourceTermDefinition
FDA Data integrity The degree to which a collection of data are complete, consistent and accurate.20  
FDA Data integrity The completeness, consistency, and accuracy of data. Complete, consistent, and accurate data should be attributable, legible, contemporaneously recorded, original or a true copy, and accurate (ALCOA).23  
IEEE Integrity The degree to which a system or component prevents unauthorized access to, or modification of, computer programs or data.24  
MHRA Data integrity The extent to which all data are complete, consistent and accurate throughout the data lifecycle.7  
NIST Data integrity The property that data has not been altered in an unauthorized manner. Data integrity covers data in storage, during processing, and while in transit.25  
WHO Data integrity The degree to which a collection of data are complete, consistent and accurate throughout the data life cycle. The collected data should be attributable, legible, contemporaneously recorded, original or a true copy and accurate.9  
SourceTermDefinition
FDA Data integrity The degree to which a collection of data are complete, consistent and accurate.20  
FDA Data integrity The completeness, consistency, and accuracy of data. Complete, consistent, and accurate data should be attributable, legible, contemporaneously recorded, original or a true copy, and accurate (ALCOA).23  
IEEE Integrity The degree to which a system or component prevents unauthorized access to, or modification of, computer programs or data.24  
MHRA Data integrity The extent to which all data are complete, consistent and accurate throughout the data lifecycle.7  
NIST Data integrity The property that data has not been altered in an unauthorized manner. Data integrity covers data in storage, during processing, and while in transit.25  
WHO Data integrity The degree to which a collection of data are complete, consistent and accurate throughout the data life cycle. The collected data should be attributable, legible, contemporaneously recorded, original or a true copy and accurate.9  

From these various definitions of integrity and data integrity is there a common ground to explain what is data integrity? Therefore, let us reconcile and combine these different definitions into a single approach for data integrity:

  • Data must be complete, consistent and accurate.

  • Data have a life cycle.

  • Data must not have been improperly modified.

  • If using a computerised system, then software should prevent unauthorised modification of data.

  • The software should not be improperly modified (such as modifying the system configuration to turn the audit trail off and then on again to make unauthorised changes to data or deletions).

  • To have integrity a record needs to meet the ALCOA criteria (we will discuss what ALCOA means in Section 1.5.3).

The first three and sixth bullet points hold for both manual processes as well as computerised systems and the fourth and fifth points are necessary for both hybrid and electronic computerised systems.

Although the focus of these definitions is on the data, metadata and records created and updated during an analytical analysis, however, data integrity is more than just data and numbers. It involves data governance that will be discussed in Chapters 4–10 and 19.

Any record that is generated during regulated laboratory analysis needs to have its data integrity assured as discussed above. This now raises the question how does an analytical scientist assess data integrity of regulatory records? In Table 1.2 the term ALCOA is mentioned as a means of assessing the integrity of data.

Table 1.2

ALCOA+ criteria for data integrity

CriterionMeaning
Attributable 
  • Attributable means information is captured in the record so that it is uniquely identified as executed by the originator of the data (e.g. a person or a computer system).9 

  • Attributable to the person generating the data.7 

  • Who acquired the data originally or performed an action subsequently to it and when?

 
Legible 
  • The terms legible and traceable and permanent refer to the requirements that data are readable, understandable, and allow a clear picture of the sequencing of steps or events in the record so that all GXP activities conducted can be fully reconstructed by the people reviewing these records at any point during the records retention period set by the applicable GXP.9  See also consistent and enduring.

  • Legible.7 

  • Can you read the data together with any metadata or all written entries on paper?

  • Can you read and understand all audit trail entries?

 
Contemporaneous 
  • Contemporaneous data are data recorded at the time they are generated or observed.9 

  • Documented (on paper or electronically) at the time of an activity.

 
Original 
  • Original record: Data as the file or format in which it was originally generated, preserving the integrity (accuracy, completeness, content and meaning) of the record, e.g. original paper record of manual observation, or electronic raw data file from a computerised system.10 

  • True copy: An exact verified copy of an original record.7 

  • Original data include the first or source capture of data or information and all subsequent data required to fully reconstruct the conduct of the GXP activity. The GXP requirements for original data include the following:

    • original data should be reviewed;

    • original data and/or true and verified copies that preserve the content and meaning of the original data should be retained;

    • as such, original records should be complete, enduring and readily retrievable and readable throughout the records retention period.

  • Written observation or printout or a certified copy thereof.

  • Electronic record including metadata of an activity.

 
Accurate 
  • Accurate.7 

  • The term “accurate” means data are correct, truthful, complete, valid and reliable.9 

  • No errors in the original observation(s).

  • No editing without documented amendments/audit trail entries by authorised personnel.

 
Complete 
  • All data from an analysis including any data generated before a problem is observed, data generated after repeating part or all of the work or reanalysis performed on the sample.

  • For hybrid systems, the paper output must be linked to the underlying electronic records used to produce it.

 
Consistent 
  • All elements of the analysis such as the sequence of events follow on and data files are date (all processes) and time (when using a hybrid or electronic systems) stamped in the expected order.

 
Enduring 
  • Recorded on authorised media, e.g. laboratory notebooks, numbered worksheets for which there is accountability or electronic media.

  • Not recorded on the backs of envelopes, laboratory coat sleeves, cigarette packets or Post-it notes.

 
Available 
  • The complete collection of records can be accessed or retrieved for review and audit or inspection over the lifetime of the record.

 
CriterionMeaning
Attributable 
  • Attributable means information is captured in the record so that it is uniquely identified as executed by the originator of the data (e.g. a person or a computer system).9 

  • Attributable to the person generating the data.7 

  • Who acquired the data originally or performed an action subsequently to it and when?

 
Legible 
  • The terms legible and traceable and permanent refer to the requirements that data are readable, understandable, and allow a clear picture of the sequencing of steps or events in the record so that all GXP activities conducted can be fully reconstructed by the people reviewing these records at any point during the records retention period set by the applicable GXP.9  See also consistent and enduring.

  • Legible.7 

  • Can you read the data together with any metadata or all written entries on paper?

  • Can you read and understand all audit trail entries?

 
Contemporaneous 
  • Contemporaneous data are data recorded at the time they are generated or observed.9 

  • Documented (on paper or electronically) at the time of an activity.

 
Original 
  • Original record: Data as the file or format in which it was originally generated, preserving the integrity (accuracy, completeness, content and meaning) of the record, e.g. original paper record of manual observation, or electronic raw data file from a computerised system.10 

  • True copy: An exact verified copy of an original record.7 

  • Original data include the first or source capture of data or information and all subsequent data required to fully reconstruct the conduct of the GXP activity. The GXP requirements for original data include the following:

    • original data should be reviewed;

    • original data and/or true and verified copies that preserve the content and meaning of the original data should be retained;

    • as such, original records should be complete, enduring and readily retrievable and readable throughout the records retention period.

  • Written observation or printout or a certified copy thereof.

  • Electronic record including metadata of an activity.

 
Accurate 
  • Accurate.7 

  • The term “accurate” means data are correct, truthful, complete, valid and reliable.9 

  • No errors in the original observation(s).

  • No editing without documented amendments/audit trail entries by authorised personnel.

 
Complete 
  • All data from an analysis including any data generated before a problem is observed, data generated after repeating part or all of the work or reanalysis performed on the sample.

  • For hybrid systems, the paper output must be linked to the underlying electronic records used to produce it.

 
Consistent 
  • All elements of the analysis such as the sequence of events follow on and data files are date (all processes) and time (when using a hybrid or electronic systems) stamped in the expected order.

 
Enduring 
  • Recorded on authorised media, e.g. laboratory notebooks, numbered worksheets for which there is accountability or electronic media.

  • Not recorded on the backs of envelopes, laboratory coat sleeves, cigarette packets or Post-it notes.

 
Available 
  • The complete collection of records can be accessed or retrieved for review and audit or inspection over the lifetime of the record.

 

As a quick introduction to the basic criteria of laboratory data integrity we need to explain the acronym ALCOA mentioned in the previous section. This is a term developed in the 1980's by an FDA GLP inspector as a means of teaching his colleagues about the integrity of data and records generated during non-clinical toxicology studies. The five ALCOA terms are:

  • Attributable;

  • Legible;

  • Contemporaneous;

  • Original;

  • Accurate.

Implicit in the ALCOA requirements are that the records should be complete, consistent, enduring and available according to the WHO Guidance on Good Records Management.9 

In addition, there are four more terms that were added by an EMA document on clinical trial electronic source data26 :

  • Complete;

  • Consistent;

  • Enduring;

  • Available.

These nine terms are collectively called ALCOA+ or sometimes ALCOA-plus and are detailed in Table 1.2. Although many of the regulatory guidance documents talk about ALCOA and claim that the four additional terms can be derived from the original five, in this book we will use ALCOA+ criteria for data integrity as there are differences.

Analytical scientists need to understand these criteria and apply them in their respective analytical methods. The WHO guidance provides the best understanding of the ALCOA principles where Appendix 1 gives definitions of each of the five ALCOA terms with side by side tables of expectations for paper and electronic records, followed by special risk considerations for each of the five terms.9  This is a very useful source of information about ALCOA requirements for data integrity of records and to help understand the issues associated with each term.

Although this book is focused on data governance and data integrity, it is important to understand the difference between data integrity and data quality. Data integrity definitions have been presented in Table 1.2 and discussed in Sections 1.5.1 and 1.5.2. Data quality is defined by MHRA as:

  • The assurance that data produced is exactly what was intended to be produced and fit for its intended purpose. This incorporates ALCOA.8 

In summary, data quality can be considered as “can I use the data” and data integrity as “can I trust the numbers”. Therefore, data integrity is an important component of data quality. This is shown diagrammatically in Figure 1.3. In the centre of the figure is the analytical process from the sample to the reportable result. An analysis can be conducted using one of the following options:

  • Paper records that are signed by the analyst and the reviewer.

  • Analysis by an analytical instrument controlled by a computer application that creates electronic records, may perform calculations and generates a reportable result that is printed out and the paper signed (a hybrid computer system or simply a hybrid).

  • Analysis by an analytical instrument controlled by a computer application that creates electronic workflow, may perform calculations and generates a reportable result and the report is electronically signed.

Figure 1.3

Difference between Data Quality and Data Integrity.

Figure 1.3

Difference between Data Quality and Data Integrity.

Close modal

Regardless of the option used for a test, contextual metadata is generated that is essential for interpretation of the reportable result and puts the value in context, as we shall see in the next section.

As a sample is analysed, there will be additional data gathered, these are called metadata. The term was originally derived from using computerised systems and has been sometimes defined as data about data, which is not the most helpful definition. Although many data integrity guidances use the term metadata, which is defined in the glossary at the front of this book, I will use the term contextual metadata as it reminds us that it refers to data that put the reportable result or equivalent into context.

To illustrate this, Table 1.3 shows the reportable result for a chromatographic analysis of a drug substance in a finished product or even the plasma concentration of a drug from a clinical study and some of the contextual metadata associated with the separation. As can be seen from the table a single result requires a large volume of contextual metadata that is generated throughout the analytical process including the chromatographic run. Figure 1.4 shows the same information graphically. In addition, the figure also indicates the scope of second person review that needs to be carried out that will be discussed in Chapter 17 and the scope of raw data and complete data that is discussed in Chapter 7.

Table 1.3

A reportable result and the contextual metadata for a chromatographic analysis

Reportable resultPhase of analysisData and contextual metadata
98.3 Sample management 
  • Sampling plan and sampling information

  • Sample transport and sample management, as appropriate

  • Sample number and identity

  • Batch/study number of sample(s)

  • Initials/signatures of the analyst(s) and reviewer

 
Overall analysis 
  • Start and end dates and times of the analysis

  • Dates and times of any reanalysis

 
Sample preparation 
  • Sample weights on balance printout

  • Analytical balance used and calibration records. Balance log book completed

  • Sample preparation records including pipettes and glassware used

  • Initials/signatures of the analyst(s) and reviewer

 
Instrumental analysis 
  • Instrument identity and current qualification status

  • Column identity

  • Instrument control method and any changes made

  • Data acquisition method and any changes made

  • Sequence file of the order of injections, e.g. system suitability samples, blanks, quality controls, standards, samples, injection volumes, purities, dilutions, correction factors

  • Integration method: Automatic integration of peak areas and any manual changes (if allowed)

  • System Suitability Test results versus acceptance criteria

  • Calculation of individual results and the reportable result

  • Result units (percent or concentration)

  • Completed instrument log book

  • Initials/signatures of the analyst(s) and reviewer

 
Reporting 
  • Printouts with date and time of printout linked to underlying electronic records and with handwritten signatures and dates (OR) Electronic report electronically signed with date and time of generation and electronic signature signing

  • Audit trail entries of all activities involved with the analysis

  • Initials/signatures of the analyst(s) and reviewer

 
Reportable resultPhase of analysisData and contextual metadata
98.3 Sample management 
  • Sampling plan and sampling information

  • Sample transport and sample management, as appropriate

  • Sample number and identity

  • Batch/study number of sample(s)

  • Initials/signatures of the analyst(s) and reviewer

 
Overall analysis 
  • Start and end dates and times of the analysis

  • Dates and times of any reanalysis

 
Sample preparation 
  • Sample weights on balance printout

  • Analytical balance used and calibration records. Balance log book completed

  • Sample preparation records including pipettes and glassware used

  • Initials/signatures of the analyst(s) and reviewer

 
Instrumental analysis 
  • Instrument identity and current qualification status

  • Column identity

  • Instrument control method and any changes made

  • Data acquisition method and any changes made

  • Sequence file of the order of injections, e.g. system suitability samples, blanks, quality controls, standards, samples, injection volumes, purities, dilutions, correction factors

  • Integration method: Automatic integration of peak areas and any manual changes (if allowed)

  • System Suitability Test results versus acceptance criteria

  • Calculation of individual results and the reportable result

  • Result units (percent or concentration)

  • Completed instrument log book

  • Initials/signatures of the analyst(s) and reviewer

 
Reporting 
  • Printouts with date and time of printout linked to underlying electronic records and with handwritten signatures and dates (OR) Electronic report electronically signed with date and time of generation and electronic signature signing

  • Audit trail entries of all activities involved with the analysis

  • Initials/signatures of the analyst(s) and reviewer

 
Figure 1.4

Reportable result and the supporting contextual metadata.

Figure 1.4

Reportable result and the supporting contextual metadata.

Close modal

Note that Table 1.3 only refers to the chromatographic analysis and it does not consider the following:

  • sampling process;

  • sample transport to the laboratory;

  • sample management in the laboratory;

  • reagent and standard preparation used in the analysis;

  • preparation of the samples for analysis;

  • identification of any instruments (analytical balance, electronic pipettes), glassware (volumetric flasks, pipettes) and equipment (sample thief) used in the above processes;

  • log books for instruments and equipment used.

These activities will create additional data and contextual metadata as part of the overall analytical procedure that will need to be considered for data retention and assessment of data integrity. All together they support the overall quality of the reportable result.

Data integrity provides assurance that the analytical work in the laboratory from the sampling to the calculation of the reportable result has been carried out correctly. Selecting three items from the definitions from Section 1.5.1 that the analysis is:

  • Complete.

  • Consistent.

  • Accurate.

In short, can I trust the analysis and the reportable result(s)? If you cannot trust the analysis, then you cannot take a decision based on poor, incomplete or falsified data. Therefore, data integrity comes before data quality; you cannot make a quality decision without the integrity of the underlying analysis. It is the role of the analytical scientist and the second person reviewer to ensure data integrity but that also requires that the instruments, computer systems, analytical procedures, laboratory environment and organisation within which they work are set up to ensure that they are not pressured to cut corners or falsify data.

The focus of this book is on data integrity and the overall data governance in the context of a regulated analytical laboratory, however, in focusing on data integrity we must not lose sight of data quality.

Once you have assured the integrity of data, we turn to data quality. In part, this focuses on the laboratory's ability to deliver the results to the sample submitter and meet their requirements for the analysis such as:

  • precision and accuracy or measurement uncertainty required;

  • reportable value format;

  • supporting information for the analysis;

  • speed of analysis (e.g. releasing a production batch or turnaround of analysis from a clinical trial).

Data quality is a term that means can I use the data to make a decision, e.g. for batch release?

In 2016, the FDA issued a proposed update to the Good Laboratory Practice regulations under the title GLP Quality System.27  One of the proposed changes was a new section 58.180 on Data Quality and Integrity:

  • (a) All data generated during the conduct of a nonclinical laboratory study must be accurate, legible, contemporaneous, original, and attributable (ALCOA). Also, data must be credible, internally consistent, and corroborated.

  • (b) All data must be recorded indelibly, directly, and promptly to a permanent medium at the time of observation and must identify unambiguously the person entering the data. Any change to any entry must be made so as not to obscure the original entry, must indicate the reason for such change, must indicate when the change was made, and must identify who made the change. When data are either captured or maintained, or both captured and maintained electronically, these requirements are fulfilled by the use of an electronic records system fully compliant with applicable regulations.

  • (c) All data accrued as required in paragraphs (a) and (b) of this section must be included in the final study report.

If these proposals are unchanged and become law, then for the first time there is the unambiguous requirement for the integrity of data, changes to data are transparent and that all data generated during a study must be reported without any omissions.

A data integrity programme of work will require changes within an organisation, to processes, to systems and attitudes of staff. Some can be achieved without involving regulatory authorities but it is important when change must be made to understand the difference between continual and continuous improvement. The ISO 9001 standard referred to continual improvement until the 2015 version28  and this has now changed to continual improvement in Section 10.3. ISO 9001 now aligns with EU GMP Chapter 1 29  – derived from ICH Q10 30  that also uses the phrase continual improvement in Section 1.4 29 :

  • (xi) Continual improvement is facilitated through the implementation of quality improvements appropriate to the current level of process and product knowledge.

The question arises do continuous and continual have the same meaning? This is important for some readers whose first language is not English there is sometimes no word for continual, only continuous and therefore it may appear that there is no difference between the words. There is some overlap in meaning between the two but they are not completely synonymous. Both can mean roughly “without interruption”, but this is where continuous is used much more. Continual, by contrast, usually means “happening frequently, with intervals between”.

This difference is very important in the context of the pharmaceutical industry as it is regulated and change cannot simply occur, especially if a change impacts a marketing authorisation. Change control, impact analysis, risk assessment and informing the regulatory authorities are the reasons for continual change being enshrined in the regulations.

The FDA, MHRA and WHO data integrity guidance documents8,9,20  refer to data as either static and dynamic and the definitions from these two publications are presented in Table 1.4.

Table 1.4

Definitions of static and dynamic records from the FDA and WHO guidances

Record typeDefinition
Static record 
  • A static record format, such as a paper or pdf record, is one that is fixed and allows little or no interaction between the user and the record content. For example, once printed or converted to static pdfs, chromatography records lose the capability of being reprocessed or enabling more detailed viewing of baselines.9 

  • A static (record) is used to indicate a fixed-data document such as a paper record or an electronic image.20 

 
Dynamic record 
  • Records in dynamic format, such as electronic records, that allow for an interactive relationship between the user and the record content. For example, electronic records in database formats allow the user to track, trend and query data; chromatography records maintained as electronic records allow the user (with proper access permissions) to reprocess the data and expand the baseline to view the integration more clearly.8,9 

  • Dynamic record means that the record format allows interaction between the user and the record content. For example, a dynamic chromatographic record may allow the user to change the baseline and reprocess chromatographic data so that the resulting peaks may appear smaller or larger. It also may allow the user to modify formulas or entries in a spreadsheet used to compute test results or other information such as calculated yield.20 

  • Information that is originally captured in a dynamic state should remain available in that state.8 

 
Record typeDefinition
Static record 
  • A static record format, such as a paper or pdf record, is one that is fixed and allows little or no interaction between the user and the record content. For example, once printed or converted to static pdfs, chromatography records lose the capability of being reprocessed or enabling more detailed viewing of baselines.9 

  • A static (record) is used to indicate a fixed-data document such as a paper record or an electronic image.20 

 
Dynamic record 
  • Records in dynamic format, such as electronic records, that allow for an interactive relationship between the user and the record content. For example, electronic records in database formats allow the user to track, trend and query data; chromatography records maintained as electronic records allow the user (with proper access permissions) to reprocess the data and expand the baseline to view the integration more clearly.8,9 

  • Dynamic record means that the record format allows interaction between the user and the record content. For example, a dynamic chromatographic record may allow the user to change the baseline and reprocess chromatographic data so that the resulting peaks may appear smaller or larger. It also may allow the user to modify formulas or entries in a spreadsheet used to compute test results or other information such as calculated yield.20 

  • Information that is originally captured in a dynamic state should remain available in that state.8 

 

Static records and dynamic records are very common in regulated laboratories. For example, an analytical balance with a printer will only produce a static record via the printer. However, if the balance is connected to an instrument data system, ELN or LIMS and the weight and other information collected electronically, then the weight becomes a dynamic record as it is capable of being manipulated and transformed later in the analytical process. Most instrument data systems used in laboratories produce dynamic records, e.g. chromatography data systems, spectrometry data systems, etc.

In the US GMP regulations there is the requirement for production information in 21 CFR 211.188 and complete data in 21 CFR 211.194(a).31  Most production records are static in that they are typically temperatures, pressures, rotation speeds and relative humidity measurements and do not need to be interpreted. Although there are static records in the laboratory as discussed above such as weights, pH values and colours, most laboratory records are dynamic as they must be interpreted to obtain a reportable result. Hence, the requirement for production information and laboratory data: they are typically static and dynamic records respectively.

As you read this book it should become apparent that there are several important data integrity concepts that I will discuss now.

The first concept is that in just reading this chapter and looking at the structure in Figure 1.1 is that data integrity is more than just numbers generated in a regulated laboratory. Although numbers and data including the associated integrity are important there is much, much more to consider. A holistic approach to data integrity and data governance:

  • Starts at the top and works throughout an organisation: from the boardroom to the laboratory bench.

  • Management lead, the rest follow.

  • Integrity is integral within the overall pharmaceutical management system or quality management system.

  • Organisation culture must be open and honest so that mistakes can be admitted and turned to an advantage of continual improvement.

  • Instruments must be qualified and computerised systems validated.

  • Analytical procedures must be validated.

Have we mentioned numbers yet? No! All the above items, and more, need to be in place to ensure that when analytical work is carried out the numbers generated have the integrity and quality required.

The second concept is quality is not owned by the Quality Assurance Department as quality is now everybody's job. All staff when they work should ensure that they are adequately trained and follow the applicable procedures. If they make a mistake the work should stop and the issue discussed with a supervisor to determine what to do to resolve the issue. It may be that a procedure is too complex and needs to be revised or that the analyst needs more training. Regardless of the outcome it is the analytical scientists working in regulated environments that must own quality. It is their job and their responsibility. This approach must be supported by management behaviours to ensure that there is not pressure to get work out of the door as this could lead to short cuts being taken.

Quality starts in the laboratory. Quality Assurance staff are not there to identify errors as that is a laboratory function of the two most important people in any analysis: the performer of a test and the reviewer of the work undertaken. Data integrity and data quality are their responsibility. Quality Assurance provides the advice, ensures that work is compliant with regulations and the quality oversight via audits and data integrity investigations.

In some people's minds data integrity is a second attempt to get compliance with 21 CFR 11 or Annex 11 right. It must be understood that data integrity covers much, much more than Part 11/Annex 11 compliance. It includes:

  • Management attitudes and leadership.

  • Definition and allocation of roles and responsibilities for data integrity.

  • Development of an open culture.

  • Establishment and maintenance of data integrity policies and procedures.

  • Assessment and remediation of manual, hybrid and electronic processes.

In contrast, 21 CFR 11 would just be focused on computerised systems.

Some attendees at data integrity training courses I have participated in have stated that management thinks that data integrity is a computer problem and left it to IT to resolve. This approach indicates that senior management has no clue. The issue is that it impacts both paper processes as well as computerised ones. Indeed, as we shall see in later in this book control of blank paper forms without a computer in sight is a key regulatory issue.

Equally concerning is the perception that data integrity is a laboratory problem. This is not even close! As we shall see in Chapters 4 and 5, EVERYBODY in an organisation from the Boardroom to the laboratory bench is involved in data integrity.

Whoever thinks this is naïve, stupid or both. If data are submitted to the FDA in a licencing application (e.g. IND or NDA) then 21 CFR 11 regulations apply to that data as noted under the Scope of the regulation in 11.1(b)32 :

  • This part applies to records in electronic form that are created, modified, maintained, archived, retrieved, or transmitted, under any records requirements set forth in agency regulations.

  • This part also applies to electronic records submitted to the agency under requirements of the Federal Food, Drug, and Cosmetic Act and the Public Health Service Act, even if such records are not specifically identified in agency regulations.

If research data are contained in a New Drug Application, then they can be subject to inspection during a Pre-Approval Inspection under CPG 7346.832 33  that is discussed in Chapter 3.

For those with short memories, data integrity is the third major IT systems improvement programme that has faced the pharmaceutical industry over the past 20 years, the other two being Year 2000 (Y2K) and electronic records/signatures (Part 11) assessment and remediation. Is the pharmaceutical industry going to make the same mistakes again? Let us explore this question.

  • The Y2K programme was simply replacing applications and operating systems that could handle dates past 31st December 1999. Typically, it was a case of updating, rather than process improvement, to complete the work before the deadline; this was a technical project with a fixed due date.

  • In contrast, a 21 CFR 11 assessment and remediation programme was an opportunity to upgrade and provide substantial business benefit by changing the business process to use electronic signatures and eliminate paper. However, not many laboratories took advantage of this approach and simply replaced non-compliant with technically compliant software and continued to use hybrid systems.

  • Data integrity and data governance is a programme not a project of work. It covers management, culture, policies, procedures, training, instrument, computerised systems, analytical procedures and analysis with quality oversight. From this list it should be very apparent that the programme work is much greater that just computer remediation and assessment as with Y2K and 21 CFR 11 projects.

Is the industry going to repeat the Part 11 behaviour?

Read the various guidance documents7–9,11,12,20  and you will see the storm clouds on the horizon: tight and bureaucratic control of blank forms, discouraging use of hybrid systems. The message is clear – get rid of paper or control it rigorously. The cost of electronic management is steady or declining – BUT the cost of using paper records is rising. Consider not just the physical storage cost but also the time to access reports and trend data and realize that paper's cost is considerable and labour intensive. Unfortunately, the management of many organisations are short sighted and only focus on fixing immediate problems and ignore the big picture.

Working electronically is the only long term viable option for the pharmaceutical industry.

An alternative view for data integrity remediation is seeing it as a second chance to get Part 11 right by looking at the intent rather than the letter of the regulation. Seen in this way, the industry can both align with regulators and position themselves for productivity—and data integrity—improvements in their processes.

However, many organisations complain that that this will cost money. Yes, but what is the effect on the organisation's cash flow if every batch can be released a few days earlier? Do the sums and then put in your project proposals.

1.
Barr Laboratories: Court Decision Strengthens FDA's Regulatory Power
,
1993
, Available from: https://www.fda.gov/Drugs/DevelopmentApprovalProcess/Manufacturing/ucm212214.htm
2.
C. L.
Burgess
,
Issues related to United States versus Barr Laboratories Inc.
, in
Development and Validation of Analytical Methods
, ed. C. L. Riley and T. W. Rosanske,
Pergamon Press
,
Oxford
,
1996
, p. 352
3.
R. J.
Davis
,
Judge Wolin's interpretations of current good manufacturing practice issues contained in the Court's ruling in the United States versus Barr Laboratories
, in
Development and Validation of Analytical Methods
, ed. C. L. Riley and T. W. Rosanske,
Pergamon Press
,
Oxford
,
1996
, p. 352
5.
Inspection of Pharmaceutical Quality Control Laboratories
,
Food and Drug Administration
,
Rockville, MD
,
1993
6.
MHRA GMP Data Integrity Definitions and Guidance for Industry 1st Edition
,
Medicines and Healthcare Products Regulatory Agency
,
London
,
2015
7.
MHRA GMP Data Integrity Definitions and Guidance for Industry 2nd Edition
,
Medicines and Healthcare Products Regulatory Agency
,
London
,
2015
8.
MHRA GXP Data Integrity Guidance and Definitions
,
Medicines and Healthcare Products Regulatory Agency
,
London
,
2018
9.
WHO Technical Report Series No. 996 Annex 5 Guidance on Good Data and Records Management Practices
,
World Health Organisation
,
Geneva
,
2016
10.
WHO Handbook Good Laboratory Practices (GLP) Quality Practices for Regulated Non-Clinical Research and Development Second Edition
,
World Health Organisation
,
Geneva
,
2009
11.
PIC/S PI-041 Draft Good Practices for Data Management and Integrity in Regulated GMP/GDP Environments
,
Pharmaceutical Inspection Convention/Pharmaceutical Inspection Co-Operation Scheme
,
Geneva
,
2016
12.
EMA Questions and Answers: Good Manufacturing Practice: Data Integrity
,
2016
, Available from: http://www.ema.europa.eu/ema/index.jsp?curl=pages/regulation/general/gmp_q_a.jsp&mid=WC0b01ac058006e06c#section9
13.
R. D.
McDowall
,
Validation of Chromatography Data Systems: Ensuring Data Integrity, Meeting Business and Regulatory Requirements Second Edition
,
Royal Society of Chemistry
,
Cambridge
,
2017
14.
M.
Mecca
,
R.
Young
and
J.
Halcomb
,
Data Management Maturity (DMM) Model
,
CMMI Institute
,
Pittsburgh, PA
,
2014
15.
GAMP Guide Records and Data Integrity
,
International Society for Pharmaceutical Engineering
,
Tampa, FL
,
2017
16.
EudraLex – Volume 4 Good Manufacturing Practice (GMP) Guidelines, Annex 11 Computerised Systems
,
European Commission
,
Brussels
,
2011
17.
USP 41 General Chapter <1058> Analytical Instrument Qualification
,
United States Pharmacopoeia Convention
,
Rockville, MD
,
2018
18.
Martin
G. P.
,
et al., Stimuli to the revision process: proposed new USP general chapter: the analytical procedure lifecycle <1220>
,
Pharmacopeial Forum
,
2017
, vol.
43
1
19.
ICH Q2(R1) Validation of Analytical Procedures: Text and Methodology
,
International Conference on Harmonisation
,
Geneva
,
2005
20.
FDA Draft Guidance for Industry Data Integrity and Compliance with CGMP
, Silver Spring, MD, USA,
2016
21.
Part 211 – Current Good Manufacturing Practice for Finished Pharmaceuticals, Fed. Regist., 1978, 43(190), 45014–45089
22.
Directive 2001/83/EC of the European Parliament and of the Council of 6 November 2001 on the Community code relating to medicinal products for human use, OJ L 311, 28 Nov 2001, 311, 67
23.
FDA Glossary of Computerized System and Software Development Terminology
,
1995
, Available from: http://www.fda.gov/iceci/inspections/inspectionguides/ucm074875.htm#_top
24.
IEEE Standard 610.12-1990 – Glossary of Software Engineering Terminology (Replaced by ISO 24765: 2010)
,
Institute of Electrical and Electronic Engineers
,
Piscataway, NJ
,
1990
25.
SP 800-33: Underlying Technical Models for Information Technology Security
,
National Institute for Standards and Technology
,
Gaithersburg, MD
,
2001
26.
Reflection Paper on Expectations for Electronic Source Data and Data Transcribed to Electronic Data Collection Tools in Clinical Trials
,
European Medicines Agency
,
London
,
2010
27.
21 CFR Parts 16 and 58 Good Laboratory Practice for Nonclinical Laboratory Studies; Proposed Rule, Fed. Regist., 2016, 81, (164), 58342–58380
28.
ISO 9001: 2015 Quality Management Systems – Requirements
,
International Standards Organisation
,
Geneva
,
2015
29.
EudraLex – Volume 4 Good Manufacturing Practice (GMP) Guidelines, Chapter 1 Pharmaceutical Quality System
,
European Commission
,
Brussels
,
2013
30.
ICH Q10 Pharmaceutical Quality Systems
,
International Conference on Harmonisation
,
Geneva
,
2008
31.
21 CFR 211 Current Good Manufacturing Practice for Finished Pharmaceutical Products
,
Food and Drug Administration
,
Sliver Spring, MD
,
2008
32.
21 CFR 11 Electronic Records; Electronic Signatures, Final Rule, in Title 21, Food and Drug Administration, Washington, DC, 1997
33.
FDA Compliance Program Guide CPG 7346.832 Pre-Approval Inspections
,
Food and Drug Administration
,
Silver Springs MD
,
2010

or Create an Account

Close Modal
Close Modal