Annex 11: Progress in EU Computer Systems Guidelines

Article

Pharmaceutical Technology Europe

Pharmaceutical Technology EuropePharmaceutical Technology Europe-06-01-2011
Volume 23
Issue 6

In January 2011, a new version of Annex 11 was released by the European Commission along with a revision of Chapter 4 of its GMP on documentation to reflect the actualities of electronic record keeping, all of which come into full effect in June 2011.

In 1991, the Pharmaceutical Inspection Co-operation (PIC) created a document defining their requirements for computer systems. This document was given the name Annex 5 in the PIC GMP. In 1992, Annex 5 was incorporated as Annex 11 in the EU GMP, and has later become part of the GLP and GCP requirements in Europe (1). Since 1992, computer systems and applications have increased in complexity to such an extent that, although the main principles of the Annex 11 are still valid, the scope and content of the present annex are considered no longer suitable to meet the needs of either the pharmaceutical industry or inspectors (2).

Eudralex Volume 4, Annex 11, which refers specifically to computer systems, provides guidance for the interpretation of the principles of GMP for all EU members (3, 4). Annex 11 is found in Volume 4 of “The rules governing medicinal products in the European Union.” Volume 4 covers the interpretation of the principles and guidelines of GMP regulated activities.

In January 2011, a new version of Annex 11 was released by the European Commission along with a revision of Chapter 4 of its GMP on documentation to reflect the actualities of electronic record keeping, all of which come into full effect in June 2011. The revised Annex 11 adopts a risk-based approach, and is mostly aligned with current industry computer system good practices. The structure of the released Annex 11 document has a main principle and 17 clauses.

Major changes in Annex 11 are:

  • Formalization of risk management in both computer validation and change control.
  • Traceability throughout a life cycle moves from a regulatory expectation to a regulatory requirement for the first time.
  • New requirements for the need to keep and manage all electronic records.
  • Extensive expansion of the life cycle validation phase.

This paper examines the Annex 11 main directive, principle and four main clauses: risk management, requirements management, e-records management, and validation. It provides recommendations to implement these paragraphs. Some descriptions are based on listed guidelines with judicious editing where necessary to fit the context of this paper.

Main Directive and Principles

Similar to the US FDA Compliance Policy Guide (CPG) 7132a.11, computer systems performing regulated operations in the manufacturing of medicinal products for human use are regarded as equipment (5). Every time the expression “equipment” is used in the GMP, it is also a requirement for the associated computer systems (6).

Premises and equipment

Computer hardware must be properly specified to meet the requirements for its intended use, and the amount of data it must handle (7). The environmental controls, electrical requirements, electromagnetic "noise" control, and others should be considered when determining a location for computer hardware. The location of the hardware must allow access for maintenance, as required. There must be a program detailing the maintenance of the computer system (i.e., hardware maintenance manual). The maintenance of the computer, including periodic scheduled maintenance and breakdown maintenance, must be documented. There must be a system to control changes to the hardware. Changes must only be made by authorized individuals following an appropriate review and approval of the change.

Design documentation, including as-built drawings, should be maintained for computers, infrastructure and instrumentation (8, 9). There must be documented verification of the inputs and outputs (I/Os) for accuracy and the computer infrastructure must be qualified (10). In addition to the verification of I/Os checks during the qualification of computer hardware, I/Os checks must be verified periodically covering the data/control/monitoring interfaces between the system and equipment.

Annex 11 is ruled by three main principles. The first principle is:

“This annex applies to all forms of computerised systems used as part of a GMP regulated activities. A computerised system is a set of software and hardware components, which together fulfill certain functionalities.”

It is interesting to note how this differs to the Good Automated Manufacturing Practices (GAMP) definition of a computer system, which includes people, all software (applications, system level software and documentation), hardware, operating procedures, and peripheral equipment being operated by the computer performing specific, defined roles within a given environment (11).

The second principle in Annex 11 is:

“The application should be validated; IT infrastructure should be qualified.”

Computer systems require a written validation process; the depth and scope of this validation depends on the complexity and criticality of the computer application (12, 13, 15). The principle is stating that validation is associated with processes and that qualification is associated with equipment. The scope of validation is further discussed in Annex 11- 4.

The third principle in Annex 11 is:

“Where a computerised system replaces a manual operation, there should be no resultant decrease in product quality, process control or quality assurance. There should be no increase in the overall risk of the process.”

This third principle is related with the expectations of the regulator regarding implementation of computer systems. Prior to converting a process from manual to automated control or the introduction of a new automated operation, it is important that project staff consider any quality assurance and safety issues as part of an impact assessment of the risks associated with the process. Risk reduction measures may need to be incorporated into the systems design and operation. Additional risks to the quality of the related products/materials should not be introduced as a result of reducing the manual involvement in the process. As part of the process risk assessment, the manual process should be addressed and, if applicable, improvement in the process should be introduced. The automation must make the process easier and reduce execution time. The use of a computer system does not reduce any requirements that would be expected for a manual system in terms of data control and security.

Quality System for Computer Systems

Paragraph 4.5 in Annex 11 is a decisive principle and probably the most important:

“4.5. The regulated user should take all reasonable steps to ensure that the system has been developed in accordance with an appropriate quality management system.”

It refers to the need to ensure that computer systems are produced under a quality system, which incorporates the applicable system development life cycle model. The common goals in a quality system are understanding and meeting customer’s needs, and to ensure that adequate quality standards are maintained. The components of the quality system for computer systems are controlled process, computer system, operating procedures, and documentation.

Analysis of Main Clauses

In addition to the principles described above, Annex 11 contains a total of 17 clauses. The four main clauses are the risk management, requirements management, e-records management, and validation.

Risk Management

“Risk management should be applied throughout the life-cycle of the computerised system taking into account patient safety, data integrity and product quality. As part of a risk management system, decisions on the extent of validation and data integrity controls should be based on a justified and documented risk assessment of the computerised system.”

There are many techniques used to implement a risk management process, but generally include risk assessment, risk mitigation, and evaluation and assessment; here I outline one such method.

A detailed risk assessment should be performed, building on the initial risk assessment from the concept phase of the computer system. This risk assessment process weighs risks associated with processes and functions defined in the draft requirements specification (RS) (15). Risks found during the assessment may add requirements that need to be part of the RS. Risk assessment activities to consider are: identification of the processes/functions/transactions (as appropriate); analysis of risk scenarios, effects for each event, likelihood of events, severity of impact, likelihood of detection; a plan for the reduction or elimination of those risks. Reduction or elimination of those risks is performed during the system development life cycle (SDLC). Based on the risks identified, planning of the design validation, design verification, and qualification testing should begin. The test plan and test cases should be developed accordingly.

Strategies for mitigation of the identified risks may include modifying the process or system design, modification of the project approach or structure, or modification of the validation and testing approach.

During the risk evaluation, processes, systems, and/or functions should be assessed considering how possible hazards and potential harms arising from these hazards may be controlled or mitigated. For some processes, systems, and/or functions a detailed assessment should be performed.

To gain the most benefit from risk management, integration with the system life cycle (SLC) management and risk management activities should be achieved. Based on the intended use and the risk associated with the computer system to be implemented, the computer system developer/integrator should determine the specific approach, the combination of techniques to be used, and the level of effort to be applied.

EU Annex 20 on risk management provides an approach to computer systems and computer controlled equipment risk management. According to Annex 20, risk management should be applied to select the design of computer hardware and software (e.g., modular, structured, fault tolerance) and to determine the extent of validation (e.g., identification of critical performance parameters, selection of the requirements and design, code review, the extent of testing and test methods, reliability of electronic records and signatures).

Requirements Management

Requirements management is only one paragraph in Annex 11, yet it is a very critical recommendation and several processes are required to fulfill it.

“4.4. User Requirements Specifications should describe the required functions of the computerised system and be based on documented risk assessment and GMP impact. User requirements should be traceable throughout the life-cycle.”

This clause establishes the expectation of the EU regulator on how to manage requirements and traceability, through the SLC, the operational and non-operational computer systems functions required by the users, applicable regulations, company standards, product, process, and safety (16). These operational and non-operational functions must be managed based on a risk assessment.

A requirement is a need or expectation that is stated, generally implied or obligatory. ‘Generally implied’ means that it is custom or common practice for the organization, its customers, and other interested parties, that the need or expectation under consideration is implied.

The term ‘requirement’ defines a bounded characterisation of the scope of the system. It contains the information essential to support the operation/operators. Some of these requirements include product requirement, quality management requirement, customer requirement, functional capacity, execution capability, operational usability, information needed to support validation, installation and commissioning, SLC documentation required, user's manuals, training, maintenance manual, system maintenance, system test plan, acceptance criteria and regulatory compliance (i.e., EU Annex 11).

Management starts with requirements gathering and concludes when all requirements have been verified or tested. Requirements can be generated by different interested parties. The scope of the system generates many of the requirements for the operation to be supported and these requirements are typically provided by the system owner. The majority of the development of computer systems fails due to poor gathering of requirements, which negatively affects subsequent development activities and associated working products.

The RS is the deliverable in which all requirements are identified. It describes what the system is supposed to do from process, safety, user and compliance perspectives. The RS deliverable may be used as a framework to select the supplier/integrator and to develop the PQ protocol.

The RS deliverable should include an overview of the process in order to familiarize the application software developer with the user, process and data acquisition requirements of the system, and any special considerations for the project. The system functionality must be well defined at the outset in order to provide the prospective supplier/integrator with enough information to provide a detailed and meaningful quotation. Specifically, on data acquisition systems, the RS deliverable must include definitions of the data to be collected, how the data will be used, how it will be stored and retention requirements, data security requirements and where each operation will be completed.

The RS deliverable addresses:

  • the scope of the system and strategic objectives
  • the problem to be solved
  • process overview, sequencing requirements, operational checks
  • sufficient information to enable the supplier/integrator to work on a solution to the problem (e.g. device driven sequencing, the methods required of the presentation of data, data security, data backup, data and status reporting and trending).
  • redundancy and error-detection protocol
  • operating environmental
  • interfaces (e.g., to field devices, data acquisition systems, reports and HMI), I/O lists, communications protocols and data link requirements
  • information gained from operators and supervisors on the system design requirements and expectations in order to influence how the system is designed and operated
  • type of control and process operations to be performed
  • data storage requirements
  • transaction/data timing requirements and considerations
  • regulatory requirements
  • preliminary evaluation of the technology
  • feasibility studies and preliminary risk assessment
  • safety and security considerations
  • security, other requirements
  • nonfunctional requirements (e.g., SLC development standards, programming language standards, program naming convention standards).

Each requirement in the RS deliverable must be unambiguous, verifiable, traceable, modifiable, usable, consistent (individual requirements must not conflict with each other) and complete.

The RS deliverable is confirmed in a test and/or operating environments during the PQ. This must include the verification of the procedural controls associated with the system that were identified in the RS deliverable. Some requirements may be verified before reaching the qualification phase in the formal validation process.

After the approval of the requirements document, the requirements traceability initiates. It is a sub-discipline of requirements management within software development and systems engineering. Requirements traceability documents the life of a requirement and provides bi-directional traceability between various associated requirements. It enables users to find the origin of each requirement and track every change that was made. For this purpose, it may be necessary to document every change made to the requirement. Traceability is an essential aspect during the verification activities in the SLC, and provides important input into design reviews.

The above represent my recommendations, but there are many ways to implement the management of computer system requirements.

E-records Management

Applicable from creation to eventual disposal, records management may include classifying, storing, securing, and destruction (or in some cases, archival preservation) of records. The ISO 15489: 2001 standard defines records management as "The field of management responsible for the efficient and systematic control of the creation, receipt, maintenance, use and disposition of records, including the processes for capturing and maintaining evidence of and information about business activities and transactions in the form of records".

The general principles of records management apply to records in any format. Digital records (almost always referred to as electronic records or e-records) raise specific issues. It is more difficult to ensure that the content, context and structure of records is preserved and protected when the records do not have a physical existence. Undoubtedly the area in which most Annex 11 recommendations are made is e-records and how to manage them.

Annex 11 e-records management recommendations can be categorized according to the e-record life cycle concept. This is a requirement in understanding any discussion about the controls necessary for proper treatment to ensure authenticity and reliability of records. Such a life cycle may be characterized as periods representing creation, access and use, and destruction.

Period of Creation.

“4.8. If data are transferred to another data format or system, validation should include checks that data are not altered in value and/or meaning during this migration process."

The migration of e-records must be verified; there must be an additional check on the accuracy of the entry.

“6. Accuracy Checks - For critical data entered manually, there should be an additional check on the accuracy of the data. This check may be done by a second operator or by validated electronic means. The criticality and the potential consequences of erroneous or incorrectly entered data to a system should be covered by risk management.”

For electronic records regulated users should define which data are to be used as raw data (17). Where applicable, there should be special procedures for critical data entry requiring a second check, for example the data entry and check for a manufacturing formula or the keying in of laboratory data and results from paper records. A second authorized person with logged name and identification may verify data entry via the keyboard with time and date. The inclusion and use of an audit trail (refer to Annex 11-9) to capture the diversity of changes possibly impacting the data may facilitate this check.

When automated equipment is used as described under US FDA 21 CFR 211.68(c), featuring direct data capture linked to other databases and intelligent peripherals, the verification by a second individual is not necessary. For example, firms may omit the second person component in weight check operations if scales are connected to a computer system performing checks on component quality control release status and proper identification of containers. The computer system must be validated, registering the raw materials identification, lot number and expiry date, and integrated with the recorded accurate weight data.

Period of Access and Use.

“5. Data - Computerised systems exchanging data electronically with other systems should include appropriate built-in checks for the correct and secure entry and processing of data, in order to minimize the risks.”

Based on the complexity and reliability of computer systems, there must be procedural controls and technologies to ensure the accuracy and security of computer system I/Os and electronic records. The US FDA Compliance Policy Guide (CPG) 425.400 (formerly 7132a.07), “I/O Checking,” establishes that computer I/Os are to be tested for data accuracy as part of the computer system qualification and, after the qualification, as part of the computer system’s on-going performance evaluation procedure. The use of input edits may mitigate the need for extensive I/O checks (18).

The objective of the I/O checks is to develop a method to prevent inaccurate data inputs and outputs. I/Os should be monitored to ensure the process remains within the established parameters. When monitoring data on quality characteristics demonstrates negative tendencies, the cause should be investigated, corrective action taken and revalidation considered.

Edits can also be used to make up information and give the erroneous impression that a process is under control. These error overrides must be documented during design.

“7.1. Data Storage - Data should be secured by both physical and electronic means against damage. Stored data should be checked for accessibility, readability and accuracy. Access to data should be ensured throughout the retention period.”

“7.2. Data Storage - Regular back-ups of all relevant data should be done. Integrity and accuracy of backup data and the ability to restore the data should be checked during validation and monitored periodically.

Computer system electronic records must be controlled and include record retention, backup and security. Computer systems must also have adequate controls to prevent unauthorized access or changes to e-records, inadvertent erasures, or loss. The validated back-up procedure including storage facilities and media, should assure integrity and availability of e-records and audit trail records. The frequency of back up is dependent on the computer system functions and the risk assessment of a loss of e-records.

Procedures for regular testing, including a test plan, for back-up and disaster recovery should be in place. A log of back up testing including date of testing and results should be maintained and a record of rectification of any errors should be kept. The physical security of the system should also be adequate to minimize the possibility of unauthorized access, willful or accidental damage by personnel or loss of e-records.

Regular training in all security/backup relevant procedures to the personnel providing security and performing backups is crucial. Before hardware and/or software is exchanged, a change control mechanism should be used to check that the e-records concerned can also be managed in the new configuration. Should an inevitable change in the hardware and/or software mean that the stored e-records cannot be managed in the new configuration, then one of the following procedures should be applied:

  • the e-records in the format concerned should be converted into a format that can be printed in the new configuration
  • the components of the old hardware and/or software configuration required for printing should be retained. In this case it should be guaranteed that a suitable alternative system is available in case the retained system fails.
  • the e-record is transferred to another medium.

The electronically stored e-records should be checked regularly for availability and integrity.

Appropriate controls for electronic documents such as templates, forms and master documents should be implemented. Appropriate controls should be in place to ensure the integrity of the record throughout the retention period.

Additional references associated with this principle can be found at: Article 9 Section 2, Commission Directives 2003/94/EC; PIC/S PI 011-3; EudraLex - Volume 4 Good manufacturing practice (GMP) Guidelines, Part I - Basic Requirements for Medicinal Products, Chapter 4 – Documentation; and US FDA 21 CFR 211.68 and 21 CFR Part 11.10(c); 11.10(d); 11.10(e); 11.10(g); 11.10(h); 11.30.

“ 9. Audit Trails - Consideration should be given, based on a risk assessment, to building into the system the creation of a record of all GMP-relevant changes and deletions (a system generated "audit trail"). For change or deletion of GMP-relevant data the reason should be documented. Audit trails need to be available and convertible to a generally intelligible form and regularly reviewed.”

Audit trails are control mechanisms generated by the computer systems that allow all data entered and further processed by the system to be traced back to the original e-record. If the e-record needs to be changed, a second person should approve these changes along with the reasons. The audit trail records should be reviewed regularly. And the date and time of the audit trail must be synchronized to a trusted date and time service.

One of the key controls for audit trails is the link of the electronic record with the audit trail. Audit trails can be part of the record that has been modified or a stand-alone record linked to the modified record. It must not be possible to modify audit trails. The access rights for audit trial information must be limited to print and/or read only. The combination of authentication, digital certificates, encryption and access control lists provides the technical mechanisms needed to control the access to audit trail files.

“12.4. Management systems for data and for documents should be designed to record the identity of operators entering, changing, confirming or deleting data including date and time.”

Computer systems must have adequate controls to prevent unauthorized access or changes to e-record, inadvertent erasures or loss. Procedures should ensure that:

  • Access rights for all operators are clearly defined and controlled, including physical and logical access.
  • Basic rules exist and are documented to ensure security related to personal passwords or pass cards and related system/e-records security requirements are not reduced or negated.
  • Correct authority and responsibilities are assigned to the correct organizational level.
  • Identification code and password issuance is periodically checked, recalled or revised.
  • Loss management exists to electronically invalidate lost, stolen or potentially compromised passwords. The system should be capable of enforcing regular changes of passwords.
  • Procedures identify prohibited passwords.
  • An audit log of breaches of password security should be kept and measures should be in place to address such breaches.
  • The system should enforce access revocation after a specified number of unsuccessful logon attempts.
  • Validated recovery of original information and e-records following back up, media transfer, transcription, archiving, or system failure.
  • Attempted breaches of security safeguards should be recorded and investigated.

Some equipment, such as standalone computer systems and dedicated operator equipment interfaces and instruments may lack logical (e.g., password.) capabilities. These should be listed, justified and subjected to procedural controls.

“17. Archiving - Data may be archived. This data should be checked for accessibility, readability and integrity. If relevant changes are to be made to the system (e.g. computer equipment or programs), then the ability to retrieve the data should be ensured and tested.”

The archived records need to be trustworthy and reliable as well as accessible, no matter where they are stored. The party having primary responsibility for record retention under the predicate regulations would be the party we would hold responsible for adequacy of archiving.

Validation

The validation of a computer system may at first appear to be a daunting task, particularly for a person without computer experience. In fact, such an unqualified person might be considered to be the ideal system auditor. Armed with the right tools and questions, there is no reason why any experienced auditor cannot validate a computer system (19).

As the motor vehicle industry discovered in the 1970s, trying to retrospectively fit quality following end of line inspection does not work. So it is with computer systems. If the environment in which a computer system is created is not controlled, then the likelihood of a quality system being produced is very low. Poor quality equates to high risk in the pharmaceutical industry. If we accept that an uncontrolled environment is likely to produce a poor quality system, then the place to start our validation is the environment. In the information services industry, companies with a controlled software development environment refer to this as their quality management system (QMS).

The Food and Drug Administration (FDA) Guideline on Principles of Process Validation defines validation as "Establishing documented evidence which provides a high degree of assurance that a specific process will consistently produce a product meeting its predetermined specifications and quality attributes."

Using this as our guide, the process of validation should start with the environment in which a software system is produced. This, in turn, will lead us to documented evidence that a particular piece of software will consistently produce pre-determined results.

Annex 11 and Validation.Computer system validation is the formal assessment and reporting of quality and performance measures for all the life-cycle stages of software and system development, its implementation, qualification and acceptance, operation, modification, re-qualification, maintenance and retirement. The user must have a high level of confidence in the integrity of both the processes executed within the controlling computer system and in those processes controlled by and/or linked to the computer system, within the prescribed operating environment.

Validation is part of the project phase. 4.1 The validation documentation and reports should cover the relevant steps of the life cycle. Manufacturers should be able to justify their standards, protocols, acceptance criteria, procedures and records based on their risk assessment.

4.2 Validation documentation should include change control records (if applicable) and reports on any deviations observed during the validation process.

4.3 An up to date listing of all relevant systems and their GMP functionality (inventory) should be available. For critical systems an up to date system description detailing the physical and logical arrangements, data flows and interfaces with other systems or processes, any hardware and software prerequisites, and security measures should be available.

4.4 User Requirements Specifications should describe the required functions of the computerised system and be based on documented risk assessment and GMP impact. User requirements should be traceable throughout the life cycle.

4.5 The regulated user should take all reasonable steps to ensure that the system has been developed in accordance with an appropriate quality management system. The supplier should be assessed appropriately.

4.6 For the validation of bespoke or customised computerised systems there should be a process in place that ensures the formal assessment and reporting of quality and performance measures for all the life-cycle stages of the system.

4.7 Evidence of appropriate test methods and test scenarios should be demonstrated. Particularly, system (process) parameter limits, data limits and error handling should be considered. Automated testing tools and test environments should have documented assessments for their adequacy.

A procedure must be developed to delineate the normal path of the computer system validation activities. The procedure must address validation projects based on criticality and complexity. Based on these criteria, the procedure should define the documentation requirements. A validation plan may be used for any deviation from the standard computer validation project.

A crucial process during computer system development is configuration management, which establishes and maintains consistency of a system or product's performance and functional/physical attributes throughout its lifetime within its requirements, design, and operational information.

Establishing intended use and proper performance of the computer system is another key concept for validation. At the beginning of a life cycle it is essential to establish the computer system’s intended use, which is one of the factors to account for when determining the granular level of application validation. For commercially available software that has been qualified by the vendor/supplier, this does not require the same level of testing. Proper performance relates to the general principle of validation (20). Planned and expected performance is based upon predetermined design specifications and, consequently, “intended use”.

All computer systems used to automate any regulated function must be validated for intended use. This also applies to any software used to automate design, testing, component acceptance for medical devices, manufacturing, labeling, packaging, distribution, complaint handling, or to automate any other aspect of the quality system.

In addition, computer systems used to create, modify and maintain electronic records, and used to manage electronic signatures are also subject to the validation requirements. Such computer systems must be validated to ensure accuracy, reliability, consistent intended performance, and the ability to discern invalid or altered records.

Software for the above applications may be developed in-house or under contract. However, software is frequently purchased off-the-shelf for a particular intended use. All production and/or quality system software, even if purchased off-the-shelf, should have documented requirements that fully define its intended use, as well as information against which test results and other evidence can be compared, to show that the software is validated for its intended use.

Appropriate installation and operational qualifications should demonstrate the suitability of computer hardware and software to perform assigned tasks.

Other Aspects of Annex 11

In addition to the Annex' increased scope, the revisions also affect the following areas:

  • Personnel
  • Suppliers and service providers
  • Printouts
  • Periodic review
  • Security
  • Incident management
  • Electronic signatures
  • Batch release
  • Business Continuity

The above areas are not discussed in this paper.

Conclusion

Annex 11 was revised in response to the increased use and complexity of computer systems. It defines EU requirements and applies to all forms of computer systems used as part of GMP regulated activities. The new Annex 11 remains concise yet offers more practical and precise specifications that can be used to ensure that computer systems will be produced under a quality system.

Consistent with current industry practices, risk management (assessment, mitigation and evaluation) applicable to computer systems performing regulated operations takes center stage in Annex 11 and impacts all sections.

The relevance of requirements management’s role in successfully managing a computer system implementation or maintenance project is stressed by integrating an SLC traceability management process into the risk management process.

To ensure accuracy, reliability, consistent intended performance, and the ability to discern invalid or altered records, electronic records management is also emphasized in the EU regulatory specification.

The validation process as a route to authenticate the quality of the computer system during the SLC is another key element of Annex 11.

Notes and references

1. Segalstad, S. H., “Pharmaceutical Computer Systems Validation: A Practical Approach for Validating LIMS and Other Manufacturing Systems,” European Pharmaceutical Review, November 1997.

2. Application - Software installed on a defined platform/hardware providing specific functionality. Annex 11

3. Annex 11 Volume 4 of the Rules Governing Medicinal Products in the European Community, Computerized Systems.

4. Computer System - a system including the input of data, electronic processing and the output of information to be used either for reporting or automatic control. Eudralex Volume IV, Glossary

5. Note that a CPG, as described in 21 CFR 10.85, is considered an advisory opinion directed to FDA inspectors. These guide are the mechanisms the FDA utilizes to spread policy statements within the Agency and to the public.

6. The equivalence of software and records in this CPG has been superseded by the approach taken in the Guidance on Part 11 Scope and Application. The equivalence of equipment and computer systems still stands. In the next revision of 21 CFR Part 11, this CPG may get formally withdrawn. The predicate regulations contained imply requirements for computers.

7. Must, or the terms "required" or "shall", mean that the definition is an absolute requirement of the specification.

8. Should, or the adjective "recommended", mean that in certain circumstances valid reasons may exist to ignore a particular item, but the full implications must be understood and carefully weighed before choosing a different course.

9. Infrastructure - the hardware and software such as networking software and operation systems, which makes it possible for the application to function. Annex 11.

10. Qualification - action of proving that any equipment works correctly and actually leads to the expected results. Eudralex Volume IV, Glossary.

11. GAMP Guide For Validation of Automated Systems in Pharmaceutical Manufacture, Version V5.0, Good Automated Manufacturing Practice (GAMP) Forum, International Society for Pharmaceutical Engineering, Tampa FL, 2008.

12. Validation - action of proving, in accordance with the principles of Good Manufacturing Practice, that any procedure, process, equipment, material, activity or system actually leads to the expected results. Eudralex Volume IV, Glossary.

13. Process - set of interrelated or interacting activities that transform inputs into outputs. Note: Inputs to a process are generally outputs of other processes. ISO 900, Quality Management Systems Requirements.

14. Critical data – can be defined as patient safety, data integrity and product quality related data.

15. Specification - document stating requirements. ISO 900, Quality Management Systems Requirements.

16. Establish means define, document (in writing or electronically), and implement.

17. Raw data - All data on which quality decisions are based should be defined as raw data. Eudralex Volume IV, Glossary.

18. Edits -- software may be written in such a manner as to reject or alter certain input or output information, which does not conform to some pre-determined criterion or otherwise fall within certain pre-established limits. Edits can be a useful way of minimizing errors and/or to reject erroneous entries. Edits can also be used to falsify information and give the erroneous impression that a process is under control.

19. P. Hill, “An Introduction to computer system validation”, A Seminar held in Sydney, Australia from 18 to 20 September 1996. Inspection of Computer Systems published by the Secretariat to the PIC and PIC/S.

20. Center for Drug Evaluation and Research, Center for Biologics Evaluation and Research, and Center for Devices and Radiological Health Food and Drug Administration, “Guideline on General Principles of Process Validation,” U.S. FDA, Rockville, MD, May 1987

Acknowledgement

I would like to express my gratitude to Ludwig Huber, Siegfried Schmitt, David Stokes, Siôn Wayne and Siri H. Segalstad who provided recommendations to improve this article.

Disclaimers

The information contained in this article is provided in good faith and reflects the personal views of the author. These views do not necessary reflect the perspective of the publisher of this article. No liability can be accepted in any way. The information provided does not constitute legal advice.

Dedication

This article is dedicated to my grandson, Mikhail Lopez…the Jr.

Recent Videos
CPHI Milan 2024: Compliance and Automation in Aseptic Processing
Related Content