Sponsors should consider best practices for maintaining data generated during sample analysis and instrument maintenance.
This article summarizes a pharmaceutical industry consensus viewpoint of the current regulations regarding data integrity as applied to analytical test data generated to support regulatory activities. In particular, the focus will be on data generated by external personnel (e.g., a third-party laboratory or instrument service technician), hereafter referred to as a vendor.
There are considerations for data integrity (1–4) when data are generated by a vendor. For example, analytical instrumentation (e.g., chromatographic, spectroscopic, spectrophotometric, thermogravimetric, electrochemical, or microscopy instrumentation) utilized in the pharmaceutical industry generates data not only during analytical sample analysis but also during routine or preventive maintenance, such as instrument calibration or qualification, and during troubleshooting associated with repairs. For preventive maintenance or repairs, data may be generated by company personnel, hereafter referred to as the sponsor, or by vendors. The authors will refer to vendor-generated data, whether generated using the sponsor’s instrumentation or the vendor’s own instrumentation, as “outsourced data”.
Although data generated during routine testing may employ well-established processes that ensure data integrity, because pharmaceutical companies deal with many vendors as well as in-house support for instrument support, the maintenance/calibration/qualification process may be fully paper based, fully paperless, or a combination of both (i.e., hybrid documentation). As such, the approaches taken to ensure data integrity during these activities may vary and should be assessed on a case-by-case basis. In addition, the possible modes of remediation may evolve as instrument manufacturers and testing/maintenance vendors evolve their approaches and capabilities.
Within this article, data are defined as electronic and/or paper records generated during GxP testing (e.g., release or stability testing) as well as during analytical instrument calibration and/or qualification activities. Outsourced (or third-party) data are data generated on qualified/validated instruments by outside personnel. These include:
All expectations should be clearly described in the written agreement (e.g., maintenance contract, quality agreement) in any of these situations.
Data should be attributable, legible, contemporaneously recorded, original or a true copy, and accurate (ALCOA). Data management, retention, and archiving should be managed based on documented risk, including the format of the data. If electronic, dynamic data are available, these data or a complete, certified true copy thereof should be maintained.
Dynamic data are formatted to allow for interaction between the user and the record content. With dynamic data, a user may be able to reprocess using different parameters or modify formulas/entries that will alter a calculation result. If it is not possible to maintain the electronic, dynamic data record, a complete static representation of all data including metadata, audit trails, etc., must be maintained. Static data (data that are fixed and allow little or no interaction between the user and the record content) may allow for a more streamlined approach than that required for dynamic data. For instance, because there is no need to monitor changes in data processing or in reported results, in many cases it may be appropriate for the third-party to supply only a printed or static electronic report (e.g., a PDF file) of the reported data for archive and retention. For data that were generated in static format, a static archived representation is appropriate if it is a true copy (including all relevant metadata) of the data. In all cases, the original data must be completely reconstructable.
The various scenarios under which outsourced data may be generated and managed are summarized in Table I and are further discussed in the following scenarios. In each case, key requirements and points to be considered are given. For all three scenarios, the vendor should be approved by the sponsor and there should be an agreement in place for services provided. The necessary details of the agreement vary depending on the service provided and the scenario.
Scenario A: Vendor-generated data on sponsor instruments using the information technology (IT) infrastructure of the sponsor.
In cases where data are acquired and stored by the vendor through the sponsor’s standard workflows and on the sponsor’s technology infrastructure, the sponsor’s standard data integrity policies and procedures should apply. The accounts and roles utilized by the vendor should be unique and configured to ensure attributability and be specific to the work being performed. The sponsor should ensure that the vendor possesses the appropriate training required for access per sponsor’s applicable standards and retain documentation of this assurance.
Scenario B: Vendor-generated data on sponsor instruments using the IT infrastructure of the sponsor and non-standard processes.
This scenario pertains to situations where the instrument hardware, firmware, or software is being accessed or utilized in a non-routine manner (different than it would typically be used to collect or process data). Examples could include using diagnostic mode or calibration mode, accessing the software outside of the network, saving in a different file location, or using a hybrid paper/electronic process. A risk assessment and mitigation strategy should be considered to ensure compliance with ALCOA principles.
It is recognized that some of the previously mentioned examplesmay require non-standard access. The accounts and roles utilized by the vendor should be unique and configured to ensure attributability and be specific to the work being performed. The sponsor should ensure (through a written agreement, direct training, or other written process) that the vendor has appropriate technical and good documentation training per sponsor’s applicable standards and retain this documentation. Data review should follow the sponsor’s standard practices and may include a defined, documented risk-based approach for reviews of vendor-generated data. System level reviews should include reviews of vendor account access. Vendor-generated data on sponsor instruments should be in alignment with the sponsor’s standards for segregation of duties. Controls should be in place to ensure data integrity is appropriately managed. In particular:
Scenario C: Vendor-generated data stored on the IT infrastructure of the vendor.
One of the critical elements of data integrity for data generated by a third-party is where the raw and reported data are stored and to whom the data are accessible. Regardless, data should be retained and archived in accordance with a quality agreement (or equivalent) and in compliance with regulatory requirements. These written agreements should establish sponsor expectations and vendor responsibilities related to data integrity controls for good manufacturing practice (GMP) or good laboratory practice (GLP) records, as well as how communication and auditing of such records should take place. The agreement should ensure the following, for the vendor:
The sponsor should agree to regularly update, review, and communicate the following to the vendor:
The sponsor should clearly communicate that sponsor audits of the vendor will include a focus on data integrity elements and practices, and the sponsor should ensure that audits include assessment and evaluation of data integrity controls in place.
The written agreement should additionally delineate the record retention responsibilities of the two parties and any handoffs between vendor and sponsor at specific milestones. In cases where data are collected using software that the sponsor does not have, the vendor should retain e-records and the software necessary to make them human-readable (including metadata).
A primary concern is that of security of the raw data. Data stored on the IT infrastructure of a third party are inherently less under the control and protection of the sponsor. In cases where the third party retains the original data, it is critical that appropriate expectations and responsibilities are clearly defined in a quality agreement (or equivalent). These agreements should include the following considerations:
Additional considerations for data generated during instrument calibration, qualification, repair, and troubleshooting (can be Scenario A, B or C).
Data integrity controls for analytical instruments and equipment may be challenged during internal or regulatory authority audits and inspections. It is a regulatory expectation (1–4) that the integrity of supporting instrument data is robust and that risks to these data have been adequately mitigated. Some examples where data integrity controls (including appropriate change control) are needed include calibration and qualification of raw data, control of standards (e.g., reference standards, external calibrated test probes, etc.) used during calibration, and management of internal and vendor documentation.
There are several aspects of the data lifecycle needed to underwrite the accuracy of analytical data. Requirements include demonstration that an instrument can produce accurate results and is under adequate system controls. Adequate system controls relate to documented procedures that support initial qualification, periodic calibration and maintenance, instrument repair and troubleshooting (including change control), and periodic review of the calibration/qualification status of the instrument.
Analytical instrument qualification and/or initial calibration is critical to ensure data generated on an instrument is accurate (ALCOA). Data generated using systems other than the one under qualification or calibration (e.g., a thermocouple used to calibrate a chromatography column heater) should be subject to the same controls as other data. In a scenario where initial validation/qualification is performed prior to mitigation of data integrity concerns, consideration should be given to potential data integrity concerns of the qualification testing. In general, data generated during calibration, qualification/validation, or maintenance activities should comply with all the foregoing requirements. However, there may be scenarios where data are generated outside of the normal workflow. In these cases, deviations from the normal workflow should be evaluated for risk and documented appropriately, particularly for risks that may impact the attributability or accuracy of the resultant data.
While all three scenarios, as well as many permutations of them, are possible and may be managed, Scenarios A (vendor-generated data on sponsor instruments using IT infrastructure of the sponsor) and C (vendor-generated data stored on the IT infrastructure of the vendor) are strongly preferred from a compliance and risk perspective. If Scenario B (vendor-generated data on sponsor instruments using the IT infrastructure of the sponsor and non-standard processes) is employed, a risk assessment and mitigation strategy should be considered to ensure compliance with ALCOA principles. In any case, the considerations described in this article should be evaluated to achieve compliance.
This manuscript was developed with the support of the International Consortium for Innovation and Quality in Pharmaceutical Development (IQ, www.iqconsortium.org). IQ is a not-for-profit organization of pharmaceutical and biotechnology companies with a mission of advancing science and technology to augment the capability of member companies to develop transformational solutions that benefit patients, regulators, and the broader research and development community.
1. MHRA, ‘GXP’ Data Integrity Guidance and Definitions (London, UK, March 2018).
2. FDA, Guidance for Industry: Data Integrity and Compliance With Drug CGMP Questions and Answers (Rockville, MD, December 2018).
3. World Health Organization, Guideline on Data Integrity (Geneva, Switzerland, October 2019).
4. Pharmaceutical Inspection Convention Pharmaceutical Inspection Co-operation Scheme, Good Practices for Data Management and Integrity in Regulated GMP/GDP Environments (Geneva, Switzerland, November 2018).
Thomas Cullen*, thomas.f.cullen@abbvie.com, and Cliff Mitchell work in Analytical Research and Development at AbbVie Inc. (North Chicago, IL); Julie Lippke and Joseph Mongillo both work in Analytical Research and Development at Pfizer Inc. (Groton, CT); Koottala S. Ramaswamy works in Global Development Quality at Merck & Co., Inc. (West Point, PA); and Thomas Purdue works in Quality at Boehringer Ingelheim Pharmaceuticals Inc. (Ridgefield, CT); all authors are members of the IQ Consortium.
*To whom all correspondence should be addressed
Pharmaceutical Technology
Supplement: Outsourcing Resources
August 2021
Pages: s16-s20
When referring to this article, please cite it as T. Cullen, C. Mitchell, J. Lippke, J. Mongillo, K.S. Ramaswamy, and T. Purdue, “Data Integrity Considerations for Vendor-Generated Data Associated with Analytical Testing," Pharmaceutical Technology Outsourcing Resources (August 2021).