Data Integrity Challenges in Manufacturing

Publication
Article
Pharmaceutical TechnologyPharmaceutical Technology-07-02-2016
Volume 40
Issue 7
Pages: 50–54

Designing systems using the principles of good documentation practice, including validated audit trails, is a key piece of a manufacturing data integrity program.

VICTOR HABBICK VISIONS/SCIENCE PHOTO LIBRARY/GETTY IMAGES

Data integrity, which refers to the completeness, consistency, and accuracy of data, is a key part of CGMP compliance for drugs, said FDA in its April 2016 draft guidance (1). The agency said at a 2014 conference that it anticipated more enforcement of data integrity issues, including warning letters, product seizures, import alerts, and broader injunctions (2), and indeed, several warning letters since then have focused on data integrity.

Regulatory and industry organizations have tried to clearly spell out what data integrity means. Several other regulatory bodies have also published guidelines and guidance regarding data integrity. The United Kingdom Medicines and Healthcare products Regulatory Agency (MHRA) published a guidance for industry defining data integrity in March 2015 (3). The United States Pharmacopeial Convention (USP) proposed a new General Chapter <1029> on good documentation practices (4). And in June 2016, the World Health Organization (WHO) published a guidance on good data and record management practices (5).  

Data integrity should be thought of as a whole system, says Rebecca Brewer, strategic practice lead for Quality Executive Partners. “All components of the system-organization, culture, and oversight; training and performance management; data management; physical controls; and documentation practices-must be working effectively to provide the highest level of assurance of data integrity.”

"Management must align expectations with the capability of a process, site, or even a person," adds Monica Cahilly, president of Green Mountain Quality Assurance. "If the infrastructure or the resources aren't there-for example, to achieve a certain throughput-errors may result and there may be a greater risk for falsification of data to try to meet targets. Establishing and staying within the boundaries of a design space that yield a safe and effective product is fundamental to meaningful data integrity and data governance programs."

"In large-molecule production, with all the complexities of this technology compared to small molecule, companies must be mindful of what targets can be realistically achieved given the variability of the technology. Saying we can hit a target that we can not is a mistake," says Cahilly. Regulatory guidance documents are beginning to acknowledge this with more realistic targets specific to large-molecule testing. For example, a 2013 FDA draft guidance on bioanalytical methods (6), which revises a 2001 guidance, gives broader acceptance criteria (e.g., for accuracy and precision) for ligand-binding assays, notes Cahilly. 

As companies work to improve data integrity, computerized systems and electronic records are playing a key role. The International Society for Pharmaceutical Engineering (ISPE) GAMP Community of Practice started a data integrity special interest group (SIG) in January 2014 due to the high interest in this area, says Michael Rutherford, the GAMP global chair and sponsor of the SIG. The group has published concept papers and offered education sessions to help members get a handle on data integrity, including challenges with electronic records.

Paper and electronic records
Recent attention to electronic records is primarily because these systems have not yet been as closely examined as traditional paper systems, says Cahilly. But whether records are paper or electronic doesn't really matter, say experts; the core principles of ALCOA+ (data must be attributable, legible, contemporaneous, original, accurate, complete, consistent, and enduring) apply to both. 

Rebecca Brewer, strategic practice lead for Quality Executive Partners, explains some of the concerns with electronic records.

PharmTech: Part of ALCOA+ is ensuring an available and enduring record. What risks are associated with this aspect of electronic records?

Brewer: The systems from today may not be fully compatible with the systems of tomorrow, and it is a constant struggle to ensure that not only the data, but also the metadata, are available in a readable, retrievable format for the duration of retention period of that data. Although batch release data may have a limited shelf-life in terms of archival, data in support of development, technology transfer, stability, validation, and continued process verification may be required for the full life-time of the product itself.

Mitigation of these risks depends on effective planning for obsolescence. At the time of computer system/automation design, it is not too early to begin thinking about how records will be maintained in an accessible format after the system has reached obsolescence.

“Manual systems most commonly suffer from failure to be a contemporaneous record, and may not be original, accurate, and complete,” notes Brewer. “Since the inception of GMPs, the industry has been emphasizing the importance of good documentation practices, yet today we still see occasional ‘pencil whipping’ of records, where an employee finishes a series of tasks and then signs for all of them (rather than completing the entries as the tasks were accomplished) or occasions where one employee signs for another employee’s activities.” Automated, electronic systems can be better if they restrict access and ensure that entries are attributable.

Making sure records are original is key in both paper and electronic versions. "There is a trash can on the computer just like a trash can in the lab," notes Lorrie Schuessler, a co-leader of the GAMP SIG. "Controls must be in place for computers to prevent deleting or renaming files or changing a record's date. Inspectors are trained to detect the specific ways that data changes can be covered up in computer systems just as they look for fraud in paper records."

"Sometimes people change records intentionally, but sometimes they may just be trying to make it less sloppy, for example," notes Rutherford. "Any time you have a human in the process, we make mistakes. Someone might not strictly follow procedures, and there might not be strict enough controls that force good data integrity."

One challenging area is having controls to prevent 'orphan data', which are results that are acquired but not reported or reviewed. "A good example of orphan data is when there are 'test' injections performed under the auspices of an investigation or in preparation for running a sample on a chromatography system, where these samples are never included in the formal investigation or run report," explains Brewer. "These unreported sample results can, intentionally or unintentionally, prevent failing data from coming to light or serve as a way in which an operator can change the operating conditions to ensure a good result. Management oversight should include a combination of policy, procedure, training, monitoring, and metrics. 'Trust but verify' should be the watchwords for the program, with frequent inspection to look to ensure that no orphan data are detected and to ensure that all operations comply with the intended policies and procedures.”

Self-recorded data (i.e., data that are not captured directly from a networked automation system) is another concern. "In these cases, it is important to think carefully about when the 'four eyes' principle should be employed and when a 'second check' is required," suggests Brewer.

A special concern for electronic data is security-changes should be made only by authorized personnel and these changes should be recorded. "Making a change on paper is more obvious, because you can easily initial, date, and note a reason for the change," notes Rutherford, but with electronic records, systems need to be set up to control changes. "Shared accounts or roles are common in manufacturing control systems, but this is a challenge for data integrity because, for example, six different engineers could change a parameter. Regulatory agencies would prefer to not have shared accounts. But for situations where they are necessary, such as for running a test over a 24-hour period, there must be other ways of showing the integrity of the data."

"Ideally there would be a technological solution to the problem of having shared accounts by keeping track of who to attribute data to," adds Schuessler. "If an electronic solution isn't available, sometimes the way to deal with this situation is a paper log."

"Whether you use a computer or paper, you can have data integrity issues," concludes Rutherford. "The key is to manage and control the risk so it doesn’t affect patient safety. Doing this includes having a quality culture, proper procedures, and making sure people are reviewing data properly and catching problems before they impact product quality and ultimately patient safety."

 

 

Audit trails
A data integrity program should include a review of an audit trail, in which critical data points are reviewed. FDA's data integrity guidance promotes a risk-based approach to reviewing the content of the original electronic record, with a focus on changes to critical data, explains Cahilly. It is important to understand that the entire original electronic record is considered the original, even if only a subset of it is printed. "Regulators and quality units are now starting to understand where to find meaningful data and metadata and make more informed decisions about whether products are safe and effective," says Cahilly. "The challenge is to facilitate an efficient review by thinking through what is critical when you’re validating the system."

In addition, people need to be trained to review audit trails to find problems in electronic data, says Cahilly. "On paper, reviewers are already trained to look for cross-outs and focus on the ones that may represent significant changes that could affect process, or method, or product, for example. In computer audit trails and metadata, reviewers would also look at audit trails and other meaningful metadata to determine whether a change to data was appropriate and properly investigated, if required. A review that is risk-based requires process understanding and thus would focus on changes to data potentially impactful to process rather than those of indirect or no impact. For example, was a datapoint changed, or was the change to correct a misspelling? Audit trails in process control systems in manufacturing may track alterations to recipe parameters, some of which may be significant. A focus on prevention makes less work in detection. For example, by securing the recipe to prevent alteration of significant parameters, there will be less metadata to review.”

Electronic records can be an advantage in the review process, because the data are more accessible than with paper. However, "Reviewing all data all the time is impossible," notes Rutherford. "Review by exception allows you to focus on what is most critical. Computerized systems do this well by flagging unusual conditions to be reviewed."  A manufacturing execution system (MES) can flag when numbers are modified, for example, or when set-up parameters are out of specification. "If there are too many flags, they may be ignored," warns Rutherford. "It gets back to understanding the process and knowing which ones are important to flag."

Current coding systems often have audit trail or "history" features, but they may need to be turned on. In some cases, software systems don’t have what is needed, such as the ability to capture data at the time of the analysis or activity, so some redesign of software is occurring as the industry's understanding of good documentation practice grows, notes Cahilly.

Because process understanding is crucial, audit trail review should be done by the business function-by the operators, engineers, or laboratory analysts-rather than by the information technology (IT) group. "The quality group can oversee the review and IT can implement a system, but the business needs to own the data and its integrity," says Rutherford.

The business group, not IT, should also be doing validation of flags and the audit system. "Validation is proving that the system meets your needs and is fit for purpose-that it provides technological control of data integrity," says Rutherford. "Some consider validation merely a documentation exercise, but really it comes down to whether you care if it works. You should know what a flag is supposed to do and test to show that it functions that way." Periodic reviews and a change-control process are also important.

A practice that can be used in addition to audit trail review is a forensic audit, which involves selecting high-risk targets based upon triggering criteria. "These targets are then reviewed and traced from initial data source to final data output-an end-to-end evaluation-to detect any wrong-doing," explains Brewer. "Forensic audits can be used either proactively, as part of the monitoring associated with the site management controls, or in response to a specific known failure or suspected wrong-doing.”

References

  1. FDA, Draft Guidance for Industry: Data Integrity and Compliance With CGMP (April 2016).
  2. J. Weschler, Pharm. Tech. 38 (9) (2014).
  3. MHRA, GMP Data Integrity Definitions and Guidance for Industry (March 2015).
  4. USP, Proposed General Chapter <1029> Good Documentation Guidelines (May 2014).
  5. WHO, Annex 5 Guidance on good data and record management practices, WHO Technical Report Series No. 996 (June 2016).
  6. FDA, Draft Guidance for Industry: Bioanalytical Method Validation (September 2013).

Article Details

Pharmaceutical Technology
Vol. 40, No. 7
Pages: 50–54

Citation:
When referring to this article, please cite it as J. Markarian, "Data Integrity Challenges in Manufacturing," Pharmaceutical Technology 40 (7) 2016.

Recent Videos
Christian Dunne, director of Global Corporate Business Development at ChargePoint Technology
Behind the Headlines episode 6
Behind the Headlines episode 5