Improving chromatographic QA/QC results

Article

Pharmaceutical Technology Europe

Pharmaceutical Technology EuropePharmaceutical Technology Europe-04-01-2006
Volume 18
Issue 4

The latest generation of HPLC instruments offers a new level of data security. Improvements in embedded instrument control programs and mass storage functionality offer 'no data loss' guarantee. This level 5 instrument control makes data acquisition audit-safe and increases laboratory technicians' efficiency, freeing them up from any reanalysis work.

The loss of instrument acquisition data is not only an annoyance, but also a compliance risk that can affect the release of the final manufactured product. Integrated data buffering in the new generation of HPLC instruments allows shorter analysis review cycles with full data integrity and compliance.

These instruments now offer a 'no data loss' guarantee. This is achieved by an additional data storage system in the instrument acting as a fail-safe mechanism if communication errors occur. This sets a new data quality standard for chromatographic quality control (QC) and quality assurance (QA) analysis of final production batches prior to shipment.

Drug manufacturing must be closely monitored to ensure the highest quality of final consumer product and prevent any unwanted or harmful side-effects caused by undetected trace compounds. In addition, numerous decisions on the progress of a potential drug candidate, such as its stability and manufacturing process, must be made across the pharmaceutical value chain.

Ultimately, many of these decisions are based on the results of analytical instruments. Hence the importance of quality analytical results and data. Here, qualitative (are all peaks available?) and quantitative (are calculated amounts within given limits?) results from analytical instruments must comply with regulations from the FDA and other sources that determine the manufacturing release of pharmaceutical products. In all instances, it is paramount that the data are correct and complete. Data loss is unacceptable.

How can data loss or poor data quality occur? The main reason for data loss is communication errors between instrument and storage server. Many client/server data systems, therefore, use an acquisition buffering device to buffer data before they are transferred over the network to the final server. This prevents any subsequent problems on the main company network resulting from such communications errors between instrument and acquisition server. Let's look at a typical example in a pharmaceutical QC laboratory monitoring the quality of final drug products. Here, instruments must be operated all day, 7 days a week (24/7 operation). When the instrument is operated in unattended mode, particularly at night, typical problems that may arise include instrument errors such as leaks, pressure limit violations or communication errors. Such cases leas to the loss of the analysis results of one or multiple runs. Should this run be part of a system suitability test suite, or for compound calibration, an entire analysis sequence data is of no use and would have to be repeated the next day. This ultimately results in costly delays in shipment release of a batch of drugs.

Also, because of advances in analytical instrumentation, data volumes are always increasing. For example, with the latest generation of high-speed instruments, the typical size of data files has increased by a factor of eight. This is because runs that used to take 10–20 min on traditional instruments only take 2 min using the new high-speed instruments. This further increases the need to ensure good quality of instrument acquisition data, as well as results.

A further impact of the introduction of next generation high-speed instrumentation is the fact that the bottleneck in chromatographic analysis is moved from the creation of data to their analysis. This increased demand on the analytical and medicinal chemist raises a need for more automation in the results review process, along with a higher focus on data quality during instrument operation. In addition, during the late stages of drug development, as well as manufacturing, the second key demand for the results of analytical instruments is compliance with GMP guidelines. These include FDA's 21 CFR Parts 210 and 211, and the more recent 21 CFR 11 and European Guidelines under the umbrella of good laboratory practice (GLP). Similar guidelines also exist from the US Environmental Protection Agency (EPA) such as EPA 40 CFR 60.

To date, the high-throughput and compliance challenges have been treated as independent problems. The need to run more samples per hour was supposed to be solved with enhanced separation techniques, while compliance needed to be supported by laboratory software applications, and related procedures and controls. In this article, we will focus on an underlying common prerequisite that is often overlooked: the quality of data that are created by the instruments. This includes the initial acquisition data from the detector module of the instrument, plus all 'meta' data (e.g., instrument logbooks, system parameters, such as pressure, solvent or oven temperature, as well as any error documentation from the system) that describe the conditions of the acquisition signal generation. Good data quality means that data have been created reliably (i.e., conditions of data creation are consistently documented) and are complete with no temporary or permanent losses. Consequently, results are consistently compliant, enabling efficient final release of manufactured products if the data transfer pass is safeguarded with buffering devices from instrument over acquisition controller to final server storage. Having a fully documented and completely secured data transfer path from analytical instrument to final storage location frees up laboratory personnel from any tasks because of data loss. This also provides certainty when passing any laboratory-related audit and ultimately improves overall product quality.

The impact of instrument control

Loss of instrument acquisition data is a nuisance and a compliance risk: it is costly, can violate data integrity and results in costly time delays. At best, an analyst has to repeat sample preparation and analysis; at worst, the data loss may be irretrievable if the supply of sample is exhausted. As previously discussed, data loss is typically caused by communication errors between the analytical instrument and the acquisition controller. A simple instrument error status communication in a general purpose interface bus (GPIB [an instrument communication protocol based on industry IEEE 488 standard)] controlled system can bring down a complete acquisition server with up to four instruments.

The only means of error correction is by rebooting this acquisition server. In unattended operation mode (such as automated overnight analysis) hundreds of sample results can be lost and all analysis steps have to be repeated. Furthermore, even a brief communication problem of just a few seconds can lead to the absence of a peak during a run. If the system then recovers and completes the run to produce data, unless somebody specifically checks the run logbook for this particular run, such an error may remain undetected. In light of this, it is very surprising that, other than adding acquisition buffering devices between instrument and network, no further measures have been taken to prevent data loss over the last 15 years. Here we describe how a new technology of embedded instrument control can completely prevent all possible data loss, thereby setting a new standard for data quality for chromatographic QA/QC analysis of final production batches prior to shipment.

Level 4 instrument control

Before we discuss the new standard for instrument control, let's start with a quick review of what is currently available as state-of-the-art. The issues relating to the data quality and results discussed above are in part addressed by compliance requirements. The first innovation in communication from instrument to workstation was driven by FDA and its enforcement of its 21 CFR Part 11. Part 11 created an enormous interest in the storage of all information involved in the creation of a final result. This information is referred to as 'meta data' and includes all data that have been acquired on the route from initial injection to final results, including acquisition conditions, run logbooks, as well as a complete result revision history in audit-trails.

To apply this concept to data acquisition, Winter and Huber defined the concept of 'level 4' instrument control in 2000 for full compliance with 21 CFR Part 11.1,2 Level 4 instrument control can be best described as bidirectional ongoing communication between the analytical instrument and the control software. Level 4 instrument control enables users to document all instrument information that is created during an analytical run in the software. This includes:

  • The exact system configuration, including the instrument serial numbers and the analytical column description.

  • Documentation of any communication problem during acquisition in the run logbook.

  • Actual sample information.

  • Maintenance feedback alerting the user to necessary system maintenance tasks.

A graphical summary of the data flow in the instrument with level 4 instrument control is given in Figure 1. With level 4 instrument control, the instrument communicates all acquisition data, such as signals and spectra, to a data cache. From this cache location the data is transferred directly to the instrument controlling device, such as a workstation PC or an acquisition controller device. Data is uploaded as actual monitored data from various sources.

Figure 1

In terms of regulations, level 4 instrument control is a very comprehensive reply to the typical validation request of 'documenting' all key information. Now with the compliance requirements being addressed, what's left open?

Compliance focuses on documentation, not error prevention. However, error prevention is the next logical step. Once you are able to document an error, you will want to correct this error on the fly. How does level 4 instrument control deal with the main reason for data loss — communication errors? If a communication error occurs between the acquisition controller and the instrument for more than a few seconds, the instrument will loose its master control and move into an error status. If communication is restored within a few seconds, level 4 instruments can correct the error, but acquisition data will be lost. If this data includes a peak used for the typical cyclical calibration of compounds, as required for system suitability checks, the entire sequence of runs will have to be reanalysed.

No data loss

So what can be done to improve data quality and prevent data loss?

The answer involves communication protocols, as well as new buffering capabilities. To secure instrument data, an integrated connection of the three key building blocks establishing the communication have to work together:

  • The communication protocol (e.g., Ethernet/LAN, RS 232 or GPIB).

  • The embedded instrument control for data buffering.

  • The instrument control software.

The best communication protocol for prevention of data loss is the transmission control protocol/internet protocol (TCP–IP), which is used in all standard local area network (LAN) communications because it is able to transfer large amounts of data securely. It can also reconnect and resend data packages if there were problems in transmitting them. This built-in correction mechanism makes it superior to traditional instrument communication protocols, such as GPIB and RS 232. However, at the heart of data loss prevention are the instrument control software and in particular the data buffering on the embedded instrument control.

The 'traditional way' of data acquisition continuously uploads all acquisition data at time of creation through the communication interface to the controlling device in the PC. Any interruption of the communication between instrument and controlling device, such as through a network problem, or an error in the communication protocol, results in immediate data loss. These connection problems may be brief (a matter of seconds) and, therefore, are likely to go undetected. With the traditional instrumentation, these brief problems have not posed a big risk to data integrity. Because their typical run times are 20 min or more to detect 10 to 15 peaks, the likelihood of missing a complete peak is relatively low. However, with the latest generation of high-speed instruments, run times for the same number of peaks can be reduced to just 2 min. With these short run times, a communication loss of even 5 s may mean that a complete peak is missing. While data loss per se is not acceptable, the undetected loss of an individual peak from a quality control chromatogram may hold back the final shipment of batches of drugs, and delay delivery and invoicing of final products significantly.

Data buffering advantages

Only data buffering at the location of creation — in the instrument — is able to totally prevent any data loss. For such complete data buffering to be possible a new concept of instrument control is required — level 5. At the heart of this concept is a way of handling, storing and managing data with a back-up copy already within the instrument. In addition to the 'traditional way', now a copy of all acquisition data is stored in the instrument module itself.

This advance has only become available for the latest generation of chromatographic instruments because of price drops in computer chips and mass storage devices. With these changes embedded instrument control can now operate as a small computer with mass storage and RAM for transactions. These advances have increased the amount of mass storage and the number of transactions by a factor of 100.

This massive increase enables instrument control, for the first time, to go beyond the mere translation of control commands from the control software to the actual instruments. Such high-speed instruments can now store all instrument results and raw data on the mass storage device of the chromatographic detection system itself. This storage capacity within the actual instrument now enables the instrument to operate independently of the instrument control software that runs on a PC or an acquisition controller in a networked chromatographic data system (CDS). This makes the instrument completely immune to any communication errors interrupting the data transfer between instrument and the acquisition controller or workstation PC.

How is this achieved? The first implementation (as used in rapid resolution LC systems) decouples instrument control and mass storage function in the embedded control software enabling full back-up storage on the mass storage device in the instrument. A dedicated data buffer in the instrument has the capacity for storing instrument raw data, meta data and results in the order of 25 min worth of acquisition data in the rapid resolution mode (or 6–8 h worth of acquisition data in standard mode). The system stores acquisition raw data and also all kinds of results that are created by the system, for instance fraction storage information or sample locations. These data are stored whenever the system is operating, thus providing a buffering of all acquisition data.

The implementation of the 'no data loss' concept is shown in Figure 2. In addition to immediate transfer of data from instrument to instrument controller (level 4), data is stored in a cyclical way on the embedded mass storage device. The embedded instrument control firmware copies all actual data from the instrument to the cyclical storage. It also adds markers for run start/end events to identify the information in the case of a communication loss and necessary upload from the cyclical storage to the controller.

Figure 2

In the context of this discussion, of course the data buffering is the most important aspect ensuring the quality of data, as well as results. However, the mass storage of data on the instrument has some additional advantages. For example, advancements in the instrument control software enables users to obtain early maintenance feedback on the buffered instrument data, or store work lists directly on the instrument. The data buffering ensures that in the event of a communication error, all data is safely buffered on the instrument. Once the error is corrected, the instrument reconnects to the controller software and uploads buffered data. This means that even in the event of a complete communication breakdown, data is safe. This may save one or multiple days of reanalysing samples, particularly for overnight analysis in unattended operation.

Summary

Because of new technology in the embedded instrument control of the latest generation of HPLC systems, a new level of data quality can be achieved. With level 5 instrument control, the system provides a 'no data loss' guarantee through an integrated storage system in the instrument. This stores all acquisition data and keeps them available as back-up in case of data losses, such as through communication errors. With this integrated data buffering, such instruments become a reliable technique and support shorter analysis review cycles with full data integrity and compliance. This is particularly important within the pharmaceutical manufacturing process because it will increase sample throughput and make data acquisition audit-safe.

Data buffering at the source of acquisition — which is within the detector of a chromatographic instrument — prevents any data loss during data acquisition; for example, through communication errors or network problems. Furthermore, it will make data acquisition fully transparent and audit-safe by documenting all acquisition parameters with the final compound results (level 4 instrument control) and preventing any data loss through data buffering in the instrument (level 5 instrument control).

This data loss prevention by level 5 instrument control delivers a new level of data quality, ensuring error-free data acquisition of chromatographic instrument even when operated in unattended mode. This is particularly important for pharmaceutical manufacturing QA/QC laboratories that must run 24/7 to ensure timely and efficient final release of product.

Christoph Nickel is product marketing manager laboratory informatics at Agilent Technologies, Germany.

References

1. Code of Federal Regulations, Title 21, Food and Drugs, Part 11 "Electronic Records; Electronic Signatures; Final Rule," Federal Register 62(54), 13429–13466.

2. L. Huber and W. Winter, BioPharm, 13(9), 52–56 (2000).

Recent Videos
Behind the Headlines episode 6
Drug Digest: Is Our Understanding of Stability Changing
CPHI Milan 2024: Highlighting the Benefits of Integrated Services
Behind the Headlines episode 5
Related Content