Regulatory submissions are a critical step in bringing new drugs and medical interventions to market, the success of which heavily relies on the quality and fitness of the data presented to the regulatory bodies.
Regulatory submissions are a critical step in bringing new drugs and medical interventions to market, and the success of these submissions heavily relies on the quality and fitness of the data presented within the submission package to regulatory bodies. Recent FDA analysis shows that 32% of study data in submissions had significant issues with data conformity. If a submission is rejected due to non-conformance with study data requirements, it does not progress to the FDA Electronic Submissions Gateway, nor does it enter FDA electronic document rooms, which is where the official FDA review process begins. Out of the new new molecular entity (NME)/investigational new drug (IND) applications that successfully pass the study data conformance screening, only 50% are approved by FDA on their first submission. There is then a median delay of 435 days to approval following the first unsuccessful submission, which in turn postpones the availability of crucial new medications to patients. The stringent requirements set forth by regulatory bodies such as FDA and the European Medicines Agency (EMA) are not just procedural hurdles, they are essential safeguards to ensure the integrity of the clinical trial process, patient safety, and drug efficacy.
The rapid advancements in medical science and data technology result in both opportunities and challenges in maintaining the highest standards of data quality. As the industry navigates through the complexities of clinical trials, the importance of accurate, reliable, and robust data cannot be understated. Data must accurately reflect the clinical trial’s findings, as inaccuracies can lead to incorrect conclusions about a drug’s safety and efficacy, ultimately affecting patient health and public safety. This is why adherence to standards such as those provided by the Clinical Data Interchange Standards Consortium (CDISC) are essential to ensure that data are consistent, interpretable, and can be efficiently reviewed by regulatory bodies such as FDA. Moreover, complete and comprehensive data are crucial; any gaps can lead to delays as outlined in this article, as well as questions from regulatory bodies to add to these potential delays as they are resolved. Equally important is ensuring that the data remains current and relevant to the specific investigational drug and its intended use, as outdated or irrelevant data can skew the assessment of a drug’s profile. These elements collectively uphold the integrity of data, which is fundamental in the decision-making process of drug approval and, ultimately, in safeguarding public health.
Maintaining data quality is fraught with challenges. These can range from technological limitations in data capture and storage to human errors in data entry and analysis. The complexity of clinical trials, involving multiple sites and varying patient populations, adds another layer of difficulty. The following highlights the main issues each of these challenges can bring:
Each of these challenges requires a tailored approach to mitigate risks and ensure integrity and quality of data in regulatory submissions.
Read this article in Pharmaceutical Technology®/Pharmaceutical Technology Europe® Quality and Regulatory Sourcebook eBook.
Claude Price is head of Clinical Data Management at Quanticate.
Pharmaceutical Technology
eBook: Quality and Regulatory Sourcebook 2024
March 2024
Pages: 35–37
When referring to this article, please cite it as Price, C. The Importance of Quality Data for Regulatory Submissions. Pharmaceutical Technology®/Pharmaceutical Technology Europe® Quality and Regulatory Sourcebook eBook (March 2024).