Published on 06/05/2026
Case Study on Data Integrity Breach: Uncontrolled Excel Calculations in Batch Release
Data integrity breaches in pharmaceutical manufacturing can lead to significant compliance issues, particularly in batch release processes. In this case study, we will explore a realistic scenario involving uncontrolled Excel calculations that were discovered during the batch release phase of a sterile injectable. This article will guide you through detection, containment, investigation, and CAPA strategies that help mitigate such risks in the future.
After reading this article, you will be equipped with practical tools and strategies to address data integrity breaches, reduce the risk of similar issues, and respond effectively during inspections.
Symptoms/Signals on the Floor or in the Lab
During routine audits of batch release processes in a sterile injectable production facility, the QA team noted discrepancies in the final calculations of batch yields and excipient percentages reported in batch records. These inconsistencies were traced back to Excel spreadsheets used for data analysis, raising significant red flags regarding the reliability of the data reported. The following symptoms were identified:
- Sent out yield reports that were
These symptoms indicated a broader problem related to data governance and integrity controls in the manufacturing process, which needed immediate attention.
Likely Causes
After initial observations, a deeper dive into potential causes of the discrepancies revealed several areas of concern categorized under six main factors: Materials, Method, Machine, Man, Measurement, and Environment.
| Category | Potential Causes |
|---|---|
| Materials | Unvalidated data imports into Excel sheets. |
| Method | Lack of standardized procedures for data entry and validation. |
| Machine | Outdated computational tools not integrated with the LIMS. |
| Man | Insufficient training on data integrity and documentation practices. |
| Measurement | Manipulation of formulas or errors in data extraction. |
| Environment | Inconsistent data backup routines lacked oversight. |
Immediate Containment Actions (first 60 minutes)
Upon identifying the potential data integrity breach, the following containment actions were executed within the first hour:
- Suspension of all batch release activities that utilized data from affected Excel sheets.
- Immediate communication to the senior management and quality assurance team to alert them of the situation.
- Isolation of all files used in the calculations to prevent further discrepancies.
- Engagement of IT and QA teams to monitor the history of the Excel files in question, assessing their modification history.
- Documentation of the initial findings in a deviation report clearly describing the potential breach.
Investigation Workflow (data to collect + how to interpret)
The investigation was structured to collect relevant data and accurately interpret its implications:
- Data Collection:
- Gather all versions of the Excel spreadsheets used in the batch release process.
- Collect batch records, including yields, excipients, and quality control results for the last 10 batches.
- Interview personnel involved in data entry and calculations for firsthand insights.
- Data Analysis:
- Compare finalized batch reports against production logs to identify discrepancies.
- Review audit trails of the Excel files to identify unauthorized modifications.
- Assess training records and compliance with data governance policies for applicable personnel.
The analysis aimed to pinpoint the source of errors and evaluate how they impacted batch release integrity, while also guiding corrective measures.
Root Cause Tools (5-Why, Fishbone, Fault Tree) and when to use which
To methodically approach the root cause of the data integrity breach, we utilized the following tools:
- 5-Why Analysis: This technique enabled the team to drill down to the fundamental root cause by asking “why” repeatedly. For example:
- Why were there discrepancies? Because of calculation errors.
- Why did calculation errors occur? Due to manual data entry in Excel.
- Why was manual data entry necessary? Because automated data transfer from LIMS was unavailable.
- Why was automated transfer unavailable? Due to outdated software.
- Why was the software outdated? Lack of a software upgrade plan or budget allocation.
- Fishbone Diagram: This visual tool facilitated brainstorming of potential causes categorized under specific headings (People, Process, Technology). It helped in illustrating relationships and identifying contributing factors.
- Fault Tree Analysis: Employed for complex scenarios, this method mapped out possible failure paths leading to the integrity breach, analyzing systems and process interactions to identify weak points in the workflow.
CAPA Strategy (correction, corrective action, preventive action)
Based on the investigation findings, the following CAPA strategy was formulated:
- Correction: Immediate retraining of all staff involved in batch release documentation on appropriate data governance practices.
- Corrective Actions:
- Upgrade the Excel tools to a more reliable and integrated solution with the existing LIMS.
- Implement a double-check mechanism in the batch release process to verify key calculations.
- Preventive Actions:
- Establish and enforce revised SOPs for usage of Excel for batch calculations, ensuring validation of results.
- Conduct regular audits and risk assessments on data integrity controls, maintaining compliance to regulatory standards.
Control Strategy & Monitoring (SPC/trending, sampling, alarms, verification)
To prevent recurrence of such breaches and enhance control, a comprehensive control strategy was put into place:
- Statistical Process Control (SPC): Implementation of SPC tools to monitor batch yields and deviations in real-time, ensuring immediate identification of anomalies.
- Trended Analysis: Perform monthly trend analysis of data integrity errors in batch releases to identify patterns or recurring issues.
- Sampling Plans: Design a robust sampling plan to verify calculations periodically, ensuring data integrity before finalizing batch reports.
- Alarm Systems: Integration of alerts for deviation thresholds that trigger immediate investigation of data entry errors.
- Verification: Always confirm data against physical records and electronic entries through designated personnel audits.
Validation / Re-qualification / Change Control impact (when needed)
As part of the corrective measures, validation and re-qualification efforts were essential:
- Re-qualification of the upgraded Excel system involved testing interoperability with the LIMS, documenting procedures, and ensuring user acceptance testing (UAT).
- Validation of the entire batch release process was re-assessed to ensure compliance with regulatory expectations, documenting any changes instituted during the CAPA implementation.
- Change control procedures were revised to reflect the updated processes and software, ensuring that future modifications go through formal validation and approval routes.
Inspection Readiness: what evidence to show (records, logs, batch docs, deviations)
To maintain inspection readiness, the following documentation and records were crucial:
Related Reads
- Managing QC Laboratory Deviations in Pharmaceutical Quality Systems
- Learning from Manufacturing Deviation Case Studies in Pharmaceuticals
- All deviation reports and CAPA documentation, detailing the investigation process.
- Logs from Excel modifications, including timestamps and personnel details on who executed changes.
- Training records showing that employees completed data integrity training.
- Process maps of modified procedures for data entry and calculations.
- Graphs illustrating trended data to showcase the effectiveness of corrective actions taken.
FAQs
What constitutes a data integrity breach in manufacturing?
A data integrity breach occurs when data is inaccurately recorded or altered without proper authorization or controls, risking the validity of production and quality processes.
How can we ensure compliance with data integrity principles?
Implement a robust data governance framework that includes training, monitoring, validation of systems, and clear documentation protocols.
What are the key components of an effective CAPA process?
Key components include identifying the root causes of issues, developing corrective actions, implementing preventive measures, and documenting everything for future reference.
How often should training on data integrity be conducted?
Regular training should occur bi-annually, with refreshers as needed, especially after any significant changes in procedures or systems.
Are there specific regulations governing data integrity in pharmaceuticals?
Yes, governing bodies such as the FDA and EMA provide guidelines on data integrity, mandating adherence to Good Manufacturing Practices (GMP) and expectations for data verification.
How can audit trails assist in data integrity cases?
Audit trails provide transparency in data modifications, helping to trace unauthorized changes and verify the accuracy of reported data.
What role does technology play in maintaining data integrity?
Technology helps one automate processes, reduces human errors, and provides systems to monitor data accuracy continuously.
What should be included in a data governance policy?
A data governance policy should outline data handling, integrity controls, validation protocols, training requirements, and accountability measures.
What is the significance of electronic signatures in batch documentation?
Electronic signatures ensure accountability and authenticity of data entries while maintaining compliance with regulatory standards, substituting traditional handwritten signatures.
How can risk assessment activities be integrated into data integrity management?
Regular risk assessments can identify potential vulnerabilities in data processes, enabling proactive measures to secure data integrity across operations.
What follow-up actions should be taken after an investigation concludes?
Follow-up actions include conducting regular reviews of the implemented CAPA, verifying adherence to corrected procedures, and planning future audits to ensure compliance.
Is it possible to use manual calculations while maintaining data integrity?
While manual calculations can still be used, comprehensive checks, validations, and documentation are essential to safeguard data integrity in the process.