top of page

Good Manufacturing Practice (GMP) data integrity: a new look at an old topic, part 2

Writer's picture: josephmorris77josephmorris77

Data integrity is fundamental in a pharmaceutical quality system which ensures that medicines are of the required quality. A robust data governance approach will ensure that data is complete, consistent and accurate, irrespective of the format in which data is generated, used or retained.

This is the second in a series of 3 posts exploring the impact of organisational behaviour and procedures on reliable, consistent and accurate data in medicines manufacture. The first post in this series looked at the impact of organisational behaviour.

Designing systems to assure data quality and integrity


Data charts on a screen

A mature data governance system adopts a ‘quality risk management’ approach across all areas of the quality system. It requires continuous review, proportionate risk-reduction measures, and an understanding of residual risk across the organisation. Despite recent high-profile regulatory cases regarding falsification of analytical data, the collective experience of the MHRA Inspectorate is that data governance is not limited to laboratories or computerised systems. There are opportunities to strengthen both paper and computerised elements of the data lifecycle.

A useful acronym when considering data integrity is ALCOA; data must be attributable, legible (permanent), contemporaneous, original and accurate. The expectations for designing systems which reduce opportunities for data integrity failure are described in more detail in guidance published by MHRA. Simple (and often low cost) system design can have significant impact on the success of data governance. Some are included below as indicators of the ALCOA principles.

Attributable:

The identity of the person completing a record should be unambiguous. The use of aliases or abridged names should only be permitted where this is consistently used, and attributable to an individual. The same alias or IT system log-in which cannot differentiate between different individuals should not be used.

Legible (permanent):

It should not be possible to modify or recreate data without an audit trail which preserves the original record. It is important not to forget paper records in this context. Blank forms for manual recording of data should also be controlled in a manner which prevents unauthorised re-creation.

A pile of papers and folders

It is important not to forget paper records


Exceptionally, there may be a valid reason to re-create a record, eg where it has been damaged beyond use, or where an error does not enable a GMP compliant correction of the original. This must be managed through the quality system, either by making a ‘true copy’ (verified as being a true replicate of the original), or by re-writing a new copy and retaining the original as evidence. In all cases, this must be approved through the quality system, with QA oversight and justification for the action.

It is generally accepted that correction fluid is not acceptable in GMP areas. However, companies may be unaware that their computerised systems often have ‘data annotation tools’ enabled. These permit changes to data which can alter the appearance of reports, and may not have a visible audit trail. From a practical perspective, this is ‘electronic correction fluid’, and should not be permitted.

Contemporaneous:

System design has significant impact upon contemporaneous record keeping. The availability of records in the right place at the right time removes the need for staff to use loose scraps of paper, or their memory, to retain information for retrospective completion in the official record.

When inspecting packaging operations, I still find it a common approach for manufacturers to use a single batch packaging record (BPR) for blistering and cartoning of a solid dosage form. However, if the BPR is located in the secondary packing area, it is impossible for staff in the primary packing area to make contemporaneous records, and vice versa. The BPR may also require periodic checks, such as equipment performance. Specifying exact time intervals (eg ‘every 60 minutes’) may result in an incentive for staff to ‘back date’ the time of the check if they were occupied at the exact time the activity was required. The system is encouraging staff to falsify the record, particularly if there is concern that missing an exact time point might lead to disciplinary measures.

Various medicines in blister packs

Splitting the BPR into 2 parts (primary and secondary) encourages the correct behaviour


This can be addressed by 2 simple changes. Specifying an acceptable window for completion of the activity (eg.‘every 60 ±5 minutes’), and splitting the BPR into 2 parts (primary and secondary) encourages the correct behaviour, and removes both opportunity and incentive to falsify the record.


Original:

Original records must preserve data accuracy, completeness, content and meaning. Metadata (data about data) is vital in this aim by enabling reconstruction of an activity – who did what, where and when. There are certain limitations in relation to file formats which may not maintain the full metadata record; so-called ‘flat files’ such as .pdf, .doc etc. We may know who created the file, and when, but there may be no information on how, when or by whom the data presented in that document was created, processed or amended. There is therefore an inherently greater data integrity risk with flat files, as they are easier to manipulate and delete as a single record with limited opportunity for detection.

Accurate:

Automated data capture, with the required IT controls, provides greater control over the accuracy of a record. Where automation is not possible or feasible, real-time second operator verification of quality-critical observed values may be necessary.

Data review must include a review of raw data in its original form. If access to electronic raw data is not possible remotely, this is a good opportunity for the reviewer to escape the confines of their office. Reviewing paper copies or flat file reports of electronic data, even from a validated secure system, is unlikely to enable detection of anomalies. This is because the preparation of reports still requires operator intervention, which can influence what data is reported, and how it is presented.

The final post in this series will look at the recurring problem of ‘trial analysis’, and ways in which organisations within the supply chain can take steps to build confidence and reliance on each other’s data.

Don’t miss the next post, sign up to be notified by email when a new post comes out on the Inspectorate blog.
Check out our guidance on good practice for information on the inspection process and staying compliant.
1 view0 comments

Comments


bottom of page