Learning from errors and discrepancies in clinical radiology

Descriptor: 

Sensible but effective processes need to be in place to ensure that all radiologists improve their own practice and the quality of the department by minimising the likelihood of repetition of errors and potential harm to patients.

Background: 

This audit is worth carrying out because all doctors (including radiologists) make errors. Errors are inevitable but potentially avoidable. Sensible but effective processes need to be in place to ensure that discrepancies and errors are identified, shared with other radiologuists for learning purposes and minimise the chance of repetition and hazard to patients. Failure to address this governance requirement and/or not correcting a known problem or difficulty, may well be perceived as a greater failure than the problem itself. This audit addresses several strategies to minimise error and its impact on patients. The discrepancy (errors and complications) meetings will identify errors and provide a forum for continuing education by learning from errors [1-4]. A target of 5% of all reports should be subject to review [8]. This may be through retrospective random review, MDT feedback or double reporting in addition to routine peer text feedback (which will facilitate both positive feedback and areas for improvement) [8]. This audit is both valuable for individual radiologists’ revalidation as well as for departmental quality and safety.

The Cycle

The standard: 

1. Every radiologist should aim to attend at least 50% of learning from discrepancy meetings (errors and complications meetings) [7]

2. The errors and complications meetings should be held at least every 2 months (minimum of 6 meetings per year) [7]

3. A record of cases discussed should becompleted for each meeting identifying error type and made available to all departmental radiologists [7]

4. The record should be analysed at least once per year to ascertain whether any pattern of errors has emerged [7]

5. Departments should aim to provide systematic review in at least 5% of reports [8]

6. A system for peer text review should be available on radiology information systems [8]

Target: 

1. 50%

2. ≥6 meetings/year

3. 100%

4. At least x1 per year

5. ≥5% reports

6. System should be available

Assess local practice

Indicators: 

For each radiologist percentage of meetings attended:

- No of meetings/year

- Record of cases discussed with error type

- Evidence of review of error type (this may be through a separate audit)

- Register and log of radiologists’ errors and complications meeting

- Random sample of reports from RIS or PACS to show percentage with secondary review

- Identified facility for peer text feedback (from PACS Manager)

Data items to be collected: 

- Register and log of radiologists’ errors and complications meeting

- Random sample of reports from RIS or PACS to show percentage with secondary review

- Identified facility for peer text feedback (from PACS Manager)

Suggested number: 

- The audit involves all the radiologists and all the errors and complications meetings

- Sample of 200 reports for identifying secondary review

Suggestions for change if target not met: 

- Remind radiologists of their responsibility to attend errors meetings as part of clinical governance and that a personal record of attendance is required for the GMC folder for revalidation purposes

- Arrange protected time for the errors meetings as part of the monthly work programme and combine the regular monthly audit meeting (presentation of completed audits) with the errors meeting

- Identify staff member responsible for introducing change

- Approach vendors regarding facility for peer text feedback

Resources: 

- Audit lead to maintain the register of attendance and the log

- Clinical director (with the audit lead) to scrutinise the attendance records and the log of cases shown in order to assess participation by each of the radiologists

References: 
  1. Reinertsen JL. Let’s talk about error: Leaders should take responsibility for mistakes. BMJ 2000; 320: 730. 

  2. Pietro DA et al. Detecting and reporting medical errors: Why the dilemma? BMJ 2000; 320: 794-6

  3. Wu AW. Medical error: The doctor who makes the mistake needs help too. BMJ 2000; 320: 726-7.

  4. Singer A. Mandatory regular meetings of hospital staff would complement medical audit and revalidation. BMJ 2000; 320: 1072.

  5. Department of Health. An organisation with a memory. Report of an expert group on learning from adverse events in the NHS chaired by the Chief Medical Officer. Norwich: Stationery Office, 2000.

  6. Quality and Safety in Radiology. Bruno and Abujudeh. OUP: 2012: p93-104

  7. Royal College of Radiologists. RCR Standards for learning from discrepancies meetings. https://www.rcr.ac.uk/sites/default/files/docs/radiology/pdf/BFCR(14)11_LDMs.pdf

  8. Royal College of Radiologists. Quality assurance in radiology reporting: peer feedback. https://www.rcr.ac.uk/system/files/publication/field_publication_files/BFCR%2814%2910_Peer_feedback.pdf

Editor's comments: 

Colleagues may find the process threatening. Aspects of confidentiality and record keeping will need to be carefully organised to reassure everyone involved. These aspects and the precise organisational arrangements, will vary from department to department. Nevertheless, introducing the process, developing a constructive approach and a no blame culture should not be too difficult and over time the process will appear less threatening.

• There are clear benefits to individual consultants. Reviewing errors within a closed meeting where all individuals share their experiences and discuss cases will be educational for all parties. It can also help to identify structural and training weaknesses within the department. It can help to underpin risk management

• Problems recognised early are often easier to deal with than those that are well established and have been previously ignored [4]

Published Date: 
Tuesday 6 May 2008

Last Reviewed: 
Tuesday 6 December 2016