View Article PDF

The last decade (2010–2019) has seen calls to action to improve the prescribing practice of junior doctors.[[1–3]] An in-depth investigation into the causes of prescribing errors by foundation trainees in relation to their medical education (the EQUIP study[[1]]) in the UK reported a prescription error rate of 8.9% for all prescribed medicines, and although that is a UK study, there are similarities with New Zealand prevocational training programmes.[[3]] The EQUIP study revealed that existing teaching strategies are not working.[[1]] To believe a single intervention will prevent most prescribing errors is simplistic, and for improvement to occur, new prescribers need to learn from their mistakes.[[3–5]] Traditionally, the education of junior doctors has focused on their competence and professional registration requirements. Working in healthcare is collective and multidisciplinary, and errors occur through human and system factors.[[6]]

In response to similar calls to action in New Zealand,[[1]] the medical education units at two of the larger New Zealand district health boards (DHBs) began working on an education intervention to improve prescribing and medication safety. They explored ways to leverage the interprofessional collaboration between doctors and pharmacists in their everyday interactions to promote effective prescribing practice.[[7]] This early work encouraged pharmacists to work collaboratively with medical staff to integrate medication safety into the postgraduate year 1 (PGY1) programme using interprofessional teaching methods and role modelling collaborative practice.

In 2015, the intervention was expanded to include a role for pharmacists coaching PGY1s on the wards. The work was evaluated by recording prescribing errors and the feedback of PGY1s and educators. A significant improvement in prescribing was demonstrated, with qualitative results suggesting that pharmacists coaching PGY1s on the ward was the strongest intervention.[[8]]

Simultaneously, a programme in the UK known as ‘ePPiFany’ (Effective Prescribing Insight for the Future) was adopting similar strategies and achieving similar results.[[9]] Over the past five years, the UK and New Zealand teams have worked together to share strategies, outcomes and lessons learned, to contribute to knowledge about how workplace learning theory and interprofessional education improves the prescribing practice of junior doctors.

The UK ePPiFany approach

The ePiFFany educational approach is based on self-regulated learning and focuses on developing clinical reasoning when prescribing. It combines a simulated clinical encounter, which is filmed, with personalised and structured feedback, including a review of the filmed encounter, to facilitate deliberate practice throughout the four-month junior doctor rotation. A full description of the intervention is provided in Green, Shahzad and Wood.[[9]] The primary outcome measure, error rate per prescriber, was calculated using daily prescribing data. The three-site ePiFFany case study demonstrated the impact of the intervention on improving clinical outcomes (ie, reducing prescribing error rates). The intervention improved prescribing and patient safety behaviour across different subspecialties and contexts.[[9]]

The New Zealand application of ePPiFany

Two New Zealand DHBs were offered the opportunity to pilot and adapt the evidence-based ePPiFany approach with the support of the UK team. The New Zealand team took a stronger focus on the role of the pharmacist as an interprofessional coach. Following the UK approach, the aim was to accelerate the prescribing performance of PGY1 doctors. We anticipated that, after three months of work experience, our intervention group would be performing at the same level that the control group would after 12 months of work experience and no intervention.

Table 1: The key components of the New Zealand intervention.

*A full protocol of the simulation and the design of the ward coaching model is available on request from the corresponding author.

The programme aimed to accelerate the prescribing performance of PGY1 doctors. It was hoped that after three months the intervention cohort’s prescribing would be at the level of performance as that achieved by the control group after 12 months.

Study design

The intervention was evaluated using a multi-method design with these objectives:

  1. Assess the impact on patient care through reduced prescribing errors.
  2. Facilitate quality improvement and programme development by documenting participants’ experiences, insights and recommendations for improvement at each DHB site (PGY1 doctor, medical and pharmacy educators).

The first objective of this study was to assess the impact of a sustained educational intervention on prescribing practice over a three-month period. That intervention included simulations with personalised, structured, video-enhanced feedback and ongoing ward coaching by pharmacists on prescribing performance. Consistent with the intervention design used in Green et al (2020), an experimental group (consisting of PGY1s on their first placement) and an experienced control group (consisting of PGY1s on their fourth and final placement) was constructed to assess the effectiveness of the intervention. All prescriptions (pre- and post-intervention group) were audited daily and analysed for prescribing accuracy and appropriateness by an independent ward pharmacist.

In order to further programme development, qualitative data was also gathered through semi-structured interviews with PGY1 doctors and debrief meetings with medical and pharmacy educators to collect feedback about satisfaction, implementation, experience and sustainability.

Methods

Prescribing audit data collection and analysis

In following the UK model, the New Zealand pharmacists completed prescribing audits daily, recording prescribing errors by the PGY1 doctors for a six-month period. The first baseline data set at three months was collected from doctors on their final rotation (Quarter 4). The second data set was collected during the following three months, which was Quarter 1 for the new incoming PGY1 doctors who received the intervention (Figure 1). Data for Sundays and public holidays were collected on the next working day. Any gaps were identified and accounted for.

Figure 1: Audit data collection time frame.

Data analysis

Error prevalence was calculated using the following formula: Error prevalence = (Total number of errors / Total medicines prescribed) X 100

The error data were tabulated. The prevalence of errors was compared statistically on the basis of the test of proportions using R.[[10]]

Data was stratified by type of error, severity of error and grade of prescriber. The descriptions of numerator and denominator data and error severity are attached (Appendix Figure 1).

Qualitative data collection and analysis

PGY1 interviews

PGY1 doctors were interviewed individually using a semi-structured format (a mix of face-to-face and telephone interviews) within a month of the completion of the intervention by the independent project coordinator (Appendix Figure 2). Interviews were brief, pre-arranged and averaged 15 minutes, to ensure the doctors were not away from clinical service for any longer than necessary. Information sheets were shared at the time the interviews were arranged, and signed consent was obtained at the commencement of the interview. Responses were recorded on an anonymised template and checked for accuracy with the junior doctor at the end of the interview.

Medicine and pharmacy educator debriefs

Face-to-face group debriefs were held at each DHB site (site 1 and site 2) with pharmacists and medical education staff. Oral consent was obtained. Discussion focused on capturing each group’s perception of the value of the project. They were asked: What worked well? What did not work well? Did you feel this was a good use of your time? What do we need to consider in the future? The facilitator took notes and sought clarification or elaboration of points that were unclear. Notes from sites were kept separate, as there were variations between site 1 and site 2. There was a focus on looking forward, programme improvement and future development, including other site implementations.

Data analysis

Initial thematic analysis was undertaken following Creswell’s method.[[11]] Data from each stakeholder group were analysed for significant statements and quotes, and clusters of meaning were developed using colour coding. Common themes and patterns emerged from both sites and data from each site was combined. An informative label for each theme that resonated with the research team was selected.

Initially, feedback from medical and pharmacy teams was analysed separately. However, as significant site variations emerged alongside clear commonalities, medical and pharmacy responses were integrated for reporting results.

Ethics

This study received ethics approval from the UNITEC Research Ethics Committee (EC). EC registration number: 2016-1038.

Site approval and registration was awarded by DHB. On 21 June 2016 the New Zealand Health and Disability Ethics Committee formally deemed it to be outside their scope and that the study did not require their approval.

Results

Quantitative results: prescribing audit outcomes

There was a significant reduction of prescribing errors at both sites (Table 2).

The volume of prescribing on the wards at the two sites was comparable for the baseline and intervention groups (Table 2). Volumes were reduced at both sites for the intervention group due to the holiday period.

Junior doctor prescribing errors at both sites reduced remarkably following the intervention. At site 1 (DHB1), error prevalence reduced by about 79% (p=0.02). The error prevalence at site 2 (DHB2) reduced by about 38% (p=0.35).

Table 2: Prescribing errors by site. Based on N=14 for DHB1 and N=17 for DHB2 control groups. In the intervention groups, N=6 for DHB 1 and N=4 for DHB2.

Impact on the frequency and severity of prescribing errors

At site 1, the proportion of severe errors reduced by 100%, from 4.04 to zero, in the intervention group (Table 3). We did not have a significant impact on the severity of error at site 2 (Table 4).

Table 3: Site 1 error severity as a proportion of total errors prescribed (PGY1 data).

Table 4: Site 2 error severity as a proportion of total errors prescribed (PGY1 data).

Qualitative results

PGY1 doctor interviews

Interviews were conducted with 10 PGY1 doctors in total across the two sites with a 100% response rate.

Respondents were asked to rate their overall experience on a scale of 1 to 10, with 10 being the most positive. The median was 8 at site 1 (n=6) and 6.5 at site 2 (n=4).

Feedback was consistent for seven of the nine themes identified across the two sites and these consistent responses were all positive. The overall experience was valued by the PGY1s, who agreed that this training was most useful early in the training year; that their reasoning when prescribing was enhanced; and that they felt more confident. The authenticity of the simulation environment and use of real patients was helpful for learning. They valued all the feedback offered, but the video of the simulation appeared less useful than reviewing prescribing on the ward with the pharmacist. (Detail provided in Appendix Table 1).

Significantly, they all thought that the programme should be rolled out to a full cohort of PGY1s, and 60% thought it would be valuable to repeat the programme in their second postgraduate year, especially in subspecialty areas with complex medications (eg, gastroenterology, oncology).

Site variation emerged when the PGY1s were asked about improvements for the future.

Site variation themes

Information prior to consent ensured that respondents were aware this was a new programme and a trial, and that we wanted detailed comments on how to improve the programme for the next cohort. There were two areas of difference: the structure of the second simulation, and the experience of the delivery of the ward coaching.

Table 5: Site variation PGY1s.

Comments highlight variation in the implementation and delivery of the end-of-run simulation and the ward coaching across sites, which signals areas for improvement at site 2 and also considerations for the planning of new site implementations.

Medicine and pharmacy educators’ debriefs

Both common and site-specific comments emerged that mirrored the PGY1 feedback. The educators all felt that it was worthwhile, supported the concept and wanted to continue to develop the programme in the future.

“Conceptually it’s worthwhile.” (Medicine educator, site 2)

“We can see the benefit.” (Pharmacy educator, site 1)

Both sites reported positive responses from the PGY1 doctors, but both sites and both professions also noted the time and resources involved and felt that the current model would be difficult to sustain and deliver to a full cohort of 40–60 PGY1s.

There were several features to consider for the future before a full roll out could be undertaken. Comments highlighted variation between sites in preparation, ward culture and the pharmacists’ previous experience and training as interprofessional coaches.

Differences between sites

1. Ward environment

Site 1 used a rehabilitation and general medical ward, and site 2 used a busy orthopaedic ward. Ward selection impacts on release time both to attend simulation and for meetings with the pharmacist.

“The wards that the intervention is based on need to be considered—for the very busy, acute wards, release time for simulation was hard.” (Medicine educator, site 2)

“Hard for pharmacists catching up with PGY1s when they are so busy.” (Pharmacy educator, site 2)

2. Simulation environment

Although simulations ran relatively smoothly, debriefs turned to a discussion of volume and capacity. At site 1 there was more technical help available from the simulation unit and more support staff were available to assist with patient support before and after the simulation. In addition, video recording and copying assistance ensured prompt return of the videos to the PGY1s. At site 2 there was less technical support.

“Using real patients is the gold standard but takes a lot of coordination, especially for the second simulation, to bring them in, support them and train them.” (Medicine educator, site 2)

“Need to use a full simulation unit with full support, which is resource intensive for large numbers of simulations. Could we run it insitu in ward linked to ward rounds? (Medicine educator, site 2)

3. Engagement, training and briefing of pharmacists.

Engagement of the pharmacy staff early and training pharmacists for the ward-based coaching is critical as there is a significant role shift for them, this varied by site. At site 2 the pharmacists were not as well prepared. Comments from this site by the pharmacists included:

“Our execution did not run as well as it could have.” (Pharmacy educator, site 2)

“Using our time is ok if they prescribe better.” (Pharmacy educator, site 2)

“Looking back, we needed to engage earlier. “(Pharmacy educator, site 2)

In comparison, site 1 had more experience working in a coaching role with junior medical staff and stronger pharmacy leadership. Their comments were:

“If it was rolled out to all PGY1s, it would decrease the pharmacist workload, as less follow-up would be needed.” (Pharmacy educator, site 1)

“It helped us build rapport—the PGY1s approached pharmacists more easily on other matters.” (Pharmacy educator, site 1)

“PGY1s would ring us with concerns.” (Pharmacy educator, site 1)

“Comments indicated a willingness for more pharmacy involvement in planning at site 1.” (Pharmacy educator, site 1)

“Communicate with pharmacists more before second simulation to have an idea of what the major issues or themes that pharmacists identified on the ward.” (Pharmacy educator, site 1)

The variation in pharmacist feedback across the two sites mirrors the PGY1 doctors’ comments, highlighting the need for consistent training and preparation for all educators.

An incidental finding identified by the pharmacists at site 1 was that, during the simulation, six PGY1 doctors all prescribed differently for each patient.

Overall, the comments and discussion in all debrief groups can be summarised by this quote:

“A great educational opportunity—a good approach to teaching something so important, but we need to streamline it make it more efficient—resources are needed to back it.” (Medicine educator, site 2)

Limitations

Limitations due to small cohort size and site variation are acknowledged. Nonetheless, the outcomes do mirror the UK findings and demonstrate improvements in patient safety. As a trial, this project was about impact (measured by prescribing errors) and quality improvement, so was focused on transferability over generalisability. No two sites will ever be identical. The dual-site implementation highlighted regional strengths and weaknesses, raised key points for transferability of the programme to other sites and informed the project moving forward.

Discussion

Impact on patient outcomes

The results, which match the UK experience, demonstrate a significant difference in patient outcomes measured by prescribing error rate and error severity.[[9]] At site 1 we made a difference to patient outcomes beyond our expectation. At site 1 the error severity profile reduced significantly for three of the six categories while moderate errors increased.

We are unable to explain the increase in moderate errors. Clinical staff suggest that this finding may be connected to the timing of the first rotation in New Zealand. The first PGY1 rotation in New Zealand coincided with the summer holiday period and the start of a new rotation for many registrars. Lack of awareness of local protocols and lower staff volumes may impact communication. In addition, interpractitioner variability of prescribing, noted during the simulation, highlighted the importance of training PGY1 doctors in local protocols using current best practice and helping them to think critically about prescribing. The increase in moderate errors by prescribers new to a service highlights the need for effective training, given that consistency in approach is a cornerstone of safe prescribing practice.

The Health and Disability Commissioner’s analysis of complaints involving a medication error between 2009 and 2016 in New Zealand identified several factors that contributed to prescribing errors: failure to obtain necessary information (60%), failure to follow policy and protocol (20%), inadequate knowledge of the medication (17%) and training and orientation to the service (5% each).[[12]]

This supports our hypothesis that the availability of pharmacists at induction to support information gathering and improve familiarity with ward protocols and policies must be addressed early in the training year.

PGY1 doctor satisfaction

The training was well received by the PGY1 doctors. They not only found it useful but would like it repeated in their second year for more complex medication in subspecialties. From both patient care and the PGY1 doctors’ perspectives, the evaluation shows that there is value in rolling out the programme to a full cohort at the trial sites, and that if a full nation rollout were undertaken, there is the potential for a dramatic impact on patient care.

Sustainability

On follow-up with the medical education units, medicine and pharmacy educators raised issues of resourcing and sustainability that cannot be ignored. Although this initiative demonstrated that the educational programme can make a significant difference to patient care, a key learning for the implementation team has been that the current model is extremely resource intensive, particularly during the simulation laboratory sessions, which take  two hours per person and would be difficult to roll out to 40–60 interns.

Regional variation

A dual-site implementation reminds us that workplace contextual and cultural factors will vary across sites and any widespread implementation needs to anticipate this.

All sites have their own implementation strengths and challenges, and a national rollout would need to include flexibility to accommodate these differences. Regional variation and the preparedness of all professional groups is a key consideration. At site 1 the pharmacists had more experience and training for the coaching role, and this was evident in the feedback. Sites also vary with regard to levels of simulation support, ward staffing structures and the culture of wards, services and teams. These factors are important considerations for scaling up the project.

What next?

The model has several phases that draw on evidence-based educational practice, and it is now recognised that further work needs to be done to trial options for delivery at additional sites that would maintain these core principles. Two aspects are being explored:

  1. Alternatives to running simulations off the ward at the beginning and end of the rotation. Two suggestions have emerged:
    — Doing small in situ simulations on the ward before or after ward rounds, with current patients and using iPhones to video the interaction.
    — Building scenarios into a workshop at the start of the rotation using pre-recorded simulations in situ or written cases. Use small group discussions to establish learning needs and as a baseline for self-assessments and self-directed learning at the end of the rotation.
  2. Developing tips and training for pharmacists coaching junior doctors on the ward.

The combined feedback from this study, an earlier New Zealand study[[8]] and a study exploring pharmacy and medicine co-working in Australia[[13]] indicates that coaching pharmacists on the ward may be a key for successfully reducing prescribing errors. In our study, the preparation the pharmacists received prior to the pilot was more extensive at site 1 than at site 2, and site 1 pharmacists had more experience with coaching PGY1 doctors. A refinement for the future is to develop a consistent training programme for pharmacists with an interprofessional educator focus.

Towards a sustainable model for large cohort implementation

The ongoing goal is to develop a flexible and sustainable model to help train entire cohorts of PGY1 doctors. The key emergent theme from the New Zealand experience is the role of the hospital pharmacist in the training of PGY1 doctors.

At the start of the 2018/19 training year, site 1 trialled a version that drew on the ePiFFany experience but replaced the simulation with a pharmacist-led, case-based workshop as a precursor to the ward-based learning with pharmacist coaching. This appears to be a more sustainable model that combines the ePPiFany core principles of supported self-directed learning on the ward with pharmacist coaching, an initial workshop and targeted feedback. This will be implemented at site 2 for the 2020/21 training year. Both sites are providing this for all current PGY1 doctors with cohorts of 40 or more.

A process to monitor PGY1 doctors’ improvements over the length of the attachment, and an activity to provide feedback at the end of placement simulation to align with the ePPiFany model’s second simulation, is under development. The ongoing challenge is finding key indicators to measure improvements in medication safety during the training period that are more sustainable than an audit tool.

We hope that this refined model of delivery will have the flexibility to be shared with other New Zealand and Australian providers, and that we can work collaboratively to build and enhance the programme and share the ongoing development of resources.

Funding

Pfizer International funded the independent audit of prescribing that allowed us to include patient outcomes within the programme evaluation.

Appendix

Appendix Figure 1: Audit information. View Appendix Figure 1.

Appendix Figure 2: Semi-structured interview questions—PGY1 doctors. View Appendix Figure 2.

Appendix Table 1: Site consistent themes PGY1 interview data. View Appendix Table 1.

Summary

Abstract

Aim

Method

Results

Conclusion

Author Information

Dr Dale Sheehan: Medical and Interprofessional Educator, Medical Education Unit, Deans Office, Otago Medical School, Dunedin. Avril Lee: Pharmacist – Quality Improvement, Pharmacy Department / Medical Education Training Unit, Waitematā DHB, North Shore, Auckland. Associate Professor Arindam Basu: Associate Professor, Public Health and Epidemiology, School of Health Sciences, University of Canterbury, Christchurch. Dr John Thwaites: Director of Medical Clinical Training and Consultant Gerontologist, Medical Education and Training Unit, Canterbury District Health Board, Christchurch.

Acknowledgements

The authors wish to acknowledge the generous support and assistance from Dr Rakesh Patel (Clinical Associate Professor in Medical Education, University of Nottingham) for not only sharing the ePPIFany programme, protocols and audit design, but also for co-presenting with us at a number of Australasian forums. We also acknowledge Professor William Green (University of Leicester) for his assistance with audit data collation and analysis and the pharmacists who worked with us at both New Zealand sites for their support, effort, expertise and generosity over the six-month period.

Correspondence

Dr Dale Sheehan, Medical and Interprofessional Educator, Medical Education Unit, Dean’s office, Otago Medical School, 021395747

Correspondence Email

Dalecsheehan@gmail.com

Competing Interests

Nil.

1. Dornan T, Ashcroft DM, Heathfield H, Lewis PJ, Miles J, Taylor D, et al. An In-Depth Investigation into the causes of prescribing errors by foundation trainees in relation to their medical education. EQUIP Study. Manchester: University of Manchester; 2009.

2. Davis P, Ley-Yee, Briant R, Ali W, Scott A, Schug S. Adverse events in New Zealand Public Hospitals: occurrence and impact. The New Zealand Medical Journal. 2002; 115 (1167).

3. Tully MP. Prescribing errors in hospital practice. British Journal of Clinical Pharmacology. 2012; 74(4):668-75

4. Thornton PD, Simon S, Mathew TH. Towards safer drug prescribing, dispensing and administration in hospitals. Journal of Quality Clinical Practice. 1999; 19:41-5

5. Dean FB, O'Grady K, Paschalides C. Utley M, Gallivan S. Providing feedback to hospital doctors about prescribing errors; a pilot study. Pharm. World Sci. 2007; 29(3):213-20.

6. Reason J. Human error: models and management. BMJ. 2000; 320(7237):768-70. doi:10.1136/bmj.320.7237.768

7. Lee A, Sheehan D, Alley P. Medication safety and quality improvement in PGY1 teaching. The New Zealand Medical Journal. 2013; 126(1384).

8. Sheehan, D.C., Lee, A.P., Young, M.L., Werkmeister, B.J. and Thwaites, J.H. Opioids, the pharmacist and the junior doctor: reducing prescribing error. J Pharm Pract Res. 2019; 49: 356-63. doi:10.1002/jppr.1526

9. Green W, Shahzad MW, Wood S,et al. Improving junior doctor medicine prescribing and patient safety: An intervention using personalised, structured, video enhance feedback and deliberate practice. Br J Clin Pharmacol. 2020; 1-13. https://doi.org/10.1111/bcp.14325

10. R Core Team [Internet]. R: A language and environment for statistical computing. R Foundation for Statistical Computing, 2018. Vienna, Austria. Available from: https://www.R-project.org/

11. Creswell, J. W. Qualitative inquiry and research design: Choosing among five approaches (2nd ed.) 2007. Sage Publications, Inc.

12. The Health and Disability Commissioner. Complaints Closed by the Health and Disability Commissioner about Medication Errors: Analysis and Report 2009–2016. 2019. HDC.

13. Noble C, Billett S. Learning to prescribe through co-working: junior doctors, pharmacists and consultants. Med Educ. 2017; 51(4):442-51. doi: 10.1111/medu.13227.

For the PDF of this article,
contact nzmj@nzma.org.nz

View Article PDF

The last decade (2010–2019) has seen calls to action to improve the prescribing practice of junior doctors.[[1–3]] An in-depth investigation into the causes of prescribing errors by foundation trainees in relation to their medical education (the EQUIP study[[1]]) in the UK reported a prescription error rate of 8.9% for all prescribed medicines, and although that is a UK study, there are similarities with New Zealand prevocational training programmes.[[3]] The EQUIP study revealed that existing teaching strategies are not working.[[1]] To believe a single intervention will prevent most prescribing errors is simplistic, and for improvement to occur, new prescribers need to learn from their mistakes.[[3–5]] Traditionally, the education of junior doctors has focused on their competence and professional registration requirements. Working in healthcare is collective and multidisciplinary, and errors occur through human and system factors.[[6]]

In response to similar calls to action in New Zealand,[[1]] the medical education units at two of the larger New Zealand district health boards (DHBs) began working on an education intervention to improve prescribing and medication safety. They explored ways to leverage the interprofessional collaboration between doctors and pharmacists in their everyday interactions to promote effective prescribing practice.[[7]] This early work encouraged pharmacists to work collaboratively with medical staff to integrate medication safety into the postgraduate year 1 (PGY1) programme using interprofessional teaching methods and role modelling collaborative practice.

In 2015, the intervention was expanded to include a role for pharmacists coaching PGY1s on the wards. The work was evaluated by recording prescribing errors and the feedback of PGY1s and educators. A significant improvement in prescribing was demonstrated, with qualitative results suggesting that pharmacists coaching PGY1s on the ward was the strongest intervention.[[8]]

Simultaneously, a programme in the UK known as ‘ePPiFany’ (Effective Prescribing Insight for the Future) was adopting similar strategies and achieving similar results.[[9]] Over the past five years, the UK and New Zealand teams have worked together to share strategies, outcomes and lessons learned, to contribute to knowledge about how workplace learning theory and interprofessional education improves the prescribing practice of junior doctors.

The UK ePPiFany approach

The ePiFFany educational approach is based on self-regulated learning and focuses on developing clinical reasoning when prescribing. It combines a simulated clinical encounter, which is filmed, with personalised and structured feedback, including a review of the filmed encounter, to facilitate deliberate practice throughout the four-month junior doctor rotation. A full description of the intervention is provided in Green, Shahzad and Wood.[[9]] The primary outcome measure, error rate per prescriber, was calculated using daily prescribing data. The three-site ePiFFany case study demonstrated the impact of the intervention on improving clinical outcomes (ie, reducing prescribing error rates). The intervention improved prescribing and patient safety behaviour across different subspecialties and contexts.[[9]]

The New Zealand application of ePPiFany

Two New Zealand DHBs were offered the opportunity to pilot and adapt the evidence-based ePPiFany approach with the support of the UK team. The New Zealand team took a stronger focus on the role of the pharmacist as an interprofessional coach. Following the UK approach, the aim was to accelerate the prescribing performance of PGY1 doctors. We anticipated that, after three months of work experience, our intervention group would be performing at the same level that the control group would after 12 months of work experience and no intervention.

Table 1: The key components of the New Zealand intervention.

*A full protocol of the simulation and the design of the ward coaching model is available on request from the corresponding author.

The programme aimed to accelerate the prescribing performance of PGY1 doctors. It was hoped that after three months the intervention cohort’s prescribing would be at the level of performance as that achieved by the control group after 12 months.

Study design

The intervention was evaluated using a multi-method design with these objectives:

  1. Assess the impact on patient care through reduced prescribing errors.
  2. Facilitate quality improvement and programme development by documenting participants’ experiences, insights and recommendations for improvement at each DHB site (PGY1 doctor, medical and pharmacy educators).

The first objective of this study was to assess the impact of a sustained educational intervention on prescribing practice over a three-month period. That intervention included simulations with personalised, structured, video-enhanced feedback and ongoing ward coaching by pharmacists on prescribing performance. Consistent with the intervention design used in Green et al (2020), an experimental group (consisting of PGY1s on their first placement) and an experienced control group (consisting of PGY1s on their fourth and final placement) was constructed to assess the effectiveness of the intervention. All prescriptions (pre- and post-intervention group) were audited daily and analysed for prescribing accuracy and appropriateness by an independent ward pharmacist.

In order to further programme development, qualitative data was also gathered through semi-structured interviews with PGY1 doctors and debrief meetings with medical and pharmacy educators to collect feedback about satisfaction, implementation, experience and sustainability.

Methods

Prescribing audit data collection and analysis

In following the UK model, the New Zealand pharmacists completed prescribing audits daily, recording prescribing errors by the PGY1 doctors for a six-month period. The first baseline data set at three months was collected from doctors on their final rotation (Quarter 4). The second data set was collected during the following three months, which was Quarter 1 for the new incoming PGY1 doctors who received the intervention (Figure 1). Data for Sundays and public holidays were collected on the next working day. Any gaps were identified and accounted for.

Figure 1: Audit data collection time frame.

Data analysis

Error prevalence was calculated using the following formula: Error prevalence = (Total number of errors / Total medicines prescribed) X 100

The error data were tabulated. The prevalence of errors was compared statistically on the basis of the test of proportions using R.[[10]]

Data was stratified by type of error, severity of error and grade of prescriber. The descriptions of numerator and denominator data and error severity are attached (Appendix Figure 1).

Qualitative data collection and analysis

PGY1 interviews

PGY1 doctors were interviewed individually using a semi-structured format (a mix of face-to-face and telephone interviews) within a month of the completion of the intervention by the independent project coordinator (Appendix Figure 2). Interviews were brief, pre-arranged and averaged 15 minutes, to ensure the doctors were not away from clinical service for any longer than necessary. Information sheets were shared at the time the interviews were arranged, and signed consent was obtained at the commencement of the interview. Responses were recorded on an anonymised template and checked for accuracy with the junior doctor at the end of the interview.

Medicine and pharmacy educator debriefs

Face-to-face group debriefs were held at each DHB site (site 1 and site 2) with pharmacists and medical education staff. Oral consent was obtained. Discussion focused on capturing each group’s perception of the value of the project. They were asked: What worked well? What did not work well? Did you feel this was a good use of your time? What do we need to consider in the future? The facilitator took notes and sought clarification or elaboration of points that were unclear. Notes from sites were kept separate, as there were variations between site 1 and site 2. There was a focus on looking forward, programme improvement and future development, including other site implementations.

Data analysis

Initial thematic analysis was undertaken following Creswell’s method.[[11]] Data from each stakeholder group were analysed for significant statements and quotes, and clusters of meaning were developed using colour coding. Common themes and patterns emerged from both sites and data from each site was combined. An informative label for each theme that resonated with the research team was selected.

Initially, feedback from medical and pharmacy teams was analysed separately. However, as significant site variations emerged alongside clear commonalities, medical and pharmacy responses were integrated for reporting results.

Ethics

This study received ethics approval from the UNITEC Research Ethics Committee (EC). EC registration number: 2016-1038.

Site approval and registration was awarded by DHB. On 21 June 2016 the New Zealand Health and Disability Ethics Committee formally deemed it to be outside their scope and that the study did not require their approval.

Results

Quantitative results: prescribing audit outcomes

There was a significant reduction of prescribing errors at both sites (Table 2).

The volume of prescribing on the wards at the two sites was comparable for the baseline and intervention groups (Table 2). Volumes were reduced at both sites for the intervention group due to the holiday period.

Junior doctor prescribing errors at both sites reduced remarkably following the intervention. At site 1 (DHB1), error prevalence reduced by about 79% (p=0.02). The error prevalence at site 2 (DHB2) reduced by about 38% (p=0.35).

Table 2: Prescribing errors by site. Based on N=14 for DHB1 and N=17 for DHB2 control groups. In the intervention groups, N=6 for DHB 1 and N=4 for DHB2.

Impact on the frequency and severity of prescribing errors

At site 1, the proportion of severe errors reduced by 100%, from 4.04 to zero, in the intervention group (Table 3). We did not have a significant impact on the severity of error at site 2 (Table 4).

Table 3: Site 1 error severity as a proportion of total errors prescribed (PGY1 data).

Table 4: Site 2 error severity as a proportion of total errors prescribed (PGY1 data).

Qualitative results

PGY1 doctor interviews

Interviews were conducted with 10 PGY1 doctors in total across the two sites with a 100% response rate.

Respondents were asked to rate their overall experience on a scale of 1 to 10, with 10 being the most positive. The median was 8 at site 1 (n=6) and 6.5 at site 2 (n=4).

Feedback was consistent for seven of the nine themes identified across the two sites and these consistent responses were all positive. The overall experience was valued by the PGY1s, who agreed that this training was most useful early in the training year; that their reasoning when prescribing was enhanced; and that they felt more confident. The authenticity of the simulation environment and use of real patients was helpful for learning. They valued all the feedback offered, but the video of the simulation appeared less useful than reviewing prescribing on the ward with the pharmacist. (Detail provided in Appendix Table 1).

Significantly, they all thought that the programme should be rolled out to a full cohort of PGY1s, and 60% thought it would be valuable to repeat the programme in their second postgraduate year, especially in subspecialty areas with complex medications (eg, gastroenterology, oncology).

Site variation emerged when the PGY1s were asked about improvements for the future.

Site variation themes

Information prior to consent ensured that respondents were aware this was a new programme and a trial, and that we wanted detailed comments on how to improve the programme for the next cohort. There were two areas of difference: the structure of the second simulation, and the experience of the delivery of the ward coaching.

Table 5: Site variation PGY1s.

Comments highlight variation in the implementation and delivery of the end-of-run simulation and the ward coaching across sites, which signals areas for improvement at site 2 and also considerations for the planning of new site implementations.

Medicine and pharmacy educators’ debriefs

Both common and site-specific comments emerged that mirrored the PGY1 feedback. The educators all felt that it was worthwhile, supported the concept and wanted to continue to develop the programme in the future.

“Conceptually it’s worthwhile.” (Medicine educator, site 2)

“We can see the benefit.” (Pharmacy educator, site 1)

Both sites reported positive responses from the PGY1 doctors, but both sites and both professions also noted the time and resources involved and felt that the current model would be difficult to sustain and deliver to a full cohort of 40–60 PGY1s.

There were several features to consider for the future before a full roll out could be undertaken. Comments highlighted variation between sites in preparation, ward culture and the pharmacists’ previous experience and training as interprofessional coaches.

Differences between sites

1. Ward environment

Site 1 used a rehabilitation and general medical ward, and site 2 used a busy orthopaedic ward. Ward selection impacts on release time both to attend simulation and for meetings with the pharmacist.

“The wards that the intervention is based on need to be considered—for the very busy, acute wards, release time for simulation was hard.” (Medicine educator, site 2)

“Hard for pharmacists catching up with PGY1s when they are so busy.” (Pharmacy educator, site 2)

2. Simulation environment

Although simulations ran relatively smoothly, debriefs turned to a discussion of volume and capacity. At site 1 there was more technical help available from the simulation unit and more support staff were available to assist with patient support before and after the simulation. In addition, video recording and copying assistance ensured prompt return of the videos to the PGY1s. At site 2 there was less technical support.

“Using real patients is the gold standard but takes a lot of coordination, especially for the second simulation, to bring them in, support them and train them.” (Medicine educator, site 2)

“Need to use a full simulation unit with full support, which is resource intensive for large numbers of simulations. Could we run it insitu in ward linked to ward rounds? (Medicine educator, site 2)

3. Engagement, training and briefing of pharmacists.

Engagement of the pharmacy staff early and training pharmacists for the ward-based coaching is critical as there is a significant role shift for them, this varied by site. At site 2 the pharmacists were not as well prepared. Comments from this site by the pharmacists included:

“Our execution did not run as well as it could have.” (Pharmacy educator, site 2)

“Using our time is ok if they prescribe better.” (Pharmacy educator, site 2)

“Looking back, we needed to engage earlier. “(Pharmacy educator, site 2)

In comparison, site 1 had more experience working in a coaching role with junior medical staff and stronger pharmacy leadership. Their comments were:

“If it was rolled out to all PGY1s, it would decrease the pharmacist workload, as less follow-up would be needed.” (Pharmacy educator, site 1)

“It helped us build rapport—the PGY1s approached pharmacists more easily on other matters.” (Pharmacy educator, site 1)

“PGY1s would ring us with concerns.” (Pharmacy educator, site 1)

“Comments indicated a willingness for more pharmacy involvement in planning at site 1.” (Pharmacy educator, site 1)

“Communicate with pharmacists more before second simulation to have an idea of what the major issues or themes that pharmacists identified on the ward.” (Pharmacy educator, site 1)

The variation in pharmacist feedback across the two sites mirrors the PGY1 doctors’ comments, highlighting the need for consistent training and preparation for all educators.

An incidental finding identified by the pharmacists at site 1 was that, during the simulation, six PGY1 doctors all prescribed differently for each patient.

Overall, the comments and discussion in all debrief groups can be summarised by this quote:

“A great educational opportunity—a good approach to teaching something so important, but we need to streamline it make it more efficient—resources are needed to back it.” (Medicine educator, site 2)

Limitations

Limitations due to small cohort size and site variation are acknowledged. Nonetheless, the outcomes do mirror the UK findings and demonstrate improvements in patient safety. As a trial, this project was about impact (measured by prescribing errors) and quality improvement, so was focused on transferability over generalisability. No two sites will ever be identical. The dual-site implementation highlighted regional strengths and weaknesses, raised key points for transferability of the programme to other sites and informed the project moving forward.

Discussion

Impact on patient outcomes

The results, which match the UK experience, demonstrate a significant difference in patient outcomes measured by prescribing error rate and error severity.[[9]] At site 1 we made a difference to patient outcomes beyond our expectation. At site 1 the error severity profile reduced significantly for three of the six categories while moderate errors increased.

We are unable to explain the increase in moderate errors. Clinical staff suggest that this finding may be connected to the timing of the first rotation in New Zealand. The first PGY1 rotation in New Zealand coincided with the summer holiday period and the start of a new rotation for many registrars. Lack of awareness of local protocols and lower staff volumes may impact communication. In addition, interpractitioner variability of prescribing, noted during the simulation, highlighted the importance of training PGY1 doctors in local protocols using current best practice and helping them to think critically about prescribing. The increase in moderate errors by prescribers new to a service highlights the need for effective training, given that consistency in approach is a cornerstone of safe prescribing practice.

The Health and Disability Commissioner’s analysis of complaints involving a medication error between 2009 and 2016 in New Zealand identified several factors that contributed to prescribing errors: failure to obtain necessary information (60%), failure to follow policy and protocol (20%), inadequate knowledge of the medication (17%) and training and orientation to the service (5% each).[[12]]

This supports our hypothesis that the availability of pharmacists at induction to support information gathering and improve familiarity with ward protocols and policies must be addressed early in the training year.

PGY1 doctor satisfaction

The training was well received by the PGY1 doctors. They not only found it useful but would like it repeated in their second year for more complex medication in subspecialties. From both patient care and the PGY1 doctors’ perspectives, the evaluation shows that there is value in rolling out the programme to a full cohort at the trial sites, and that if a full nation rollout were undertaken, there is the potential for a dramatic impact on patient care.

Sustainability

On follow-up with the medical education units, medicine and pharmacy educators raised issues of resourcing and sustainability that cannot be ignored. Although this initiative demonstrated that the educational programme can make a significant difference to patient care, a key learning for the implementation team has been that the current model is extremely resource intensive, particularly during the simulation laboratory sessions, which take  two hours per person and would be difficult to roll out to 40–60 interns.

Regional variation

A dual-site implementation reminds us that workplace contextual and cultural factors will vary across sites and any widespread implementation needs to anticipate this.

All sites have their own implementation strengths and challenges, and a national rollout would need to include flexibility to accommodate these differences. Regional variation and the preparedness of all professional groups is a key consideration. At site 1 the pharmacists had more experience and training for the coaching role, and this was evident in the feedback. Sites also vary with regard to levels of simulation support, ward staffing structures and the culture of wards, services and teams. These factors are important considerations for scaling up the project.

What next?

The model has several phases that draw on evidence-based educational practice, and it is now recognised that further work needs to be done to trial options for delivery at additional sites that would maintain these core principles. Two aspects are being explored:

  1. Alternatives to running simulations off the ward at the beginning and end of the rotation. Two suggestions have emerged:
    — Doing small in situ simulations on the ward before or after ward rounds, with current patients and using iPhones to video the interaction.
    — Building scenarios into a workshop at the start of the rotation using pre-recorded simulations in situ or written cases. Use small group discussions to establish learning needs and as a baseline for self-assessments and self-directed learning at the end of the rotation.
  2. Developing tips and training for pharmacists coaching junior doctors on the ward.

The combined feedback from this study, an earlier New Zealand study[[8]] and a study exploring pharmacy and medicine co-working in Australia[[13]] indicates that coaching pharmacists on the ward may be a key for successfully reducing prescribing errors. In our study, the preparation the pharmacists received prior to the pilot was more extensive at site 1 than at site 2, and site 1 pharmacists had more experience with coaching PGY1 doctors. A refinement for the future is to develop a consistent training programme for pharmacists with an interprofessional educator focus.

Towards a sustainable model for large cohort implementation

The ongoing goal is to develop a flexible and sustainable model to help train entire cohorts of PGY1 doctors. The key emergent theme from the New Zealand experience is the role of the hospital pharmacist in the training of PGY1 doctors.

At the start of the 2018/19 training year, site 1 trialled a version that drew on the ePiFFany experience but replaced the simulation with a pharmacist-led, case-based workshop as a precursor to the ward-based learning with pharmacist coaching. This appears to be a more sustainable model that combines the ePPiFany core principles of supported self-directed learning on the ward with pharmacist coaching, an initial workshop and targeted feedback. This will be implemented at site 2 for the 2020/21 training year. Both sites are providing this for all current PGY1 doctors with cohorts of 40 or more.

A process to monitor PGY1 doctors’ improvements over the length of the attachment, and an activity to provide feedback at the end of placement simulation to align with the ePPiFany model’s second simulation, is under development. The ongoing challenge is finding key indicators to measure improvements in medication safety during the training period that are more sustainable than an audit tool.

We hope that this refined model of delivery will have the flexibility to be shared with other New Zealand and Australian providers, and that we can work collaboratively to build and enhance the programme and share the ongoing development of resources.

Funding

Pfizer International funded the independent audit of prescribing that allowed us to include patient outcomes within the programme evaluation.

Appendix

Appendix Figure 1: Audit information. View Appendix Figure 1.

Appendix Figure 2: Semi-structured interview questions—PGY1 doctors. View Appendix Figure 2.

Appendix Table 1: Site consistent themes PGY1 interview data. View Appendix Table 1.

Summary

Abstract

Aim

Method

Results

Conclusion

Author Information

Dr Dale Sheehan: Medical and Interprofessional Educator, Medical Education Unit, Deans Office, Otago Medical School, Dunedin. Avril Lee: Pharmacist – Quality Improvement, Pharmacy Department / Medical Education Training Unit, Waitematā DHB, North Shore, Auckland. Associate Professor Arindam Basu: Associate Professor, Public Health and Epidemiology, School of Health Sciences, University of Canterbury, Christchurch. Dr John Thwaites: Director of Medical Clinical Training and Consultant Gerontologist, Medical Education and Training Unit, Canterbury District Health Board, Christchurch.

Acknowledgements

The authors wish to acknowledge the generous support and assistance from Dr Rakesh Patel (Clinical Associate Professor in Medical Education, University of Nottingham) for not only sharing the ePPIFany programme, protocols and audit design, but also for co-presenting with us at a number of Australasian forums. We also acknowledge Professor William Green (University of Leicester) for his assistance with audit data collation and analysis and the pharmacists who worked with us at both New Zealand sites for their support, effort, expertise and generosity over the six-month period.

Correspondence

Dr Dale Sheehan, Medical and Interprofessional Educator, Medical Education Unit, Dean’s office, Otago Medical School, 021395747

Correspondence Email

Dalecsheehan@gmail.com

Competing Interests

Nil.

1. Dornan T, Ashcroft DM, Heathfield H, Lewis PJ, Miles J, Taylor D, et al. An In-Depth Investigation into the causes of prescribing errors by foundation trainees in relation to their medical education. EQUIP Study. Manchester: University of Manchester; 2009.

2. Davis P, Ley-Yee, Briant R, Ali W, Scott A, Schug S. Adverse events in New Zealand Public Hospitals: occurrence and impact. The New Zealand Medical Journal. 2002; 115 (1167).

3. Tully MP. Prescribing errors in hospital practice. British Journal of Clinical Pharmacology. 2012; 74(4):668-75

4. Thornton PD, Simon S, Mathew TH. Towards safer drug prescribing, dispensing and administration in hospitals. Journal of Quality Clinical Practice. 1999; 19:41-5

5. Dean FB, O'Grady K, Paschalides C. Utley M, Gallivan S. Providing feedback to hospital doctors about prescribing errors; a pilot study. Pharm. World Sci. 2007; 29(3):213-20.

6. Reason J. Human error: models and management. BMJ. 2000; 320(7237):768-70. doi:10.1136/bmj.320.7237.768

7. Lee A, Sheehan D, Alley P. Medication safety and quality improvement in PGY1 teaching. The New Zealand Medical Journal. 2013; 126(1384).

8. Sheehan, D.C., Lee, A.P., Young, M.L., Werkmeister, B.J. and Thwaites, J.H. Opioids, the pharmacist and the junior doctor: reducing prescribing error. J Pharm Pract Res. 2019; 49: 356-63. doi:10.1002/jppr.1526

9. Green W, Shahzad MW, Wood S,et al. Improving junior doctor medicine prescribing and patient safety: An intervention using personalised, structured, video enhance feedback and deliberate practice. Br J Clin Pharmacol. 2020; 1-13. https://doi.org/10.1111/bcp.14325

10. R Core Team [Internet]. R: A language and environment for statistical computing. R Foundation for Statistical Computing, 2018. Vienna, Austria. Available from: https://www.R-project.org/

11. Creswell, J. W. Qualitative inquiry and research design: Choosing among five approaches (2nd ed.) 2007. Sage Publications, Inc.

12. The Health and Disability Commissioner. Complaints Closed by the Health and Disability Commissioner about Medication Errors: Analysis and Report 2009–2016. 2019. HDC.

13. Noble C, Billett S. Learning to prescribe through co-working: junior doctors, pharmacists and consultants. Med Educ. 2017; 51(4):442-51. doi: 10.1111/medu.13227.

For the PDF of this article,
contact nzmj@nzma.org.nz

View Article PDF

The last decade (2010–2019) has seen calls to action to improve the prescribing practice of junior doctors.[[1–3]] An in-depth investigation into the causes of prescribing errors by foundation trainees in relation to their medical education (the EQUIP study[[1]]) in the UK reported a prescription error rate of 8.9% for all prescribed medicines, and although that is a UK study, there are similarities with New Zealand prevocational training programmes.[[3]] The EQUIP study revealed that existing teaching strategies are not working.[[1]] To believe a single intervention will prevent most prescribing errors is simplistic, and for improvement to occur, new prescribers need to learn from their mistakes.[[3–5]] Traditionally, the education of junior doctors has focused on their competence and professional registration requirements. Working in healthcare is collective and multidisciplinary, and errors occur through human and system factors.[[6]]

In response to similar calls to action in New Zealand,[[1]] the medical education units at two of the larger New Zealand district health boards (DHBs) began working on an education intervention to improve prescribing and medication safety. They explored ways to leverage the interprofessional collaboration between doctors and pharmacists in their everyday interactions to promote effective prescribing practice.[[7]] This early work encouraged pharmacists to work collaboratively with medical staff to integrate medication safety into the postgraduate year 1 (PGY1) programme using interprofessional teaching methods and role modelling collaborative practice.

In 2015, the intervention was expanded to include a role for pharmacists coaching PGY1s on the wards. The work was evaluated by recording prescribing errors and the feedback of PGY1s and educators. A significant improvement in prescribing was demonstrated, with qualitative results suggesting that pharmacists coaching PGY1s on the ward was the strongest intervention.[[8]]

Simultaneously, a programme in the UK known as ‘ePPiFany’ (Effective Prescribing Insight for the Future) was adopting similar strategies and achieving similar results.[[9]] Over the past five years, the UK and New Zealand teams have worked together to share strategies, outcomes and lessons learned, to contribute to knowledge about how workplace learning theory and interprofessional education improves the prescribing practice of junior doctors.

The UK ePPiFany approach

The ePiFFany educational approach is based on self-regulated learning and focuses on developing clinical reasoning when prescribing. It combines a simulated clinical encounter, which is filmed, with personalised and structured feedback, including a review of the filmed encounter, to facilitate deliberate practice throughout the four-month junior doctor rotation. A full description of the intervention is provided in Green, Shahzad and Wood.[[9]] The primary outcome measure, error rate per prescriber, was calculated using daily prescribing data. The three-site ePiFFany case study demonstrated the impact of the intervention on improving clinical outcomes (ie, reducing prescribing error rates). The intervention improved prescribing and patient safety behaviour across different subspecialties and contexts.[[9]]

The New Zealand application of ePPiFany

Two New Zealand DHBs were offered the opportunity to pilot and adapt the evidence-based ePPiFany approach with the support of the UK team. The New Zealand team took a stronger focus on the role of the pharmacist as an interprofessional coach. Following the UK approach, the aim was to accelerate the prescribing performance of PGY1 doctors. We anticipated that, after three months of work experience, our intervention group would be performing at the same level that the control group would after 12 months of work experience and no intervention.

Table 1: The key components of the New Zealand intervention.

*A full protocol of the simulation and the design of the ward coaching model is available on request from the corresponding author.

The programme aimed to accelerate the prescribing performance of PGY1 doctors. It was hoped that after three months the intervention cohort’s prescribing would be at the level of performance as that achieved by the control group after 12 months.

Study design

The intervention was evaluated using a multi-method design with these objectives:

  1. Assess the impact on patient care through reduced prescribing errors.
  2. Facilitate quality improvement and programme development by documenting participants’ experiences, insights and recommendations for improvement at each DHB site (PGY1 doctor, medical and pharmacy educators).

The first objective of this study was to assess the impact of a sustained educational intervention on prescribing practice over a three-month period. That intervention included simulations with personalised, structured, video-enhanced feedback and ongoing ward coaching by pharmacists on prescribing performance. Consistent with the intervention design used in Green et al (2020), an experimental group (consisting of PGY1s on their first placement) and an experienced control group (consisting of PGY1s on their fourth and final placement) was constructed to assess the effectiveness of the intervention. All prescriptions (pre- and post-intervention group) were audited daily and analysed for prescribing accuracy and appropriateness by an independent ward pharmacist.

In order to further programme development, qualitative data was also gathered through semi-structured interviews with PGY1 doctors and debrief meetings with medical and pharmacy educators to collect feedback about satisfaction, implementation, experience and sustainability.

Methods

Prescribing audit data collection and analysis

In following the UK model, the New Zealand pharmacists completed prescribing audits daily, recording prescribing errors by the PGY1 doctors for a six-month period. The first baseline data set at three months was collected from doctors on their final rotation (Quarter 4). The second data set was collected during the following three months, which was Quarter 1 for the new incoming PGY1 doctors who received the intervention (Figure 1). Data for Sundays and public holidays were collected on the next working day. Any gaps were identified and accounted for.

Figure 1: Audit data collection time frame.

Data analysis

Error prevalence was calculated using the following formula: Error prevalence = (Total number of errors / Total medicines prescribed) X 100

The error data were tabulated. The prevalence of errors was compared statistically on the basis of the test of proportions using R.[[10]]

Data was stratified by type of error, severity of error and grade of prescriber. The descriptions of numerator and denominator data and error severity are attached (Appendix Figure 1).

Qualitative data collection and analysis

PGY1 interviews

PGY1 doctors were interviewed individually using a semi-structured format (a mix of face-to-face and telephone interviews) within a month of the completion of the intervention by the independent project coordinator (Appendix Figure 2). Interviews were brief, pre-arranged and averaged 15 minutes, to ensure the doctors were not away from clinical service for any longer than necessary. Information sheets were shared at the time the interviews were arranged, and signed consent was obtained at the commencement of the interview. Responses were recorded on an anonymised template and checked for accuracy with the junior doctor at the end of the interview.

Medicine and pharmacy educator debriefs

Face-to-face group debriefs were held at each DHB site (site 1 and site 2) with pharmacists and medical education staff. Oral consent was obtained. Discussion focused on capturing each group’s perception of the value of the project. They were asked: What worked well? What did not work well? Did you feel this was a good use of your time? What do we need to consider in the future? The facilitator took notes and sought clarification or elaboration of points that were unclear. Notes from sites were kept separate, as there were variations between site 1 and site 2. There was a focus on looking forward, programme improvement and future development, including other site implementations.

Data analysis

Initial thematic analysis was undertaken following Creswell’s method.[[11]] Data from each stakeholder group were analysed for significant statements and quotes, and clusters of meaning were developed using colour coding. Common themes and patterns emerged from both sites and data from each site was combined. An informative label for each theme that resonated with the research team was selected.

Initially, feedback from medical and pharmacy teams was analysed separately. However, as significant site variations emerged alongside clear commonalities, medical and pharmacy responses were integrated for reporting results.

Ethics

This study received ethics approval from the UNITEC Research Ethics Committee (EC). EC registration number: 2016-1038.

Site approval and registration was awarded by DHB. On 21 June 2016 the New Zealand Health and Disability Ethics Committee formally deemed it to be outside their scope and that the study did not require their approval.

Results

Quantitative results: prescribing audit outcomes

There was a significant reduction of prescribing errors at both sites (Table 2).

The volume of prescribing on the wards at the two sites was comparable for the baseline and intervention groups (Table 2). Volumes were reduced at both sites for the intervention group due to the holiday period.

Junior doctor prescribing errors at both sites reduced remarkably following the intervention. At site 1 (DHB1), error prevalence reduced by about 79% (p=0.02). The error prevalence at site 2 (DHB2) reduced by about 38% (p=0.35).

Table 2: Prescribing errors by site. Based on N=14 for DHB1 and N=17 for DHB2 control groups. In the intervention groups, N=6 for DHB 1 and N=4 for DHB2.

Impact on the frequency and severity of prescribing errors

At site 1, the proportion of severe errors reduced by 100%, from 4.04 to zero, in the intervention group (Table 3). We did not have a significant impact on the severity of error at site 2 (Table 4).

Table 3: Site 1 error severity as a proportion of total errors prescribed (PGY1 data).

Table 4: Site 2 error severity as a proportion of total errors prescribed (PGY1 data).

Qualitative results

PGY1 doctor interviews

Interviews were conducted with 10 PGY1 doctors in total across the two sites with a 100% response rate.

Respondents were asked to rate their overall experience on a scale of 1 to 10, with 10 being the most positive. The median was 8 at site 1 (n=6) and 6.5 at site 2 (n=4).

Feedback was consistent for seven of the nine themes identified across the two sites and these consistent responses were all positive. The overall experience was valued by the PGY1s, who agreed that this training was most useful early in the training year; that their reasoning when prescribing was enhanced; and that they felt more confident. The authenticity of the simulation environment and use of real patients was helpful for learning. They valued all the feedback offered, but the video of the simulation appeared less useful than reviewing prescribing on the ward with the pharmacist. (Detail provided in Appendix Table 1).

Significantly, they all thought that the programme should be rolled out to a full cohort of PGY1s, and 60% thought it would be valuable to repeat the programme in their second postgraduate year, especially in subspecialty areas with complex medications (eg, gastroenterology, oncology).

Site variation emerged when the PGY1s were asked about improvements for the future.

Site variation themes

Information prior to consent ensured that respondents were aware this was a new programme and a trial, and that we wanted detailed comments on how to improve the programme for the next cohort. There were two areas of difference: the structure of the second simulation, and the experience of the delivery of the ward coaching.

Table 5: Site variation PGY1s.

Comments highlight variation in the implementation and delivery of the end-of-run simulation and the ward coaching across sites, which signals areas for improvement at site 2 and also considerations for the planning of new site implementations.

Medicine and pharmacy educators’ debriefs

Both common and site-specific comments emerged that mirrored the PGY1 feedback. The educators all felt that it was worthwhile, supported the concept and wanted to continue to develop the programme in the future.

“Conceptually it’s worthwhile.” (Medicine educator, site 2)

“We can see the benefit.” (Pharmacy educator, site 1)

Both sites reported positive responses from the PGY1 doctors, but both sites and both professions also noted the time and resources involved and felt that the current model would be difficult to sustain and deliver to a full cohort of 40–60 PGY1s.

There were several features to consider for the future before a full roll out could be undertaken. Comments highlighted variation between sites in preparation, ward culture and the pharmacists’ previous experience and training as interprofessional coaches.

Differences between sites

1. Ward environment

Site 1 used a rehabilitation and general medical ward, and site 2 used a busy orthopaedic ward. Ward selection impacts on release time both to attend simulation and for meetings with the pharmacist.

“The wards that the intervention is based on need to be considered—for the very busy, acute wards, release time for simulation was hard.” (Medicine educator, site 2)

“Hard for pharmacists catching up with PGY1s when they are so busy.” (Pharmacy educator, site 2)

2. Simulation environment

Although simulations ran relatively smoothly, debriefs turned to a discussion of volume and capacity. At site 1 there was more technical help available from the simulation unit and more support staff were available to assist with patient support before and after the simulation. In addition, video recording and copying assistance ensured prompt return of the videos to the PGY1s. At site 2 there was less technical support.

“Using real patients is the gold standard but takes a lot of coordination, especially for the second simulation, to bring them in, support them and train them.” (Medicine educator, site 2)

“Need to use a full simulation unit with full support, which is resource intensive for large numbers of simulations. Could we run it insitu in ward linked to ward rounds? (Medicine educator, site 2)

3. Engagement, training and briefing of pharmacists.

Engagement of the pharmacy staff early and training pharmacists for the ward-based coaching is critical as there is a significant role shift for them, this varied by site. At site 2 the pharmacists were not as well prepared. Comments from this site by the pharmacists included:

“Our execution did not run as well as it could have.” (Pharmacy educator, site 2)

“Using our time is ok if they prescribe better.” (Pharmacy educator, site 2)

“Looking back, we needed to engage earlier. “(Pharmacy educator, site 2)

In comparison, site 1 had more experience working in a coaching role with junior medical staff and stronger pharmacy leadership. Their comments were:

“If it was rolled out to all PGY1s, it would decrease the pharmacist workload, as less follow-up would be needed.” (Pharmacy educator, site 1)

“It helped us build rapport—the PGY1s approached pharmacists more easily on other matters.” (Pharmacy educator, site 1)

“PGY1s would ring us with concerns.” (Pharmacy educator, site 1)

“Comments indicated a willingness for more pharmacy involvement in planning at site 1.” (Pharmacy educator, site 1)

“Communicate with pharmacists more before second simulation to have an idea of what the major issues or themes that pharmacists identified on the ward.” (Pharmacy educator, site 1)

The variation in pharmacist feedback across the two sites mirrors the PGY1 doctors’ comments, highlighting the need for consistent training and preparation for all educators.

An incidental finding identified by the pharmacists at site 1 was that, during the simulation, six PGY1 doctors all prescribed differently for each patient.

Overall, the comments and discussion in all debrief groups can be summarised by this quote:

“A great educational opportunity—a good approach to teaching something so important, but we need to streamline it make it more efficient—resources are needed to back it.” (Medicine educator, site 2)

Limitations

Limitations due to small cohort size and site variation are acknowledged. Nonetheless, the outcomes do mirror the UK findings and demonstrate improvements in patient safety. As a trial, this project was about impact (measured by prescribing errors) and quality improvement, so was focused on transferability over generalisability. No two sites will ever be identical. The dual-site implementation highlighted regional strengths and weaknesses, raised key points for transferability of the programme to other sites and informed the project moving forward.

Discussion

Impact on patient outcomes

The results, which match the UK experience, demonstrate a significant difference in patient outcomes measured by prescribing error rate and error severity.[[9]] At site 1 we made a difference to patient outcomes beyond our expectation. At site 1 the error severity profile reduced significantly for three of the six categories while moderate errors increased.

We are unable to explain the increase in moderate errors. Clinical staff suggest that this finding may be connected to the timing of the first rotation in New Zealand. The first PGY1 rotation in New Zealand coincided with the summer holiday period and the start of a new rotation for many registrars. Lack of awareness of local protocols and lower staff volumes may impact communication. In addition, interpractitioner variability of prescribing, noted during the simulation, highlighted the importance of training PGY1 doctors in local protocols using current best practice and helping them to think critically about prescribing. The increase in moderate errors by prescribers new to a service highlights the need for effective training, given that consistency in approach is a cornerstone of safe prescribing practice.

The Health and Disability Commissioner’s analysis of complaints involving a medication error between 2009 and 2016 in New Zealand identified several factors that contributed to prescribing errors: failure to obtain necessary information (60%), failure to follow policy and protocol (20%), inadequate knowledge of the medication (17%) and training and orientation to the service (5% each).[[12]]

This supports our hypothesis that the availability of pharmacists at induction to support information gathering and improve familiarity with ward protocols and policies must be addressed early in the training year.

PGY1 doctor satisfaction

The training was well received by the PGY1 doctors. They not only found it useful but would like it repeated in their second year for more complex medication in subspecialties. From both patient care and the PGY1 doctors’ perspectives, the evaluation shows that there is value in rolling out the programme to a full cohort at the trial sites, and that if a full nation rollout were undertaken, there is the potential for a dramatic impact on patient care.

Sustainability

On follow-up with the medical education units, medicine and pharmacy educators raised issues of resourcing and sustainability that cannot be ignored. Although this initiative demonstrated that the educational programme can make a significant difference to patient care, a key learning for the implementation team has been that the current model is extremely resource intensive, particularly during the simulation laboratory sessions, which take  two hours per person and would be difficult to roll out to 40–60 interns.

Regional variation

A dual-site implementation reminds us that workplace contextual and cultural factors will vary across sites and any widespread implementation needs to anticipate this.

All sites have their own implementation strengths and challenges, and a national rollout would need to include flexibility to accommodate these differences. Regional variation and the preparedness of all professional groups is a key consideration. At site 1 the pharmacists had more experience and training for the coaching role, and this was evident in the feedback. Sites also vary with regard to levels of simulation support, ward staffing structures and the culture of wards, services and teams. These factors are important considerations for scaling up the project.

What next?

The model has several phases that draw on evidence-based educational practice, and it is now recognised that further work needs to be done to trial options for delivery at additional sites that would maintain these core principles. Two aspects are being explored:

  1. Alternatives to running simulations off the ward at the beginning and end of the rotation. Two suggestions have emerged:
    — Doing small in situ simulations on the ward before or after ward rounds, with current patients and using iPhones to video the interaction.
    — Building scenarios into a workshop at the start of the rotation using pre-recorded simulations in situ or written cases. Use small group discussions to establish learning needs and as a baseline for self-assessments and self-directed learning at the end of the rotation.
  2. Developing tips and training for pharmacists coaching junior doctors on the ward.

The combined feedback from this study, an earlier New Zealand study[[8]] and a study exploring pharmacy and medicine co-working in Australia[[13]] indicates that coaching pharmacists on the ward may be a key for successfully reducing prescribing errors. In our study, the preparation the pharmacists received prior to the pilot was more extensive at site 1 than at site 2, and site 1 pharmacists had more experience with coaching PGY1 doctors. A refinement for the future is to develop a consistent training programme for pharmacists with an interprofessional educator focus.

Towards a sustainable model for large cohort implementation

The ongoing goal is to develop a flexible and sustainable model to help train entire cohorts of PGY1 doctors. The key emergent theme from the New Zealand experience is the role of the hospital pharmacist in the training of PGY1 doctors.

At the start of the 2018/19 training year, site 1 trialled a version that drew on the ePiFFany experience but replaced the simulation with a pharmacist-led, case-based workshop as a precursor to the ward-based learning with pharmacist coaching. This appears to be a more sustainable model that combines the ePPiFany core principles of supported self-directed learning on the ward with pharmacist coaching, an initial workshop and targeted feedback. This will be implemented at site 2 for the 2020/21 training year. Both sites are providing this for all current PGY1 doctors with cohorts of 40 or more.

A process to monitor PGY1 doctors’ improvements over the length of the attachment, and an activity to provide feedback at the end of placement simulation to align with the ePPiFany model’s second simulation, is under development. The ongoing challenge is finding key indicators to measure improvements in medication safety during the training period that are more sustainable than an audit tool.

We hope that this refined model of delivery will have the flexibility to be shared with other New Zealand and Australian providers, and that we can work collaboratively to build and enhance the programme and share the ongoing development of resources.

Funding

Pfizer International funded the independent audit of prescribing that allowed us to include patient outcomes within the programme evaluation.

Appendix

Appendix Figure 1: Audit information. View Appendix Figure 1.

Appendix Figure 2: Semi-structured interview questions—PGY1 doctors. View Appendix Figure 2.

Appendix Table 1: Site consistent themes PGY1 interview data. View Appendix Table 1.

Summary

Abstract

Aim

Method

Results

Conclusion

Author Information

Dr Dale Sheehan: Medical and Interprofessional Educator, Medical Education Unit, Deans Office, Otago Medical School, Dunedin. Avril Lee: Pharmacist – Quality Improvement, Pharmacy Department / Medical Education Training Unit, Waitematā DHB, North Shore, Auckland. Associate Professor Arindam Basu: Associate Professor, Public Health and Epidemiology, School of Health Sciences, University of Canterbury, Christchurch. Dr John Thwaites: Director of Medical Clinical Training and Consultant Gerontologist, Medical Education and Training Unit, Canterbury District Health Board, Christchurch.

Acknowledgements

The authors wish to acknowledge the generous support and assistance from Dr Rakesh Patel (Clinical Associate Professor in Medical Education, University of Nottingham) for not only sharing the ePPIFany programme, protocols and audit design, but also for co-presenting with us at a number of Australasian forums. We also acknowledge Professor William Green (University of Leicester) for his assistance with audit data collation and analysis and the pharmacists who worked with us at both New Zealand sites for their support, effort, expertise and generosity over the six-month period.

Correspondence

Dr Dale Sheehan, Medical and Interprofessional Educator, Medical Education Unit, Dean’s office, Otago Medical School, 021395747

Correspondence Email

Dalecsheehan@gmail.com

Competing Interests

Nil.

1. Dornan T, Ashcroft DM, Heathfield H, Lewis PJ, Miles J, Taylor D, et al. An In-Depth Investigation into the causes of prescribing errors by foundation trainees in relation to their medical education. EQUIP Study. Manchester: University of Manchester; 2009.

2. Davis P, Ley-Yee, Briant R, Ali W, Scott A, Schug S. Adverse events in New Zealand Public Hospitals: occurrence and impact. The New Zealand Medical Journal. 2002; 115 (1167).

3. Tully MP. Prescribing errors in hospital practice. British Journal of Clinical Pharmacology. 2012; 74(4):668-75

4. Thornton PD, Simon S, Mathew TH. Towards safer drug prescribing, dispensing and administration in hospitals. Journal of Quality Clinical Practice. 1999; 19:41-5

5. Dean FB, O'Grady K, Paschalides C. Utley M, Gallivan S. Providing feedback to hospital doctors about prescribing errors; a pilot study. Pharm. World Sci. 2007; 29(3):213-20.

6. Reason J. Human error: models and management. BMJ. 2000; 320(7237):768-70. doi:10.1136/bmj.320.7237.768

7. Lee A, Sheehan D, Alley P. Medication safety and quality improvement in PGY1 teaching. The New Zealand Medical Journal. 2013; 126(1384).

8. Sheehan, D.C., Lee, A.P., Young, M.L., Werkmeister, B.J. and Thwaites, J.H. Opioids, the pharmacist and the junior doctor: reducing prescribing error. J Pharm Pract Res. 2019; 49: 356-63. doi:10.1002/jppr.1526

9. Green W, Shahzad MW, Wood S,et al. Improving junior doctor medicine prescribing and patient safety: An intervention using personalised, structured, video enhance feedback and deliberate practice. Br J Clin Pharmacol. 2020; 1-13. https://doi.org/10.1111/bcp.14325

10. R Core Team [Internet]. R: A language and environment for statistical computing. R Foundation for Statistical Computing, 2018. Vienna, Austria. Available from: https://www.R-project.org/

11. Creswell, J. W. Qualitative inquiry and research design: Choosing among five approaches (2nd ed.) 2007. Sage Publications, Inc.

12. The Health and Disability Commissioner. Complaints Closed by the Health and Disability Commissioner about Medication Errors: Analysis and Report 2009–2016. 2019. HDC.

13. Noble C, Billett S. Learning to prescribe through co-working: junior doctors, pharmacists and consultants. Med Educ. 2017; 51(4):442-51. doi: 10.1111/medu.13227.

Contact diana@nzma.org.nz
for the PDF of this article

Subscriber Content

The full contents of this pages only available to subscribers.

LOGINSUBSCRIBE