1st August 2014, Volume 127 Number 1399

Steven Lillis, Heather Roblin

International medical graduates (IMGs) who wish to work in New Zealand may have to undertake a series of assessments depending on prior qualifications and experience; one such pathway is the NZREX Clinical pathway. In order to be eligible for this pathway, the IMG must have successfully met the Council's English language policy and must have passed a Council approved written examination within the 5 years prior to sitting NZREX Clinical.

NZREX Clinical is a 16 station Objective Structured Clinical Examination (OSCE). A detailed description of the assessment has previously been published including psychometric data.1

The average pass rate in the assessment over the past 5 years is 60%. Those who are successful in the assessment are eligible for provisional general registration (i.e. working under Council monitored supervision) in New Zealand and are permitted to work as an intern in specified hospital positions.

During this year, the IMG intern will be expected to complete four runs that must include one in medicine and one in surgery.2 Surgical and medical runs must be given a ‘category A’ rating by the Medical Council of New Zealand. A ‘category A’ surgical run is one which provides the trainee with substantial training in basic surgical principles. A ‘category A’ medical run is one in which there is a substantial content of general medical training. Category A runs are usually found in general medicine, paediatrics, general surgery and orthopaedics. It is acknowledged that attachment experiences will differ, with a broader exposure characteristic of peripheral hospitals and limited exposure associated with tertiary hospitals. The two remaining runs may be more specialised but remain subject to approval. Each IMG will have a clinical supervisor for a specific run who is responsible for giving support and guidance to the IMG and also for reporting on the progress being made.

The intern supervisor for IMGs are required to complete a supervisor report form at the end of each quarter. The grades given by the supervisor are based on the work undertaken by the IMG in a work context and as such represents a workplace-based assessment (WBA). The forms assist the supervisor by directing attention to specific areas of performance graded on a 5-point Likert scale.3

An 'unsatisfactory report' is defined as having any of the 19 attributes scored as a category 1 (performs significantly below that generally observed for this level of experience) or more than one category 2 (below expectation—requires further development). The form is discussed with the IMG and is then forwarded to the Council. The Council will discuss any unsatisfactory report received with the supervisor to determine whether any additional support or input from the Council is required.

To progress from provisional general registration to general registration, the IMG must have three consecutive satisfactory reports. All supervisors of interns who are IMGs are invited to annual workshops run by the Council specifically for the purpose of assisting the supervisors to provide support in pastoral, professional and educational frameworks.

In measuring the quality of an assessment, a useful structure focuses on issues of validity, reliability, educational value, acceptability and cost.4 Validity refers to the ability of an assessment to measure the characteristics of a trait that it was designed to measure.5 Within the concept of validity, several discrete variations have been described that include face validity (the impression of the validity), content validity (does the test measure defined content), criterion (how well the test predicts the outcome of another measurement method) and construct (how well the test measures what it is supposed to measure).6

The stated purpose of NZREX Clinical is ‘By passing NZREX Clinical, we then know you are competent to have provisional registration in New Zealand.’ 7 It is therefore of value to collect data on subsequent clinical performance in the workplace of those candidates who were successful in the NZREX Clinical as this information represents a measure of criterion validity. This paper reports on the progress of successful NZREX Clinical IMGs through the first year of provisional general registration in New Zealand.


All successful NZREX Clinical candidates who registered with the Council for clinical practice between September 2005 and August 2013 were identified. For each IMG, the supervisor report forms were extracted and data was anonymised and aggregated. Demographic data was obtained and also was anonymised and aggregated.


Demographic data—There were 353 successful candidates between September 2005 and August 2013. Of those, 312 (88%) had a current Practicing Certificate (PC) in August 2013.

Unsatisfactory reports—The overall number of IMGs with one or more unsatisfactory reports was 37 (10.4%). There was an increase in the number of unsatisfactory reports according to the number of attempts to pass NZREX Clinical.

Of those who passed at their first attempt, 9% finished the clinical year with at least one unsatisfactory report, 13% of those who required two attempts at NZREX Clinical received one or more unsatisfactory reports and 16% of those taking three attempts received one or more unsatisfactory reports.

Overall, only six doctors (2%) had more than one unsatisfactory report. Just over 1% of the cohort of 312 actively registered NZREX Clinical doctors had an open health concern and the same number had an open complaint at August 2013. The run in which the unsatisfactory report was received shows that 42% were generated from the first run, 35% from the second, 13% from the third and 11% from the fourth.

As three consecutive satisfactory reports are required for progression to a general scope, it is important to note that the IMG who receives an unsatisfactory report in the first run is still eligible to progress to a general scope after 12 months of working within a provisional general scope. Of those with a unsatisfactory report in the first run, 11 out of the 23 had no further unsatisfactory reports and were able to meet general registration requirements by the end of that year.

Areas of concern—When an unsatisfactory report was generated, the mean number of concerns was 6.3 out of a possible 19 with a standard deviation of 4.0. The areas of concern noted in the unsatisfactory supervisor reports are presented in Table 5 in the same way as supervisor report forms are structured.


Table 5. Areas of concern in unsatisfactory reports (n=56)


Area of concern

Domain %

Sub-domain %

Clinical knowledge and skills


1. Clinical Knowledge


2. Professional knowledge


3. Clinical clerking


4. History taking


5. Relevant procedural skills


Clinical judgment


6. Diagnostic Skills


7. Patient management


8. Time management


9. Recognising limits


Patient communication


10. Communication skills


11. Ability to communicate with patients and families


12. Sensitivity, ethical and cultural awareness


13. Ability to communicate with other healthcare professionals


14. Initiative and enthusiasm


15. Takes responsibility for own learning


16. Motivation to teach


Professional attitudes and behaviour


17. Reliability and dependability


18. Ability to cope with stress


19. Personal manner




The cohort entering the clinical year were, by definition, all successful in the NZREX Clinical OSCE, yet 10% went on to receive at least one unsatisfactory report. There are a number of reasons why this could happen. The OSCE reflects a traditional assessment undertaken outside the workplace where measurements of reliability are required as well as mechanisms to ensure validity.8 9

The assessment undertaken by supervisors based on observation of work in a clinical setting has the theoretical basis of WBA where constructivist methods are used to observe complex contextualized performance in the workplace.10 Thus the nature of what is assessed and the theoretical basis of assessment differs between the two modalities. A study of 39 IMGs undergoing a number of tests with different modalities also concluded that this modal difference accounted for poor correlation on the same trait between WBA techniques and OSCE.11 This may partially explain why some successful NZREX graduates performed poorly in clinical attachments.

A further confounding variable is the reliability of supervisor reports. Issues of feasibility prevent rigorous training and calibration of individual supervisors and thus variability in reports can occur. There is also variable, and sometimes inadequate level of contact between intern and supervisor. The vexing question of false positives and false negatives also needs to be addressed.

The literature suggests that medical educators do pass underperforming students on occasions and that individual assessor differences in what represents adequate skills, uncertainty over accuracy of rating scales and reliance on global impressions may confound the report from a clinical attachment.12 13 The converse may also apply where inadvertent negative bias may exist in assessment processes as evidenced by the recent controversy over the MRCGP examinations.14 15

The timing of the unsatisfactory report is of interest as there was a substantial reduction in numbers of such reports as the year progressed. Prior research has identified difficulties faced by IMGs in their intern year that include finding employment, integrating into the work role and social integration.16 It is likely that new work environments as well as the difficulties of resettling in a new country and potentially practising medicine in a second language are responsible for the initial difficulties experienced by these doctors. As experience is gained, many of the problems causing unsatisfactory reports would appear to have resolved. As already stated, three consecutive satisfactory reports are required for progression to a general scope, and the IMG who receives an unsatisfactory report in the first run is still eligible to progress to a general scope after 12 months of working within a provisional general scope.

The provisional general registration year is considered to have both service and educational functions. Some concerns exist with the overall educational value of WBA and the ability of this assessment methodology to improve educational outcomes.17 However, it is also clear that the quality of feedback in WBA is a key factor in improving clinical skills and knowledge.18

The structure of supervision for these IMGs is designed to provide feedback. The demographic data would suggest that a large proportion of those successful in the NZREX Clinical remain in the New Zealand workforce. Although the reasons for remaining in New Zealand are multifactorial, a contributor to such a decision would be the level of satisfaction with training and working environments. There would appear to be some difference in the workplace performance of NZREX Clinical IMGs according to the number of times taken to pass NZREX.

The number of unsatisfactory reports increased with the variable of numbers of attempts to pass the assessment. If it is assumed that all successful NZREX Clinical IMGs have a common minimum level of skills, knowledge and ability, but that some required longer to attain that standard, then these doctors may be less capable of adapting to new circumstances.

With 90% of IMGs completing the provisional general registration year with no unsatisfactory reports, there is a good level of concordance between the two assessments. This result is consistent with the limited published research on predictive validity of high-stakes OSCE where excellent concordance was found between a selection OSCE for clinical placement and subsequent results in the clinical attachment.19 A different study of construct validity in 3rd year medical students undertaking a psychiatry run found lesser correlations.20

There was a fairly even spread of concerns across Clinical Knowledge and Skills, Clinical Judgment and Patient Communication with lesser incidence of concerns over Professional Attitudes and Behaviour. Of note is the low incidence of concerns over cultural awareness. This may reflect the considerable emphasis in New Zealand on cultural awareness with hospital orientations emphasising its importance as well as cultural issues being examinable in the NZREX Clinical.

Diagnostic skills and patient management were the most common sub-categories of concern. Both of these professional skills require complex cognitive processes in the WBA setting that require adequate clinical knowledge, communication skills etc. Weakness in these 'building blocks' will result in deficient diagnostic and management abilities as would poor clinical reasoning skills.


Of IMGs entering the provisional general year of registration after NZREX Clinical, 90% had satisfactory reports from all four runs. Of those who received an unsatisfactory report, it was more common for that report to come from either the first or second run. There was minor difference in clinical performance between those who take more than one attempt at the NZREX Clinical to those who pass at their first attempt.

When unsatisfactory reports are generated, the most common concerns are in the area of 'Clinical judgment', but 'Clinical Knowledge and Skills' as well as 'Patient Communication' issues were almost as common. For hospitals who take successful NZREX graduates, greater focus on these areas would be beneficial. The results of this research will inform the future direction of the NZREX clinical examination.


Some overseas trained doctors have to successfully pass a clinical examination before working in supervised positions as a junior doctor. Of those who pass the examination, 90% do well in their first year of supervised practice. When problems occur, they usually occur in the first 6 months of practice.



To determine the frequency and nature of clinical difficulties faced in the first year of supervised clinical practice by international medical graduates (IMGs) who have successfully passed NZREX Clinical in order to be able to practise in New Zealand.


All doctors who successfully passed NZREX Clinical and who registered with the Medical Council of New Zealand (the Council) from 2005 to 2013 were identified. Supervisor reports for each of the four runs in the first year of practice were obtained and reports where concerns were raised over clinical performance analysed.


Results Of 353 IMGs successful in NZREX Clinical, 316 (89.6%) completed the subsequent clinical year with no adverse reports. Those requiring more than one attempt to pass NZREX Clinical had an incremental increase in number of unsatisfactory reports, where areas of the IMGs’ performance were rated as ‘below the expected standard’. Less than 2% of IMGs had more than one unsatisfactory report. The majority of unsatisfactory reports were generated in the first half of the clinical year. Areas of concerns found were Clinical Knowledge and Skills (28%), Clinical Judgment (35%), Patient Communication (28%) and Professional Attitudes and Behaviour (9%).


Most IMGs who were successful in NZREX Clinical performed well in the subsequent year of clinical practice. NZREX Clinical would appear to have acceptable criterion validity.

Author Information

Steven Lillis, Medical Adviser

Heather Roblin, Professional Standards Coordinator

Medical Council of New Zealand, Wellington


Dr Steven Lillis, Medical Adviser, Medical Council of New Zealand, Level 6, 80 The Terrace, Wellington, New Zealand.

Correspondence Email


Competing Interests



1. Lillis S, Stuart M, Sidonie, Takai N. New Zealand Registration Examination (NZREX Clinical): 6 years of experience as an Objective Structured Clinical Examination (OSCE). N Z Med J. 2012 Sep 7;125(1361):74–80. http://journal.nzma.org.nz/journal/125-1361/5327/content.pdf

2. https://www.mcnz.org.nz/assets/News-and-Publications/Booklets/Education-and-Supervision-for-interns.pdf Accessed 19th June 2014

3. http://www.mcnz.org.nz/assets/Forms/RP3.pdf Accessed 1st May 2014

4. Van Der Vleuten CP. The assessment of professional competence: Developments, research and practical implications. Adv Health Sci Educ Theory Pract. 1996 Jan;1(1):41–67.

5. Wass V, Van der Vleuten C, Shatzer J, Jones R. Assessment of clinical competence. Lancet. 2001 Mar 24;357(9260):945–9.

6. Vallevand A, Violato C. A predictive and construct validity study of a high-stakes objective clinical examination for assessing the clinical competence of international medical graduates. Teach Learn Med. 2012;24(2):168–76.

7. http://www.mcnz.org.nz/get-registered/registration-exam-nzrex-clinical/ Accessed 1st May 2014

8. Turner JL, Dankoski ME. Objective structured clinical exams: a critical review. Fam Med. 2008 Sep;40(8):574–8.

9. Fuller R, Homer M, Pell G. Longitudinal interrelationships of OSCE station level analyses, quality improvement and overall reliability. Med Teach. 2013 Jun;35(6):515–7.

10. Govaerts M, van der Vleuten CP. Validity in work-based assessment: expanding our horizons. Med Educ. 2013 Dec;47(12):1164–74.

11. Baig L, Violato C, Crutcher R. A construct validity study of clinical competence: a multitrait multimethod matrix approach. J Contin Educ Health Prof.2010 Winter;30(1):19–25.

12. Yeates P, O'Neill P, Mann K, Eva K. Seeing the same thing differently: mechanisms that contribute to assessor differences in directly-observed performance assessments. Adv Health Sci Educ Theory Pract. 2013 Aug;18(3):325–41.

13. Monrouxe LV, Rees CE, Lewis NJ, Cleland JA. Medical educators' social acts of explaining passing underperformance in students: a qualitative study. Adv Health Sci Educ Theory Pract. 2011 May;16(2):239–52.

14. Esmail A, Roberts C. Academic performance of ethnic minority candidates and discrimination in the MRCGP examinations between 2010 and 2012: analysis of data. BMJ. 2013 Sep 26;347:f5662.

15. Graham M, Tsisi R, Cowley G. Racial bias unlikely to be a factor in the differential performance of candidates in the MRCGP exam. BMJ. 2013 Oct 29;347:f6447.

16. Lillis S, St George I, Upsdell R. Perceptions of migrant doctors joining the New Zealand medical workforce. N Z Med J. 2006 Feb 17;119(1229):U1844. http://journal.nzma.org.nz/journal/119-1229/1844/content.pdf

17. Miller A, Archer J. Impact of workplace based assessment on doctors' education and performance: a systematic review. BMJ. 2010 Sep 24;341:c5064.

18. Pelgrim EA, Kramer AW, Mokkink HG, van der Vleuten CP. The process of feedback in workplace-based assessment: organisation, delivery, continuity. Med Educ. 2012 Jun;46(6):604–12.

19. Vallevand A, Violato C. A predictive and construct validity study of a high-stakes objective clinical examination for assessing the clinical competence of international medical graduates. Teach Learn Med. 2012;24(2):168–76.

20. Park RS, Chibnall JT, Blaskiewicz RJ, et al. Construct validity of an objective structured clinical examination (OSCE) in psychiatry: associations with the clinical skills examination and other indicators. Acad Psychiatry. 2004 Summer;28(2):122–8.