Home About us Editorial board Search Ahead of print Current issue Archives Submit article Instructions Subscribe Contact us Login 
  • Users Online:809
  • Home
  • Print this page
  • Email this page


 
 Table of Contents  
INVITED EDITORIAL
Year : 2014  |  Volume : 2  |  Issue : 2  |  Page : 123-125

Performance-based assessment: Innovation in medical education


School of Medicine and Public Health, Faculty of Health and Medicine, University of Newcastle, Australia

Date of Web Publication11-Nov-2014

Correspondence Address:
Balakrishnan R Nair
School of Medicine and Public Health, Faculty of Health and Medicine, University of Newcastle
Australia
Login to access the Email id

Source of Support: None, Conflict of Interest: None


DOI: 10.4103/2321-4848.144292

Rights and Permissions

How to cite this article:
Nair BR, Parsons K. Performance-based assessment: Innovation in medical education . Arch Med Health Sci 2014;2:123-5

How to cite this URL:
Nair BR, Parsons K. Performance-based assessment: Innovation in medical education . Arch Med Health Sci [serial online] 2014 [cited 2019 Oct 15];2:123-5. Available from: http://www.amhsjournal.org/text.asp?2014/2/2/123/144292


  Introduction Top

"The educational argument for integrating teaching, learning and assessment is powerful. We know that assessment drives learning and it is therefore imperative that workplace-based assessment focuses on important attributes rather than what is easiest to assess." [1]

The practice of medicine is a complex process requiring many attributes. Medical knowledge is only one part of this process. Hence, thorough skills and performance assessment is required in medical practice. Traditionally, assessment of doctors was achieved through competency (what the doctor can do) rather than performance (what the doctor does). [2]

We know that clinical competence does not predict performance. So we need to think of new ways for assessment. Performance-based assessment at the workplace is now becoming popular. One major advantage of work-based assessment (WBA) is that it allows for evaluation of performance within a variety of contexts and experiences. [3],[4]

Societal responsibilities of the medical profession are a major part of the modern curriculum. [5] This is because what a doctor does affects not only the individual patient but also society at large. It is clear, as mentioned above, that no single person or single method of assessment can comprehensively assess performance in medicine. [6]

In this editorial, we will explore the need and rationale for performance based assessment at the work place using a variety of tools. As van der Vleuten describes, optimizing assessment requires balancing the six components of the utility index, [7] namely, educational impact × validity × reliability × cost × acceptability × feasibility (E × V × R × C × A × F). We will discuss our experience with various work based assessment tools over the past 3 years in assessing International Medical Graduates (IMGs) and draw some general conclusions. [8] The same tools and methodology can also be used for assessment in undergraduate and postgraduate programs.

What is WBA? It includes a variety of tools to assess a doctor's ability to practice safely in a variety of clinical settings and contexts and is undertaken by multiple assessors to eliminate bias.

We will now discuss the commonly used tools: Gathering information from multiple sources otherwise called Multisource feedback (MSF) or 360 assessments (360) has been done in other professions for some time. This process involves the completion of a MSF assessment, which is done by the individual and the nominated coworkers. The domains include communication, teamwork, empathy, honesty and similar characteristics needed for practicing medicine in the 21 st century. In our program, this feedback was reviewed by a multidisciplinary panel. Guidance, remediation, and mentoring are provided to all participants as and when needed. In our experience, trainees value this feedback and have improved their performance as a result. [2]

Mini clinical evaluation exercise (Mini-CEX) is a rating scale to assess core medical competencies that include medical interviewing skills, physical examination skills, and clinical communication. [9] The Mini-CEX was developed by Norcini and had been used over the past decade. This is a valid and reliable tool to assess clinical skills. [10] The assessment tasks can be mapped against the curriculum and learning outcomes which are assessed with a validated assessment form. [8] The individual is evaluated in a diverse range of clinical setting. To achieve reliability, at least 8-10 assessments should be done.

Case base discussions (CbDs) are structured and nonjudgemental reviews that assess what the doctor has performed in practice. This type of assessment is based on the pinnacle of Millers triangle [Figure 1]. It is essential to have six or seven CbDs [10-20 minutes in duration] to ensure curriculum coverage and that a variety of clinical areas are analyzed for a valid and reliable assessment process. [8] This is a tool that assesses competency against standards, judgement, and professionalism. CbDs can provide excellent learning opportunities and a change in clinical practice, which can lead to an improvement in the trainee's abilities, a characteristic that is not duplicated in other assessment tools. It can also promote autonomy and self -directed learning. [11] This assessment is a 'one: one' discussion and reflection.
Figure 1: Miller's pyramid in assessment

Click here to view


Direct observation of procedural skills (DOPS) is a structured rating scale which assesses feedback on practical procedures. Competencies that are commonly assessed include knowledge of the procedure, consent, analgesia, technical ability, aseptic technique, patient management, counselling, and communication. [12] In a generalizability analysis and D-study report, it was found that a trainee should be observed for three procedures by at least three different assessors, for at least two procedures to order to achieve reliability. [13]

Although it is ideal to have these various assessments, adults rely on immediate constructive feedback. Therefore, it is optimal to incorporate constructive feedback into any assessment. Feedback has been described as the basis of effective clinical learning. [14]

It is clear that assessor engagement is imperative for successful WBA. [8] All assessors should be trained and calibrated in all the assessment tools. [15] The importance of evaluator workshops to optimize the learning opportunity through the delivery of effective feedback cannot be ignored. [9] Learner reflection and the onus for taking responsibility for one's own learning is a valuable component of this WBA. Faster acquisition of clinical skills and better patient care will result from providing trainees with detailed assessment and feedback. [14],[ 16]

Workplace assessments aim to enable learners to demonstrate higher level thinking and knowledge. Cost-effective tools are available to provide such thorough and mixed assessments. We have shown that the costs of these tools are acceptable and minimal. [17] For example, we have analyzed the cost of assessment over a 6-month period using WBA principles and tools, where the average cost per doctor was A$ 10,000, and we believe this is a cost-effective price to pay for optimal outcomes. More and more medical schools and post graduate training programs are introducing these modern ways of assessment and as a result, the outcome is better prepared and qualified professionals. If we are practicing evidence-based medicine, we should be practicing evidence-based assessment too. This is the way forward in medical education.


  Acknowledgment Top


Steve Mears for editorial comments.

 
  References Top

1.Chana N. Workplace-based assessment [Internet]. Faculty Development: London Deanery; 2008. Available from: http://www.faculty.londondeanery.ac.uk/e-learning/workplace-based-assessment/why-workplace-based-assessment [Last accessed on 2014 Jan 22].  Back to cited text no. 1
    
2.Miller GE. The assessment of clinical skills/competence/performance. Acad Med 1990;65:S63-7.  Back to cited text no. 2
    
3.Miller A, Archer J. Impact of work based assessment on doctors' education and performance: A systemic review. BMJ 2010;341: c5064.  Back to cited text no. 3
    
4.Beard J. Workplace-based assessment: The need for continued evaluation and refinement. Surgeon 2011;9:S12-3.  Back to cited text no. 4
    
5.Turney B. Anatomy in a modern medical curriculum. Ann R Coll Surg Engl 2007; 89:104-7.  Back to cited text no. 5
    
6.Lake FR, Hamdorf JM. Teaching on the run tips 6: Determining competence. Med J Aust 2004;181:502-3.  Back to cited text no. 6
    
7.Postgraduate Medical Education Training Board (PMETB). Developing and maintaining an assessment system- A PMETB guide to good practice. 2007 U.K. Available from: gmc-uk.org/Assessment_good_practice_v0207.pdf_31385949.pdf [Last accessed on 2014 Jan 22].  Back to cited text no. 7
    
8.Nair BR, Hensley MJ, Parvathy MS, Lloyd DM, Murphy B, Ingham K, et al. A systemic approach to workplace-based assessment for international medical graduates. Med J Aust 2012;196:399-402.  Back to cited text no. 8
    
9.Norcini JJ. ABC of learning and teaching in medicine: Work based assessment. BMJ 2003;326:753-5.  Back to cited text no. 9
    
10.Norcini JJ, Blank LL, Duffy FD, Fortna GS. The Mini-CEX: A method for assessing clinical skills. Intern Med 2003; 138:476-81.  Back to cited text no. 10
    
11.Williamson JM, Osborne AJ. Critical Analysis of case based discussion. Br J Med Pract 2012;5:514-7.  Back to cited text no. 11
    
12.Cobb KA, Brown G, Jaarsma DA, Hammond RA. The educational impact of assessment: A comparison of DOPS and MCQS. Med Teach 2013;35:e1598-607.  Back to cited text no. 12
    
13.Naeem N. Validity, reliability, feasibility, acceptability and educational impact of direct observation of procedural skills (DOPS). J Coll Physicians Surg Pak 2013;23:77-82.  Back to cited text no. 13
    
14.Archer J. Feedback: It's all in the CHAT. Med Educ 2013;47:1059-61.  Back to cited text no. 14
    
15.Liao KC. Pu SJ. Lui MS. Yang CW, Kuo HP. Development and implementation of a mini-Clinical Evaluation Exercise (mini- CEX) program to assess the clinical competencies of internal medicine residents: From faculty development to curriculum evaluation. BMC Med Educ 2013;26:13-31.  Back to cited text no. 15
    
16.Kogan JR, Holmboe ES, Hauer KE. Tools for direct observation and assessment of clinical skills of medical trainees: A systematic review. JAMA 2009;302:1316-26.  Back to cited text no. 16
    
17.Nair B, Searles A, Ling R, Wein J, Ingham K. Workplace-based assessment for international medical graduates: At what cost? Med J Aust 2014;200:41-4.  Back to cited text no. 17
    


    Figures

  [Figure 1]



 

Top
 
 
  Search
 
Similar in PUBMED
   Search Pubmed for
   Search in Google Scholar for
Access Statistics
Email Alert *
Add to My List *
* Registration required (free)

 
  In this article
Introduction
Acknowledgment
References
Article Figures

 Article Access Statistics
    Viewed2839    
    Printed72    
    Emailed1    
    PDF Downloaded1582    
    Comments [Add]    

Recommend this journal


[TAG2]
[TAG3]
[TAG4]