Workplace based assessment is ideal and often not done. WBAs allow us to see how learners are using their skills in a real environment, rather than a mock up or lab situation.
WBAs may be learner or assessor driven. Often students are asked to select a case on which they will be assessed formally, with a minimum number of assessments required. Meanwhile, all the other cases they are seeing are opportunities for feedback in a more informal manner.
Feedback
Feedback is still key and should be focused on what the learner needs at this point in time and directed toward what will make future performances stronger. Feedback does not need to be private if phrased in a growth mindset manner and if specific and related to what was seen. E.g. “You took a good history using predominantly open ended questions. Nice job. Next time make sure you leave time for the client to phrase their questions for you. You moved quickly and the client may not have been able to ask everything they wanted to ask.”
For many of these, inter-rater variability tends to be high and may be improved with training. Unconscious bias has been shown to be substantial.
Several WBAs may be combined to create a learner portfolio.
Types of assessment
Direct observation of procedural skills (DOPS)
Using guidelines, checklists and or global rating scores, assessors evaluate learners either live or via video capture. Guidelines and pass scores are usually set by expert consensus opinion. Students are generally allowed to remediate with a maximum level of attempt set.
For ultrasound procedures, Tolsgaard found reviewing four videos of trainee ultrasounds was sufficient to reach consensus through discussion between raters.
Real time assessments (RTAs)
RTAs are very similar to DOPs and include the entire case. A student may have a minimum number of RTAs to complete and these may be divided into subject areas (surgery, medicine, anesthesia, etc).
Clinical evaluation exercises – CEX/ mini CEX
Using a single patient encounter, the learner works through the case while being assessed. A written patient report is generally produced.
A mini CEX uses shorter episodes of direct observation. Both include immediate feedback.
In training evaluation reports (ITERs)
ITERs are generally performed at the end of a rotation and encompass all activities over the rotation. High variability between programs for marking and assigning grades.
Standardized cases/ Clinical simulation
Paper cases may also be used to assess students when the real cases are not available. Clinical reasoning can be assessed via written descriptions (essay) or script concordance testing.
Alternatively, a simulated client interaction may be set up. This minimizes variability between patients and clients
Client surveys
Clients evaluate student performance from their perspective.
Multisource feedback (360 surveys)
A questionnaire is sent to different members of the team, potentially including clients. Similar to the ITER and DOPS but with differing durations.
Objective structured clinical exams (OSCEs)
OSCEs can be used to standardize assessment and improve repeatability. OSCEs are generally focused on skills and consist of multiple mini stations.
Resources
2014 A guide to assessment in veterinary medicine – a great place to start
MJ Tolsgaard Assessment and learning of ultrasound skills in Obstetrics & Gynecology. Dan Med J 2018;65(2):B5445
MJB Govaerts et al. Workplace-based assessment: raters’ performance theories and constructs. Adv in Health Sci Educ (2013) 18:375–396
J. M. W. Moonen-van Loon et al. Composite reliability of a workplace-based assessment toolbox for postgraduate medical education. Adv in Health Sci Educ (2013) 18:1087–1102
KM Magnier et al. Workplace-Based Assessment Instruments in the Health Sciences. JVME 39(4) 6 2012
C Liu. An introduction to workplace-based assessments. Gastroenterology and hepatology from bed to bench, 2012, Vol.5(1), pp.24-8