Access the full text.
Sign up today, get DeepDyve free for 14 days.
M. Govaerts, C. Vleuten, L. Schuwirth, A. Muijtjens (2007)
Broadening Perspectives on Clinical Performance Assessment: Rethinking the Nature of In-training AssessmentAdvances in Health Sciences Education, 12
Gerald Cohen, Nancy Henry, Pearl Dodd (1990)
A self-study of clinical evaluation in the McMaster clerkship.Medical teacher, 12 3-4
M. Verma, T. Singh (1994)
Assessment of clinical competenceThe Indian Journal of Pediatrics, 61
A. Gingerich, G. Regehr, K. Eva (2011)
Rater-Based Assessments as Social Judgments: Rethinking the Etiology of Rater ErrorsAcademic Medicine, 86
M. Albanese (2001)
Challenges in using rater judgements in medical education.Journal of evaluation in clinical practice, 6 3
F. Naumann, Keri Moore, Sally Mildon, Philip Jones (2014)
Developing an objective structured clinical examination to assess work-integrated learning in exercise physiologyAsia-Pacific journal of cooperative education, 15
(2001)
Competency based assessment. Chapter 25
M. Govaerts, bullet Wiel, bullet Schuwirth, bullet Vleuten, bullet Muijtjens, C. Vleuten, A. Muijtjens, M. Wiel, L. Schuwirth (2012)
Workplace-based assessment: raters’ performance theories and constructsAdvances in Health Sciences Education, 18
Timothy Wood (2014)
Exploring the role of first impressions in rater-based assessmentsAdvances in Health Sciences Education, 19
K. Mazor, Mary Zanetti, E. Alper, D. Hatem, Susan Barrett, Vanessa Meterko, Wendy Gammon, M. Pugnaire (2007)
Assessing professionalism in the context of an objective structured clinical examination: an in‐depth study of the rating processMedical Education, 41
D. Cook, D. Dupras, T. Beckman, Kris Thomas, V. Pankratz (2008)
Effect of Rater Training on Reliability and Accuracy of Mini-CEX Scores: A Randomized, Controlled TrialJournal of General Internal Medicine, 24
D. Boud (1995)
Assessment and learning: contradictory or complementary?
JR Kogan, BJ Hess, LN Conforti, ES Holmboe (2010)
What drives faculty ratings of residents’ clinical competence. The impact of faculty’s own clinical skillsAcademic Medicine, 85
A. Lankshear (1990)
Failure to fail: the teacher's dilemma.Nursing standard (Royal College of Nursing (Great Britain) : 1987), 4 20
A Wolf (2001)
Competence in the learning society
J. Kogan, Lisa Conforti, Elizabeth Bernabeo, W. Iobst, E. Holmboe (2011)
Opening the black box of clinical skills assessment via observation: a conceptual modelMedical Education, 45
(2005)
Assessing professional competence : from methods to programmes
R. Watson, Anne Stimpson, A. Topping, D. Porock (2002)
Clinical competence assessment in nursing: a systematic review of the literature.Journal of advanced nursing, 39 5
E. Holmboe, J. Sherbino, D. Long, S. Swing, J. Frank (2010)
The role of assessment in competency-based medical educationMedical Teacher, 32
G. Gigerenzer, W. Gaissmaier (2011)
Heuristic decision making.Annual review of psychology, 62
D Boud (1995)
Assessment for learning in higher education
K. Eva (2007)
Putting the cart before the horse: testing to improve learningBMJ : British Medical Journal, 334
Jasminka Vukanovic-Criley, S. Criley, Carole Warde, J. Boker, Lempira Guevara-Matheus, W. Churchill, William Nelson, J. Criley (2006)
Competency in cardiac examination skills in medical students, trainees, physicians, and faculty: a multicenter study.Archives of internal medicine, 166 6
R. Harden, Mary Stevenson, W. Downie, G. Wilson, R. Mcg, M. Harden, W. Wilson, Downie Lecturer, M. Ch, G. Lecturer, M. Wilson, Regius (1975)
Assessment of clinical competence using objective structured examination.British Medical Journal, 1
J. Kogan, B. Hess, Lisa Conforti, E. Holmboe (2010)
What Drives Faculty Ratings of Residents' Clinical Skills? The Impact of Faculty's Own Clinical SkillsAcademic Medicine, 85
R Watson, A Stimpson, A Topping, D Porock (2002)
Clinical competence in nursing: A systematic review of the literatureJournal of Advanced Nursing, 39
J. Veloski, J. Boex, Margaret Grasberger, Adam Evans, D. Wolfson (2006)
Systematic review of the literature on assessment, feedback and physicians’ clinical performance: BEME Guide No. 7Medical Teacher, 28
C. Vleuten (1996)
The assessment of professional competence: Developments, research and practical implications.Advances in health sciences education : theory and practice, 1 1
P. Yeates, P. O’Neill, K. Mann, K. Eva (2013)
Seeing the same thing differentlyAdvances in Health Sciences Education, 18
N. Dudek, Meridith Marks, G. Regehr (2005)
Failure to Fail: The Perspectives of Clinical SupervisorsAcademic Medicine, 80
D. Newble (2004)
Techniques for measuring clinical competence: objective structured clinical examinationsMedical Education, 38
David Thomas (2006)
A General Inductive Approach for Analyzing Qualitative Evaluation DataAmerican Journal of Evaluation, 27
P Yeates, P O’Neill, K Mann, K Eva (2013)
Seeing the same thing differently. Mechanisms that contribute to assessor differences in directly-observed performance assessmentsAdvances in Health Science Education, 18
L. Schuwirth, J. Ash (2013)
Assessing tomorrow's learners: In competency-based education only a radically different holistic method of assessment will work. Six things we could forgetMedical Teacher, 35
Exploring examiner judgement of professional competence in… 787
Exercise physiology courses have transitioned to competency based, forcing Universities to rethink assessment to ensure students are competent to practice. This study built on earlier research to explore rater cognition, capturing factors that contribute to assessor decision making about students’ competency. The aims were to determine the source of variation in the examination process and document the factors impacting on examiner judgment. Examiner judgement was explored from both a quantitative and qualitative perspective. Twenty-three examiners viewed three video encounters of student performance on an OSCE. Once rated, analysis of variance was performed to determine where the variance was attributed. A semi-structured interview drew out the examiners reasoning behind their ratings. Results highlighted variability of the process of observation, judgement and rating, with each examiner viewing student performance from different lenses. However, at a global level, analysis of variance indicated that the examiner had a minimal impact on the variance, with the majority of variance explained by the student performance on task. One anomaly noted was in the assessment of technical competency, whereby the examiner had a large impact on the rating, linked to assessing according to curriculum content. The thought processes behind judgement were diverse and if the qualitative results had been used in isolation, may have led to the researchers drawing conclusions that the examined performances would have yielded widely different ratings. However, as a cohort, the examiners were able to distinguish good and poor levels of competency with the majority of student competency linked to the varying ability of the student.
Advances in Health Sciences Education – Springer Journals
Published: Jan 21, 2016
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.