Access the full text.
Sign up today, get DeepDyve free for 14 days.
R. Reznick, S. Smee, J. Baumber, R. Cohen, A. Rothman, D. Blackmore, M. Bérard (1993)
Guidelines for estimating the real cost of an objective structured clinical examinationAcademic Medicine, 68
J. Trempe (1992)
Providing constructive feedback by Jane Westberg and Hilliard Jason, 72 pages. Center for Instructional Support, Boulder, CO, USA. 1991. $15Biochemical Education, 20
R. Hodder, R. Rivington, L. Calcutt, I. Hart (1989)
The effectiveness of immediate feedback during the Objective Structured Clinical ExaminationMedical Education, 23
M.G. Brown, B. Hodges, J. Wakefield (1995)
Evaluation Methods: A Resource Handbook
R. Hodder, R. Rivington (1989)
The effectiveness of immediate feedback during the OSCEMed Educ, 23
C. Vleuten, S. Luyk, A. Ballegooijen, D. Swanson (1989)
Training and experience of examinersMedical Education, 23
J Martin, R Reznick, A. Rothman, R Tamblyn, G. Regehr (1996)
Who should rate candidates in an objective structured clinical examination?Academic Medicine, 71
Context: Various research studies haveexamined the question of whether expert ornon-expert raters, faculty or students,evaluators or standardized patients, give morereliable and valid summative assessments ofperformance on Objective Structured ClinicalExaminations (OSCEs). Less studied has beenthe question of whether or not non-facultyraters can provide formative feedback thatallows students to take advantage of theeducational opportunity that OSCEs provide. This question is becoming increasinglyimportant, however, as the strain on facultyresources increases.Methods: A questionnaire was developed toassess the quality of feedback that medicalexaminers provide during OSCEs. It was pilottested for reliability using video recordingsof OSCE performances. The questionnaires werethen used to evaluate the feedback given duringan actual OSCE in which clinical clerks,residents, and faculty were used as examinerson two randomly selected test stations.Results: The inter-rater reliabilityof the 19-item feedback questionnaire was 0.69during the pilot test. The internalconsistency was found to be 0.90 during pilottesting and 0.95 in the real OSCE. Using thisform, the feedback ratings assigned to clinicalclerks were significantly greater than thoseassigned to faculty evaluators. Furthermore,performance on the same OSCE stations eightmonths later was not impaired by having beenevaluated by student examiners.Discussion: While evidence of markinflation within the clinical clerk examinersshould be addressed with examiner training, thecurrent results suggest that clerks are capableof giving adequate formative feedback to morejunior colleagues.
Advances in Health Sciences Education – Springer Journals
Published: Sep 21, 2004
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.