Please use this identifier to cite or link to this item: https://hdl.handle.net/2440/89541
Citations
Scopus Web of Science® Altmetric
?
?
Type: Journal article
Title: How was the intern year?: self and clinical assessment of four cohorts, from two medical curricula
Author: Laven, G.
Keefe, D.
Duggan, P.
Tonkin, A.
Citation: BMC Medical Education, 2014; 14(1):123-1-123-10
Publisher: BioMed Central
Issue Date: 2014
ISSN: 1472-6920
1472-6920
Statement of
Responsibility: 
Gillian Laven, Dorothy Keefe, Paul Duggan, and Anne Tonkin
Abstract: BACKGROUND Problem-based curricula have provoked controversy amongst educators and students regarding outcome in medical graduates, supporting the need for longitudinal evaluation of curriculum change. As part of a longitudinal evaluation program at the University of Adelaide, a mixed method approach was used to compare the graduate outcomes of two curriculum cohorts: traditional lecture-based ‘old’ and problem-based ‘new’ learning. METHODS Graduates were asked to self-assess preparedness for hospital practice and consent to a comparative analysis of their work-place based assessments from their intern year. Comparative data were extracted from 692 work-place based assessments for 124 doctors who graduated from the University of Adelaide Medical School between 2003 and 2006. RESULTS Self-assessment: Overall, graduates of the lecture-based curriculum rated the medical program significantly higher than graduates of the problem-based curriculum. However, there was no significant difference between the two curriculum cohorts with respect to their preparedness in 13 clinical skills. There were however, two areas where the cohorts rated their preparedness in the 13 broad practitioner competencies as significantly different: problem-based graduates rated themselves as better prepared in their ‘awareness of legal and ethical issues’ and the lecture-based graduates rated themselves better prepared in their ‘understanding of disease processes’. Work-place based assessment: There were no significant differences between the two curriculum cohorts for ‘Appropriate Level of Competence’ and ‘Overall Appraisal’. Of the 14 work-place based assessment skills assessed for competence, no significant difference was found between the cohorts. CONCLUSIONS The differences in the perceived preparedness for hospital practice of two curriculum cohorts do not reflect the work-place based assessments of their competence as interns. No significant difference was found between the two cohorts in relation to their knowledge and clinical skills. However results suggest a trend in ‘communication with peers and colleagues in other disciplines’ (χ2 (3, N = 596) =13.10, p = 0.056) that requires further exploration. In addition we have learned that student confidence in a new curriculum may impact on their self-perception of preparedness, while not affecting their actual competence.
Keywords: Evaluation/assessment; Clinical performance; Curriculum development/evaluation; Problem-based; Intern/house officer training; Competence
Rights: © 2014 Laven et al.; licensee BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly credited.
DOI: 10.1186/1472-6920-14-123
Appears in Collections:Aurora harvest 7
Public Health publications

Files in This Item:
File Description SizeFormat 
hdl_89541.pdfPublished version962.2 kBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.