Please use this identifier to cite or link to this item: https://hdl.handle.net/2440/86547
Citations
Scopus Web of Science® Altmetric
?
?
Type: Journal article
Title: Evaluating guidelines for reporting empirical software engineering studies
Author: Kitchenham, B.
Al-Khilidar, H.
Babar, M.
Berry, M.
Cox, K.
Keung, J.
Kurniawati, F.
Staples, M.
Zhang, H.
Zhu, L.
Citation: Empirical Software Engineering: an international journal, 2008; 13(1):97-121
Publisher: Springer US
Issue Date: 2008
ISSN: 1382-3256
1573-7616
Statement of
Responsibility: 
Barbara Kitchenham, Hiyam Al-Khilidar, Muhammed Ali Babar, Mike Berry, Karl Cox, Jacky Keung, Felicia Kurniawati, Mark Staples, He Zhang, Liming Zhu
Abstract: Background: Several researchers have criticized the standards of performing and reporting empirical studies in software engineering. In order to address this problem, Jedlitschka and Pfahl have produced reporting guidelines for controlled experiments in software engineering. They pointed out that their guidelines needed evaluation. We agree that guidelines need to be evaluated before they can be widely adopted. Aim: The aim of this paper is to present the method we used to evaluate the guidelines and report the results of our evaluation exercise. We suggest our evaluation process may be of more general use if reporting guidelines for other types of empirical study are developed. Method: We used a reading method inspired by perspective-based and checklist-based reviews to perform a theoretical evaluation of the guidelines. The perspectives used were: Researcher, Practitioner/Consultant, Meta-analyst, Replicator, Reviewer and Author. Apart from the Author perspective, the reviews were based on a set of questions derived by brainstorming. A separate review was performed for each perspective. The review using the Author perspective considered each section of the guidelines sequentially. Results: The reviews detected 44 issues where the guidelines would benefit from amendment or clarification and 8 defects. Conclusions: Reporting guidelines need to specify what information goes into what section and avoid excessive duplication. The current guidelines need to be revised and then subjected to further theoretical and empirical validation. Perspective-based checklists are a useful validation method but the practitioner/consultant perspective presents difficulties.
Keywords: Controlled experiments
Software engineering
Guidelines
Perspective-based reading
Checklist-based reviews
Rights: © Springer Science + Business Media, LLC 2007
DOI: 10.1007/s10664-007-9053-5
Published version: http://dx.doi.org/10.1007/s10664-007-9053-5
Appears in Collections:Aurora harvest 2
Computer Science publications

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.