A Requirements elicitation and automated assessment system

Date

2007

Authors

Scott, William

Editors

Advisors

Journal Title

Journal ISSN

Volume Title

Type:

thesis

Citation

Statement of Responsibility

Conference Name

Abstract

Requirements form the foundation of the systems engineering process as they document the basis of an acceptable technical solution to a customer need. Not surprisingly then, the ability of a delivered system to fulfil a stated need is strongly correlated with the quality of the system?s requirements. However, a review of requirement management tools shows that while they excel at storing requirements and managing their configuration, they lack the ability to provide informed feedback regarding the content of the requirements. The research outlined in this thesis addresses this deficiency. The contribution of the thesis starts with a task that was central to the original Australian Research Council research proposal that funded this work: an examination of the potential of the developing systems engineering neutral format standard, Version 5.1 of AP-233, to support requirements engineering including requirements assessment. The work concluded that the data protocol is suitable for transfer of textual requirements between tools. However, it was considered no more suitable to act as a knowledge representation for artificial reasoning than the existing representations utilised by contemporary requirement management tools. Armed with the knowledge derived from the AP-233 study, the focus of the research project turned to determining the most appropriate artificial reasoning technique to assess the quality of requirements and hence assist in requirements engineering activities. After a review of artificial intelligence techniques, it was determined that requirements assessment would best be tackled by multiple assessment systems each focussing on a different assessment level, namely statement quality; document integrity; and precedence of requirements against industry norms. We describe this approach as a hybrid, tiered, requirements assessment. The thesis then describes the considerable work needed to develop the underpinning concepts and corresponding proof-of-concept software in order to realise the assessment system. The first of these is the definition of a knowledge representation that can contain the information found within requirements. The second is a design of a formal grammar derived from industry practice that forms the heart of the tier one assessment systems and the interfacing routines. This grammar can be used to parse existing requirements, help generate well-written requirements, and extract meaning from requirements statements in order to enable the higher level assessments. The third underpinning concept that required research to define and develop was a reasoning strategy based on comparing requirements held in the novel knowledge representation. The technique adopted was to compare the requirements against other requirements and assign one of a discrete set of outcomes: unrelated, equivalent, partially equivalent, contradictory, partially contradictory, or overlapping. Tier 2 Assessment, document integrity, is thus made possible by using this comparison technique together with rules for interpreting the outcomes. Similarly, Tier 3 Assessment can be achieved by comparing requirements from the set of interest against a set of benchmark requirements. The capstone of the thesis is the demonstration of these concepts through their implementation in software tools and their evaluation in real-world requirements engineering situations. The results are very encouraging both for the generation of new requirements and for the high success rate in parsing and evaluating requirements.The resulting software, which is currently the subject of commercialisation discussions with tool vendors, is seen to provide genuinely worthwhile requirements assessment capability.

School/Discipline

University of South Australia School of Electrical and Information Engineering
Systems Engineering and Evaluation Centre

Dissertation Note

PhD Doctorate

Provenance

Copyright William Scott 2007

Description

EN-AUS

Access Status

Rights

License

Grant ID

Published Version

Call number

Persistent link to this record