MaturityModelArchitect-AToolforMaturityAssessmentSupport
MaturityModelArchitect-AToolforMaturityAssessmentSupport
net/publication/327409112
CITATIONS READS
13 6,553
2 authors:
All content following this page was uploaded by Diogo Proença on 12 November 2018.
Abstract—A Maturity Model represents a path towards an increasingly organized and systematic way of doing business. It is
therefore a widely-used technique valuable to assess certain aspects of organizations, as for example business processes. A
maturity assessment can enable stakeholders to clearly identify strengths and improvement points, and prioritize actions in
order to reach higher maturity levels. Doing maturity assessments can range from simple self-assessment questionnaires to full-
blown assessment methods. This work presents the Maturity Model Architect (MMArch), a maturity model repository and
assessment tool, which purpose is to provide support for executing maturity assessment making use of enterprise architecture
models, ontologies and description logics. For this purpose, this work details three possible methods for governing the
application of these techniques. The main goal of this work is to develop a tool with the purpose of automating and supporting
maturity assessment.
Index Terms—Maturity Model, Maturity Assessment, Business Process Management, Enterprise Architecture, Ontology,
Description Logics, OWL.
—————————— u ——————————
1 INTRODUCTION
Fig. 1. Maturity models’ assessment methods (SEI SCAMPI [17] and ISO/IEC 15504 [18]).
any further action. It is not an end, because it is a mobile these methods, maturity assessment becomes an expen-
and dynamic goal [7]. It is rather a state in which, given sive and burdensome activity for organizations.
certain conditions, it can be accepted, and taken the deci- These methods usually start by creating an assessment
sion of not to continue any further action. Several authors plan, which describes how to conduct the assessment, as
have defined maturity, however many of the current defi- well as, the schedule, people involved, necessary docu-
nitions fit into the context in which each particular ma- ments and how to collect evidence. Then a group of asses-
turity model was developed. sors, denominated assessment team follows the assess-
In [8] the definition of maturity is of a specific process ment plan, they collect all the necessary evidence, calcu-
to explicitly define, manage, measure and control the late the maturity levels and assemble the assessment re-
evolutionary growth of an entity. In [9] it is suggested that port, which details the findings and maturity levels of the
maturity is associated with an evaluation criterion or the assessment. Then, based on the assessment results, the
state of being complete, perfect and ready and in [10] as organization can plan for improvement by following an
being a concept which progresses from an initial state to a improvement plan.
final state (which is more advanced), that is, higher levels The Software Engineering Institute (SEI) created the
of maturity. Standard CMMI Appraisal Method for Process Improve-
Models in the domain of quality [12] usually define a ment (SCAMPI) [17] which details the method to assess
series of sequential levels, which together form an antici- the processes that are described in the three constellations
pated or desired logical path from an initial state to a final of CMMI. This method is composed of three main pro-
state of maturity [11]. A maturity model also can be a tool cesses, (1) Plan and prepare for assessment, (2) Conduct
to evaluate the maturity capabilities of certain elements appraisal, (3) Report results. These are depicted in the top
and select the appropriate actions to bring the elements to part of Fig. 1.
a higher level of maturity, as described in [13]. The ISO/IEC 15504, also describes a method to guide
Some definitions, such as presented by [14] there ap- the assessment of organizational processes, which is de-
pears the concern of associating a maturity model to the picted in Fig. 1. The ISO/IEC 15504 assessment method is
concept of continuous improvement. In [15], maturity composed of seven main steps, (1) Initiation, (2) Planning,
models are particularly important for identifying (3) Briefing, (4) Data Collection, (5) Data Validation, (6)
strengths and weaknesses of the organizational context to Process Attribute Rating, and (7) Assessment Reporting.
which they are applied, and the collection of information These steps are then further detailed in atomic tasks. [18]
through methodologies associated with benchmarking. As can be seen in Fig. 1 there is a correlation between
For the purpose of this paper, a maturity level is the “de- the steps of both assessment methods as these have a
gree of process improvement across a predefined set of common background behind their development [2]. These
process areas in which all goals in the set are attained.” groups identify three main steps in these maturity as-
[16] sessment methods, (1) the assessment planning, (2) the
assessment execution, and (3) the assessment reporting.
2.2 Maturity Assessment Regarding these two assessment methods, the
A maturity assessment is “an examination of one or MMArch can be useful while conducting the appraisal (in
more processes by a trained team of professionals using SCAMPI) and while performing the data validation and
an appraisal model.” [16] process attributes rating (in ISO/IEC 15504). In the data
Current maturity assessment methods focus on highly validation step, users can benefit from MMArch to vali-
complex and specialized tasks performed by competent date if a certain architecture model developed during the
assessors in an organizational context. These tasks mainly data collection is sound and complete to determine the
focus on manually collecting evidence to substantiate the maturity levels. Finally, in the process attributes rating
maturity level calculation. Because of the complexity of step users can benefit from MMArch as a way to auto-
D. PROENÇA ET AL.: MATURITY MODEL ARCHITECT – A TOOL FOR MATURITY ASSESSMENT SUPPORT 3
mate the determination of the maturity levels and also as SPICE assessments.
way to substantiate the maturity levels determination. Further on, it was extended to also be able to assess the
One major distinction between these two methods is in +SAFE, the CMMI SE 1.1, and the Automotive SPICE.
the terminology used, SCAMPI uses the term appraisal Similarly, to the SPICE 1-2-1 this tool also was developed
when talking about assessment, while the ISO/IEC 15504 to run on Microsoft Windows and has a comprehensive
uses the term assessment. Besides the difference in the interface depicted in Fig. 3.
term used, they have the same meaning in both assess-
ment methods. Because of this fact certain documents
such as the assessment record in ISO/IEC 15504 is called
appraisal record in SCAMPI, the same happens to as-
sessment team and appraisal team.
the Griffith University. Moreover, these tools do not make which answers will be provided by executing reasoning
use of ontologies, description logics of reasoning engines engines over the ontology. From a maturity model user
as a means to support the automation of a maturity mod- viewpoint, these methods enable them to instantiate a
el, which is one of the goals of our work. specific maturity model ontology and collect the assess-
However, in order for these semantic technology tech- ment results for a given maturity model.
niques to be applicable to existing maturity models and There are three possible methods described in this sec-
maturity assessment methods there must be a set of ap- tion:
plication methods that guide its use which will be de- • Translate an assessment questionnaire into an archi-
tailed further on section 3. tecture model template and develop the DL queries
to assess that architecture model;
2.4 Ontologies and Description Logics • Instantiate an architecture model template, execute
The term ontology originates on the Greek language, the DL queries and collect the assessment results;
being a combination of “ontos” (being) and “logos” and
(word) [23]. From the perspective of philosophy, ontology • Instantiate an ontology of an existing maturity
is the “systematic explanation of existence” [24]. In the model, execute a reasoner and gather the assess-
computer science domain, there are several definitions for ment results.
the term. One of the most widely used definitions is in [4], The roles associated with these methods activities are
building upon earlier definitions provided in [25] and the following:
[26]. Such definition describes ontologies as a “formal, • Maturity model developer is responsible for develop-
explicit specification of a shared conceptualization” [4]. ing the maturity model and creating the assessment
According to [5], “conceptualization” refers to an “ab- questionnaire that will be used by the architect to
stract, simplified view of the world”, containing “the develop a template architecture model.
objects, concepts, and other entities that are assumed to • Architect is responsible for formalizing the assess-
exist in some area of interest and the relationships that ment questionnaire into a template architecture
hold among them” [6]. “Explicit” refers to the explicit model, to make sure that the template faithfully rep-
definition of the “type of concepts used, and the con- resents the assessment questionnaire and to verify
straints on their use” [4]. “Formal” refers to the fact that the ontology converted from the architecture mod-
the conceptualization “should be machine readable” [4]. els is complete and correct.
“Shared”, reflects that the ontology “captures consensual • Ontology engineer is responsible for converting the
knowledge” shared between several parties [4]. architecture models into an ontology and translat-
DL is “a family if knowledge representation formal- ing the assessment questions into DL queries to be
isms that represent the knowledge of an application do- executed over the ontology.
main (the “world”) by first defining the relevant concepts • Assessor is responsible for performing a maturity as-
of the domain (its terminology), and then using these sessment, instantiate the architecture model tem-
concepts to specify properties of objects and individuals plate, executing the DL queries over the instantiated
occurring in the domain (the world description)” [27] and architecture models, execute reasoners over popu-
can be seen as a “decidable fragment of first-order logic” lated ontologies of specific maturity models, ana-
[28]. Using this technique, the description of a domain lyze and collecting the assessment results.
consists of concepts, roles and individuals. Logical state- The next subsections will detail each of these three
ments named axioms make possible to declare relations methods, describing the steps, roles, the artefacts used
between roles and concepts. There are several types of and techniques applied. The ArchiMate notation was
DL, which differ on their expressivity. The DL language is used to depict these methods [29].
𝒜ℒ which stands for attributive language. 𝒜ℒ is a mini-
mal language which can be seen as a family of languages 3.1 Architecture Model Template and DL Queries
which are deemed extensions of 𝒜ℒ. One example is 𝒜ℒ𝒞 Development
which stands for attributive language with complements. This method goal is to develop the architecture model
𝒜ℒ𝒞 is the most widely used DL in reasoners and is ob- template and DL queries for a specific maturity model in
tained by adding a negation complement operator (¬) to order to be used when assessing real organizational sce-
𝒜ℒ. narios. This method can either be used when developing
a new maturity model or by using an existing maturity
model. It starts with the identification of the assessment
3 APPLICATION METHODS questions by the maturity model developer. These ques-
This section describes three possible methods for govern- tions are then used by an architect to develop the architec-
ing the application of the techniques detailed in Section 2 ture model template, which in turn is converted into an
for the purpose of automating maturity assessment. ontology by an ontology engineer. Finally, the DL queries
MMArch provides support for executing these methods. to assess a given scenario according to the assessment
From a maturity model developer viewpoint, these meth- questions and the architecture model template are devel-
ods have the purpose of translating existing maturity oped. Fig. 5 depicts an overview of this method using the
assessment questionnaires into an ontology and then ArchiMate notation.
translating the assessment questions into DL queries
D. PROENÇA ET AL.: MATURITY MODEL ARCHITECT – A TOOL FOR MATURITY ASSESSMENT SUPPORT 5
ogy: In this step, the instantiated architecture model de- model ontology. Finally, the reasoner creates the inferred
veloped by the assessor is converted into an ontology instantiated maturity model ontology which will be used
either by using a converter or by manually creating the to gather the assessment results.
ontology in an editor. The assessor must confirm that the
final ontology faithfully captures the instantiated architec-
ture models.
This method ends with the execution of the DL queries
over the ontology representation of the instantiated archi-
tecture models. This is followed by the analysis of the
results of these queries to determine the assessment re-
sults, in the form of one or more maturity levels, accord-
ing to the specification of the maturity model. The role
associated with this method fragment is the Assessor. It
consists of performing analysis using DL queries. In this
step, the DL queries developed to analyze the architecture
models with purpose of assessing the assessment ques-
tions of a specific maturity model are executed by the
assessor over the ontology representation of the instanti-
ated architecture models. The assessor then uses the anal-
ysis results to determine the assessment results, in the
form of one or more maturity levels, according to the
Fig. 7. Assessment using a maturity model ontology.
specification of the maturity model.
3.3 Assessment using a Maturity Model Ontology This method ends with the assessor gathering the as-
This method goal is to support the assessment of specific sessment results from the inferred instantiated maturity
organizational scenarios using an ontology specific to a model ontology. These can be one or more maturity or
maturity model. This ontology formalizes the maturity capability levels, depending to the rules defined by the
model and captures all the relevant maturity model com- maturity model. The role associated with this method
ponents for performing an assessment, as well as, the fragment is the assessor. It consists of gathering assess-
rules used to determine the maturity or capability levels. ment results. In this step, the assessor uses the inferred
This method starts with the assessor creating the individ- instantiated maturity model ontology to gather the as-
uals in the ontology, which is then followed by the execu- sessment results in the scope of the assessment. These can
tion of a reasoner and is finalized with the assessor gath- be one or more maturity or capability levels, depending
ering the assessment results from the inferred instantiated to the rules defined by the maturity model.
maturity model ontology. Fig. 7 depicts an overview of
the method. 3.4 Summary
This method starts by the assessor creating the indi- This section presented three application methods which
viduals in the maturity model ontology. These individuals purpose is to guide the use of MMArch in existing ma-
represent the goals, practices, work products and re- turity assessment methods already presented in section 2.
sources specified by a maturity model and satisfied by an To provide an overview of the inputs, activities, out-
organization being assessed. The role associated with this puts and roles associated with each method we provide a
method fragment is the assessor. It consists of creating synthesized view in Table 1.
individuals in an ontology: In this step, the assessor cre-
ates individuals in a maturity model ontology. This step
4 MMARCH: MATURITY MODEL ARCHITECT
represents the instantiation of an organizational scenario
in the ontology. These individuals represent the goals, This section illustrates a web-oriented application devel-
practices, work products and resources specified by a oped using the Microsoft .NET framework , using the 1
maturity model and satisfied by an organization being Microsoft ASP.NET , Internet Information Services and
2
TABLE 1
SYNTHESIS OF THE APPLICATION METHODS
Method Process Steps Roles Input Output Applications
Identify assessment ques- Maturity model Specify Assessment Ques-
None Assessment questions
tions developer tions
Develop architecture Develop Architecture
Architect Assessment questions Architecture model template
Architecture Model model template Model Template
Template and DL Convert architecture Ontology representation of
Architect, Ontology Architecture model tem- Create Ontology Represen-
Queries Develop- model template into an the architecture model
engineer plate tation
ment ontology template
Develop DL queries for Ontology representation of
architecture model tem- Ontology engineer the architecture model DL queries Create Queries
plate template
Instantiate architecture Architecture model tem- Instantiated architecture Create Architecture Model
model template plate model Instance
Convert instantiated Ontology representation of
Maturity Assessment Instantiated architecture Create Ontology Represen-
architecture model into an the instantiated architecture
using an Architec- model tation
ontology model
ture Model Template
Ontology representation of
Perform analysis using DL
the instantiated architecture Analysis results Execute Analysis Queries
queries Assessor
model; DL queries
Create individuals in Instantiated maturity model
Maturity model ontology Manage Ontology
ontology ontology
Maturity Assessment
Instantiated maturity model Inferred instantiated maturi- Execute Ontology Reason-
using a Maturity Execute reasoner
ontology ty model ontology er
Model Ontology
Inferred instantiated ma-
Gather assessment results Assessment results Manage Ontology
turity model ontology
3
https://protege.stanford.edu/
D. PROENÇA ET AL.: MATURITY MODEL ARCHITECT – A TOOL FOR MATURITY ASSESSMENT SUPPORT 9
Maturity_Level
satisfiedBy
Capability
achievedBy
Assessment_Criterion
Fig. 12. MMArch ontology used for the OWL export functionality.
5 CONCLUSIONS
This paper presented MMArch, a maturity model reposi-
tory and assessment tool, that makes use of a new ap-
proach to maturity assessment using enterprise architec- Fig. 13. MMArch example maturity assessment export to OWL.
ture model analysis, ontologies and DL. For that purpose,
we present an analysis of the related work in maturity MMArch allows maturity model developers to upload
models, ontologies and DL reasoning, concluding that their maturity models as well as, the assessment criteria,
10 20TH IEEE CONFERENCE ON BUSINESS INFORMATICS (CBI 2018)
expressed in DL queries to verify the compliance of an Document, Software Engineering Institute - Carnegie Mellon
University, Handbook CMU/SEI-2011-HB-001, 2011.
organizational scenario against the maturity assessment
[18] ISO/IEC 15504-3:2004, Information technology - Process assessment
criteria. Users can then log into MMArch select the ma- - Part 3: Guidance on performing an assessment, International Or-
turity model which they which to assess their organiza- ganization for Standardization and International Electrotech-
tion against and provide the enterprise architecture mod- nical Commission Std., 2004.
els deemed necessary by the maturity model developer to [19] HM&S IT-Consulting, “SPICE 1-2-1,” 2012. [Online] Available:
get a report containing the organizations’ current maturi- http://www.spice121.com/cms/en/
ty level(s), that report can then be used as an input for an [20] Griffith University, “Appraisal Assistant,” 2007. [Online]
Available:
improvement plan. https://www.sqi.griffith.edu.au/AppraisalAssistant/about.ht
ml
ACKNOWLEDGMENT [21] AXELOS, “ITIL Maturity Model,” 2015. [Online] Available:
https://www.axelos.com/best-practice-solutions/itil/itil-
This work was supported by national funds through maturity-model
Fundação para a Ciência e a Tecnologia (FCT) with refer- [22] E-ARK Project, “Information Governance Maturity Model
ence UID/CEC/50021/2013. Assessment Questionnaire,” 2017. [Online] Available:
http://earkmaturitysurvey.dlmforum.eu/
[23] K. Breitman, M. A. Casanova, and W. Truszkowski, Semantic
REFERENCES web: concepts, technologies and applications. Springer, 2007.
[1] J R. L. Nolan, "Managing the Computer Resource: A Stage [24] A. Gomez-Perez and R. Benjamins, “Overview of knowledge
Hypothesis", Communications of the ACM, vol. 16, pp. 399-405, sharing and reuse components: Ontologies and problem-
1973. solving methods,” in Proceedings of IJCAI-99 Workshop on
[2] D. M. Ahern, A. Clouse, R. Turner, “CMMI Distilled: A Practi- Ontologies and Problem Solving Methods (KRR5), Stockholm,
cal Introduction to Integrated Process Improvement, Third Edi- Sweden, 1999.
tion,” Addison Wesley Professional, 2008. [25] T. R. Gruber, “A translation approach to portable ontology
[3] ISO/IEC 15504:2004, “Information technology - Process as- specifications,” Knowledge Acquisition, vol. 5, pp. 199–220,
sessment,” International Organization for Standardization and 1993.
International Electrotechnical Commission Std. 2004. [26] W. N. Borst, Construction of Engineering Ontologies. PhD
[4] R. Studer, R. Benjamins, and D. Fensel, “Knowledge engineer- thesis, University of Twente, Enschede, 1997.
ing: Principles and methods,” Data & Knowledge Engineering, [27] F. Baader, D. Calvanese, D. McGuiness, D. Nardi, P. Patel-
vol. 25, pp. 161–198, 1998. Schneider, “The Description Logic Handbook: Theory, Imple-
[5] N. Guarino, D. Oberle, and S. Staab, Handbook on Ontologies, mentation, and Applications (1st. ed.),” Cambridge University
ch. What Is an Ontology?, pp. 1–17. Springer Berlin Heidelberg, Press New York, New York. 2003.
2009. [28] R. Vaculin, “Process Mediation Framework for Semantic Web
[6] M. R. Genesereth and N. J. Nilsson, Logical Foundations of Services,” PhD thesis, Department of Theoretical Computer Sci-
Artificial Intelligence. Morgan Kaufmann, Los Altos, CA, 1987. ence and Mathematical Logic, Faculty of Mathematics and
[7] A. Tonini, M. Carvalho, M. Spínola, “Contribuição dos modelos Physics, Charles University. 2009.
de qualidade e maturidade na melhoria dos processos de soft- [29] The Open Group, “Archimate 3.0.1 Specification,” 2017.
ware,” Produção, Vol. 18, No 2, pp. 275-286. 2008. [Online] Available:
[8] M. Paulk, B. Curtis, M. Chrissis, C. Weber, “Capability Maturity http://pubs.opengroup.org/architecture/archimate3-doc/.
Model for software,” Version 1.1 CMU/SEI-93-TR-24, Pitts- [30] M. Horridge, S. Bechhofer, “The OWL API: A Java API for
burgh, Pennsylvania, USA, Carnegie Melon University. 1993. Working with OWL 2 Ontologies,” in Proceedings of the 5th In-
[9] R. Fitterer, P. Rohner, “Towards assessing the networkability of ternational Workshop on OWL: Experiences and Directions
health care providers: a maturity model approach,” Infor- (OWLED 2009), Chantilly, VA, United States. 2009.
mation Systems E-business Management, Vol. 8, pp. 309-333. [31] D. Proença, R. Vieira, J. Borbinha, “Information Governance
2010. Maturity Model - Final Development Iteration”, In Proceedings
[10] A. Sen, K. Ramammurthy, A. Sinha, “A model of data ware- of the 21st International Conference on Theory and Practice of
housing process maturity,” In IEEE Transactions of Software En- Digital Libraries (TPDL 2017), Thessaloniki, Greece. 2017.
gineering. 2011. [32] ISO 14721:2010, Space data and information transfer systems –
[11] M. Röglinger, J. Pöppelbuß, “What makes a useful maturity Open archival information system– Reference model, Interna-
model? A framework for general design principles for maturity tional Organization for Standardization, 2010.
models and its demonstration in business process manage-
ment,” In proceedings of the 19th European Conference on Infor- [33] ISO 16363:2012, Space data and information transfer systems –
mation Systems, Helsinki, Finland, June. 2011. Audit and certification of trustworthy digital repositories, In-
ternational Organization for Standardization, 2012.
[12] N. Brookes, R. Clark, “Using maturity models to improve pro-
ject management practice,” In proceedings of the POMS 20 Annu-
th [34] ISO 20652:2006, Space data and information transfer systems –
al Conference, Florida, USA, 1-4 May. 2009. Producer-archive interface – Methodology abstract standard,
[13] M. Kohlegger, R. Maier, S. Thalmann, “Understanding maturity International Organization for Standardization, 2006.
models: Results of a structured content analysis,” In proceedings [35] B. Glimm, I. Horrocks, B. Motik, G. Stoilos, Z. Wang, “HermiT:
of the I-KNOW ’09 and I-SEMANTICS ’09, 2-4 September 2009, An OWL 2 Reasoner,” in Journal of Automated Reasoning, vol. 53,
Graz, Austria. 2009. pp. 245–269, 2014.
[14] G. Jia, Y. Chen, X. Xue, J. Chen, J. Cao, K. Tang, “Program [36] D.Proença, J. Borbinha, “A formalization of the ISO/IEC 15504:
management organization maturity integrated model for mega Enabling Automatic Inference of Capability Levels,” in Pro-
construction programs in China,” International Journal of Pro- ceedings of the 17th International Conference on Process Im-
ject Management, Vol. 29, pp. 834-845. 2011. provement and Capability Determination (SPICE 2017), Palma
[15] M. Koshgoftar, O. Osman, “Comparison between maturity de Mallorca, Spain. 2017.
models,” In proceedings of the 2 IEEE International Confer-
nd
ence on Computer Science and Information Technology, Vol. 5, [37] D.Proença, J. Borbinha, “Formalizing of the ISO/IEC 15504-5
pp. 297-301. 2009. and SEI CMMI v1.3: Enabling Automatic Inference of Maturity
and Capability Levels,” in Computer Standards & Interfaces Jour-
[16] CMMI Product Team, “CMMI for development, version 1.3,”
Software Engineering Institute - Carnegie Mellon University, nal. 2018. (In Press)
Tech. Rep. CMU/SEI-2010-TR-033, 2010.
[17] SCAMPI Upgrade Team, Standard CMMI Appraisal Method for
Process Improvement (SCAMPI), Version 1.3: Method Definition