Maturity Models For Data and Information Management: September 2018
Maturity Models For Data and Information Management: September 2018
net/publication/327431346
CITATIONS READS
17 14,970
2 authors:
All content following this page was uploaded by Diogo Proença on 12 November 2018.
{diogo.proenca, jlb}@tecnico.ulisboa.pt
1 Introduction
A maturity model is a technique that proved valuable in measuring different aspects of
a process or an organization. It represents a path towards increasingly organized and
systematic way of doing business in organizations.
A maturity model consists of a number of “maturity levels”, often five, from the
lowest to the highest, initial, managed, defined, quantitatively managed and optimizing
(however, the number of levels can vary, depending on the domain and the concerns
motivating the model). This technique provides organizations: (1) a measuring for au-
diting and benchmarking; (2) a measuring of progress assessment against objectives;
(3) an understanding of strengths, weaknesses and opportunities (which can support
decision making concerning strategy and project portfolio management).
We can trace the subject of maturity models back to 1973 [1], and recognize as high-
lights the Software Engineering Institute (SEI) Capability Maturity Model Integration
(CMMI) [2] that was first presented in 1991, and in 2004 the ISO/IEC 15504 [3]. Both
the CMMI and ISO/IEC 15504 are key references, born in the Software Engineering
domain, culminating decades of development and refinement of the corresponding
models. Moreover, there is certification for these two references, which are the de facto
2
assessment techniques to use when benchmarking organizations for their software en-
gineering process implementation and maturity. As such, in order for the results to be
comparable, there is a detailed maturity assessment method behind each of these ma-
turity models. These methods define in detail how to plan, conduct, and determine the
maturity levels of an assessment and how to present the results to the organization.
These methods make each assessment repeatable and comparable with results from
other organizations, allowing for benchmarking.
This paper is structured as follows. The first sections of this paper provide a back-
ground to the concepts of maturity and maturity model, the rationale for the selection
of the maturity models examined, and a brief description of each maturity model. A
more detailed analysis follows. The next section provides a discussion on the analysis
of the selected maturity models. A concluding section highlights gaps in the current
range of maturity models and identifies opportunities for further research.
2 Background
To evaluate maturity, organizational assessment models are used, which are also known
as stages-of-growth models, stage models, or stage theories [4].
Maturity is a state in which, when optimized to a particular organizational context,
is not advisable to proceed with any further action. It is not an end, because it is a
“mobile and dynamic goal” [4]. It is rather a state in which, given certain conditions, it
is agreed not to continue any further action. Several authors have defined maturity,
however many of these definitions fit into the context in which each the maturity model
was developed.
In [5], the authors define maturity as a specific process to explicitly define, manage,
measure and control the evolutionary growth of an entity. In turn, in [6] the authors
define maturity as a state in which an organization is perfectly able to achieve the goals
it sets itself. In [7] it is suggested that maturity is associated with an evaluation criterion
or the state of being complete, perfect and ready and in [8] as being a concept which
progresses from an initial state to a final state (which is more advanced), that is, higher
levels of maturity. Similarly, in [9] maturity is related with the evolutionary progress
in demonstrating a particular capacity or the pursuit of a certain goal, from an initial
state to a final desirable state. In [10] maturity is seen as the “extent to which an organ-
ization has explicitly and consistently deployed processes that are documented, man-
aged, measured, controlled, and continually improved.”
Most maturity model definitions found in literature clarify that maturity models are
particularly important for identifying strengths and weaknesses of the organizational
context to which they are applied, and the collection of information through methodol-
ogies associated with benchmarking.
In [4] the authors define maturity models as a series of sequential levels, which to-
gether form an anticipated or desired logical path from an initial state to a final state of
maturity. Röglinger et al. [14] explain that a maturity model includes “a sequence of
levels (or stages) that together form an anticipated, desired, or logical path from an
initial state to maturity.”
Some definitions involve common organizational concepts. For example, the defini-
tion in [11] defines a maturity model as “... a framework of evaluation that allows an
3
organization to compare their projects and against the best practices or the practices of
their competitors, while defining a structured path for improvement.” This definition is
deeply embedded in the concept of benchmarking. In other definitions, such as the in
[12], we identify the concern of associating a maturity model to the concept of contin-
uous improvement. In [13], it was concluded that the great advantage of maturity mod-
els is that they show that maturity must evolve through different dimensions and, once
reached a maturity level, sometime is needed for it to be actually sustained.
The SEI explains that a maturity model “contains the essential elements of effective
processes for one or more areas of interest and describes an evolutionary improvement
path from ad hoc, immature processes to disciplined, mature processes with improved
quality and effectiveness.”
Currently, the lack of a generic and global standard for maturity models has been
identified as the cause of poor dissemination of this concept.
1 https://www.datasealofapproval.org/en/
2 http://www.moreq.info/
4
3 www.arma.org/principles
5
model and position themselves among the maturity levels. However, there is no method
or tool to facilitate this assessment. This maturity model consists of five maturity levels
with ten attributes defined which are called process perspectives.
JISC Records Management Maturity Model (2013) – Created by JISC infoNet
and stands as a self-assessment tool for higher education institution in England and
Wales [33]. It is based on a code of practice and its aim is to help in the compliance
with this code although it is independent from the code. Its aim is to help higher edu-
cation institutions to assess their current approach on records management regarding
recommendations issued by the United Kingdom government and benchmark against
other similar organizations. Self-assessment is conducted by using a spreadsheet, con-
sisting of statements for each of the nine sections. Users should choose the level that
best suits the organization for each statement. This maturity model consists of five ma-
turity levels with nine attributes defined which are called sections.
CMMI Institute Data Management Maturity Model (2014) – A reference model
for data management process improvement created by the CMMI Institute. Defines the
fundamental business processes of data management and specific capabilities that con-
stitute a path to maturity [34]. It allows organizations to evaluate themselves against
documented best practices, determine gaps, and improve data management across func-
tional, business, and geographic boundaries. Its aim is to facilitate an organization’s
understanding of data management as a critical infrastructure, through increasing capa-
bilities and practices. This maturity model consists of five maturity levels with attrib-
utes at two levels, six attributes called categories which are further decomposed into 25
process areas.
Syracuse University Capability Maturity Model for Research Data Manage-
ment (2014) – Developed by the school of information studies at the University of
Syracuse in the USA [35]. It is based on the number and name of levels of CMMI, as
well as, the principles of each level. RDM has become a treading topic in data manage-
ment as increased importance from government agencies, such as, the US National Sci-
ence Foundation. These funding agencies are raising the issue of maintaining good
RDM practices for the projects that are funded by them. There is no assessment method
specified. This maturity model consists of five maturity levels with five attributes de-
fined which are called key process areas.
Preservica Digital Preservation Maturity Model (2015) - Created on the premise
that organizations have realized that is critical for their business that information is
retained over a long period of time [36]. Preservica defines three main sections for the
maturity model. The first section is durable storage which comprehends levels 1 to 3,
where raw bit storage increases in safety and security. The second section comprehends
levels 4 to 5, where the raw bits in storage become preserved and organized. The third
and last section is information preservation which comprehends level 6, where the in-
formation survives the lifetime of the application that created it. There is no assessment
method specified. This maturity model consists of six maturity levels with no attributes
defined.
Digital Asset Management (DAM) Maturity Model (2017) - Provides a descrip-
tion of where an organization is, where it needs to be so that it can perform gap analysis
7
and comprehend what it needs to do to achieve the desired state of DAM implementa-
tion [37]. There is a description on how to do a self-assessment. It should begin by
identifying the stakeholders who identified the need for DAM and can advocate in favor
of it. Then, a set of questionnaires must be created and administered to each of the
stakeholders identified. Then the levels can be determined using the answers to the
questionnaires. This maturity model consists of five maturity levels with attributes at
two levels, four attributes called categories which are further decomposed into 15 di-
mensions.
E-ARK Information Governance Maturity Model (2017) – Based on the
OAIS/ISO 14721 [38], the TRAC/ISO 16363 [15] and PAIMAS/ISO 20652 [39].
A2MIGO uses the dimensions described in ISO9001 (Management, Processes and In-
frastructure) and the maturity levels defined in SEI CMMI (Initial, Managed, Defined,
Quantitatively Managed, Optimizing) [40]. The SEI CMMI levels were selected due to
their broader scope making them suitable for wider fields such as that of information
governance. This maturity model provides a self-assessment questionnaire, details how
the results are analyzed, and clarifies the concepts being used [41]. This maturity model
consists of five maturity levels with three attributes defined which are called dimen-
sions.
4 Analysis
There is a growing body of knowledge studying maturity models in other domains, and
we draw from this work in our analysis. Mettler et al. [16] note the variety of research
that exists on maturity models, and we have attempted to cover a wide range to form
our theoretical foundation here. First, the works by Maier et al. [17] and Proenca et al.
[18] provide a similar survey of models in a variety of domains and that focus on the
application of models. Second, to understand the models as artefacts, we have drawn
on work in Design Science research, including the examples and approaches to define
requirements for the process of developing a maturity model [14], as well as general
design principles for maturity models [4].
We determined a set of attributes to analyze the existing options available for data
and information management maturity assessment, the results of which are detailed in
Table 2. We first determined the domain of the maturity model. For this work, we iden-
tified three domains that deal with data and information management. These are Infor-
mation Management (IM), Digital Preservation (DP), and Records Management (RM).
We also examined the nature of the assessment process and expected outputs. Spe-
cific requirements are necessary for different types of intended audience of the model,
e.g. to be shared internally in the organization, with external stakeholders, or both. As
well, we considered the assessment method, e.g. whether it is performed as a self-as-
sessment, third-party assisted, or by a certified practitioner.
Next, we examined the practicality of the maturity model, which details if the prac-
ticality of the recommendations is problem-specific or general in nature. For this aspect,
we examined if the maturity model provided just general recommendations or if it de-
tailed specific improvement activities. A maturity model that just provides general rec-
ommendations is categorized as a descriptive model. One that details specific improve-
ment activities is considered a prescriptive model according Röglinger et al. [4].
8
Table 2. Analyzed maturity models for assessment in data and information management.
Name Domain Audience AM PRA CER Origin IOP SWPI ACC
EIMM IM External TPA GR No C No No Free
IGMM RM Both SA SIA No C No Yes Charged
ECMM IM Both SA GR No C No No Free
RKMM RM Internal SA SIA No P Yes Yes Free
AMMM IM Internal TPA GR No A No Yes Free
DGMM DM Internal SA SIA No A No Yes Free
DPCMM DP Internal TPA GR No P No Yes Free
BDPMM DP Internal SA SIA No C No Yes Charged
RMMM RM Both SA GR No P No No Free
DMMM DM Both CP SIA Yes C Yes Yes Charged
CMMRDM DM Internal N/A GR No A No Yes Free
PDPMM DP Both N/A GR No C No No Free
DAMM DM Both SA, TPA GR No C No No Charged
A2MIGO DP Internal SA, TPA SIA No A Yes Yes Free
Legend: Columns - AM = Assessment Method; PRA = Practicality; CER = Certification; IOP = Improvement Opportunities
Prioritization; SWPI = Strong/Weak Point Identification; ACC = Accessibility. Column Domain - IM= Information Man-
agement; DP= Digital Preservation; RM= Records Management. Column Assessment Method - SA= Self-Assessment;
TPA= Third-party Assisted; CP= Certified Professionals. Column Practicality - GR = General recommendations;
SIA=Specific improvement activities. Column Origin - C= Commercial; A = Academic; P = Practitioner.
5 Discussion
This analysis of maturity models has generated a number of insights into both the do-
mains that deal with data and information management and the maturity models them-
selves, as well as shedding light on the limitations around assessments performed using
maturity models.
One significant trend to emerge from this comparison is a noticeable increase in the
number and complexity of maturity models in recent years. This increasing interest in
assessment models mirrors the increasing number of legislation surrounding the man-
agement of information. Increased interest and development of maturity models can
indicate a domain’s transition from a state of lack of definition and standardization to-
wards optimization and improvement, although this shift is not always valuable or de-
sired. Maturity models come with assumptions that sometimes conflict with the reality
in organization scenarios. Improvement is often oriented towards quality control and
consistency, minimizing unpredictability of outcomes over time and reducing individ-
ual efforts to a minimum. However, the culture built around the work of skilled indi-
viduals can contrast abruptly with these assumptions, resulting in resistance to the tran-
sition to a streamlined approach.
The assumptions of the most recognized maturity models such as those compliant
with ISO33000 [19], a family of standards for process assessment, include a process
orientation which considers the availability of multiple instances of the assessed pro-
cesses across the organization. Just as the CMMI was not universally praised in the
software industry [20], current highly detailed standards prescribing functional require-
ments are not necessarily fit for all purposes. Additional limitations that result from
using maturity models include the tendency to oversimplify reality and obscuring alter-
nate paths to maturity [21].
Some of the maturity models examined in this paper declare adherence to a model
such as the SEI CMMI, they often do not demonstrate awareness of the concepts and
assumptions. One exception to this reality is A2MIGO where the authors clearly show
the relevance and extent of use of the CMMI in their work. Despite this fact and in
general, greater clarity about underlying concepts and a stronger adherence to design
principles for maturity models is needed to introduce trust in these maturity models.
10
Acknowledgements
This work was supported by national funds through Fundação para a Ciência e a Tecno-
logia (FCT) with reference UID/CEC/50021/2013.
References
1. R. L. Nolan, "Managing the Computer Resource: A Stage Hypothesis," in Communications
of the ACM, vol. 16, pp. 399-405, 1973.
11