0% found this document useful (0 votes)
49 views

Measuring and Improving Agile Processes in A Small-Size Software Development Company

This document summarizes a study on measuring and improving agile processes in a small software development company. The study developed a catalog of metrics to measure the agile development process and validate them through a pilot project developing commercial software. The metrics focused on process quality and team effectiveness. The catalog and results provided practical guidance for practitioners, especially those in small companies, on using metrics to make data-driven process improvements in agile projects.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
49 views

Measuring and Improving Agile Processes in A Small-Size Software Development Company

This document summarizes a study on measuring and improving agile processes in a small software development company. The study developed a catalog of metrics to measure the agile development process and validate them through a pilot project developing commercial software. The metrics focused on process quality and team effectiveness. The catalog and results provided practical guidance for practitioners, especially those in small companies, on using metrics to make data-driven process improvements in agile projects.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 15

Received April 9, 2020, accepted April 20, 2020, date of publication April 23, 2020, date of current version

May 11, 2020.


Digital Object Identifier 10.1109/ACCESS.2020.2990117

Measuring and Improving Agile Processes in a


Small-Size Software Development Company
MICHAŁ CHORAŚ 1,2 , TOMASZ SPRINGER1 , RAFAŁ KOZIK1,2 , LIDIA LÓPEZ 3,

SILVERIO MARTÍNEZ-FERNÁNDEZ 3,4 , (Member, IEEE), PRABHAT RAM5 ,


PILAR RODRÍGUEZ 5,6 , (Member, IEEE), AND XAVIER FRANCH 3
1 ITTI Sp. z o.o., 61-612 Poznań, Poland
2 Institute
of Computer Science and Telecommunications, University of Science and Technology in Bydgoszcz (UTP), 85-796 Bydgoszcz, Poland
3 Department of Service and Information System Engineering (ESSI), Universitat Politecnica de Catalunya, 08034 Barcelona, Spain
4 Fraunhofer IESE, 67663 Kaiserslautern, Germany
5 M3S, Faculty of Information Technology and Electrical Engineering (ITEE), University of Oulu, 90570 Oulu, Finland
6 Department of Languages, Computer Systems and Software Engineering, Faculty of Computer Sciences, Universidad Politécnica de Madrid,

28040 Madrid, Spain


Corresponding author: Michał Choraś ([email protected])
This work supported in part by the European Union’s Horizon 2020 Research and Innovation Programme under Grant 732253 and in part
by the University of Science and Technology in Bydgoszcz, Poland.

ABSTRACT Context: Agile software development has become commonplace in software development
companies due to the numerous benefits it provides. However, conducting Agile projects is demanding
in Small and Medium Enterprises (SMEs), because projects start and end quickly, but still have to fulfil
customers’ quality requirements. Objective: This paper aims at reporting a practical experience on the use
of metrics related to the software development process as a means supporting SMEs in the development of
software following an Agile methodology. Method: We followed Action-Research principles in a Polish
small-size software development company. We developed and executed a study protocol suited to the needs
of the company, using a pilot case. Results: A catalogue of Agile development process metrics practically
validated in the context of a small-size software development company, adopted by the company in their
Agile projects. Conclusions: Practitioners may adopt these metrics in their Agile projects, especially if
working in an SME, and customise them to their own needs and tools. Academics may use the findings as a
baseline for new research work, including new empirical studies.

INDEX TERMS Agile software development, process metrics, software engineering, software quality, rapid
software development, SMEs.

I. INTRODUCTION Product Owner and Scrum Master is at least twofold: to assure


Agile development methodologies are widely adopted nowa- software product quality and to facilitate the effectiveness of
days by software development companies of every kind [37]. the team and the process.
Industry surveys show that virtually all organisations use Currently, in many software development companies,
Agile methods to some extent, and over half of them have teams are using various specific tools (such as Jira, GitLab
Agile as their usual approach to software development.1 Prac- and SonarQube) in order to support the development process
titioners report many benefits, ranging from reduced time- and the quality of the code and products. This is usually
to-market, to increased customer satisfaction and reduced done in a regular retrospective meeting that involves all the
development costs, among others.2 However, managing Agile team. As far as the code quality is concerned, those tools
projects may be challenging [10], especially in the case of provide sufficient information for the Scrum Team. However,
Small and Medium Enterprises (SMEs). The challenge for the there is still a gap and the need for more solutions reflecting
team effectiveness and process quality. It can be stated that,
The associate editor coordinating the review of this manuscript and at present, process improvement activities are mainly based
approving it for publication was Fabrizio Messina . on developers’ perceptions and little support is given to make
1 13th Annual State of Agile Report, 2019. process wise data-driven decisions.
https://www.stateofagile.com/#ufh-i-521251909-13th-annual-state-of-
agile-report/473508
The major contribution of this paper comes in the form
2 Hewlett-Packard Enterprise. Agile is the new normal, 2015. of a set of metrics that measure the Agile software devel-
https://www.softwaretestinggenius.com/docs/4aa5-7619.pdf opment process (which we call process metrics hereafter) in

This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/
78452 VOLUME 8, 2020
M. Choraś et al.: Measuring and Improving Agile Processes in a Small-Size Software Development Company

an SME-type of company, and the discussion on how those


metrics helped the Scrum Team in the development of a com-
mercial product. The metrics were built as part of an Action-
Research collaboration involving a team of researchers and
a Polish small-size software development company, ITTI
Sp. z o.o, working together during the development of the
ITTI’s CONTRA commercial product, in the context of the
Q-Rapids EU project.3
This paper is structured as follows: Section II provides
the background in process metrics and the Q-Rapids project.
Section III surveys the state of the art in process metrics used
in software development. Section IV presents the research FIGURE 1. Q-Rapids tool conceptual architecture.
method. Section V defines a set of process metrics. Section VI
includes the discussion on the results. Section VII enumerates
the threats to validity of the study. Finally, Section VIII ware process measurement challenging in practice [19], [41].
concludes the paper. Moreover, due to the time, budget and resource constraints,
software measurement is rife with challenges, particularly in
II. BACKGROUND SMEs [9], [23], [36].
A. PROCESS METRICS FOR SOFTWARE DEVELOPMENT With considerations to resources [9], [23], [36], metric
The scientific literature shows that measurement is integral selection methods [3], infrastructure facilities, team size [8],
to understanding, predicting and assessing software devel- and a well-planned software measurement program [23],
opment projects [12], [42]. Software development involves [36], process metrics can assist SMEs in measuring and
many processes, and measurement enables us to characterize, improving their process performance.
control, predict, and improve those processes [28]. Being
a human-centered activity, software processes are prone to B. THE Q-RAPIDS PROJECT
problems [14], which lends further credence to why they Q-Rapids was a collaborative industry-academy project
should be continuously assessed and improved, to meet the (funded by the European Commission under the H2020
expectations of the customers and the stakeholders of an Framework), involving three research partners and four com-
organization [14]. Software process measurement can help panies. It proposed innovative methods and tools to support
in achieving the desired level of performance, capability, and the software development industry in improving their quality
quality [13], [30]. Moreover, measuring software processes levels (software and process) when using Agile and Rapid
also allows learning about the quality of the software product software development [6]. All the partners worked together
[33], [41]. under a co-creation strategy. Besides, every company adapted
Owing to the relevance of measurement in software devel- the results as they were produced, to the specific needs of the
opment, software metrics have been studied for decades company.
[21]. However, the increasing popularity of Agile software The Q-Rapids approach [16] is based on gathering and
development (ASD) [37] makes understanding of software analysing data from several sources (software repositories,
metrics in Agile context more relevant. Research recognizes project management tools, system usage and quality of ser-
the need for Agile organizations to use metrics, but empirical vice) [20], [26]. Data is aggregated into quality indicators
research on metrics in industrial ASD remains scarce [22]. that are rendered to the different stakeholders by means of
Particularly, the rationale behind the metrics mentioned in the the Q-Rapids tool [25].
literature (e.g., burn-down charts, test-pass rates, and suitable The Q-Rapids tool, as a result of the project, provides
pace) and how they are actually used in practice have received continuous assessment of the quality-related strategic indi-
little attention [22]. In addition, although the aim of measur- cators to decision makers. Figure 1 shows an excerpt of the
ing in ASD is similar to that in traditional approaches (i.e. conceptual architecture of the tool. The main modules are
to plan and track Agile sprints or cycles, to monitor product Data Gathering, Data Modelling and Analysis, and Strategic
quality, and to identify and fix process-related problems), Decision Making.
the measurement programs are quite different in practice [22]. The Data Gathering module is composed of different
Agile’s focus on lightweight practices, continuous delivery of Apache Kafka connectors to enable gathering data from het-
working software, flexible development phases, and minimal erogeneous external data sources, such as static code analysis
documentation make it necessary for measurement programs (e.g., SonarQube), continuous integration tools (e.g., Jenk-
to be well aligned with the Agile mindset and the principle ins), code repositories (e.g., SVN, Git, GitLab), issue tracking
of simplicity [22]. In the particular case of process metrics, tools (e.g., Redmine, GitLab, JIRA, Mantis), and usage logs.
software processes are complex and intangible, making soft- The Data Modelling and Analysis module processes the
data to assess software quality (product and process). Con-
3 https://www.q-rapids.eu/ cretely, metrics are calculated from the gathered data. These

VOLUME 8, 2020 78453


M. Choraś et al.: Measuring and Improving Agile Processes in a Small-Size Software Development Company

metrics are aggregated into quality factors related to devel- these reasons, measuring software processes, especially in an
opment and usage. Finally, the quality factors are aggregated SME, becomes a bigger challenge.
into high-level indicators named strategic indicators, which Kupiainen et al. [22] conducted a systematic review of
can be aligned to the strategic goals of the organisation [27]. the use and impact of software metrics in ASD in industry.
The assessment data processed by the Data Modelling and The authors reported that software metrics are mainly used
Analysis module is visualised by the dashboards included for sprint planning, tracking progress, improving software
in the Strategic Decision Making module. The data is visu- quality, fixing software process, and motivating people. The
alised as aggregated data to the end-user through the web- authors reported that metrics like velocity, effort estimation,
based GUI, also named Strategic Dashboard. The strategic customer satisfaction, defect count, technical debt and build
dashboard also includes links to customised dashboards that are used prominently in ASD. In their systematic review,
can be developed to visualise dedicated charts including the Tahir et al. [46] observed that metrics for defects, effort, size,
data ingested from the data producers’ tools. This dashboard duration, productivity, employee commitment, and customer
allows the user to display those data calculated for the current satisfaction are commonly reported in the state of the art.
stage of the project, as well as the evolution of the metrics, These findings complement another review by Gómez et al.
factors and strategic indicators over the time. Another con- [15], where complexity and size were found to be the most
figurable property of the data visualisation is the possibility measured attributes in MPs. Other usage of metrics in ASD
to adjust grid of the time-based charts to the current needs discussed in the literature are for planning and tracking soft-
and present evolution of data with granularity from days up ware development [22], understanding development perfor-
to months. The dashboard also allows navigating through the mance and product quality [48], measuring process quality
different elements, which provides traceability and enforces [50], estimating effort [47], and reporting progress and qual-
the understanding of the assessment. ity to stakeholders not involved in the actual development [4].
Other software analytics tools similar to Q-Rapids have Taken together, metrics targeting sprint planning, fixing soft-
recently emerged in the software engineering landscape. ware process, effort estimation, development performance,
Some of them are domain-dependent, e.g. the European and software defects can be used to measure an organization’s
Cooperation for Space Standardization (ECSS) metric frame- process performance. However, this objective is not expressly
work to improve transparency of software development in stated in any of the reviews mentioned above. On the contrary,
customer-supplier relationships of space missions [35]. Also, [22] remarked that more studies are needed to explore the
some commercial tools are available in the market with simi- rationale behind utilizing the metrics the authors found in
lar characteristics as the Q-Rapids tool. For instance, Squore4 their review.
provides a similar dashboard to Q-Rapids’ and includes sev- Most of the studies present initial emerging results of MP
eral software metrics and indicators measuring software qual- implementation in organizations, which have not been evalu-
ity, although they are not comparable with the set of metrics ated within a larger industrial context. One of the exceptions
analysed in this paper. is the study by Dubinsky et al. [11] reporting on the experi-
ence of using an MP at an extreme programming (XP) devel-
III. RELATED WORK opment team of the Israeli Air Force. The authors found that
There is a long history of research on metrics programs (MPs) using metrics to measure the amount, quality, pace, and work
[24], and plenty of literature recommending success factors status could lead to more accurate and professional decision-
for their implementation [17], [28], [34], [44]. However, making. A similar study by Diaz-Ley et al. [9] proposed
literature on SMEs using MPs in the context of Agile is a measurement framework, customized for SMEs. One key
rather scarce. Moreover, the literature on measuring software benefit the authors reported was better measurement goals
processes and their role in improving SME processes is even that align with the company’s maturity. Specific to process
scarcer. Measuring software processes with the use of process metrics, most studies focus on using process metrics mainly
metrics enables objective and quantitative evaluation of soft- to predict software faults/defects [18], [29], [40]. However,
ware processes, which can lead to continuous improvement the role of process metrics in improving an organization’s
and learning [34], [42]. However, measuring process metrics overall process performance, especially in the context of SME
is a challenge [43]. Software processes are inherently com- and ASD, is missing from these studies.
plex and intangible, which makes their measurement more There have been studies evaluating MPs in SMEs, but
difficult than their product counterparts [19], [41]. Ideally, the scope has been limited to a particular region, which
measurement activities should consume little effort and time, makes it difficult to generalize their findings. For example,
while being adequate enough to meet an organization’s mea- with the goal of evaluating MPs in the Pakistani software
surement demands. Software organizations need to weigh industry, Tahir et al. [46] conducted a systematic mapping
in cost-efficiency while prioritizing measurement objectives study combined with a survey among 200 practitioners to
and targets. SMEs have the added constraints of limited bud- highlight the state of measurement practices. Forty-two per-
get, ambitious deadlines, and short-term strategy [45]. Due to cent of the organizations that responded to the survey were
SMEs. Overall, SMEs fared poorer than their larger coun-
4 https://www.vector.com/int/en/products/products-a-z/software/squore/ terparts. For instance, SMEs have the lowest share among

78454 VOLUME 8, 2020


M. Choraś et al.: Measuring and Improving Agile Processes in a Small-Size Software Development Company

organizations that have any defined measurement process, Research study focused on one particular ITTI software prod-
measurement standards, and usage of measurement models uct, named and branded CONTRA. CONTRA is an enterprise
and tools. Furthermore, 65% of SMEs tend to use MPs pri- class integrated software system for Warehouse (WMS) and
marily at project level, and only 13% of SMEs implement Manufacturing (MES) management and deployed in the form
it across the organization. One of the positive findings, with of web application.
respect to measuring software processes, was that 70% of ITTI applies Scrum in their software development projects,
the SMEs reported to either focus on measuring process or a including CONTRA. Typically, from 7 up to 10 developers
combination of process and the other two entities. How- work daily on specific deployments or on the new features to
ever, the corresponding primary studies were unclear on the improve the product. The Scrum team holds weekly sprint
context in which the process measurement was undertaken, meetings on the last day of the sprint. Each Scrum Team
and the focus on process metrics for process improvement meeting consists of the following parts: review, retrospective
was missing. For example, the study by Díaz-Ley et al. [9] and planning for the next sprint.
reported the experiences of a Spanish SME in implementing
an MP and reported that the practitioners could now objec- B. ACTION-RESEARCH APPROACH
tively evaluate the trade-off between on-time releases and To conduct this research, we applied an Action-Research
software product reliability. Tosun et al. [49] collaborated cycle: diagnosis, action planning and design, action taking,
with a Turkish healthcare SME to institutionalize process evaluation, and specifying learning [31], [38]. ITTI par-
improvement practices, and reported improvements in the ticipants in the Q-Rapids project played a double role as
organization’s time allocation for requirements, coding, and researchers and project champions in the company.
testing steps. Furthermore, the authors found that defect rates The action started in September 2018 with the diagnosis of
as well as testing effort estimation decreased. One of the the industry needs in the form of process improvements that
more interesting approaches for improving an SME’s process ITTI wanted to address. This originated our research goal and
was documented in an experimental study by Caballero et al. research questions (documented in Section IV-C).
[5]. The authors introduced Scrum to enhance process pro- As participant of the H2020 Q-Rapids project, in order to
ductivity without impacting product quality at a very small tackle these improvements, ITTI decided to customise the
enterprise. The authors claim that Scrum can prove to be a Q-Rapids approach and tool with extended process metrics.
good alternative for process improvement in an organization In October 2018, both the team of researchers and the CON-
with very limited resources, which has been a long-time con- TRA Scrum team decided the company repositories to be
cern in implementing MPs in SMEs. It is evident from these used and how to make actionable this data with the outcomes
studies that the evaluations of MPs in SMEs are concerned of the Q-Rapids project (action planning and design).
mainly with the overall software process improvement, where Next, the same joint researchers-practitioners team jointly
the role of process metrics towards this objective is either elicited 25 candidate process metrics fed with the selected
implied or absent altogether. company data. The resulting process metrics were used by
As per the state of the art, there is extensive reliance on the CONTRA Scrum Team during their meetings to reflect on
measurement experts and experience [46], and organizations the process performance and quality of the product as well as
tend to prefer employee perception to objective measurement to estimate tasks (action taking). This action took place from
processes for process improvements [32]. In contrast, our November 2018 to May 2019.
study provides empirical evidence of using process metrics The process metrics were evaluated in a retrospective ses-
for improving process performance, and even facilitating sion with the Scrum team of CONTRA in June 2019 (evalu-
decision-making. The empirical validation is especially a ating).
distinguishing aspect of our study, as it has been identified Finally, the team learned the subset of process metrics
as a research gap in [22]. Furthermore, it should be noted that which are more effective for the diagnosed problems (spec-
the Q-Rapids solution, embodying the MP, integrates basic ifying learning). Since July 2019 to the present, such subset
features like automatic data collection, support for diverse of process metrics has been used in other projects in ITTI.
data sources, expert-based metrics elicitation, and visualisa-
tion; something that is absent from the MPs reported in the C. GOAL AND RESEARCH QUESTIONS
literature. In the regular meetings reported in Section IV-A, ITTI Scrum
Teams diagnosed the need to: (a) monitor the process per-
formance of the team, (b) keep a stable product quality level
IV. RESEARCH METHODOLOGY while adding new features, and (c) improve the estimation of
A. CONTEXT tasks during sprints.
ITTI is a software development and consulting company Bearing in mind these industrial needs, ITTI considered the
based in Poznan, Poland. ITTI currently has about 70- customisation of process metrics, a concept already imple-
90 employees. ITTI delivers software products in a number of mented in the Q-Rapids project, and applied them to CON-
application domains (e.g. administration, utilities, e-Health, TRA, serving as the pilot case. Following the Goal-Question-
and crisis management). In this paper, we report an Action- Metric approach (GQM) [2], we can define the resulting

VOLUME 8, 2020 78455


M. Choraś et al.: Measuring and Improving Agile Processes in a Small-Size Software Development Company

research goal of this study as: Analyze process metrics with in the period of fifteen days. On the other hand, in the same
the purpose to evaluate them with respect to monitoring period, there is a clear improvement in the Average resolved
and estimating the Agile process performance from the view issues’ throughput (last 7 days) and some improvement in
point of the Scrum team in the context of an SME. Density of estimated development tickets (last 7 days).
We break this generic research goal into three research Figure 3 shows an example of the use of Kibana dash-
questions, aligned with the needs anticipated above: boards. In this case, the presented view aggregates several
• RQ1. Do process metrics help the Scrum team of an SME metrics related to the average elapsed time for tasks (accord-
to monitor their own process performance? ing to their state).
• RQ2. Do process metrics support the Scrum team of
an SME in keeping a stable product quality level while V. RESULTS
adding new features? A. PROCESS METRICS DEFINITION
• RQ3. Do process metrics help the Scrum team of an SME In the Action taking phase of the Action-Research cycle,
to improve the estimation of tasks during sprints? the ITTI CONTRA Scrum Team and the research team dis-
cussed and analysed what type of GitLab-based process met-
D. INSTRUMENTATION rics they consider candidates for assessing the Agile devel-
In the action planning and design phase of the Action- opment processes at CONTRA. These metrics were imple-
Research cycle, we decided to use GitLab as the data source mented in the CONTRA case in order to understand their
for the pilot. GitLab is an open source, web based tool that significance.
provides support to the full software development life cycle, The CONTRA Scrum Team provided relevant observa-
with focus on repository management and issue tracking, tions that drove the design of the candidate set of metrics:
among other capabilities. GitLab is used extensively in all
• The three main concepts that they use in their daily
ITTI projects and in particular, the CONTRA Scrum Team
practices to monitor progresses are: Task, Issue/Bug and
affirmed that it is the tool that may best reflect the process
Effort.
followed by the team during development.
• The state transition among development tasks (opened
ITTI gathered data from GitLab from about 12 months
→ completed → closed) is particularly important in
history of the CONTRA development, so that the process
analysing progress.
metrics could be assessed during a long life span. Dur-
• The effort is particularly interesting in relation to its
ing this time, a total of 31 unique assignees opened up to
estimation, because resource planning in the team (e.g.,
2.975 issues from which they closed up to 2.651, and there
developer allocation) greatly depends on its accuracy.
happened 40.947 events describing the changes of status of
• The concepts above may be analysed mainly from
tasks or issues.
two perspectives: numerical (e.g., number, accumulated
Table 1 includes the collected data for each issue from
sum or average) and time dimension.
GitLab. This data is stored in a dedicated index in the Elas-
• The metrics should be measurable using GitLab data.
ticSearch engine. From this data, during the studied period,
the Q-Rapids tool provided a total of 1.830 metrics, 1.098 fac- Considering these principles, the team consolidated a pro-
tors, and 732 strategic indicators assessment data points. posal of 25 candidate metrics. Although many other metrics
In order to evaluate the usefulness of the process metrics appeared interesting, the Action-Research team preferred to
defined in CONTRA, the Scrum Team proceeded as follows: keep the proposal manageable in this first iteration, thus
• They implemented the connectors to GitLab that focusing on those metrics which the developers agreed upon
allowed to effectively gather the data to start the mea- being the most determinant. The metrics can be divided into
surement process (see Section II-B). several categories as shown in Table 2:
• Given the existence of two strategic indicators provided • General metrics. Following the Scrum Team observa-
by the Q-Rapids project, related to the two first research tions, we propose an indication of the total number
questions (Process performance and Product Quality), and the average number of development tasks (metrics
they used the Q-Rapids dashboard for the process met- #1 and #4), number of tasks based on their status, e.g.
rics related to these questions. completed, closed, etc. (#2 and #3), and average time
• For the third research question, they preferred to deploy of tasks lifetime (#5 and #6). Metrics #2 and #3 can
Kibana dashboards in order to assess task estimation. be extended and the number of tasks marked as ‘‘in
In order to provide an integrated solution, these Kibana progress’’, ‘‘testing’’, ‘‘ready’’, etc., can be also anal-
dashboards were integrated into the Q-Rapids dash- ysed, but we discarded this in order to keep the approach
board. simple in this first iteration. Each of the general metrics
Figure 2 illustrates the Q-Rapids dashboard by showing can be calculated in different dimensions, i.e. per devel-
some example of historical data views. In these charts we can oper, per specific project, per area (frontend/backend),
see that Density of tickets pending testing and Density of bug per sprint or release and narrowing the timespan to the
have been stable in low and high quality values respectively specific range.

78456 VOLUME 8, 2020


M. Choraś et al.: Measuring and Improving Agile Processes in a Small-Size Software Development Company

TABLE 1. Issue data gathered from GitLab.

FIGURE 2. Visualization of process metrics using the Q-Rapids dashboard.

• Task estimation metrics. This category includes the met- accuracy of such estimation, including average deviation
rics related to planning an effort allocation and analysis of estimation in relation to the real effort consumption
of effort/resources consumption. They can indicate the (metrics #7 and #8), total sum of estimated or used

VOLUME 8, 2020 78457


M. Choraś et al.: Measuring and Improving Agile Processes in a Small-Size Software Development Company

FIGURE 3. Visualization of process metrics using Kibana.

FIGURE 4. Strategic indicators evolution view.

resources (#11 and #12) as well as completeness of task of average time needed/pending time to fix the bug
estimation (#9 and #10). Similarly to general metrics, (19 and 20).
those related to task estimation also can be calculated in • Testing metrics. Similarly, metrics #22 and #23 indicate
different dimensions, such as taking assignees, projects, average testing time and average pending testing time,
sprints or time range into account. while metric #24 shows the current percentage of the
pending task to be tested.
For the specific stages of the development process, we distin- • Other metrics. Moreover, we propose a metric showing
guished the metrics related to task implementation, bug fixing the number of non-commented merge requests (metric
and testing. #25), identified as relevant by the CONTRA developers.
• Task implementation metrics. This category includes
time-based metrics indicating average time of imple-
mentation and average time of waiting for implemen- B. PROCESS METRICS ASSESSMENT
tation (metrics #13 and #14). The other three metrics Next, we report the impact from using the process metrics in
in this category are related to the task implementation the CONTRA pilot project (corresponding to the evaluating
status, namely assignment to the given sprint (#15 and phase of the Action-Research cycle) on the three research
#17) and to a given developer (#14). questions: (a) monitoring process performance, (b) keeping a
• Bug fixing metrics. They include the number of tasks stable product quality level, and (c) improving the estimation
with reported bug (metrics #18 and #21) and indication of tasks during sprints.

78458 VOLUME 8, 2020


M. Choraś et al.: Measuring and Improving Agile Processes in a Small-Size Software Development Company

TABLE 2. Set of candidate process metrics for CONTRA.

The process of using the metrics was led by the CON- 1) PROCESS PERFORMANCE
TRA Project Owner. With respect to process performance Figure 4 shows an example, in which we can see that
and product quality, the Project Owner analysed monthly the Product Quality remains at a steady level, while in
strategic indicators rendered by the Q-Rapids dashboard for the case of Process Performance, there are significant
these two concepts. changes.

VOLUME 8, 2020 78459


M. Choraś et al.: Measuring and Improving Agile Processes in a Small-Size Software Development Company

The Product Owner and the Scrum Master wanted to


know the reason behind the process performance change of
behaviour, so they used the Q-Rapids detailed view capa-
bility applied to the Process Performance strategic indicator
(see Figure 5). As a result, they were able to identify that
the Issues Velocity factor suffered a significant improvement
over the month, i.e. the development team is increasing the
development velocity. Instead, the Process Performance just
experienced minor fluctuations.
Next, in order to understand in more detail the reason
behind the issues’ velocity improvement, they used the
detailed view capability to visualise the evolution of their
influencing metrics. Figure 6 indicates that the Average
Resolved Issue Throughput (last 7 days) metric improved over
the month.
When discussing in the team the reason for this improve-
ment, it became apparent that metrics visualization via the
FIGURE 5. Detailed view for process performance indicator.
Q-Rapids dashboard allowed the Product Owner to improve
his understanding on several aspects of the Scrum pro-
cess, which had remained unknown before using Q-Rapids.
In other words, Product Owners were relying on anecdotal the 9 tasks completed by this developer were non-estimated
evidence rather than real-time collected data of their devel- in terms of completion time. This helps to identify those tasks
opment process. that were not estimated and take actions, which results in
improving the estimation of tasks from sprint to sprint.
2) PRODUCT QUALITY
As Figure 4 shows, the metric shown in the Q-Rapids dash- C. METRICS SELECTION
board did not bring extra value to the development process In order to evaluate the usefulness of the metrics for the
in this pilot project. The reason may be that other tools CONTRA Scrum Team (specifying learning phase in the
like SonarQube were already in place in the company, and Action-Research cycle), we executed a retrospective session
therefore the code quality factors were already addressed. involving the Scrum Master and one developer from CON-
In any event, the Scrum Team considered it positive that TRA, and three researchers from the Q-Rapids project. The
despite adding new features to the products (i.e., many tasks retrospective session was divided into three main activities:
being closed during the sprint), the product quality remained (a) exploration of relevant process metrics used during six
at a steady level without acquiring technical debt. months in the CONTRA pilot project; (b) an open feedback
session discussing the reasons of the impacts of using the
3) TASK ESTIMATION process metrics; (c) documenting the results of the session
The lack of mechanisms for visualizing task estimation was in a template for the impacts of process metrics.
reported before starting the study. By using the process met- As a result of this session, the metrics that were considered
rics visualised with Q-Rapids, the Product Owner was able more valuable are (see Table 2):
to see, as part of the Testing Performance factor, low values • Metric #7: Estimation accuracy per development tasks
for the metric Density of Estimated Development Tickets (last (per developer in project in specific timespan).
7 days) (Figure 6 (b), green line). In order to learn more, • Metric #9: Number of development tasks with lacking
the Product Owner switched to the Kibana view on metrics estimation of effort to be spent (‘‘estimated’’) per project
and noticed that there were 37 non-estimated issues in the per developer.
analysed timespan (see Figure 7). • Metric #10: Number of development tasks with lacking
This capability to smoothly switch into Kibana from Q- value of effort used (‘‘spend’’) per project per developer.
Rapids was highly appreciated by the CONTRA team. For • Metric #11: Total sum of estimated effort values (‘‘esti-
instance, by using the Kibana dashboard, looking at the cir- mate’’) per project per developer.
cular diagram, the Product Owner can identify key persons, • Metric #12: Sum of used effort (‘‘spend’’) per project
e.g. the developers with the highest number of assigned per developer.
issues (see Figure 8). The Kibana dashboard also includes • Metric #18: Number of development tasks with reported
information for analysing other statistics related to a given bug.
developer, e.g. the average time of fixing a bug or correction • Metric #19: Average time of task correction based on
(see Figure 9). After clicking and selecting the particular project board.
developer’s surname, the data related to this developer is • Metric #20: Average time-to-correct of task based on the
filtered out. Now, the Product Owner can see that 7 out of project board.

78460 VOLUME 8, 2020


M. Choraś et al.: Measuring and Improving Agile Processes in a Small-Size Software Development Company

FIGURE 6. Detailed view for process performance factors.

FIGURE 7. Kibana dashboard including non-estimated issues.

• Metric #21: Percentage of ‘non-bug’ type tasks to total motivate the team and improve the process, and also to find
tasks on the board. the problems in order to resolve them.
These metrics significantly improve management of such
B. BENEFITS AND ADOPTION OF PROCESS METRICS AT
processes as task estimation and bug fixing, which are crucial
ITTI
in rapid software development of high quality and stable
software. Moreover, after applying those metrics, team man- The most important advantage of the process metrics per-
agement is now more efficient and transparent. ceived by ITTI is the focus on the process and team effec-
tiveness. The proposed solution has improved the way devel-
opers now report time spent on issues/tickets and allows for
VI. DISCUSSION comparison to the effort planned.
The overall assessment of the process metrics in the CON- Including dashboards and the process metrics into the ITTI
TRA case proved their value to the company. The proposed, software development process enhanced and improved the
calculated and visualized process metrics (using either the willingness and efficiency in reporting spent time and plan-
Q-Rapids dashboard, Kibana views or even ad-hoc visual- ning the effort. Moreover, the gap between effort planned
izations developed at ITTI) were assessed as useful by the and spent is continuously decreasing which means Product
CONTRA Scrum Team and some of the metrics are now used Owners, Scrum Masters as well as developers estimate much
in practice not only in the pilot project but company-wide. better.
From more practical advantages and possible deci-
A. PROCESS METRICS IN THE SCRUM PROCESS sions, the proposed metrics allow for efficient tracking of
Once the selection was made, at each Scrum Team retro- tasks/issues in the project, per developer and per sprint (or
spective meeting, the team usually spends 15 to 20 minutes the chosen timespan).
on visualizing and analysing these selected process metrics. As shown in the previous section, now the efficiency of
Process metrics are a great fit since this part of the meeting is each developer can be checked. What we found out is that the
devoted to people, processes, tools, lessons learnt and how optimal reported time/effort should be close to 4 days (which
to improve the way of working. Of course the role of the means that 1 day is spent on unreported aspects (e.g. when
Product Owner and Scrum Master is to make those discus- experienced developers help those less experienced ones),
sions and displays interesting, but this turned out to be an and this is well understood and justified). What Product
easy job for them, because developers usually like statistics Owners and Scrum Masters mostly seek is the information
and some trends/graphs, such as those shown in the previous of bottlenecks of the process, and basically how much time
section. These results, trends and metrics values are used to the ticket ‘lies’ in each phase of the process. The board of the

VOLUME 8, 2020 78461


M. Choraś et al.: Measuring and Improving Agile Processes in a Small-Size Software Development Company

FIGURE 8. Kibana dashboard including a chart for developers’ assigned issues per developer. Black boxes are used
to hide developers’ real names.

FIGURE 9. Kibana dashboard including developers’ statistics.

process at ITTI is presented in Figure 10. It consists of 8 steps, as well as to some practical guidelines such as those from
and the proposed process metrics nicely show the status of the European project CIPHER [7].
tickets in the project, per phase of the process, per developer
and in the given timespan. VII. THREATS TO VALIDITY
At ITTI, in the project CONTRA, the solution showed As with any empirical study, there might be limitations to our
at first that the bottleneck was in the testing phase. Such research method and findings. This section discusses possible
situation facilitated a quick decision to engage more testers, threats to validity in terms of construct, conclusion, internal,
but did not solve all the problems. While looking at process and external validity and emphasises the mitigation actions
metrics, the company found out that the bottleneck shifted applied.
to the ‘merge request’ phase. Such a situation meant that
experienced developers (those who can perform code review A. CONSTRUCT VALIDITY
and merge) do not have enough time and resources to perform
The retrospective session enabled us to further elaborate the
tasks in this phase of the software development process.
practical relevance of the process metrics with two members
The situation is now solved by granting the rights to per-
of the Scrum team. The use of quantitative and qualita-
form ‘merges’ to medium-experienced developers in order to
tive measures and observations reduced the mono-method
improve the overall process.
bias. Furthermore, we aimed at creating a safe environ-
ment, encouraging the participants to highlight any negative
aspects and make suggestions for the improvement of the
C. CONSIDERATIONS ON HUMAN ASPECTS process metrics. Finally, some of our results could be caused
An important aspect to note is that some of the proposed by not optimal implementation of the process metrics (See
process metrics, while calculated per developer, have to be Section VI.A). Still, these results are useful for others to learn
used wisely by the Product Owners and Scrum Masters taking how to build such an infrastructure in realistic settings.
into account a plethora of human factors and aspects. This is
extremely important especially now (as of early 2020), when B. CONCLUSION VALIDITY
we have the employee-market in IT world and developers To ensure the reliability of this evaluation, the measurement
are lacking. At ITTI, the usage of the process metrics is plan and procedure were documented in detail. Additionally,
also compliant to the General Data Protection Regulation, the results were reviewed by the Scrum team. In this way,

78462 VOLUME 8, 2020


M. Choraś et al.: Measuring and Improving Agile Processes in a Small-Size Software Development Company

FIGURE 10. The view of the process board used at ITTI (the board is in Polish (the real used one) but any other
language can also be used).

we mitigated risks such as fishing for results during the VIII. CONCLUSION
analysis, which would have led to a subjective analysis. In this paper we presented an approach to the definition and
utilisation of process metrics related to Agile software devel-
C. INTERNAL VALIDITY opment. This has been implemented with: the formulation of
We evaluated the integrated Q-Rapids solution by drawing a set of process metrics, their assessment in a real project,
a convenient sample of a Scrum master and a developer. and the description of the practical and empirical usage in a
One limitation of our work is that we were not able to particular SME company.
get a random sample of participants in the pilot project. In More precisely, in terms of the research questions:
addition, we defined an evaluation protocol in advance, which
• RQ1: The major contribution of the paper is the very
included a specific description of our planned procedure and
needed solution to monitor the performance of each
the order of using the materials, i.e., an explanation with
phase of the software development process. The solution
all the steps that had to be performed. After all the partners
includes a subset of effective process metrics.
had agreed on the final version of the evaluation guidelines,
• RQ2: Indeed, the major benefit for ITTI is the pos-
we executed the evaluation accordingly. This should miti-
itive impact on the stability of the CONTRA system
gate the fact that we needed to split the work of conduct-
while adding new features. In fact, the offered software
ing the evaluation among different researchers and partners.
product (CONTRA) needs customisation for each client
Some of the five researchers who conducted the evaluation
(while the domains of clients’ businesses vary signifi-
were involved in developing the Q-Rapids tool components.
cantly). The proposed process metrics are continuously
To minimise that bias, we made sure that in each case there
used to help assuring the quality and stability of the
were at least two researchers present; one acting as the moder-
software.
ator/experimenter and one as the observer, to emphasise that
• RQ3: The value of the proposed solution are the mech-
the participants could speak freely.
anisms for visualizing task estimation, e.g., trace tasks
(tickets, issues) live (in real time). By using these process
D. EXTERNAL VALIDITY metrics visualisations, the Product Owner in the Scrum
Our results are tied to the context of CONTRA. Our goal Team of ITTI was able to improve task estimations.
was to better understand practitioners’ perception. We charac-
terised the environment as realistically as possible and studied Even though our findings are based on a particular SME
the suitability of our sampling (see Section IV.A). GitLab is company and product, we believe that the presented findings
one of the most extensively used software management tools on process metrics and Q-Rapids usage can be applicable
in SME software development companies. Therefore, we can to a wider context. ‘‘If the forces within an organization
expect that these metrics analysis may provide actionable that drove the observed behavior are likely to exist in other
insights to a software development company for improving organizations, it is likely that those other organizations, too,
the quality of their processes. will exhibit similar behavior’’ [39].

VOLUME 8, 2020 78463


M. Choraś et al.: Measuring and Improving Agile Processes in a Small-Size Software Development Company

In fact, most SME software development companies use [13] W. A. Florac and A. D. Carleton, Measuring the Software Process.
SCRUM-like processes as the one presented in Fig 10, Reading, MA, USA: Addison-Wesley, 1999.
[14] A. Fuggetta, ‘‘Software Process: A Roadmap,’’ in Proc. Conf. Future Softw.
and would be interested in the practical metrics related to Eng., 2000, pp. 25–34.
processes and team effectiveness. Even though the used [15] O. Gómez, H. Oktaba, M. Piattini, and F. García, ‘‘A systematic review
tools or labels or names of the process phases may be dif- measurement in software engineering: State-of-the-art in measures,’’ in
Software Data Technology Berlin, Germany: Springer, pp. 165–176, 2006.
ferent, the solution is general, although of course it would [16] L. Guzmán, M. Oriol, P. Rodríguez, X. Franch, A. Jedlitschka, and
require some customisation. It is worth to note that, in this M. Oivo, ‘‘How can quality awareness support rapid software
paper, we showed real benefits based on a real implementa- development?—A research preview,’’ in Proc. REFSQ, 2017, pp. 167–173.
[17] T. Hall and N. Fenton, ‘‘Implementing effective software metrics pro-
tion for GitLab; however other tools such as JIRA can also be
grams,’’ IEEE Softw., vol. 14, no. 2, pp. 55–65, 1997.
used as the data source, and in fact, the connectors to JIRA [18] D Radjenović, M Heričko, R. Torkar, and A. Živkovi, ‘‘Software fault
are already implemented and available as the output of the prediction metrics: A systematic literature review,’’ Inf. Softw. Technol.,
Q-Rapids project.5 vol. 55, no. 8, pp. 1397–1418, Aug. 2013.
[19] M. Kasunic, ‘‘The state of software measurement practice: Results of 2006
Indeed, the proposed metrics and related Q-Rapids solu- survey,’’ Software Engineering Institute (SEI), Pittsburgh, PA, USA, Tech.
tions fill the current need for tools related to the processes Rep. CMU/SEI-2006-TR-009, ESC-TR- 2006-009, 2006.
in Agile software development. Most tools focus on software [20] R. Kozik, M. Choraà, D. Puchalski, and R. Renk, ‘‘Q-rapids framework
for advanced data analysis to improve rapid software development,’’
quality or continuous integration, without the measures for J. Ambient Intell. Hum. Comput., vol. 10, no. 5, pp. 1927–1936, May 2019.
the process. Basically, there is only one competing solution [21] B. Kitchenham, ‘‘What’s up with software metrics?—A preliminary map-
that could be used to analyze the process, namely GitLab ping study,’’ J. Syst. Softw., vol. 83, no. 1, pp. 37–51, 2010.
[22] E. Kupiainen, M. V. Mäntylä, and J. Itkonen, ‘‘Using metrics in agile and
Time Tracker. However, as shown in the paper, we propose lean software development—A systematic literature review of industrial
a wider set of calculated process metrics, better visualization studies,’’ Inf. Softw. Technol., vol. 62, pp. 143–163, Jun. 2015.
as well as much more enhanced analysis capabilities. [23] C. Y. Laporte, S. Alexandre, and R. V. O’Connor, ‘‘A software engineering
lifecycle standard for very small enterprises,’’ in Proc. Softw. Process
Improvement, 2008, pp. 129–141.
ACKNOWLEDGMENT [24] F. Van Latum, R. Van Solingen, M. Oivo, B. Hoisl, D. Rombach, and
The authors would like to thank all the members of the Q- G. Ruhe, ‘‘Adopting GQM based measurement in an industrial environ-
Rapids H2020 project consortium. ment,’’ IEEE Softw., vol. 15, no. 1, pp. 78–86, Dec. 1998.
[25] L. López, S. Martínez-Fernández, C. Gómez, M. Choraś, R. Kozik,
L. Guzmán, A. M. Vollmer, X. Franch, and A. Jedlitschka, ‘‘Q-rapids
REFERENCES tool prototype: Supporting decision-makers in managing quality in rapid
[1] E. Arisholm and L. C. Briand, ‘‘Predicting fault-prone components in a software development,’’ in Proc. CAiSE Forum, 2018, pp. 200–208.
java legacy system,’’ in Proc. ACM/IEEE Int. Symp. Int. Symp. Empirical [26] S. Martínez-Fernández, P. Jovanovic, X. Franch, and A. Jedlitschka,
Softw. Eng. (ISESE), 2006, pp. 8–17. ‘‘Towards automated data integration in software analytics,’’ in Proc. Int.
[2] V. Basili, G. Caldiera, and H. Rombach, ‘‘The goal question metric Workshop Real-Time Bus. Intell. Anal., 2018. p. 6
approach,’’ Encyclopedia Softw. Eng., vol. 1, pp. 528–532, 1994. [27] S. Martinez-Fernandez, A. M. Vollmer, A. Jedlitschka, X. Franch,
[3] A. M. Bhatti, H. M. Abdullah, and C. Gencel, ‘‘A model for selecting an L. Lopez, P. Ram, P. Rodriguez, S. Aaramaa, A. Bagnato, M. Choras,
optimum set of measures in software organizations,’’ in Proc. Eur. Conf. and J. Partanen, ‘‘Continuously assessing and improving software qual-
Softw. Process Improvement., 2009, pp. 44–56. ity with software analytics tools: A case study,’’ IEEE Access, vol. 7,
[4] M. P. Boerman, Z. Lubsen, D. A. Tamburri, and J. Visser, ‘‘Measuring pp. 68219–68239, 2019.
and monitoring agile development status,’’ in Proc. IEEE/ACM 6th Int. [28] M. G. Mendonca and V. R. Basili, ‘‘Validation of an approach for improv-
Workshop Emerg. Trends Softw. Metrics, May 2015, pp. 54–62. ing existing measurement frameworks,’’ IEEE Trans. Softw. Eng., vol. 26,
[5] E. Caballero, J. A. Calvo-Manzano, and T. S. Feliu, ‘‘Introducing Scrum no. 6, pp. 484–499, Jun. 2000.
in a very small enterprise: A productivity and quality analysis,’’ Commun. [29] R. Moser, W. Pedrycz, and G. Succi, ‘‘A comparative analysis of the effi-
Comput. Inf. Sci., vol. 172, no. May 2014, pp. 215–224, 2011. ciency of change metrics and static code attributes for defect prediction,’’
[6] M. Choraś, R. Kozik, D. Puchalski, and R. Renk, ‘‘Increasing prod- in Proc. 13th Int. Conf. Softw. Eng. (ICSE), 2008, pp. 181–190.
uct owners’ cognition and decision-making capabilities by data analy- [30] D. J. Paulish and A. D. Carleton, ‘‘Case studies of software-process-
sis approach,’’ Cognition, Technol. Work, vol. 21, no. 2, pp. 191–200, improvement measurement,’’ Computer, vol. 27, no. 9, pp. 50–57,
May 2019. Sep. 1994.
[7] M. Choraś, R. Kozik, R. Renk, and W. Holubowicz, ‘‘A practical frame- [31] K. Petersen, C. Gencel, N. Asghari, D. Baca, and S. Betz, ‘‘Action
work and guidelines to enhance cyber security and privacy,’’ in Proc. research as a model for industry-academia collaboration in the software
8th Int. Conf. Comput. Intell. Secur. Inf. Syst., Burgos, Spain Jun. 2015, engineering context,’’ in Proc. Int. Workshop Long-Term Ind. Collab-
pp. 15–17. oration Softw. Eng. (WISE), New York, NY, USA, 2014, pp. 55–62,
[8] V. Claudia, M. Mirna, and M. Jezreel, ‘‘Characterization of software doi: 10.1145/2647648.2647656.
processes improvement needs in SMEs,’’ in Proc. Int. Conf. Mechatronics, [32] F. J. Pino, F. García, and M. Piattini, ‘‘Software process improvement in
Electron. Automot. Eng. (ICMEAE), Dec. 2013, pp. 223–228. small and medium software enterprises: A systematic review,’’ Softw. Qual.
[9] M Díaz-Ley, F García, and M. Piattini, ‘‘Implementing a software measure- J., vol. 16, no. 2, pp. 237–261, Jun. 2008.
ment program in small and medium enterprises: A suitable framework,’’ [33] P. Ram, P. Rodríguez, and M. Oivo, ‘‘Software process measurement and
IET Softw., vol. 2, no. 5, pp. 417–436, 2008. related challenges in agile software development: A multiple case study,’’
[10] K. Dikert, M. Paasivaara, and C. Lassenius, ‘‘Challenges and success fac- in Proc. Int. Conf. Product-Focused Softw. Process Improvement. Cham,
tors for large-scale agile transformations: A systematic literature review,’’ Switzerland: Springer, Nov. 2018, pp. 272–287.
J. Syst. Softw., vol. 119, pp. 87–108, Sep. 2016. [34] P. Ram, P. Rodriguez, M. Oivo, and S. Martinez-Fernandez, ‘‘Success
[11] Y. Dubinsky, D. Talby, O. Hazzan, and A. Keren, ‘‘Agile metrics at the factors for effective process metrics operationalization in agile software
israeli air force,’’ in Proc. Agile Develop. Conf. (ADC), 2005, pp. 12–19. development: A multiple case study,’’ in Proc. IEEE/ACM Int. Conf. Softw.
[12] T. Dybà, ‘‘Factors of software process improvement success in small and Syst. Processes (ICSSP), May 2019, pp. 14–23.
large organizations: An empirical study in the scandinavian context,’’ in [35] C. R. Prause, A, Hönle, ‘‘Emperor’s new clothes: Transparency through
Proc. 9th Eur. Softw. Eng. Conf., pp. 148–157, 2003. metrication in customer-supplier relationships,’’ in Product-Focused Softw.
Process Improvement (Lecture Notes in Computer Science), vol. 11271,
5 https://github.com/q-rapids/ M. Kuhrmann, Ed. Cham, Switzerland: Springer, 2018.

78464 VOLUME 8, 2020


M. Choraś et al.: Measuring and Improving Agile Processes in a Small-Size Software Development Company

[36] I. Richardson and C. G. Von Wangenheim, ‘‘Guest Editors’ introduction: TOMASZ SPRINGER received the M.Sc. degree
Why are small software organizations different?’’ IEEE Softw., vol. 24, in applied computer science from Adam Mick-
no. 1, pp. 18–22, Jan. 2007. iewicz University, Poland, in 2009. He currently
[37] P. Rodríguez, J. Markkula, M. Oivo, and K. Turula, ‘‘Survey on agile and holds the position of Lead IT Architect at ITTI sp.
lean usage in finnish software industry,’’ in Proc. ACM-IEEE Int. Symp. z o.o., Pozna?, Poland. He is also an Experienced
Empirical Softw. Eng. Meas. (ESEM), Lund, Sweden, 2012, pp. 139–148. Team Leader, a Product Owner, a Scrum Master,
[38] P. S. M. dos Santos and G. H. Travassos, ‘‘Action research can swing and an Active Agile Practitioner. He is involved
the balance in experimental software engineering,’’ Adv. Comput., vol. 83,
in commercial software product CONTRA devel-
pp. 205–276, Dec. 2011.
opment. He took part as an Expert, end-user, and
[39] P. B. Seddon and R. Scheepers, ‘‘Towards the improved treatment of gener-
alization of knowledge claims in IS research: Drawing general conclusions practitioner in H2020 Q-Rapids project. For many
from samples,’’ Eur. J. Inf. Syst., vol. 21, no. 1, pp. 6–21, Jan. 2012. years, he was an Active Member of different EU FP7, H2020, European
[40] E. Shihab, Z. M. Jiang, W. M. Ibrahim, B. Adams, and A. E. Hassan, Defence Agency, European Space Agency, and NATO projects. He is person-
‘‘Understanding the impact of code and process metrics on post-release ally, interested in user-centered design, IT systems ergonomics, and usability.
defects: A case study on the eclipse project,’’ in Proc. ACM-IEEE Int.
Symp. Empirical Softw. Eng. Meas. (ESEM), Sep. 2010, p. 4.
[41] J. Soini, ‘‘A survey of metrics use in finnish software companies,’’ in Proc.
Int. Symp. Empirical Softw. Eng. Meas., Sep. 2011, pp. 49–57.
[42] R. van Solingen and E. Berghout, The Goal/Question/Metric Method:
A Practical Guide for Quality Improvement of Software Development.
New York, NY, USA: McGraw-Hill, 1999.
[43] M. Söylemez and A. Tarhan, ‘‘Challenges of software process and product
quality improvement: Catalyzing defect root-cause investigation by pro-
cess enactment data analysis,’’ Softw. Qual. J., vol. 26, no. 2, pp. 779–807,
Jun. 2018.
[44] M. Staron and W. Meding, ‘‘Factors determining long-term success of
a measurement program: An industrial case study,’’ E-Infor. Softw. Eng. RAFAŁ KOZIK received the Ph.D. degree in
J., vol. 1, no. 1, pp. 7–23, Jan. 2012. telecommunications from the University of Sci-
[45] M. Sulayman and E. Mendes, ‘‘A systematic literature review of software ence and Technology (UTP) in Bydgoszcz,
process improvement in small and medium Web companies,’’ in Proc. Adv. in 2013, and the D.Sc. degree in computer science
Softw. Eng., 2009, pp. 1–8. from the West Pomeranian University of Tech-
[46] T. Tahir, G. Rasool, W. Mehmood, and C. Gencel, ‘‘An evaluation of nology in Szczecin, in 2019. Since 2009, he has
software measurement processes in pakistani software industry,’’ IEEE been involved in a number of international and
Access, vol. 6, pp. 57868–57896, 2018.
national research projects related to cybersecu-
[47] B. Tanveer, L. Guzmán, and U. M. Engel, ‘‘Understanding and improving
rity, critical infrastructures protection, software
effort estimation in agile software development,’’ in Proc. Int. Workshop
Softw. Syst. Process (ICSSP), 2016. 41–50 quality, and data privacy (e.g., FP7 INTERSEC-
[48] A. Tarhan and S. G. Yilmaz, ‘‘Systematic analyses and comparison of TION, FP7 INSPIRE, FP7 CAMINO, FP7 CIPRNet, SOPAS, SECOR,
development performance and product quality of incremental process and and H2020 Q-Rapids). He is currently an Assistant Professor with the
agile process,’’ Inf. Softw. Technol., vol. 56, no. 5, pp. 477–494, May 2014. Department of Telecommunication, University of Science and Technology
[49] A. Tosun, A. Bener, and B. Turhan, ‘‘Implementation of a software quality in Bydgoszcz (UTP). He has authored more than 70 reviewed scientific
improvement project in an SME: A before and after comparison,’’ in Proc. publications.
35th Euromicro Conf. Softw. Eng. Adv. Appl., 2009, pp. 203–209.
[50] M. Unterkalmsteiner, T. Gorschek, A. K. M. M. Islam, C. K. Cheng,
R. B. Permadi, and R. Feldt, ‘‘Evaluation and measurement of soft-
ware process Improvement—A systematic literature review,’’ IEEE Trans.
Softw. Eng., vol. 38, no. 2, pp. 398–424, Apr. 2012.

LIDIA LÓPEZ received the Ph.D. degree in com-


puting from the Technical University of Catalonia
(UPC-BarcelonaTech), Spain, in 2013.
From 2007 to 2012, she was an Assistant
Teacher with UPC-BarcelonaTech. She is cur-
MICHAŁ CHORAŚ currently a Professor with rently a Research Fellow with the Software and
the University of Science and Technology in Services Engineering Research Group (GESSI),
Bydgoszcz (UTP), where he is also the Head UPC-BarcelonaTech. Her research interests are
of the Teleinformatics Systems Division and the software engineering, focused in requirements
PATRAS Research Group. He also works at engineering, open source software, empirical soft-
FernUniversitat, Hagen, Germany, and manages ware engineering, and data-driven decision-making processes in agile soft-
projects for ITTI Sp. z o.o. He has been involved in ware development.
many EU projects (e.g., SocialTruth, CIPRNet, Q- Dr. López has been a PC Co-Chair in the international conference and
Rapids, and INSPIRE). He currently coordinates workshops (CIbSE, iStar) and has been a PC member on several interna-
H2020 SIMARGL. His interests include data sci- tional conferences like RCIS, ICSOFT, SAC, and CIbSE. She also reviewed
ence and pattern recognition in several domains, e.g., cyber security, image articles for journals including IST, JSS, and the IEEE SOFTWARE. She has
processing, software engineering, prediction, correlation, and biometrics and participated in several international research projects, e.g., Q-Rapids (H2020,
critical infrastructures protection. He has authored more than 230 reviewed work-package leader) and RISCOSS (FP7, work-package leader, and partner
scientific publications. representative).

VOLUME 8, 2020 78465


M. Choraś et al.: Measuring and Improving Agile Processes in a Small-Size Software Development Company

SILVERIO MARTÍNEZ-FERNÁNDEZ (Member, PILAR RODRÍGUEZ (Member, IEEE) received


IEEE) received the B.Sc., M.Sc., and Ph.D. the Ph.D. degree in computer science from the
degrees in computing fromUPC-BarcelonaTech. University of Oulu, Finland, in 2013. She is cur-
He was a Researcher with the inSSIDE Research rently an Assistant Professor in software engineer-
Group, UPCBarcelonaTech. He was a Postdoc- ing with the Universidad Politécnica de Madrid,
toral Fellow of the European Research Consortium Spain, and a Docent at the University of Oulu, Fin-
for Informatics and Mathematics, from 2016 to land. Her main research areas of interest are empir-
2018 and an Operative Project Manager with ical software engineering, agile and lean software
Fraunhofer IESE, Germany, from 2018 to 2019. development, software quality, value-based soft-
He is currently an Assistant Professor with UPC- ware engineering, and human factors in software
BarcelonaTech. He was a Researcher with more than 35 peer-reviewed engineering. She is a member of the Review Board of leading SE journals
publications and H-factor 11 (according to Google Scholar). His interests such as TSE and EMSE. She has served as a PC Member for conferences
include empirical software engineering, reference architectures, software such as ESEM, EASE, and XP. Recently, she has been the Leader of Work
analytics, and data-driven development, among others. In EU framework Package 2 in the H2020 Q-Rapids Project.
programmes, he acted as an Evaluation WP Leader in Q-Rapids (H2020,
RIA), and participated in DESIRA (H2020, RIA). He has participated in
the organization of several conferences and workshops PROFES 2019 (PC
Co-chair), CESI@ICSE 2018 (PC-Co Chair), and QuASD@PROFES 2017–
2018 (PC co-chair). He is an Editorial Board Member of the SCI-indexed
journal IET Software (IEE). He has also been a Reviewer of multiple journals
(e.g., IST, JSS, and IJCIS) and a PC Member of international conferences
(e.g., ESEM, ICSME, ECSA, and CIbSE).
XAVIER FRANCH received the Ph.D. degree
in informatics from UPC, in 1996. He is cur-
rently a Professor of software engineering with
the Universitat Politècnica de Catalunya (UPC-
PRABHAT RAM received the M.Sc. degree in BarcelonaTech). His research interest embraces
information processing science from the Univer- many fields in software engineering, includ-
sity of Oulu, Finland, in 2016. He is currently ing requirements engineering, empirical software
working as a Doctoral Researcher with the Empir- engineering, open source software, and agile soft-
ical Software Engineering in Software, Systems ware development.
and Services (M3S) Research Unit, University Dr. Franch is a member of the IST, REJ, IJCIS,
of Oulu. His research interests include empirical and Computing editorial boards, Journal First chair of JSS, and a Deputy Edi-
software engineering, software measurement, the tor of IET Software. He served as a PC chair at RE’16, ICSOC’14, CAiSE’12,
Internet of Things, and system security and pri- and REFSQ’11, among others, and as GC for RE’08 and PROFES’19.
vacy.

78466 VOLUME 8, 2020

You might also like