Model-Based Intelligent User Interface Adaptation
Model-Based Intelligent User Interface Adaptation
https://doi.org/10.1007/s10270-021-00909-7
EXPERT VOICE
Received: 21 May 2021 / Revised: 28 June 2021 / Accepted: 29 June 2021 / Published online: 16 July 2021
© The Author(s) 2021
Abstract
Adapting the user interface of a software system to the requirements of the context of use continues to be a major challenge,
particularly when users become more demanding in terms of adaptation quality. A considerable number of methods have,
over the past three decades, provided some form of modelling with which to support user interface adaptation. There is,
however, a crucial issue as regards in analysing the concepts, the underlying knowledge, and the user experience afforded by
these methods as regards comparing their benefits and shortcomings. These methods are so numerous that positioning a new
method in the state of the art is challenging. This paper, therefore, defines a conceptual reference framework for intelligent
user interface adaptation containing a set of conceptual adaptation properties that are useful for model-based user interface
adaptation. The objective of this set of properties is to understand any method, to compare various methods and to generate new
ideas for adaptation. We also analyse the opportunities that machine learning techniques could provide for data processing and
analysis in this context, and identify some open challenges in order to guarantee an appropriate user experience for end-users.
The relevant literature and our experience in research and industrial collaboration have been used as the basis on which to
propose future directions in which these challenges can be addressed.
Keywords Context of use · Intelligent user interface · Machine learning · Model-based software engineering · Model-driven
engineering · User interface adaptation · Conceptual reference framework
1 Introduction nate from sources that are external to the end-user, such as
other user groups, recommendation occurs instead. Mixed-
User interface (UI) adaptation consists of modifying a soft- initiative adaptation [17] occurs when both the end-user and
ware system’s UI in order to satisfy requirements, such as the the system collaborate in order to make the adaptation.
needs, wishes, and preferences of a particular user or a group UI adaptation should ultimately serve the end-user’s ben-
of users. Adaptation falls into two categories depending on efit, by optimising factors contributing to the end-user’s
which the system or end-user is responsible for making the experience. For example, the objective of UI adaptation could
adaptation [15]: adaptability refers to the end-user’s ability be to increase efficiency (by reducing task completion time
to adapt the UI, whereas adaptivity or self-adaptation refers and error rate or by improving the learning curve), to ensure
to the system’s ability to perform UI adaptation. Personal- effectiveness (by guaranteeing full task completion) or to
isation is a particular form of adaptivity, usually for the UI improve the subjective user’s satisfaction, but could also be
contents, that is based on data originating solely from the related to other factors, such as hedonic value or user disrup-
end-user, such as personal traits [15]. When the data origi- tion [18].
The challenge is to suggest the right adaptation at the right
time in the right place in order to make it valuable for the end-
Communicated by Bernhard Rumpe.
user [4]. Otherwise, adaptation will be prone to limitations
B Silvia Abrahão that could impede the expected benefits [21], if not thwart
[email protected] them: risk of misfit (the end-user’s needs are incorrectly cap-
1 tured or interpreted), user cognitive disruption (the end-user
Universitat Politècnica de València, ES, Valencia, Spain
is disrupted by the adaptation), lack of prediction (the end-
2 Université catholique de Louvain, BE, user does not know when and how the adaptation will take
Ottignies-Louvain-la-Neuve, Belgium
123
place), lack of explanation (the end-user is not informed of 2 Current state of model-based UI
the reasons for adaptation), lack of user involvement (the end- adaptation
user does not have the opportunity to participate actively in
the adaptation process), and risks as regards privacy (the sys- Pioneering work on UI adaptation started with Browne et al.
tem maintains personal information that the user wishes to [8], who used Moran’s Command Language Grammar (CLG)
keep private). to structure UI specifications into distinct aspects, ranging
A number of model-based approaches with which to from tasks and abstract concepts to syntactic and physical
address these challenges have been proposed to support UI components. These authors concluded that the major strength
adaptation by the human–computer interaction (HCI) and of CLG as regards UI adaptation is the principle of separation
software engineering (SE) communities. However, no study of concerns. Although this principle is enforced in CLG, it is
that summarises the current knowledge, reasoning, and expe- not obvious how to easily propagate all specification aspects
rience gained by these approaches, along with their benefits into the final code. These authors additionally state that CLG
and limitations, currently exists. These aspects are so numer- has very limited facilities with which to express UI presen-
ous that positioning any new approach with respect to the tation and behaviour.
prior work is difficult to achieve. Surveys of UI adapta- Dieterich et al.’s taxonomy [13] has long been consid-
tion [2,13,20] synthesise adaptation concepts, methods, and ered a seminal reference when classifying different types of
tools. Most of them are, however, technology driven, limited adaptation configurations and methods. This taxonomy was
in scope, or largely surpassed by recent technical progress, obtained after analysing more than 200 papers, after which
which makes them incomplete as regards covering the most the UI adaptation methods found were structured in four
recent adaptation approaches or exploring alternatives in a stages: (1) the initiative, which specifies the entity, end-user
structured manner. or system that expresses the intention to perform adaptation;
In this paper, we, therefore, present a conceptual refer- (2) the proposal, which suggests those proposals that could
ence framework for model-based intelligent UI adaptation be applied to adaptation, given the current context of use;
that contains a set of conceptual adaptation properties. These (3) the decision, which specifies those adaptation proposals
properties are structured around the Quintilian questions— that best fit the requirements imposed by the context of use,
what, why, how, to what, who, when, and where—posed and (4) the execution, which is responsible for enacting the
for model-based UI adaptation. The objective of these con- adaptation method previously decided on.
ceptual properties is to facilitate the understanding and However, López-Jaquero et al. [23] identified some short-
comparison of adaptation capabilities, in addition to their comings of this taxonomy: it does not support an explicit
integration into the model-based or model-driven engineer- collaboration between entities (i.e. the user and the system,
ing of user interfaces of software systems, such as interactive or even a third party) and it is restricted to the execution
applications, websites and desktop applications. These prop- only. These authors specialised Norman’s theory of action in
erties also help to identify open challenges and generate new the Isatine framework, which structures the UI adaptation
ideas. In particular, progress in artificial intelligence (AI) and, into seven stages describing how the adaptation is carried out
more specifically, machine learning (ML), provides useful and by whom, thus addressing some of the Quintilian ques-
ways in which to support adaptation more effectively. We, tions. The UI adaptation is understood to be a sequence of
therefore, analyse some opportunities that these fields may seven stages (Fig. 1, in which the user’s parts are depicted
bring to model-based UI adaptation. in blue, while the system parts are depicted in green): (1)
In Sect. 2, we present the current state of UI adaptation, an entity obtained from UI adaptation goals that is formally
while in Sect. 3, we define the conceptual framework for UI expressed in the system or informally maintained in the end-
adaptation in order to locate the conceptual properties that user’s head is established; (2) this entity takes the initiative
support adaptation with respect to the Quintilian questions. in order to start a UI adaptation; (3) based on this input, some
These properties target the needs of two major stakeholder UI adaptation is subject to a specification so as to enable it
groups: they help system engineers to incorporate suitable UI to express how the adaptation will be conducted; (4) the UI
adaptation mechanisms into model-based development more adaptation selected is then applied; (5) a transition from an
systematically, and they help practitioners to understand and initial state before adaptation to a final state after adaptation
compare UI adaptation methods. We conclude this paper in is subsequently ensured in order to preserve continuity; (6)
Sect. 4, with a call for action that includes a discussion of the results of this output are then subjected to interpretation
open challenges and future directions. by an entity based on the feedback provided by the system,
and (7) the interpretation eventually leads to an evaluation
of whether the initial goals established for the adaptation are
(partially or totally) met. Depending on this evaluation, a new
cycle could be initiated until the final goals are achieved.
123
123
models, such as the user model, is frequently mentioned, but UI (e.g. location, stationary vs. mobile conditions, light,
their structure and usage are not made sufficiently explicit to noise level, physical configuration, organisational and
help modellers and developers to implement adaptation [1]. psycho-social constraints). For example, Figs. 4 and 5
On the one hand, UI adaptation methods have been inves- reproduce two UIs of a trip planner in two different con-
tigated in human–computer interaction (HCI), but without texts of use.
making the means required to practically implement them – The software system, which is usually an interactive
sufficiently explicit. application, consists of the semantic core component that
On the other, the software engineering (SE) community contains the business logic functions appertaining to the
has advanced as regards principles and technologies with application domain. These functions are executed from
which to support the PDA-LDA cycle on the system side the intelligent UI of this software, which could exploit up
(e.g. the MAPE-K adaptation loop, models at runtime), but to four models [9]: a “task and domain” model captures
often relegates the perception, decision and action stages on domain abstractions, usually in the form of an object-
the end-user side, typically addressed in HCI, to a secondary oriented diagram or a UML class diagram or any kind of
role. domain model, while the “task” model captures how the
end-user sees the interaction with the semantic core, inde-
pendently of any implementation or technology.1 The
3 Conceptual framework and properties for task and domain models are transformed into an “abstract
UI adaptation UI” model2 for a given context of use, but still without
making any assumptions about the interaction modality
In this section, we introduce a conceptual reference frame- and target platform. The “abstract UI” becomes a “con-
work with intelligent model-based UI adaptation. The pur- crete UI” when design options are decided for a particular
pose of the framework is twofold: (1) to help software modality, e.g. graphical, vocal, tactile, haptic or tangible,
engineers to properly decompose the application into layers and for a particular platform, e.g. a smartphone, a tablet,
and modules that are suitable for supporting model-based UI a wall screen. A final UI is obtained from this concrete UI
adaptation, and (2) to provide a property-based classification model by means of either model interpretation or model-
of existing approaches in order to identify some trends and to-code transformation. Any transformation between two
research directions. levels can exploit the contents of the context model.
Examples of these are: a forward engineering transfor-
3.1 The conceptual reference framework mation (depicted as a downwards arrow in the software
system), a reverse engineering transformation (depicted
Figure 3 depicts our conceptual framework with which to as an upwards arrow), or a self-modification (depicted as
support intelligent UI adaptation based on MDE princi- a loop), all of which can generate a subsequent model
ples (e.g. abstraction, separation of concerns, automation) based on the previous one by exploiting the context
and technologies (e.g. modelling and metamodelling, model model.
transformation). This framework is decomposed into four – The Intelligent UI adaptor, which consists of six com-
parts: ponents, two of which are mandatory. At the core is the
adaptation manager, which is responsible for performing
– The context of use, which represents the actor(s) that any complete adaptation process from its initiation to its
interact with their platform or device in any physical completion, such as according to the Isatine framework.
environment [9]. For example, a business person inter- This manager, therefore, stores and maintains adapta-
acting with a smartphone in a busy airport represents a tion parameters, which regulate the adaptation process
context of use that is radically different from a tourist with variable parameters, such as the level of automa-
browsing a laptop in a hotel. In order to support context- tion, the frequency, the priority of adaptation rules or
aware UI adaptation, a context probe (e.g. a camera, a the preferred adaptation strategies. Adaptation param-
sensor) senses it and abstracts relevant data into useful eters can be application-independent, such as the level
model fragments corresponding to the context of use: a of automation, or application-dependent, such as those
user model captures all data pertaining to the end-user shown in Figs. 6 and 7. The adaptation manager has
(e.g. gender, age, interaction history, abilities, prefer- its own UI, denominated as an adaptation manager UI,
ences, emotional state, past experience), a platform model
captures data that are useful as regards characterising the 1 See W3C recommendation for task model at https://www.w3.org/TR/
target platform (e.g. screen resolution, sizes, interaction task-models/.
capabilities, CPU availability), and an environment model 2 See W3C recommendation for abstract UI model at https://www.w3.
captures any environmental data that could influence the org/TR/abstract-ui/.
123
Fig. 3 Conceptual reference framework for intelligent UI adaptation highlighting four main components and their related conceptual adaptation
properties
which is sometimes referred to as a meta-UI (the UI above executed [16]. Finally, an adaptation machine learning
the original UI [11]) or extra-UI (the UI external to the UI system can monitor the whole process over time, learn
of the application [26]). This UI enables the end-user to what the good adaptations are or which are preferred by
access and update the adaptation parameters and to con- the end-user, and recommend them in the future [7]. For
duct the whole adaptation process interactively so as to example, TADAP [27] suggests adaptation operations
specifically perform adaptation operations, review them, based on the user’s interaction history that the end-user
accept them or reject them. In order to clearly differ- can accept, reject, or re-parameterise by employing Hid-
entiate the UI of the adaptation manager from that of den Markov Chains.
the software system, it should be located in a separate – The external sources contain any form of information
location. Depending on the parameters, the adaptation that can be exploited in order to support and improve
manager executes the adaptation logic contained in the the adaptation process: data concerning individual items,
adaptation engine, which is usually implemented in the information for semantically related data, knowledge
form of adaptation rules. The adaptation manager can gained from exploiting the information within a certain
call the adaptation transitioner in order to convey to the domain, and wisdom when knowledge can be reproduced
end-user the transition between the status before and after in different domains. These sources are typically held
adaptation. For example, animated transitions [12] apply by agents that are external to the software system, such
morphing techniques to show this step and to preserve as experts, brokers, recommenders, or any third party
continuity (Fig. 8). If necessary, the transitioner provides source. For example, when no adaptation proposal can
the end-user with information on why, how and when be obtained, an external source may be required in order
the adaptation is performed by requesting the adapta- to attain one.
tion explainer, which is responsible for explaining and
justifying why any adaptation proposal or step will be
123
3.2.1 Who
123
Fig. 8 Animated transition from the initial state before adaptation to the final state after adaptation [12]
tion degree can be assessed by analysing who performs the 3 Adaptation granularity, which refers to the smallest
following three steps: UI unit that is subject to adaptation. The adaptation unit could
cover the UI presentation, the dialogue, the navigational flow
– Initiative, which refers to who detects that there is a need or the contents. The following units may be subject to adap-
to adapt the user interface. The adaptation process can tation:
usually be initiated by the user (U), the system (S) or
a third party (T). For instance, the user can trigger an – Within widgets: the adaptation is applicable within an
adaptation by selecting an element in the user interface interaction object. For example, a list box is replaced
or the system can decide that an adaptation is needed by with a drop-down list (Fig. 8).
inferring it from a change in the context of use. – Across widgets, within a group: the adaptation is appli-
– Decision, which refers to who makes the decision to cable to widgets within the same group.
adapt the UI (i.e. the user, the system, or third party). – Across groups within a container: the adaptation is appli-
The decision is concerned with the identification of what cable to groups of widgets within the same container.
adaptation proposals best fit the need for the adaptation – Across containers, within a software system: the adapta-
detected. tion is applicable to all groups within a software system,
– Application, which refers to who is responsible for apply- e.g. a single application.
ing the adaptation, i.e. U, S, or T, or any combination. – Across software systems: the adaptation is applicable to
software systems. For example, a particular adaptation is
always applied to all applications used by a person.
The evaluation degree can be assessed by analysing who
performs the following three steps:
A model-based approach may support the adaptation of one
or more UI units. For example, Sottet et al. [36] support adap-
– Transition, which refers to how the transition is per-
tations across different interaction objects (widgets), across
formed from the original UI to that which is adapted.
groups within a container, and across containers, within a
This criterion indicates whether the end-user is able to
software system.
perceive how the adaptation is conducted, i.e. whether
4 UI Type, which refers to the UI type that is subject to
the user is aware of the intermediate steps taken when
adaptation depending on its interaction modality:
adapting the user interface.
– Interpretation, which refers to the user’s ability to under-
stand both the adaptation results and the adaptation – Graphical: concerns only the graphical part.
execution itself. – Vocal: concerns only the vocal part.
– User feedback, which refers to the ability of the approach – Tactile: concerns only the tactile part.
to provide feedback about the quality of the adaptation. – Gestural: concerns only the gestural part.
– Haptic: concerns only the haptic modality.
3.2.2 What
For example, a rich internet UI is rendered as a vectorial
This dimension refers to what is adapted, which is further graphical interface [24]. Nomadic gestures [40] adapt com-
characterised through the use of five conceptual properties: mand gestures for a particular user that are transferable from
2 Adaptation target, which refers to which UI part is
one software system to another.
subject to adaptation: its presentation (layout), its dynamic 5 UI Modality, which expresses how many modalities
behaviour, its navigation, its contents (e.g. text, images, are incorporated into the UI adaptation, as follows:
videos), or any combination. For example, after identifying
the end-user, Diffie [37], highlights parts of a website that – Monomodal: the approach supports only the adaptation
have changed since the last visit. of a single UI type.
123
8 Adaptation QAs, which refer to the quality attribute(s)
(a) User System that should be impacted by the UI adaptation process. For
example, the ISO/IEC 25010 standard for software prod-
uct quality [19] can be used as a reference to specify the
quality attributes to be guaranteed or improved by any UI
Adaptaon adaptation, such as usability, UI aesthetics, flexibility and
(b) User engine
System
portability. Since the UI adaptation is, in principle, performed
for the ultimate benefit of the end-user, and not necessarily
the software system, quality attributes such as accessibility
Adaptaon Adaptaon and continuity are often oriented towards the end-user. UI
(c) User System
Manager engine
plasticity [10] also represents a frequent quality attribute, as
it expresses the ability of a UI to adapt itself depending on
Fig. 9 Feedback loops in adaptation: a one with the system without contextual variations while preserving usability.
adaptation, b two with an adaptation engine, c three with an adaptation
manager
3.2.4 Where
Context
6 Coverage, which expresses which part of the – Client-side: when located inside in the software.
context model is exploited for UI adaptation: the user model – Server-side: when located outside the software system,
(e.g. user profile, preferences, goals, tasks, emotional state, which is typically the case in cloud computing.
physical state), the platform model (e.g. screen resolution, – Proxy-side: when encapsulated in a proxy component to
browser, battery) and/or the environment model (e.g. loca- ensure some independence. For example, Fig. 5 depicts a
tion, noise, light). For example, a business traveller who rents UI in which weather forecasts were retrieved from a web
a car via a smartphone in a noisy airport is considered as one service in XML and fed back into a proxy to decide how
context of use, and a tourist who book a car on a laptop while to present these data based on the adaptation parame-
sitting on a sofa at home is considered as another context ters specified in Fig. 7. Locating this adaptation strategy
of use. Figure 4 covers the three models: the user who is inside the software would create a certain dependence
a tourist, the platform detected as a tablet and the environ- between the UI and the web service.
ment detecting dark conditions. Any variation in the model
involved in the context model can initiate a contextual vari-
ation that will or will not be reflected via a UI adaptation. The adaptation location directly influences how feedback
A small contextual variation could be considered not suf- loops are introduced into the components (Fig. 9). A soft-
ficiently significant enough to trigger a UI adaptation, and ware system devoid of adaptation (Fig. 9a) benefits from
it is not always desirable or advisable to perform such an retroactive feedback only between the software system and
adaptation for every slight contextual perturbation. the end-user. A software system with an adaptation engine
(Fig. 9b) has two feedback loops: between the user and the
adaptation engine and between the user and the system.
3.2.3 Why Finally, three feedback loops are possible for an intelli-
gent UI (Fig. 9c), which require the system to be decoupled
This dimension is concerned with justifying the reasons why from its intelligent UI. Current research efforts in the SE
a UI adaptation is carried out. This depends on the user’s community [42] [3] are focused on providing strategies and
goals and is defined by two properties: facilities with which to support UI adaptation based on one
7 Adaptation rationale, which refers to the reason why control loop with an adaptation engine, and there is a shortage
the adaptation is required and, more specifically what the new of approaches that support more intelligent strategies based
requirements that need to be satisfied through the adaptation on two control loops with an adaptation manager.
are. For example, an end-user expressing a preference for 10 Adaptation scope level, which refers to the level at
data selection rather than data input will see some sort of UI which the adaptation process occurs, which is based on the
adaptation based on this preference. three levels proposed by Nierstrasz and Meijler [30]:
123
123
12 Adaptation Domain, which refers to the domain
of human activity in which the adaptation takes place. A
model-based UI adaptation approach can be general pur-
pose (independent of the application domain) or devised for a
specific domain (e.g. smart home, Internet-of-things, ambi-
ent assisted living, smart cities, ERP system). We believe
that the application domain may influence the adaptation
rationale or the adaptation QAs that should be ensured by a
particular approach. For example, the objective of Akiki et
al.’s approach [3] is to improve the UI usability of enterprise
applications, such as ERP systems, by providing end-users Fig. 12 Adaptation automation levels (AAL)
with a minimal feature-set and an optimal layout.
system can access the various aforementioned models to opti-
3.2.5 When
mise the UI adaptation and to exploit them. It is characterised
by five properties:
This dimension is concerned with when the adaptation takes
15 Adaptation method, which refers to the software engi-
place. This decision is not trivial, since the frequency of
neering method used to adapt the UI. An adaptation method
adaptation affects the system usability. It is defined by two
can be model-based/driven or it can be combined with other
properties:
methods, such as aspect-oriented modelling, component-
13 Adaptation type. The UI adaptation type is said to be
based design, computation reflection (i.e. a programme’s
static when its process takes place during design (e.g. proto-
ability to reason about, and possibly alter, its own behaviour),
typing, sketching), development (e.g. compile), link or load
dynamic interconnection, higher-order functional composi-
time, dynamic, when its process takes place during runtime,
tion, higher-order modelling, macro-command expansion,
or hybrid when both are combined. For example, in the Yig-
mashup, modelling or programming by example, syntacti-
itbas et al. approach [42] a rule-based execution environment
cal expansion of parameterised component. This paves the
supports the UI adaptation at runtime.
way towards investigating the effectiveness of combining
14 Adaptation time, which refers to the exact moment of
these techniques with the purpose of improving the existing
time at which the UI adaptation occurs, which could be at
model-based UI adaptation approaches. For example, Blouin
one specific moment (single-step) or distributed throughout
et al. [5] presented an approach that combines aspect-oriented
several moments of time (multi-step). In order to further char-
modelling with property-based reasoning to control complex
acterise this conceptual property, we rely on the adaptation
and dynamic user interface adaptations. The encapsulation of
dimensions proposed by McKinley et al. [25], which result
variable parts of interactive systems into aspects permits the
from a survey of adaptive systems. It is said to be hardwired
dynamic adaptation of user interfaces, and the tagging of UI
(when the UI adaptation is embedded in the code of the soft-
components and context models with QoS properties allows
ware application, typically the UI code), customisable (when
the reasoner to select the aspects best suited to the current
the UI adaptation enables some degree of pre-computed
context.
freedom), configurable (when the UI adaptation technique
16 Adaptation automation degree, which refers to the
could be configured before executing it), tunable (when the
level to which the UI adaptation is automated. There is a
UI adaptation technique could fine-tune the UI at run-time
wide range of possible adaptation levels between adaptability
without modifying its code), or mutable (when the UI adap-
(when UI adaptation is performed entirely manually by the
tation technique subsumes the run-time code modification of
end-user) and adaptivity (when UI adaptation is performed
the software system, namely the UI code). McKinley et al.
entirely by the system), which we defined as follows based
[25] mention that hardwired, customisable, and configurable
on [33] (see Fig. 12):
cases are static, while tunable and mutable cases are, by defi-
nition, dynamic. For example, MiniAba [34] uses generative
programming to automatically regenerate a new C++ project – Level 1. Adaptability (fully manual): the UI adaptation is
from dynamic specifications, which are thus dynamic and performed entirely by the end-user.
mutable. – Level 2. Proposability: the intelligent UI manager pro-
poses certain decisions that should be made in order to
3.2.6 How execute actions towards UI adaptation to be performed
by the system and the end-user decides.
This dimension is concerned with how the UI adaptation is – Level 3. Narrowing: the intelligent UI manager sorts
performed. One critical issue is to what extent the software the proposed decisions according to certain criteria to
123
facilitate the end-users’ decision. For example, Fig. 11 by the user) or less involvement (when the UI adaptation is
proposes a suite of six new layouts in decreasing order of mostly performed by the adaptation manager). These cases
performance based on past user actions and parameters. should cover various degrees of user involvement depending
– Level 4. Identification: the intelligent UI manager iden- on her willingness to drive the process and the knowledge
tifies the best decision for the user to make from among required for this purpose. Most existing model-based/driven
all the proposals. UI adaptation approaches do not properly involve the end-
– Level 5. Execution: the intelligent UI manager executes user during the adaptation process. Moreover, most of them
the decision made by the end-user. For example, the end- use only data or information as external sources. There is
user selects one of the new layouts presented in Fig. 11 consequently a shortage of approaches that use knowledge
to replace the existing one. and wisdom to drive the adaptation process.
– Level 6. Restriction: the intelligent UI manager postpones 17 Adaptation logic, which refers to the algorithm(s) used
the UI adaptation for a certain amount of time. If the end- to perform the UI adaptation. Typical examples of adaptation
user does not react, the UI adaptation will be processed as algorithms are:
suggested. Otherwise, the end-user should use the adap-
tation manager UI to specify which actions to take.
– Level 7. Information: the intelligent UI manager performs – Probabilistic-based: the adaptation logic is performed by
the UI adaptation and triggers the adaptation transitioner a probabilistic model (e.g Bayesian network).
and/or explainer in order to inform the end-user of this – Rule-based: the adaptation logic is performed by rules
decision. (e.g. Event-Condition-Action (ECA) rules in the adap-
– Level 8. On-demand: the intelligent UI manager performs tation engine shown in Fig. 3). For example, Yigitbas et
the UI adaptation and triggers the adaptation transitioner al. [43] presented a model-based approach with which to
and/or explainer only if the end-user demands it. build context-adaptive UIs based on an adaptation model
– Level 9. Self-explanation: the intelligent UI manager containing ECA rules.
performs the UI adaptation and triggers the adaptation – Case-based: the adaptation logic is based on case-based
transitioner and/or explainer when it decides to do so. reasoning.
– Level 10. Adaptivity/self-adaptation: the intelligent UI – Logic-based: the adaptation logic is based on logic (e.g.
manager performs the UI adaptation entirely automati- first-order predicate logic).
cally without any user intervention. – Ontology-based: the adaptation logic is based on an
ontology (e.g. a domain ontology).
Levels 2 to 9 represent various cases of mixed-initiative – Evidence-based: the adaptation logic is performed by an
adaptation. While these levels cover a wide range of automa- evidence theory.
tion levels, they mainly relegate the end-user to a secondary – Fuzzy-based: the adaptation logic is performed by a fuzzy
role of decision maker. These levels should, therefore, be approach (e.g. fuzzy sets).
accompanied by appropriate actions that the end-user should
take within the adaptation manager UI, which should offer
more high-level actions to support UI adaptation. These lev-
18 Tool support, which refers to the automation level pro-
els are cumulative, thus requiring a sophisticated adaptation
vided by the UI adaptation manager. We define the following
manager.
levels:
To be more practical, we suggest distributing the mixed
initiative between the end-user, the system, and any third
party according to the seven stages of adaptation (Fig. 1):
goal, initiative, specification, application, transition, inter- – Level 0. Not automated: all steps of the UI adaptation are
pretation, and evaluation. For example, AB-HCI [22] sup- performed manually.
ports a mixed initiative for the three steps belonging to the – Level 1. Partially automated: one or more steps of the UI
gulf of execution, i.e. from initiative to application, but not the adaptation are supported by the Adaptation Manager and
subsequent stages belonging to the gulf of evaluation. Each can be manipulated via its adaptation manager UI.
stage is managed through a particular agent in a multi-agent – Level 2. Fully automated: all steps of the UI adaptation
architecture which adequately distributes responsibilities. are supported by the Adaptation Manager. For example,
An alternate characterisation of the adaptation automation Slime [35] adapts a final UI, in a completely automatic
degree could balance the UI adaptation with equal respon- manner, in order to avoid the awkward problem posed by
sibility (when the UI adaptation is performed equally by the bezels of contiguous displays (Fig. 13).
the end-user and the adaptation manager), with more user
involvement (when the UI adaptation is mostly performed
123
123
tions, etc. However, models at runtime is rarely applied reference framework (e.g. adaptation engine, adaptation tran-
to such models, and this may be challenging. sitioner, adaptation machine learning, adaptation explainer)
5. Relying on software co-evolution. Changes in soft- would allow us to get further insights.
ware resulting from UI adaptation go far beyond merely The authors of this expert voice trust that the proposed
modifying the UI and could potentially impact on any framework for model-based intelligent user interface adap-
component of the software system or the others repre- tation will serve as a call for action that could lead to research
sented in Fig. 3. For example, how can we align user initiatives by the modelling community.
interface changes with changes in the software architec-
ture and vice versa? There is a need to formalise these Acknowledgements This work is supported by the Spanish Ministry
of Science, Innovation, and Universities under Grant No.: TIN2017-
changes in order to reason about them for purposes such 84550-R, Adapt@Cloud Project and by the Generalitat Valenciana
as maintainability, traceability, etc. The field of software under Grant No.: AICO/2020/113, UX-Adapt Project. Arthur Sluÿters
evolution has an established tradition as regards formal- is funded by the “Fonds de la Recherche Scientifique - FNRS” under
ising these aspects, but rarely as regards UI aspects. Grant n40001931.
When the UI comes into play, software evolution should Open Access This article is licensed under a Creative Commons
upgrade to software co-evolution in which changes on Attribution 4.0 International License, which permits use, sharing, adap-
both sides, the user interface and the software system, tation, distribution and reproduction in any medium or format, as
should be formalised. long as you give appropriate credit to the original author(s) and the
source, provide a link to the Creative Commons licence, and indi-
6. Considering adaptation as a multi-factorial problem. cate if changes were made. The images or other third party material
Since many contextual aspects could influence the quality in this article are included in the article’s Creative Commons licence,
of UI adaptation, multiple quality factors (e.g. adap- unless indicated otherwise in a credit line to the material. If material
tation QAs, adaptation automation degree, the user’s is not included in the article’s Creative Commons licence and your
intended use is not permitted by statutory regulation or exceeds the
characteristics) should be considered together in the same permitted use, you will need to obtain permission directly from the copy-
multi-factorial problem. Improving user performance right holder. To view a copy of this licence, visit http://creativecomm
could come about at the expense of cognitive destabil- ons.org/licenses/by/4.0/.
isation. Another challenge is related to the analysis and
resolution of conflicting UI adaptation alternatives. In
this context, ML techniques could be used to support the References
decision making as regards the selection of the best UI
adaptation that is closer to the end-user’s intention. 1. Abrahão, S., Bourdeleau, F., Cheng, B.H.C., Kokaly, S., Paige,
R.F., Störrle, H., Whittle, J.: User experience for model-driven
engineering: Challenges and future directions. In: Proceedings of
The aforementioned suggestions represent opportunities the 20th ACM/IEEE International Conference on Model Driven
for the modelling community to leverage UI adaptation of Engineering Languages and Systems, MODELS 2017, Austin, TX,
software systems by investigating some new avenues. A USA, September 17-22, 2017, pp. 229–236. IEEE Computer Soci-
limitation of this approach is that the conceptual reference ety (2017). https://doi.org/10.1109/MODELS.2017.5
2. Akiki, P.A., Bandara, A.K., Yu, Y.: Adaptive model-driven user
framework does not provide any prioritisation of its key fea- interface development systems. ACM Comput. Surv. 47(1), 91–
tures and how they should be explored. In addition, it is 933 (2014). https://doi.org/10.1145/2597999
impossible to consider them all together, although they are 3. Akiki, P.A., Bandara, A.K., Yu, Y.: Engineering adaptive model-
intertwined. We need, therefore, to investigate trade-offs and driven user interfaces. IEEE Trans. Softw. Eng. 42(12), 1118–1147
(2016). https://doi.org/10.1109/TSE.2016.2553035
dependencies among the different properties and their levels 4. Alvarez-Cortes, V., Zarate, V.H., Ramirez Uresti, J.A., Zayas,
to get a better understanding of the potential of the proposed B.E.: Current challenges and applications for adaptive user
framework. Software co-evolution, as suggested, is a form of interfaces. In: I. Maurtua (ed.) Human-Computer Interaction,
keeping the human-in-the-loop paradigm when the UI should chap. 3, pp. 49–68. IntechOpen, London, UK (2009). https://doi.
org/10.5772/7745. https://www.intechopen.com/books/human-
be adapted as much as possible as a collaboration between the computer-interaction/current-challenges-and-applications-for-
end-user, the system, and any third party, especially when no adaptive-user-interfaces
consensus is reached between the end-user and the system. 5. Blouin, A., Morin, B., Beaudoux, O., Nain, G., Albers, P., Jézéquel,
Some properties of the reference framework present a J.M.: Combining aspect-oriented modeling with property-based
reasoning to improve user interface adaptation. In: Proceedings
level-wise assessment which represents particular capabili- of the 3rd ACM SIGCHI Symposium on Engineering Interactive
ties of an approach to support intelligent UI adaptation, which Computing Systems, EICS ’11, p. 85–94. Association for Com-
increases the higher the level. More efforts are needed to val- puting Machinery, New York, NY, USA (2011). https://doi.org/10.
idate more thoroughly the property levels for a wider set 1145/1996461.1996500
6. Bouillon, L., Limbourg, Q., Vanderdonckt, J., Michotte, B.:
of existing adaptation approaches. In addition, instantiating Reverse engineering of web pages based on derivations and
the framework to specific model-based adaptation scenar- transformations. In: Proceedings of Third Latin American Web
ios and building prototypes of the main components of the Congress, LA-WEB ’05, pp. 11. IEEE Computer Society Press,
123
Piscataway, USA (2005). https://doi.org/10.1109/LAWEB.2005. 19. ISO: ISO/IEC 25010: Software Quality Product Standard. stan-
29 dard, International Standard Organization, Geneva (2019). https://
7. Bouzit, S., Calvary, G., Coutaz, J., Chêne, D., Petit, E., Vanderdon- iso25000.com/index.php/en/iso-25000-standards/iso-25010?
ckt, J.: The PDA-LPA design space for user interface adaptation. limit=3&limitstart=0
In: Proceedings of the 11th IEEE International Conference on 20. Kühme, T., Dieterich, H., Malinowski, U., Schneider-Hufschmidt,
Research Challenges in Information Science, RCIS ’17, pp. 353– M.: Approaches to adaptivity in user interface technology: Survey
364. IEEE Press, Hoboken, New Jersey, USA (2017). https://doi. and taxonomy. In: Proceedings of the IFIP TC2/WG2.7 Working
org/10.1109/RCIS.2017.7956559 Conference on Engineering for Human-Computer Interaction, pp.
8. Browne, D., Totterdell, P., Norman, M. (eds.): Adaptive User Inter- 225–252. North-Holland Publishing Co., Amsterdam, The Nether-
faces. Computers and People Series. Academic Press, London, UK lands, The Netherlands (1992). https://doi.org/10.5555/647103.
(1990) 717564. http://dl.acm.org/citation.cfm?id=647103.717564
9. Calvary, G., Coutaz, J., Thevenin, D., Limbourg, Q., Bouillon, L., 21. Lavie, T., Meyer, J.: Benefits and costs of adaptive user
Vanderdonckt, J.: A unifying reference framework for multi-target interfaces. Int. J. Human Comput. Stud. 68(8), 508–524
user interfaces. Interact. Comput. 15(3), 289–308 (2003). https:// (2010) https://doi.org/10.1016/j.ijhcs.2010.01.004. http://www.
doi.org/10.1016/S0953-5438(03)00010-9 sciencedirect.com/science/article/pii/S1071581910000145
10. Calvary, G., Coutaz, J., Thevenin, D., Limbourg, Q., Souchon, N., 22. López-Jaquero, V., Simarro, F.M., González, P.: AB-HCI: an inter-
Bouillon, L., Florins, M., Vanderdonckt, J.: Plasticity of user inter- face multi-agent system to support human-centred computing.
faces: A revised reference framework. In: Proceedings of the First IET Softw. 3(1), 14–25 (2009). https://doi.org/10.1049/iet-sen:
International Workshop on Task Models and Diagrams for User 20070108
Interface Design, TAMODIA ’02, p. 127–134. INFOREC Pub- 23. López-Jaquero, V., Vanderdonckt, J., Simarro, F.M., González,
lishing House Bucharest (2002). https://doi.org/10.5555/646617. P.: Towards an extended model of user interface adaptation:
697235 The ISATINE framework. In: J. Gulliksen, M.B. Harning, P.A.
11. Coutaz, J.: Meta-user interfaces for ambient spaces. In: Coninx, Palanque, G.C. van der Veer, J. Wesson (eds.) Proceedings of
K., Luyten, K., Schneider, K.A. (eds.) Task Models and Diagrams the Joint Working Conferences on Engineering Interactive Sys-
for Users Interface Design, pp. 1–15. Springer, Berlin (2007) tems, EIS’07-EHCI’07-DSV-IS’07-HCSE’07, Salamanca, Spain,
12. Dessart, C.E., Genaro Motti, V., Vanderdonckt, J.: Showing user March 22–24, 2007, Lecture Notes in Computer Science, vol. 4940,
interface adaptivity by animated transitions. In: Proceedings of the pp. 374–392. Springer (2007). https://doi.org/10.1007/978-3-
3rd ACM SIGCHI Symposium on Engineering Interactive Com- 540-92698-6_23. https://link.springer.com/chapter/10.1007/978-
puting Systems, EICS ’11, pp. 95–104. ACM, New York, NY, USA 3-540-92698-6_23
(2011). https://doi.org/10.1145/1996461.1996501 24. Martínez-Ruiz, F.J., Arteaga, J.M., Vanderdonckt, J., González-
13. Dieterich, H., Malinowski, U., Kuhme, T., Schneider-Hufschmidt, Calleros, J.M., González, R.M.: A first draft of a model-driven
M.: State of the art in adaptive user interfaces. In: M. Schneider- method for designing graphical user interfaces of rich internet
Hufschmidt, T. Kuhme, U. Malinowski (eds.) Adaptive User applications. In: J.A. Sánchez (ed.) Fourth Latin American Web
Interfaces Principles and Practice, chap. 10, pp. 13–48. Elsevier Congress (LA-Web 2006), 25–27 October 2006, Cholula, Puebla,
Science Publishers, Amsterdam (1994). https://www.elsevier. Mexico, pp. 32–38. IEEE Computer Society (2006). https://doi.
com/books/adaptive-user-interfaces/schneider-hufschmidt/978- org/10.1109/LA-WEB.2006.1
0-444-81545-3 25. McKinley, P.K., Sadjadi, S.M., Kasten, E.P., Cheng, B.H.C.: Com-
14. Furtado, E., Furtado, V., Silva, W.B., Rodrigues, D.W.T., posing adaptive software. Computer 37(7), 56–64 (2004). https://
da Silva Taddeo, L., Limbourg, Q., Vanderdonckt, J.: An doi.org/10.1109/MC.2004.48
ontology-based method for designing multiple user interfaces. In: 26. Melchior, J., Vanderdonckt, J., Roy, P.V.: A comparative evaluation
Proceedings of International Workshop on Multiple User Inter- of user preferences for extra-user interfaces. Int. J. Hum. Com-
faces, MUI’ 01 (2001). https://www.researchgate.net/publication/ put. Interact. 28(11), 760–767 (2012). https://doi.org/10.1080/
2567741_An_Ontology-Based_Method_for_Universal_Design_ 10447318.2012.715544
of_User_Interfaces 27. Mezhoudi, N., Vanderdonckt, J.: Toward a task-driven intelligent
15. Gajos, K.Z., Chauncey, K.: The influence of personality traits and GUI adaptation by mixed-initiative. Int. J. Hum. Comput. Interact.
cognitive load on the use of adaptive user interfaces. In: Proceed- (2020). https://doi.org/10.1080/10447318.2020.1824742
ings of the 22Nd International Conference on Intelligent User 28. Motti, V.G., Vanderdonckt, J.: A computational framework for
Interfaces, IUI ’17, pp. 301–306. ACM, New York, NY, USA context-aware adaptation of user interfaces. In: Proceedings of
(2017). https://doi.org/10.1145/3025171.3025192 the 7th IEEE International Conference on Research Challenges in
16. García Frey, A., Calvary, G., Dupuy-Chessa, S., Mandran, N.: Information Science, RCIS ’13, pp. 1–12 (2013). https://doi.org/
Model-based self-explanatory uis for free, but are they valuable? In: 10.1109/RCIS.2013.6577709
P. Kotzé, G. Marsden, G. Lindgaard, J. Wesson, M. Winckler (eds.) 29. Nichols, J.: Using the crowd to understand and adapt user inter-
Human-Computer Interaction–INTERACT 2013–14th IFIP TC 13 faces. In: Proceedings of the 5th ACM SIGCHI Symposium on
International Conference, Cape Town, South Africa, September 2- Engineering Interactive Computing Systems, EICS ’13, pp. 1–
6, 2013, Proceedings, Part III, Lecture Notes in Computer Science, 2. ACM, New York, NY, USA (2013). https://doi.org/10.1145/
vol. 8119, pp. 144–161. Springer (2013). https://doi.org/10.1007/ 2494603.2480344
978-3-642-40477-1_9 30. Nierstrasz, O., Meijler, T.D.: Research directions in software com-
17. Horvitz, E.: Principles of mixed-initiative user interfaces. In: Pro- position. ACM Comput. Surv. 27(2), 262–264 (1995). https://doi.
ceeding of the ACM International Conference on Human Factors org/10.1145/210376.210389
in Computing Systems, CHI ’99, pp. 159–166. ACM, New York, 31. Nivethika, M., Vithiya, I., Anntharshika, S., Deegalla, S.: Personal-
NY, USA (1999). https://doi.org/10.1145/302979.303030 ized and adaptive user interface framework for mobile application.
18. Hui, B., Partridge, G., Boutilier, C.: A probabilistic mental model In: Proceedings of International Conference on Advances in
for estimating disruption. In: Proceedings of the 14th International Computing, Communications and Informatics, ICACCI ’13, pp.
Conference on Intelligent User Interfaces, IUI ’09, p. 287–296. 1913–1918. IEEE Press, Piscataway, USA (2013). https://doi.org/
Association for Computing Machinery, New York, NY, USA 10.1109/ICACCI.2013.6637474
(2009). https://doi.org/10.1145/1502650.1502691
123
32. Paramythis, A., Weibelzahl, S., Masthoff, J.: Layered evaluation of Silvia Abrahão is an Associate
interactive adaptive systems: framework and formative methods. Professor at Universitat Politéc-
User Model. User Adapt. Interact. 20(5), 383–453 (2010). https:// nica de Valéncia, Spain. Her res-
doi.org/10.1007/s11257-010-9082-4 earch interests include quality assur-
33. Parasuraman, R., Riley, V.: Humans and automation: use, misuse, ance in model-driven engineering,
disuse, abuse. Hum. Fact. 39(2), 230–253 (1997). https://doi.org/ empirical assessment of software
10.1518/001872097778543886 modeling approaches, model-dri-
34. Schlee, M., Vanderdonckt, J.: Generative programming of graphi- ven cloud services development
cal user interfaces. In: Proceedings of the Working Conference on and monitoring, and the integra-
Advanced Visual Interfaces, AVI ’04, p. 403–406. Association for tion of usability into software devel-
Computing Machinery, New York, NY, USA (2004). https://doi. opment. Contact her at sabrahao@-
org/10.1145/989863.989936 dsic.upv.es
35. Sluÿters, A., Vanderdonckt, J., Vatavu, R.D.: Engineering slidable
graphical user interfaces with slime. Proc. ACM Hum. Comput.
Interact. (2021). https://doi.org/10.1145/3457147
36. Sottet, J.S., Calvary, G., Coutaz, J., Favre, J.M.: A model-driven
engineering approach for the usability of plastic user interfaces. Emilio Insfran is an Associate
In: Gulliksen, J., Harning, M.B., Palanque, P., van der Veer, G.C., Professor at Universitat Politéc-
Wesson, J. (eds.) Engineering Interactive Systems, pp. 140–157. nica de Valéncia, Spain. His rese-
Springer, Berlin (2008) arch interests include requirements
37. Teevan, J., Dumais, S.T., Liebling, D.J., Hughes, R.L.: Chang- engineering, model-driven engineer-
ing how people view changes on the web. In: Proceedings of the ing, DevOps, and cloud services
22nd Annual ACM Symposium on User Interface Software and development and evaluation. Con-
Technology, UIST ’09, p. 237–246. Association for Computing tact him at [email protected]
Machinery, New York, NY, USA (2009). https://doi.org/10.1145/
1622176.1622221
38. Todi, K., Bailly, G., Leiva, L., Oulasvirta, A.: Adapting user inter-
faces with model-based reinforcement learning. In: Proceedings of
the 2021 CHI Conference on Human Factors in Computing Sys-
tems, CHI ’21. Association for Computing Machinery, New York,
NY, USA (2021). https://doi.org/10.1145/3411764.3445497
39. Vanderdonckt, J., González-Calleros, J.M.: Task-driven plasticity:
Arthur Sluÿters is a PhD student
One step forward with ubidraw. In: P. Forbrig, F. Paternò (eds.)
in Computer Science at Université
Engineering Interactive Systems, Proceedings of Second Con-
catholique de Louvain, Belgium,
ference on Human-Centered Software Engineering, HCSE 2008,
where he is an “aspirant FNRS”
and 7th International Workshop on Task Models and Diagrams,
under contract no. 1.A434.21. His
TAMODIA 2008, Pisa, Italy, September 25–26, 200, Lecture Notes
research interests include intelli-
in Computer Science, vol. 5247, pp. 181–196. Springer (2008).
gent user interfaces (IUI), gesture
https://doi.org/10.1007/978-3-540-85992-5_16
recognition, gestural user inter-
40. Vatavu, R.: Nomadic gestures: A technique for reusing gesture
faces, and radar-based interaction.
commands for frequent ambient interactions. J. Ambient Intell.
Contact him at arthur.sluyters@u-
Smart Environ. 4(2), 79–93 (2012). https://doi.org/10.3233/AIS-
clouvain.be
2012-0137
41. van Velsen, L., van der Geest, T., Klaassen, R., Steehouder, M.F.:
User-centered evaluation of adaptive and adaptable systems: a
literature review. Knowl. Eng. Rev. 23(3), 261–281 (2008)https://
doi.org/10.1017/S0269888908001379. https://www.cambridge.
org/core/journals/knowledge-engineering-review/article/abs/
usercentered-evaluation-of-adaptive-and-adaptable-systems-a- Jean Vanderdonckt is a Full Pro-
literature-review/C77A0D4AE8BAF5808E55214884245965 fessor at Université catholique de
42. Yigitbas, E., Jovanovikj, I., Biermeier, K., Sauer, S., Engels, G.: Louvain, Belgium, where he leads
Integrated model-driven development of self-adaptive user inter- the Louvain Interaction Lab. His
faces. Softw. Syst. Model. 19(5), 1057–1081 (2020). https://doi. research interests include engineer-
org/10.1007/s10270-020-00777-7 ing of interactive systems (EICS),
43. Yigitbas, E., Sauer, S.: Engineering context-adaptive UIs for task- intelligent user interfaces (IUI),
continuous cross-channel applications. In: Human-Centered and multimodal systems such as ges-
Error-Resilient Systems Development—IFIP WG 13.2/13.5 Joint ture-based, information systems,
Working Conference HCSE 2016 and HESSD 2016 Stockholm, and model-based/driven engineer-
Sweden, August 29–31, 2016, Proceedings, pp. 281–300. Springer ing of user interfaces. Contact him
(2016). https://doi.org/10.1007/978-3-319-44902-9_18 at [email protected]
123
1. use such content for the purpose of providing other users with access on a regular or large scale basis or as a means to circumvent access
control;
2. use such content where to do so would be considered a criminal or statutory offence in any jurisdiction, or gives rise to civil liability, or is
otherwise unlawful;
3. falsely or misleadingly imply or suggest endorsement, approval , sponsorship, or association unless explicitly agreed to by Springer Nature in
writing;
4. use bots or other automated methods to access the content or redirect messages
5. override any security feature or exclusionary protocol; or
6. share the content in order to create substitute for Springer Nature products or services or a systematic database of Springer Nature journal
content.
In line with the restriction against commercial use, Springer Nature does not permit the creation of a product or service that creates revenue,
royalties, rent or income from our content or its inclusion as part of a paid for service or for other commercial gain. Springer Nature journal
content cannot be used for inter-library loans and librarians may not upload Springer Nature journal content on a large scale into their, or any
other, institutional repository.
These terms of use are reviewed regularly and may be amended at any time. Springer Nature is not obligated to publish any information or
content on this website and may remove it or features or functionality at our sole discretion, at any time with or without notice. Springer Nature
may revoke this licence to you at any time and remove access to any copies of the Springer Nature journal content which have been saved.
To the fullest extent permitted by law, Springer Nature makes no warranties, representations or guarantees to Users, either express or implied
with respect to the Springer nature journal content and all parties disclaim and waive any implied warranties or warranties imposed by law,
including merchantability or fitness for any particular purpose.
Please note that these rights do not automatically extend to content, data or other material published by Springer Nature that may be licensed
from third parties.
If you would like to use or distribute our Springer Nature journal content to a wider audience or on a regular basis or in any other manner not
expressly permitted by these Terms, please contact Springer Nature at