Top Trends in Data and Analytics 2021
Top Trends in Data and Analytics 2021
By Analysts Rita Sallam, Donald Feinberg, Pieter den Hamer, Shubhangi Vashisth, Farhan Choudhary,
Jim Hare, Lydia Clougherty Jones, Julian Sun, Yefim Natis, Carlie Idoine, Joseph Antelmi, Mark Beyer,
Ehtisham Zaidi, Henry Cook, Jacob Orup Lund, Erick Brethenoux, Svetlana Sicular, Sumit Agarwal,
Melissa Davis, Alan D. Duncan, Afraz Jaffri, Ankush Jain, Soyeb Barot, Saul Judah, Anthony Mullen,
James Richardson, Kurt Schlegel, Austin Kronz, Ted Friedman, W. Roy Schulte, Paul DeBeasi, Robert
Thanaraj
Initiatives: Data and Analytics Strategies; Analytics, BI and Data Science Solutions; Artificial
Intelligence; Data Management Solutions
The D&A trends covered in this research can help organizations respond to change,
uncertainty and the opportunities they bring over the next three years. D&A leaders
must examine how to turn these trends into mission-critical investments that
accelerate their capabilities to anticipate, shift and respond.
Overview
Opportunities
■ The COVID-19 pandemic, while devastating, has proven that society and even the most conservative
organizations can make dramatic changes, previously thought impossible, to survive and thrive. The
top data and analytics (D&A) trends highlighted in this report will accelerate and scale D&A-based
transformations to help organizations and society build their best future.
■ Digital and AI acceleration have been byproducts of disruption. However, the difficulties of moving AI
from pilot to production at scale impedes the business impact of AI. New AI tools and techniques can
enable complex and data-scarce use cases, while investments in XOps can operationalize them using
DevOps best practices.
■ Change acceleration has become necessary for survival. The ability to rapidly design composable
D&A and transparent decision flows using a common data fabric represents a critical competency for
a disruption-ready and resilient organization.
■ Distributed everything — data, people and devices — is accelerating. Using graph techniques to
uncover connections in combinations of diverse data at scale enriches data management, analytics,
AI and machine learning (ML), and enables innovation. Leveraging distributed D&A that resides in
edge computing environments, while giving every distributed user dynamic insights, represents new
opportunities for competitive differentiation and operationalizing business value.
■ Establishing D&A as a core business function with critical capabilities in business-domain-led D&A,
data literacy, data monetization, smarter data sharing and adaptive governance accelerates the
achievement of intended value from D&A investments.
Recommendations
D&A and other business leaders responsible for D&A strategies should:
■ Accelerate change by extending the AI competencies to support complex and scarce data use cases,
including those that use small and wide data, and by designing composable D&A and transparent
decision flows that use a common data fabric.
■ Operationalize the business value of data and analytics by unifying DevOps practices (XOps) across
data, ML, models and platforms, and by building D&A into a core business function.
■ Harness “distributed everything” to transform, optimize and scale analytics-centric processes and
business models by augmenting consumers with dynamic data stories, leveraging D&A at the edge,
and using graph analytics as a foundation for modern D&A.
■ Encourage innovation by putting in place success metrics and incentives that emphasize learning and
reward innovation.
■ Proactively monitor, experiment with or exploit key trends to respond to disruption and uncertainty,
innovate and rebuild. Don’t just react to trends as they mature.
D&A has never been more critical. It’s harder to make sense of a world with ever more interdependencies
and unintended consequences. A small change here leads to a big change there, etc. (think of the
butterfly effect). This requires better D&A, as humans have a harder time making sense of it all, and
therefore require assistance to make decisions while taking ever more factors, stakeholders and data
sources into account. In addition, in the face of more competition, digitization and more emancipated
consumers, decisions must be made more quickly and more accurately, and must be personalized, again
requiring D&A to play a pivotal role.
The top D&A trends covered in this research (and summarized in Figure 1 with links to individual trends
in Table 1) represent business, market and technology dynamics that you cannot afford to ignore. They
have the potential to transform your enterprise, and will accelerate in their adoption over the next one to
three years. The accelerated speed at which disruption is occurring requires D&A leaders to have
structured and proactive mechanisms in place to identify technology trends and prioritize those with the
biggest potential impact on their competitive advantage (maybe even their survival in these uncertain
times). They are in no particular order of importance. Your ranking of them and whether you proactively
monitor them, conduct a proof of concept or deploy the capabilities represented by the trends will
depend on your mission-critical priorities, both urgent and longer-term, and how these trends can enable
them.
The global pandemic has been a major disruptive force of change. However, there are additional
dynamics driving our top D&A trends that predated the pandemic, but were accelerated by it. These
include the rate of AI, digital and overall innovation, cloud, and the convergence of D&A capabilities.
These dynamics underline the mission-critical role of D&A and the need to further improve its
effectiveness for all organizations and society at large. Implementing success metrics and incentives
that put an emphasis on learning and reward innovation when experimenting will further contribute to
success.
While the trends are divided into three categories, they may straddle two or all three categories. Given
this urgency, D&A leaders must address:
■ Accelerating change in D&A, leveraging innovations in AI, improved composability and more agile and
efficient integration of more diverse data sources.
■ Operationalizing business value through more effective XOps, better decision making and making
D&A an integral part of business.
■ Distributed everything, requiring the flexible connection of data and insights to empower an ever
wider audience of people and objects.
This report covers trends that are expected to enter the mainstream over the next three years.
For a current or immediate budget cycle view of trends and actionable advice, see Leadership Vision for
2021: Data and Analytics Leader.
Accelerating Change
Trend 1: Smarter, Responsible, Scalable AI
Back to top
SPA: By 2023, 60% of organizations with more than 20 data scientists will require a professional code of
conduct incorporating ethical use of data and AI.
Description:
Smarter, more responsible and scalable AI disruptions are needed to support more complex and data
scarce use cases, while protecting privacy and embedding AI models more effectively. Scaling and
operationalizing machine learning (ML) and AI for business impact will require extending the AI toolbox
with new techniques, including synthetic data, active learning and adaptive learning. New approaches to
building interpretable systems and operationalizing AI models into production are also an imperative.
Why Trending:
■ AI benefits are currently hard to take advantage of when data availability is limited, particularly in a
post pandemic world where many models requiring large amounts of historical data may no longer be
relevant. Current data-intensive approaches are also compute- and energy-intensive and inflexible.
■ Bias and discrimination in decision making are amplified when automated with AI. Gartner inquiries
suggest that as AI becomes more central to all aspects of society and business, there is a growing
urgency around its ethical use and privacy.
■ Many organizations struggle to scale their AI prototypes and pilots to full production and wider usage,
and often underestimate the challenge of deploying and integrating AI with other systems. According
to the 2020 Gartner AI in Organizations Survey, only 53% of prototypes are eventually deployed. Once
in production, AI models require frequent monitoring to maintain accuracy and actual value creation.
Implications:
■ The current trends of smarter, more responsible and more scalable AI enable better learning
algorithms, interpretable systems and shorter time to value.
■ Better learning algorithms enable AI solution development with less data or “small data.” These
include composite AI, small data techniques and adaptive, federated, generative adversarial and self-
supervised machine learning.
■ Interpretable systems support the validation of AI decision automation or augmentation to build trust
and provide lineage and reproducibility, for example in augmented diagnosis in healthcare. They also
support the ethical use of AI while minimizing bias, protecting privacy and complying with regulations
and corporate policies, for example in loan application processing.
■ Shorter time to value helps to apply AI more broadly and quickly to support business innovation and
agility, as well as to make AI more accessible to a wider audience without technical assistance.
Actions:
■ Invest in techniques such as composite AI, which is the synergy of data-driven techniques such as
deep learning with expertise-driven techniques such as knowledge graphs and rules, agent-based
systems and other simulation techniques. Use adaptive, federated/collaborative and transfer learning,
allowing model improvement after initial deployment, enabling AI to become more adaptive and
robust. Overcome a lack of representative or labeled data by reinforcement, active, self-supervised or
zero/one-shot learning, or by generating synthetic data.
■ Build interpretable systems through explainable ML models and explicit decision models, which help
to build trust in AI. They also help to improve the accuracy, risk management and governance of
decision automation and the augmentation of human and artificial intelligence. Reduce bias by
improving the representativeness and diversity in data and the protection of data against poisoning.
Also put in place ethics guidelines and privacy by design, or techniques such as data masking and
differential privacy.
■ Leverage advances in the automation and augmentation of development and deployment activities,
as well as in model operations. Augmentation and automation improve productivity of AI experts and
make AI more accessible to a less technical audience. Explore new infrastructure, such as
neuromorphic hardware, to accelerate AI computations and workloads. Implement advanced model
operations, enabling easier deployment, lineage and more automated detection and replacement of
models with degrading performance.
Further Reading:
Key Actions to Prevent Machine Learning Failure Due to COVID-19-Related Data Drift
Analysis by: Julian Sun, Yefim Natis, Carlie Idoine, Rita Sallam, Joseph Antelmi
SPA: By 2023, 60% of organizations will compose components from three or more analytics solutions to
build decision-oriented applications infused with analytics that connect insights to actions.
Description:
Composable D&A utilizes container- or microservices-based architecture and data fabric to assemble
flexible, modular, reusable and consumer-friendly D&A capabilities from existing assets. This transforms
monolithic data management and analytics applications into assemblies of data, analytics or other
application building blocks (known as packaged business capabilities or PBCs) through the use of
composition technologies such as low- and no-code capabilities, supporting adaptive and intelligent
decision making.
Why Trending:
■ Gartner inquiries suggest that most large organizations have more than one “enterprise standard”
analytics and BI tool. Composing new applications from the PBCs of each (in combination with
others) will promote productivity and agility.
■ Container- or microservices-based analytics and business intelligence (ABI) and data science and
machine learning (DS/ML) platforms with improved APIs allow for the rapid and flexible assembly of
analytics applications.
■ Organizations often struggle to operationalize AI. Organizations can use composition to extend
established BI production systems to new AI capabilities.
■ Cloud-based marketplaces are becoming an effective channel for organizations to distribute and
share modular analytics capabilities.
Implications:
■ Alignment of data, analytics and application development and collaboration between D&A and
application teams: Organizations can use extension APIs from the D&A world to engage with the
application world, while application developers can contribute integration kits and new composed
PBCs to the D&A marketplaces.
■ Agile citizen development: Embedded analytics focused on dashboards and reporting are usually
implemented by IT. Less skilled business power users can use the low- or no-code capabilities to
compose more advanced analytics capabilities and workflows.
Actions:
■ Improve decision making by incorporating and assembling modular, reusable D&A capabilities with a
common data fabric.
■ Pilot composable analytics in the cloud, establishing an analytics marketplace to drive and support
collaboration and sharing.
Further Reading:
Analysis by: Mark Beyer, Ehtisham Zaidi, Henry Cook and Jacob Lund
SPA: By 2023, artificial intelligence in the data fabric will be capable of reducing data quality and data
mastering ongoing operations costs up to 65%.
Description:
A data fabric utilizes continuous analytics over existing, discoverable and inferenced metadata assets to
support the design, deployment and utilization of integrated and reusable data objects regardless of
deployment platform and architectural approach. It can include automated orchestration for data
access, data integration, data quality, use of knowledge graphs, and even data utilization and usage
recommendations.
Why Trending:
■ Siloed integration initiatives lead to poor customer experience — poor processes, integration errors,
missing or inaccurate data, or any combination of these.
■ Core data management functionalities currently appear in many different data management tools,
increasing cost, time to deployment and skills requirements, and causing delays in data utilization.
■ The data fabric is a single architecture that can address the levels of diversity, distribution, scale and
complexity in an organization’s data assets.
Implications:
■ Reduces time for integration design by 30%, deployment by 30% and maintenance by 70%, because
data fabric designs draw upon the ability to use, reuse and combine different data integration styles.
■ Automated data and metadata discovery, data quality and integration drive augmented data
management. Automating repetitive tasks that exist in most data quality, mastering and integration
solutions will lower the overall costs for these solutions (by 35-65% depending on the existing
approach).
■ Data fabrics leverage existing skills and technologies, such as existing data hubs, data lakes, data
warehouses, operational data stores, master data repositories and other traditional architecture and
design solutions, while introducing new approaches, tools and platforms.
Actions:
■ Ensure that your data fabric supports the combination of different data delivery styles dynamically
(through metadata-driven design) to support specific use cases.
■ Operationalize a data fabric by implementing continuous and evolving data engineering practices for
the data management ecosystem.
■ Build your data fabric by leveraging existing, well-understood and established integration technologies
and standards, but educate your team on new approaches and practices such as DataOps and data
engineering, including in edge environments.
Further Reading:
Analysis by: Farhan Choudhary, Shubhangi Vashisth, Pieter den Hamer, Lydia Clougherty Jones and Jim
Hare
SPA: By 2025, 70% of organizations will be compelled to shift their focus from big to small and wide
data, providing more context for analytics and making AI less data hungry.
Description:
As companies experience the limitations of big data as a critical enabler of analytics and AI, new
approaches known as “small data” and “wide data” are emerging. The wide data approach, applying X
analytics techniques, enables the analysis and synergy of a variety of small and large, unstructured and
structured data sources. The small data approach is about the application of analytical techniques that
require less data but still offer useful insights. This includes the tailored use of less data hungry models,
rather than using more data hungry deep learning techniques in a one-size-fits-all approach.
Why Trending:
■ Decision making by humans and AI has become more complex and demanding, requiring a greater
variety of data for better situational awareness.
■ Small and large disruptions — such as the COVID-19 pandemic — cause historical data to become
obsolete more quickly. Techniques are needed to build analytics and AI with less data. In addition,
collecting sufficiently large volumes of historical or labeled data for analytics and AI is a challenge for
many organizations.
■ Data sourcing, data quality, bias and privacy protection are common challenges. However, the cost of
addressing these challenges in large datasets for conventional supervised ML can be prohibitive.
■ New analytics techniques are needed, capable of using available data more effectively, either by
reducing the required volume or by extracting more value from unstructured, diverse data sources.
Implications:
■ Small and wide data approaches enable more robust analytics and AI, reducing organizations’
dependency on big data and helping them attain a richer, more complete situational awareness or
360-degree view to support better decision making.
■ The wide data approach enables the analysis and synergy of a variety of small and large,
unstructured and structured data sources for more context and better situational awareness for both
human decision makers and AI applications.
■ X analytics derives insights from single or combined data sources in a variety of formats. These
include tabular, text, image, video, audio, voice, temperature or even smell and vibration. The data
itself comes from an increasing range of internal and external data sources such as data
marketplaces and brokers, social media, Internet of Things (IoT) sensors and digital twins.
■ The small data approach applies techniques such as certain time-series analysis techniques, few-shot
learning, synthetic data and self-supervised learning, as well as collaborative or federated, adaptive,
reinforcement and transfer learning.
■ Potential areas for innovation with small and wide data include, but are not limited to, demand
forecasting in retail, behavioral and emotional intelligence in (real-time) customer service applied to
hyperpersonalization and customer experience improvement. Other areas include physical security or
fraud detection and adaptive autonomous systems, such as robots, which constantly learn by the
analysis of correlations in time and space of events in different sensory channels.
Actions:
■ Lower the barrier to entry for advanced analytics and AI by considering small and wide data
approaches to mitigate the real or perceived lack of data.
■ Provide a richer context for more accurate business decision making by extending the AI toolbox with
small and wide data techniques, taking advantage of the growing availability of external data sources
through data sharing and marketplaces.
■ Improve the predictive power of data and the accuracy of models by considering alternative modeling
techniques, rather than overly relying on data hungry deep learning approaches.
Further Reading:
Analysis by: Erick Brethenoux, Ankush Jain, Afraz Jaffri, Soyeb Barot, Donald Feinberg
SPA: By 2025, 50% of enterprises will have devised artificial intelligence (AI) orchestration platforms to
operationalize AI, from less than 10% in 2021.
Description:
The goal of XOps (DataOps, MLOps, ModelOps, PlatformOps, etc.) is to achieve efficiencies and
economies of scale using DevOps best practices and ensure reliability, reusability and repeatability while
reducing duplication of technology and processes and enabling automation.
Why Trending:
■ Analytics and AI solutions have struggled to keep pace with the growing complexity, implementation
diversity (hybrid, on-premises, multicloud) and the need for access and delivery of datasets, data
science, and ML artifacts and AI-based solutions.
■ Most analytics and AI projects fail because operationalization is only addressed as an afterthought.
The extra time taken to hand off models and analytics artifacts causes significant delays and lost
productivity in operationalization, lack of ownership and frustration among engineers.
■ Complexity around integrating the solution with existing enterprise applications and infrastructure is
the top barrier to scaling analytics and AI implementations and achieving business value.
■ The multiplication of Ops disciplines stemming out of DevOps best practices has caused significant
confusion in the marketplace. Yet, their reconciliation can bring significant advantages to
organizations that are able to harmonize those disciplines.
Implications:
■ Scaling prototypes that leverage analytics and AI assets can be achieved through the continuous and
sustainable operationalization of those assets. Operationalizing at scale will also address the
reproducibility, traceability, integrity and integrability of analytics and AI assets.
■ A unified XOps strategy provides the necessary transparency and visibility to track business value
throughout analytics and AI assets’ life cycles.
■ The flexible design and agile orchestration of governed decision-making systems that can adapt
continuously are the promises of a purposefully crafted XOps strategy. Additional benefits include the
ability to simulate entire decision processes to anticipate changes, reduce uncertainty, reduce
systems’ vulnerability to threats and attacks, and prepare for unexpected environmental changes.
■ Analytical modeling assets that are deployed across hybrid, edge and Internet of Things (IoT) contexts
require orchestration platforms that integrate capabilities and simplify access across contexts.
■ An integrated XOps strategy will make it possible to align and integrate the tasks needed to combine
different techniques (composite AI) to solve a problem within a single solution.
Actions:
■ Create an integrated XOps practice that blends disparate functions, teams and processes to support
data processing, model training, model management and model monitoring, allowing for continuous
delivery of AI-based systems. Establish clear lines of cooperation between DevOps practices and
analytics and AI labs or centers of excellence.
■ Establish a strong DevOps practice across the various stages — data pipeline, data science, ML, AI
and underlying infrastructure — to radically improve the delivery pipeline and operationalize your
analytics and AI architectures.
■ Build a new set of competencies above the traditional roles, including systems thinking, the emphasis
on feedback loops, and the promotion of experimentation and learning.
■ Nurture production AI mindsets even for AI pilots. This means utilizing standardized data and model
pipeline infrastructure stacks to maximize reusability, reproducibility, reliability and rate of success, all
the while minimizing AI time to production.
■ Enable a unified foundation for XOps, where DataOps, MLOps and ModelOps can be orchestrated.
Enable the customization of data and model pipelines across hybrid multicloud, edge and IoT context
by building talent around the operations of analytics and AI platforms.
Further Reading:
SPA: By 2023, composable decisions that leverage a data fabric will reduce operational costs and
accelerate time to insight by 20%, while improving explainability.
Description:
Decision intelligence is a practical discipline framing a wide range of decision-making techniques. These
range from conventional analytics to AI and complex adaptive system applications, depending on the
decision context, to design, model, align, execute, monitor and tune decision models. Engineering
decision intelligence applies to not just individual decisions, but sequences of decisions, grouping them
into business processes and even networks of emergent decision making.
Why Trending:
■ Decision making has become more complex as business and society have become more uncertain.
According to a Gartner Research Circle Survey, “65% of decisions made are more complex (involving
more stakeholders or choices) than they were two years ago.”
■ An organization’s business results, both positive and negative, are the sum total of the quality,
timeliness and accuracy of its decisions, as well as its ability to execute on those decisions.
■ There is more uncertainty about the direction of the market, whether decisions will be questioned, and
the durability of a decision.
■ To deal with unprecedented levels of business complexity and uncertainty, organizations must
improve their ability to accelerate accurate and highly contextualized decisions.
■ It is possible to design decision solutions and examine their consequences individually. However, it
becomes exponentially harder to determine and track the consequences at the aggregate scale,
particularly if they are networked and complex.
Implications:
■ Engineering decision intelligence supports improved decisions both individually and collectively, by
modeling and optimizing their interactions, measuring business impact, learning and adjusting.
■ As decisions become increasingly automated and augmented, engineering decisions for precision,
transparency, traceability, flexibility reusability and explainability will improve trust and adoption.
■ Decisions have a human emotional component or impact with trade-offs. Changing how decisions are
made can have a significant impact on the role of decision makers. These shifts must be anticipated
and proactively managed.
Actions:
■ Create new decision-making habits by training decision makers to apply best practices such as critical
thinking, trade-off analysis, recognizing bias and listening to opposing views.
■ Increase data literacy skills throughout the organization as a step toward building a data-driven,
decision-centric organization. Establish D&A communities to encourage collaboration and
community-shared learning.
■ Consider creating a role for decision engineers by hiring or upskilling experts that are able to work with
decision makers to identify critical decisions that would benefit from the rigor of engineering decision
intelligence practices.
Further Reading:
Analysis by: Saul Judah, Melissa Davis, Alan Duncan, Lydia Clougherty Jones
SPA: By 2022, over 75% of centrally organized analytics programs will be replaced by a hybrid
organizational model that shares power with local domain data and analytics leaders.
Description:
D&A is shifting to become a core business function. Rather than it being a secondary activity done by IT
to support business outcomes, business leaders increasingly think in terms of D&A as one of their key
business capabilities to drive their business results. As organizations accelerate their digital business
transformation efforts, business-domain-led D&A, data literacy, data monetization, smarter data sharing
and adaptive governance increasingly play key business roles.
Why Trending:
■ Business domain leaders are taking control, ownership and responsibility of their D&A functions in
order to accelerate their digital business initiatives. However, they often underestimate the associated
complexity and risks, missing strategic business opportunities. Chief data officers (CDOs) who are
fully involved in setting business direction and goals, serving as executive decision makers, increase
the consistent production of business value by a factor of 2.6x.
■ To be competitive using D&A, organizations need to leverage a broad range of data assets both
internal and external to the enterprise. They must share that data, build trust and adoption through
D&A governance and ensure impactful use of D&A by building data literacy skills. However, most D&A
strategies and the data, technology and organizational competencies required to enable them are not
aligned to business outcomes or mission-critical priorities.
■ To respond to disruptive change, organizations must be able to deliver innovation quickly and adapt
applications dynamically, reassembling capabilities from inside and outside the enterprise.
Implications:
■ With D&A as a core business function, it becomes a shared business asset aligned to business
results. D&A silos break down because of better collaboration between central and federated D&A
teams.
■ Operating D&A as a business function enables better business agility, responsiveness, resilience,
saleability and sustainability in support of a composable enterprise.
■ Data sharing collaboration as a core competency, even with competitors, could yield both
organizational competitive advantage and digital business acceleration.
■ Upgrading and coordinating the portfolio of D&A skills across business domains and central teams is
becoming a critical action for D&A leaders.
Actions:
■ Build balanced, collaborative organizational models for D&A that enable both enterprise and business
domain outcomes to be achieved through effective, trust-based governance.
■ Actively curate business area datasets that could be monetized or exchanged, building a business
function to maintain an inventory of possible information assets in an intelligent data catalog.
■ Give business leaders access to the right data at the right time to maximize business impact by
adopting a “must share data unless” approach to D&A. Enable this by recalibrating risk, establishing
trust-based mechanisms and engaging with augmented data ecosystems.
■ Identify quantifiable success criteria for key business outcomes and directly connect these with the
D&A assets that enable them.
Further Reading:
Roadmap for Data Literacy and Data-Driven Business Transformation: A Gartner Trend Insight Report
Data and Analytics Leaders Must Use Adaptive Governance to Succeed in Digital Business
Distributed Everything
Trend 8: Graph Relates Everything
Back to top
Analysis by: Afraz Jaffri, Ankush Jain, Jim Hare, Pieter den Hamer
SPA: By 2025, graph technologies will be used in 80% of data and analytics innovations, up from 10% in
2021, facilitating rapid decision making across the enterprise.
Description:
Graph technologies encompass a wide variety of solutions that work with data represented as a set of
nodes and edges instead of tables, rows and columns. It allows us to find relationships between people,
places, things, events, locations etc. across diverse data. This structure intuitively models relationships
between entities and can capture business knowledge, making it easier to perform queries and answer
questions. In addition, modeling data as a graph opens up new analytical insights through the use of
graph algorithms.
Graphs are forming the foundation of many modern data and analytics capabilities. Increased
understanding and collaboration with business users, organizing and preparing data for downstream
processes, uncovering hidden insights, improving ML model creation and providing explainable AI are
just some of the uses driven by different graph technologies and techniques.
Why Trending:
■ Complex business problems require contextual awareness and understanding the variable nature of
connections and strengths across multiple entities, such as organizations, people or transactions.
Critical business questions that used to take months to answer can now be solved in minutes.
■ Graphs form the foundation of modern D&A, with capabilities to enhance and improve user
collaboration, ML models and explainable AI. The recent Gartner AI in Organizations Survey
demonstrates that graph techniques are increasingly prevalent as AI maturity grows, going from 13%
adoption when AI maturity is lowest to 48% when maturity is highest.
■ No- and low-code tools that enable visual exploration and interaction with a graph are enabling
insights to be found without the need for graph query languages.
■ Improved, scalable and lower-cost processing options, including cloud-based services and dedicated
hardware, are making graph analytics and databases prime candidates for accelerated adoption.
■ Knowledge graphs can form a key component of data fabrics and give structure to images, audio,
video and natural languages. They do so by exposing metadata and business rules, enabling data
scientists to quickly identify and use the data they need while preserving context and representing all
forms of data in a standard queryable format.
Implications:
■ A change in thinking and development of a “graph mindset” are taking place as more organizations
identify use cases that graph techniques can solve. Up to 50% of Gartner inquiries on the topic of AI
involve discussion of the use of graph technology.
■ The number of products that incorporate graph technology will increase. Within these products, the
use of graphs may or may not be visible to end users, resulting in duplication and redundancy. There
needs to be an understanding of when an underlying graph model can be exposed, and how multiple
graphs can be combined.
■ AI solutions will evolve from being based on one type of model, or ensemble of models, to being made
of composite models, with graph techniques playing a prominent role. The use of graph techniques
will require a broader and deeper set of data science and AI skills with specialist roles, such as graph
engineer and ontology manager, appearing, as well as existing roles in data science teams becoming
proficient in graph techniques.
■ Graph technology underpins the creation of richer semantic models that can enhance augmented
analytics models, as well as the richness of conversational analytics. Organizations that use graphs
and semantic approaches for natural language technology projects will have less technical debt than
those that do not.
Actions:
■ Complement traditional analytics with graph technology when the primary business questions are
about the relationships between data rather than data values themselves.
■ Deduce actual relationships among data in multiple data stores and identify enforced and implied
relationships in the data across the organizational silos by taking advantage of graph-enabled data
and metadata management capabilities..
■ Examine business processes that have a high potential for optimization through the use of graph
techniques and algorithms by creating a conceptual graph domain model for the process and testing
scenarios that graph algorithms could solve.
■ Identify potential use cases that can be simplified or accelerated with graphs for ML by evaluating
existing models that require intensive data preparation and feature selection workflows.
Further Reading:
Understanding When Graph Technologies are Best for Your Business Use Case
Case Study: Answering Critical Business Questions With Graph Analytics (Jaguar Land Rover)
SPA: By 2025, augmented consumerization functionality will drive adoption of analytics and business
intelligence capabilities beyond 50% for the first time, influencing more business processes and
decisions.
Description:
The percentage of time users spend in predefined dashboards, which largely require manual exploration
and some degree of analyst skill, will be displaced by automated, conversational, mobile and
dynamically generated insights that are customized to a user’s context and delivered to their point of
consumption in dynamic, autogenerated and personalized data stories and embedded in applications.
This will shift advanced analytical power to the information consumer — the augmented consumer —
giving them capabilities previously only available to analysts and citizen data scientists.
Why Trending:
■ Predefined dashboards often overwhelm users with data, and users need to have the skills and time to
explore what happened, why and what to do about it manually. This can lead to incorrect conclusions
and flawed decisions and actions.
Implications:
■ Surfacing the most important insights for specific business and operational users at their point of
consumption and action, without the need for an analyst intermediary, will further expand adoption
and improve the impact of analytics on the organization.
■ New user experiences with ABI platforms range from extending existing predefined dashboards and
manual point-and-click exploration with augmented and conversational capabilities, to autogenerated
domain- and industry-specific insights and conversational capabilities. These capabilities are
embedded in enterprise applications and extensions of NLP-based interfaces. There are also
completely new dynamic user experiences, where the dashboard plays a minimal role. Dynamic data
stories highlight for each user the most meaningful changes in the business for them.
■ Technological immaturity, organizational immaturity, low levels of data literacy, resistance to change,
lack of trust in perceived “black box” approaches and concerns about the availability of data will
inhibit adoption and must be proactively managed.
Actions:
■ Evaluate your existing ABI tools, as well as innovative startups offering new augmented and NLP-
driven user experiences.
■ Prioritize consumer capabilities, including: embedded analytics to push personalized content directly
to the consumer’s natural workflow; natural language query to make it easy to find the right
information; analytics catalog to describe analytics content available and show the usage of that
content; automated insights to use machines to find insights; and dynamic storytelling to displace use
of predefined dashboards and manual exploration.
■ Accelerate adoption of analytics and increase their impact by teaching users how to use the newly
found analytical power to impact their specific business problems, opportunities and processes, as
well as by expanding the organization’s data literacy.
■ Build trust in autogenerated insights and models by back testing and offering users scenario planning
and “what-if analysis,” and by prioritizing explainability features in selected platforms.
Further Reading:
How Augmented Analytics Will Transform Your Organization: A Gartner Trend Insight Report
Analysis by: Ted Friedman, Roy Schulte, Pieter den Hamer, Paul Debeasi
SPA: By 2023, over 50% of the primary responsibility of data and analytics leaders will comprise data
created, managed and analyzed in edge environments.
Description:
Increasingly, data, analytics and the technologies supporting them reside outside traditional data centers
and cloud environments. This is where edge comes in — computing environments residing closer to
assets in the physical world and outside IT’s typical purview. As this trend continues, there is both a
requirement and a huge opportunity for organizations to enable greater flexibility in how and where data
management and analytics are carried out. These changes will significantly impact D&A leaders and
their teams, requiring new capabilities and skills while also opening up new opportunities to deliver
value.
Why Trending:
■ By distributing D&A capabilities to edge environments, data-centric solutions can enable more real-
time value. For scenarios that require extremely low latency, the ability to capture and analyze data
close to the place and time of origin reduces latency issues.
■ More D&A solutions, such as those supporting IoT use cases, need to operate in disconnected (or
intermittently connected) scenarios. By bringing more powerful D&A capabilities to edge
environments, these solutions need not rely on centralized data centers or cloud resources.
■ By provisioning advanced analytics and AI capabilities to edge environments, the assets driven by
those environments can behave in an autonomous manner, with no support from external data
sources or processing capabilities. As demand grows for “smarter” physical assets in many industries,
supporting autonomous behavior will be a common requirement.
■ Governance issues related to sensitive or regulated data can constrain D&A teams from adopting
centralized or cloud-based environments — moving data outside its originating geography can violate
sovereignty regulations. By enabling data to be managed and processed in edge environments, or by
applying federated ML, the enterprise can remain in compliance.
Implications:
■ Speed and agility: By placing data, analytic workloads and AI capabilities at optimal points ranging all
the way out to endpoint devices, D&A teams can enable more real-time use cases. In addition, the
flexibility to move D&A workloads up and down the continuum from centralized data centers or the
cloud to edge devices will enable greater optimization of resources.
■ Scale and reach: By using distributed computing resources and spreading the load across the
ecosystem, D&A teams can more broadly scale their capabilities and extend their impact into more
areas of the business. This includes use cases and outcomes traditionally managed only via
operational technology teams, such as those managing equipment in industrial settings. Dedicated
hardware for edge processing of data will continue to amplify these benefits.
■ Resiliency: Pushing D&A capabilities toward edge environments can also bring benefits in the form of
greater fault tolerance and autonomous behavior. If edge environments do not require centralized
resources, then issues with connectivity to or unplanned downtime of those centralized resources
don’t disrupt processes that rely on local edge capabilities.
■ Governance: With the distribution and complexity of edge environments comes a great challenge from
a D&A governance perspective. It will be critical for D&A teams to extend the reach of their governance
practices to include edge-resident D&A workloads in scope.
Actions:
■ Identify use cases where AI capabilities in edge environments can enable differentiated products and
services by collaborating with engineers, operations managers and satellite office managers working
in edge locations.
■ Plan to augment existing data management and analytics infrastructure to support edge deployment
by partnering with product teams that are implementing IoT platforms and similar distributed
computing architectures.
■ Identify opportunities for leverage and shared outcomes by establishing communication and
collaboration with operational technology (OT) teams.
■ Place a greater emphasis on end-to-end system design. Understanding the dependencies between all
components of distributed data pipelines, analytic workloads and AI models will be crucial to success.
■ Ensure safety and control by extending existing D&A governance capabilities to apply to edge
environments where appropriate.
Further Reading:
Trend 3: Data Fabric Is the Foundation — The data fabric has been an evolving top 10 D&A trend for the
past three years. This year, we focus on its market evolution and its foundational role in the D&A
architecture and intelligent composable business.
Trend 4: From Big to Small and Wide Data — This trend is related to a number of trends from the Top 10
Trends in Data and Analytics, 2020. This includes X analytics, data exchanges and sharing, and smarter,
faster, more responsible AI. It is also related to graph analytics as a foundational technology for related
diverse data in knowledge graphs and for finding patterns across diverse data using graph analytics
technologies.
Trend 6: Engineering Decision Intelligence — The Top 10 Trends in Data and Analytics, 2020 highlighted
the importance of decision intelligence. Engineering decision intelligence focuses on the need to use
composability and design approaches for decision intelligence systems as core components of an
organizational decision competency.
Trend 8: Graph Relates Everything — Graph analytics has been an emerging top D&A trend for the past
three years. We continue to evolve our analysis and recommendations as potential high-impact use
cases, vendor solutions, customer deployments and processing capabilities expand. Graphs relate to
and enable many of our current and past trends, including data fabric, natural language processing, the
rise of the augmented consumer, X analytics, and smarter, more responsible and more scalable AI.
Trend 9: The Rise of the Augmented Consumer — We identified the decline of the dashboard as a top
data and analytics trend in Top 10 Trends in Data and Analytics, 2020. The rise of the augmented
consumer is an evolution of this important market trend, which is growing in importance, to ensure the
realization of business value from data and analytics as business complexity and uncertainty increase.
Evidence
Composable Data and Analytics:
Gartner’s 2020 Magic Quadrant customer reference survey indicates that many large organizations have
more than one “enterprise standard” analytics and BI tool, with 41% stating they have multiple standards
in place.
Data Fabric:
Interactive briefings in which vendors provided Gartner with updates on their strategy, market positioning,
recent key developments and product roadmaps.
Feedback about tools and vendors captured during conversations with users of Gartner’s client inquiry
service.
Client inquiries to Gartner’s data and analytics team — more than 500 aggregated conversations with
users of Gartner’s client inquiry service from January 2018 through September 2019
Gartner primary and secondary research into decision intelligence, composable applications and
composable data and analytics and AI.
Survey Analysis: Fifth Annual CDO Survey — Growth Must Continue in Order to Achieve Real Impact
© 2021 Gartner, Inc. and/or its affiliates. All rights reserved. Gartner is a registered trademark of Gartner, Inc. and its
affiliates. This publication may not be reproduced or distributed in any form without Gartner's prior written permission.
It consists of the opinions of Gartner's research organization, which should not be construed as statements of fact.
While the information contained in this publication has been obtained from sources believed to be reliable, Gartner
disclaims all warranties as to the accuracy, completeness or adequacy of such information. Although Gartner research
may address legal and financial issues, Gartner does not provide legal or investment advice and its research should not
be construed or used as such. Your access and use of this publication are governed by Gartner’s Usage Policy. Gartner
prides itself on its reputation for independence and objectivity. Its research is produced independently by its research
organization without input or influence from any third party. For further information, see "Guiding Principles on
Independence and Objectivity."