0% found this document useful (0 votes)
164 views12 pages

Tech Trends Deloitte 2025 (0703)

The document discusses the evolution of AI, highlighting a shift from large language models (LLMs) to smaller, specialized models and agentic AI that can perform discrete tasks. Organizations are increasingly investing in data management and exploring diverse AI applications, with a focus on efficiency and tailored solutions. The future of AI is expected to emphasize execution over augmentation, enabling more autonomous digital agents to enhance productivity across various sectors.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
164 views12 pages

Tech Trends Deloitte 2025 (0703)

The document discusses the evolution of AI, highlighting a shift from large language models (LLMs) to smaller, specialized models and agentic AI that can perform discrete tasks. Organizations are increasingly investing in data management and exploring diverse AI applications, with a focus on efficiency and tailored solutions. The future of AI is expected to emphasize execution over augmentation, enabling more autonomous digital agents to enhance productivity across various sectors.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 12

INFORMATION

What’s next for AI?


While large language models continue to advance, new models and agents are
proving to be more effective at discrete tasks. AI needs different horses for
different courses.
Kelly Raskovich, Bill Briggs, Mike Bechtel, and Abhijith Ravinutala

Blink and you’ll miss it: The speed of artificial intelli- Organizations may witness a fundamental shift in AI
gence’s advancement is outpacing expectations. Last year, from augmenting knowledge to augmenting execution.
as organizations scrambled to understand how to adopt Investments being made today in agentic AI, as this next
generative AI, we cautioned Tech Trends 2024 readers era is termed, could upend the way we work and live by
to lead with need as they differentiate themselves from arming consumers and businesses with armies of sili-
competitors and adopt a strategic approach to scaling con-based assistants. Imagine AI agents that can carry
their use of large language models (LLMs). Today, LLMs out discrete tasks, like delivering a financial report in a
have taken root, with up to 70% of organizations, by board meeting or applying for a grant. “There’s an app
some estimates, actively exploring or implementing LLM for that” could well become “There’s an agent for that.”
use cases.¹

But leading organizations are already considering AI’s Now: Getting the fundamentals right
next chapter. Instead of relying on foundation models
built by large players in AI, which may be more powerful LLMs are undoubtedly exciting but require a great deal
and built on more data than needed, enterprises are now of groundwork. Instead of building models themselves,
thinking about implementing multiple, smaller models many enterprises are partnering with companies like
that can be more efficient for business requirements.² Anthropic or OpenAI or accessing AI models through
LLMs will continue to advance and be the best option hyperscalers.4 According to Gartner®, AI servers will
for certain use cases, like general-purpose chatbots or account for close to 60% of hyperscalers’ total server
simulations for scientific research, but the chatbot that spending.5 Some enterprises have found immediate busi-
peruses your financial data to think through missed reve- ness value in using LLMs, while others have remained
nue opportunities doesn’t need to be the same model that wary about the accuracy and applicability of LLMs
replies to customer inquiries. Put simply, we’re likely to trained on external data.6 On an enterprise time scale,
see a proliferation of different horses for different courses. AI advancements are still in a nascent phase (crawling
or walking, as we noted last year). According to recent
A series of smaller models working in concert may surveys by Deloitte and Fivetran and Vanson Bourne, in
end up serving different use cases than current LLM most organizations, fewer than a third of generative AI
approaches. New open-source options and multimodal experiments have moved into production, often because
outputs (as opposed to just text) are enabling organiza- organizations struggle to access or cleanse all the data
tions to unlock entirely new offerings.³ needed to run AI programs.7 To achieve scale, organiza-
tions will likely need to further think through data and
In the years to come, the progress toward a growing technology, as well as strategy, process, and talent, as
number of smaller, more specialized models could outlined in a recent Deloitte AI Institute report.
What’s next for AI?

once again move the goalposts of AI in the enterprise.

17
According to Deloitte’s 2024 State of Generative AI in are likely to pilot smaller models. Many may have data
the Enterprise Q3 report, 75% of surveyed organiza- that can be more valuable than previously imagined,
tions have increased their investments in data-life-cycle and putting it into action through smaller, task-oriented
management due to generative AI.8 Data is foundational models can reduce time, effort, and hassle. We’re poised
to LLMs, because bad inputs lead to worse outputs (in to move from large-scale AI projects to AI everywhere,
other words, garbage in, garbage squared). That’s why as discussed in this year’s introduction.
data-labeling costs can be a big driver of AI invest-
ment.9 While some AI companies scrape the internet
to build the largest models possible, savvy enterprises New: Different horses for different courses
create the smartest models possible, which requires
better domain-specific “education” for their LLMs. For While LLMs have a vast array of use cases, the library is
instance, LIFT Impact Partners, a Vancouver-based orga- not infinite (yet). LLMs require massive resources, deal
nization that provides resources to nonprofits, is fine-tun- primarily with text, and are meant to augment human
ing its AI-enabled virtual assistants on appropriate data intelligence rather than take on and execute discrete
to help new Canadian immigrants process paperwork. tasks. As a result, says Vivek Mohindra, senior vice pres-
“When you train it on your organization’s unique ident of corporate strategy at Dell Technologies, “there
persona, data, and culture, it becomes significantly more is no one-size-fits-all approach to AI. There are going to
relevant and effective,” says Bruce Dewar, president and be models of all sizes and purpose-built options—that’s
CEO of LIFT Impact Partners. “It brings authenticity one of our key beliefs in AI strategy.”15
and becomes a true extension of your organization.”10
Over the next 18 to 24 months, key AI vendors and
Data enablement issues are dynamic. Organizations enterprise users are likely to have a toolkit of models
surveyed by Deloitte said new issues could be exposed comprising increasingly sophisticated, robust LLMs
by the scale-up of AI pilots, unclear regulations around along with other models more applicable to day-to-
sensitive data, and questions around usage of external day use cases. Indeed, where LLMs are not the optimal
data (for example, licensed third-party data). That’s choice, three pillars of AI are opening new avenues of
why 55% of organizations surveyed avoided certain value: small language models, multimodal models, and
AI use cases due to data-related issues, and an equal agentic AI (figure 1).
proportion are working to enhance their data security.11
Organizations could work around these issues by using Small language models
out-of-the-box models offered by vendors, but differen-
tiated AI impact will likely require differentiated enter- LLM providers are racing to make AI models as effi-
prise data. cient as possible. Instead of enabling new use cases, these
efforts aim to rightsize or optimize models for existing
Thankfully, once the groundwork is laid, the benefits use cases. For instance, massive models are not neces-
are clear: Two-thirds of organizations surveyed say sary for mundane tasks like summarizing an inspection
they’re increasing investments in generative AI because report—a smaller model trained on similar documents
they’ve seen strong value to date.12 Initial examples of would suffice and be more cost-efficient.
real-world value are also appearing across industries,
from insurance claims review to telecom troubleshoot- Small language models (SLMs) can be trained by enter-
ing and consumer segmentation tools.13 LLMs are also prises on smaller, highly curated data sets to solve more
making waves in more specialized use cases, such as specific problems, rather than general queries. For exam-
space repairs, nuclear modeling, and material design.14 ple, a company could train an SLM on its inventory
information, enabling employees to quickly retrieve
As underlying data inputs improve and become more insights instead of manually parsing large data sets, a
sustainable, LLMs and other advanced models (like process that can sometimes take weeks. Insights from
simulations) may become easier to spin up and scale. such an SLM could then be coupled with a user interface
But size isn’t everything. Over time, as methods for AI application for easy access.
training and implementation proliferate, organizations

18
Figure 1

Different AI for different needs

Small language models Multimodal Agentic

Can’t train on smaller data sets;


Text, customizable, applied to
Focus needs greater input and has wider Can take concrete actions
different use cases (trainable)
variety of output

Input Text More than text Text

Output Some More Most

Data Less Significant To be determined

Need to be customized and trained Less customization possible due to Vendors provide out-of-the-box
Customization
on data they would work with the volume of data required capabilities, but works best when
tailored

Source: Deloitte research.

Naveen Rao, vice president of AI at Databricks, believes parameters, from their larger AI offerings, and Meta
more organizations will take this systems approach with offers multiple options across smaller models and fron-
AI: “A magic computer that understands everything is tier models.17
a sci-fi fantasy. Rather, in the same way we organize
humans in the workplace, we should break apart our Finally, much of the progress happening in SLMs is
problems. Domain-specific and customized models can through open-source models offered by companies like
then address specific tasks, tools can run deterministic Hugging Face or Arcee.AI.18 Such models are ripe for
calculations, and databases can pull in relevant data. enterprise use since they can be customized for any
These AI systems deliver the solution better than any number of needs, as long as IT teams have the internal
one component could do alone.”16 AI talent to fine-tune them. In fact, a recent Databricks
report indicates that over 75% of organizations are
An added benefit of smaller models is that they can be choosing smaller open-source models and customizing
run on-device and trained by enterprises on smaller, them for specific use cases.19 Since open-source models
highly curated data sets to solve more specific problems, are constantly improving thanks to the contributions of a
rather than general queries, as discussed in “Hardware is diverse programming community, the size and efficiency
eating the world.” Companies like Microsoft and Mistral of these models are likely to improve at a rapid clip.
What’s next for AI?

are currently working to distill such SLMs, built on fewer

19
Multimodal models in the real world. Examples range from booking a flight
based on your travel preferences to providing auto-
Humans interact through a variety of mediums: text, mated customer support that can access databases and
body language, voice, videos, among others. Machines execute needed tasks—likely without the need for highly
are now hoping to catch up.20 Given that business needs specialized prompts.26 The proliferation of such action
are not contained to text, it’s no surprise that companies models, working as autonomous digital agents, heralds
are looking forward to AI that can take in and produce the beginnings of agentic AI, and enterprise software
multiple mediums. In some ways, we’re already accus- vendors like Salesforce and ServiceNow are already tout-
tomed to multimodal AI, such as when we speak to ing these possibilities.27
digital assistants and receive text or images in return, or
when we ride in cars that use a mix of computer vision Chris Bedi, chief customer officer at ServiceNow, believes
and audio cues to provide driver assistance.21 that domain- or industry-specific agentic AI can change
the game for humans and machine interaction in enter-
Multimodal generative AI, on the other hand, is in its prises.28 For instance, in the company’s Xanadu platform,
early stages. The first major models, Google’s Project one AI agent can scan incoming customer issues against
Astra and OpenAI’s GPT-4 Omni, were showcased in a history of incidents to come up with a recommenda-
May 2024, and Amazon Web Services’ Titan offering tion for next steps. It then communicates to another
has similar capabilities.22 Progress in multimodal gener- autonomous agent that’s able to execute on those
ative AI may be slow because it requires significantly recommendations, and a human in the loop reviews
higher amounts of data, resources, and hardware.23 In those agent-to-agent communications to approve the
addition, the existing issues of hallucination and bias hypotheses. In the same vein, one agent might be adept at
that plague text-based models may be exacerbated by managing workloads in the cloud, while another provi-
multimodal generation. sions orders for customers. As Bedi says, “Agentic AI
cannot completely take the place of a human, but what it
Still, the enterprise use cases are promising. The notion can do is work alongside your teams, handling repetitive
of “train once, run anywhere (or any way)” promises a tasks, seeking out information and resources, doing work
model that could be trained on text, but deliver answers in the background 24/7, 365 days a year.”29
in pictures, video, or sound, depending on the use case
and the user’s preference, which improves digital inclu- Finally, aside from the different categories of AI models
sion. Companies like AMD aim to use the fledgling tech- noted above, advancements in AI design and execution
nology to quickly translate marketing materials from can also impact enterprise adoption—namely, the advent
English to other languages or to generate content.24 For of liquid neural networks. “Liquid” refers to the flexi-
supply chain optimization, multimodal generative AI bility in this new form of training AI through a neural
can be trained on sensor data, maintenance logs, and network, a machine learning algorithm that mimics
warehouse images to recommend ideal stock quantities.25 the human brain’s structure. Similar to how quantum
This also leads to new opportunities with spatial comput- computers are freed from the binary nature of classical
ing, which we write about in “Spatial computing takes computing, liquid neural networks can do more with
center stage.” As the technology progresses and model less: A couple dozen nodes in the network might suffice,
architecture becomes more efficient, we can expect to versus 100,000 nodes in a more traditional network. The
see even more use cases in the next 18 to 24 months. cutting-edge technology aims to run on less computing
power, with more transparency, opening up possibili-
Agentic AI ties for embedding AI into edge devices, robotics, and
safety-critical systems.30 In other words, it’s not just the
The third new pillar of AI may pave the way for changes applications of AI but also its underlying mechanisms
to our ways of working over the next decade. Large (or that are ripe for improvement and disruption in the
small) action models go beyond the question-and-an- coming years.
swer capabilities of LLMs and complete discrete tasks

20
Figure 2

Compound AI journey

1 2 3
Small language model
Human
Small language model
Apply tools to analyze Create customer-facing
Retrieve data data and create insights social media content based
on insights

6 5 4
Agentic
Multimodal
Schedule the marketing Human
Generate marketing
post for the most opportune Review for accuracy and
images based on output
time, based on content and appropriateness
from step 3
target audience. Repeat
process as needed.

Source: Deloitte research.

Next: There’s an agent for that • Job displacement and creation. Some claim that
roles such as prompt engineer could become obso-
In the next decade, AI could be wholly focused on execu- lete.34 However, the AI expertise of those employees
tion instead of human augmentation. A future employee will remain pertinent as they focus on managing,
could make a plain-language request to an AI agent, training, and collaborating with AI agents as they
for example, “close the books for Q2 and generate a do with LLMs today. For example, a lean IT team
report on EBITDA.” Like in an enterprise hierarchy, with AI experts might build the agents it needs in a
the primary agent would then delegate the needed tasks sort of “AI factory” for the enterprise. The signif-
to agents with discrete roles that cascade across differ- icant shift in the remaining workforce’s skills and
ent productivity suites to take action. As with humans, education may ultimately reward more human skills
teamwork could be the missing ingredient that enables like creativity and design, as mentioned in previous
the machines to improve their capabilities.31 This leads to Tech Trends.
a few key considerations for the years to come (figure 2):
• Privacy and security. The proliferation of agents
• AI-to-AI communication. Agents will likely have with system access is likely to raise broad concerns
a more efficient way of communicating with each about cybersecurity, which will only become more
other than human language, as we don’t need important as time progresses and more of our data
human-imitating chatbots talking to each other.32 is accessed by AI systems. New paradigms for risk
Better AI-to-AI communication can enhance and trust will be required to make the most out of
outcomes, as fewer people will need to become applying AI agents.
experts to benefit from AI. Rather, AI can adapt to
What’s next for AI?

each person’s communication style.33

21
• Energy and resources. AI’s energy consumption is When it comes to AI, enterprises will likely have the
a growing concern.35 To mitigate environmental same considerations in the future that they do today:
impacts, future AI development will need to balance data, data, and data. Until AI systems can reach arti-
performance with sustainability. It will need to ficial general intelligence or learn as efficiently as the
take advantage of improvements in liquid neural human brain,37 they will be hungry for more data and
networks or other efficient forms of training AI, inputs to help them be more powerful and accurate.
not to mention the hardware needed to make all Steps taken today to organize, streamline, and protect
of this work, as we discuss in “Hardware is eating enterprise data could pay dividends for years to come,
the world.” as data debt could one day become the biggest portion
of technical debt. Such groundwork should also help
• Leadership for the future. AI has transformative enterprises prepare for the litany of regulatory challenges
potential, as everyone has heard plenty over the and ethical uncertainties (such as data collection and use
last year, but only insofar as leadership allows. limitations, fairness concerns, lack of transparency) that
Applying AI as a faster way of doing things the come with shepherding this new, powerful technology
way they’ve always been done will result in, at into the future.38 The stakes of garbage in, garbage out
best, missed potential, and, at worst, amplified are only going to grow: It would be much better to opt
biases.36 Imaginative, courageous leaders should for genius in, genius squared.39
dare to take AI from calcified best practices to the
creation of “next practices,” where we find new
ways of organizing ourselves and our data toward an
AI-enabled world.

22
36
BUSINESS OF TECHNOLOGY

IT, amplified: AI elevates the reach


(and remit) of the tech function
As the tech function shifts from leading digital transformation to
leading AI transformation, forward-thinking leaders are using this as an
opportunity to redefine the future of IT
Kelly Raskovich, Bill Briggs, Mike Bechtel, and Abhijith Ravinutala

Much has been said, including within the pages of Tech As we discuss in “Hardware is eating the world,” hard-
Trends, about the potential for artificial intelligence to ware and infrastructure are having a moment, and enter-
revolutionize business use cases and outcomes. Nowhere prise IT spending and operations may shift accordingly.
is this more true than in the end-to-end life cycle of soft-
ware engineering and the broader business of informa- As both traditional AI and generative AI become more
tion technology, given generative AI’s ability to write capable and ubiquitous, each of the phases of tech deliv-
code, test software, and augment tech talent in general. ery may see a shift from human in charge to human in
Deloitte research has shown that tech companies at the the loop. Organizations need a clear strategy in place
forefront of this organizational change are ready to real- before that occurs. Based on Deloitte analysis, over the
ize the benefits: They are twice as likely as their more next 18 to 24 months, IT leaders should plan for AI
conservative peers to say generative AI is transforming transformation across five key pillars: engineering, talent,
their organization now or will within the next year.1 cloud financial operations (FinOps), infrastructure, and
cyber risk.
We wrote in a Tech Trends 2024 article that enterprises
need to reorganize their developer experiences to help This trend may usher in a new type of lean IT over the
IT teams achieve the best results. Now, the AI hype cycle next decade. If commercial functions see an increased
has placed an even greater focus on the tech function’s number of citizen developers or digital agents that can
ways of working. IT has long been the lighthouse of spin up applications on a whim, the role of the IT func-
digital transformation in the enterprise, but it must now tion may shift from building and maintaining to orches-

IT, amplified: AI elevates the reach (and remit) of the tech function
take on AI transformation. Forward-thinking IT leaders trating and innovating. In that case, AI may not only be
are using the current moment as a once-in-a-generation undercover, as we indicate in the introduction to this
opportunity to redefine roles and responsibilities, set year’s report, but may also be overtly in the boardroom,
investment priorities, and communicate value expecta- overseeing tech operations in line with human needs.
tions. More importantly, by playing this pioneering role,
chief information officers can help inspire other tech-
nology leaders to put AI transformation into practice. Now: Spotlight—and higher spending—on IT

After years of enterprises pursuing lean IT and every- For years, IT has been under pressure to streamline
thing-as-a-service offerings, AI is sparking a shift away sprawling cloud spend and curb costs. Since 2020,
from virtualization and austere budgets. Gartner predicts however, investments in tech have been on the rise
that “worldwide IT spending is expected to total $5.26 thanks to pent-up demand for collaboration tools and
trillion in 2024, an increase of 7.5% from 2023.”2 the pandemic-era emphasis on digitalization.3 According

37
to Deloitte research, from 2020 to 2022, the global It will be more integrated with the business than ever.
average technology budget as a percentage of revenue AI is moving fast, and centralization is a good way to
jumped from 4.25% to 5.49%, an increase that approx- ensure organizational speed and focus.”11
imately doubled the previous revenue change from 2018
to 2020.4 And in 2024, US companies’ average budget As IT gears up for the opportunity presented by AI—
for digital transformation as a percentage of revenue is perhaps the opportunity that many tech leaders and
7.5%, with 5.4% coming from the IT budget.5 employees have waited for—changes are already under-
way in how the technology function organizes itself and
As demand for AI sparks another increase in spending, executes work. The stakes are high, and IT is due for
the finding from Deloitte’s 2023 Global Technology a makeover.
Leadership Study continues to ring true: Technology
is the business, and tech spend is increasing as a result.
New: An AI boost for IT
Today, enterprises are grappling with the new rele-
vance of hardware, data management, and digitization Over the next 18 to 24 months, the nature of the IT func-
in ramping up their usage of AI and realizing its value tion is likely to change as enterprises increasingly employ
potential. In Deloitte’s Q2 State of Generative AI in the generative AI. Deloitte’s foresight analysis suggests that,
Enterprise report, businesses that rated themselves as by 2027, even in the most conservative scenario, gen AI
having “very high” levels of expertise in generative AI will be embedded into every company’s digital product
were increasing their investment in hardware and cloud or software footprint (figure 1), as we discuss across five
consumption much more than the average enterprise.6 key pillars.12
Overall, 75% of organizations surveyed have increased
their investments around data-life-cycle management Engineering
due to generative AI.7
In the traditional software development life cycle,
These figures point to a common theme: To realize the manual testing, inexperienced developers, and disparate
highest impact from gen AI, enterprises likely need to tool environments can lead to inefficiencies, as we’ve
accelerate their cloud and data modernization efforts. discussed in prior Tech Trends. Fortunately, AI is already
AI has the potential to deliver efficiencies in cost, inno- having an impact on these areas. AI-assisted code gener-
vation, and a host of other areas, but the first step to ation, automated testing, and rapid data analytics all
accruing these benefits is for businesses to focus on save developers more time for innovation and feature
making the right tech investments.8 Because of these development. The productivity gain from coding alone
crucial investment strategies, the spotlight is on tech is estimated to be worth US$12 billion in the United
leaders who are paving the way. States alone.13

According to Deloitte research, over 60% of US-based At Google, AI tools are being rolled out internally to
technology leaders now report directly to their chief exec- developers. In a recent earnings call, CEO Sundar Pichai
utives, an increase of more than 10 percentage points said that around 25 percent of the new code at the tech-
since 2020.9 This is a testament to the tech leader’s nology giant is developed using AI. Shivani Govil, senior
increased importance in setting the AI strategy rather director of product management for developer products,
than simply enabling it. Far from a cost center, IT is believes that “AI can transform how engineering teams
increasingly being seen as a differentiator in the AI age, work, leading to more capacity to innovate, less toil,
as CEOs, following market trends, are keen on staying and higher developer satisfaction. Google’s approach
abreast of AI’s adoption in their enterprise.10 is to bring AI to our users and meet them where they
are—by bringing the technology into products and tools
John Marcante, former global CIO of Vanguard and US that developers use every day to support them in their
CIO-in-residence at Deloitte, believes AI will fundamen- work. Over time, we can create even tighter alignment
tally change the role of IT. He says, “The technology between the code and business requirements, allowing
organization will be leaner, but have a wider purview. faster feedback loops, improved product market fit, and

38
Figure 1

How generative AI might transform IT ways of working


Over the next 18 to 24 months, enterprises may experience vast improvement in their technology teams as generative AI is
increasingly embedded into ways of working. Deloitte’s foresight analysis suggests that by 2027, even in the most
conservative scenario, gen AI will be embedded into every company’s digital product/software footprint. Manual and
time-consuming processes like code reviews, infrastructure configuration, and budget management can be automated and
improved, as we move from current to target state of AI in IT.

The problem Necessary changes Recommended actions

Manual, inefficient aspects of the Shift from writing code to defining the Tech leaders should expect human-in-the-
Engineering traditional software development architecture, reviewing code, and loop code generation and review to become
life cycle orchestrating functionality the standard

Executives struggle to hire workers with AI can generate rich learning and Tech leaders should implement regular AI-
Talent the right backgrounds and are forced to development media as well as powered learning recommendations and
delay projects documentation to upskill talent personalization as a new way of working

Runaway spend is common in the cloud, AI-powered cost analysis, pattern


Cloud financial Leaders should consistently apply AI to help
since resources can be provisioned with detection, and resource allocation
operations it earn its keep and optimize costs
a click can optimize IT spend at new speeds

Nearly half of enterprises are handling Automated resource allocation, predictive Leaders should work toward an IT
Infrastructure tasks like security, compliance, and maintenance, and anomaly detection infrastructure that can heal itself as
service management on a manual basis could revolutionize IT systems needed through AI

Automated data masking, incident Enterprises should take steps to further


Generative AI and digital agents open up
Cyber response, and policy generation can authenticate data and digital media through
more attack surfaces than ever for bad actors
optimize cybersecurity responses new tech or processes

IT, amplified: AI elevates the reach (and remit) of the tech function
Source: Deloitte research and analysis.

better alignment to the business outcomes.”14 In another As Deloitte recently stated in a piece on engineering in
example, a health care company used COBOL code assist the age of gen AI, the developer role is likely to shift
to enable a junior developer with no experience in the from writing code to defining the architecture, reviewing
programming language to generate an explanation file code, and orchestrating functionality through contextu-
with 95% accuracy.15 alized prompt engineering. Tech leaders should anticipate
human-in-the-loop code generation and review to be the
standard over the next few years of AI adoption.16

39
Talent require manual budgeting and offer limited visibility
across disparate systems.22 The power of AI enables orga-
Technology executives surveyed by Deloitte last year nizations to be more informed, proactive, and effective
noted that they struggle to hire workers with critical IT with their financial management. Real-time cost analysis,
backgrounds in security, machine learning, and software as well as robust pattern detection and resource alloca-
architecture, and are forced to delay projects with finan- tion across systems, can optimize IT spending at a new
cial backing due to a shortage of appropriately skilled speed.23 AI can help enterprises identify more cost-saving
talent.17 As AI becomes the newest skill in demand, many opportunities through better predictions and tracking.24
companies may not even be able to find all the talent All of this is necessary because AI may significantly drive
they need, leading to a hiring gap wherein nearly 50% up cloud costs for large companies in the coming years.
of AI-related positions cannot be filled.18 Applying AI to FinOps can help justify the investments
in AI and optimize costs elsewhere while AI demand
As a result, tech leaders should focus on upskilling their increases.25
own talent, another area where AI can help. Consider
the potential benefits of AI-powered skills gap analyses Infrastructure
and recommendations, personalized learning paths, and
virtual tutors for on-demand learning. Bayer, the life Across the very broad scope of IT infrastructure, from
sciences company, has used generative AI to summarize toolchain to service management, organizations haven’t
procedural documents and generate rich media such as seen as much automation as they want.Just a few years
animation for e-learning.19 Along the same lines, AI could ago, studies estimated that nearly half of large enterprises
generate documentation to help a new developer under- were handling key tasks like security, compliance, and
stand a legacy technology, and then create an associated service management on a completely manual basis.26 The
learning podcast and exam for that same developer. missing ingredient? Automation that can learn, improve,
and react to the changing demands of a business. Now,
At Google, developers thrive on hands-on experience that’s possible.
and problem-solving, so leaders are keen to provide
AI learning and tools (like coding assistants) that meet Automated resource allocation, predictive maintenance,
developers where they are on their learning journey. “We and anomaly detection could all be possible in a system
can use AI to enhance learning, in context with emerging that’s set up to natively understand its own real-time
technologies, in ways that anticipate and support the status and then act.27 This emerging view of IT is known
rapidly changing skills and knowledge required to adapt as autonomic, in reference to the human body’s auto-
to them,” says Sara Ortloff, senior director of developer nomic nervous system that regulates its heart rate and
experience at Google.20 breath, and adjusts dynamically to internal and exter-
nal stimuli.28 As mentioned above, such a system would
As automation increases, tech talent would take an over- enable the change from human in charge to human in the
sight role and enjoy more capacity to focus on innovation loop, as infrastructure takes care of itself and surfaces
that can improve the bottom line (as we wrote about last only the issues that require human intervention. That’s
year). This could help attract talent since, according to why companies like eBay are already leveraging gener-
Deloitte research, the biggest incentive that attracts tech ative AI to scale their infrastructure and sort through
talent to new opportunities is the work they would do troves of customer data, potentially leading to impactful
in the role.21 changes to their platform.29

Cloud financial operations Cyber

Runaway spending became a common problem in the Although AI may make many aspects of IT simpler or
cloud era when resources could be provisioned with a more efficient, it certainly introduces more complexity
click. Hyperscalers have offered data and tooling for to cyber risk. As we wrote about last year, generative
finance teams and CIOs to keep better track of their AI and synthetic media open up more attack surfaces
team’s cloud usage, but many of these FinOps tools still than ever for phishing, deepfakes, prompt injection, and

40
others.30 As AI proliferates and digital agents become own solutions. While IT-as-a-service may not be new,
the newest business-to-business representatives, these the previous understanding was that several aspects of
issues may become more severe. Enterprises should take a company’s IT infrastructure would be handed off to
steps to work on data authentication, as in the example a new vendor.35 Looking forward, that vendor could be
of SWEAR, a security company that has pioneered a replaced by each organization’s internally trained and
way to verify digital media through the blockchain.31 secure AI agents.
Data masking, incident response, and automated policy
generation are all also areas where generative AI can be In this sense, IT itself could become a service run through
applied to optimize cybersecurity responses and defend online portals, where a combination of low-code or
against attacks.32 no-code technologies and advanced AI allows nontech-
nical users to create and run applications.36 For example,
Finally, as technology teams grow accustomed to the the role of the chief architect could look very different
changes and challenges mentioned above, many will shift with many legacy tasks performed by a digital agent. Just
their focus to the innovation, agility, and growth that as cloud computing blocks can today be opened with a
can be enabled by AI. Teams can streamline their IT click, entire applications may be available at a click in
workflows and reduce the need for manual intervention the next five to 10 years. Continuous tech learning and
or offshoring, allowing IT to focus on higher-value activ- fluency would become essential across the enterprise,
ities.33 Indeed, an entire reallocation of IT resources is not just in IT, as employees and citizen developers would
likely to take place. As Ian Cairns, CEO of Freeplay, has be encouraged to adapt to the latest technologies. Trust
noted, “As with any major platform shift, the businesses and security responsibilities would also broaden, with
that succeed will be the ones that can rethink and adapt technology teams retaining humans in the loop to review
how they work and build software for a new era.”34 data privacy, cybersecurity, and ethical AI practices.

Though the advancement of AI may call into question


Next: IT itself as a service the future role of IT, it actually elevates the technology
function in the enterprise once it’s embedded every-
The current moment is like an all-hands-on-deck siren where. Savvy tech leaders will need to develop a bevy
sounding for many IT teams, where product managers, of skills as tech and AI become even more important in
domain experts, and business unit leaders are diving the enterprise. These skills include journey and process
into the details of AI to stand up working proofs of knowledge, program and product management, busi-
concepts. If the bet pays off and companies are able to ness development, trust and compliance expertise, and
improve their margins with this new technology, IT may ecosystem management (including AI tools and share-
complete its transition from a cost center and enabler to ability). Leaders may also need to take on a new role as
a true competitive differentiator. By then, the role of the the enterprise’s educator and evangelist of AI, in order
CIO and their management of the tech estate could be to drive change management.
dramatically altered.

IT, amplified: AI elevates the reach (and remit) of the tech function
Marcante says, “AI capabilities may be democratized
Imagine a scenario over the next decade where IT for the business and spur innovation, but tech leaders
transitions from a centrally controlled function to an have to drive the agenda. There has to be a set of guiding
innovation leader, providing reusable code blocks and principles and goals that people can point to globally to
platforms that business units can use to develop their move their enterprise forward.”37

Aer reading the articles, finish the Writing Task 2 question below.

You should spend around 40 minutes on this task.


Some scientists believe that in the future computers will be more intelligent than human beings.
While some see this as a positive development others worry about the negative consequences.
Discuss both views and give your opinion.
Give reasons for your answer and include any relevant examples from your own knowledge or experience.
Write at least 250 words.
41

You might also like