Using AI To Build Apps
Using AI To Build Apps
Contents
03 Introduction
18 Conclusion
Using AI to Build Apps that Delight Customers 3
Introduction
Today, breakthroughs in AI and data are ushering in a new era of app innovation, empowering businesses to
overcome their biggest challenges and inspire growth with intelligent apps and services. Using AI tools and data,
companies are rapidly developing unique solutions and digital experiences that drive immediate business results,
positioning themselves as industry leaders for the long term.
This eBook offers a comprehensive overview of how some successful businesses use AI and data to build intelligent
apps that delight customers and deliver game-changing results.
Using AI to Build Apps that Delight Customers 4
Intelligent apps like the one described above provide an experience uniquely tailored to the customer. Not only do
they provide a wonderful experience, but they also give the customer real value that they wouldn’t find elsewhere.
AI apps delight customers in ways that we’re just beginning to explore. Whereas traditional apps have built-in
limitations, AI apps employ machine learning to continually learn and adapt, using advanced models powered
by cloud computing to optimise their results over time. The insights they provide are much more informative
and actionable than their non-AI counterparts.
Here’s a simple breakdown of the core differences that make intelligent apps more flexible, scalable and high-
performing than traditional apps.
Typically built on
a monolithic architecture Built on the cloud using AI apps have enhanced scalability that lets
Implementation and deployed a microservices architecture them handle unlimited traffic and data
on- premises
Building apps with AI may seem like a thing of the future – but it’s not. In the following pages, find out how
four companies fueled their intelligent app strategy using Azure AI tools.
Using AI to Build Apps that Delight Customers 5
For almost 140 years, global strategy management consulting firm Arthur D. Little has been helping clients drive
technology innovation to enhance their industry-leading capabilities. The firm’s clients include the majority of the
Fortune 500, and its workforce extends across 46 offices in 39 countries. Teams of consultants work with a giant and
ever-growing wealth of data and documents – much of which is unsorted and unstructured.
With most of its intellectual capital being stored in complex data formats and multiple languages, traditional content
search methods were proving to be unsustainable. The firm couldn’t easily access the collective knowledge scattered
across documents, files and experts. Already a Microsoft customer, Arthur D. Little turned to Azure OpenAI Service,
Azure AI Search and Azure AI services to create and launch an internal AI solution to sort through and make sense
of complex document formats so they could deliver better service to their clients.
Useful information was typically stored in complex data formats and multiple languages, making searching
and interacting with relevant data difficult. The company wanted to make it easier to search through and summarise
documents so they could spend more time delivering personalised service.
The company wanted a solution that would perform fast searches while maintaining the highest levels
of confidentiality to comply with regulatory and contractual privacy requirements to preserve client trust.
The company wanted to innovate products and services using the most advanced technologies without
getting bogged down in manual management processes.
Through the power of pre-trained large language models, the team at Arthur D. Little were no longer simply searching, but
building insights as they browsed their knowledge base. They were able to bring generative AI to businesses in a responsible,
scalable, secure and compliant way.
Using AI to Build Apps that Delight Customers 7
Solutions used
To maximise the collective knowledge of its consultants, Arthur D. Little created an internal solution that draws
on text analytics and other AI enrichment capabilities in Azure AI services to improve indexing and deliver
consolidated data insights. Using this solution, consultants have access to summaries of documents with the
abstractive summarisation feature in Azure AI Language. Unlike extractive summarisation – which only extracts
sentences with relevant information – abstractive summarisation generates concise and coherent summaries,
saving the consultants from scanning long documents for information.
Azure AI Search
Azure AI service
Outcomes
The abstractive summarisation in Azure AI Language was transformational in helping consultants access information.
For instance, it can take a 100-slide PowerPoint deck with fragmented text and images and immediately make
them readable and searchable, allowing consultants to determine whether a document’s content is relevant within
seconds. Plus, introducing text translation and entity linking in Azure AI Language and an Azure SQL Server
operational data mart helps consultants better understand the context and relationship of their data, breaking down
previous knowledge barriers.
Arthur D. Little became an early adopter of large language models through Azure OpenAI, which helped them store
inputs and responses in a secure environment and ensure that confidential information wouldn’t be used for training
purposes. For heightened security, the company also deployed Microsoft 365 Defender and Microsoft Intune
throughout the firm, giving it access to incident management abilities to quickly address any activity that appears
beyond the baseline.
Arthur D. Little’s infrastructure has been grounded in Windows Server virtual machines on Azure, allowing it to turn
off any remaining VMware software from its on-premises environment. Across every use case, the firm makes each
of its Azure services available to its development team without contacting Microsoft, so its teams can move straight
into its innovation and development phase without dealing with manual infrastructure management and monitoring.
Increasing flexibility
and accessibility
TIM pioneers synthesised voice service to increase customer satisfaction
Italian-Brazilian company TIM was founded in 1995. Today it serves tens of millions of customers daily
throughout Brazil, making it one of the largest organisations in the country in landline, mobile telephone and
internet services.
The company’s virtual assistant, named TAIS, is used to perform customer service. The telephone service
was initially created with a robotic voice and some options. Over time, this was eventually replaced with
scripted human speech recorded in a studio. Although the human-sounding voice was an improvement
over the robotic one, the company still recognised that it lacked a quality of friendliness. After all, nobody
communicates with a friend through a list of options. With support from Microsoft, TIM implemented
a synthesised and realistic voice solution for AI-generated phone answering to give customers a more
approachable method of solving their problems over the phone.
With customers speaking a wide range of languages and needing different accommodations, TIM sought to provide
a service in over 100 languages and give customers more ways to control the conversation. For instance, some
customers might want to ask the voice to slow down to enhance their understanding – something the recorded
voice couldn’t do.
The voice recorded in the studio made any type of improvisation impossible, limiting the scope of the conversation..
The company wanted a phone answering solution that would move away from a scripted bot, which is limited in its
ability to address certain issues and doesn’t allow for improvisation.
The company’s evolution was leading it to serve foreign audiences and sponsor major events. TIM wanted to
increase its competitive standing in the industry by expanding its service capacity for international customers
without requiring costly human contributions.
Solutions used
To increase its capacity to provide friendly phone service, TIM created a synthesised and realistic voice solution
for AI-generated phone answering using AI tools from Azure. The company implemented Azure AI Services, Azure AI
services, Azure AI Speech and Text-to-Speech Neural Voice to create the neural voice channel.
In response, TIM customers accepted the new speech with enthusiasm. They instantly noticed an increased quality
in understanding and intonation. With their new phone answering tool, TIM customers realised they could provide
enhanced services to their own customers over the phone without demanding additional human effort.
Azure AI Speech
Azure AI services
Outcomes
With the switch to neural voice, TIM has helped its customers reduce the amount of human contribution
needed to serve customers, which has directly influenced both customer satisfaction and cost savings.
According to the Forrester TEI of Azure AI study, the adoption of Azure AI has the potential to result
in 40% reduction in customer support requests.
Using the new tool, phone customers have access to more options to help them better understand and
guide the conversation. They can request slower or faster speech, making it easier for companies to solve
their customers’ problems. This has enabled friendlier and more approachable phone service that was never
possible with a scripted bot.
The tool has expanded the company’s service capacity for international organisations by offering
a range of more than 400 different voices in 140 languages.
Enabling opportunities
to innovate
DeepBrain AI adds more advanced AI capabilities to its technology
Founded in 2016 as a chatbot service, DeepBrain AI has been on a mission to design AI solutions
for customers in finance, commerce, retail, education and media. By 2018, DeepBrain AI developed
an early prototype of its AI avatars. By 2019, it enabled voice synthesis capabilities for its solutions, and then
continued pushing the envelope by innovating AI avatar capabilities.
In its never-ending quest to innovate with AI, the company turned to the Azure platform to take advantage
of its large language model capabilities, among others. Now, the company has integrated Azure OpenAI
Service, Azure Cognitive Services and AKS to power intelligent AI-charged avatars that customers use for
training videos, news broadcasts, marketing videos, one-on-one interviews and more.
DeepBrain AI wanted to build and train sophisticated deep learning models on extensive datasets. This required
technology that could easily scale AI workloads and dynamically allocate computing resources while ensuring
high availability across all its applications.
With vast amounts of data to analyze to power its expanding portfolio of AI solutions, the company
wanted more tools for eliminating manual processes and cutting down on development time.
This would help them iterate quickly and save time and effort in managing infrastructure.
In addition to adding more advanced AI capabilities to its solutions, DeepBrain AI wanted go-to-market support
to help sell its solutions to a wide range of industries, including retail, customer service, finance and education.
Solutions used
Partnering with Azure has helped DeepBrain AI expand its AI solutions portfolio, including DeepBrain AI Avatar,
DeepBrain AI Human and DeepBrain AI Interview. These solutions use the capabilities of Azure OpenAI Service, Azure
AI services and Azure Kubernetes Service (AKS). Additionally, through the Azure Marketplace, DeepBrain AI is able
to sell solutions globally – gaining a higher level of exposure than ever before. Lead generation coming through the
Azure Marketplace has provided the company with a valuable revenue stream and helped get its product in front of
the customers who need it the most.
Azure AI services
Outcomes
AzureOpen AI Service and Azure AI services have helped DeepBrain AI build sophisticated deep learning models
and train them on extensive datasets, leading to breakthroughs in natural language processing, computer vision and
other AI domains. These solutions have been integrated into the DeepBrain AI portfolio of tools, including a photo-
realistic AI avatar that serves as an AI retailer, AI banker, AI tutor and more.
Using Azure AI Speech capabilities, DeepBrain AI develops state-of-the-art NLP models integrated directly
into its AI infrastructure. This has helped the company maintain accurate, efficient NLP solutions that can be
seamlessly integrated across a wide range of industries.
AKS provides a managed container orchestration service that automates containerised application deployment,
scaling and management. This helps the company easily scale AI workloads and shift computing resources to where
they’re needed most while providing high availability across all applications. The AKS functionality saves developers
time and effort in managing infrastructure so they can bring innovative solutions to market faster.
The Dutch railway system produces vast amounts of data that the team could use as part
of a customer-facing solution. NS saw how much data was flowing in and realised it needed to
be on the public cloud. The company chose to build its new crowdedness indicator solution on
Azure for its scaling capabilities so its system could dynamically scale to accommodate peak
and off-peak travel times.
When it came to choosing a database, the team prioritised real-time updates, high
performance and low latency. Going in, they also had no idea what fields they wanted
to include, but it had already started building the REST API for its new project, making
Azure Cosmos DB for NoSQL the clear choice.
NS customer research showed that crowdedness is critical to customer satisfaction. While a train often has
seats available in the back, they frequently go unused because passengers don’t want to walk the length
of the train if they don’t know for certain that they’ll be able to sit once they get there. The company wanted
to help spread crowds more efficiently throughout the trains to decongest cars and give passengers a more
satisfying travel experience.
Travel crowds drastically change in size depending on the time of day, week and year. The new solution needed
to scale dynamically during rush hour and then deflate during off-peak travel times.
The company wanted a more data-driven method to predict how and when people travel in order to change
their service. Chip cards with check-in and check-out data could help show where and when passengers enter
and exit a train, and that data can be looked at historically to predict future travel patterns.
Of the two million people who travel by rail each day, 95% check in to
see the status of their train. Our research shows that people especially
value the crowdedness indicator. They can choose from among several
indicators, and most choose crowdedness – it’s a very popular feature.”
Bram van Eck
Chief Product Owner/Senior Product Manager at NS
Using AI to Build Apps that Delight Customers 16
Solutions used
After a three-month process that took the team from proof of concept to production, NS was ready to launch its
crowdedness app. The solution is built entirely on Azure – with Azure Cosmos DB at its centre – and processes
hundreds of thousands of events per day. Additionally, the NS team relies on Azure DevOps to support the end-to-
end CI/CD process, Azure Monitor provides application insights and alerts integrated with the team’s Slack channel,
and Azure Advisor helps diagnose and solve system problems.
Outcomes
With the number of requests fluctuating drastically between peak and non-peak travel times, the team
uses the Azure Cosmos DB autoscaling feature to adjust capacity in response to workload demands.
The team uses Azure Databricks to build and train machine learning models that predict how crowded trains
will likely be over the next three days based on the data collected. Those predictions – what the team calls
the ‘crowdedness prognosis’ – are delivered nightly to Azure Cosmos DB (the ‘prognosis store’) and used to inform
the overall crowdedness calculation.
NS publishes millions of travel advisories a day to help passengers plan a more comfortable travel route.
Of the nearly two million people who travel by rail every day, 95% are using the app to check the status of their
trains. From the several indicators available on the app, most travelers check in on crowdedness to plan their trip.
The app also supports push notifications that alert passengers when a train’s expected crowdedness has changed or
when a disruption means a train will be late.
Azure OpenAI Service provides access to powerful language models from OpenAI, such as GPT-4,
GPT-3.5 Turbo, Codex, DALL-E and Whisper, that perform tasks such as content generation, summarisation,
semantic search and natural language to code translation. Enterprises use this service to improve digital
customer experience by adding chatbot/generative AI capabilities to customer-facing solutions with Azure AI
services and Azure OpenAI.
Azure AI Search
Azure AI Search lets enterprises build rich search experiences over their private and heterogeneous data
sources in web, mobile and enterprise applications. Azure AI Search utilises advanced deep-learning models
to provide contextual and relevant results. It also supports features such as semantic search, knowledge
mining, summary results, faceting, suggestions, synonyms, geo-search and more.
Azure AI services
Azure AI services is a suite of out-of-the-box and customisable AI tools, APIs and models that help
modernise business processes faster. Azure AI services include services for vision, speech, language,
decision, metrics advisor, immersive reader and more. Enterprises use these services to build intelligent
applications that automate document processing, improve customer service, understand the root cause of
anomalies and extract insights from content.
Azure Kubernetes Service simplifies deploying managed Kubernetes clusters in Azure by offloading the
operational overhead to Azure. Kubernetes is a popular open-source platform for orchestrating containers
that run applications. Enterprises use AKS to run their containerised applications at scale with high
availability and performance.
Azure Cosmos DB
Azure Cosmos DB is a globally distributed, multi-model database service that offers single-digit millisecond
response times, automatic and instant scalability and guaranteed speed at any scale. Azure Cosmos DB
supports multiple data models including document, key-value, graph and column-family data. It also
supports multiple APIs, such as native NoSQL, MongoDB API, PostgreSQL API, Apache Cassandra API and
more. Enterprises use Azure Cosmos DB to store and query their data in the most suitable model and API
for their application needs.
Using AI to Build Apps that Delight Customers 18
Migrate to Azure and take advantage of its AI solutions to empower your team to
develop and deploy intelligent apps quickly while maintaining the highest standards
of security and cost efficiency.
Learn more about how Azure helps enterprises innovate intelligent apps
while saving on costs.
Contact Sales
© 2023 Microsoft Corporation. All rights reserved. This document is provided ‘as-is’. Information
and views expressed in this document, including URL and other internet website references, may
change without notice. You bear the risk of using it. This document does not provide you with
any legal rights to any intellectual property in any Microsoft product. You may copy and use this
document for your internal, reference purposes.