AI At Work
AI At Work
EBOOK
The ever-increasing adoption of artificial intelligence and machine learning across a variety of industries
is evidence of the transformative and tangible business impact these technologies have the potential to
provide — when integrated responsibly and transparently.
One sector in particular that is experiencing growth in incorporating AI is supply chain management.
From predictive capabilities for optimized demand planning to autonomous vehicles and warehouse
robotics, stakeholders across the supply chain will continue to harness a proliferation of data sources
and technology that enables them to cut costs and accelerate profits for organizations of all sizes and
across all industries.
Challenges and Barriers
to AI Adoption
In a 2018 research paper, McKinsey identified supply chain and manufacturing as one of the two main areas of impact for AI, along
with marketing and sales. The firm estimated the “unlocked value” of AI for supply chain and manufacturing to be $1.2-$2 trillion.
Why, then, isn’t AI adoption within the supply chain industry higher, especially considering that the segment has been reliant
on data and analytics for years? While the answer is multivariate, it can be broken down into three main buckets:
People: On top of the difficulty in hiring AI talent — which is global and not unique to any one industry or
sector — is the fact that existing teams and people are not empowered to leverage new techniques like machine
learning and AI. Organizations need to build teams that can actually use data at scale, working together (i.e.,
bringing those with extensive knowledge on the business together with those that have deep skills in data
science) to choose use cases and deliver value.
Processes: A leading challenge in this area is a general lack of clean, consistent, and usable data which can then
be applied across the greater supply chain ecosystem. Getting this piece right is the basis of all other processes
surrounding AI execution. That means before diving in and tackling use cases, supply chain organizations need
to take a step back and create a comprehensive, big-picture plan for making the development and use of data
and models pervasive.
Technology: Without the technology to access complete and current information, supply chain organizations run
the risk of fragmented data, a lack of perspective into how the data-to-insights process can truly drive business
impact, and missed opportunities — many times which can be associated with a loss in capital. Choosing
technology that can easily combine data from a multitude of sources, that provides a level of transparency, and
that helps upskill people and solidify processes around AI is a critical challenge to address.
Stakeholders need to understand that using data to generate insights at scale involves fundamental organizational change,
and so far, many supply chain organizations have been hesitant to recalibrate. It is clear that the incredible opportunity
presented by further advances in data analytics and AI runs up against equally significant challenges, culturally and politically.
However, the arc of history bends unambiguously toward a heightened role for Al. In many ways, the economy and environment
depend on it. While using data and analytics isn’t a groundbreaking feat in the supply chain industry, now is the time for these
organizations to break down silos, merge together any disparate datasets, and apply operationalization to enable top-down
change with scalable data initiatives instead of isolating AI to one element of the supply chain, such as delivery.
The rest of this white paper will be devoted to detailing high-value use cases in supply chain that can help jump-start AI efforts.
Selecting a few use cases with which to begin can help address early people, process, and technological challenges, setting up
future AI projects for success. For more on how to choose the right use cases, see Defining a Successful AI Project.
For thousands of years, supply chain functions have been developed and optimized across various businesses. Most of the
common methodology framework and vocabulary was founded in 1990 by the Supply-Chain Council and has been managed
by the APICS foundation since 2014.
The supply chain operational reference model (SCOR) proposes definitions of five major processes:
• Plan: Processes that balance aggregate demand and supply to develop a course of action which best meets sourcing,
production, and delivery requirements
• Source: Processes that procure goods and services to meet planned or actual demand
• Make: Processes that transform product to a finished state to meet planned or actual demand
• Deliver: Processes that provide finished goods and services to meet planned or actual demand, typically including
order management, transportation management, and distribution management
• Return: Processes associated with returning or receiving returned products for any reason, which extend into post-
delivery customer support
While Source, Make, and Deliver can be seen as sequential processes, the Plan operations exist at different levels of operation
management — from global to local — and the Return process is generally managed transversely.
PLAN
DELIVER SOURCE MAKE DELIVER SOURCE MAKE DELIVER SOURCE MAKE DELIVER DELIVER
RETURN
END-TO-END PROCESS
Today's challenges rely on the following notion: “As more data is available, more transparency is required by all parties to
manage supplier and customer relationships and improve performance. A new complexity occurs when the end-to-end
supplier’s suppliers and customer’s customers need to be managed and continually improved.”
Many information management systems have been put in place to manage the essential functions of the business. For
example, manufacturing execution systems (MES) are able to manage KPIs like manufacturing cycle time, yield, total effective
equipment performance, and so on. The most common features of enterprise resource planning propose to manage
information across various domains such as financial accounting, management accounting, and human resources as well as
their associated KPIs.
Warehouse management systems (WMS) tend to manage planning, staffing, and organization information to secure SKU
traceability and the operational activities. while customer relationship management (CRM) takes care of information related
to final consumer orders, customer profiling, marketing operations, and so on.
WMS WMS
ERP ERP ERP
Transactional information is now recorded in these information management systems and represents the data patrimony,
including the history and behavior of business. The market favors new requirements such as:
• Product origin – Ensures transparency about the materials and components that are used for the final product,
enabling supplier adherence and safeguards against counterfeiting
• Shipment status – Uses track and trace data to check the location of products and proactively communicate if
discrepancies arise
• External effects – Track the temperature, shocks, and humidity information of products as they move through the
supply chain to identify when breaches occur
• Marketing – Communicates the supply chain story with your customers to prove compliance with requirements and
to increase buyer confidence
• Recall management – Quickly identifies and recalls only the affected material or production batch in the case of
quality issues
• Quality and sustainability - Collects and aggregates traceability data related to quality and sustainability on a single
platform
• Risk management – Proactively identifies and managed risks across the supply chain
Such new capabilities are increasingly designed and implemented using new data and analytics technologies. Breaking down
the silos between monolithic information systems is now a must have in analytics use cases where correlations, statistics,
machine learning, or other algorithmic capabilities will improve the productivity and effectiveness of operations.
CONTROL TOWER
When a global view on data is available through a single point, the data and analytics capabilities to improve performance and
minimize risk in a given supply chain is unlimited. To evaluate the business impact of these capabilities in organizations, we
propose three categories of usage, outlined below.
RULE-BASED MODEL
Most information management systems implement rule-based model capabilities. Still, they usually manage silos of
information.
Many algorithm families like expert systems or operational research capabilities offer very interesting features for multiple
supply chain challenges, illustrated in the following examples:
• A space optimizer for either warehouse stock or a transporter is a minimum bounding box algorithm.
• An efficient freight supplier recommender system can be implemented using constraints satisfaction algorithms where
variables such as cost, lead time, quality, and carbon emission can be expressed as constraints to optimize.
A multitude of the systems that we have called intelligent use rule-based models on a daily basis. Impressive implementations
of these expert systems, accompanied by a formidable human formalism, allowed for chess automation: Garry Kasparov lost
against Deep Blue in 1996 and Deep Blue was a rule-based system.
At the end of the day, rule-based model capabilities offer very useful applications. In these systems, the input dataset does not
have to be very large. It has to be relevant, but it can be incomplete, or even nonexistent. Most of these algorithms are available
in open-source repositories and generally include strong documentation and examples.
Robotic process automation (RPA) systems manage rules and sometimes cognitive tasks that look like the global process is
intelligent. Business intelligence systems that aim to compute business KPIs implement business rules. Process intelligence
systems that dynamically analyze a process using its transactional data to highlight business compliance issues or process
bottlenecks also implement business rules.
A machine learning model infers rules from many examples (output) and its context (input).
Machine learning is about creating intelligent models, known as algorithms. Intelligent models are different from rule-based
models since no human has defined the rules. The rules are computed based on a history of data through process stage
transactions. Businesses can effectively:
• Learn the most optimal stock coverage level for each SKU, taking into account factors such as seasonality and security
stock.
• Learn a supplier’s delivery lead time to evaluate a risk level and/or challenge the supplier’s performance.
• Evaluate an asset’s risk of failure to help better plan maintenance of said assets.
Many of these tasks already exist in business and machine learning mainly enables increased speed, accuracy, and availability.
When such a model is implemented, the business can use it as a recommender. The human still makes all decisions and is
responsible for all actions, with business processes in place to govern the risk and the strategy of decisions.
• Automating the schedule of maintenance interventions using asset failure prediction. Here, we go one step beyond
the detection of the failure of an asset and automate the planning of the maintenance operation.
• Automating a product order to a supplier when predicted customer demand sensing increases. Here, the demand
sensing can be recommended by a machine learning algorithm that understands the trends of the market.
Before automating an order to a supplier, the technology needs to check all of the SKU levels, the lead time, the
manufacturing time, and so on in order to secure the decision of ordering, as a human would do.
An artificial intelligence model offers a new service to the company either to improve its
performance or to deliver new services.
The main difference with a machine learning model is in the design of the service that uses the prediction. Business artificial
intelligence should use the machine learning model to automate an action. Current data analytics initiatives mainly
implement data science proof of concepts. The automation part, which relies on an algorithm, is difficult because of the
feedback loop. Every automated action taken needs to have a feedback loop that tells if the action taken was relevant or not.
It is just like when a human makes a decision. If it is a wrong decision, a boss will tell it that. With AI, on the other hand, the
“boss” is the final user which could be a client, an operator, or a collaborator. Moreover, to be in an automatable context, the
process must be perfectly simple. If exceptions occur too frequently, many decisions cannot be learned and future mistakes
will be made.
DEMAND FORECASTING
Machine learning is hardly a new concept in demand planning. Demand planning applications have been harnessing data to
forecast customer demand since at least the beginning of the 21st century.
What has evolved is the sophistication of the tools, which now incorporate far more data inputs and are therefore able to
deliver analysis with far greater precision. There are seemingly infinite data points worth considering when projecting how
many baseball hats a large retailer will sell in the next six months: economic conditions, weather, which team won the World
Series, the price of cotton, and so on. These data inputs, all of which have varying levels of importance, can provide a much
more accurate reflection of the future than simply looking at demand from the year before, which is what most businesses will
do in the absence of any additional data analysis.
One of the major benefits of AI-enabled demand forecasting is the speed with which the technology can process an enormous
volume of data. Even more important, though, is the potential for the program to continually improve. It
will be constantly comparing its forecasts to the results on the ground,
identifying bits of data that help explain why the
forecast was off.
To be clear, not all of today’s demand forecasting tools incorporate nearly this many factors into their analysis. If they’re not
doing it now, though, they likely will in the future. The path forward always includes more data from more sources.
PRICE OPTIMIZATION
Just as important as helping companies more accurately forecast the number of prospective customers, AI tools are helping
them predict how much they are willing to pay.
One of the most obvious examples of AI-enabled price optimization comes via Uber and Lyft, where prices fluctuate by the
second based on the supply of drivers and the demand from customers. It may very well incorporate other factors, such as
weather, which could affect how much people are willing to pay for a ride.
However, it’s not just Silicon Valley disruptors that are looking to AI for dynamic pricing. It’s an increasingly important part of
supply chain management in nearly every industry. Gartner estimated that approximately 1,000 companies had adopted AI-
powered price optimization at the end of 2018. There’s still a tremendous amount of room for growth: the same study estimated
there are 10,000 companies solely in the B2B space that would benefit from price optimization technology.
The bottom line is that more accurate demand forecasting results in a more efficient supply chain that delivers major cost
savings and increased profits. Particularly in retail, it’s common for companies to overstock rather than face the prospect of
turning customers away. Even if the products are nonperishable and can be returned or sold at some point, excess stock takes up
space, time, and ultimately money that could be dedicated to something that provides actual value at that specific point in time.
As AI-powered tools are able to forecast demand with much greater confidence, wholesalers, retailers, and any other organization
involved in purchasing and selling products will be able to align their inventory much more closely with what customers want.
Pricemoov’s challenge was that data originating from old SI systems, Oracle, or MySql was dirty and required a full- time developer
to perform long ETL steps in PHP for cleaning. Once cleaned, the datasets were painfully entered into a model, as they were custom-
built pipelines. And once finished, the replication and deployment process for the next customer was taking weeks.
THE SOLUTION
• Significantly speed up data cleaning and exporting, leveraging the Dataiku’s visual point-and-click interface to enable
less experienced staff to assist with this process and leaving tenured data scientists to focus on modeling rather than
data prep and plumbing.
• Non-technical teams (like marketing) can build their skills and scale their efforts thanks to an intuitive, visual
point-and- click interface. Longer term, the goal is to have them efficiently and independently leveraging website
clickstreams and HDFS datasets.
• Better define a specific price per customer that evolves over time by melding data indicating demand with customers’
willingness to pay.
Deliver specific insight for local branches by quickly applying geo clustering.
• Quickly submit pricing options to local branc hes of brick-and-mortar stores, who can then choose to accept the options
or not and can seamlessly share feedback to improve the model.
THE RESULTS
DELIVERY OF 10X MORE MODELS
After implementing Dataiku to scale their pricing optimization system and its surrounding processes, Pricemoov saw:
• A two week improvement in the speed at which they could produce pricing and forecast models.
• An improvement in staff performance and development, allowing new hires to prototype code in Jupyter
notebooks and sales teams better sell the product.
A big part of smart factories are digital twins. Digital twins have played a major role in helping organizations optimize supply
chain processes and they promise to make an even greater impact in the coming years.
The concept of a digital twin is relatively old. Although the term wasn’t in use at the time, it was digital twin technology that
allowed NASA to save the lives of the three American astronauts aboard Apollo 13 in April of 1970. After the spacecraft’s
oxygen tanks exploded more than 200,000 miles away from Earth, NASA engineers in Houston figured out how to solve the
problem by troubleshooting in the replica shuttle they had on the ground.
These days, the digital twin is entirely virtual, built on data collected from sensors on the physical asset, whether that’s an
airplane or a diaper manufacturer. The twin interprets all of the behavior it is detecting from the physical asset: temperature,
pressure, vibrations, etc. With a seamless stream of data from the machine, the digital twin can detect problems and predict
failures, providing technicians with a crucial jump on necessary fixes that will prevent costly repairs and downtime that will
disrupt the supply chain.
The precision offered by digital twins allows organizations to get much more out of their equipment. It’s not just breakdowns
that are avoided, but unnecessary maintenance or other unnecessary precautions.
Boeing offers a compelling example. Its 737 aircrafts can carry a maximum cargo of 80,000 kilograms. However, calculating
the total cargo weight is often imprecise because it’s based on the estimates of technicians. As a precaution, the planes often
fly with far less cargo than they could safely carry. Digital twins that are being fed data from sensors on the plane can deliver
the precise cargo weight, allowing the organization to get more cargo on each plane.
Without digital twins, the best strategy for preventative maintenance is to look at historic trends that will tell you when a given
piece of equipment is likely to break down. The problem is that every manufactured item is unique. Not only do even mass-
produced items all have slight variations, but they all live distinct lives, subject to different conditions that change how they
behave. Digital twins are a way to track the life of the object in real time, responding to its unique needs.
GO FURTHER:
Get the Step-by-Step Guide to Introducing AI
into Equipment Maintenance
The return on investment (ROI) for digital twins will vary between organizations and industries, but companies that have
digitized their supply chains have reported major returns. Unilever piloted a digital twin factory in Brazil and achieved $2.8
million in operating cost savings and a 3% increase in productivity.
Predictive maintenance plays a pivotal role in keeping rising equipment and machinery costs at bay by harnessing data from
multiple and varied sources, bringing it together, and using machine learning capabilities to foresee equipment failure before
it occurs, eliminating repair and labor costs as well as any revenue that was historically lost during machine downtime.
VIRTUAL ASSISTANTS
Perhaps the most visible display of AI’s impact on the supply chain comes in the form of customer-facing applications, notably
chat boxes and robot operators.
Organizations of all types and sizes are relying more and more on AI-powered applications to handle frontline customer
service work. For most organizations, a large percentage of customer service calls relate to simple issues that can be resolved
by a robot just as well, if not better, than a human operator. Moreover, they are much cheaper and don’t require training.
Estimates on the impact of AI on customer service costs vary, but they all anticipate billions in savings in the coming years.
A 2019 study by Juniper projected that chat boxes alone would save banks $7.3 billion in customer service costs by 2023.
IBM estimates that businesses spend $1.3 trillion a year to field 265 billion customer service calls; it reckons chat boxes can
already handle roughly 80% of them.
Additionally, virtual assistants can play a significant role at different points along the supply chain and in a wide range of
industries. Not only do they produce major cost savings by reducing customer service staff, but they enhance productivity by
allowing companies to redirect manpower to more complex customer service requests that demand human intelligence and
finesse. In the coming years, however, virtual assistants will only become more sophisticated, offering even more opportunities
to supplement or substitute for human labor.
A fitting example comes from UPS. For decades, the company has provided drivers with precise routes aimed at reducing
delivery time and minimizing the use of fuel and the likelihood of crashes. One of the hallmarks of the strategy has been to
avoid left turns (or right turns in countries where people drive on the left) both because left turns are much more likely to
result in crashes than right turns and because they typically require a longer wait.
In recent years, however, UPS has put its time-cutting techniques on steroids thanks to its own AI-enabled mapping system,
ORION (On-Road Integrated Optimization Navigation). The company describes:
ORION is a 1,000-page, algorithmic optimization. It's a technique for faster, more practical
problem solving, providing a solution for an immediate situation, even if it is neither optimal nor
perfect. ORION doesn't necessarily map the perfect route or even the best one. Rather, ORION
gives UPS drivers workable routes, based on experience. It learns over time and speeds up the
process. It gets smarter.
For each driver’s route, ORION considers more than 200,000 options. Since deploying the algorithm in 2012, the company
claims it has shaved an average of six to eight miles off the daily routes of each driver, resulting in a total of 100 million fewer
miles driven per year! That has resulted in a fuel economy savings of $50 million per year. However, the total cost savings
due to reduced overtime pay, maintenance, and repairs has been far greater; at the end of 2016 UPS claimed that ORION had
resulted in a cost avoidance of $300-$400 million.
Other companies are implementing technology aimed at making their trips more efficient. Unlike UPS, most won’t invent their
own systems, but they can increasingly look to a variety of third-party systems that similarly harness vast amounts of data to
determine the quickest and most fuel-efficient routes available, helping drivers (and eventually autonomous vehicles) to cut
costs on everything from route selection to where to stop for gas.
Organizations can once again leverage predictive maintenance here to foresee component failure and address any issues
before they impact other transportation devices in the fleet. Instead of solely taking historical data and performing static
analysis, supply chain organizations should use real-time data to predict future asset performance and ensure real-time
feedback is acted upon for proactive maintenance and intervention.
14
SAFER, MORE PREDICTABLE SHIPPING
No matter how much planning goes into supply chain decisions, there’s always the prospect of unexpected obstacles. The
supply route that might offer the fastest possible delivery may also include a higher risk of disruption due to political instability,
crime, or labor unrest. AI can provide companies a clearer vision of the risks involved with supply chain decisions, allowing
them to minimize the chance of disruptions.
Implementing autonomous vehicles outside of the clearly-defined boundaries of the warehouse is significantly more
challenging but that’s not going to stop the world’s largest companies. In 2019, Walmart began to test self-driving vans to
move products in between two warehouses in Bentonville, Ark. Whether or not the pilot project with Gatik, an autonomous
vehicle startup, is successful, there’s little doubt that what Walmart envisions will one day become a reality for it and other
large retailers. For Walmart, autonomous technology may represent an existential necessity: the mega-corporation’s online
business has never been profitable due to high delivery costs. In an increasingly digital marketplace ecosystem, figuring out
how to turn a profit online is not an option, but a requirement.
Finally, of course, there are drones. It’s hard to say when we can expect widespread commercial use of unmanned aerial
devices, but there are strong indications that it’s not far away. The presumed leader in the use of drones for delivery is Amazon,
which already has a name for its future drone-based delivery system: Amazon Prime Air. UPS and CVS have also established a
partnership that pioneered the first commercial drone delivery of a medical prescription in the United States in 2019.
Autonomous drones offer a variety of benefits. They require less human labor, which reduces costs, but they also offer the
prospect of much faster delivery, since they are able to avoid traffic and take shortcuts that are unavailable to cars. Amazon
envisions drones as part of a new service in which customers can get products at their door within 30 minutes of ordering.
That type of service could deliver yet another major shock to the retail industry, which has already been transformed in recent
years due to online shopping in general and Amazon in particular.
While it is only a matter of time before Amazon and others have the technology to do autonomous air deliveries, whether or
not the technology is deployed to its full capability largely depends on how it is viewed by the public, political leaders, and
regulators. Some people will be terrified by the prospect of the skies being overtaken by thousands of delivery drones. There
will be concerns about safety, privacy, noise, and the visual impact on the natural environment. Many of the same concerns
will pose obstacles to autonomous ground vehicles. Although there is strong evidence that autonomous vehicles will make
the roads safer, it may take a while for people to hand over the keys to robots.
AI-powered technology can help businesses become more agile in their decision-making and response times, both of which
benefit them from a capital perspective. As suppliers, warehouse managers, and the like optimize their logistics and simplify
complex business problems, they can effectively maintain customer relationships and high service levels, generating both
customer loyalty and net new business.
Businesses that are ready to introduce AI into their supply chain should start with these three steps:
1. Vet their existing technology stack. Do teams dealing with data have a platform that will centralize their AI efforts in one
product and interface? If not, what does the process look like to implement one? If yes, does it include usability across a
diverse user profile, allowing participation from data scientists to non-coding, line of business executives?
2. Prioritize data quality and governance. With any AI function, the data should be clean, trusted, and accessible. Someone
needs to be continually responsible for its quality, making sure it’s updated, formatted, and being used appropriately to
maximize impact across various parts of supply chain, from product sourcing and demand planning to customer delivery.
3. Communicate any AI initiatives broadly and establish a big-picture plan for AI. Don’t just rush into projects that you
know will have an immediate impact on revenue or cost reduction; rather, work on establishing longer-term processes for
collaboration and communication between business units and data teams so that any concerns arising from particular
data projects can be easily addressed in context.
Over time, the powerful combination of AI, data analytics, and machine learning will establish new industry standards for the
supply chain and logistics blueprints of today. By utilizing predictive data to inform strategic business decisions and adopting
technology that drastically improves productivity, stakeholders can effectively optimize their end-to-end supply chain.
Whether businesses are seeking to improve last-mile delivery, create demand forecasting models to help avoid stagnant
inventory, or simply drive productivity and efficiency within fulfillment centers, the application of AI and machine learning will
continue to penetrate and shape the supply chain and logistics ecosystem.
Filter on Mr.
45,000+ 450+
ACTIVE USERS CUSTOMERS
Dataiku is the platform for Everyday AI, systemizing the use of data for exceptional
business results. Organizations that use Dataiku elevate their people (whether technical
and working in code or on the business side and low- or no-code) to extraordinary, arming
them with the ability to make better day-to-day decisions with data.