0% found this document useful (0 votes)
96 views

02 AI Maturity Assessment and Roadmap Tool

Uploaded by

buenosdias1911
Copyright
© © All Rights Reserved
Available Formats
Download as XLSX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
96 views

02 AI Maturity Assessment and Roadmap Tool

Uploaded by

buenosdias1911
Copyright
© © All Rights Reserved
Available Formats
Download as XLSX, PDF, TXT or read online on Scribd
You are on page 1/ 49

AI Maturity Assessment and Roadmap Planning Tool

This tool will help you analyze your organization's current- and target-state maturity in AI capabilities and systematically de
Info-Tech's AI Maturity Framework, evaluate the current and target performance levels for your organization's core AI pract

This tool's assessment uses Info-Tech's industry-leading AI Maturity Framework and includes the following scoring legend:
1 = Exploration
2 = Incorporation
3 = Proliferation
4 = Optimization
5 = Transformation

As you navigate the tool, enter data in cells shaded light grey.

Current-State & Target-State AI Maturity


Tab 2. Maturity Assessment
Tab 3. Maturity Assessment Results
AI Initiative Planning and Roadmap Creation
Tab 4. Initiative Planning
Tab 5. Gantt Chart

Use the current-state and target-state assessment results as a starting point for building your list of AI initiatives by busines
toward your target-state AI capabilities.

For acceptable use of this tool, refer to Info-Tech's Terms of Use. These documents are intended to supply general informa
advice, and are not intended to be used as a substitute for any kind of professional advice. Use this document either in who
creation. To customize this document with corporate marks and titles, simply replace the Info-Tech information in the Head
g Tool
apabilities and systematically develop a plan for your target AI practices. Based on
your organization's core AI practices.

es the following scoring legend:

ur list of AI initiatives by business value driver to support your maturity growth

ended to supply general information only, not specific professional or personal


Use this document either in whole or in part as a basis and guide for document
fo-Tech information in the Header and Footer fields of this document.
Current-State vs. Target-State AI Maturity Assessment

Assess yourself according to the statements below to determine your organization's current maturity level. The five key dime
artificial intelligence capabilities are:
- AI Governance
- Data Management/AI Data Capabilities
- People & AI Skills
- AI Processes
- Technology & AI Infrastructure

Indicate the current level of maturity for each capability statement based on the asset being evaluated.
Use the following legend to guide you in assigning a current level of maturity for each statement:
1 = Exploration
2 = Incorporation
3 = Proliferation
4 = Optimization
5 = Transformation
Note: you must have answered "Yes" to all previous levels to receive scores in later levels.

Effective AI Capability Enablers

Dimension

AI Governance (L1-L5)

Current Maturity Target Maturity

0.5 3.0

Data Management for AI (L1-L5)


Data Management for AI (L1-L5)

Current Maturity Target Maturity

3.5 4.0

People & AI Skills (L1-L5)

Current Maturity Target Maturity

2.0 3.0

AI Processes (L1-L5)

Current Maturity Target Maturity


2.5 3.0

Tech & AI Infrastructure (L1-L5)

Current Maturity Target Maturity

2.0 3.0
rget-State AI Maturity Assessment

ments below to determine your organization's current maturity level. The five key dimensions of effective

each capability statement based on the asset being evaluated.


n assigning a current level of maturity for each statement:

o all previous levels to receive scores in later levels.

Questions

L1. Are we working to define a role and scope for AI governance in our organization?

L2. Are there documented policies and standards for the development and deployment of
pilot AI applications?

L3. Do we have an organizational approach to define accountability and delegate authority


that can be applied to AI?

L4. Are we using tools to automate and validate compliance with policies, principles, and
standards?

L5. Do we have the right governance structures in place to support the transformative use of
AI?

L1. Is there an awareness in our organization of the data requirements for developing AI
applications?
L2. Do we have a successful, repeatable approach to preparing data for AI pilot projects?

L3. Does our organization have standards and dedicated staff for data management, data
quality, data integration, and data governance?

L4. Have relevant data platforms been optimized for AI and data analytics and are there tools
to enforce compliance with responsible AI principles?

L5. Is there an organization-wide understanding of how data can support innovation and
responsible use of AI?

L1. Is there an awareness in our organization of the skills required to build AI applications?

L2. Do we have the skills required to implement an AI proof of concept (PoC)?

L3. Are there sufficient staff and skills available to the organization to develop, deploy, and
run AI applications in production?

L4. Is there a group responsible for educating staff on AI best practices and our
organization’s responsible AI guiding principles?

L5. Is there a culture where the organization is constantly assessing where business
capabilities, services, and products can be reengineered or augmented with AI?

L1. Is there an awareness in our organization of the core processes and supporting tools that
are required to build and support AI applications?

L2. Do we have a standard process to iteratively identify, select, and pilot new AI use cases?

L3. Are there standard processes to scale, release, deploy, support, and enable use of AI
applications?

L4. Are we automating deployment, testing, governance, audit, and support processes
across our AI environment?
L5. Does our organization lead our industry by continuously improving and reengineering
core processes to drive improved business outcomes?

L1. Is there an awareness in our organization of the infrastructure (hardware and software)
required to build AI applications?

L2. Do we have the required technology infrastructure and AI tools available to build pilot or
one-off AI applications?

L3. Is there a shared, standardized technology infrastructure that can be used to build and
run multiple AI applications?

L4. Is our technology infrastructure optimized for AI and advanced analytics, and can it be
deployed or scaled on demand by teams building and running AI applications within the
organization?

L5. Is our organization developing innovative approaches to acquiring, building, or running AI


infrastructure?
Current Level
Collect this documentation and data to support the maturity (L1-L5)
assessment

In Progress

In Progress
- Documented AI governance processes, roles and accountabilities (RACI), and
responsible AI principles
- AI policies, standards, and best-practice guidelines
- Formal AI governance bodies, charters, memberships, decision authorities, and In Progress
relationship with other governing bodies
- A fully formed AI risk framework and decision making from governance bodies in
relation to alignment to that framework
No

No

Yes

- A comprehensive data strategy


- Data literacy programs
- Data products or repeatable data pipelines
- Data glossary, semantic modeling, and data discovery and sharing processes
- Documented data governance, integration and data quality processes, and roles
and accountabilities (RACI)
- A comprehensive data strategy
- Data literacy programs Yes
- Data products or repeatable data pipelines
- Data glossary, semantic modeling, and data discovery and sharing processes
- Documented data governance, integration and data quality processes, and roles
and accountabilities (RACI) Yes
- Data policies, standards, and best-practice guidelines
- Formal data governance bodies, charters, memberships, decision authorities,
relationship with other governing bodies
- A fully formed data risk management framework and decision making from In Progress
governance bodies in relation to alignment to that framework

No

Yes

Yes

- AI education programs
- AI job descriptions and competencies
- AI competency assessments and development plans
No
- PoC execution and adopted learnings
- AI Center of Excellence team and responsibilities
- AI leadership roles (e.g. CAO, CDO, Head of AI)

No

No

Yes

Yes

- Use of ML tools and documentation of ML processes


- AI/ML pilot projects and post-implementation reviews (PiRs)
In Progress
- Documented AI, ML, and data science best practices and standards
- AI risk controls embedded into AI and ML processes and operations

No
- AI risk controls embedded into AI and ML processes and operations

No

Yes

Yes

- Technology strategy incorporating AI technologies


- AI infrastructure analysis and sizing to support value/business case
- Cloud strategy incorporating AI technologies No
- Use of hyperscaler AI tools (e.g. Microsoft, AWS) or specialist AI technologies
- Use of productionization technologies (e.g. Kubernetes, Docker)

No

No
Target Level
Rationale for Assessment
(L1-L5)

Yes

Yes

Yes

No

No

Yes
Yes

Yes

Yes

No

Yes

Yes

Yes

No

No

Yes

Yes

Yes

No
No

Yes

Yes

Yes

No

No
Gap Description

Rationale behind assessment (based upon harvested artifacts and


questioning):

Rationale behind assessment (based upon harvested artifacts and


questioning):
Rationale behind assessment (based upon harvested artifacts and
questioning):

Rationale behind assessment (based upon harvested artifacts and


questioning):

Rationale behind assessment (based upon harvested artifacts and


questioning):
Rationale behind assessment (based upon harvested artifacts and
questioning):
Comments on Required Efforts to Raise Maturity

Maturity to be raised to target level through the following initiatives and


activities:

Maturity to be raised to target level through the following initiatives and


activities:
Maturity to be raised to target level through the following initiatives and
activities:

Maturity to be raised to target level through the following initiatives and


activities:

Maturity to be raised to target level through the following initiatives and


activities:
Maturity to be raised to target level through the following initiatives and
activities:
Artificial Intelligence Maturity Assessment Results

Review the current maturity levels for each of the artificial intelligence practices and capabilities below in the bar graph and

Artific
5.00

4.00
4.00
3.50

3.00
3.00

2.00

1.00
0.50

0.00
AI Governance (L1-L5) Data Management for AI (L1-L5)

Current and Target AI Maturity

AI Governance (L1-L5)

Tech & AI Infrastructure (L1-L5) Data Manage


Tech & AI Infrastructure (L1-L5) Data Manage

AI Processes (L1-L5) People & AI Skills (L


aturity Assessment Results

the artificial intelligence practices and capabilities below in the bar graph and in the radar chart.

Artificial Intelligence – Current-State to Target-St

4.00

3.50

3.00 3.00

2.00

1-L5) Data Management for AI (L1-L5) People & AI Skills (L1-L5)

Current Maturity Targ

Current and Target AI Maturity

AI Governance (L1-L5)

-L5) Data Management for AI (L1-L5)


-L5) Data Management for AI (L1-L5)

ses (L1-L5) People & AI Skills (L1-L5)


nt-State to Target-State Maturity Assessment

3.00 3.00

2.50

2.00 2.00

People & AI Skills (L1-L5) AI Processes (L1-L5) Tech & AI Infr

Current Maturity Target Maturity

Current-State to Target-State Assessment by Category

Data Management for AI (L1- People & AI Skills (L1-


AI Governance (L1-L5)
L5) L5)

Current Maturity Current Maturity Current Maturity


0.50 3.50 2.00
Target Maturity Target Maturity Target Maturity
3.00 4.00 3.00
3.00

2.00

Tech & AI Infrastructure (L1-L5)

by Category

Tech & AI
AI Processes (L1-L5)
Infrastructure (L1-L5)

Current Maturity Current Maturity


2.50 2.00
Target Maturity Target Maturity
3.00 3.00
Generative AI Initiative Planning
Steps:
1. Determine the high, medium, and low levels for your costs, staffing, alignment with business, and risk reduction as appro
your own organization.
2. Indicate what your gap initiatives are in the first column, and move through the columns on the right to determine the cost
benefit of each initiative.
3. After reviewing the recommended prioritization, indicate when each initiative will be completed based on the recommenda
any other organizational consideration. Assign an owner for each of the initiatives.
4. Use the remaining columns for any additional information on these initiatives that may need to be indicated.

Cost and Benefit to Your Organization on Average


Alignment With Business Value High 3
(e.g. Revenue generation, customer experience, alignment to Medium 2
corporate value proposition) Low 1
Initial Costs High 3
(e.g. FTE, hardware acquisition, licensing, data acquisition, skills Medium 2
hiring and training) Low 1
Ongoing Costs High 3
(e.g. FTE, people and continuous development, licensing, processing Medium 2
costs, data quality, support) Low 1
High 3
Alignment & Adherence to Responsible AI Principles
Medium 2
(e.g. Does the project align to and enforce data privacy principles)
Low 1
Project Complexity High 3
(e.g. Availability of data, data quality, hardware capacity, scarcity of Medium 2
resource) Low 1
High 3
Project Feasibility
Medium 2
(e.g. Urgency, clarity of requirements, executive support)
Low 1

Business Case

Planned Initiative(s) Business


Initial Costs
In-Flight Initiative(s) Value

Increase Revenue
1 Initiative 1: Create sales proposals Low - 1 High - 3
2 Initiative 2 Medium - 2 Medium - 2
3 Initiative 3 Medium - 2 Medium - 2
4 Test 4 High - 3 Medium - 2
5 Test 5 Low - 1 Medium - 2
6 Test 6 Medium - 2 Medium - 2
7 Test 7 Medium - 2 Medium - 2
8 Test 8 High - 3 Medium - 2
9 Test 9 High - 3 Medium - 2
10 Test 10 Medium - 2 Medium - 2
11 Improve Operational Efficiency
12 Test 1 High - 3 Medium - 2
13 Test 2 High - 3 Medium - 2
14 Test 3 Medium - 2 Medium - 2
15 Test 4 High - 3 Medium - 2
16 Test 5 Low - 1 Medium - 2
17 Test 6 Medium - 2 Medium - 2
18 Test 7 Medium - 2 Medium - 2
19 Test 8 High - 3 Medium - 2
20 Test 9 High - 3 Medium - 2
21 Test 10 Medium - 2 Medium - 2
23 Accelerate Innovation
24 Test 1 Low - 1 Medium - 2
25 Test 2 Medium - 2 Medium - 2
26 Test 3 Medium - 2 Medium - 2
27 Test 4 High - 3 Medium - 2
28 Test 5 Low - 1 Medium - 2
29 Test 6 Medium - 2 Medium - 2
30 Test 7 Medium - 2 Medium - 2
31 Test 8 High - 3 Medium - 2
32 Test 9 Medium - 2 Medium - 2
33 Test 10 Medium - 2 Medium - 2
34 Mitigate Risk
35 Test 1 Low - 1 Medium - 2
36 Test 2 Medium - 2 Medium - 2
37 Test 3 Medium - 2 Medium - 2
38 Test 4 High - 3 Medium - 2
39 Test 5 Low - 1 Medium - 2
40 Test 6 Medium - 2 Medium - 2
41 Test 7 Medium - 2 Medium - 2
42 Test 8 High - 3 Medium - 2
43 Test 9 High - 3 Medium - 2
### Test 10 High - 3 Medium - 2
ness, and risk reduction as appropriate to

on the right to determine the cost and

mpleted based on the recommendation and

eed to be indicated.

e
3
2
1
3
2
1
3
2
1
3
2
1
3
2
1 Initiatives by Year
3 YearAuthor:
1
2 (DD-Month-
Year 2
YY)
1 Year 3
Year 4
iness Case Complexity & Effort Year 5

Recommended
Ongoing AI Principle Project Project
Priority Start Date
Costs Alignment Complexity Feasibility
(Value/Complexity)

Low - 1 Low - 1 High - 3 High - 3 Low/High 1-Apr-24


High - 1 Medium - 2 Medium - 2 Medium - 2 Med/Med 31-Mar-24
High - 1 High - 1 Medium - 2 Medium - 2 Med/Med 1-Jan-24
Medium - 2 Low - 1 Low - 1 Low - 1 High/Low 1-Jul-24
High - 1 High - 1 Medium - 2 Medium - 2 Low/Med 1-Jan-25
High - 1 High - 1 Medium - 2 Medium - 2 Med/Med 1-Jul-25
High - 1 High - 1 Medium - 2 Medium - 2 Med/Med 1-Jan-26
High - 1 High - 1 Medium - 2 Medium - 2 High/Med 1-Mar-26
High - 1 Low - 1 Low - 1 Low - 1 High/Low 1-Jul-26
High - 1 High - 1 Medium - 2 Medium - 2 Med/Med 1-Jul-26

Low - 1 High - 3 High - 3 Medium - 2 High/High 1-Jan-24


High - 1 Low - 1 Low - 1 Low - 1 High/Low 31-Mar-24
High - 1 High - 1 Medium - 2 Medium - 2 Med/Med 1-Jan-24
Medium - 2 Low - 1 Low - 1 Low - 1 High/Low 1-Jul-24
High - 1 High - 1 Medium - 2 Medium - 2 Low/Med 1-Jan-25
High - 1 High - 1 Medium - 2 Medium - 2 Med/Med 1-Jul-25
High - 1 High - 1 Medium - 2 Medium - 2 Med/Med 1-Jan-26
High - 1 High - 1 Medium - 2 Medium - 2 High/Med 1-Mar-26
High - 1 Low - 1 Low - 1 Low - 1 High/Low 1-Jul-26
High - 1 High - 1 Medium - 2 Medium - 2 Med/Med 1-Jul-26

Low - 1 High - 3 High - 3 Medium - 2 Low/High 1-Jan-24


High - 1 Medium - 2 Medium - 2 Medium - 2 Med/Med 31-Mar-24
High - 1 High - 1 Medium - 2 Medium - 2 Med/Med 1-Jan-24
Medium - 2 Low - 1 Low - 1 Low - 1 High/Low 1-Jul-24
High - 1 High - 1 Medium - 2 Medium - 2 Low/Med 1-Jan-25
High - 1 High - 1 Medium - 2 Medium - 2 Med/Med 1-Jul-25
High - 1 High - 1 Medium - 2 Medium - 2 Med/Med 1-Jan-26
High - 1 High - 1 Medium - 2 Medium - 2 High/Med 1-Mar-26
High - 1 Low - 1 Low - 1 Low - 1 Med/Low 1-Jul-26
High - 1 High - 1 Medium - 2 Medium - 2 Med/Med 1-Jul-26

Low - 1 High - 3 High - 3 Medium - 2 Low/High 1-Jan-24


High - 1 Medium - 2 Medium - 2 Medium - 2 Med/Med 31-Mar-24
High - 1 High - 1 Medium - 2 Medium - 2 Med/Med 1-Jan-24
Medium - 2 Low - 1 Low - 1 Low - 1 High/Low 1-Jul-24
High - 1 High - 1 Medium - 2 Medium - 2 Low/Med 1-Jan-25
High - 1 High - 1 Medium - 2 Medium - 2 Med/Med 1-Jul-25
High - 1 High - 1 Medium - 2 Medium - 2 Med/Med 1-Jan-26
High - 1 High - 1 Medium - 2 Medium - 2 High/Med 1-Mar-26
High - 1 Low - 1 Low - 1 Low - 1 High/Low 1-Jul-26
High - 1 High - 1 Medium - 2 Medium - 2 High/Med 1-Jul-26
iatives by Year
Author: 0 Author:
(DD-Month- 0 (DD-Month-YY)
YY)
0
0
0 Supports AI Capability Dimensions & Maturity Uplif

Data
People & AI
End Date Assigned Responsibility AI Governance Management
Skills
for AI

31-Dec-24
30-Jun-25
30-Jun-24
31-Dec-24
30-Jun-25
31-Dec-25
31-Dec-26
30-Jun-26
30-Sep-26
31-Dec-26

31-Dec-24
30-Jun-25
30-Jun-24
31-Dec-24
30-Jun-25
31-Dec-25
31-Dec-26
30-Jun-26
30-Sep-26
31-Dec-26

31-Dec-24
30-Jun-25
30-Jun-24
31-Dec-24
30-Jun-25
31-Dec-25
31-Dec-26
30-Jun-26
30-Sep-26
31-Dec-26

31-Dec-24
30-Jun-25
30-Jun-24
31-Dec-24
30-Jun-25
31-Dec-25
31-Dec-26
30-Jun-26
30-Sep-26
31-Dec-26
Dimensions & Maturity Uplift

Tech & AI
AI Processes
Infrastructure
Artificial Intelligence Roadmap (Gantt Chart Format)
Steps:
1. Review the Gantt chart below, based on the start/end dates in tab 4.
2. Ensure that this chart is realistic to your own organization. If there are any dates that need to be adjusted, adjust them ac
to ensure consistency in this tool.

Value Driver Project Name Starts

Increase Revenue
Initiative 1: Create sales proposals 1-Apr-24

Initiative 2 31-Mar-24

Initiative 3 1-Jan-24

Test 4 1-Jul-24

Test 5 1-Jan-25

Test 6 1-Jul-25

Test 7 1-Jan-26

Test 8 1-Mar-26

Test 9 1-Jul-26

Test 10 1-Jul-26

Improve Operational
Efficiency Test 1 1-Jan-24

Test 2 31-Mar-24

Test 3 1-Jan-24

Test 4 1-Jul-24

Test 5 1-Jan-25
Test 6 1-Jul-25

Test 7 1-Jan-26

Test 8 1-Mar-26

Test 9 1-Jul-26

Test 10 1-Jul-26

Accelerate Innovation
Test 1 1-Jan-24

Test 2 31-Mar-24

Test 3 1-Jan-24

Test 4 1-Jul-24

Test 5 1-Jan-25

Test 6 1-Jul-25

Test 7 1-Jan-26

Test 8 1-Mar-26

Test 9 1-Jul-26

Test 10 1-Jul-26

Mitigate Risk
Test 1 1-Jan-24

Test 2 31-Mar-24

Test 3 1-Jan-24

Test 4 1-Jul-24

Test 5 1-Jan-25
Test 6 1-Jul-25

Test 7 1-Jan-26

Test 8 1-Mar-26

Test 9 1-Jul-26

Test 10 1-Jul-26
be adjusted, adjust them accordingly in tab 4

Ends

ug
ay

ov
p
b

n
ar
n

pr
Duration

ct
l
Ju

Se
Ju
Fe
Ja

O
A

N
A
(Days)

31-Dec-24 274

30-Jun-25 456

30-Jun-24 181

31-Dec-24 183

30-Jun-25 180

31-Dec-25 183

31-Dec-26 364

30-Jun-26 121

30-Sep-26 91

31-Dec-26 183

31-Dec-24 365

30-Jun-25 456

30-Jun-24 181

31-Dec-24 183

30-Jun-25 180
31-Dec-25 183

31-Dec-26 364

30-Jun-26 121

30-Sep-26 91

31-Dec-26 183

31-Dec-24 365

30-Jun-25 456

30-Jun-24 181

31-Dec-24 183

30-Jun-25 180

31-Dec-25 183

31-Dec-26 364

30-Jun-26 121

30-Sep-26 91

31-Dec-26 183

31-Dec-24 365

30-Jun-25 456

30-Jun-24 181

31-Dec-24 183

30-Jun-25 180
31-Dec-25 183

31-Dec-26 364

30-Jun-26 121

30-Sep-26 91

31-Dec-26 183
O
ct

N
ov

D
ec

Ja
n

Fe
b

M
ar

A
pr

M
ay

Ju
n

Ju
l

A
ug

Se
p

O
ct

N
ov

D
ec

Ja
n

Fe
b
Ja
n

Fe
b

M
ar

A
pr

M
ay

Ju
n

Ju
l

A
ug

Se
p

O
ct

N
ov

D
ec

You might also like