100% found this document useful (5 votes)
617 views

Data Quality Framework

This document outlines the Data Quality Framework for an organization. It defines roles and responsibilities for ensuring data quality, including Cabinet Members, Directors, Managers, and specific roles related to performance management data. It describes how data quality will be audited, including audits of authority systems and key performance indicators. Auditing will involve a five step approach and follow an audit process map. The framework is intended to help stakeholders meet their information needs by establishing training, roles, and processes for monitoring and improving data quality.

Uploaded by

Nick Mendez
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (5 votes)
617 views

Data Quality Framework

This document outlines the Data Quality Framework for an organization. It defines roles and responsibilities for ensuring data quality, including Cabinet Members, Directors, Managers, and specific roles related to performance management data. It describes how data quality will be audited, including audits of authority systems and key performance indicators. Auditing will involve a five step approach and follow an audit process map. The framework is intended to help stakeholders meet their information needs by establishing training, roles, and processes for monitoring and improving data quality.

Uploaded by

Nick Mendez
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 19

Data Quality Framework

Version 1.3

1|Page Part of the Information Governance Framework (Schedule 16)


Documentation Control
Document Title Format Version Status Location Approved Author Date

Data Quality Policy June 2010 PDF 0.1 Superseded Network Drive (Bkup) * Yes Nigel Manders June 2010
Data Quality Policy March 2011 PDF 0.1 Superseded Network Drive (Bkup) * Yes Nigel Manders March 2011
Data Quality Policy August 2011 PDF 0,1 Superseded Network Drive (Bkup) * Yes Nigel Manders August 2011
Data Quality Policy February 2013 PDF 0.1 Superseded Network Drive (Bkup) * Yes Nigel Manders February 2013
Data Quality Framework Draft Word 0.2 Superseded Network Drive (Bkup) * No Tony Holder December 2013
Data Quality Framework Word, PDF 1.0 Superseded Network Drive (Bkup) * No Tony Holder February 2014
Data Quality Framework Word, PDF 1.1 Superseded Network Drive (Bkup) * Yes Tony Holder March 2014
Data Quality Framework Word, PDF 1.2 Superseded Network Drive (Bkup) * Yes Tony Holder January 2015
Data Quality Framework Word, PDF 1.3 Live Network Drive, Intranet Yes Tony Holder March 2015
* Original document has since been removed from the Network Drive but remains on the Network Backup for retention purposes.

Change Control
Document Title Version Effective Date Pages Reason
Data Quality Policy June 2010 0.1 June 2010 Not specified Both protocol and policy updated to reflect new national guidance, links to spot check questions,
recommendations from the Audit Commission and Internal Audit and links to new auditing method
developed by the performance team.
Data Quality Policy March 2011 0.1 March 2011 Not specified Updated to reflect changes to the National Context and to include references/links to useful
documents/website.
Data Quality Policy August 2011 0.1 August 2011 Not specified Updated to reflect changes to the corporate performance framework and data quality expectations.
Data Quality Policy February 2013 0.1 February 2013 Not specified Minor updates to reflect internal changes to council structures terminology and include the One Council
logo.
Data Quality Framework Draft 0.2 December 2013 All pages Major changes to reflect data quality across the authority and not just the corporate performance
framework. New terminology and policy alignment to Information Governance and Council
Strategy/Priorities.
Data Quality Framework Draft 0.2 February 2014 All pages This document has been superseded as no longer in draft form, has become the Data Quality Framework
document and subject to approval.
Data Quality Framework 1.0 February 2014 All pages Minor changes to various pages to enable the document to be presented for approval. Submitted for
consultation at II&VFM in February 2014, followed by CMT and cabinet member approval. Schedule 15 on
title page added to advise that this document is part of the Information Governance Framework.
Data Quality Framework 1.1 March 2014 Pages 4, 5, 11, 13, 14 Introduction updated, Audit Committee added to Roles and the Auditing and Reporting Section updated,
along with a small section advising KPI audits will be prioritised on council Strategy Prioritises.
Data Quality Framework 1.2 January 2015 Page 18 KPI section updated to include Internal Audit requirement.
Data Quality Framework 1.3 March 2015 Page 17, front cover Removed the part advising that would be stored on PMS as this has changed, changed the
schedule number as IG Framework updated to include new sections.

2|Page
Contents
1. Introduction to Data Quality ................................................................................................................... 4
1.1. Data hierarchy to decision making .................................................................................................... 5
1.2. Where data quality problems occur .................................................................................................. 5
2. Purpose..................................................................................................................................................... 7
3. Stakeholders and their information needs ............................................................................................ 7
4. Training ..................................................................................................................................................... 8
5. Roles and Responsibilities ..................................................................................................................... 9
Cabinet Member....9

Directors/Assistant Directors...9

Heads of Service...9

Service or Team Managers.....9

Information Asset Owners (IAOs)......9

Data Asset Owners (DAOs).....10

Performance Management Editors.....10

Performance Management Owners....10

Performance Management Approvers.......10

All Officers and Elected Members.......10

Information Providers (Internal)....11

Information Providers (External)......11

Audit Committee..11

Senior Data Quality and Information Management Officer......11

6. Auditing and Reporting ......................................................................................................................... 12


6.1. Auditing Authority Systems .......................................................................................................... 12
6.2. Auditing Key Performance Indicators (KPIs) ................................................................................. 13
6.3. Reporting ......................................................................................................................................... 13
Appendix 1 ..................................................................................................................................................... 14
Aspects of Data Quality ........................................................................................................................... 14
Appendix 2 ..................................................................................................................................................... 15
Five step approach to data quality .......................................................................................................... 15
Data Management Plan Audit ................................................................................................................. 16
Audit(s) Process Map ............................................................................................................................... 17
Appendix 3 ..................................................................................................................................................... 18
Data Quality Protocol for performance management............................................................................. 18

3|Page
1. Introduction to Data Quality
Good data quality is required for any organisation to plan, make its key decisions, and
deploy its resources and for the smooth operation of its operations. Data quality can mean
different things depending on your role within the council but fundamentality business
analysis or business intelligence is reliant on the quality of the data used and this can be
affected by the way it is:

Captured
Entered
Stored
Managed

The process of data quality assurance verifies the reliability and effectiveness of data.
Managing data quality requires a periodical approach and typically involves updating,
standardising and cleansing of records.

A definition of Data Quality


Data is of high quality if they are fit for their intended uses in operations, decision making
and planning (J.M.Juran).

Data quality is the perception or an assessment of data from a system (manual or


electronic) to ensure that the datas fitness to serve a particular purpose in a given
situation. This may be to deliver a service or provide data to third parties etc. Aspects of
data quality include:

Accuracy Validity Reliability Timeliness Relevance Completeness

Please refer to Appendix 1 for detailed explanation of aspects of data quality

Common data quality issues are incomplete or missing data, out-dated information,
duplicate data and inputting errors. Assuring data quality can increase efficiency, enhance
customer satisfaction, enable more informed decisions.

The terms data, information, knowledge and wisdom are frequently used and
interchangeable throughout the authority. Data is the raw material from where the authority
generates information. The diagram below provides a pictorial representation of the data
hierarchy from the basic raw material of data to the wisdom becoming the keystone of
informed decision making.

4|Page
1.1. Data hierarchy to decision making

Data is captured by services for several reasons:

Operational running of a service


Measure progress towards targets, outcomes and priorities
Set targets
Compare current performance and past performance
Inform policy decisions
Enable the correct use of finite resources
Understand customer needs and requirements

Producing data that is fit for purpose should not be an end in itself, but an integral part of
the authoritys operational, performance management, and governance arrangements.
Putting data quality at the heart of their performance management systems are most likely
to be actively managing data in all aspects of their day-to-day business, in a way that is
proportionate to the cost of collections and turning that data into reliable information that in
turn provides knowledge and wisdom to support decision making more informed.

1.2. Where data quality problems occur


A key thing to understand about data quality is that it is not an Information Technology
problem or a service problem. In reality it is usually a combination of both. The causes of
poor data quality are often complex, involving inter-relationships between people, process
and technology. This has significant implications for how to go about improving data
quality. Poor data quality is a holistic authority problem.

5|Page
People Process Technology

Inadequate training Inappropriate goals and objectives Shortcomings in data capture:


Sometimes people enter incorrect data Many data quality problems arise IT data capture design needs to focus on
as they have not been properly trained. where goals and objectives are set providing pre-populated and/or existing
For example, a fault repair data officer which pay no attention to the need data held within or outside the
who has not been made aware that to preserve and enhance data organisation to ensure that the data
fault codes (identifying the cause of the quality. For example in a sales capture process creates data that is
fault have been updated) department telephone sales people consistent with data already held and
are incentive solely on the basis of industry standards.
the number of calls they make
and/or the value of the sales they For example, using an address
secure. In this environment, it is generation tool, which creates an
inevitable that data quality will address sourced from the UK Postal
Human Error suffer. Address File (PAF), only requires the
When people input data via agent or customer to capture a postcode
keyboards or handwrite data into to generate a standard format address
fields on forms, from time to time Process failures that can subsequently be matched to
they make mistakes. For example, an addresses already held.
order officer who mishears a citizens Data quality problems can also arise
postcde, or a citizen online who types from the fact that processes can be
in the wrong post code. The most designed to preserve and enhance
common errors are those of data quality but are not adhered to.
transcription (where an incorrect For example, in a major
telecommunication company field Multiple data sources
keystoke is made) and transposition
(where digits are switched engineers are expected to record any The development of IT systems to support
inadvertently). changes they make to customer specific business processes has created
connections in roadside cabinets. the multiple data sources we see in so
Sometimes they fail to do this so that many organisations today, with larger
the physical connections wired in are organisations often holding hundreds of
not reflected in the data sources copes of variations of what is in effect the
which store connection data. same data.
Data as power
Many people in organisations want to
capture and manage their own data.
They feel this gives them direct control Process design
over the data and are often reluctant All processes require data to support Process design
to share it with others in the them and in a world of siloed Some experts have suggested that up to
organsation, seeing it as a source of processes, it is hardly surprising that 80% of a typical IT department's focus is
influence and advantage. supporting data stores are similarly on trying to manage data interactions
fragmented. The HR department between IT systems. The existence of
holds data on employees used for multiple, even hundreds of systems
personnel processes, the Finance creates the need for IT to build and
Department a version for payroll maintain thousands of data transfer and
purposes, and so on. As a result, applications interfaces between them.
multiple copies of data proliferate. When changes occur in one system (for
Lack of ownership or responsibility for
data quality instance the creation of a new data field
This is a major problem in many in a customer database), these
organisations. When data is wrong, all interfaces have to cater for them as the
Lack of common data standards change will have a downstream impact
too often no-one is held responsible
for fixing the problem. Often data standards have been on other systems. In this world of
defined to meet the specific needs of immense complexity, things inevitably
one business process, so when data is go wrong, with data quality frequency
transferred to another process, the suffering.
data takes on a different meaning.

For example, consider a global


organisation that operates in both the
Denial UK and USA. In the USA date
Many people are in denial about data standards are normally formatted as
quality problems as their MM-DD-YY, whereas in the UK they
organisational culture might regard are DD-MM-YY. This could present a
them as failing. So they pretend all is major problem if data from the two
fine, covering up issues as and wen territories ever needs to be combined.
they arise and not exposing them to
others, or seeking help to address Poorly-designed and implemented
them. processes tend to generate poor
quality data. Well-designed and
implemented processes tend to
generate good quality data. Data
quality can therefore be a good
measure of both process design and
implementation.

6|Page
2. Purpose
The purpose of this policy is to set out North Lincolnshire Councils approach for ensuring
that staff at all levels who have responsibility for collecting, analysing, manipulating or
using data:

Have a greater awareness of data quality and their responsibilities


Recognise the importance for good quality data, and how they contribute to it
Have the knowledge and competencies to produce good quality data and
information.

This data quality framework links with the information governance framework. Services
should follow this policy when dealing with data. Not only is it important for ensuring that
all data they use to inform decision making should be sound and where services work with
partners and/or contractors they should be satisfied that the information that comes from
these sources is correct. Services are required to put in place a process of reviewing
internal and external data, this should be done by setting their own performance indicators
for data quality and conducting audits. Where data comes from an external body that is
contracted through the council then the requirements to produce accurate and timely data
should be specifically built into the contract terms and conditions.

3. Stakeholders and their information needs

Stakeholder Information needs/uses

Service users and Exercising choice, understanding the service standards to expect and holding
the public public bodies to account.

Staff in public sector Delivering services day to day at the front line; the starting point for data
organisations collection and use.

Managers in public Monitoring and managing service delivery and benchmarking against others in
sector organisations particular performance.

Local councillors, Decision makings; monitoring strategic objectives, targets and use of
trust non-executives resources; ensuring accountability.

Monitoring the achievement of partnership targets and the use of resources;


Partners
ensuring accountability.

Identifying population need and determining priorities and services for meeting
Commissioners
it; monitoring the achievement of contractual arrangements.

Developing policy; monitoring progress of new initiatives, and the


achievement of national targets; publishing local performance information at
Central government
national level; identifying poorly performing organisations and rewarding good
performance with autonomy and resources.

Monitoring performance and use of resources of local bodies; publishing


Regulators comparative performance information and national studies; planning work
programmes proportionate to risk.

7|Page
4. Training
Awareness of data quality will be raised through council wide notices, e-learning courses
and workshop sessions if required for staff involved in data collection. The focus of the
training is to:

Develop greater awareness and understanding of what is meant by data quality.


Raise awareness of the content of the data quality framework and the importance of
data quality across the organisation.
Raise awareness of the importance of continual verification (auditing) of data to
check data quality arrangements
Highlight roles and responsibilities for staff in terms of assuring the quality of data
and information they use, analyse, supply or have responsibility for.

Data quality is also included in both the corporate and management induction programme
and is also part of the employee generic competencies. Training needs should also be
addressed through one to ones and the employee appraisal process.

The councils aim is to keep reviewing and improving, and to have data quality
arrangements which are robust enough to withstand internal and external scrutiny.

8|Page
5. Roles and Responsibilities

Cabinet Member
Cabinet Members have overall member responsibility for performance and need to be pro active in
raising issues around data quality challenging progress made against corporate and service data
quality objectives.

Directors/Assistant Directors

Directors and Assistant Directors have overall responsibility for challenging performance and data
quality.

Heads of Service

Heads of Service are accountable for the accuracy and quality of data and information within their
service area and need to ensure their officers are aware of their data quality requirements.

Service or Team Managers


Service Managers and Team Managers are responsible for the accuracy and quality of data and
information, undertaking necessary checks and complying with the guidance as well as identifying
and implementing improvement measures to improve the quality of data.

Information Asset Owner (IAO)


IAOs are concerned with the information used within the running of their particular area of business.
They are senior individuals and their role is to understand what information is held, what is added
and what is removed, how information is moved, and who has access and why. As a result they are
able to understand and address risks to the information, and ensure that information is fully used
within the law for the public good, and provide written input to the SIRO annually on the security and
use of their asset.

9|Page
Roles and Responsibilities continued

Data Asset Owner (DAO)


Data Asset Owner responsibilities are the same as IAO's except for specific systems or
processes. They understand what information and data is held, what is added and what is
removed, how information or data is moved and who has access and why. They are responsible
for the data quality of the system or process.

Performance Management Editors


Editors
Have an understanding of the relevant indicator definitions and guidance documents sufficient to
fulfil their roles, they ensure results, target data and relevant information is entered into the
Performance Management System (PMS) in line with the authorities data quality policy.

Performance Management Owners


Owners
Ensure that results, target data and information is entered onto the PMS in line with the authorities
data quality policy. All data has been subjected to data validation and quality assurance to uphold
the level of quality and ensure that accountability is clearly defined.

Performance Management Approvers


Approvers
Have a thorough understanding of the KPI definition and ensure that results submitted have been
subject to data validation and quality assurance checks are in line with the authorities policies.

All Officers and Elected Members


All Officers and Elected Members need to be aware of the data quality policy and where
applicable, should be aware how their day to day activities contribute to the quality and fit for
purpose of data.

10 | P a g e
Roles and Responsibilities continued

Information Providers (Internal)


Internal
Authority departments or teams providing data or information is supplied to relevant services in line
with agreed policies. Any sharing of data or information needs to ensure that this is fit for purpose and
of good quality.

Information Providers (External)


External
Authority departments or teams receiving data for information externally are responsible to ensure
that the data or information received is fit for purpose and of good quality. They need to ensure that
sound governance arrangements are in place.

Audit Committee
Seeks assurance from officers and external audit that the council's audit, financial, constitution,
governance, performance and risk arrangements etc are in place and monitors how well they are
performing. It oversees the council's Annual Governance Statement, and seeks assurance that
systems, best practice and on going improvement practices are in place in relation to Information
Governance including Data Quality.

Senior Data Quality and Information Management Officer

Ensure that results, target data and information is entered onto the PMS in line with the authorities
data quality policy. All data has been subjected to data validation and quality assurance to uphold the
level of quality and ensure that accountability is clearly defined.

11 | P a g e
6. Auditing and Reporting
All documentation created/produced, published and used during the auditing of systems and KPIs
will be subject to the authoritys appropriate Records Management Policy and therefore be subject
to appropriate retention and disposal.

6.1 Auditing Authority Systems


It is impossible for the authority to achieve 100% data quality; the aim should be for data to
be fit for purpose, not perfect. Data is
an authority asset, so any data
improvement initiatives must be owned
and led by the business in partnership
with the Information Governance Team
(IGT). The IGT adopts a five step
approach to data improvement, the
diagram (Fig.1) represents this five
step approach and further detail is in
appendix 2.

A key element of ensuring sound data


quality arrangements is the on-going
programme of audits. The Information (Fig.1)

Governance Team in partnership with


the authoritys service areas will
undertake an annual data
management plan audit and system/process data quality audit on a 5 year rolling
programme of events.

Year 1 Full data management plan audit followed where appropriate by a full data quality
audit

Year 2/5 Review the data management plan and random sample data quality audit.

Should there be a requirement then a full review of the data management plan and data
quality audit would be concluded.

Information is a key corporate asset to the authority and the Information Governance
Framework is a key policy with the data management plan and audit help to deliver this.
The audit could be completed by appropriate officers in the authority (i.e. Information Asset
Owners (IAOs) or Data Asset Owners (DAOs)). It is anticipated completion time would
approximately be 2/3 hours depending on the information required to be gathered to
support the answers. It is not expected that the audit would be completed in one go but
the Information Governance Team (IGT) would expect that the audit would be completed
within 2 weeks of commencement. On submission the IGT will produce an action plan
along with the request (if appropriate) for a copy of the systems/process data for profiling
and data quality audit. The following 4 years would see a random sample of data quality
audits ensuring that data quality is maintained or improved within reasonable percentages.

12 | P a g e
6.2 Auditing Key Performance Indicators (KPIs)
All KPIs are subject to a Data Quality Audit. This framework recognises that the auditing
of KPIs requires a more specific guidance and explanation therefore Appendix 3 contains
the Data Quality Protocol for performance management.

Each KPI once audited will be monitored on the working KPI system. This is maintained
by the Strategy & Information Governance Team. This system contains the dates of the
audit and the KPI star rating, along with full re-audit date and review dates if appropriate.

Each KPI undergoes an initial audit and will be prioritised on the council annual strategy
i.e. priority one KPIs generally would be audited first then priority two KPIs etc. Once the
audit has been completed this date is entered into the working KPI system along with its
star rating. If all requirements have been met and there are no action
plan/recommendations from the audit it would have been awarded 5 stars and is then only
subject to an additional re-audit in 18 months time, however should the audit fall short of
five stars then a review would be required to ensure that the actions/recommendations
have been implemented. The table below explains the review periods:

Star Rating Review Period


**** 1 month
*** 2 month
** 3 month
* 4 month
Q 5 month
Q rating refers to and audit where significant
errors have been found in the calculation of the
Key Performance Indicator

The Strategy & Information Governance Team will follow up all KPIs that fall short of 5
stars around the review period. Should actions/recommendations not have been
implemented by the review date then this will be escalated to the appropriate service
manager with a period of one month to be actioned. Failing to implement the
actions/recommendations one month after the service manager has been notified will then
result in the KPI being reported to the Audit Committee at the next available meeting.

6.3 Reporting
Findings of audits would be reported to the audit owners and managers, along with
aggregated information to the audit committee. The IGT will monitor the authoritys overall
data quality along with individual systems/processes data quality and provide assistance
where and when required. The IGT will also monitor all KPI audits, undertake reviews as
mentioned above and report at each Audit Committee meeting.

13 | P a g e
Appendix 1
Aspects of Data Quality

Data should be sufficiently accurate for their intended purposes, representing clearly and in
enough detail the interaction provided at the point of activity. Data should be captured once,
although they may have multiple uses. Accuracy is most likely to be secured if data are captured
Accuracy as close to the point of activity as possible. The need for accuracy must be balanced with the
importance of the uses for the data, and the costs and effort of collection. For example, it may be
appropriate to accept some degree of inaccuracy where timeliness is important.

Data should be recorded and used in compliance with relevant requirements, including
the correct application of any rules or definitions. This will ensure consistency between
Validity periods and with similar organisations, measuring what is intended to be measured.
Where proxy data are used to compensate for an absence of actual data, bodies must
consider how well these data are able to satisfy the intended purpose.

Data should reflect stable and consistent data collection processes across collection points and
over time, whether using manual or computer based systems, or a combination. Managers and
Reliability stakeholders should be confident that progress toward performance targets reflects real changes
rather than variations in data collection approaches or methods

Data should be captured as quickly as possible after the event or activity and must be
available for the intended use within a reasonable time period. Data must be available
Timeliness quickly and frequently enough to support information needs and to influence service or
management decisions.

Data captured should be relevant to the purposes for which they are used. This entails periodic
review of requirements to reflect changing needs. It may be necessary to capture data at the
Relevance point of activity which is relevant only for other purposes, rather than for the current intervention.
Quality assurance and feedback processes are needed to ensure the quality of such data.

Data requirements should be clearly specified based on the information needs of the
Completeness body and data collection processes matched to these requirements. Monitoring
missing, incomplete, or invalid records can provide an indication of data quality and
can also point to problems in the recording of certain data items.

14 | P a g e
Appendix 2
Five step approach to data quality for systems, processes or functions

Usually this is the starting point where data quality problems and opportunities for
improvement are identified. There are a number of ways to identify problems most
Problem & importantly among them is talking to stakeholders and listening to their views and
1 Opportunity
suggestions. The usual outcome of this step is a list of data quality problems for further
Identification
investigation. The Data Management Plan Audit would enable problem and opportunity
identification

Having identified potential problems and opportunities, this step examines them in
greater detail. This involves more detailed discussions with key stakeholders,
drilling down to quantify the problem, and data profiling. Profiling the data examines
data sources in detail and quantifies the current state of the data. Profiling the entire Diagnosis 2
data will enable you to obtain a comprehensive view of all data held and its quality
in terms of accuracy, completeness, and reliability. The system/process data quality
audit would diagnose the current position.
At the end of this stage there will be a business case to justify investment in one or more
improvement projects. Do not start a data quality improvement project unless you are
confident it will deliver an acceptable return on investment (ROI). ROI can be a positive
commercial impact, but also non-financial imperatives i.e. improved compliance.
3 Proposal

Always include a 'do nothing' option in the business case as it is important to spell out the
impact of taking no action.

Once the business case(s) have been approved then the real work can begin, the main
tasks are appointing a project manager, define the objectives and success criteria and
then plan the project in accordance with the authorities project management
techniques and procedures. Improvement 4
The objectives and success criteria should derive from the business case with
consideration on the quality metrics and measures.
Data quality improvement is not a one off process. Do not embark on a project unless it
makes both initial improvements to the data and maintains those improvements in the
longer term.
5 Consolidation
Data cleansing is often mistaken as data quality improvement. While in the short term
data quality would improve data cleansing does not solve the root causes of the data
quality problems and inevitably the data will start to deteriorate over a period of time.

Consolidation ensures that gains made in the improvement stage are preserved and
maintained. It is vital that this stage revisits the original business case and establishes
that the expected benefits of the project have been realised along with capturing the
lessons learnt and feed these into further projects.
15 | P a g e
Data Management Plan Audit
The Data Management Plan Audit is a joint audit for Data Quality and Information Governance. The audit is
split into 9 sections (fig.1), consisting of over 80 questions. The Information Governance Team has been
tasked with undertaking an authority wide Information and Data Quality Audit. These will enable the
authority to produce a Data Management Plan that helps in fulfilling action for internal audit, NHS
information governance self-assessment and legislation.

Section 1 - Defining data and the principles of data management

Section 2 - Data that is 'fit for purpose'

Section 3 - Technologies for managing and exploiting data

Section 4 - Process and Administration

Section 5 - Data Analysis

Section 6 - The impact of legislation, regulation and compliance framework

Section 7 - Data Security

Section 8 Data/records and Data Governance

Section 9 Building a compelling business case for investing in data

(fig.1)

Information is a key corporate asset to the authority and the Information Governance Framework is a key
policy and this audit would help to deliver this. It is anticipated that the Audit would be completed by
Systems Owners, Records Co-ordinators, Information Asset Owners (IAO) or Data Asset Owners (DAO).
Anticipated completion time is approximately 2-3 hours depending on information to be gathered to
support the answers. It is not expected that the audit would be completed in one session but the IGT
would expect that the audit to be completed within 2 weeks of commencement.

16 | P a g e
Audit(s) Process Map
Data Management Plan Audit (DMPA), System/Process data analysis mapped over the five step approach to data quality

Step 2 Step 4
Diagnois Improvement
Data Management Plan DMPA once analysed will Re-enforce training and
Audit (DMPA) is provide an Action Plan for awareness of Information
conducted, on completion The audit is anlysed by improvement Implement DMPA Governance and Data
it's submitted to the IGT. Data analysis will provide action plan and review Quality
Information Governance
System/Process data cleansing and process in 6 months Communicate lessons
Team improvements learnt from the audits
profiling and analysis is Cleanse data and
started improve process(s)
Step 1
Step 3 Step 5
Problem and opportunity
Proposal Consolidation
investigation

Annual review of the DMPA/Full or random sample data audit

The DMPA would be reviewed on an annual basis along with an action plan (if required) and 6 monthly review of that action plan,
whereas the System/Process data analysis would initially have a full data audit to enable profiling. This profiling enables a greater
understanding of the data then fit for purpose can be applied along with percentage of complete records, recommended data
cleansing and any other business rules specific to the system/process. The data would then on an annual basis have a random
sample check to ensure that improvements are been made would be required for the next 4 years annually. During the random
sample should the outcomes be in with a 5% drop tolerance of the original result then no further action would be required, however
should the random sample fall outside the tolerance levels then a full data quality audit could be conducted with recommendations to
rectify the situation.

17 | P a g e
Appendix 3
Data Quality Protocol for performance management (Key Performance Indicators -KPIs)

Performance management is a key aspect of the day-to-day operation of a service.


Performance information, often in the form of indicators, is a key illustration of how well a
service/function is performing in a particular area. This information can be used for a
number of purposes such as monitoring how well a service is performing, to set future
targets to drive continuous improvement in how we deliver our services, as a benchmark
of performance or cost against other bodies and to inform the residents of North
Lincolnshire of how well the council is delivering services. In order that performance
information can be used for these purposes, the quality of the data used must be robust
and reliable.

In most instances, the exact specification of how a performance indicator must be


calculated will be supplied by an external body. If this is the case then the guidelines
provided by the external body must be followed precisely and a clear audit trail of how the
indicator has been calculated, backed up by hard-copy evidence where appropriate, must
be available. In other cases, a service may decide to monitor and review performance
information that they feel is more relevant to the council ambitions and local priorities.

Whether performance information is being used by an external body to analyse our


delivery of services, or whether it is being used internally to help drive improvement the
need for robust, reliable data is clear. With this in mind, this Data Quality Protocol has
been produced. The protocol elements take into consideration:

Key Lines of Enquiry from the Audit Commission, key principles from the Audit
Commission joint paper: Improving Information to Support Decision Making:
Standards for Better Data Quality.
Audit Commission spot check questions.
Improvements identified during external audits of our data quality arrangements
Advice and input from internal audit function
National & local benchmarking data
Best practice research.

Where practicable, all information and documents should be attached to the Performance
Management System (PMS), also the People tab should regularly be checked to ensure
that the correct people have access to the KPI. Adequate cover within PMS should be
maintained should there be any absence of the named editor.

The protocol has been structured as a checklist, to help you make sure the data you are
producing is accurate. This checklist should be used whether the data is being used
internally or externally. We have structured the checklist into three sections please refer to
the diagram below:

General Data Quality Management Arrangements

Before collection/calculation

During/after calculation and target setting

18 | P a g e
During the audit, the controls in place will be evaluated in each area to assess the
robustness of the data quality arrangements. Upon completion of the audit the findings are
assessed and a Data Quality Rating is awarded to the service.

The lowest possible rating is one star and the highest possible rating is 5 stars. For each
section of the checklist evaluated, if all controls have been met and no recommendations
or improvements are identified, five stars are awarded. However, if an error is identified
which leads to incorrect calculation of the final result, the indicator is automatically
qualified and given a Q rating.

Star Rating Description


* Recommendations have been identified in 4 of 5 sections of the audit
checklist
** Recommendations have been identified in 3 of 5 sections of the audit
checklist
*** Recommendations have been identified in 2 of 5 sections of the audit
checklist
**** Recommendations have been identified in 1 of 5 sections of the audit
checklist
***** No recommendations identified
Q Significant error found

Documentation is available on the Intralinc which services can use to carry out audits of
their priority indicators. It is expected that ALL SERVICES adhere to the Data Quality
Protocol and that all services utilise the available audit documentation to test the
robustness of their data quality arrangements.

The documents available on the authoritys intralinc are:

Indicator Definition template A key element of ensuring sound data quality arrangements is the on-going
audits of the council's key priority indicators, before an audit is carried out an
indicator definition should be completed, its purpose is to define the key
elements of the indicator from the description to how it is calculated. All KPI's
require an Indicator Definition Document.

KPI Audit Template A key element of ensuring sound data quality arrangements is the on-going
audits of the council's key priority indicators. These audits are carried out by
means of a checklist to ensure that data quality requirements are being met.
This document is the checklist template used during an audit to assess the
data quality controls a service has in place. It is the responsibility of individual
directorates to carry out a programme of audits on priority indicators. This
document can be downloaded and utilised by directorates for this purpose.

Performance Indicators A process map is a diagrammatical way to define the sequences of activities
Process Map Guidance (processes) that must take place when calculating a performance indicator
result from data collection through to final calculation. This guidance
document has been developed to aid in the construction of process maps,
ensuring a quality, consistent approach across the organisation.

19 | P a g e

You might also like