Data Quality Framework
Data Quality Framework
Version 1.3
Data Quality Policy June 2010 PDF 0.1 Superseded Network Drive (Bkup) * Yes Nigel Manders June 2010
Data Quality Policy March 2011 PDF 0.1 Superseded Network Drive (Bkup) * Yes Nigel Manders March 2011
Data Quality Policy August 2011 PDF 0,1 Superseded Network Drive (Bkup) * Yes Nigel Manders August 2011
Data Quality Policy February 2013 PDF 0.1 Superseded Network Drive (Bkup) * Yes Nigel Manders February 2013
Data Quality Framework Draft Word 0.2 Superseded Network Drive (Bkup) * No Tony Holder December 2013
Data Quality Framework Word, PDF 1.0 Superseded Network Drive (Bkup) * No Tony Holder February 2014
Data Quality Framework Word, PDF 1.1 Superseded Network Drive (Bkup) * Yes Tony Holder March 2014
Data Quality Framework Word, PDF 1.2 Superseded Network Drive (Bkup) * Yes Tony Holder January 2015
Data Quality Framework Word, PDF 1.3 Live Network Drive, Intranet Yes Tony Holder March 2015
* Original document has since been removed from the Network Drive but remains on the Network Backup for retention purposes.
Change Control
Document Title Version Effective Date Pages Reason
Data Quality Policy June 2010 0.1 June 2010 Not specified Both protocol and policy updated to reflect new national guidance, links to spot check questions,
recommendations from the Audit Commission and Internal Audit and links to new auditing method
developed by the performance team.
Data Quality Policy March 2011 0.1 March 2011 Not specified Updated to reflect changes to the National Context and to include references/links to useful
documents/website.
Data Quality Policy August 2011 0.1 August 2011 Not specified Updated to reflect changes to the corporate performance framework and data quality expectations.
Data Quality Policy February 2013 0.1 February 2013 Not specified Minor updates to reflect internal changes to council structures terminology and include the One Council
logo.
Data Quality Framework Draft 0.2 December 2013 All pages Major changes to reflect data quality across the authority and not just the corporate performance
framework. New terminology and policy alignment to Information Governance and Council
Strategy/Priorities.
Data Quality Framework Draft 0.2 February 2014 All pages This document has been superseded as no longer in draft form, has become the Data Quality Framework
document and subject to approval.
Data Quality Framework 1.0 February 2014 All pages Minor changes to various pages to enable the document to be presented for approval. Submitted for
consultation at II&VFM in February 2014, followed by CMT and cabinet member approval. Schedule 15 on
title page added to advise that this document is part of the Information Governance Framework.
Data Quality Framework 1.1 March 2014 Pages 4, 5, 11, 13, 14 Introduction updated, Audit Committee added to Roles and the Auditing and Reporting Section updated,
along with a small section advising KPI audits will be prioritised on council Strategy Prioritises.
Data Quality Framework 1.2 January 2015 Page 18 KPI section updated to include Internal Audit requirement.
Data Quality Framework 1.3 March 2015 Page 17, front cover Removed the part advising that would be stored on PMS as this has changed, changed the
schedule number as IG Framework updated to include new sections.
2|Page
Contents
1. Introduction to Data Quality ................................................................................................................... 4
1.1. Data hierarchy to decision making .................................................................................................... 5
1.2. Where data quality problems occur .................................................................................................. 5
2. Purpose..................................................................................................................................................... 7
3. Stakeholders and their information needs ............................................................................................ 7
4. Training ..................................................................................................................................................... 8
5. Roles and Responsibilities ..................................................................................................................... 9
Cabinet Member....9
Directors/Assistant Directors...9
Heads of Service...9
Audit Committee..11
3|Page
1. Introduction to Data Quality
Good data quality is required for any organisation to plan, make its key decisions, and
deploy its resources and for the smooth operation of its operations. Data quality can mean
different things depending on your role within the council but fundamentality business
analysis or business intelligence is reliant on the quality of the data used and this can be
affected by the way it is:
Captured
Entered
Stored
Managed
The process of data quality assurance verifies the reliability and effectiveness of data.
Managing data quality requires a periodical approach and typically involves updating,
standardising and cleansing of records.
Common data quality issues are incomplete or missing data, out-dated information,
duplicate data and inputting errors. Assuring data quality can increase efficiency, enhance
customer satisfaction, enable more informed decisions.
The terms data, information, knowledge and wisdom are frequently used and
interchangeable throughout the authority. Data is the raw material from where the authority
generates information. The diagram below provides a pictorial representation of the data
hierarchy from the basic raw material of data to the wisdom becoming the keystone of
informed decision making.
4|Page
1.1. Data hierarchy to decision making
Producing data that is fit for purpose should not be an end in itself, but an integral part of
the authoritys operational, performance management, and governance arrangements.
Putting data quality at the heart of their performance management systems are most likely
to be actively managing data in all aspects of their day-to-day business, in a way that is
proportionate to the cost of collections and turning that data into reliable information that in
turn provides knowledge and wisdom to support decision making more informed.
5|Page
People Process Technology
6|Page
2. Purpose
The purpose of this policy is to set out North Lincolnshire Councils approach for ensuring
that staff at all levels who have responsibility for collecting, analysing, manipulating or
using data:
This data quality framework links with the information governance framework. Services
should follow this policy when dealing with data. Not only is it important for ensuring that
all data they use to inform decision making should be sound and where services work with
partners and/or contractors they should be satisfied that the information that comes from
these sources is correct. Services are required to put in place a process of reviewing
internal and external data, this should be done by setting their own performance indicators
for data quality and conducting audits. Where data comes from an external body that is
contracted through the council then the requirements to produce accurate and timely data
should be specifically built into the contract terms and conditions.
Service users and Exercising choice, understanding the service standards to expect and holding
the public public bodies to account.
Staff in public sector Delivering services day to day at the front line; the starting point for data
organisations collection and use.
Managers in public Monitoring and managing service delivery and benchmarking against others in
sector organisations particular performance.
Local councillors, Decision makings; monitoring strategic objectives, targets and use of
trust non-executives resources; ensuring accountability.
Identifying population need and determining priorities and services for meeting
Commissioners
it; monitoring the achievement of contractual arrangements.
7|Page
4. Training
Awareness of data quality will be raised through council wide notices, e-learning courses
and workshop sessions if required for staff involved in data collection. The focus of the
training is to:
Data quality is also included in both the corporate and management induction programme
and is also part of the employee generic competencies. Training needs should also be
addressed through one to ones and the employee appraisal process.
The councils aim is to keep reviewing and improving, and to have data quality
arrangements which are robust enough to withstand internal and external scrutiny.
8|Page
5. Roles and Responsibilities
Cabinet Member
Cabinet Members have overall member responsibility for performance and need to be pro active in
raising issues around data quality challenging progress made against corporate and service data
quality objectives.
Directors/Assistant Directors
Directors and Assistant Directors have overall responsibility for challenging performance and data
quality.
Heads of Service
Heads of Service are accountable for the accuracy and quality of data and information within their
service area and need to ensure their officers are aware of their data quality requirements.
9|Page
Roles and Responsibilities continued
10 | P a g e
Roles and Responsibilities continued
Audit Committee
Seeks assurance from officers and external audit that the council's audit, financial, constitution,
governance, performance and risk arrangements etc are in place and monitors how well they are
performing. It oversees the council's Annual Governance Statement, and seeks assurance that
systems, best practice and on going improvement practices are in place in relation to Information
Governance including Data Quality.
Ensure that results, target data and information is entered onto the PMS in line with the authorities
data quality policy. All data has been subjected to data validation and quality assurance to uphold the
level of quality and ensure that accountability is clearly defined.
11 | P a g e
6. Auditing and Reporting
All documentation created/produced, published and used during the auditing of systems and KPIs
will be subject to the authoritys appropriate Records Management Policy and therefore be subject
to appropriate retention and disposal.
Year 1 Full data management plan audit followed where appropriate by a full data quality
audit
Year 2/5 Review the data management plan and random sample data quality audit.
Should there be a requirement then a full review of the data management plan and data
quality audit would be concluded.
Information is a key corporate asset to the authority and the Information Governance
Framework is a key policy with the data management plan and audit help to deliver this.
The audit could be completed by appropriate officers in the authority (i.e. Information Asset
Owners (IAOs) or Data Asset Owners (DAOs)). It is anticipated completion time would
approximately be 2/3 hours depending on the information required to be gathered to
support the answers. It is not expected that the audit would be completed in one go but
the Information Governance Team (IGT) would expect that the audit would be completed
within 2 weeks of commencement. On submission the IGT will produce an action plan
along with the request (if appropriate) for a copy of the systems/process data for profiling
and data quality audit. The following 4 years would see a random sample of data quality
audits ensuring that data quality is maintained or improved within reasonable percentages.
12 | P a g e
6.2 Auditing Key Performance Indicators (KPIs)
All KPIs are subject to a Data Quality Audit. This framework recognises that the auditing
of KPIs requires a more specific guidance and explanation therefore Appendix 3 contains
the Data Quality Protocol for performance management.
Each KPI once audited will be monitored on the working KPI system. This is maintained
by the Strategy & Information Governance Team. This system contains the dates of the
audit and the KPI star rating, along with full re-audit date and review dates if appropriate.
Each KPI undergoes an initial audit and will be prioritised on the council annual strategy
i.e. priority one KPIs generally would be audited first then priority two KPIs etc. Once the
audit has been completed this date is entered into the working KPI system along with its
star rating. If all requirements have been met and there are no action
plan/recommendations from the audit it would have been awarded 5 stars and is then only
subject to an additional re-audit in 18 months time, however should the audit fall short of
five stars then a review would be required to ensure that the actions/recommendations
have been implemented. The table below explains the review periods:
The Strategy & Information Governance Team will follow up all KPIs that fall short of 5
stars around the review period. Should actions/recommendations not have been
implemented by the review date then this will be escalated to the appropriate service
manager with a period of one month to be actioned. Failing to implement the
actions/recommendations one month after the service manager has been notified will then
result in the KPI being reported to the Audit Committee at the next available meeting.
6.3 Reporting
Findings of audits would be reported to the audit owners and managers, along with
aggregated information to the audit committee. The IGT will monitor the authoritys overall
data quality along with individual systems/processes data quality and provide assistance
where and when required. The IGT will also monitor all KPI audits, undertake reviews as
mentioned above and report at each Audit Committee meeting.
13 | P a g e
Appendix 1
Aspects of Data Quality
Data should be sufficiently accurate for their intended purposes, representing clearly and in
enough detail the interaction provided at the point of activity. Data should be captured once,
although they may have multiple uses. Accuracy is most likely to be secured if data are captured
Accuracy as close to the point of activity as possible. The need for accuracy must be balanced with the
importance of the uses for the data, and the costs and effort of collection. For example, it may be
appropriate to accept some degree of inaccuracy where timeliness is important.
Data should be recorded and used in compliance with relevant requirements, including
the correct application of any rules or definitions. This will ensure consistency between
Validity periods and with similar organisations, measuring what is intended to be measured.
Where proxy data are used to compensate for an absence of actual data, bodies must
consider how well these data are able to satisfy the intended purpose.
Data should reflect stable and consistent data collection processes across collection points and
over time, whether using manual or computer based systems, or a combination. Managers and
Reliability stakeholders should be confident that progress toward performance targets reflects real changes
rather than variations in data collection approaches or methods
Data should be captured as quickly as possible after the event or activity and must be
available for the intended use within a reasonable time period. Data must be available
Timeliness quickly and frequently enough to support information needs and to influence service or
management decisions.
Data captured should be relevant to the purposes for which they are used. This entails periodic
review of requirements to reflect changing needs. It may be necessary to capture data at the
Relevance point of activity which is relevant only for other purposes, rather than for the current intervention.
Quality assurance and feedback processes are needed to ensure the quality of such data.
Data requirements should be clearly specified based on the information needs of the
Completeness body and data collection processes matched to these requirements. Monitoring
missing, incomplete, or invalid records can provide an indication of data quality and
can also point to problems in the recording of certain data items.
14 | P a g e
Appendix 2
Five step approach to data quality for systems, processes or functions
Usually this is the starting point where data quality problems and opportunities for
improvement are identified. There are a number of ways to identify problems most
Problem & importantly among them is talking to stakeholders and listening to their views and
1 Opportunity
suggestions. The usual outcome of this step is a list of data quality problems for further
Identification
investigation. The Data Management Plan Audit would enable problem and opportunity
identification
Having identified potential problems and opportunities, this step examines them in
greater detail. This involves more detailed discussions with key stakeholders,
drilling down to quantify the problem, and data profiling. Profiling the data examines
data sources in detail and quantifies the current state of the data. Profiling the entire Diagnosis 2
data will enable you to obtain a comprehensive view of all data held and its quality
in terms of accuracy, completeness, and reliability. The system/process data quality
audit would diagnose the current position.
At the end of this stage there will be a business case to justify investment in one or more
improvement projects. Do not start a data quality improvement project unless you are
confident it will deliver an acceptable return on investment (ROI). ROI can be a positive
commercial impact, but also non-financial imperatives i.e. improved compliance.
3 Proposal
Always include a 'do nothing' option in the business case as it is important to spell out the
impact of taking no action.
Once the business case(s) have been approved then the real work can begin, the main
tasks are appointing a project manager, define the objectives and success criteria and
then plan the project in accordance with the authorities project management
techniques and procedures. Improvement 4
The objectives and success criteria should derive from the business case with
consideration on the quality metrics and measures.
Data quality improvement is not a one off process. Do not embark on a project unless it
makes both initial improvements to the data and maintains those improvements in the
longer term.
5 Consolidation
Data cleansing is often mistaken as data quality improvement. While in the short term
data quality would improve data cleansing does not solve the root causes of the data
quality problems and inevitably the data will start to deteriorate over a period of time.
Consolidation ensures that gains made in the improvement stage are preserved and
maintained. It is vital that this stage revisits the original business case and establishes
that the expected benefits of the project have been realised along with capturing the
lessons learnt and feed these into further projects.
15 | P a g e
Data Management Plan Audit
The Data Management Plan Audit is a joint audit for Data Quality and Information Governance. The audit is
split into 9 sections (fig.1), consisting of over 80 questions. The Information Governance Team has been
tasked with undertaking an authority wide Information and Data Quality Audit. These will enable the
authority to produce a Data Management Plan that helps in fulfilling action for internal audit, NHS
information governance self-assessment and legislation.
(fig.1)
Information is a key corporate asset to the authority and the Information Governance Framework is a key
policy and this audit would help to deliver this. It is anticipated that the Audit would be completed by
Systems Owners, Records Co-ordinators, Information Asset Owners (IAO) or Data Asset Owners (DAO).
Anticipated completion time is approximately 2-3 hours depending on information to be gathered to
support the answers. It is not expected that the audit would be completed in one session but the IGT
would expect that the audit to be completed within 2 weeks of commencement.
16 | P a g e
Audit(s) Process Map
Data Management Plan Audit (DMPA), System/Process data analysis mapped over the five step approach to data quality
Step 2 Step 4
Diagnois Improvement
Data Management Plan DMPA once analysed will Re-enforce training and
Audit (DMPA) is provide an Action Plan for awareness of Information
conducted, on completion The audit is anlysed by improvement Implement DMPA Governance and Data
it's submitted to the IGT. Data analysis will provide action plan and review Quality
Information Governance
System/Process data cleansing and process in 6 months Communicate lessons
Team improvements learnt from the audits
profiling and analysis is Cleanse data and
started improve process(s)
Step 1
Step 3 Step 5
Problem and opportunity
Proposal Consolidation
investigation
The DMPA would be reviewed on an annual basis along with an action plan (if required) and 6 monthly review of that action plan,
whereas the System/Process data analysis would initially have a full data audit to enable profiling. This profiling enables a greater
understanding of the data then fit for purpose can be applied along with percentage of complete records, recommended data
cleansing and any other business rules specific to the system/process. The data would then on an annual basis have a random
sample check to ensure that improvements are been made would be required for the next 4 years annually. During the random
sample should the outcomes be in with a 5% drop tolerance of the original result then no further action would be required, however
should the random sample fall outside the tolerance levels then a full data quality audit could be conducted with recommendations to
rectify the situation.
17 | P a g e
Appendix 3
Data Quality Protocol for performance management (Key Performance Indicators -KPIs)
Key Lines of Enquiry from the Audit Commission, key principles from the Audit
Commission joint paper: Improving Information to Support Decision Making:
Standards for Better Data Quality.
Audit Commission spot check questions.
Improvements identified during external audits of our data quality arrangements
Advice and input from internal audit function
National & local benchmarking data
Best practice research.
Where practicable, all information and documents should be attached to the Performance
Management System (PMS), also the People tab should regularly be checked to ensure
that the correct people have access to the KPI. Adequate cover within PMS should be
maintained should there be any absence of the named editor.
The protocol has been structured as a checklist, to help you make sure the data you are
producing is accurate. This checklist should be used whether the data is being used
internally or externally. We have structured the checklist into three sections please refer to
the diagram below:
Before collection/calculation
18 | P a g e
During the audit, the controls in place will be evaluated in each area to assess the
robustness of the data quality arrangements. Upon completion of the audit the findings are
assessed and a Data Quality Rating is awarded to the service.
The lowest possible rating is one star and the highest possible rating is 5 stars. For each
section of the checklist evaluated, if all controls have been met and no recommendations
or improvements are identified, five stars are awarded. However, if an error is identified
which leads to incorrect calculation of the final result, the indicator is automatically
qualified and given a Q rating.
Documentation is available on the Intralinc which services can use to carry out audits of
their priority indicators. It is expected that ALL SERVICES adhere to the Data Quality
Protocol and that all services utilise the available audit documentation to test the
robustness of their data quality arrangements.
Indicator Definition template A key element of ensuring sound data quality arrangements is the on-going
audits of the council's key priority indicators, before an audit is carried out an
indicator definition should be completed, its purpose is to define the key
elements of the indicator from the description to how it is calculated. All KPI's
require an Indicator Definition Document.
KPI Audit Template A key element of ensuring sound data quality arrangements is the on-going
audits of the council's key priority indicators. These audits are carried out by
means of a checklist to ensure that data quality requirements are being met.
This document is the checklist template used during an audit to assess the
data quality controls a service has in place. It is the responsibility of individual
directorates to carry out a programme of audits on priority indicators. This
document can be downloaded and utilised by directorates for this purpose.
Performance Indicators A process map is a diagrammatical way to define the sequences of activities
Process Map Guidance (processes) that must take place when calculating a performance indicator
result from data collection through to final calculation. This guidance
document has been developed to aid in the construction of process maps,
ensuring a quality, consistent approach across the organisation.
19 | P a g e