THREE RIVERS
DISTRICT COUNCIL
DATA QUALITY STRATEGY
MARCH 2009
CONTENTS
Item / Page- Introduction
- Definitions
- Current Position
- Characteristics of Data Quality
- Accuracy
- Validity
- Reliability
- Timeliness
- Relevance
- Completeness
- Data Quality Standards
- Governance and Leadership
- Policies
- Systems and Processes
- People and Skills
- Data Use and Reporting
- Review and Action Plan
Data Quality Action Plan
Performance Indicator Quality Assurance sign-off form / 1
1
2
3
3
3
3
3
4
4
4
4
5
5
6
6
6
Appendix 1
Appendix 2
Appendix 3
- Introduction
The Council needs information that is fit for purpose in order to manage services and account for performance.Information is used throughout the organisation to make judgements about the efficiency, effectiveness and responsiveness of services and in makingcomplex decisions about priorities and the use of resources. Service users, and in particular members of the public, need accessible information to make informed decisions and Regulators and government departments must satisfy their responsibilities for making judgements about performance and governance.
The 2006 Local Government White Paper, Strong and Prosperous Communities, and the Local Government and Public Involvement in Health Act 2007 set out a new performance framework for local services. This places greater reliance on data quality, to provide robust data for local performance management, and to inform performance assessments. It also emphasises the need for local public services to use information to reshape services radically and to account to local people for performance.As increasing reliance is placed on performance information in performance management and assessment regimes, the need to demonstrate that the underlying data are reliable has become more critical.
In November 2007 the Audit Commission published ‘Improving information to support decision making: standards for better quality data’. This paper encourages public bodies to improve the quality of the data used for decision making, presenting a set of clear and concise standards, based on accepted good practice, which can be adopted on a voluntary basis.
The Council has published a Policy Statement for Data Quality[1] which outlines a commitment to data quality through the adoption of the Audit Commission’s Standards for Better Data Quality.This strategy builds on the Policy Statement and outlines an approach to improving Data Quality across the Council. Consistent, high-quality, timely and comprehensive information is vital to support good decision-making and improved service outcomes.
The Council’s Director of Corporate Resources and Governance is the lead officer for data quality.
With regard to performance indicators, each service has nominated in its service plan an officer or officers responsible for source data, data entry and checking the data.
The accuracy of data held within computer systems is the responsibility of nominated systems administrators and the ICT Manager has overall responsibility for data security.
- Definitions
The terms ‘data’, ‘information’ and ‘knowledge’ are frequently used interchangeably and are defined in the following table. This document and the standards it introduces, focuses on data; that is, the basicfacts from which information can be produced by processing or analysis.
Data / Data are numbers, words or images that have yet to be organised oranalysed to answer a specific question.Information / Produced through processing, manipulating and organising data toanswer questions, adding to the knowledge of the receiver.
Knowledge / What is known by a person or persons. Involves interpreting information received, adding relevance and context to clarify the insights the information contains.
Source: Audit Commission
- Current Position
During 2006 the Audit Commission implemented a revised approach to the audit of performance indicators in local government. This required the Council’s External Auditors to conclude on the arrangements for monitoring and reviewing performance, including arrangements to ensure data quality. A score was attributed, derived from a number of key lines of enquiry (KLOE) and areas of audit focus and evidence under the following;
- Governance and leadership
- Policies
- Systems and processes
- People and skills
- Data use and reporting
The auditors’ work on data quality has supported the Audit Commission’s reliance on performance indicators in its service assessments for the Comprehensive performance Assessment and this will continue to be the case with the introduction of the Comprehensive Area Assessment from 2009.
The auditor is specifically interested in the National Performance Indicators. The definitions of these and guidance as to how they should be compiled can be obtained from the Communities and Local Government website:
As partnership working increases, there is a greater reliance on the quality of data shared between the partners. This strategy recognises that the Council needs to satisfy itself that the data it is receiving from partners, and supplying to them, is accurate and timely. Validation procedures and security protocols will be agreed with partners.
A list of the Performance Indicators collected by the Council is attached at Appendix 1.
The Council has achieved an overall score of two or ‘adequate performance’ for the Council’s management arrangements in respect of data quality.
- Characteristics of Data Quality
The Audit Commission have identified six key characteristics of good quality data.
4.1.Accuracy
Data should be sufficiently accurate for the intended use and should be captured only once, although it may have multiple uses. Data should be captured at the point of activity.
- Data is always captured at the point of activity. Performance data is directly input into PerformancePlus[2](P+) by the service manager or nominated data entry staff.
- Access to P+ for the purpose of data entry is restricted through secure password controls and limited access to appropriate data entry pages. Individual passwords can be changed by the user and which under no circumstances should be used by anyone other than that user.
- Where appropriate, base data, i.e. denominators and numerators, will be input into the system which will then calculate the result. These have been determined in accordance with published guidance or agreed locally. This will eliminate calculation errors at this stage of the process, as well as provide contextual information for the reader.
- Data used for multiple purposes, such as population and number of households, is input once by the system administrator.
- Service heads will give an assurance as to the accuracy of data under their control (see Appendix 3)
4.2.Validity
Data should be recorded and used in compliance with relevant requirements, including the correct application of any rules or definitions. This will ensure consistency between periods and with similar organisations, measuring what is intended to be measured.
- Relevant guidance and definitions are provided for all statutory performance indicators. Service Heads are informed of any revisions and amendments within 24 hours of receipt from the relevant government department. Local performance indicators comply with locally agreed guidance and definitions.
4.3.Reliability
Data should reflect stable and consistent data collection processes across collection points and over time.Progress toward performance targets should reflect real changes rather than variations in data collection approaches or methods.
- Source data is clearly identified and readily available from manual, automated or other systems and records. Protocols exist where data is provided from a third party, such as Hertfordshire Constabulary and Hertfordshire County Council
4.4.Timeliness
Data should be captured as quickly as possible after the event or activity and must be available for the intended use within a reasonable time period. Data must be available quickly and frequently enough to support information needs and to influence service or management decisions.
- Performance data is requested to be available within one calendar month from the end of the previous quarter and is subsequently reported to the respective Policy and Scrutiny Panel on a quarterly basis. As a part of the ongoing development of PerformancePlus it is intended that performance information will be exported through custom reporting and made available via the Three Rivers DC website. This will improve access to information and eliminate delays in publishing information through traditional methods.
4.5.Relevance
Data captured should be relevant to the purposes for which it is to be used. This will require a periodic review of requirements to reflect changing needs.
- We have a duty to collect and report performance information against a wide range of statutory indicators. These are set out in the context of the Government’s White Paper – Strong and Prosperous Communities. Where appropriate each service will identify reliable local performance indicators to manage performance and drive improvement. These are reviewed on an annual basis to ensure relevance.
4.6.Completeness
Data requirements should be clearly specified based on the information needs of the organisation and data collection processes matched to these requirements.
- Checks will be made to ensure for completeness of data. An annual assessment of this is undertaken by Internal Audit.
- Data Quality Standards
The Standards for Better Data Quality as identified by the Audit Commission define a framework of management arrangements that bodies can put in place, on a voluntary basis, to secure the quality of the data they use to manage and report on their activities.
These Standards reflect the KLOE as described in paragraph 3. Below, this Strategy identifies the extent that we currently meet with these standards and recognises those areas that are not yet fully developed.
5.1 Governance and leadership
We will put in place a corporate framework for management and accountability ofdata quality, with a commitment to secure a culture of data quality throughout theorganisation.
Key components:
5.1.1.There will be clear corporate leadership of data quality by those charged with governance.
5.1.2.A senior individual at top management level (for example a member of the seniormanagement team) will have overall strategic responsibility for data quality, and thisresponsibility is not delegated.
5.1.3.The corporate objectives for data quality will be clearly defined (although this may notnecessitate a discrete document for data quality), and agreed at topmanagement level.
5.1.4.The data quality objectives will be linked to business objectives, cover all ouractivities, and have an associated delivery plan.
5.1.5.The commitment to data quality will be communicated clearly, reinforcing the message that all staff have a responsibility for data quality.
5.1.6.Accountability for data quality will be clearly defined and considered where relevant as part of the performance appraisal system.
5.1.7.There will be a framework in place to monitor and review data quality, with robust scrutiny by those charged with governance. The programme will be proportionate to risk.
5.1.8.Data quality will be embedded in risk management arrangements, with regular assessment of the risks associated with unreliable or inaccurate data.
5.1.9.Where applicable, we will take action to address the results of previous internal and external reviews of data quality.
5.1.10.Where there is joint working, there will be an agreement covering data quality with partners (for example, in the form of a data sharing protocol, statement, or service level agreement).
5.2.Policies
We will put in place appropriate polices or procedures to secure the quality of thedata it records and uses for reporting.
Key components:
5.2.1.Comprehensive guidance for staff on data quality, translating the corporatecommitment into practice, will be provide and published. This may take the form of a policy, set of policies, oroperational procedures, covering data collection, recording, analysis and reporting. Theguidance will be implemented in all business areas.
5.2.2.Polices and procedures will meet the requirements of any relevant national standards,rules, definitions or guidance, for example the Data Protection Act, as well as defininglocal practices and monitoring arrangements.
5.2.3.Policies and procedures will be reviewed periodically and updated when needed. We will inform staff of any policy or procedure updates on a timely basis.
5.2.4.All relevant staff will have access to policies, guidance and support on data quality, andon the collection, recording, analysis, and reporting of data. Where possible this will be supported by information systems.
5.2.5.Policies, procedures and guidelines will be applied consistently. Mechanisms will be inplace to check compliance in practice, and the results will be reported to top management.Corrective action will be taken where necessary.
5.3.Systems and processes
We have put in place systems and processes which secure the quality of data aspart of the normal business activity of the body.
Key components:
5.3.1.There are systems and processes in place for the collection, recording, analysis andreporting of data which are focused on securing data which are accurate, valid, reliable,timely, relevant and complete.
5.3.2.Systems and processes work according to the principle of right first time, rather thanemploying extensive data correction, cleansing or manipulation processes to produce theinformation required.
5.3.3.Arrangements for collecting, recording, compiling and reporting data are integratedinto our business planning and management processes supporting theday-to-day work of staff.
5.3.4.Information systems have built-in controls to minimise the scope for human error ormanipulation and prevent erroneous data entry, missing data, or unauthorised datachanges. Controls are reviewed at least annually to ensure they are working effectively.
5.3.5.Corporate security and recovery arrangements are in place. We regularly test our business critical systems to ensure that processes are secure, and results are reportedto top management.
5.4.People and skills
We will put in place arrangements to ensure that staff have the knowledge,competencies and capacity for their roles in relation to data quality.
Key components:
5.4.1.Roles and responsibilities in relation to data quality will be clearly defined anddocumented, and incorporated where appropriate into job descriptions.
5.4.2.Data quality standards are set, and staff are assessed against these.
5.4.3.Staff are trained to ensure they have thecapacity and skills for the effective collection, recording, analysis and reporting of data.
5.4.4.There will be a programme of training for data quality, tailored to needs. This will includeregular updates for staff to ensure that changes in data quality procedures aredisseminated and acted on.
5.4.5.Corporate arrangements will be in place to ensure that training provision is periodically evaluated and adapted to respond to changing needs.
5.5.Data use and reporting
We will put in place arrangements that are focused on ensuring that datasupporting reported information are actively used in the decision making process, and will be subject to a system of internal control and validation.
Key components:
5.5.1.Internal and external reporting requirements will be critically assessed. Dataprovision is reviewed regularly to ensure it is aligned to these needs.
5.5.2.Data used for reporting to those charged with governance are also used forday-to-day management of our business. As a minimum, reported data, and theway they are used, will be fed back to those who create them to reinforce understanding oftheir wider role and importance.
5.5.3.5.5.3Data will be used appropriately to support the levels of reporting and decision makingneeded (for example, forecasting achievement, monitoring service delivery andoutcomes, and identifying corrective actions). Evidence is provided so that management actionis taken to address service delivery issues identified by reporting.
5.5.4.Data which are used for external reporting are subject to rigorous verification, and tosenior management approval.
5.5.5.All data returns are prepared and submitted on a timely basis, and, where appropriate, are supported by aclear and complete audit trail.
- Review and Action Plan
The Action Plan attached at Appendix 2 results from the recommendations contained in the external auditor’s data quality reports for 2008 and 2009.
1
PERFORMANCE INDICATORSAPPENDIX 1
Service Plan / PI reference / PI Description / Collection/monitoring frequency / Reported to
(Committee) / Strategic Plan link ref.
Building Control / BC01 / Full plans applications vetted within 10 days (national PI top quartile) / Quarterly / Sustainable Environment / 1.2.1
3.1.1
Building Control / BC02 / Full plans determined within statutory timescales (national PI) / Quarterly / Sustainable Environment / 1.2.1
3.1.1
Building Control / BC03 / Inspect the same day if notified by 10.00 (national PI top quartile) / Quarterly / Sustainable Environment / 1.2.1
3.1.1
Building Control / BC04 / Inspect the same day other inspections (national PI top quartile) / Quarterly / Sustainable Environment / 1.2.1
3.1.1
Building Control / BC05 / Inspect with 24 hours of notified dangerous structures (national PI) / Quarterly / Sustainable Environment / 1.2.1
3.1.1
Customer Services Centre / CSC01 / % of calls answered / Quarterly / Resources / 3.1.1
Customer Services Centre / CSC02 / % of calls answered within 15 seconds / Quarterly / Resources / 3.1.1
Customer Service Centre
(lead to coordinate firstsubmission) / NI 14 / Reducing avoidable contact: minimising the proportion of customer contactthat is of low or no value to the customer / Annual / Resources / 3.1
Democratic Services / N/A / 10 Performance Standards introduced by Electoral Commission for Electoral Registration (July 2008).
Reported to Regulatory Committee 13 August 2008. Assessed via self-assessment to Electoral Commission. / First deadline to report to Electoral Commission – 16/01/09 / Regulatory Services Committee / 3.1.1
Development Control / DC01 / Major Planning Applications (Target period is for decision within 13 weeks) (NI157) / Quarterly / Sustainable Environment / N/A
Development Control / DC02 / Minor Planning Applications (Target period is for decision within 8 weeks) (NI157) / Quarterly / Sustainable Environment / N/A
Development Control / DC03 / Other Planning Applications (Target period is for decision within 8 weeks) (NI157) / Quarterly / Sustainable Environment / N/A
Development Control / DC04 / Quarterly planning applications received / Quarterly / Sustainable Environment / N/A
Development Control / DC05 / Outstanding planning applications (average at one time) / Quarterly / Sustainable Environment / N/A
Development Control / DC06 / Committee applications determined within 8 weeks / Quarterly / Sustainable Environment / N/A