Subject: Police Training and Learning Performance Evaluation and Management Purpose: This Circular gives guidance on the evaluation of the impact of training and learning in the police service on operational performance. SUMMARY This Circular aims to improve the conduct of evaluation of training and learning across the Service with a particular focus on the impact on operational performance and return on investment. It is particularly targeted at those who require training and learning to be done, i.e. BCU Commanders and Heads of Departments (the client sponsor) who are responsible for determining what is the output or outcome required from training and learning and its evaluation. It emphasises the importance of the role of client sponsor and that of the Police Authority. It supports the delivery of the National Policing Plan (NPP) through the National Intelligence Model (NIM) and the monitoring and management of performance against the Police Performance Assessment Framework (PPAF). It promotes the involvement of the community in the evaluation of training and learning. It provides a framework for conducting evaluations that includes how and when as well as what needs to be evaluated. Together with revised Models for Learning and Development for the Police Service (available from Centrex and through NCALT at www.ncalt.com), it builds on the existing evaluation model to provide a more robust approach that is better able to deliver what the Service requires. It repositions the Evaluator role in the context of the reform of police training and learning. It is intended that Evaluators in future will be recorded on a professional register. This Circular has been endorsed by the Police Training and Development Board (PTDB). It replaces HOC 105/1991. ACTION BY Police Authorities - to satisfy themselves that forces are complying with this circular and are obtaining an acceptable return on investment in training. The Circular should be to drawn to the attention of the Police Authority members who have responsibility for training, learning and evaluation. Chief Officers - to ensure dissemination to client sponsors across the service and, in turn, compliance with the requirements of this circular. Also to ensure that adequate resources are made available to carry out evaluations in line with national, regional and local priorities according to the needs of sponsors and the professional practice of evaluation of training and learning. Chief Officers must ensure that there is independent oversight of the evaluation function, that the function is independent from the delivery of training and learning and that it has access to all training and learning materials. BCU Commanders and Heads of Department and other sponsors (the client) - to take responsibility for determining that evaluation occurs where appropriate and what is sought from an evaluation, and to ensure that evaluations are conducted in accordance with the requirements. Where necessary, guidance on what evaluation can do and how it works should be sought from Evaluators within force. Training Managers - to act on the Circular, draw it to the attention of Force Evaluators and participate in the evaluation process from the contractor side. Force Evaluators - to conduct evaluations in line with defined national guidance, national, regional and local priorities and as specified by sponsors. Inform and support sponsors in the development of the evaluation requirement. HR Directors - to ensure implementation of the developments regarding the role and qualifications for force evaluators. 1. Scope 1.1 This Circular is aimed at ensuring that the impact of training and learning on operational performance is evaluated effectively. It is targeted at performance evaluation where it is specifically related to training and learning and not on performance evaluation generally. 2. National Evaluation Strategy 2.1 The Circular seeks to provide a common approach to evaluation in order to achieve the National Evaluation Strategy (NES) and the implementation of Foundation for Change 8 – Evaluation, as published by the ACPO/APA National Project for Best Value in Police Training in November 2002. A fuller explanation of the background and context of this circular is given in Annex 1. 3. National Policing Plan and National Intelligence Model 3.1 The National Intelligence Model (NIM) enables effective identification of resource and capability requirements in order to deliver the Policing Plan. Where capability is short of the operational requirement NIM reveals the need for training and learning. 3.2 Evaluation of training and learning delivered as a consequence measures the correct and proper application of NIM and is thus fundamental to delivery of the Police Reform Agenda and the National Policing Plan. 3.3 NIM is ‘A Model for Policing’ that ensures that information is fully researched, developed and analysed to provide intelligence that senior managers can use to: • provide strategic direction • make tactical resourcing decisions about operational policing and • manage risk. 3.4 The application of evaluation methodology will lead to a greater consistency of policing across the Service through the management of risk in terms of operational skills. It will provide more informed business planning through a practical link of training and learning provision to operational policing issues. 3.5 At a strategic level, NIM is strongly linked to all aspects of business planning, both in relation to the Policing Plans and the strategies of Crime and Disorder Reduction Partnerships. Evaluation of the investment planned and made in the skills and abilities of officers and staff in support of those areas will ensure that there is a clear linkage between the resources invested and the outcome performances required. 4. Police Performance Assessment Framework 4.1 Evaluation must address the effectiveness of training and learning in impacting on the areas of performance for which it was intended. Training and learning will invariably be designed to improve performance in areas that will be measured within the Police Performance Assessment Framework (PPAF). Indeed, in many cases it will be designed specifically to drive an indicator in a particular direction. In such cases it is likely that training and learning will not be the only interventions and thus will not be the sole contributor to performance improvement. However, it is incumbent upon sponsors and evaluators to identify the intended impact of training and learning and to specify the expected measurable changes. This will ensure that wherever possible the link between training and learning and operational performance improvement has been predicted and quantified prior to measurement. 5. Implementation 5.1 There is a need to be prescriptive about a number of aspects of evaluation activity and the context in which they are carried out. This will ensure consistency, eliminate duplication and ensure that national priorities for action are met. This circular outlines the methodology for selecting what will be evaluated, and how and when the evaluations will be carried out. 5.2 What – A number of key elements must be considered in determining what programmes will be evaluated and at what level. These will include the prioritisation within the National Learning Requirement (NLR) as identified by the PTDB’s impact assessment. They will consider the relative weights of strategic priority, individual need, time requirement, investment in learning (£s per head), cost to the service, number of learners and cost of training time (number of days). 5.3 PTDB will determine which programmes will require national evaluation activity. This may include some programmes of a specialist nature which are only adopted in certain parts of the service. 5.4 PTDB and ACPO Business Areas will also decide when to initiate national evaluations of existing programmes based on an assessment of performance indicators in PPAF. 5.5 Regions, through their Strategic Training Groups, and forces will also need to conduct other evaluations according to local prioritisation of needs and performance measures. 5.6 Furthermore, there will be a need for evaluations to be undertaken by the client side for some Centrex products and services, especially in response to requests from the National Training Managers Group. 5.7 The National Evaluation Strategy Implementation Group (NESIG) will consider the arrangements for interchange of evaluation resources between regions and Centrex to ensure the independence and integrity of evaluation reports. They will make proposals to PTDB Executive to engage regional and force level resources in evaluating national programmes. 5.8 As NESIG brings all the regions together, it is a useful forum to promote regional collaboration. 5.9 How – All forces and regions must ensure that the role requirements of client side sponsors and contractor side practitioners are addressed i.e. all programmes to be evaluated must have a sponsor identified and, wherever relevant and possible, there should be engagement with the community. The latter requirement is essential where the training or learning programme is community focused or based. 5.10 Further guidance is contained in the Association of Police Authorities’ publication ‘Involving communities in police learning and development’ which can be accessed at www.apa.police.uk See Annex 2 for a summary of roles and responsibilities. 5.11 Careful consideration will be given to what aspects within the programmes will be evaluated in order to measure the causal relationships between the inputs, processes and outcomes of training and learning activities. 5.12 The service is accustomed to using the Kirkpatrick model and there is no intention to change this. However, this is insufficiently well defined to enable sponsors, stakeholders and evaluation practitioners to define the specific questions that an evaluation needs to answer. 5.13 In order to make the identification of causal relationships between inputs, processes and outcomes more achievable, and hence identify more readily the evaluation methodologies appropriate, the service will adopt a consistent approach to considering the questions that an evaluation needs to answer. A full explanation of how this will be done will be provided in the forthcoming revision of the Centrex Models for Learning and Development, scheduled for publication in Spring 2005. 5.14 All evaluation projects will be conducted according to the Program Evaluation Standards published by the Joint Committee on Standards for Education Evaluations at the University of West Michigan. The Standards can be found at www.eval.org/EvaluationDocuments/progeval.html. They will also be conducted within a programme management framework based on EFQM to be developed by NESIG for use across the service. The evaluation must be conducted under the guidance of a qualified evaluator, as defined by Skills for Justice (SfJ), and who, after its creation, is a member of the Professional Register of Evaluators. 5.15 All quantitative data will be stored in a spreadsheet format. This is so that it can be accessed and analysed using a standard statistical software package to enable sharing and aggregation of data for meta analyses across the service and its partners. The data layout in the spreadsheet must be compliant with the input requirements of SPSS Inc. software, which is the international standard for statistical analysis. It must also be utilised within a quality assurance cycle geared towards continuous improvement. The time periods for retention of all data will be as specified by National Evaluation Strategy Implementation Group (NESIG). 5.16 These developments will also be incorporated into the next update of the Models for Learning and Development. 5.17 When – The scope, timing and phasing of evaluation must be determined through the Project Initiation Document (PID) and at the time of the training needs analysis so that it is scheduled into the project timetable. It is anticipated that most programmes will have a pilot phase that will be evaluated within the quality assurance cycle as specified above before the programme is refined and rolled out. 5.18 There must also be a schedule for evaluating during and/or after the roll out for formative and/or summative purposes. 6. Future Developments 6.1 The direction of the further development of evaluation activity will be determined by a specialist evaluator role within the PTDB Executive who will work closely with NESIG. 6.2 NESIG will continue to direct the implementation of the NES supported by the PTDB Executive. It will build and reinforce links to force and Centrex evaluators and to regional evaluation groups. This will also include linkage to NCALT1 and the Police Licensing and Accreditation Board (PLAB) to ensure that evaluation has the infrastructure to be maintained as a business-as-usual activity at national, regional and force level. 6.3 It will ensure the creation and maintenance of a national database of evaluations that will be shared across the service. 6.4 It will seek to promote community involvement and work with the service and Centrex on the evolution of the Models for Learning and Development. 6.5 NESIG will also monitor the evaluation resources and evaluation projects across the service. It will ensure that evaluation projects at national and regional level have the necessary balance of Centrex and force evaluation resources to ensure the production of timely reports of high professional integrity. 6.6 Qualification routes for Evaluators are being examined in conjunction with Skills for Justice and their training needs with Centrex. 7. Action Plan 7.1 1 The following generic action plan will be used over the next year National Centre for Applied Technologies, Centrex by the PTDB and PTDB Executive to regulate national evaluation activity. 1 PTDB and PTDB Executive will determine national evaluation priorities from 2005-6 onwards and advise NESIG and Regional Training Strategic Groups. Details will be published on the PTDB pages of the Police Reform website: http://www.policereform.gov.uk/ptdb/index.html. 2 Regional Training Strategic Groups will determine regional and some local evaluation priorities 3 Forces will determine other local evaluation priorities 4 NESIG will draft a planned programme for national evaluation activity, considering the demands placed on force, regional and Centrex resources and their availability, and submit it to PTDB Executive 5 PTDB Executive will approve the programme for implementation by force, regional and Centrex evaluators 6 Sponsors, Force Training Managers, Centrex and Evaluators will take specific evaluation projects forward 7 NESIG will monitor evaluation activities against the programme, assist forces, regions and Centrex to balance resources and report to PTDB Executive 8 PTDB Executive will monitor the progress of the evaluation projects and report to PTDB. 8. Evaluators Forum 8.1 The Evaluators forum, previously available on the PSSO website is now hosted by the PTDB and is available at www.evaluation.policereform.gov.uk. For any enquiries about the website, please contact: Pritha Ray PLPU 2nd floor Allington Towers 19 Allington Street London SW1E 5EB Tel: 020-7035-5018 pritha.ray@homeoffice.gsi.gov.uk 9. Further information 9.1 For further information, please contact: Dorothy Gonsalves PLPU 2nd floor Allington Towers 19 Allington Street London SW1E 5EB Tel: 020-7035-5071 dorothyd.gonsalves@homeoffice.gsi.gov.uk Kim White PTDB Executive co-ordinator 2nd floor Allington Towers 19 Allington Street London SW1E 5EB Tel: 020-7035-5040 Mobile: 07917-053741 kim.white12@homeoffice.gsi.gov.uk Police Leadership and Powers Unit Home Office …January 2005 ANNEX 1 - BACKGROUND In 1990 the Police Training Council’s Steering Group on the Overview of Police Training Arrangements set up a working group to examine the issues involved in the evaluation of training. The outcome of that work was Home Office Circular 105/91 – The Evaluation of Training in the Police Service. That document has provided guidance on the way in which police training should be evaluated and the contribution which can be made by Training Evaluators. However, the document has not been updated since and, as a consequence, does not reflect the current context of the reform of police training. During 2001 a working group was convened to produce a National Evaluation Strategy (NES) for Police Training. This was ratified by both APA and ACPO and in October 2002 was launched across the service by the National Best Value Project in Police Training (NBVP) team. In 2003 it was incorporated into the Models for Learning and Development in the Police Service, published by the Central Police Training and Development Authority (Centrex), and the Home Office convened the National Evaluation Strategy Implementation Group (NESIG) to facilitate its adoption. Recently the NES has increasingly been applied to a broader conception of police training, development and learning. As a result HOC 105/91 no longer reflects the service’s need for evaluation. This circular replaces HOC 105/91. It is intended to provide clarity regarding a common approach to evaluation in order to achieve the National Evaluation Strategy and the implementation of Foundation for Change 8 – Evaluation. It will provide a framework for evaluation which includes how and when as well as what needs to be evaluated. It will also reposition the Evaluator role in the context of the reform of police training. Consultation on the reform of police training began in 1999 and led to the publication of the Home Office paper ‘Police Training – The Way Forward’ in April 2000. The Police Training Council (PTC) was disbanded and replaced by the Police Training and Development Board (PTDB), which now has an Executive beneath it to take action forward. There were a number of other structural changes including the reform of National Police Training into Centrex and the creation of the Police Standards and Skills Organisation (PSSO) and Her Majesty’s Inspectorate of Constabulary for Training and Personnel (HMIC P&T). APA and ACPO jointly established the NBVP in order to synchronise the Best Value Reviews (BVR’s) of training at force level between 2001 and 2003. In order to facilitate this, several frameworks were developed including the National Evaluation Strategy, the National Training Costing Model and the Foundations for Change. Work has continued with the National Performance and Development Review (PDR) and, through PSSO (now part of Skills for Justice), on National Occupational Standards (NOS) and the Integrated Competency Framework (ICF). A learning culture in the service is being promoted by the Home Office and HMIC by encouraging forces to attain Investors in People (IiP) status. HMIC P&T inspects both training planning and provision and the implementation of the BVR recommendations, whilst training is also considered in the wider HMIC Baseline Inspections within the European Foundation for Quality Management (EFQM) framework. PTDB is re-drafting the National Strategy to Promote Learning in the Police Service (often referred to as the National Learning Strategy – NLS) but it has already been instrumental in focusing attention increasingly on the learner experience and learning effectiveness. In addition, the Development Portfolio of the ACPO Personnel Management Business Area makes recommendations to PTDB on priorities for the National Learning Requirement (NLR). NESIG, in seeking to promote the NES, is aiming to ensure due account is taken of these developments and the broader issues of programme management. These will include quality assurance under the auspices of the Police Licensing and Accreditation Board (PLAB) which is accountable to PTDB; Evaluator training, qualifications and register; the existing networks of Evaluators regionally and nationally; client and contractor relationships as outlined in HOC’s 18/2002 and 53/2003; and the roles and responsibilities of the owners, sponsors and users of evaluations. ANNEX 2 - ROLES AND RESPONSIBILITIES PTDB - Police Training and Development Board - is responsible for the direction of the policy and strategy of police training. It is a tripartite body made up of Home Office, ACPO and APA representatives. It also determines the criteria for national evaluation priorities. PTDB Executive - is responsible to PTDB for the formulation of policy and strategy for police training and, on PTDB approval, their implementation. It has tripartite representation and includes senior and experienced managers and practitioners and is linked to the National Training Managers Group, the National Evaluation Strategy Implementation Group, Centrex, Skills for Justice and employees' representatives. It also acts on behalf of PTDB to specify the national priorities for evaluation activity. NESIG - National Evaluation Strategy Implementation Group - is responsible to the PTDB Executive for the implementation of the National Evaluation Strategy for police training and for considering the balance of activities for evaluators nationally in line with agreed priorities. It is linked directly to the PTDB Executive and the National Training Managers Group, and comprises professional evaluation specialists from Centrex, Skills for Justice and from each ACPO region. Sponsors - are responsible for determining what is sought from an evaluation. Sponsors will be from the Home Office for certain national initiatives, from ACPO Business Areas for Business Area and other national priorities, or from force ACPO and/or APA for addressing regional or force issues. Force Training Managers - are often responsible for the evaluation resources within the force. Where this occurs, care must be taken to ensure that there is a split between the training practitioners and evaluators to ensure there is no conflict of interest. They will work collaboratively to ensure that resources are made available to carry out evaluations in line with national, regional and local priorities according to the needs of sponsors and the professional practice requirements stipulated in this circular. Evaluators - are responsible for conducting evaluations according to their professional qualifications and training and in line with defined national, regional and local priorities and as specified by sponsors.