Quality Assurance Framework in the HCSO Katalin Szép, Judit Vigh, Erika Földesi, Kornelia Mag, Szilvia Katona Hungarian Central Statistical Office, Statistical Research and Methodology Department Keleti Károly utca 5-7. 1024 Budapest, Hungary Katalin.szep@ksh.hu Judit.vigh@ksh.hu Erika.foldesi@ksh.hu Kornelia.mag@ksh.hu Szilvia.katona@ksh.hu Keywords: quality framework, development of standard tools 1. Introduction HCSO has been aware of the achievements of quality-related developments in other NSIs and Eurostat, but the central methodological unit set up in 2002 was the organisational precondition to start systematic work in this field in our office. Based on the knowledge gathered from literature, training courses, study visits to Statistics Canada, Statistics Sweden, ONS UK, participation in Eurostat working parties we have developed a strategic plan. HCSO has declared quality commitment, and has published its quality concept and the action plan on the development of a quality assurance framework as part of the strategy of HCSO for the period 2005-2008 (HCSO 2005; Szép et al. 2006). However the task on quality assurance framework concentrated on data producing processes and statistics as products, the strategy included other programs closely related to the implementation of TQM principles, like ‘Programme planning and self assessment systems’, ‘Improving relations with respondents’, ‘Development of the meta-information system’ and project on the development of the systematic user satisfaction surveys. 2. The quality assurance framework in the HCSO By the end of 2009 the program on quality assurance was finished (HCSO 2009 c). The new, separate quality policy has been published on HCSO's website (HCSO 2009 b). “The mission of the Hungarian Central Statistical Office as part of the European Statistical System is to provide credible and good-quality statistical services – adequate for users’ needs – about the state and changes of society, economy and environment. “ “The products of HCSO are published data provided to users, statistical analyses, classifications, registers, methods and other statistical services carried out as services. The statistical products of HCSO – in line with the ISO definition – must be “fit for use” and in accordance with the following interrelated components: • relevance; 1 • accuracy; • timeliness; • punctuality; • accessibility; • clarity; • comparability and coherence.” The standards of the basic elements – methods and tools – are internally available. Quality guidelines are documented and accepted, for each phases of statistical production process containing concept, principles, guidelines and references to basic literature. Standard process variables are identified to characterize process quality (Mag 2006), and DESAP (checklist in the form of a self-assessment questionnaire to evaluate statistical processes) is adapted according to our quality guidelines. Quality components and their describing standard indicators are defined. Quality report structure and content are ready to characterize product quality in a standardized manner. Quality report form is completed with self assessment questions referring product quality. Hence by now survey managers have at their disposal the necessary tools to make the system running, to perform regular quality assessments. Figure 1: Elements and tools of quality measurement and assessment Quality requirements Measured qualitydimensions User needs External-, internal audit Feedback talks Self-assessment Process variables Quality report Quality indicators Measurement according to quality definition Statistical production process Action plans Assessment tools Standards, requirements, Quality guidelines Statistical data Reality The quality framework structure and functions are presented on Figure 1. The figure is based on DatQAM Handbook (Eurostat 2007), and we highlighted the implemented items in bold and italics. The training on quality has been on the agenda during the whole period, adapted to the actual needs and possibilities. Since 2006 internal courses on quality have been organized for managers, for survey statisticians and other employees of the HCSO. As institutional background the Planning department is responsible for operating the system, while the methodology unit has the responsibility of developing tools. 2 In the following we will highlight some key elements, drawbacks and synergies of the system development process. 3. The sequence of the development phases of quality assurance framework We intended to follow a step by step approach to set up the planned system. In the preparatory phase our first task was the translation of the main standard documents from English to Hungarian. We had to establish the Hungarian terminology in this field based on the existing statistical and TQM terminology. We have started a systematic approach, with a view to develop the framework and elaborate a program for the development of tools and methods. The quality concept and components were accepted with the program at the beginning of 2005. The first intention was to start with product quality measurement – quality control – following historical evolution. But based on the decision of the Strategic Development Council of our office, both of the projects on product quality and process quality started at the same time without causing any problem. From an ethical point of view the quality requirements should be known well before the start of the quality control or quality assurance. Consequently in the frame of the process quality project the first task was the elaboration of Quality guidelines as an internal methodology standard on statistical process phases. Although quality guidelines were almost ready (90%) in 2005, there were adopted as an internal regulation only in 2007, after a very long, exhaustive consultation, discussion, agreement procedure within the office. The development of standard quality indicators characterising quality components was based on a different approach. First we intended to start measurement, gain experiences and set the concrete thresholds if any only afterwards. The first set of standard quality indicators were accepted in 2008 with a strict rule on their compulsory calculation. In 2009 the following standards were accepted as recommendations, without any strict obligation on regular implementation: Standard quality report and form with complementary selfassessment blocks, standard process variables for process phases, self assessment questionnaire based on DESAP adapted to quality guidelines. We started to develop standard for internal audits supported by quality coaching1, and work on TQM approach (HCSO 2009 a). Keeping quality on the communication agenda was a continuous task, we presented it at regular meetings and organised special events like presentations in the frame of methodological days; a forum on the accepted quality program; reporting fora after quality conferences (Q2004, Q2006, Q2008), CoP self assessment and peer reviews within the office, and outside the office in the frame of Statistical Council for official statistical service, Statistical Committee for academic people. The training on quality has been a continuous activity in our office with widening scope in audience and special topics. At the beginning several HCSO experts have been participating in Eurostat, TES, Amrads, Istat quality training courses since 2003. Later on a wider participation was possible in the training courses held by foreign experts2 in the HCSO organised with Eurostat support in 2005. In order to widen our knowledge and experiences, we participated in the pilot project 2004-2005 of Eurostat „Quality in Statistics”, and the DatQAM grant closed in 2007 (Colledge et al. 2006). Since 2006 we have our internal quality courses with yearly updated course 1 Maria Dologova from Slovakia Michael Colledge from Australia, Peter Lohauß, Anja Nimmergut, Karen Blanke from Germany and Thomas Burg from Austria. 2 3 structure and material in order to meet changing needs and to adapt to phases of framework development. The framework development was the task of the Statistical Research and Methodology department with the involvement of some experts from Planning Department. The situation matured for organisational changes, the operation and the development of the system should be separated. The section of the Planning Department has been appointed for the management of the HCSO quality framework. The unit was renamed Quality Management and Analysis Section and the tasks were declared in the Organizational and Operational Regulation in 2010. 4. Problems emerged during the development We had to cope with limited resources from the beginning. Financial resources were used only for external training and conference participations. At the beginning 4 experts worked on the development of quality framework as part of their work assignments. Later on 2 more experts joined the team for special tasks. Team work was the only solution to network the different approach, experience, knowledge of these “methodologist” people. Subject matter experts could only dedicate limited time to the task, because within this period HCSO staff was cut by 22%, without reduction in tasks and outputs. In addition the resource allocation measurement and planning system was set up in the same period putting extra administrative burden on statisticians. Subject matter experts participated in the two projects on data and process quality and in testing the results. We have used a cautious, step by step approach in setting up a quality framework. There was always a test phase before the introduction of new elements; and the compulsory tasks have been introduced in a flexible way: the calculation of quality indicators is compulsory only if it is automated based on IT tools. Our management was strongly committed concerning the development of the quality assurance framework. But we had to motivate, convince the staff to overcome resistance against new mode and burden. First we explained that having quality requirements and goals that were monitored during the production phase and assessed at the end was always an inherent part of the statistical work and that the new approach consisted only in the introduction of a systematic, regular practice. Secondly we always started from the analysis of the state of the art, existing knowledge and practices. The proposals developed in project teams were always circulated for office-wide consultation, and discussed with all the subject matter areas. Thirdly we have always taken into consideration the balance of costs and benefits from the point of view of the statisticians. For example after the provision of the shortlist of standard quality indicators and the introduction of their compulsory calculation, the survey managers were assisted by quality report and selfassessment tools. In the past there was no forum to discuss cross-cutting issues like general statistical methodology or quality. In 2007 the Board on Methodological Issues was set up as a new forum. The members of the Board are the representatives of all the subject matter departments including the Planning Department and in case of interest in the agenda items, the representatives of the IT and Dissemination Department participate also in the work of the Board. This forum discusses new standards, cross cutting tools, rules, feasibility studies to overcome problems and makes proposals for the management. We had to face some problems concerning the terminology. At the beginning it became clear, that the same words, expressions were interpreted in different ways in the different departments. It was very difficult to achieve consensus concerning the terminology, however, the office-wide project on meta-system development (Baracza et al. 2009) had a synergic effect when reviewing concepts and definitions database. 4 The need for new nomenclatures emerged. The already existing business process model, developed by Informatics Department is written in a nomenclature, it can be detailed on 4 levels. It is used by meta driven systems (Györki 1996; Pap 2004). For quality guidelines we needed another business process model with the list of the phases of the whole production process. The actual one was developed on the basis of the existing HCSO informatics process model and the statistical value chain. The harmonisation with SDMX Standard business process model is a task for the near future. We had to find “units” of statistics to be characterised by quality reports. In the HCSO we already had the nomenclature of programs (for the resource planning and allocation documentation system, it is hierarchical with 3 levels, cc. 300 items on the lowest level) and the nomenclature of statistical domains (by themes of statistics for the metadata system, it has 4 hierarchical levels, cc. 100 items on the lowest level). The quality reports are linked to the lowest level of statistical domains. During the development of standard tools and methods the question “Can standards exist at all?” emerged several times. Subject matter area experts were convinced that their area was so specific, that no standard guidelines, methods, indicators could be defined or applied. So in the development of internal standards we relied as much as possible on the knowledge accumulated in HCSO and our traditions. For example in the project on product quality in the first step we wanted to identify standard quality indicators. Based on a survey in subject matter areas we compiled an inventory of quality indicators (program, frequency, indicator, type and characteristics, nomenclatures used, formulas etc.). Based on the analysis of the inventory and Eurostat standard quality indicators (2003), we identified a list of proposed indicators. Then we conducted a survey among subject matter departments to enquire, which indicators could be applied for the different domains. Obviously bilateral consultations, clarifications were necessary to achieve common understanding. The test phase served methodology and burden estimation goals, each department participated with 1-2 domains (Földesi 2006). To reduce statisticians burden, we made effort to introduce quality indicators in existing and currently developed IT meta driven systems, which support statistical processes. For example different non response rates can be calculated on the output of the meta driven data capturing system “GESA”. We have set the target to build up a database for quality indicators and quality reports. For these we need information on the number, size, source, frequency etc. of these indicators, and their current availability. So it was decided that subject matter departments are temporarily responsible not only to calculate but also to store these indicators. An internal audit has been set up to control if the departments had fulfilled this obligation. It was carried out by internal audit section, independent from quality experts. The result is an important feedback. In spite of the trainings and consultations (around 170 participants) it became obvious that further training and methodological support was needed concerning concepts and operational competences. The members of the “quality respondents” network – one by subject matter department - were nominated and trained, but in practice mostly other experts have dealt with quality reports and indicators, more from one department, frequently one by survey. Systematic capacity building for technical implementation has not been sufficient. A new problem emerged with the development of the office program planning and performance measurement. Programs are designated yearly for units, managers and each statistician. Implementation, performance of the programs is measured with indicators, mostly overlapping with quality indicators. If people are evaluated and awarded on the basis of quality indicators, distortion can be expected. The main goal will be to prove excellent performance and not to identify problems how to improve. And in principle quality assessment is not to judge people but to judge products and processes. 5 5. Leading principles, main inputs As part of the learning process, we intended to apply leading TQM principles in the development process of quality assessment framework. Customer in focus: In the program statisticians were our customers, main partners. First we concentrated on understanding their way of work, way of thinking about quality requirements, measurement and assessment, then proceeded with capacity building to improve common understanding in the development process. We intended to take into consideration cost-benefit from their point of view. Involvement of people: Based on our main principle to build on existing knowledge and tradition, we involved each department in mapping traditional quality measurement tools. The development was carried out in projects, involving experts from different departments. The training courses were open for all applicants, interested people, and quality concepts were applied in other methodological courses as well. Special training courses were organized for the statisticians of the other institutions of official statistical service. We used wide range of communication tools, like intranet, internet, presentations at different meetings (Statistical Council, Committee of Statistics, Managerial fora), and organisation of fora on quality. Systems approach: We developed a strategic plan to build up the system envisaged. Factual approach in decision making: The acceptance of the developed standard tools and methods, as well the working system was a long procedure in each case. First the project team prepared the proposal, and circulated for opinion among departments. Then it was put on the agenda of the Board on Methodological Issues, where the representatives of all departments discussed the issues. Last it was discussed in the Strategic Development Council, on the proposal of which the president of the office took his decision. Main external inputs were: Eurostat support (quality working group meetings, Amrads, ESTP courses, Eurostat pilot project on quality, DatQAM grant, results of previous programs like LEG on Quality, Handbook on improving quality by analysis of process variables, quality coaching) Eurostat requirements for quality reports in different domains, CoP self assessment and peer review, follow up reports, NSI practices, experiences from Germany, Austria, Netherlands, Italy, Spain, Slovenia, Portugal, Sweden, UK, Finland, from DatQAM handbook the chapter on the strategy for the Implementation of Data Quality Assessment and examples. 6. Conclusions The development of an internal quality framework has been a necessary step towards a systematic quality assurance system. Management commitment, external requirements and motivations from Eurostat have been proved necessary in addition to the broad involvement of the staff in the development process. The transparency of the development and adoption procedure, communication within HCSO and official statistical service were assured. However more emphasis should have been put on practical training. The real achievement will be, if – according to the plans – the quality assurance becomes an integrated part of the HCSO planning system, if the action plans emerging from the results of quality assessment will become part of working plans. The main challenges to overcome are still the quality related administrative burden and the link of performance and quality measurement. 6 HCSO did not have a leading role in the systematic quality work, so it could make use of others experiences. This accelerated our development process. However the integration of our local knowledge, the adaptation in HCSO environment and especially the modification in the way of thinking, the development of necessary competences in the broad community of statisticians need time in order to avoid formal implementation. REFERENCES Baracza G., Ercsey Zs., Ábry Cs. 2009. Metainformation system of the Hungarian Central Statistical Office. Hungarian Statistical Review 2009. Special Number 13. p. 103-128. http://www.ksh.hu/statszemle_archive/tartalom2009.html#iss_K13 Colledge, M., March M. 1993. Quality Management: Development of a Framework for a Statistical Agency. Journal of Business and Economic Statistics 11, no. 2 (April): 157-165. http://www.jstor.org/pss/1391367 Colledge, M., Dalén, J., Georges, J. and Lindén, H. 2006. Improving Statistical Quality Assessment in Ten Countries, European Conference on Quality in Survey Statistics (Q2006) Cardiff, UK http://www.statistics.gov.uk/events/q2006/downloads/W08_Colledge.doc Csereháti, Z. 2006. Multiple Donor Imputation Techniques in the Survey of Internet Service Providers in Hungary. European Conference on Quality in Survey Statistics (Q2006) Cardiff, UK www.statistics.gov.uk/events/q2006/downloads/W06_Cserehati(A).doc Dan Lisai 2007. Choosing and Implementing a Quality Management System at Statistics Sweden, Masters thesis in Mathematical Statistics, Stockholm, December 2007 http://liu.diva-portal.org/smash/record.jsf?pid=diva2:17405 Eurostat 2007. Handbook on Data Quality Assessment – Methods and Tools (DatQAM). Written by Körner, T., Bergdahl, M., Elvers, E., Lohauss, P., Mag, K., Nimmergut, A., Sæbø, H.V., Szép, K., Zilhão M.J. http://epp.eurostat.ec.europa.eu/portal/page/portal/quality/quality_reporting Földesi, E. 2006. Introduction of Eurostat Standard Quality Indicators in the Hungarian Central Statistical Office – Dilemma. European Conference on Quality in Survey Statistics (Q2006) Cardiff, UK http://www.statistics.gov.uk/events/q2006/downloads/T16_Foldesi.ppt Györki, I. 1996: Survey control as subsystem of the statistical information system (GÉSA) Seminar on integrated statistical systems and related matters (ISIS’96’) HCSO 2005. Strategy of the HCSO, 2005-2008, HCSO, Budapest http://epp.eurostat.ec.europa.eu/portal/page/portal/pgp_insite/insite_docs/hu/HCSO_STRATEGY20 05_2008.doc HCSO 2006. Annual report on the strategy of the HCSO (2006) http://portal.ksh.hu/pls/ksh/docs/bemutatkozas/eng/strategy2006.pdf HCSO 2009 a. Strategy of the HCSO, 2009-2012, HCSO, Budapest http://portal.ksh.hu/pls/ksh/docs/bemutatkozas/eng/strategy2009-2012.pdf HCSO 2009 b. Quality Policy of HCSO, HCSO, Budapest, 2009 7 http://portal.ksh.hu/portal/page?_pageid=38,605809&_dad=portal&_schema=PORTAL HCSO 2009 c. Quality assurance framework, Case of Hungarian Central Statistical Office, December 2009, United Nations Statistics Division, National Quality Assurance Frameworks http://unstats.un.org/unsd/dnss/QAF_comments/HCSO.pdf Mag K. 2006. Process Approach of the Eurostat Standard Quality Indicators. European Conference on Quality in Survey Statistics. Cardiff Pap I. 2004. Data Warehouse (DWH) approach at the Hungarian Central Statistical Office (HCSO) Joint UNECE/Eurostat/OECD work session on statistical metadata (METIS) (Geneva, 9-11 February 2004) http://www.unece.org/stats/documents/2004/02/metis/wp.17.e.pdf Szép, K., Mag, K., Vigh, J., 2006. Quality of Statistics in the Strategy of HCSO, Radenci, Slovenia, Statistical Days http://www.stat.si/radenci/program_2006/C_Szep.doc 8