Sandra Kröger Georg-August Universität Göttingen Humboldtallee 3 37073 Göttingen Germany e-mail: sandra_kroeger@gmx.net Number of words (text): 7.956 Abstract Many scholars support the idea that the Open Method of Coordination (OMC) is fostering learning processes. This article takes a closer look at the OMC in the field of poverty and social exclusion, and more particularly its indicators, in order to empirically evaluate whether this assumption is well founded. Particular attention is given to the choice of indicators, the information they transport, the comparability and data availability and the processes leading to the development of the indicators. It finally argues that in order to make OMCs in the social field more meaningful, more competences must be given to the EU-level. Keywords: OMC, learning, indicators, poverty. Do numbers induce learning? Assessing the Laeken indicators of the Open Method of Coordination1 1. Introduction2 In the last few years, the Open Method of Coordination (OMC) in the field of social policy has attracted considerable attention amongst social scientists, mostly in the context of the European Employment Strategy (EES)3, to a considerably lesser extent in the fields of inclusion, pensions and health. Most analyses have focused on conceptual and theoretical issues4, fewer were concerned with in-depth analysis of the actual implementation of the various OMCs5; equally few have analysed the theoretical assumptions of the OMC or its concrete construction at EU level. Generally spoken, the OMC was – and mostly still is – seen as a promising governance instrument in the absence of more binding mechanisms to bring about “modernisation” of the so-called European Social Model (ESM) by initiating mutual learning processes. Yet, it is argued here, in order to have an appropriate understanding of what the OMC can potentially do for “social Europe”, it is essential to analyse the theoretical assumptions and the technical toolbox of the OMC at EU level and the processes surrounding their development. The OMCs have mostly been conceptualised as enhancing steering through “learning” – a precise definition of what “learning” should denote is mostly missing. This is certainly related to the Lisbon Conclusions which introduced the OMC and several political EU texts since in 1 Paper originally presented under a different title at the young researchers workshop of the ESPAnet in Bath, 12 April 2005. 2 I am grateful for critical and constructive comments from Milena Büchs, Hester Kan, Stephan Lessenich, Ramon Pena-Casa, Geny Piotty and John Veit-Wilson. 3 As examples, see Büchs (2005) and Goetschy (2003). 4 See Kröger 2005 for the inclusion process. 5 Exceptions include Büchs (2005) for the EES as well as Atkinson et al. (2002), Friedrich (2006) and Kröger (2006) for the inclusion process. which one can continuously find a strong focus on learning6. In the light of absent sanctionary mechanisms within the OMCs, it is assumed that “learning” can lead to policy change – and reform – through a shift of policy actors’ understandings of social problems and their solutions7. However, strong empirical support for de facto learning processes induced by the OMC inclusion is missing. To the contrary, those who invested in in-depth empirical research mostly conclude that the OMC inclusion has so far not delivered on both expected fronts, that is on effectiveness and legitimacy (Idema and Kelemen, forthcoming, Kröger 2006). Indeed, one can wonder why the tool box of the OMCs and its non-bindingness should promote “learning” processes in a politically highly sensitive policy area where further integration was and remains judged undesirable for reasons of institutional diversity and political and ideological disagreements8. It often seems as if research tends to neglect that we are dealing with a political process, in which one can find different ideas, interests and resources available, which is embedded in other, “stronger” political processes such the EMU or the Growth and Stability Pact (Scharpf 2002). In the following, I will deal with the commonly agreed indicators and their development in the context of the OMC inclusion and evaluate if they can contribute to supranational learning processes. First, they constitute the “hardest” element of the OMC inclusion and therefore can best be evaluated. Second, they contribute to the framing of a given social problem and therewith are indicative of the current approach to social policy matters. Third, as will be shown, they point to the diverse difficulties that the soft governance approach of the OMC faces in the social field. Particular attention will be given to 1) the choice and content of indicators; 2) the information that the indicators transport; 3) the comparability and data The Lisbon Conclusions imagine the OMC inclusion as “periodic monitoring, evaluation and peer review organised as mutual learning processes” (para 37). 7 See Berghman et al 2003; Jacobsson 2002 and Trubek and Mosher 2003 among many others. 8 Disagreements about how to conceive of the social sphere and how to organize it institutionally and financially. 6 availability as well as 4) the processes leading to the development of the indicators. Why are these for issues central in the “learning” context of the OMC? As for content, the main question is whether the choice of indicators is broad enough so as to draw a picture of the diverse aspects of poverty. If one assumes that different aspects of poverty interact with one another and can be mutually reinforcing, then a wide range of indicators needs to be taken into consideration. With respect to the second point, the main question of concern here is whether the indicators provide the information necessary to learn about the reasons for the respective performances, or, in other words, whether the indicators allow for the contextualisation of performances. Thirdly, in order to support learning processes, it is clear that the data need to be comparable, available and timely. Finally, if “learning” is to happen, it is essential that all actors concerned have the possibility to contribute to the debate, that the process be open and transparent (Friedrich 2006). In the empirical section, I will review in how far these four points are met by the Laeken indicators. After a brief review of the OMC inclusion process (2), the second part will review theories of learning and deliberation which have been used in the scientific community to conceptualise the OMC (3). The next section will confront these theories with the empirical reality of the commonly agreed indicators, their choice and development as well as the actors involved (4). Finally, the discussion will review why it might not (yet) be possible to conceive of the OMC as promoting learning processes and in how far there remain structural limitations to supranational learning in the social policy field (5). The analysis, other than on academic literature and policy documents, particularly draws upon in-depth interviews with involved actors9. 9 See footnote 20. 2. Introducing the OMC inclusion In 2000, the Lisbon Council decided to officially install a governance instrument 10, the OMC, which was seen as respecting the principle of subsidiarity11 in social matters in the EU while nevertheless leading member states in the direction of convergence of performances and the „modernisation“ of the ESM through ongoing processes of exchange, comparison and learning12 and benchmarking13. While diverging in some aspects, most OMCs have since then integrated common features such as the writing of National Action Plans (NAPs) by the member states, evaluating Joint Reports by the Commission and the Council, gathering data on the basis of commonly agreed indicators, peer reviews, potentially benchmarking, and the exchange of best practices. The four common objectives that were agreed upon for the social inclusion process at the Nice Council in December 2000 were: 1) Access to employment and to resources, rights, goods and services, 2) prevention of the risks of exclusion, 3) help for the most vulnerable and 4) mobilisation of all relevant bodies. While these common objectives 14 certainly reflect a multi-dimensional approach to social exclusion, they are very broad and leave considerable space for interpretation and implementation. Since its inauguration in 2000, member states have produced three NAPs (2001, 2003 and 2005), and the Commission respectively three Joint Reports (2002, 2004 and 2005). In June 10 Whereas the OMC was introduced in Lisbon, elements of it such as guidelines, benchmarking and peer review had already been in practice for a longer time. 11 Article 3b of the Treaty on the European Union states that "the Community shall take action, in accordance with the principle of subsidiarity, only if and insofar as the objectives of the proposed action cannot be sufficiently achieved by the Member States and can therefore, by reason of the scale or effects of the proposed action, be better achieved by the Community". 12 For a broad overview of the respective literature, see Zeitlin "The Open Method of Coordination in Action: Theoretical Promise, Empirical Realities, Reform Strategy.", in: Zeitlin, Jonathan, Pochet, Philippe and Lars Magnusson (eds.) 2005, The Open Method of Coordination in Action: The European Employment and Social Inclusion Strategies. Brussels: P.I.E.-Peter Lang, to be found at: http://eucenter.wisc.edu/OMC/open12.html. 13 For the origins of the concept of „benchmarking“ in the economy as well as in the EU, see Arrowsmith et al. (2004) and de la Porte, Pochet and Room (2001). 14 It should be mentioned that they are spelled out in greater detail in the respective official EU-document: Objectives in the fight against poverty and social exclusion (EC: 2001/C 82/02). 2000, the Social Protection Committee (SPC) was established through the Council15. The SPC is meant to be the connecting mechanism between member states, the Council and the Commission with respect to the “modernisation” of the social protection systems. Its purpose is therefore broader than the issues at stake in the context of the OMC inclusion It has, soon after its coming into existence, created an indicator sub-group (ISG) which was and remains responsible for the elaboration and further development of common indicators. The so-called Laeken indicators were adopted by the Employment and Social Affairs Council in December 2001 and were therefore only used in the second and third round of NAPs. While these developments are considerable achievements, there was stagnation with respect to quantified targets (Greve 2002: 11) which would – to some degree – permit evaluation of compliance16. Yet, “if the process is to be meaningful and credible, targets are essential” (Atkinson et al. 2004: 66) as they are a prove of political commitment and a goal against which to measure progress. Without quantified targets, the intended benchmarking process also becomes impossible17. There equally has been no dynamic with respect to the further development of the common objectives. 3. Situating the OMC in the learning literature Many different approaches have been offered in order to come to grips with the OMC. Nowadays, they are commonly referred to as governance literature. This literature is inspired by theories of learning and deliberation (Cohen and Sabel 1997), of policy transfer (Dolowitz 15 See the Council Decision of 29 June 2000 on Setting up a Social Protection Committee (2000/436/EC). The SPC subsequently gained a Treaty basis in 2003. 16 The Common Outline for the NAPs 2003-2005 of the SPC invited member states to set targets that are ambitious but achievable, relevant, intelligible, quantified, measurable, and time specific (SPC 2003, Appendix I). 17 Ramon Pena-Casas has pointed to me that a “pure benchmarking” process was not the aim of the exercise (and impossible in light of the principle of subsidiarity). While this is certainly right for the implementation of the OMC, benchmarking was and is nevertheless foreseen by the institutional equipment of the OMC/incl., and with it, the possibility for naming and shaming to happen. and March 1996), of networks (Kohler-Koch 2001), of diffusion, and of naming and shaming (Trubek and Mosher 2003). Most researchers, however, have conceptualised it as a learning instrument, particularly in the direction of ideational, cognitive and discursive learning (Jacobsson 2002; Overdevest 2002; Tucker 2003). This concept particularly focuses on the interactions amongst involved actors. These interactions, so the expectation, can lead to policy change, namely by leading the concerned actors to the modification of the interpretations of their interests. The idea of learning can be traced back to the 1960s (Deutsch) and the 1970s (Heclo). In the 1990s, Hall has strongly influenced the academic debate, defining policy learning as a “deliberate attempt to adjust the goals of techniques of policy in the light of consequences of past policy and new information (Hall 1993: 278). Hall speaks of learning when policy change occurs after such as process and differentiates into first (tools), second (setting of the tools) and third order (paradigm or goal changed) learning. Some years later, the concept of a directly deliberative polyarchy has been presented and has received broad attention since, also in the OMC literature (Sabel and Cohen 1997). The authors suggest that steering political systems might be better achieved through ongoing local learning processes than through centralized regulation. Consequently, they propose a reconfiguration of the traditional political (state) powers and their rights and duties, mainly shifting formal and material responsibilities to local units while leaving the dissemination of information and the distribution of financial means to central agencies. Competition between the different local units about the “best model” would be supported by processes of mutual comparison and benchmarking. A central criticism of this model concerns its blindness towards the political character of learning in a public environment (de la Porte and Pochet 2003). Therefore, so de la Porte and Pochet, the “DDP approach does not seem to add much value to understanding the OMC” (de la Porte and Pochet 2003: 8). More directly concerned with the social OMC processes in the EU, Hemerijck and Visser identified a whole set of reasons why processes of learning might not lead to any or improved policy change. These include the assessment that learning is neither a sufficient nor a necessary condition for policy change; that to learn from other countries is only but one possible factor amongst others for the change, and not necessarily the most important one; that there is no reason to believe that learning necessarily improves performances, particularly if ut does not rely on one’s own experiences. They also noticed that “poorly developed evaluation methods tend to stand in the way of effective learning” (Hemerijck and Visser 2003: 17). Research has additionally shown that often fundamental political, administrative, institutional and cultural aspects are neglected, overseen or forgotten in the analysis of learning processes, for example problems of collective action, diverging interests, the low predictability of outcomes, to name only a few (Barbier 2004; Kröger 2005). Trubek and Mosher, occupied with the functioning of the EES, introduced an important distinction when analysing potential supranational learning processes insofar as they differentiate between looking at policy change once it has occurred and the possibility of tracing change back to learning processes and analysing whether the instrument which is expected to support learning processes is well equipped to do so. Turning to the latter scenario, they suggest that learning can take place where public and private actors are brought together in deliberative problem-solving settings; where policy networks are enlarged; where decentralised experimentation is encouraged; where information on innovation is precise and commonly available; and where actors are encouraged to compare their results with those of the best performers in any area (Trubek and Mosher 2003). Addressing the Laeken indicators, Mabbett defended the learning conceptualisation of the OMC by differentiating between learning and evaluation, suggesting that “evaluative exercise has to resolve problems of power and authority” (Mabbett 2004: 2) and thereby imposes certain “demands on the use of indicators which a learning orientation does not” (ibid)18. The first point broadly pertains on the issue of subsidiarity and more concretely to who should have the authority to evaluate member states’ performances. While the Commission obtained the task to write Joint Reports, this does not mean that it is free to write whatever it wants to write19 (Idema and Keleman 2006). The second point has to do with the choice of indicators. For evaluation and learning processes alike, it is not sufficient to rely exclusively on outcomeoriented indicators as these do not take into account the larger environment in which a given performance is achieved: “The promise of benchmarking as a powerful tool of learning can be undermined by the elevation of quantitative criteria over more complicated issues to do with context and processes“ (Arrowsmith et al. 2004: 312; see also de la Porte, Pochet and Room: 2001: 292 and Kutsar 2000: 3). While distinguishing learning from evaluation has analytical value to it, this must not mean that learning is not dependent on supportive mechanisms, binding rules and common definitions nor that hierarchies are absent in such a logic. Indeed, there are strong arguments which speak against the conceptualisation of the OMC as a learning instrument. The first one being that the OMC takes place in a political environment, that it is a truly political exercise where different actors pursue different interests and goals, grounded on diverging if not opposed ideas and equipped with diverging amounts of resources (Radaelli 2004). Nowadays, this political environment is strongly influenced by the demands of European Monetary Union, the process and effects of “negative integration” (Scharpf 1996) and the supply-sided focus of labour market policies. Secondly, the structure of the process is not supportive of enduring collective learning as it does not ensure that knowledge and experience are widely shared, discussed and evaluated (Casey and Gold 2004; Idema and Keleman 2006). Thirdly, 18 This does not mean that in learning processes, no power hierarchies were active. Particularly in 2001, the Commission had to deal with severe critique for its draft Joint Report from all member states for pronouncing a too direct and harsh critique on their NAPs and policy approaches and attempt to categorize them hierarchically. 19 there is a strong tendency to neglect that the involved actors are embedded in a larger institutional and organizational environment and therefore not free to learn whatever they may wish. 4. The Laeken indicators: a basis for learning20 Social indicators can be an important tool for evaluating a country's level of social development, for assessing the impact of policy, for addressing social inequalities and their structural grounds, dimensions and degrees of social exclusion. A lot depends on how they are fabricated and with which intentions. In Atkinson’s words, one has to ask “what is the objective underlying an indicator and how does this influence the definition to be adopted?” (Atkinson 2002: 10). Within the context of the OMC, one idea sees the indicators as establishing a common language for the discussion of social policy issues and clearly as performance indicators (Atkinson et al. 2002: 19). Another idea is that a set of common indicators can advance the agenda for the social inclusion process. While social indicators certainly can add value to the analysis, evaluation and development of social policies, there are equally a number of difficulties with respect to the usage of common indicators particularly in a supranational context such as the EU which should not be forgotten21. 20 In what follows I will use information gathered in seven in-depth interviews with members of the ISG, six out of which from (EU-15) member states representatives amongst which the president of the ISG and one from the secretariat provided by the Commission. The interviews were conducted between September and November 2005. Even though not directly used here, in order to come to an appropriate picture of the process under review, I can also draw upon information gathered in six in-depth interviews with (EU-15) members of the SPC, five out of which from member states and one from the secretariat provided by the Commission. Additionally I have received eight answers to an open questionnaire from EU-25 members of the SPC. This material was gathered during the same time. 21 Kröger has listed and explained these difficulties: the question of definition, of measurement, of taking into account time and space, of data availability, of the political character of indicators and of who fabricates them (Kröger 2004). During the 1970s and 1980s, the EU’s concept of poverty came quite close to Townsend‘s (1970) notion of poverty who operated with a relative concept of poverty which was to be measured against the given standards of a given society and the means necessary to participate therein. By the 1990s, and due to the academic22 and political influence stemming from French researchers and the then-in-place President of the Commission, Jacques Delors, the concept of poverty and its focus on monetary exclusion became less influencial in the EU while the one of social exclusion had gained in prominence and acceptance. Social exclusion was seen as capable of grasping the multidimensional aspects of the phenomenon as well as the relational, agency and dynamic sides to it. Furthermore it seemed to place greater weight on the structural reasons of different forms of individual deprivation23 thereby increasingly focusing on “barriers and processes by which people are excluded” (Greve 2002: 11) from social, political and cultural rights, a tradition going back to the concept of citizenship as developed by Marshall24. With the changing conceptualisation of poverty and social exclusion, methods used to measure the related phenomena changed as well (from national household budget surveys to multi-dimensional and intertemporal data instruments (panel studies)). 3.1 Choice of indicators How are these developments mirrored in the OMC inclusion? Both the common objectives as well as the later on decided Laeken indicators provide a provisional answer, the indicators 22 Prominent authors on the definition of social exclusion include Serge Paugam, Robert Castel, Rob Atkinson, Ruth Levitas, Ruth Lister or Chiara Saraceno. For an overview over the shift from “poverty” to “social exclusion” and the respective debates, see Atkinson and Davoudi (2000), Todman (2004) or Goguel d’Allondans (2003). 23 Not all scholars perceive of this shift as a positive one and Veit-Wilson came to the opposite conclusion when stating: “The well documented change in Eurospeak terminology from poverty to social exclusion in the 1980s reflected a deliberate politically-driven expedient shift in discourse from politically-sensitive structural causes to politically-anodyne victim consequences” (Veit-Wilson 2003: 6). 24 Room has noted the following elements in the switch from poverty to social exclusion: 1) From financial to multi-dimensional disadvantage. 2) From a static to a dynamic analysis; 3) From a focus on the resources of the individual or household to a concern also with those of the local community; 4) From distributional to relational dimensions of stratification and disadvantage and 5) From a continuum of inequality to catastrophic rupture (Room 2000). being the harder instrument as they are supposed to be the basis for evaluation, benchmarking and naming and shaming. The indicators suggested in October 2001 by the SPC and endorsed at the Laeken Council in December 2001 are divided into primary, secondary and tertiary indicators, the last ones being a voluntary domestic exercise. The ten primary indicators consist of: 1. Low income rate after transfers with low income threshold set at 60% of median income (with breakdowns by gender, age, most frequent activity status, household type and tenure status; as illustrative examples, the values for typical households); 2. Distribution of income (income inequality S80/S20); 3. Persistence of low income; 4. Median low income gap; 5. Regional cohesion (based on employment); 6. Long term unemployment rate; 7. People living in jobless households; 8. Early school leavers not in further education or training; 9. Life expectancy at birth, split by gender; 10.Self perceived health status. The eight secondary indicators are: 11. Dispersion around the 60% median low income threshold after transfers (40%, 50% and 70% of median); 12. Low income rate anchored at a point in time; 13. Low income rate before transfers; 14. Distribution of income (Gini coefficient); 15. Persistence of low income (based on 50% of median income); 16. Long term unemployment share within total unemployment; 17. Very long term unemployment rate and 18. Persons with low educational attainment, by age and sex25. Since the adoption of these 18 indicators in December 2001, two indicators have been added to the list: in-work poverty and the amount of children in jobless households. What does the choice of these indicators say about the framing of poverty and social exclusion in the context of the OMC inclusion? Out of the ten primary indicators, seven are income (4) or unemployment (3) related, while out of the eight secondary indicators, seven are income (5) or unemployment (2) related. This adds up to 14 out of 18 indicators being 25 http://europa.eu.int/comm/employment_social/soc-prot/soc-incl/index_en.htm. income or unemployment related, two to education, two to health. There are no indicators on housing, drug abuse, early (child) pregnancy, released prisoners, ethnic minorities and migrants, consummation or social and political activities. Atkinson et al. have noted these gaps in coverage: “Major gaps in the areas and topics covered at this stage – recognized by the SPC and its ISG – reflect a combination of data unavailability and absence of clear conceptual underpinning in particular areas” (Atkinson et al. 2004: 59), particularly with respect to housing (see also Greve 2002: 11) and homelessness. Whereas the indicators of this OMC were continuously discussed in 2001-2002, this was the case five times during 2003 and 2004, and four times in 200526. This decrease in importance can largely be attributed to the increase of other OMCs, namely those in pensions (late 2001) and health (2002), but also to the fact that the “easy” questions were resolved during the first two years of work whereas the difficult issues remain unsolved and more or less on the agenda27. The discussions since 2003 have particularly focused on non-monetary / deprivation indicators, indicators on housing and homelessness as well as for ethnic minorities and immigrants. Other regular points of the meetings included the passage from the European Community Household Panel (ECHP) to the EU-SILC (Statistics on Income and Living Conditions), the review of the indicators used in the national reports, how to link the indicators more closely to the objectives, and the preparation of the streamlined OMC on social protection and social inclusion, beginning in 2006. Finally, it is an open question whether the ISG should concentrate on finding and defining indicators or whether it should also analyse the results28 which would give greater political weight to it. 26 This information stems from the official work programmes of the respective years, documents obtained through informal contacts. 27 Interview with a member of ISG, 25.11.2005. 28 Interview with a member of the ISG, 25.11.2005. There are different reasons why these issues have remained on the agenda instead of being resolved. The first reason is that contrary to the ISG on employment, this indicator sub-group opted for having a high degree of comparability, rendering consensus building more difficult29. With respect to the deprivation indicators, it has been particularly the French delegation speaking up against them arguing that such indicators would deter the picture whereas other delegations consider that such indicators are the only way of getting indicators in particular areas at all, namely for housing issues: “The French are strategically wrong in rejecting it. Because if we don’t do it, there will be no indicators at all on, for example, housing”30. If this area, and particularly homelessness have not become an object of consensus yet, this mainly has to do with both a lack of a commonly accepted definition and of data: “Indicators on homelessness and housing are important, but very difficult. First, because there is no adequate data available and second because it is unclear how the quality of housing should be judged (ex: lack of heating in northern and southern member states)”31. The question of ethnic minorities and immigrants was rendered difficult by the respective legislation of France and Portugal which do not foresee the possibility of data gathering split up by national origin or ethnic ascription. But also, ethnic minorities vary in (quantitative) importance in the different member states as well as their social stratification. In June 2005, a preliminary consensus was found foreseeing strict guidelines for the future reporting on these individuals, including their work situation32. There is thus a strong focus on employment and income-related indicators which generally tend to be adopted to the EES: “These indicators have a clear and distinct overlap with the employment strategy” (Greve 2002: 11). Social exclusion as framed through the Laeken indicators mainly appears as exclusion from the labour market and thereby exclusion from an 29 Interview with member ISG, 25.11.2005. Interview with member of ISG, 17.11.2005. 31 Interview with member of ISG, 14.11.2005. 32 This was reported by two interviewees and also can be found in the report presented under the Luxemburg Presidency by Marlier et al. An immigrant is now defined as someone who crossed the border and with a different nationality (than EU). 30 own income. While both are without any doubt fundamental for the integration into society, they do “not tell us everything we need to know about the resources or living standards of households” (Atkinson et al 2004: 61), an argument that points in the direction of more nonmonetary indicators (Greve 2002). 3.2 Restriction to performance indicators The SPC, and later the Council, have opted for indicators that address final outcomes and thus integrate a static definition of social exclusion while not addressing the processes leading to it (Greve 2002: 11) nor structural reasons. This choice has to do with the respect of the principle of subsidiarity33: “The aim of the EU indicators is to measure social outcomes, not the means by which they are achieved” (Atkinson 2002: 8). Atkinson et al. bring in another argument : “Focusing on outcomes may also foster a co-operative attitude between the different national bodies – ministries, agencies, etc. – that have competencies in these areas, whereas as far as inputs are concerned they may be more inclined to see competition for resources as a zerosum-game” (Atkinson et al. 2004: 51-52). While this argument may have some validity to it, it should not be overemphasised as the degree of politicisation of the OMC and its public visibility are very low. More importantly, outcome indicators do not represent the result of public policies alone, but reflect a multitude of interconnected developments in economy and society; isolating the effects of single policies is therefore hardly possible. This is a strong argument for indicators of policy inputs and outputs, and of policy processes, without which the Laeken indicators will “hardly themselves furnish the comparative policy-relevant information that such learning – and the sharing of good practice – will properly require” (Room 2004: 6). It is noteworthy that originally, the Social Affairs Council had foreseen both „performance indicators“ and „policy indicators“ (Friedrich 2006). 33 If the indicators themselves do not furnish this information, maybe the NAPs, the texts which the governments have handed to the Commission on a biennal basis, can compensate for this shortcoming? So far, these have turned out to be governmental reports instead of critical review of passed policies or strategic planning of future policies (Idema 2004). As importantly, there are two additional shortcomings: for once, the indicators presented in the annex to the NAPs do not necessarily correspond to the policy measures presented in the text and vice versa. Second, the information about a policy provided by the general text as well as the annexed so-called good practices is by far insufficient to understand what the policy is about, how it functions, how it is financed, which actors are involved and so on (Kröger 2006). In other words: there is an important mismatch between the indicators and the text which is supposed to contextualize them. This is not to say that they are without any value: the indicators chosen may very well enable the member states and the EU alike to evaluate whether member states are moving in the same direction or not, as well as, in theory, serve as a tool for naming and shaming processes. Indicators may equally “serve to combat the tendency for national and European social policies to be developed in parallel universes” (Mabbett 2004: 15). It just will not be possible to say why member states moved in a given direction or another. Yet for learning processes, this explanation is rather essential. The secretariat of the Commission and some national representatives are aware of this shortcoming and see room for indicators that are more responsive to political action: “There is room for policy or process or input measurement as well”34. 34 Interview with member of the ISG, 21.10.2005. 3.3 Comparability and data availability In order to learn in a supranational context, it is essential that the available data be comparable. For comparable data to exist, situations of poverty and social exclusion need to be somewhat comparable. However, what might be considered as producing social exclusion or as a state of exclusion at one given point in time in one given region of the world (or the EU) by no means corresponds with a definition given at another region of the world (or the EU) at the same time: “People in different countries thus experience unemployment or poverty in very different social contexts. It is therefore possible that comparability across countries may, paradoxically, require the questions to differ across member states” (Atkinson et al. 2002: 177; de la Porte and Pochet 2003: 26). Some examples can be instructive. Employment rates may be lower in Italy than in the United Kingdom; this does not necessarily have to mean that less people have a paid job in Italy than in the UK as informal networks may work in favour of some sort of labour market integration in Italy. A homeless rate of 3% would be a scandal in the Scandinavian countries as many people would probably die in the cold of the winter whereas in Spain or Portugal, even though still politically a scandal, it does rarely threaten people’s lives and therefore might not be as big a priority: “What counts as unacceptable housing in Finland in the winter may be accepted in Cyprus or Portugal”35. Having two friends in rural Ireland may be all one can hope for in the Green Island, whereas it may be a sign of social isolation in Greece. Some member states have rather well developed minimum assistance schemes allowing the unemployed to continue to live in dignity whereas others don’t. In short, the question of comparability in a supranational context is closely linked to the concept of adequacy which “often varies from country to country, as it depends on specific cultural, social, environmental, and economic factors, such as the climatic differences between the North and the South of Europe” (Atkinson et al: 2002: 159). And evidently, the diverging traditions and institutions as well as 35 Interview with member of the ISG, 21.10.2005. the different climates not only render the interpretation of the data more difficult, but also the agreement on common indicators. One interviewee has focused on the cultural differences making comparison difficult: “Furthermore, there is research evidence that quite small changes in the wording of questions on self reported health (even in the listing of questions) can produce different results. The other problem is that cultural differences determine answers more than underlying health status and this makes cross country comparisons difficult”36. Again, the interpretation of the data, necessary for learning processes, needs accompanying information about policies, budgets and laws, supposedly provided in the NAPs. As of now, there have been important differences between the year of data gathering and the policies described in the NAPs leading to a situation where “countries will be reporting on the position some years in the past” (Atkinson et al. 2002: 183) and rendering learning processes difficult, meaning that the presented performances largely cannot be explained by the presented text. Also, the definition of an indicator “may have changed in the meantime even if there has been no change in policy” (ibid; see also Kutsar 2000: 3). The problem of timely data has largely been acknowledged by the member states leading to the introduction of the EU-SILC which is expected to solve it but will take another decade before producing data that can be compared over time. Finally, some data just do not exist (yet), a challenge acknowledge by all interviewees, particularly with respect to homelessness, disabilities and illiteracy: “The data poses quite a bit of a problem”37. 3.4 The consultation process The literature on governance, deliberation and learning emphazises the importance of the exchange of arguments in order to come to viable solutions and an acceptable shared understanding of the common good (Sabel and Cohen 1997). Another more functional 36 Interview with member of the ISG, 21.10.2005. As an example, the interviewee cited the notoriously good humour of the Irish, who could still answer that everything is fine while being sick against the rather pessimistic character of the Germans with a tendency to exaggerate sufferings. 37 Interview with member of the ISG, 21.10.2005. argument reads that the more (and different) actors will be involved in the policy-making process, the more effective the problem-solving strategies should be. This is why it is important to evaluate the sort and degree of participation and consultation in the process of indicator fabrication. It is possible to distinguish three types of consultation, namely institutionalised consultation of the member states and the Commission, semi-formal consultation of external academic or statistical experts (such the OECD or the Atkinson-group, see below) and informal consultation of NGOs (Friedrich 2002). Consultation of the member states happens nine to ten times a year in the official meetings of the ISG in Brussels. The OMC inclusion, however, need not be at the centre of attention at every meeting (other issues being the OMC on pensions and health, for example). These meetings last half a day to a day and normally, there are several items on the agenda. Particularly since enlargement38, it is thus easy to imagine that not all delegations can speak up for all items: ”Out of 25 member states, maybe ten speak up during a meeting” 39. According to all interviewees, these tend to be particularly Luxemburg, Belgium, France, Italy and the United Kingdom40. Particularly the EU-10 member states are reported to be “silent”. This state of the art, however, is not reported to reflect coalition building of EU-15 vs. EU-10 member states41. With respect to the second group, the degree of participation had a lot to do with Frank Vandenbroucke, the former Belgian Federal Minister of Social Affairs and Pensions who had made the indicators one of his top-priorities for the Belgian Presidency of the EU (second half of 2001). Besides single experts from the OECD, other DGs or Eurostat, it was particularly 38 Note, however, that the EU-10 member states participated in the ISG as guests since 2003. Interview with member of the ISG, 25.11.2005. 40 Member states that were mentioned once are Finland, Poland, Hungary, the Netherlands and Germany. 41 Note, however, that the ISG interviews were all conducted with EU-15 members of the ISG. 39 academic advise that Vandenbroucke encouraged and supported, and this resulted, in autumn 2000, in the setting-up of a group of high-level academic experts, called the ‘Atkinson group’, after its chair Tony Atkinson42. The Atkinson group was in regular contact with the ISG and drafted the final report for the SPC43; it also opened up discussion to a broader academic network. Throughout the consultation process, the academic experts met clear rules of participation and were actively encouraged to participate by the Belgian Presidency (Friedrich 2002). The visible highlight of Vandenbroucke’s efforts – besides the adoption of 18 indicators at the Laeken Council – was an international conference on “Indicators for Europe. Making Common European Objectives Work”, in mid-September 2001, organized by the Belgian Presidency. It seems, however, that since the adoption of the Laeken indicators, consultation has involved less academic advise than in the first two years: “The first Atkinson report (2001/2002) has strongly influenced the ISG as the ISG only came into place, then. Now, the work of the ISG has strongly influenced the second report (Marlier et al. 2005). The great value is that it puts on the table what the ISG is doing”44. No formal participation rules existed for NGOs and attitudes from officials towards their participation seems to differ largely. While European NGOs showed political interest in influencing the indicators, they met considerably fewer facilitating conditions to participate than the academic experts45 which stands at odds with the common objective no. 4. (“mobilisation of all relevant bodies”) and with the aim to foster the exchange of good practice which are rather based on the ground where NGOs operate than at governmental levels. Apparently, some degree of access was possible upon request, but without (clear) rules 42 Members were Tony Atkinson, Bea Cantillon, Eric Marlier, and Brian Nolan. Its work was published a year later (Atkinson et al 2002). 44 Interview with secretariat ISG, 21.10.2005. 45 But it should also be mentioned that such a development seems rather unlikely as people / representatives coming from “civil society” and / or working in NGOs – let alone the excluded – simply lack the technical know-how necessary for the construction of such indicators, most of the times. Yet, this should not be used as an argument for their exclusion from the process. 43 of participation (Friedrich 2002). Both the EAPN46 and the FEANTSA had informal access to the ISG and discussed several papers and the interim report: “FEANTSA, they try to participate all the time, trying to influence the process, wanting to make contributions. They have been in the ISG” – to the difference of EAPN47. They equally contributed with own evaluations and reports and organized several round-tables. Yet, the scope of the impact of their participation remains at best unclear. The perception of who established the contacts between the subgroup and the NGOs, either the ISG’s secretariat or the NGOs, also differs. Whereas official actors have stressed the role of the secretary, NGOs have a more critical view with respect to the demand of their participation (Friedrich 2002). The degree of ‘openness’, i.e. the extent to which non-public actors were incorporated into the definition process, thus varied significantly. It is fair to conclude that the Laeken indicators have been developed by quite closed groups. The influence of NGOs was limited48 and of socially excluded people inexistent, whereas academic expert participation was extensive and important for the final output, as the adoption of Atkinson’s proposal to use three levels of indicators as well as the adoption of the suggested methodological principles proves. One can wonder whether an increased consultation of NGOs would have meant a different set of indicators in the end; but without clear modes of consultation, this option was excluded from the beginning. As Atkinson et al. have argued, it is essential to “intensify the efforts to engage a wide range of social actors”, particularly not viewing the excluded as passive participants, but as active actors: “Those suffering from social exclusion should co-determine how exclusion should be 46 Amongst social NGOs, EAPN enjoys by far the greatest support of the Commission who is financing it to a large degree. Zeitlin goes as far as to state that “EAPN has been granted a semi-official place in the Social Inclusion process” (Zeitlin 2005: 17). 47 Interview secretariat of ISG, 21.10.2005. 48 Note that no „excluded“ people were invited to participate in the elaboration of the commonly agreed indicators. measured” (Atkinson et al. 2004: 63-64)49. Such an approach would not only increase the input legitimacy of the EU, but also profit from valuable knowledge coming from “experts on their own matter”. There are diverging acknowledgements of the nature of the discussions. While some interviewees stated that there is a tendency towards consensus building and finding ways of coming together, others estimated that “in the ISG, bargaining takes place more often” (than in the SPC)50. The latter evaluation meets the findings of Jacobsson and Vifell, analysing the functioning of various social and economic committees of the EU. They have found some evidence for cooperative deliberation in general discussions; things where different when it came “down to the formulation of recommendations or the exact definition of indicators, the discussion in the committees or in the bilateral consultations with the Commission, takes the form of pure negotiations and bargaining. Member States try to anticipate the recommendations and influence the exact wordings to make them nationally acceptable” (Jacobsson and Vifell 2004: 20). Not only does bargaining replace arguing, but also, according to these authors, do power relations replace good arguments in sensitive issue areas (ibid.). Asking the ISG-members if they had observed learning effects, answers diverged quite importantly. Two interviewees were rather sceptical, stating that “many member states don’t have learning as their first interest”51 or that no discussion is taking place and that things that are not reported at the meetings are completely neglected 52. Three interviewees particularly mention the positive effects in the domestic context, accelerating the discussions around the 49 And they go on to cite the EAPN: 'The best indicators are those which gauge changes in the everyday lives of people living in poverty and social exclusion. Such indicators can only be defined through a participatory method which involves a close cooperation between them and researchers' (EAPN, cited in Atkinson et al. 2002:187). 50 Interview with member of ISG and SPC, 27.10.2005. 51 ibid. 52 Interview with member of ISG, 25.11.2005. definition of indicators and improving the system of data gathering, particularly but not only in the new member states. Several interviewees stated that there is a structured comparison of the different systems leading to an increased knowledge about other social systems. However, there was also a critical voice estimating that by now, there are so many information that the risk exists of getting lost in it. 4. Discussion The aim of this article was to see whether the Laeken indicators of the OMC inclusion can be seen as fostering social learning and deliberation in a supranational environment. It became clear that if at all, these indicators may serve to support learning processes in the fields of employment, unemployment and income, but not in other areas as indicators are either simply missing and not supported by adequate information on policies nor process indicators. The latter ones so far have not been introduced into the process as they conflict with the principle of subsidiarity, leaving the means with which to achieve certain policy goals up to the member states. I have also shown that for different reasons – lack of available and timely data, lack of critical assessment of the domestic policies and performances in the NAPs by governments, important institutional, cultural and geographic differences to name some of them – comparability of the indicators is not assured (yet). Yet without comparability of the data, learning must remain quite vague. Finally, I have attracted attention to the consultation process where analysis shows that not all actors concerned have been consulted in sufficient degrees, a central precondition for learning to happen. Concluding, let me turn to some more general remarks about the issue of supranational learning in the field of social policy. One of the basic ambiguities inherent in the OMC is that policy coordination needs to follow a top-down logic if seriously attempting to counterbalance economic integration and that the learning processes which indicators are supposed to support require more of a bottom-up logic and would seem to necessarily include a broader range of actors and indicators produced on the ground (Room 2004)53. For learning to happen, there would need to be a greater commitment of key actors and much more space would need to be dedicated to peer review and the contextualisation of best practices (Schludi 2003: 7) as performance benchmarking does not allow to know causal relationships between policies and performances and therefore do not foster learning processes. Instead, this kind of benchmarking tends to be concerned (ideally) with target setting and quantitative measurement, encouraging participants to manipulate the evidence. It seems as if, in order to give the inclusion indicators greater weight, the issue of subsidiarity must be addressed. A soft policy coordination instrument such as the OMC inclusion is not capable of addressing questions of power nor of distributional nor judicial justice and suffers under the absence of binding regulations (Scharpf 2002). If the EU is to become a social actor capable of counter-balancing the negative and disintegrating effects of globally marketised economies, then it is misguided in treating member states as if they were at the centre of attention (Room 2004: 12). In this sense, there is a need to complement the soft elements of the OMC with more coercive mechanisms in order to ensure some degree of compliance. If interested in fostering learning processes, member states should also ensure that these processes are not working in favour of individual ad-hoc and interrupted learning effects, but in organizational and institutional learning (Casey and Gold 2004). It might be time to question more widely the learning assumption of soft governance modes – very difficult to operationalize for empirical research anyhow (de la Porte and Pochet 2003) 53 Which is encouraged by the Commission, but not respected by the governments. and see it instead as part and expression of the political process: might both the member states and the Commission not accentuate this component so much in order to reassure themselves continuously that no compentencies are being taken away and that the EU is doing something in the social field (Idema and Kelemen 2006) while in reality not much learning has occurred so far through the social OMCs (ibid; Kröger 2006)? References Arrowsmith, James, Sisson, Keith and Paul Marginson, 2004, „What can ‘benchmarking’ offer the open method of co-ordination?“, in: Journal of European Public Policy, Vol. 11(2), 311–328 Atkinson, Rob and Simin Davoudi, 2000, “The Concept of Social Exclusion in the European Union: Context, Development and Possibilities”, in: Journal of Common Market Studies, Vol. 38, no. 3, 427-448. Atkinson, Tony, 2002, “Social Europe and Social Science”, unpublished manuscript of the 13th ESRC Annual Lecture, to be found at: http://www.nuff.ox.ac.uk/users/atkinson/. Atkinson, T., Cantillon, B., Marlier, E. and Brian Nolan, 2002, Social Indicators: the EU and Social Inclusion, Oxford: Oxford University Press. Atkinson, Tony, Marlier, Eric and Brian Nolan, 2004, “Indicators and Targets for Social Inclusion in the European Union”, in: Journal of Common Market Studies, Vol. 42, no. 1, 47-75. Barbier, Jean-Claude, 2004, ‘Open methods of coordination’ and national social policies: what sociological theories and methods?”, Paper presented at the RC19 international conference, Paris, 2-4 Septembre 2004. Büchs, Milena, 2005, "Dilemmas of post-regulatory European social policy coordination". The European Employment Strategy in Germany and the United Kingdom, Ph.D dissertation. Casey, Bernard H. and Gold, Michael, 2005, Peer review of labour market programmes in the European Union: what can countries really learn from one another? Journal of European Public Policy, 12: 1, 23-43. de la Porte, Caroline, Pochet, Philippe and Graham Room (2001) “Social benchmarking, policy making and new governance in the EU”, in: Journal of European Social Policy, Vol. 11 (4), 291–307. de la Porte, Caroline and Philippe Pochet (2003) "The OMC Intertwined with the Debates on Governance, Democracy and Social Europe." Research on the Open Method Of Co-Ordination and European Integration prepared for Minister Frank Vandenbroucke, Minister for Social Affairs and Pensions, Belgium, http://eucenter.wisc.edu/OMC/Papers/delaportePochet.pdf Dolowitz, D. P. and Marsh, D. (1996) “Who learns What from Whom: a Review of the Policy Transfer Literature”, in: Political Studies , XLIV, 343-357. Commission of the European Communities (2001) European Governance, White Paper, COM (2001) 428 final, 25.07.2001, Brussels. Commission of the European Communities, 2002, Nr. Kommissionsvorschlag: 13926/01, Gemeinsamer Bericht über die soziale Eingliederung - Teil I: Die Europäische Union, einschließlich Zusammenfassung, Brüssel. Commission of the European Communities, 2003, 773 final, Joint Report on Social Inclusion, Communication from the Commission to the Council, the European Parliament, the European Economic and Social Committee and the Committee of the Regions. European Council, 2001, „Objectives in the fight against poverty and social exclusion” (EC: 2001/C 82/02). European Council, 2000, “Council Decision of 29 June 2000 on Setting up a Social Protection Committee” (2000/436/EC). Friedrich, Dawid, 2002, "The Open Method of Co-ordination: Bringing the Union Closer to the People? Participation in the process of defining indicators for the OMC on social inclusion", Masters thesis at the MSc programme "European Social Policy Analysis" (MESPA) at the University of Bath, UK . Friedrich, Dawid, 2006, Policy Process, Governance and Democracy in the EU: the Case of the Open Method of Coordination on social inclusion. Policy & Politics, 34 (2), 367-383. Goetschy, Janine, 2003, “The European Employment Strategy, Multi-level Governance, and Policy Coordination”, in : Zeitlin, Jonathan and David M. Trubek (eds.), Governing Work and Welfare in a New Economy : European and American Experiments, Oxford : OUP. Goguel d’Allondans, Alban, 2003, L’Exclusion sociale. Les métamorphoses d’un concept (19602000), Paris : L’Harmattan. Greve, Bent, 2002, “Is a supranational strategy for social inclusion possible?” Research papers from the Department of Social Sciences, Institut for Samfundsvidenskab og Erhvervsøkonomi, Research paper no. 06/02. Hall, Peter, 1993, Policy paradigms, social learning, and the state: The case of economic policy making in Britain. Comparative Politics, 25 (3), 275-296. Hemerijck, Anton and Visser, Jelle, 2003, Policy learning in European Welfare States, unpublished manuscript, October 2003. Idema and Kelemen, 2006, New Modes of Governance, the Open Method of Coordination, and other Fashionable Red Herring. Forthcoming in Perspectives on European Politics and Society. Jacobsson, Kerstin, 2002, „Soft Regulation and the Subtle Transformation of States: The Case of EU Employment Policy“, 2002/4, SCORE (Stockholm Center for Organizational Research), Stockholm. Kerstin Jacobsson and Åsa Vifell, 2004, “Towards Deliberative Supranationalism? Analysing the role of committees in soft co-ordination, Preliminary chapter for Linsenmann, Meyer and Wessels (eds.), Economic Governance in the EU, Palgrave Macmillan. Kohler-Koch, Beate, 2001, “European Networks and Ideas: Changing National Policies?”, in: European Integration Online Papers, 6 (6). Kröger, Sandra, 2004, “The commonly agreed indicators in the context of the OMC/incl.: Challenges and limitations”, Paper prepared for the French-German workshop on „Governance, Law and Technology“, Centre Marc Bloch, Berlin, 7 December 2004. Kröger, Sandra, 2005, “Coming to grips with soft governance: a conceptual framework for analysing the Open Method of Coordination in the field of poverty and social exclusion”, Paper prepared for the Summer School 2005 of the Postgraduate Programme »The Future of the European Social Model«, University of Göttingen, 18th to 21rst July. Kröger, Sandra, 2006, "When Learning Hits Politics Or: Social Policy Coordination Left to the Administrations and the NGOs?" in: European Integration online Papers, 10 (3). Kutsar, Dagmar, 2000, “Problems of Comparability of Sociodemographic Indicators between Developed Countries and Countries in Transition”, Paper prepared for the seminar „Brainstorming on Social Indicators”, Strasbourg, 14-15 September 2000, Council of Europe and Friedrich Ebert Stiftung. Mabbett, Deborah, 2004, “Learning by numbers: The role of indicators in the social inclusion Process”, Paper prepared for the ESPAnet Conference, Oxford, 9-11 September 2004. Marlier, Eric et al, 2005 Taking Forward the Social Inclusion Process, report written for the Luxemburg Presidency. Radaelli, Claudio, 2004, “Who learns what? Policy learning and the open method of Coordination”, paper prepared for the ESRC seminar series: Implementing the Lisbon Strategy “Policy learning inside and outside the open method”, European Research Institute, University of Birmingham, 26 November 2004. Room, Graham, 2004, “Benchmarking Indicators, Policy Convergence and Political Choice”, Paper prepared for the ESPAnet Conference, Oxford, 9-11 September 2004. Room, Graham, 2000, “Social exclusion, solidarity and the challenge of globalisation”, in: International Journal of Social Welfare, 9 (2), 103-119. Sabel, Charles and Joshua Cohen, 1997, “Directly-Deliberative Polyarchy”, in: European Law Journal, vol. 3, no. 4, 313-340. Scharpf, Fritz, 1996, Negative and positive Integration in the Political Economy of European Welfare States. In: Marks, Gary et al, Governance in the European Union, (London: Sage), 15-39. Scharpf, Fritz W., 2002, “The European Social Model: Coping with the Challenges of Diversity”, in: Journal of Common Market Studies, Vol. 40., nr. 4, 645-670 Social Protection Committee, 2003, Common Outline for the NAPs / inclusion 2003-2005, Brussels. Todman, Lynn C., 2004, “Reflections on Social Exclusion: What is it? How is it different from U.S. Conceptualizations of Disadvantage? And why Americans might consider integrating it into U.S. social policy discourse”, unpublished manuscript. Trubek, David M. and Joshua S. Mosher, 2003, “New Governance, Employment Policy, and the European Social Model”, in: Zeitlin, Jonathan and David M. Trubek, (eds.), Governing Work and Welfare in a New Economy. European and American Experiments, Oxford/New York: Oxford University Press, 33-58. Tucker, Christoph M., 2003, “The Lisbon Strategy and the Open Method of Coordination: A New Vision and the Revolutionary Potential of Soft Governance in the European Union”, Paper presented at the 2003 Annual Meeting of the American Political Science Association, Chicago, 28 – 31 August 2003.