The heuristic and holistic synthesis of large volumes of qualitative

advertisement

The heuristic and holistic synthesis of large volumes of qualitative data: the TLC experience.

Phil Hodkinson, Gert Biesta, Denis Gleeson, David James and Keith Postlethwaite

Paper Presented at

RCBN 2005 Annual Conference

Cardiff, 22 nd February 2005

Abstract

The Transforming Learning Cultures in FE project faced three daunting problems in making sense of its qualitative data. Firstly, the volume of data was a problem, with over 700 taped and transcribed interviews, notes from between 100 and 200 observations, and 16 detailed tutor diaries. Secondly, the research team consisted of 30 people, academic and FE based, located in

8 different institutions. Thirdly, there was a problem of scale of analysis – making sense of 16 learning sites, of the individual tutors and students within them, and of the more macro FE context. This paper explains why we did not adopt some of the well-known conventional approaches to qualitative analysis based upon the coding of transcripts. It goes on to describe, explain and evaluate our alternative process. This was staged, beginning with individual site descriptions and some individual student and tutor stories. We then heuristically used two

‘instruments’ to analyse both the learning cultures of these sites, and of interventions which changed the cultures and learning in them, and finally, moving beyond these instruments to develop an overarching theoretical position, together with an integrated account of learning, presented differently at different scales of examination. At its heart, we would describe our approach as one of collective and collaborative interpretation and synthesis, rather than the more common approach of analysis.

Contact details

Prof Phil Hodkinson,

The Lifelong Learning Institute,

Continuing Education Building,

University of Leeds,

Leeds, LS2 9JT,

UK.

Tel: 0113 343 3223

Email: p.m.hodkinson@leeds.ac.uk

T. L . R . P

Teaching & Learning

Research Programme

1

Introduction

This paper has two purposes that are interwoven. The first is to explain the solution to a logistical problem encountered in the Transforming Learning Cultures in Further Education (TLC) project.

That is, how to make sense of the huge volume of qualitative data that the research was generating, together with overlapping challenges rooted in a large, diverse and dispersed research team. The second is to use these specific problems to address one of the key points of contention in debates over educational and social science research methodology in the last 20 years. This point of contention is the extent to which methodology can and should be objective and neutral , rather than directly contributing to the construction of ‘findings’ from the data. Here, we are addressing that macro issue through a specific focus on what is commonly termed data analysis. The first part of this paper establishes the nature and background to these two related problems before going on to describe and explain the approaches adopted by the TLC. We then conclude with a more general discussion about the second, and deeper, methodological issue.

The TLC Project

Understanding the logistical problem faced by the TLC project requires an understanding of the nature of the project, what it was trying to do, and how it was organised (see Hodkinson and

James, 2003, for a fuller account). Our starting point was that learning in FE, as elsewhere, is complex, and we set out to research that complexity, rather than to focus on one or two key variables. In the TLC we us the term, ‘culture’, to indicate these complex relationships (James and Diment, 2003; Hodkinson et al., 2004a, b). The project aimed to examine, within a variety of settings, what a culture of learning is and how is can be transformed, based upon an acceptance that ‘learning and thinking are always situated in a cultural setting, and always dependent upon the utilization of cultural resources’ (Bruner, 1996, p 4). To conceptualise this, we turned to the work of Pierre Bourdieu (e.g., Bourdieu, 1977; 1998; Bourdieu and Wacquant, 1992; Grenfell and

James, 1998). Bourdieu’s theory-as-method provides a relational approach to learning that emphasises the mutual interdependence of social constraint and individual volition. Social practices are understood as having both an objective and a subjective reality at one and the same moment. Complex human relations and activities can be understood via theoretical tools that enable the ‘unpacking’ of social practices in social spaces: examples of these ‘tools’ include the notions of habitus (i.e., a collection of durable, transposable dispositions) and field (a set of positions and relationships defined by the possession and interaction of different amounts of economic, social and cultural capital ). Habitus and field are mutually constituting, a point of considerable practical importance to the way that the actions of tutors, students and institutions are studied and understood. Put more concretely, our starting assumption was that learning would depend upon the complex interactions between the following factors, amongst others:

 Students’ positions, dispositions and actions, influenced by their previous life histories

 Tutors’ positions, dispositions and actions, influenced by their previous life histories

The nature of the subject, including bro ader issues of ‘disciplinary identity’ and status, as well as specifics such as syllabus, assessment requirements, links with external agencies or employers, etc.

College management approaches and procedures, together with organisational structures, site location and resources

National policies towards FE, including qualification, funding and inspection regimes

Wider social, economic and political contexts, which inter-penetrate all of the other points.

To organise data collection, we adopted nested case studies. Four case study FE colleges were selected and the design of the project negotiated with their principals and key staff. Each college was paired with one of the four host universities in the project. Within each college, four specific sites of learning and teaching were identified, providing 16 sites across the whole project. By

‘site’ we meant a location where tutor(s) and students worked together on learning. The sample

2

is not representative of the whole of FE provision, but it does provide a wide enough range to allow either significant variations between sites, or significant common issues across them, to be identified. The main tutor in each site was funded for two hours a week, to participate in the research. These ‘participating tutors’ attended regular meetings and workshops with their host university/college research team, were encouraged to keep reflective log books or diaries, and to observe each other’s sites. They were encouraged to innovate as the research progressed, and where new approaches were attempted the research provided on-going evidence of what happened.

In addition to the participating tutors, each local research team has three core members: one of the project directors, nominally for one day per week; a half-time academic researcher, employed by the university; and an FE practitioner/researcher, seconded for two days a week to work on the project. In addition to working with the participating tutors, these core researchers interviewed about 6 students per site twice a year, using semi-structured interviews; and observed the practice in each site on regular occasions. Observations were unstructured.

Participating tutors were also regularly interviewed, and given periodic feedback about what the research shows about their particular site and more general issues across the project as a whole.

They also kept detailed diaries for the duration of the project. In addition to these 16 qualitative case studies (which eventually became 17, as one participating tutor left and was replaced by another in a different site), the TLC also used regular questionnaire sweeps, to generate a broader picture of the sites. One director and one part-time researcher work exclusively on this part of the project. In this paper, it is the qualitative work that is addressed. It should be noted that this was also a project with a relatively long time frame – four years. This meant that we could track changes effectively, but itself was one of the reasons for the volume and complexity of data that were generated.

In approaching the analysis of this data, the TLC faced some difficult problems, each of which is also a strength. Firstly, the sheer volume of data, that gives us such rich and detailed pictures of learning, is overwhelming. By the conclusion we will have about 600 student interviews, 100 tutor interviews, 16 log books, over 100 observation notes, notes from local team meetings and discussions, interviews with a small number of college managers, etc. Our second problem came from the size and diversity of the core research team – 14 people, all part-time, with different professional roots and identities, split across four geographically distant partnerships. This gave a valuable depth of understanding to all our work, as sometimes contrasting perspectives are blended. However, it does make the core team difficult to manage, and there are tensions when some members feel that their perspectives or needs are marginalised. All team members had to balance their TLC activity against the rest of their working and family lives.

The Problem of Method

As qualitative research progressed, early approaches to methodology were rooted, knowingly or unknowingly, in standards previously set for quantitative work (Denzin and Lincoln, 2000). That is, much of the methods literature attempted to address the holy trinity of validity, reliability and generalisability, especially validity. For many, the adoption of a rigorous method was seen as the main way to preserve objectivity. That is, to establish the credibility of qualitative findings, through an assurance that they represented a true picture of the subject being researched, rather than a biased personal perspective of the researcher. Either explicitly or implicitly, these approaches assumed a realist position, namely, there is a real world out there, separate from the researcher, and the job of researchers is to discover what that real world is like, through rigorous objective method. Thus Glaser and Strauss (1967) and later Strauss and Corbin (1997) developed grounded theory, to use the method of constant comparison to arrive at a single, true understanding, as part of bottom-up theory construction. We are not concerned here with this ongoing debate in is broadest sense, though some of us have addressed these issues elsewhere

(Biesta and Burbules, 2003; Hodkinson, 2004). Here we focus explicitly on analysis.

3

Most of the realist approaches to the analysis of qualitative data take the term analysis literally.

Analysis means to examine in detail to discover meaning, but also to break down into components or essential features. That is, almost the standard way of approaching realist analysis is to break down interview transcripts, observation notes etc. into standard component parts, for example through coding. One argument is that this forces the researcher’s attention to detail, helping to avoid being biased by first impressions. Often, as in grounded theory (Strauss and Corbin, 1997) or more eclectic analysis guides, such as Miles and Huberman (1994) this initial coding is followed by one or more analytical algorithms – set and predetermined procedural protocols that work on the coded data to develop patterns and produce both understanding and the single most plausible and verifiable truth. Such algorithms, it is claimed, tame the subjectivity of the researcher, allowing the data to speak almost for itself. The skill of the researcher is to choose the most appropriate algorithms, to encourage this to happen. It also helps, from this perspective, of more than one researcher is involved. If, say, three researchers all apply the same appropriate algorithms, and then agree on the correct interpretation of the results, the outcomes are arguably as objective as qualitative research can be.

However, there is an alternative type of approach, which is interpretative rather than realist.

Within this second approach, rather than a biased individual whose subjectivity is to be tamed, the researcher is seen as a person who actively constructs a meaningful story out of the data, maximising the benefits of their existing experience and insights. The emphasis here is on research as construction, rather than discovery (Smith and Deemer, 2000). Thus, Wolcott (1994) focuses on transforming qualitative data – changing it into something meaningful. Moustakas

(1990) writes of what he terms heuristic analysis – making sense of data through immersion, then standing back and allowing the sub-conscious to work. As Colley (2001) suggests, synthesising data into a constructed story may be a better way of describing what is involved than is the term analysis. The TLC directors were more closely aligned to this interpretivist approach than to the alternative of coding etc. Our decision to use Bourdieu’s thinking to structure the research project signalled this standpoint. Implicit in that decision was the understanding that the research was framed within a particular way of viewing the world, even if, as we have always asserted, we wanted to use the research to challenge our pre-assumptions. We return to this broader debate about methodology at the end of this paper. Before doing so, we focus on the TLC’s approach to the two practical problems we faced.

Analysing or Synthesising the TLC data

We begin this section hypothetically. Had the TLC team been wedded to a more conventional procedural analysis of data, the scale and complexity of the project would have offered amazing hope, underpinned by practical impossibility. The hope would come from the size and diversity of the research team. If we could get 14 people, including five leading academics, five professional researchers and four seconded FE practitioners to completely agree about every finding generated from such a large and diverse sampling base, then those agreed truths would have been robust indeed. The impossibility arose from the very things that would have demonstrated our success.

A diverse and dispersed research team is arguably like any other similarly sized diverse and dispersed team. Agreement is seldom total, and shared understandings have to be worked for.

At different times, all of us have felt slightly out of step with what has been agreed, and occasionally one or more of us has felt significantly out of step. We had to work hard not to resemble the committee that was asked to design a horse but came up with a camel.

Agreements were often compromises, often strongly led by those with most power. In general, we found that we could agree most major, broad-brush findings more easily than the detail.

Often, that detail had to be taken partly on trust, because in every case, those researchers who had collected the data, and then their two close geographical colleagues, could determine what the details meant in ways that the others lacked the evidence to challenge.

4

In the archetypal realist approach, these sorts of difficulty are resolved through coding and algorithms. However, in this case, such an approach would have been almost impossible. Most coding approaches require the ongoing refinement of the codes or categories, and they have to be agreed and uniformly applied by the whole research team. This might have been feasible within each geographical partnership of three people, independently of the other groups. It would have been logistically impossible across all 12 qualitative researchers taken together.

The next source of impossibility arises from the sheer volume of data collected. Even using the most sophisticated software programmes available, the time taken coding all our interviews and observation notes would have been colossal. And what would we have done then? The numbers of data chunks in each category would also have been massive. How do you make sense of, say, 376 coded references to assessment, or 423 to self-esteem? What algorithms could we have chosen that were capable of universal application to this massive volume of data, and which would have helped us identify the truths in our data about how to improve learning?

Furthermore, how could we have balanced our twin objectives of progressively deepening understanding and also tracking changes, within such a coding approach? How many different frames of categorisation and potential combination would we have needed?

Of course, we never intended to try such a thing, but were faced with identifying an alternative that was almost as daunting. Many of the better known interpretivist approaches to data transformation or synthesis require a single researcher or small team to get to know all the data very well. For us, this was also impossible. The ways that we worked in geographically separated teams added a further risk – that we would have four different TLC projects – even without the issues of integrating the survey, which will be written about elsewhere.

Our final problem, which we return to later, was our decision to focus on complexity, culture and interrelationships. For this, analysis was little help and a potential distraction. It was not the parts that we were interested in, but how they reflexively related to each other. We had more to learn from ethnography (Wolcott, 1999) – but how to connect what were potentially 17 different ethnographies of learning sites?

In next describing and explaining what we did, we are employing hindsight. Some of what follows was planned and worked largely as intended, but much of it was developed as the project progressed. This was a project over four years, attempting several things which, to our knowledge, had not been done before. The methodology described below, and the findings that it generated, were co-produced by the research participants working together, drawing upon our contacts with the colleges and participating tutors, and our on-going immersion in the FE sector.

This was qualitative research as art (Wolcott, 1995), continually working out the next stage in the process, rather than having a detailed and unchanging pre-prepared plan to follow. This was a reflexive journey, as we constantly returned to the research objectives and questions and to our established but also developing principles as we worked out what to do next.

Synthesis through Stages

In essence, our approach to synthesis was through a series of stages, which are described separately. In practice they overlapped and were not followed precisely in sequence.

Site Learning Cultures

We had decided when writing the research proposal, that the prime unit of data organisation was the learning site. The first stage of synthesising our data was, therefore, to construct accounts of the learning cultures in each of our 16, later 17, sites. This began as soon as our first sweep of data collection was complete. Each local team of three core researchers went through all the relevant data in detail, and produced a detailed account of the site. These accounts were lengthy

(around 15,000 to 20,000 words) and contained interview quotations and extracts from observations and notebooks to evidence and illustrate the account. Each local team approached

5

this task in their own way. In Leeds, for example, the team each read all the available data, then came together for a day workshop discussing one site. At this workshop, we shared our varied impressions of the site, and together identified some agreed provisional organising ideas. The individual responsible for collecting the data then wrote up a first draft, which was further refined by the others. We eventually shared our draft accounts with the relevant participating tutor. This was not primarily a verification issue. Rather, we wanted them to have access to the insights we had made. This was especially important, as they did not see the student interview transcripts, for reasons of confidentiality. In addition, we were interested in their reactions to the accounts, and the ensuing discussions helped us further improve them. For example, we might see something differently because of the tutor input. Also, the tutor’s reaction to part of the account added to our understanding of them as teachers, and their interactions with the site. In the second round of data collection, these site accounts were further refined and modified, usually by becoming even longer. As the project continued, work on the site cultures also continued. This work was also informed by the results from the quantitative survey, which, by grouping sites together on the basis of statistical analysis, sometimes confirmed and sometimes changed subtly the ways in which a particular site culture was understood (Postlethwaite and Maull, 2003).

We were always looking for further insights and for either confirmation of our understanding, or for changes to it. In addition, we were monitoring the sites longitudinally – mapping changes of various types. These included tracking any changes specifically introduced by the participating tutors. However, we stopped further extending the detailed accounts, because this work was too time-consuming, and distracted us from other important elements of the synthesis.

Cross-Site Themes

As soon as the first versions of site culture case studies were written, we looked for possible themes that cut across them. Within the TLC, we saw this as a useful mechanism to pull together small groups of researchers across the geographically localised teams. This was important in avoiding unbridgeable divisions between each geographical cluster. It also contributed to the enhancement of research skills and understanding across the team. This proved problematic, as distance made face to face meetings difficult. Also, some of our early theme choices proved to have more lasting value than others. Eventually, two were worked up and published. The first focussed on the professionality of FE staff, drawing upon the data from our participating tutors

(Gleeson et al., In Press). The second examined the impact of what we termed vocational habitus on those vocational courses which sustained close links with employers and workplaces

(Colley et al. 2003b). Whilst we regard this work as a valuable contribution to the eventual project outcomes, we felt that we needed to supplement it with other approaches, as we worked to transform and synthesise our data. This was partly because there were far too many possible and relevant themes that were each worthy and relevant to our research objectives, and the theme papers had proved very time-consuming to produce.

Individual Stories

It was always our intention to focus part of our effort into understanding the individual stories of participating tutors and students, in relation to the sites. Bloomer and Hodkinson (2000) had done this sort of research with students before, analysing their learning careers. In the TLC, we wished to see how learning careers and teaching careers intersected with site cultures.

Developing such individual stories presented us with two problems. The first was an unforeseen bias in our methodology. We were more focussed on individual tutors than on individual students. This was because the participating tutors worked with us and we knew them better; because we had more data on them, including their diaries; and because there was only one of them per site, whilst we were interviewing six or more students. Because of this, we had to consciously make the effort to look at some individual students, whereas the tutors’ stories were always close to the forefront of our work. The second problem was that we could not complete any of these stories early in the project, as they continued to develop throughout the fieldwork period. This meant that in the final third of the project time, developing individual stories competed with the other major stages in our synthesis, described below. At the time of writing, we have one published piece with individual students as a central focus (Davies and Tedder,

2003) and others are under way. At least one of the participating tutors’ stories is also well on the

6

way to being ready for publication. However, even though not many of our numerous sample students’ and tutors’ stories will be published, ‘drilling down’ below the level of the site to examine individuals was a crucial stage in our synthesis.

Comparing Cultures and Interventions

Apart form the logistical problems of time and priorities, developing students’ and tutors’ stories was straightforward. However, the third stage in our synthesis process was much more difficult.

We had to find ways to blend together our 17 site ethnographies. Our original proposal separated out two linked issues – understanding learning cultures, and developing principles of procedure for the improvement of teaching and learning. In the early part of our work we had concentrated almost exclusively on the first of these. We therefore had to find a way to blend these learning culture accounts, and also to direct our attention explicitly to change – to what we termed

‘interventions’. That is, actions by the tutor or others than affected the learning culture and learning in the sites. We used parallel approaches for each of these things, but here we will start with the issue of culture.

We faced three problems. Firstly, each of us knew our own sites well, but had only partial awareness of the others. Secondly, the detailed accounts of each site were too long for anyone to put them all together and make sense of all 17. Thirdly, though we had excellent and detailed accounts of the culture of each site, the understanding of learning that underpinned them was often implicit rather than explicit, and there was no clearly shared theoretical position on learning across the team. Putting all three problems together, we needed to develop short summaries of each site, with an explicitly critical and theoretical focus on learning, based upon a broadly common framework. We did this through the development of what we termed an instrument to analyse learning cultures.

The first stage was to agree the instrument. One of us, Hodkinson, produced the first draft, in collaboration with his Leeds-based colleagues (Helen Colley and Tony Scaife). It was based on several existing approaches to learning as a cultural phenomenon, in the workplace and elsewhere (E.g. Beckett and Hager, 2002; Colley et al, 2003a; Engestrom, 2001; Fuller and

Unwin, 2003, 2004; Lave and Wenger, 1991; Wenger, 1998). The Leeds team trialled and refined the draft, which was then discussed in two whole team workshops. It proved contentious, and a lot of effort and time went in to developing a version which most TLC core team members could work with. This final version is shown in appendix I. A look at it will reveal that this is not a conventional instrument, lacking the precision normally associated with such a device. Rather, it consists of a series of headings and associated issues, around which each site account was to be written. The result was 17 ‘short’ site accounts. Each was written by the researcher primarily responsible for data collection in that site, supported by the other members of their local team.

We approached ‘Interventions’ in a similar way. This time, James took the lead, by focussing our thinking on a range of issues derived from the literature on change in education. After two whole team workshops which progressively refined our thinking, we produced the final working version of what we termed the ‘Interventions’ instrument. This is shown in appendix II. One of the key distinctions it drew was betw een interventions, mainly by the participating tutor, and ‘intervening events’ which impacted upon the site from elsewhere. We agreed that each local team would analyse four different interventions using this instrument, possibly, but not necessarily, one from each of their four sites. For reasons of local logistics, we eventually collected fourteen such interventions. These were discussed by the whole team at another workshop, and then further refined, in the light of those discussions.

Synthesising the Instruments

In broad terms, we approached the synthesis of the instrument analysis in the same way for both, though in practice the detail of the work was done differently. For reasons of space, we concentrate on the cultural analysis here. Each local team set out to produce short 6 – 10 page analyses of each of their local sites, using the cultural analysis instrument. The results, though insightful and illuminating, were far from the standardised products we had naively hoped for.

7

They varied in length (4 pages to 20 pages), in focus, and in the ways that the parts of the instrument were understood and interpreted. This variability helped determine the form that the next stage of analysis took. It meant that there was no point in developing some further level of apparently standard technical analysis – the meta-data from the 17 sites were too different in form and content. Also, it was important not to simply reproduce the same headings that had been used in the instrument. To do so would be to reify their significance, and to impose the structure of the instrument on the eventual results. It was better to take a more holistic approach.

This entailed immersion in the 17 accounts, searching for issues, patterns, commonalities and differences, following what Moustakas (1990) termed ‘heuristic research’. This immersion would ideally have been done by a small team, working closely together. Logistics prevented this, and

Hodkinson carried out this part of the work himself. The next step was to further examine and test out the issues he identified across the whole research team, including going back, where necessary, to the more detailed data and evidence from which each site analysis has been constructed. In the interventions analysis, by contrast, there was more whole team input into the early stages.

This analysis of the learning cultures took two forms. One the one hand, we searched for common factors that seemed to cut across all or most of our sites. There were several of these

(Hodkinson et al., 2004b), which together began to construct a picture of the learning culture of the FE sector as a whole. The second was to look for ways to group site cultures together – to search for types of learning culture, within our data.

There are numerous differences between the 17 sites, many of which could form the basis for such a typology or classification of learning cultures. However, many of the possible ways forward entail focussing on one aspect of the learning culture. Thus, we could classify sites according to the types of student found in them. This is most obvious in relation to gender, for we have sites that were clearly female, clearly male, and mixed. Less clearly demarcated, but arguably no less significant, are issues of social class, ethnicity, age (mature students, under-16s, young people over 16), and educational achievement/ability. There may be very good reason to focus explicitly on one or more of these to see where it leads, but this would only be a good way of classifying learning cultures, if we could establish that such issues of student identity were preeminent above other interrelated cultural variables, and this does not seem to be the case.

This same problem is replicated for many other possible single factor classification contenders.

For example, the cultural analyses of sites reveal significant variations in relations with employment. There were at least two types of vocational course – those with close links to employment and those where links were looser. Then there were the non-vocational courses, of many different types. Alternatively, we could classify sites according to the type of tutor-student contacts. Thus, there were fulltime courses where a group of students work with a small group of tutors, courses where students and tutor met only for one or two sessions a week, doing other things for the rest of the time; sites where students and tutor met mainly one to one, and one site where tutor and students never met.

There was very little point in following this road, though any of the issues listed might prove valuable lines of research inquiry, as significant elements of FE learning cultures. But this is different from classifying the learning cultures themselves. Our data strongly support the view that no single variable or group of related variables is universally pre-eminent in constructing the learning cultures in all sites, so we need a means of classifying learning cultures that is not reductionist in this way. If we take a cultural and relational view of learning, then it follows that types of relational patterns within the sites are likely to be significant. We identified two ways of doing this, based upon the sites analyses themselves, though there may well be more. One was to examine issues of divergence and convergence across the sample as a whole. The other was to identify small clusters of sites, where the learning cultures share significant similarities. Some examples of the results of this approach are given in Hodkinson et al. (2004b).

8

Of the two analyses, learning cultures was completed before learning interventions. This was for pragmatic reasons, rather than principle. However, it had one clear benefit, for as the interventions were analysed, we retained a clear awareness of what the learning cultures work was showing. It increasingly became apparent that the two had to be considered together and were interlinked. That is, even though work on the interventions analysis continues at the time of writing (January 2005), increasingly the two analyses are seen as parts of the same overall picture. At this time, results from the quantitative survey were also becoming blended into our overall synthesis. Put differently, we have not needed a formal process to integrate the two separate analyses. Conducting them and considering them across the whole team has itself been a process of further synthesis.

Looking beyond the immediacy of our data: a genealogy of learning in FE

As these analyses were progressing, they alerted us to some empirical issues that could not be directly addressed through our existing data. These issues clustered around a more macro view of FE as a whole, and the need to be able to map changes in the FE learning culture over a longer period than our 3.5 years of fieldwork. These two issues are linked. We were aware that much of the FE research literature writes of the move from Local Authority control to being independent colleges funded through a government quango (originally the FEFC, now the LSC) as transformatory (E.g. Ainley and Bailey, 1997). However, the detailed knowledge of FE in the team, held especially by Gleeson (c.f. Gleeson & Mardle, 1980; Gleeson, 1989, 1993; Gleeson &

Shain, 1999), combined with early work of our attached research fellow, Mark Goodrham, suggested that there were continuities across this process of incorporation, as well as changes.

In the period of our research, many of the common elements of FE learning culture seemed to be closely related to post-incorporation policies and managerial approaches. For example, we had identified what one of us, Scaife, termed a ‘culture of the now’ in FE, which seemed to be linked to frequent policy changes and short-termism encouraged by the funding mechanisms. But was the ‘culture of the now’ a much more deep-rooted feature of English FE practice, pre-dating incorporation?

To address these sorts of issue, we embarked an additional piece of research. A small group of us (Colley, Gleeson, Walhberg) are working on a genealogy of learning in FE. That is, they are looking back at documents (both official and research) about the period from about 1980 to the present day. In this task, their focus is on how learning has been conceptualised at different times, and the links between conceptualisations of learning and other wider cultural influences.

This work will locate the learning culture of our more recent fieldwork in a longer period. As a key question about learning cultures and interventions focuses on the nature of change, this new task will add a vital extra dimension to our eventual synthesis of data.

Research Synthesis as Construction

Discerning readers will already have noticed a correspondence between our overall research approach, the methodology adopted and the major findings produced. Thus, we started with assumptions that learning was complex and relational, in our terms, cultural. This view was reinforced through much of our reading about learning. This paper has already explained how these theoretical lenses influenced our choice of methodology. The resulting outcomes were (i) to confirm the significance of a relational and cultural means of understanding learning; (ii) to produce a new theoretical position, that critically addressed our source literature, as well as making maximum use of our data (See Hodkinson et. al. 2004a for an early account of this); and

(iii) a critique of the current learning culture(s) in FE, which in turn raises serious shortcomings in current policy and managerial approaches in the sector, and, inter alia, of much current research into teaching and learning.

Once our work is fully published, these achievements could be viewed in two ways, if we first bracket off important questions about whether we conducted the research well or badly, within our chosen approach. A positive reading is that we have constructed a new and potentially

9

valuable way of understanding learning in FE, and for improving it. We hope that this does justice to structural, agentic and serendipitous influences on learning, including a recognition of the significance of power in relation to pedagogy and to definitions of what counts as good learning and teaching. Through this approach, we can help explain many problems that beset current policy and management approaches to teaching and learning in the sector, and explain why they don’t work in the ways that their perpetrators hope. We can also point to new approaches to the improvement of learning in the sector. The more negative reading is that our research has not been ‘objective‘ – because we set out with pre-formed and therefore prejudicial views about teaching and learning, and then selected a method that was inherently likely to confirm them.

That is, in deliberately avoiding the most common mechanisms for objectifying qualitative research (coding and algorithms) we allowed our collective biases free rein. We focussed only on readings of our data that reinforced and confirmed those biases, even if we allowed the detail of our position to be changed, sometimes substantially. These two ‘readings’ are interrelated, because our cultural and relational view of learning was the frame within which we were willing to synthesise, and this was signalled within the original bid. This leads to an obvious if hypothetical question – would our outcomes have been significantly different if we had adopted a different starting approach and techniques of analysis – and does this matter? Our answers to these questions would be yes, and no.

Why yes? As has been made clear in the above account, the mode of synthesis we adopted was integral to the production of the research outcomes, and was deeply influenced by our thinking and theorising, some of which was prior to the project. If we had used a very different approach, for example by following the grounded theory procedures precisely (Strauss and Corbin, 1997), at least some alternative issues would probably have surfaced. If we had not started looking at learning as complex and relational, the whole project would have been different. That is, dramatically different approaches to analysis/synthesis are likely to produce at least partly different findings, even from the same data set.

To us, this conclusion seems merely common sense. However, we would go on to argue that it is inconsequential. The huge volume of data collected in this project is capable of producing multiple findings, if it is analysed in different ways and if different researchers focus on different things. This does not mean, of course, that ‘anything goes’. We cannot know exactly how many different and valid findings can be generated, but the number is clearly finite, at least in the sense that there are very many views of FE and learning that the data would not support – including, ironically, that currently most dominant if sometimes implicit in UK current policies towards the FE sector. What matters is not how many other alternative findings could have been produced, or even whether or not ours is the ‘best’ or most valuable of these alternatives. What matters, is whether our findings are fully supported by evidence in the data we collected, make theoretical and practical sense, and offer the potential for improvement in learning if they are applied sensibly.

This takes us to the centre of the methodological debate which we flagged up at the start of this paper. In one sense only, advocates of the use of objective qualitative methodologies have to agree with us that different methods of analysis/synthesis can produce different findings.

Otherwise, what is the point in specifying the elements of a protocol for data analysis? Beyond this, however, their position logically requires, as Glaser and Strauss (1967) suggested, that ultimately there is one true, that is to say most correct, position that can be eventually discovered in any data set. The parallel assumption is that carefully chosen objective methods are neutral or transparent – that they allow this truth in the data to be revealed or uncovered. However, in relation to the TLC, we have argued that following such a line of analysis rather than synthesis might have obscured the all important relationships between complex factors. For example, not treating site learning cultures as a holistic base for analysis would have made it difficult, especially in the time available, to achieve the levels of synthesis which are the hallmark of the

TLC. We deliberately chose methods designed to maximise these sorts of relational insight, and applied them with as much rigour as we could. Alternative analytical approaches would have made the eventual outcomes of the TLC more difficult to achieve. That is, choice of methodology

10

matters, because different methodologies help construct different types of research finding.

Methods cannot be chosen on the grounds that some offer greater neutrality and, therefore, a guarantee of a greater degree of objectivity in the knowledge produced at the end of the process, or because any one approach is inherently superior to another. Any method both enables and constrains the research process, making some outcomes more likely and others less. This as true of analysis/synthesis as it is of other parts of the research process. As researchers, we have to choose and/or develop the approaches we use, and in so choosing, we influence the sorts of finding that we are able to construct.

Acknowledgements

We wish to that the Economic and Social Research Council (ESRC) and the Teaching and

Learning Research Programme (TLRP) for the funding and support that made this research possible. The project award number is L139251025. We also want to thank the other researchers in the TLC team, who were integral to the development of the project and the processes described here. They are Graham Anderson, Helen Colley, Jennie Davies, Kim

Diment, Wendy Maull, Tony Scaife, Mike Tedder, Madeleine Wahlberg and Eunice Wheeler.

References

AINLEY, P. & BAILEY, B. (1997) The Business of Learning: Staff and Student Experiences of

Further Education in the 1990s.

London: Cassell.

BECKETT, D. and HAGER, P. (2002) Life, Work and Learning: practice in postmodernity

(London: Routledge).

BIESTA, G. & BURBULES, N. C. (2003) Pragmatism and Educational Research (Philosophy,

Theory, & Educational Research) (Lanham, MA: Rowman & Littlefield).

BLOOMER, M. & HODKINSON, P. (2000), Learning Careers: Continuity and Change in Young

People’s Dispositions to Learning, British Educational Research Journal, 26 (5) 583 - 598 .

BOURDIEU, P. (1977) Outline of a Theory of Practice (Cambridge: Cambridge University Press).

BOURDIEU, P. (1998) Practical Reason (Cambridge, Polity Press).

BOURDIEU, P. and WACQUANT, L. (1992) An Invitation to Reflexive Sociology (Polity Press).

BRUNER, J. (1996) The Culture of Education (London: Harvard University Press).

COLLEY, H. (2001) Unravelling Myths of Mentor:Power Dynamics of Mentoring Relationships with ‘Dissaffected’ Young People (PhD Thesis, Manchester Metropolitan University).

COLLEY, H., HODKINSON, P. & MALCOLM, J. (2003a) Informality and Formality in Learning: a report for the Learning and Skills Research Centre (London: Learning and Skills Research

Centre).

COLLEY, H., JAMES, D., TEDDER, M. & DIMENT, K. (2003b) Learning as Becoming in

Vocational Education and Training: class, gender and the role of habitus, Journal of Vocational

Education and Training , 55 (4) 471-497.

DAVIES, J. & TEDDER, M. (2003) Becoming Vocational: insights from two vocational courses in a further education college, Journal of Vocational Education and Training , 55 (4) 515 – 539.

DENZIN, N. & LINCOLN, Y. (2000) Introduction: the discipline and practice of qualitative research, in N. Denzin & Y. Lincoln (Eds.) Handbook of Qualitative Research , 2 nd Edition

(Thousand Oaks, Sage).

ENGESTROM, Y. (2001) Expansive Learning at Work: towards an activity-theoretical reconceptualisation, Journal of Education and Work , 14 (1) 133 – 156.

FULLER, A. AND UNWIN, L. (2003) Learning as apprentices in the contemporary UK workplace: creating and managing expansive and restrictive participation, Journal of Education and Work , 16

(4) 407-426.

FULLER, A. & UNWIN, L. (2004) Expansive learning environments: integrating organizational and personal development, in H. Rainbird, A. Fuller & A. Munro (Eds) Workplace Learning in Context

(London: Routledge).

11

GLASER, B. and STRAUSS, A. (1967) The Discovery of Grounded Theory (London: Weidenfield and Nicholson).

GLEESON, D. (1989) The Paradox of Training (Milton Keynes, Open University Press)

GLEESON, D. (1993) "Legislating for change: Missed Opportunities in the Further and Higher

Education Act" British Journal of Education and Work , 6, 2, 29 - 40.

GLEESON, D., DAVIES, J. & WHEELER, E. (In Press) On the making of professionalism in the further education workplace, British Journal of Sociology of Education.

GLEESON, D. & MARDLE, G. (1980) Further education or training? : a case study in the theory and practice of day-release education (London: Routledge & Keegan Paul).

GLEESON, D. and SHAIN, F. (1999) Managing ambiguity: between markets and managerialism

– a case study of ‘middle’ managers in further education’, Sociological Review , 47 (3) 461 – 490.

GRENFELL, M. and JAMES, D. (1998) Bourdieu and Education-Acts of Practical Theory

(London, Falmer).

HODKINSON, P. (2004) Research as a form of work: expertise, community and methodological objectivity, British Educational Research Journal, 30 (1) 9-26 .

HODKINSON, P., BIESTA, G. & JAMES, D. (2004a) Towards a Cultural Theory of College-based

Learning, British Educational Research Association Annual Conference, UMIST, Manchester,

September 16 th – 18 th .

HODKINSON, P., ANDERSON, G., COLLEY, H., DAVIES, J., DIMENT, K., SCAIFE, T.,

TEDDER, M., WAHLBERG, M. & WHEELER, E. (2004b) Learning Cultures in Further Education,

BERA Annual Conference, UMIST, Manchester, 15 th – 18 th September.

HODKINSON P. & JAMES, D. (2003) Introduction. Transforming Learning Cultures in Further

Education, Journal of Vocational Education and Training, 55 (4) 389 – 406.

JAMES, D. & DIMENT, K. (2003) Going Underground? Learning and Assessment in an

Ambiguous Space, Journal of Vocational Education and Training , 55 (4) 407-422.

LAVE, J. and WENGER, E. (1991) Situated Learning , Cambridge: Cambridge University Press.

MILES, M.B. and HUBERMAN, A.M. (1994 ) Qualitative Data Analysis: An Expanded Sourcebook ,

2nd Edn. (London: Sage).

MOUSTAKAS, C. (1990) Heuristic Research: Design, Methodology, and Applications (London:

Sage).

POSTLETHWAITE, K. & MAULL, W. (2003) Similarities and Differences amongst Learning Sites in Four Further Education Colleges in England, and some implications for the transformation of learning cultures, Journal of Vocational Education and Training , 55 (4) 447-470.

SMITH, J.K. and DEEMER, D.K. (2000) The Problem of Criteria in the Age of Relativism, in N.K.

Denzin and Y.S. Lincoln (eds) Handbook of Qualitative Research: Second Edition (London,

Sage).

STRAUSS, A.L. & CORBIN, J. (Eds) (1997) Grounded Theory in Practice Thousand Oaks, CA:

Sage

WENGER, E. (1998) Communities of Practice: learning, meaning, and identity . Cambridge:

Cambridge University Press.

WOLCOTT, H.F. (1994) Transforming Qualitiative Data: Description, Analysis and Interpretation

(London: Sage).

WOLCOTT, H.F. (1995) The Art of Fieldwork (London: Sage).

WOLCOTT, H. (1999) Ethnography: a Way of Seeing (Walnut Creek, CA: AltaMira Press).

12

APPENDIX I:

Analysing Learning Cultures

This is an Instrument for analysing learning cultures. The Instrument has two purposes:

To explore ways of drawing wider lessons from our individual cases

To give some clear indicators of possible transformations of learning cultures.

The Instrument has four parts:

An analysis of convergences and divergences found in each site

An analysis of balance (or lack of it) within each site

An overview of each site, linked with summary strengths and weaknesses of the existing learning culture

An analysis of possible ways in which the culture of each site might be transformed, setting out the costs and benefits of each approach.

Introduction

The concept of learning culture is a means of understanding a learning site, not the site itself. A learning culture should be understood as constituted by the actions, interpretations and dispositions of those who participate in it, or who influence it. (Participation, in this sense, includes many who are beyond the normal boundaries of the site, but whose practices contribute to it. The identity of such participants may vary from site to site. They may include college managers, OFSTED inspectors, LSC managers, government policy makers, and TLC researchers). These actions, interpretations and dispositions are always interrelated to wider practices beyond the site concerned, such as national funding and inspection practices, and college management practices. They are grounded in the unequal social positions occupied by individuals and social groups. Participants’ actions, interpretations and dispositions are partly structured by the learning culture in which they participate, and those cultures are structured by the dispositions, interpretations and actions of participants. Cultures persist even though individual participants come and go. The longitudinal nature of the TLC allows us to examine this, eg when we compare different cohorts of students.

In using the Instrument, it is important to look more broadly than just within each site. We also need to consider the relationships between site cultures and wider college cultures and procedures ; nationals systems of say, inspection and funding of FE, and between all of those, and wider social, political and economic factors that impinge on learning practices of students or tutors. A learning culture may mean different things to different participants, or groups of participants. Also, unequal power relations within or around any learning culture influences the horizons for action and influence of participants, and means that some participants will be more influential than others, in the relational processes and practices of the site.

Though most of the analysis will rightly focus on the learning of students, in some sites there are significant issues around the learning of tutors. Include this as a sub-theme in the analysis, where relevant.

In applying the Instrument to any site, look for brevity, rather than excessive detail. In practice, we found that about 2 sides each on each of the first three sections, and 1 side for the final section on transformations, was about right. For a site that is already known fairly well, the analysis took roughly one day to complete.

Convergence and Divergence

In this part of the analysis, examine the convergence and or divergence within and between the following perspectives:

1). Dispositions, Interpretations and Actions.

Here we are concerned with the extent to which the dispositions, interpretations and actions of students and tutors (and others, where relevant) are convergent or in tension with each other. Part of this concerns the convergence and/or divergence between students and where relevant between tutors, as well as between students and tutors. Four things should be remembered:

 ‘dispositions’ is a sociological term, expressing the ways in which wider structural concerns, including gender, class and ethnicity, are represented through individuals, as well as apparently more individual attitudes towards learning, the site, other students and the tutors

13

(or vice versa). It subsumes many factors that pyschologists chose to separate out, such as motivation or locus of control.

There are always power inequalities within sites, and convergence and/or divergence are partly a result of the operation of those power differentials.

Actions and interactions are not separate from dispositions, and should not be overlooked.

Actions and dispositions help construct any learning culture.

Dispositions and actions both entail making interpretations of the learning culture, and of the actions and dispositions of other participants. Interpretations can be explicit and discursive, or tacit and practical.

2). Attributes of In/formality

Here, the concern is with a very wide range of factors which can be part of learning in many situations. Often in the literature, they are labelled as the characteristics of either formal or informal (nonformal) learning separately. Recent research (Colley et al., 2003) suggests that formal and informal learning are NOT distinct from each other. Rather, many of the characteristics that are commonly attributed to either formal or informal learning, by various writers and speakers, are actually present to a varied extent in all learning situations. The question, therefore, is to examine the extent to which the attributes of in/formality present in any learning site are convergent and mutually supportive or are divergent, acting in tension with each other. One short-hand way of thinking about this issue, is to juxtapose the official, discursive forms of curriculum and intentional learning (formal) with the hidden curriculum and practices (informal). As a heuristic to consider this issue, Colley et al. suggested four groups, or aspects, of these attributes of in/formality in learning. These are:

Learning processes

Location and setting

The content of learning

The purposes of learning.

What are the costs and benefits from whatever convergences or divergences that can be identified in each learning culture? These may vary from the perspectives of different participants.

Balance

Convergence and divergence alone are an inadequate label for the complexities of a learning culture.

The second part of the instrument tests out a parallel notion of ‘balance’. The argument here is that learning in any site is influenced by the balance between three sets of opposite tendencies. Too much of any one may impede successful learning. The ‘optimum’ balance is likely to vary from site to site, and may very from the perspective of different participants in the same site. The three pairs of opposites are:

1). Inclusion - Exclusion

In all learning cultures, some students are excluded and others are included. To what extent are some students excluded from or marginalised within the culture being analysed? Are there degrees of inclusion/exclusion, for example were some students only turn up for part of the time, or turn up but do not fully buy into or participate with the normal practices of the culture? What is the balance between the interests of individual students (especially students who are ‘different’ from the group norms) and those of the group and site as a whole?

2).

Challenge/ perturbation/conflict/ dynamism/ risk - Stability/safety/routine

All these factors are likely to be present in all sites, but what are the specifics, and what is the nature of the balance between them? Literature suggests that both sides are important in contributing to learning.

Please consider as many of these different but inter-related factors as seem relevant in each culture. Again, remember that the balance may very from participant to participant, but in this analysis we are looking for a short overview of issues.

3). Expansiveness - Restrictiveness

This idea comes from work by Fuller and Unwin (In Press) in relation to workplace learning. An expansive learning environment is one where there are wide-ranging opportunities to learn. A restrictive

14

one is where opportunities are narrow and limited. They argue that for workplace learning, expansive is better. Our alternative suggestion, at least for college courses, is that balance is more significant. A learning environment can be either too expansive or too restrictive. What is the balance in each site?

Again, this balance may vary from participant to participant.

For all three ‘balance’ dimensions, we need to know:

What are the strengths and weaknesses of existing balances in each site?

What impact does those balances have on learning of students and tutors?

Based upon the evidence we have got, to what extent would learning be likely to improve, if the balance were changed? What changes, if any, would be most likely to result in improved learning, and for whom?

N.B. It is important to consider all three scales with regard to factors outside the actual site, as well as within it. For example, risk, exclusion and restriction might come as much from college organisation, examination board specifications, or from wider social values and available cultural, social and economic capital, as from the characteristics of the site itself.

Overview

In this section, it is necessary to pull together an overall picture of the learning culture of the site, drawing on the different parts of the two sections. This is partly a matter of pulling together interrelationships between the different parts of the analysis. In part, it concerns prioritising the issues/factors that matter most in this culture. This should make clear the relational nature of learning and learning cultures.

Following the overview, it is helpful to list, as bullet points, what seem to be the main strengths and weaknesses of the site culture. Sometimes, of course, the same factor will appear in both lists, and what are strengths for some participants, may be weaknesses for others.

Devote some space in the overview to the culture of learning for the tutor(s) concerned.

Possible Transformations

The overview analysis, and the list of strengths and weaknesses, should make it possible to identify possible strategies for cultural change. When this is done, the following principles are important:

Do not restrict your analysis only to activities that can be actioned by the tutors or students. Often, changes by college, examination board, Government or others are more important.

Look for combined actions by several agents/agencies, that might be needed to support possible transformations.

Do not be put off if some of the changes you think important seem politically unachievable, in the current context. Part of our role, as researchers, is to point out where such wider factors inhibit desired actions or outcomes.

Try to identify, briefly, some of the main costs and benefits of any possible transformation, including costs and benefits to whom.

If relevant, include possible transformations of the learning culture for tutors.

Postscript

Though this is the version of the Instrument we want you to use, you should interpret it in ways most appropriate to the learning cultures you are analysing. The instrument itself can and should be further refined, with possible eventual publication, in the light of our further use of it, and reflection upon it.

References

COLLEY, H., HODKINSON, P. & MALCOLM, J. (2003) Informality and Formality in Learning

(London: LSRC).

FULLER, A. & UNWIN, L. (2003, in press) Learning as apprentices in the contemporary UK workplace: creating and managing expansive and restrictive participation, Journal of Education and Work , 16 (4).

15

Appendix II: Interventions and intervening events

An analytical tool for use across the sites

Introduction

This analytical tool was developed in a series of meetings within the TLC project workshops. It is intended to assist in the meaningful comparison and synthesis of interventions and intervening events across diverse learning sites. It does not seek a rigid classification of interventions and intervening events, but is a way of highlighting key cultural features within a project that has an overall aim to understand learning cultures.

From our initial team discussions it seemed as if the distinction between ‘intervention’ and ‘intervening event’ was an important starting-point. Our working definition of these terms is as follows:

An intervention is where a participating tutor deliberately and consciously shifts some aspect of the activity system in order to produce other changes of some kind. Such shifts might be purely tutorinitiated, or might, in a strong or weak sense, be ‘someone else’s agenda’.

An intervening event is a change in the site that is clearly from ‘outside’, whether or not it is in keeping with a tutor’s wishes, practices or values. Tutors are likely to have to make changes to the activity system as a result of intervening events. Another way of putting this is that intervening events are likely to produce further interventions .

The five ‘nodes’ of the analytical tool

A. Nature, origins and impetus?

If the tutor decides to do something differently, to what is the decision attributed by him/her? Is it on the back of experience, or tutor values, or official/unofficial feedback from students or managers or others?

Thrown up by external pressures? Is it immanent in the situation, ie ready and waiting to happen? There’s lots of scope here for ideas like ‘alignment’ (where someone tries to achieve a closer fit between student backgrounds/expectations and college provision) and individual or collective attempts to gain some greater harmony or synergy between one thing and something else. Concepts like field, disposition and habitus may be useful (what does ‘developing a more appropriate course for this type of student’ actually mean?), but so may social psychological ideas about motivation (such as ‘cognitive dissonance’) or about how the self is positioned as more or less in control (e.g. ‘locus of control’). If we think of tutors as learners, remember that the trigger for expansive learning is contradiction or disparity arising from an activity (see the brief Michael Young piece on Engestrom we shared at an earlier workshop – Young, 2001). Some learning cultures may be more ‘nailed down’, less open to change in some respect. Interactionist notions of the self, self and others may also be helpful.

B. Situated rationale/justification?

What are the reasons given by the persons most directly involved for the intervention? How many different reasons are there, and by whom are they given? Are they private or public? Do they appear to belong to the tutor only, or can they be seen as shared in the immediate or wider relevant social group? What ‘order’ of rationale or justification is there (e.g. personal, professional, educational, financial, managerial, corporate, ethical, etc?) and what focus (e.g. is it pedagogical, strategic, communicative)? Is a particular intervention related to college or sector policy that has a technical-rational justification (e.g.

‘improvement’, ‘efficiency’, ‘service’ or ‘progress’), albeit translated into a local form?

More controversially, for example, is a change in (say) tutorial arrangements given justification on the grounds that it will protect student interests in the face of damaging institutional developments? Who gives these justifications, and why might that be?

C. Outcomes and consequences

Is there a discernable effect? Who says so? Would it be impossible to tell if there was? Are there intended and unintended consequences of the intervention? Will any effects show up in conventional measurements

(e.g. retention and achievement) or will they be more subtle, or of a different order? Is it possible to check

16

different perceptions of the same event or series of events? Are the consequences ones for tutors, students, the organisation, others – or some combination of these? Are they one-offs, or sustained in the site? Are there ‘transferable’ elements in the eyes of the people involved? If there is change, how fundamental is it, and for whom? Are there transformations at the level of individuals, groups, cultures? How significant is tutor intervention in relation to other intervening events? How much difference can the tutor make? If there is improvement, then improvement according to whom and by what criteria? How does power figure in relation to the intervention? Does it challenge or reproduce existing power structures?

D. Key contextual matters

It is worth trying to say how much change there is ‘anyway’ in a particular setting. The very notion of intervention implies a backdrop of other things that are not subjected to the same degree of deliberate change – so we need to address this. What makes it an intervention? Does the sheer variety of culture and practice mean that one intervention in one place is a commonplace practice in another? As researchers, are we justified in singling something out as an intervention (to put this another way, how does it differ from the constant, everyday adjustments made?) How difficult is it for the tutor to make intervention x in a particular setting (ie we must have a relational view of this). How is change seen in the particular context, and is this felt through peer pressure? For example, is pedagogical innovation seen as fancy, progressive nonsense or as the ongoing responsibility of the committed professional? Who has ‘power over change’ in the particular setting?

E. Model of change

Does the tutor, the manager (or the students) hold a specific model of change? How is change seen in the particular site, and in the college? If there are notions of change in circulation, how do they focus on individuals (and individual responsibilities) or on the organisation? Is there a ‘rhetoric’ and a ‘reality’ of change? Does the organisation claim that change is rational - measured, incremental proactive - whilst some tutors claim they are in constant, chaotic, reactive change? The ‘change literature’ contains many theories of change. Kezar (2001) puts these into six categories, namely: evolutionary; teleological

(rational); life cycle; dialectical/political; social cognition; cultural. Kezar reminds us that each has different assumptions about such things as how and why change occurs and its outcomes. However, many writers draw eclectically from across these categories of theory (p. 2). Finally, Kezar suggests that in relation to higher education at least, ‘…organizational change can best be explained through political, social-cognition, and cultural models’ (p. 2).

References

Kezar, A. (2001) ‘Understanding and Facilitating Change in Higher Education in the 21 st Century – ERIC

Digest’ . Accessed April 2004. Available at http://www.ericfacility.net/databases/ERIC_Digests/ed457763.html

Young, M. (2001) ‘Contextualising a New Approach to Learning: some comments on Yrjo Engestrom’s theory of expansive learning’ Journal of Education and Work , 14, 1, pp. 157 – 161.

17

Download