Uploaded by I. I

Messy method the unfolding story

advertisement
Educational Action Research
ISSN: 0965-0792 (Print) 1747-5074 (Online) Journal homepage: https://www.tandfonline.com/loi/reac20
Messy method: the unfolding story
Nigel Mellor
To cite this article: Nigel Mellor (2001) Messy method: the unfolding story, Educational Action
Research, 9:3, 465-484, DOI: 10.1080/09650790100200166
To link to this article: https://doi.org/10.1080/09650790100200166
Published online: 24 Feb 2007.
Submit your article to this journal
Article views: 3181
View related articles
Citing articles: 7 View citing articles
Full Terms & Conditions of access and use can be found at
https://www.tandfonline.com/action/journalInformation?journalCode=reac20
Educational Action Research, Volume 9, Number 3, 2001
Messy Method: the unfolding story
NIGEL MELLOR
Educational Psychology Team,
Student and Pupil Services, Monkseaton, United Kingdom
ABSTRACT This article outlines an attempt to develop a method of inquiry,
which takes a frank look at the untidy realities of research. During his
practitioner-based PhD the author was mainly ‘working without rules’
(Appignanesi & Garrett, 1995, p. 50). New understandings of concepts such as
analysis, data, theory and writing began to evolve as he gradually embraced a
positive view of ‘mess’.[1] Over the 6 years of the study, the author made many
‘errors’, but wanted to be honest about these. He eventually began to see an
‘honesty trail’, together with clear communication (the material should be very
readable and subject to feedback) as essential parts of the project’s ‘strength’ (a
term offered in place of validity):
... the sciences have been enchanted by the myth that the
assiduous application of rigorous method will yield sound fact – as
if empirical methodology were some form of meat grinder from
which truth could be turned out like so many sausages. (Gergen,
1985, p. 273)
The Nightmare Begins
For much of my practitioner-based PhD, there was no research question
and no clear method. I was in Schön’s (1983) ‘swampy lowland ... of
confusing messes’ (p. 42), ‘working without rules in order to find out the
rules of what you’ve done’ (Appignanesi & Garrett, 1995, p. 50).
Initially, an aspect of practice intrigued me – my interventions as an
educational psychologist, with parents of children displaying emotional and
behaviour difficulties. They seemed to ‘work’, I had no particular wish to
alter them, but I was curious, although the nature of that curiosity was far
from clear:
465
Nigel Mellor
The history of any inquiry begins with something that is less than
a problem ... It may be considered as an awareness that there is a
question to be asked, without anyone being able to frame the
question successfully. (Ravetz, 1971, p. 135)
I recall that I opened the project with a relatively unformed ‘questioning’, not
with a wish to change, simply a wish to understand, what I was doing.
However, as soon as I began to look, I wanted to improve some aspects of
what I did, but by no means all. Some of the reflection on practice served
simply to celebrate and reinforce existing approaches. I started to explore
my fairly naive assumptions about action research (see, for example,
Atkinson, 1994, and Mellor, 1998, for further discussion of some of the
potential complexities of action research methods). If that had been all,
however, I might have been able to sleep fairly easy, but I had unwittingly
begun a process that was to torment me for the next 6 years: researching
my own researching.
The inquiry had zero time allocation in my daytime job and had the
values of practice uppermost. Research had to fit round practice as best it
could. The project started on day one. As soon as I decided to start, I was in
the middle, collecting ‘data’ in a research diary, but with no clear purpose. I
felt I could not ‘shut up shop’ for a few months and quietly plan a cosy
project – the world would not go away. It seemed important to me to capture
this first, incoherent, unformed stage of the process, although I was not
totally sure why, other than having a faint idea that this activity was
generally overlooked and was, in some unknown way, a vital part of the
overall inquiry.
However, I was on the rack, with voices in my head constantly nagging
that this wasn’t ‘real research’, this wasn’t science. Cixous (1993) writes of
the need in original research to ‘think the unthinkable’ (p. 38). The struggle
I faced, however, with roots securely in scientific subjects (physics and
psychology degrees) was much more to unthink the thinkable – to free
myself from ‘the tyranny of method’ (Thomas, 1998, p. 151) and open up
that ‘actual experience in the field [which] is often messy and fraught’
(Atkinson, 1994, p. 399).
Some 4 years into the project I felt confident enough to assert that I
had the bare bones of a method that I was comfortable with (see Mellor,
1998, for details). This was an approach that could accommodate those
‘disorderly “messy” features of the research process’ (Fine & Deegan, 1996,
p. 437). The method was reflexive – I could use it to look at itself and it
changed in the process. I began to refine the method by applying it to new
areas. I will focus initially on one mini-project, my efforts to understand a
single part of the inquiry: how I went about ‘analysis’. Although difficult,
this mini-project felt relatively successful. Other dangers lay far ahead.
466
THE UNTIDY REALITIES OF RESEARCH
Just When You Thought You Were Safe ...
With a wish to comprehend the process of analysis in greater depth I began
to re-examine my diaries, but with an extra perspective: what exactly was I
doing? Again, I did not feel comfortable with adopting a ready-made
solution, such as the techniques outlined in various versions of grounded
theory to analyse my analysis. Quite apart from acknowledging
disagreements over what exactly those techniques comprised (as explored
in, for instance, Melia, 1996) the research settings of grounded theory
advocates appeared in many ways quite different to mine. They seemed
often to involve strangers exploring other people’s territories or practitioners
with the relatively ‘objective’ facts of their practice to examine. In my case,
although I began with practice, I quickly became bound up with a moving
target: my own thoughts about methods of inquiry. Which ‘theory’ of
‘research’ would I be ‘grounding’ as my ideas on ‘method’ shifted and
developed over several years?
In the end I decided to ‘work without rules’, again, to find out my own
‘rules’ of analysis as I explored some of the ‘complex and messy problems’ of
research (Staw, 1981, p. 226). This intuitive style ran the risk of reinventing the wheel (although my hunch was that I might in fact end up
with a different type of wheel) or simply doing a bad job (although I hoped to
be very diligent in my considerations). In any case, the important part of the
process for me was not the wheel that appeared at the end (conventional or
otherwise), but the inventing that went in at the start.
One view of research I believe I carried at the start of the project
embodied a linear process:
Collect data
Analyse data
Write up.
My appreciation of the reality of the procedure gradually came to
incorporate a much more complex set of relationships influenced by
Baldamus’ (1972, 1976) concept of ‘double fitting’; Hampton’s (1993)
account of ‘platforms of understanding’; Minkin’s (1997) description of the
place of writing in research; McGrath et al’s (1981) ‘knowledge accrual
process’; St Pierre’s (1997) description of learning to ‘live in the middle of
things ... making do with the messiness’ (p. 176) and a range of other
sources. I will begin my exploration of analysis with Baldamus and double
fitting.
Double Fitting
Baldamus (1976) describes ‘double fitting’ as:
the unarticulated trial-and-error technique that occurs when an
investigator ‘simultaneously manipulates the thing he (sic) wants
to explain as well as his explanatory framework’. (p. 40)
467
Nigel Mellor
He explores the ‘unofficial techniques’ (Baldamus, 1972, p. 282) employed
by sociologists in their research, those ‘inarticulated techniques, devices,
and practices which are customarily employed during the preparatory
stages in the production of formal theories’ (Baldamus, 1972, p. 295). In
these, he identifies a process of ‘double fitting’, to explain the ‘continuous
restructuring of conceptual frameworks’ (Baldamus, 1972, p. 295), which
takes place as the researchers work on the interplay between data and
emerging theory:
imagine a carpenter alternately altering the shape of a door and
the shape of the door-frame to obtain a better fit, or a locksmith
adjusting successively both the keyhole and the key. (Baldamus,
1972, p. 295 emphasis in original)
During this process the data help build a theory while at the same time the
theory helps the researcher see the data in a new light. This of course could
look suspiciously like ‘deliberate falsification ... cooking the facts’
(Baldamus, 1972, p. 295) – which of course it would be if the investigator
began with a fixed mind and simply set out to prove his or her case. In
practice, however:
At the beginning there is only a vague notion of some ... puzzling
phenomenon ... it is gradually articulated by trying to ‘fit’ it into ...
generally known ... concepts...[whose] combination may
nevertheless produce new meanings that might eventually
illuminate the original unfamiliar phenomenon [or we begin with] a
hunch ... a means ... to discover the existence of some regularity ...
among certain data. (Baldamus, 1972, pp. 296-297)
Baldamus sees ‘double fitting’ as progressive. The current project, however,
involved a more discontinuous process, reaching ‘platforms of
understanding’. The inquiry also explored elements Baldamus does not
consider. For instance, I began to see writing as a form of research itself and
began to re-configure my views of data and analysis, or what I came to
regard as ‘making sense’ (a phrase borrowed loosely from Miles &
Huberman, 1994, p. 245)
A Developing View of Some of the Elements of Making Sense
As I reflected, my view of working practices in my day to day job reinforced
my beliefs: total ‘error’ elimination is perhaps not possible and probably not
efficient; it may be better to reconcile oneself, for instance, to repeating a
number of items, rather than aiming for 100% accuracy the first time in all
aspects. In a rough parallel, analysis or making sense could go through
cycles of approximating:
In a recent conversation at work I remember arguing that having
no errors is actually inefficient. It means you’re spending too long
checking things, the work will never get done so clients who have
468
THE UNTIDY REALITIES OF RESEARCH
to wait suffer that way. Of course, I do my best not to cause harm
to the clients I am currently involved with, but part of the trick over
the years is learning what you really need to look out for, and how
to put right the mistakes you do make, if they are big enough to
worry about.
Not that this is an excuse for ‘sloppy’ work (or sloppy analysis) – I
am committed to trying to do my best – I just happen to think I
work best in this way. (Diary, 28 October 1998)
Gradually, I began to value errors/off-shoots/over-sights in the project:
‘Research is the process of going up blind alleys to see if they are blind’
(Bates, cited in Green, 1982, p. 217). I return to errors later in discussing
their somewhat paradoxical contribution to validity.
In trying to penetrate my own techniques more clearly I turned again
to Miles & Huberman (1994). They describe part of the process as ‘data
reduction’: to see what’s on the page I first have to get rid of as much as I
can (in psychologist’s terms ‘chunk it’), so that the material left can stand
out. How did I get the categories (to chunk it) in the first place? Problems of
‘selective perception’ came to the fore. Patterns that came readily to mind
could hide others:
Of course the problem is one of bias. I see the patterns I learn to
see. I may miss the significance of others and just not perceive
some. Time of day became relevant [for example]: thoughts
appeared during the night, or early morning or when walking in
the park in the afternoons on weekends, almost never during
hours at work. The season of the year, the political climate, my
state of health did not. (Diary, 28 October 1998)
Which brings me to my concern over the nature of the data I am
relying on: my diaries, and the theory-dependence of what I recorded in
them:
But how did I get the data in the first place? Some were obvious –
reading a book, talking to people about the research. Some were
not so obvious – walking in the park, sleeping, flicking [through
books] at random, counselling. The problem was [in all the mass I
collected] – what to attend to? (Diary, 29 October 1998)
Chalmers (1982) raises the issue that what we take as relevant data
depends on our theories about what is going on. For instance, in early
experiments on radio waves, he asks, should the scientist have attended to:
the readings on various meters, the presence or absence of sparks
... the dimensions of the circuits ... the colour of the meters, the
dimension of the laboratory, the state of the weather, the size of
his shoes. (p. 33)
469
Nigel Mellor
Chalmers has set a rather neat trap here, however, which does not
ultimately detract from the strength of his point. He eventually discloses
how even apparently trivial or seemingly irrelevant circumstances may, with
hindsight, be vital – like the size of the room. It spoiled Hertz’s early work on
measuring the speed of radio waves as he was, in fact, often measuring
waves reflected from the walls. His theories at that time did not anticipate
such events, so the room itself was ignored.
There were many activities I did not record:
I did not record for instance listening to music – although I now
recall from my earlier more relaxed days of writing poetry that
playing guitar (badly) often triggered a poem (perhaps some
right brain-left brain interaction if you believe that sort of
analysis – reminds me of Sherlock Holmes playing his violin).
(Diary, 28 October 1998)
Thus, my data collection was ‘theory dependent’ in the sense of what I
thought the relevance of some activity could possibly be, determined
whether or not I recorded it, and over time my theories changed (see below).
My Latest Understanding of the Process of Making Sense
Marshall & Rossman (1995) explain how data analysis is ‘a messy,
ambiguous, time-consuming, creative and fascinating process’ (p. 111) –
another confusing part of that ‘messy and error-strewn reality’ that is
research (Minkin, 1997, p. 16). St Pierre (1997) describes how she rejected a
‘clear, linear process of research’ (p. 180) and how all the activities ‘data
collection, analysis and interpretation happened simultaneously’ (p. 180).
For me, each of the elements of the process interacted as I work towards a
platform of understanding. These multiple interactions are described in
depth in Mellor (1999), I restrict myself below to the main elements of the
process: exploration, ideas, writing, data. As a picture, the process I evolved
would look like Figure 1.
Figure 1. Making sense.
470
THE UNTIDY REALITIES OF RESEARCH
Writing
Writing could perhaps be seen as a simple ‘transcription’ of ideas worked
out elsewhere as Bell implies in her book ‘Doing your research project’:
‘When all the hard work of gathering and analysing evidence is complete,
you will need to write a final report’ (Bell, 1987, p. 151). Minkin (1997),
however, describes a fundamental shift in perception of the process :
‘writing involved not the recording of a creative outcome but participation in
a further creative process’ (Minkin, 1997, p. 178).
He explains how there is a ‘two-way interaction between continuously
developed knowledge and continuously developed text’ (Minkin, 1997,
p. 178). In Minkin’s experience, ‘there was no clear break in process
between the composing of thinking and the composing of writing’ (Minkin,
1997, p. 175). As well as discovering ambiguities, however, writing ‘could
develop a momentum of [its] own ... even small changes ... could take the
argument in unanticipated and significantly new directions’ (Minkin, 1997,
pp. 176-177) and ‘even sentences, occasionally took on new content,
involved new arguments and moved in unplanned directions’ (Minkin, 1997,
p. 177). Thus, the act of writing became itself part of the creative process.
Data
I reflect in my research diary, on what I thought of as a kind of ‘knowledge
trick’ which Colin Biott (my supervisor) and others could perform with my
‘data’ (previous research diary entries), but which, at that stage, I felt I could
not. I was confident in writing the original notes, but could not claim them
as ‘knowledge’:
I am amazed to find my notes suddenly becoming ‘knowledge’.
Colin [Biott] could do that, I could not. I could write diary notes,
‘data’ but I could not assert these as ‘knowledge’.
Reading this article [of Colin’s] for the first time there is a kind of
unembarrassed ‘authority’ [in my diary notes that Colin quotes] –
[me] writing my diary is ‘telling it how it is’. Somehow [me] writing
an article about my diary feels different. There is an invisible
barrier. I believe others can assert the ‘validity’ of my ‘knowledge’.
It is a giant, terrifying leap for me to do that. Do my jottings have
any status?
Am I wrong ... is the research monster that dizzying height [of
academic authority] I can never aspire to, or just a ‘Wizard of Oz’
with much smoke and loud bangs and little substance?
Like the straw man, do I need [a] certificate ... from some fictitious
university, to prove I have a brain? I guess this is a struggle with
my ‘researcher identity’. If I cannot show P < 0.001, is it ‘true’? I
471
Nigel Mellor
am still torn apart by this. I realise I am still trapped in my
positivist, scientific frame! Perhaps ‘it’ becomes ‘valid’ when I
fit it I to a theoretical framework. But the ‘it’ is [still] my notes!
(Diary, 7 January 1998, emphasis in original)
This battle with ‘validity’, the pull of ‘science’ and the belief in my own
ability to create ‘knowledge’ were not, however, resolved (at least to a partial
level of satisfaction) for quite some time. Hollingsworth (1997) describes in
working with a group of teacher- researchers, confidence in ‘knowledge
bearing’ is not easily come by: ‘[t]heir own questions, imaginings and ways
of knowing were never good enough [in their own eyes]’ (p. 493). In part, the
thesis was about exploring the struggle to claim such personal knowledge.
At an early stage of the project, my views about data had begun to
dissolve, but in a rather different manner to St Pierre. With my ‘scientist’
hat on, I had been used to concrete images of research: ‘building’ theories
from ‘solid’ data. Richardson (1990) describes how ‘metaphor is the
backbone of social science writing ... [b]ut ... we often do not recognise [its]
role’ (p. 18) and ‘in standard social science writing, the metaphor for theory
is a “building”’ (Richardson, 1994, p. 524). My view of data had been as
incontrovertible ‘givens’, not constructs; ‘brute data’ (Manicas & Secord,
1983, p. 410); the raw material of the building; data as foundation stones.
I gradually came to a softer, more fluid view of the mass of reflexive
diary entries: data as stepping stones, not facts, unchanging in themselves,
on which and from which to erect an edifice, but shifting and unsure,
elements to rest briefly and lightly upon, clues to mark the path. Switching
metaphors to try to capture the essence of my position another way, data
were the river, but not the water (as, for instance, we might choose to see
the social, spiritual, economic and so on effects of a river, rather than the
chemistry of its constituents). In a sense I ‘skimmed’ over the data, not
dwelling on any particular item at length. Changing metaphors for the final
time, I was more interested in the route I was taking (and creating) than the
earth I was walking on.
Exploration
Exploration of the data I viewed as a collection of procedures similar to
McGrath et al’s (1981) ‘knowledge accrual’, but including other aspects such
as serendipity and incubation:
Many forms of intellectual endeavour can contribute to the
knowledge accrual process. Among them are such activities as:
thinking; talking to colleagues; reading past research literature;
sometimes, reading past nonresearch literature, including novels
and poetry; becoming knowledgeable in a related area (from which
vantage point ideas, analogies, discrepancies, may be noted);
reanalysing the data. (p. 211, emphasis in original)
472
THE UNTIDY REALITIES OF RESEARCH
Broadly, this ‘exploration’ might cover the more formulaic-sounding term
‘analysis’. However, ways of exploring the data, such as using incubation (a
walk in the park), often led to the creation of new data (new diary notes as I
mused unconsciously). Part of the ‘exploring’ was the writing.
Ideas
The ‘ideas’ generated formed a collection of suppositions at different levels of
complexity – from an overall view of the research process (the messy
method) to discrete aspects such as understanding how data could be
stepping stones. I use the term ‘ideas’ here, rather than ‘theories’ to capture
the varied and fluid nature of the output of the studies. Definitions of theory
often seemed so theory-laden.[2]
Now, my data were ‘theory-dependent’, in the sense that ideas
influenced what I collected as data. So, for instance, counselling, which at
first seemed more of a general support mechanism, came to be seen as a
vital research tool in clearing thinking and I began to record this as part of
the project, rather than as part of my on-going separate counselling
development. Writing eventually became seen as an important phase of
research in its own right, not just a means of recording (see above); and an
old, partly forgotten identity aspect of ‘being a writer’ also achieved a new
importance. My previous writing, of fiction itself, became a kind of data.
The topic of identity and its relationship with research entered the
scene towards the end of the project, September 1997, but it was not till
October 1998 that I came to appreciate the relevance of another ‘missing’
data set, to do with identity: my explorations as an ‘amateur scientist’ in
areas unconnected with the project (recorded in a scrap book). These, I
began to realise, might help me to integrate some of my perceptions of my
own identities. Initially, they had not been seen as relevant data. They had
not even been seen.
How ‘valid’ all this ‘making sense’ exercise is, was a question that I
turn to now. I wanted to be honest about my struggles, and decided to
include in the final write-up an account of all the errors, blind alleys and
off-shoots of the project. Throughout the research I had suppressed my
worries over validity. As the reality of facing an examination of the work
began to sink in, the horror of it all began to bubble up. I had escaped from
a ‘story book’ scientific method (Mitroff, 1983, p. 8), and thought I was free
of the pull of science. However, it came back to haunt me. I realised I was
drawing on some barely conscious, background idea of ‘scientific validity’ to
evaluate what I had done. I was still trapped. Could I be honest?
... The Horror Returns ...
A paper by Lenzo (1995) discusses some of the difficulties inherent in
presenting honest accounts. She discusses the difficulties of doctoral
students wishing to challenge ‘traditional forms of closed narratives’ (Lenzo,
473
Nigel Mellor
1995, p. 19) and introduce doubts into their writing. She asks ‘What kind of
textual authority can admit to uncertainty ...?’ (Lenzo, 1995, p. 19). Lenzo,
however, misses an important point. For me, a resolution of the issue is to
not accept the terms of the dilemma, but to turn the problem on its head.
To assert that the admission of uncertainty becomes a source of authority.
To see that errors, side-turnings, blind-alleys, uncertainty render the text
more believable, not less; insofar as believability is a measure of authority,
they lend authority. I am now more suspicious of ‘hygienic’ accounts.
This position I reached was valuable to me in developing my
confidence, then I began to wonder if, in my concerns over validity, I was
asking the right question.
Is ‘Validity’ the Right Question?
Anxieties over validity coincided with a crumbling of my long-standing, but
unexamined ‘scientific’ stance. No single, defining moment in my
questioning of ‘science’[3] and where ‘knowledge’ lay marked the transition,
although a few fragments stand out. I recall, for instance, Phillips’s (1992)
comment that:
The worry about the warrant for conclusions drawn from a
qualitative inquiry will not wane, largely because the worry about
the warrant for conclusions drawn from any inquiry will not wane.
(p. 118)
An observation from Cixous ...:
when you look at the TV [news] the truth simply disappears. I see
massacres on TV and I do not cry. I have to go to the theatre for
that. There I can receive that subjective and poetical expression.
(quoted in Jeffries, 1997, p. 4)
latched on to a personal memory of the power of Shelley’s poem
‘Ozymandias’. There, no amount of ‘valid’, ‘objective’ history would capture,
for me, in quite the same way, that essence of human vanity that Shelley
encapsulates in a few lines about lifeless stones in the desert: the truth that
comes from lies. This in its turn made connections back into the academic
world through Wolcott’s (1990) gripping life or death story ‘when it really
matters, does validity really matter?’ (p. 135) and Scheurich’s (1996) critique
where post-positivists, although quite willing to dump conventional science,
continue to cherish the associated notion of validity ‘they will not leave
home without it’ (p. 50).
Whatever the causes (and many other articles contributed to this
questioning, such as Mishler, 1990, and Lather, 1993), at about the same
time that my understanding of what actually constituted ‘science’, as
practised by real scientists, and what constituted ‘scientific knowledge’ both
began to shift, so my stance on validity shifted. This crystallised very late,
during the write-up stage. Rendering the event somewhat poetically, I found
474
THE UNTIDY REALITIES OF RESEARCH
I had ‘stepped through’: ‘Validity is the wrong word’ (Diary, 11 November
1998).
There was something in all this still worth struggling for, but what was
it? I needed another term. ‘Worthwhileness’ (House, 1980, cited in Dadds,
1995, p. 112) came close. None seemed quite right: valuation, judgement,
good/bad. As a provisional resting place I settled on ‘strength’ (while still
hoping for something better): ‘validus’, strength, being the Latin root of
validity.
Research Strength and Practitioner Research
Several writers propose frameworks for the evaluation of inquiries; several
apply to action research that was one of my starting points (Altrichter et al,
1993; Clarke et al, 1993; Lomax, 1994; Dadds, 1995; Tickle, 1995; Norris,
1997). Within practitioner research more generally there are recent texts
such as Robson (1993), Fuller & Petch (1995) and Reed & Procter (1995). It
is the latter book I draw on, particularly the chapter by Reed & Biott.
Although the various contributors focus on health care, their views appear
immediately translatable to other fields.
Reed & Biott (1995) see strength as a continuum, not as an either/or.
Melding the key points of their argument with those of the other writers
mentioned, I adopted a final simplified list.
Assessment of Research Strength
Is it a believable account of what I did? Is it full and honest? Does it ‘ring
true’ for you?
Is the writing clear and readable? Is there a good balance between
academic rigour and accessibility ?
Does the account demonstrate a sufficient level of care in developing
methods of research and seeking wider critique? Is there sufficient detail
for you to judge?
Does the enterprise seem worthwhile? Does it lead to new
understandings?
Is what I did of interest to you? Does it ‘strike a chord’? Does it stimulate
you?
Is the study in keeping with practitioner values?
Is the research self-critical?
Finally, all things considered, is the method described something you
might be prepared to adopt, even in part and according to your own
needs? Can you relate to it?
However, while broadly happy with this breakdown, I continued to become
tangled up with anxieties about how individualistic the study might be.
Could it be generalised? Did it need to be?
475
Nigel Mellor
Generalisation/Relatablity
In the early stages of developing ideas, I sought encouragement as I dredged
up notions from the confusing research process I had set underway. I took
support where I could, in constant dialogue with an enormous range of
colleagues, while at the same time, using this dialogue to refine my ideas.
To begin with, I made only tentative sorties into subjecting them to the more
stringent testing of the public forum of conference papers and publication,
after trying them out with local research groups. I was ‘testing while
protecting’.[4] My first attempt at serious testing (a paper presented to
CARN 1995, where I was anxiously feeling my way) was nearly my last, but
interestingly, it set the seeds of what was to be an important later theme:
the role of counselling and emotions in research (see Mellor, 1999).
Initially, I had an inkling that what I was describing as ‘messy method’
was a universal method: if we are honest, we all work this way. The reality
was unpredictably different, however. My counsellor, Mike, who in many
respects could have been a prime candidate for such concepts (being very,
what I can loosely call ‘alternative’, in his views generally) initially denied
that he proceeded in a ‘messy’ way, in one of our early discussions:
Mike: In some ways I see [my] department muddling through and
I’m quite critical of that. In one sense I could use it [but] they’re
incredibly amateur – unsystematic and lacking in theory ... I
sometimes think I’m too disciplined. (Diary, 24 September 1996)
A relative by marriage, Celia, once a post-doctoral chemist from the
prestigious Imperial College, now an accountant and manager with a giant
multi-national, but still a scientist in outlook, agreed strongly that she did
use a messy approach.
So from the start I had both (unexpected) confirmation and
disconfirmation of the existence of ‘messy approaches’. In reconciling these
views, following dialogue with other researchers (and, indeed, with Mike and
Celia, extracting more of the subtleties of their beliefs), I eventually settled
for the following position, between two extremes. I do not try to claim that
everyone works the way I have tried to outline here. Neither do I try to claim
that my own approach is only relevant to me. I contend instead that some
people, some of the time, may in part adopt a ‘messy method’ such as I have
described.
Now, while much of the research time was taken up in convincing
myself that the approach and the effort were worthwhile, I wanted to do
more: to share these ideas with (like-minded) colleagues, indeed, to see if
colleagues were like-minded. Thus, the interviews above highlighted a
problem: what constituted sufficient grounds to say I could ‘generalise’ my
ideas (if that was what I wanted to do). To begin with I was a little downhearted, I had wanted everyone to confirm my belief in the existence of
messy methods:
476
THE UNTIDY REALITIES OF RESEARCH
Initially I was disappointed [with Mike’s response]. I had a halfconscious hope that, Piaget-like, from studying one or two children
(or in my case, just myself) I could have uncovered a universal, a
grand narrative (my ‘muddling through’ method of everyday life
and research). Post-modernism would of course reject such a
notion. (Diary, 24 September 1996)
Then, I began to see Mike’s response in a much more positive light. Here
was some useful feedback that made me examine more closely my
assumptions about generalisability and some of its alternatives. I was
drawn to Bassey’s (1981) ‘relatability’:
an important criterion for judging the merit of a case-study is the
extent to which the details are sufficient and appropriate for a
teacher working in a similar situation to relate his (sic) decision
making to that described in the case-study. (p. 85)
However, while attractive, this seemed to disguise a problem it shared with
generalisability: the question of numbers:
The positive thing which came out of this interview [with Mike] was
to do with generalisability. Bassey for instance talks about
‘relatability’, and others talk about a work having ‘resonance’. But
with how many people? Is one person feeling empathy, seeing a
resonance, relating the work to their own situation, enough? Is
two? What is the right number? Does getting an article published
count? Would a sample survey of a cohort of academics be
necessary? The literature is not clear on this point. That’s the
problem in using this kind of concept. (Diary, 24 September 1996)
Leaving to one side temporarily the question of numbers, I will address an
issue that seems to be buried in the above discussion: communication and
its contribution to validity/strength.
But With One Bound He Was Free ...
Communication
Beyond a selfish desire for the personal development of the practitioner
researcher in question, what value is there in research? Reed & Biott (1995)
point to the ‘usefulness of research in informing practice’ and the capacity
in ‘generating debate rather than solving it’ (p. 200). However, what use is
inquiry to other practitioners unless they read about it? Practitioners
appear to rely much more on tacit knowledge (Usher & Edwards, 1994, refer
to ‘subjugated knowledge’ p. 54) than that from the academy.
While we cannot be sure that anything we write will ever be read there
is, I feel, some duty to attempt to make what we write as accessible as
possible: to tell a good story, although that, in itself, is far from sufficient a
criterion on which to judge.
477
Nigel Mellor
Excellent material may at times be excavated from impenetrable prose.
Good ideas do not always come cheap and the committed researcher must
be expected to work for his or her insights. Not all bad writing is bad
research. Phillips (1992), however, criticises the notion that ‘good writing’ is
by itself a sufficient warrant for inquiry. Citing Miles & Huberman he
explains that ‘qualitative analyses can be evocative, illuminating, masterful,
and downright wrong’ (p. 114 emphasis in original). It is important to
examine Phillips’ ‘god’s eye’ view of truth,[5] but the point he makes is a fair
one. We need more than good prose to be assured of the value of a piece of
work: ‘a swindler’s story is coherent and convincing’ (Phillips, 1992, p. 114).
He argues ‘[c]redibility is a scandalously weak and inappropriate surrogate
for truth ... under appropriate circumstances any nonsense at all can be
judged as ‘credible’ (p. 117).
Those taking a post-modern slant, such as Scheurich (1996), who
dismiss the valid/ non-valid distinction, leave me unsatisfied. Can I write
anything and it will find a place somewhere in the great garden of
knowledge, in his ‘Bakhtinian ... carnival’ of ‘marginalized voices’ (p. 58)?
However, the form, if not the content, of his paper, at least in this example,
betrays Scheurich’s allegiance to a very elegant, conventional writing voice
in the refereed world of academic publication. I am left with the feeling that
there are still some (uncertain) ‘standards’ to strive for in judging the merit
of a piece of research, even if notions of validity are rejected.
My current ‘solution’ to some of the problems raised here is explored
below, where I discuss communication and critique.
Communication and Critique
The position I have come to is, I suspect, neither new nor particularly
radical. It combines communication with critique. For practitioner
researchers I want to communicate my ideas in a form they may be more
ready to access, to aid the search for relatability. This is not to be
patronising. Practitioner researchers are not some inferior breed of
researcher who need spoon-feeding, but perhaps their motivations and
positioning can be seen as different. Certainly in my case, the values of
practice, for instance, are pre-eminent.
Academic rigour, for me, is not as crucial as relevance and
accessibility (although they need not necessarily conflict). For casework, as
opposed to research, the ‘messy and unpredictable’ business (Kupferberg,
1996, p. 235) of my unique day-to-day practice will provide its own testing
ground. With regard to research, I would not wish to be cavalier with the
results of serious, dedicated researchers but ‘real situations demand [an] ...
approach which may escape the categories of applied science’ (KremerHayton, 1994, quoted in Kupferberg, 1996, p. 229). In a similar way, I may
wish to draw on a range of techniques and ideas in dealing with the messy
reality of my research practice.
478
THE UNTIDY REALITIES OF RESEARCH
To effectively communicate the outcomes of this inquiry to others
requires, I believe, that they need to feel some ‘resonance’; the writing
‘speaks to them’; they can ‘relate’ it to their situation. Full description of the
‘faltering reality’ of research or practice, presented as part of an ‘honesty
trail’, may contribute to that sense of ‘resonance’, although this is for me,
more a question of faith at the present time. I do not have the evidence to
support the assertion: a catalogue of ‘errors’ might conceivably undermine
the credibility of the study in some eyes, although Lenzo (1995, p. 19) urges
a freeing from ‘the censorious hold of ‘science writing’’ (citing Richardson,
1994).
Cornet (1995) argues that ‘[d]oubt and uncertainty are the fuel that
drives ... research’ (p. 123) and that it is ‘[r]efreshing to find thoughtful
writings ... that vividly portray the intellectual and emotional struggles that
researchers face when they are truly concerned about truth value and
ethical behaviour in their work’ (p. 123), although ‘it is rare that authors
share their angst publicly’ (p. 123).
Accepting for now the value of frankness, to draw this discussion to a
close, I would like to focus on one last, key issue: the two-way nature of true
communication.
... Well, Almost Free
My problem as a writer is, in presenting an ‘honest’ account (indeed, any
account) how do I know it resonates or relates? Bassey has unfortunately
not developed his idea of relatability (Bassey, 1981, personal
communication). Feedback from papers and talks helps. Publication is
perhaps one index, but is not reliable. True evidence of resonance requires
the test of time: did people in the end, pick it up?
However, even if I could find a supportive band, a clique of ‘believers’,
a group who felt ‘resonance’ with the ideas (being unflattering, a kind of
‘flat-earth society’, Phillips, 1992, p. 115) would that be convincing to the
unconverted? How far does it need to be? Would ‘[l]ocalized approaches
which develop from our unique contexts’ be acceptable, provided they were
not ‘corruptingly insular’ (Dadds, 1995, p. 113). A universe of one is insular,
but in seeking support, how large a community do I need to communicate
with? How un-insular is un-insular enough? At the time of writing, the
difficulty remains unresolved.
If material is offered to a ‘community’; if it includes as much honesty
as I can muster and as much detail of what I did, as is consistent with being
readable and interesting; and if that community, over time, subjects this
work to stringent criticism (and does not accept a simple, immediate
‘consensus’ view) and finds it useful, then it may be possible in one sense,
to count the work as ‘valid’/‘worthwhile’/‘strong’. However, that process
may have a lifetime greater than the average PhD.
The End
479
Nigel Mellor
Acknowledgements
I am deeply indebted to my supervisors Colin Biott and Sandy Wolfson,
together with an extensive ‘invisible college’ of relatives, friends, colleagues
and chance acquaintances.
Correspondence
Nigel Mellor, Educational Psychology team, Student and Pupil Services,
Chapel Lane Education Centre, Chapel Lane, Monkseaton, North Tyneside,
Tyne and Wear NE25 8AD, United Kingdom (nigel.mellor@ncl.ac.uk).
Notes
[1] Managers ... manage messes’ (Ackoff, 1979, p. 99, quoted in Schön, 1983,
p. 16). Schön describes the ‘confusing “messes”’ of practice (p. 42). Lindblom
(1959) outlines a process of policy makers ‘muddling through’. Kupferberg
(1996) claims that ‘[c]haos and lack of structure is the first surprise of project
work’ (p. 237). Cook (1998) describes action researchers ‘bumbling, messing’
(p. 105). Spellman & Harper (1996) refer to the ‘[f]ailure, mistakes, regret and
other subjugated stories in family therapy’ (p. 205). Pava (1986) explains how
in planning there is an effective approach, which displays: ‘A disjointed,
undisciplined quality ... Goals and procedures are left unclear ... Action
precedes understanding’ (p. 631). The individual analyses presented by these
writers I will not address here. My point is that some concept of mess may be
necessary in exploring a wide spread set of areas. There may be mess in the
problems that confront us; there may be mess in the way we tackle these. I
would wish, however, to rescue the term mess from its negative baggage: to
see messy approaches not as ‘sloppy’, but as difficult, requiring a high level of
skill; and to extract what structure I can from such approaches. As Cook
(1998) explains, ‘Mess is skilled – [a] very highly skilled process’ (p. 103). My
view of research in the social sciences as a messy process is reinforced by
these authors, many works explored earlier, and others noted in Mellor
(1999), such as Whyte (1955, methods appendix), Bannister (1981), Atkinson
et al (1991), Salmon (1992), Kleinman & Copp (1993), Stanley & Wise (1993),
Cornett (1995), Frost (1995).
[2] Compare, for instance, Winter (1998) and Bell (1987), to unearth the implicit
or explicit assumptions behind just two definitions.
[ 3] The project in the end is not anti-science, but anti- an unthinking ‘scientism’
and a simplistic view of what constitutes scientific method: ‘story book’
science.
[4] See Chalmers (1982) and my own attempts in Mellor (1999) at developing an
idea of ‘testing while protecting’. Drawing from Lakatos, Chalmers describes
how in the early stages a ‘research programme’ must be protected: ‘[it] must
be given a chance to realise its potential’ (p. 83). It must not be allowed to
crumble under the weight of immediate criticism, all new positions are bound
480
THE UNTIDY REALITIES OF RESEARCH
to be fallible: ‘An embarrassing historical fact for falsificationists is that if
their methodology had been strictly adhered to ... those theories generally
regarded as being among the best examples ... would have been rejected in
their infancy ... Newton’s gravitational theory was falsified by observation of
the moon’s orbit. It took almost fifty years to deflect this falsification onto
[other] causes’ (p. 66).
[5] Phillips (1992) criticises a consensual view of ‘truth’, arguing that a consensus
might form round a view such as the world is flat (see p. 115). In this,
however, he has the benefit of knowing the ‘truth’. He has a ‘god’s eye’ view.
What are we to make of a situation where the ‘truth’ is unknown and
whatever emerges may only do so after a long period of political and scientific
struggle and may, or may not, correspond to some ‘god’s eye’ truth.
Appreciating this point for me, experiencing the real uncertainty of ‘not
knowing’, meant engaging with present day ‘hot’ science. I spent some time
examining the shifting discourse over the safety of GM foods as a prime
example where a convenient ‘god’s eye’ view was not to hand. The conclusion
of Phillips’ chapter is, for me, also somewhat unsatisfying. He holds up ‘truth’
as a ‘regulative ideal’ (p. 119, emphasis in original), much like Popper’s
famous piece on the search for the ‘mountain peak [of truth] permanently, or
almost permanently, wrapped in clouds’ (Popper, 1968, p. 226). This
unfortunately gives little guidance on what one is actually supposed to do in
the research setting, with this ‘regulative ideal’.
References
Ackoff, R. (1979) The Future of Operational Research is Past, Journal of Operational
Research Society, 30, pp. 93-104.
Altrichter, H., Posch, P. & Somekh, B. (1993) Teachers Investigate their Work: an
introduction to the methods of action research. London: Routledge.
Appignanesi, R. & Garratt, C. (1995) Postmodernism for Beginners. Cambridge: Icon
Books.
Atkinson, B., Heath, A. & Chenail, R. (1991) Qualitative Research and the
Legitimization of Knowledge, Journal of Marital and Family Therapy, 17,
pp. 175-180.
Atkinson, S. (1994) Rethinking the Principles and Practice of Action Research: the
tensions for the teacher-researcher, Educational Action Research, 2, pp. 383-401.
Baldamus, W. (1972) The Role of Discoveries in Social Science, in T. Shanin (Ed.) The
Rules of the Game. London: Tavistock Publications.
Baldamus, W. (1976) The Structure of Sociological Inference. London: Martin
Robertson.
Bannister, D. (1981) Personal Construct Theory and Research Method, in P. Reason &
J. Rowan (Eds) Human Inquiry: a sourcebook of new paradigm research.
Chichester: John Wiley.
Bassey, M. (1981) Pedagogic Research: on the relative merits of search for
generalisation and study of single events, Oxford Review of Education, 7,
pp. 73-94.
481
Nigel Mellor
Bell, J. (1987) Doing Your Research Project: a guide for first-time researchers in
education and the social sciences. Buckingham: Open University Press.
Chalmers, A.F. (1982) What is This Thing Called Science? 2nd edn. Buckingham:
Open University Press.
Cixous, H. (1993) Three Steps on the Ladder of Writing. New York: Columbia
University Press (Welleck Library Lectures, University of California, Irvine).
Clarke, J., Dudley, P., Edwards, A., Rowland, S., Ryan, C. & Winter, R. (1993) Ways
of Presenting and Critiquing Action Research Reports, Educational Action
Research, 1, pp. 490-492.
Cook, T. (1998) The Importance of Mess in Action Research, Educational Action
Research, 6, pp. 93-108.
Cornett, J.W. (1995) The Importance of Systematic Reflection: implications of a
naturalistic model of research, Anthropology and Education Quarterly, 26,
pp. 123-129.
Dadds, M. (1995) Passionate Enquiry and School Development: a story about teacher
action research. London: Falmer Press.
Fine, G.A. & Deegan, J.G. (1996) Three Principles of Serendip: insight, chance, and
discovery in qualitative research, International Journal of Qualitative Studies in
Education, 9, pp. 434-447.
Frost, D. (1995) Integrating Systematic Enquiry into Everyday Professional Practice:
towards some principles of procedure, British Education Research Journal, 21,
pp. 307-321.
Fuller, R. & Petch, A. (1995) Practitioner Research: the reflexive social worker.
Buckingham: Open University Press.
Gergen, K.J. (1985) The Social Constructionist Movement in Modern Psychology,
American Psychologist, March, pp. 266-275.
Green, J. (1982) A Dictionary of Contemporary Quotations. London: Pan.
Hampton, H. (1993) Behind the Looking Glass: practitioner research – who speaks
the problem? Educational Action Research, 1, pp. 257-273
Hollingsworth, S. (1997) Killing the Angel in Academe: feminist praxis in action
research, Educational Action Research, 5, pp. 483-500.
House, E.R. (1980) Evaluating with Validity. Beverley Hills: Sage.
Jeffries, S. (1997) A Bit of the Other. Interview with Helene Cixious. Guardian, G2,
29 October, p. 4.
Kleinman, S. & Copp, M.A. (1993) Emotions and Fieldwork. London: Sage.
Kremer-Hayton, L. (1994) The Knowledge Teachers Use in Problem Solving Situations:
sources and forms, Scandinavian Journal of Educational Research, 38, p. 63.
Kupferberg, F. (1996) The Reality of Teaching: bringing disorder back into social
theory and the sociology of education, British Journal of Sociology of Education,
17, pp. 227-247.
Lather, P. (1993) Fertile Obsession: validity after poststructuralism, Sociological
Quarterly, 34, pp. 673-693.
Lenzo, K. (1995) Validity and Self-reflexivity Meet Poststructuralism: scientific ethos
and the transgressive self, Educational Researcher, 24, pp. 17-45.
482
THE UNTIDY REALITIES OF RESEARCH
Lindblom, C.E. (1959) The Science of Muddling Through, Public Administration
Review, 19, pp. 79-88.
Lomax, P. (1994) Standards, Criteria and the Problematic of Action Research within
an Award Bearing Course, Educational Action Research, 2, pp. 113-126.
Manicas, P.T. & Secord, P.F. (1983) Implications For Psychology of the New
Philosophy of Science, American Psychologist, April, pp. 399-413.
Marshall, C. & Rossman, G.B. (1995) Designing Qualitative Research, 2nd edn.
London: Sage Publications.
McGrath, J.E., Martin, J. & Kulka, R.A. (1981) Some Quasi-Rules for Making
Judgement calls in Research, American Behavioral Scientist, 25, pp. 211-224.
Melia, K.M. (1996) Rediscovering Glaser, Qualitative Health Research, 6, pp. 368-378.
Mellor, N. (1998) Notes from a Method, Educational Action Research, 6, pp. 453-470.
Mellor, N. (1999) From Exploring Practice to Exploring Inquiry: a practitioner
researcher’s experience. PhD thesis, University of Northumbria at Newcastle (to
appear at www.staff.ncl.ac.uk/nigel.mellor).
Miles, M.B. & Huberman, A.M. (1994) Qualitative Data Analysis, 2nd edn. London:
Sage.
Minkin, L. (1997) Exits and Entrances: political research as a creative act. Sheffield:
Sheffield Hallam University Press.
Mishler, E.G. (1990) Validation in Inquiry-guided Research: the role of exemplars in
narrative studies, Harvard Educational Review, 60, pp. 415-442.
Mitroff, I.I. (1983) The Subjective Side of Science: a philosophical inquiry into the
psychology of the Apollo moon scientists. Seaside: Intersystems Publications.
Norris, N. (1997) Error, Bias and Validity in Qualitative Research, Educational Action
Research, 5, pp. 172-176.
Pava, C. (1986) New Strategies of Systems Change: reclaiming nonsynoptic methods,
Human Relations, 39, pp. 615-633.
Phillips, D.C. (1992) The Social Scientist’s Bestiary: a guide to fabled threats to, and
defences of, naturalistic social science. Oxford: Pergamon.
Popper, K. (1968) Conjectures and Refutations. New York: Harper Torchbooks.
Ravetz, J.R. (1971) Scientific Knowledge and its Social Problems. Oxford: Clarendon
Press.
Reed, J. & Biott, C. (1995) Evaluating and Developing Practitioner Research, in
J. Reed & S. Procter (Eds) Practitioner Research in Health Care. London:
Chapman & Hall.
Reed, J. & Procter, S. (Eds) (1995) Practitioner Research in Health Care. London:
Chapman & Hall.
Richardson, L. (1990) Writing Strategies: reaching diverse audiences. London: Sage.
Richardson, L. (1994) Writing: a method of inquiry, in N.K. Denzin & Y.S. Lincoln
(Eds) Handbook of Qualitative Research. London: Sage.
Robson, C. (1993) Real World Research: a resource for social scientists and
practitioner-researchers. Oxford: Blackwell.
Salmon, P. (1992) Achieving a PhD – ten students’ experience. Stoke-on-Trent:
Trentham Books.
483
Nigel Mellor
Scheurich, J.J. (1996) The Masks of Validity: a deconstructive investigation,
International Journal of Qualitative Studies in Education, 9, pp. 49-60.
Schön, D.A. (1983) The Reflective Practitioner. Aldershot: Avebury.
Spellman, D. & Harper, D.J. (1996) Failures, Mistakes, Regret and Other Subjugated
Stories in Family Therapy, Journal of Family Therapy, 18, pp. 205-214.
St. Pierre, E.A. (1997) Methodology in the Fold and the Irruption of Transgressive
Data, International Journal of Qualitative Studies in Education, 10, pp. 175-189.
Stanley, L. & Wise, S. (1993) Breaking Out Again: feminist ontology and epistemology.
London: Routledge.
Staw, B.M. (1981) Some Judgements on the Judgement Call Approach, American
Behavioral Scientist, 25, pp. 225-232.
Thomas, G. (1998) The Myth of Rational Research, British Educational Research
Journal, 24, pp. 141-161.
Tickle, L. (1995) Testing for Quality in Educational Action Research: a terrifying
taxonomy, Educational Action Research, 3, pp. 233-237.
Usher, R. & Edwards, R. (1994) Postmodernism and Education. London: Routledge.
Whyte, W.F. (1955) Street Corner Society: the social structure of an Italian slum, 2nd
edn. Chicago: University of Chicago Press.
Winter, R. (1998) Managers, Spectators and Citizens: where does theory come from in
action research? Educational Action Research, 6, pp. 361-376.
Wolcott, H.F. (1990) On Seeking – and Rejecting – Validity in Qualitative Research, in
E.E. Eisner & A. Peshkin (Eds) Qualitative Inquiry in Education: the continuing
debate. New York: Teachers College Press.
484
Download