Teachers' Application of Error Analysis in the Context of

Teachers’ Application of Error
Analysis in the context of Learner
Interviews
School of Education, University of the Witwatersrand
Funded by the Gauteng Department of Education
17 January 2012
Updated June 2013
Teachers’ Application of Error Analysis in the
context of Learner Interviews
School of Education, University of the Witwatersrand
Funded by the Gauteng Department of Education
School of Education
University of the Witwatersrand
27 St Andrews’ Road
Parktown
Johannesburg 2193
http://www.wits.ac.za/academic/humanities/education/l
© School of Education, University of the Witwatersrand
This work is licensed under a Creative Commons Attribution-NonCommercial 2.5 South Africa License.
Permission is granted under a Creative Commons Attribution-NonCommercial license to replicate, copy, distribute,
transmit, or adapt this work for non-commercial purposes only provided that attribution is provided as illustrated in the
citation below.
To view a copy of this licence visit http://creativecommons.org/licenses/by-nc/2.5/za/deed.en .
Citation:
Shalem, Y., & Sapire, I. 2012. Teachers’ Application of Error Analysis in the context of Learner Interviews. Johannesburg: Saide.
Contents page
Contents page ........................................................................... 5
Tables ......................................................................................... 7
Acknowledgements ................................................................. 8
Introduction ............................................................................ 10
Report Plan .............................................................................. 12
Section One: Activity Process .............................................. 13
1.1 Process ...............................................................................................14
1.1.1 Time line ............................................................................................................ 14
1.1.2 The Process ........................................................................................................ 14
Section Two: Methodology .................................................. 19
2.1 Aim of the evaluation.....................................................................20
2.2 Interviews evaluated ......................................................................20
2.3
Evaluation Instruments ..........................................................21
2.3.1 The coding template ........................................................................................ 21
2.3.2 The five activities describe the teachers’ participation in the interview,
and their respective codes ........................................................................................ 22
2.3.3 The activity that describes learner participation: ....................................... 26
2.4 Training of coders and coding process .......................................27
2.5 Validity and reliability check .......................................................27
2.6 Data analysis ....................................................................................28
Section Three: Quantitative analysis ................................. 29
3.1 Overall picture .................................................................................30
3.1.1 Time spent on interview activities ................................................................ 30
3.1.2 Teacher participation ....................................................................................... 34
3.1.3 Learner participation ....................................................................................... 36
3.2 Presence of the coded teacher interview activities...................38
3.2.1 Procedural Activity .......................................................................................... 38
3.2.2 Conceptual Activity ......................................................................................... 40
3.2.3 Awareness of error Activity............................................................................ 42
3.2.4 Diagnostic reasoning Activity ....................................................................... 44
3.2.5 Use of everyday Activity ................................................................................. 46
3.2.6 Multiple explanations ..................................................................................... 46
3.3 Quantitative Findings. ...................................................................48
Section Four: Qualitative analysis ...................................... 53
4.1 Sample ...............................................................................................54
4.2 Analysis of learner interviews .....................................................55
4.2.1 Grade 4 learner interview ............................................................................... 55
4.2.2. Grade 5 learner interview .............................................................................. 63
4.2.3 Grade 7 learner interview ............................................................................... 69
4.2.4. Grade 9 learner interview .............................................................................. 77
4.3 Qualitative Findings.......................................................................85
Section 5: Conclusion ............................................................ 91
Recommendations for professional development and for further
research ...................................................................................................92
Recommendations for professional development .............................................. 92
Recommendations for further research ................................................................. 92
References ................................................................................ 93
Appendices: ............................................................................. 95
Appendix 1.................................................................................................................. 96
Appendix 2.................................................................................................................. 98
Appendix 3.................................................................................................................. 99
Appendix 4................................................................................................................ 101
Appendix 5................................................................................................................ 105
6
Tables
Table 1: Setting own test and conducting learner interviews .............................................. 15
Table 2: Sample summary ......................................................................................................... 20
Table 3: Percentage of time spent per activity across the interview sample set ................ 31
Table 4: Percentage of time spent per activity across the interview sample set ................ 48
Table 5: Percentage of time coded full over total interview time ......................................... 49
Table 6: Percentage of time coded inaccurate over total interview time ............................. 49
Table 7: Percentage of time coded full and partial over total interview time ..................... 50
Table 8: Activity criteria abbreviations.................................................................................... 54
7
Acknowledgements
This report was prepared by:
Professor Yael Shalem (Wits School of Education)
Ingrid Sapire (Wits School of Education)
Tessa Welch (Saide)
Maryla Bialobrzeska (Saide)
Liora Hellmann (Saide)
We would like to thank the following people for their role in the preparation of the
report:
Nico Molefe (Maths Education Consultant)
Bronwen Wilson-Thompson (Maths Education Consultant)
The project team for DIPIP Phases 1 and 2
Project Director: Prof Yael Shalem
Project Leader: Prof Karin Brodie
Project coordinators: Ingrid Sapire, Nico Molefe, Lynne Manson
The project team and evaluators would like to thank the Gauteng Department of
Education and in particular Reena Rampersad and Prem Govender for their support of
the project.
We would also like to thank all of the teachers, departmental subject facilitators, Wits
academic staff and students and the project administrators (Karen Clohessy, Shalati
Mabunda and team) who participated over the three years of the project.
8
9
Introduction
Learner interviews was one of the two error-focused activities in DIPIP (Data Informed
Practice Improvement Project) phases 1 and 2. The other error-focused activity was an
error analysis of the evaluation data (ICAS 2006 and 2007) which included mainly
multiple choice items and a few open items. Additional open items were selected from
the tests that the teachers designed in their small groups in the last 6 months of the
project. The results of the groups’ error analysis are reported in the Teachers’ Knowledge
and Error Analysis report (Shalem & Sapire, 2012).
During the final project activity (test setting and learner interviews) the teachers were
given an opportunity to design an open questions test, administer it in class, and
conduct an error analysis on the results. The purpose of the activity was to give the
teachers experience in designing their own assessments, which they could then analyse
in terms of learners’ errors and misconceptions. Groups were asked to set their tests on
misconceptions associated with the equal sign or with problem solving and visuals
(which related to the teaching activities done earlier in the project) (Shalem, Sapire,
Welch, Bialobrzeska & Hellman, 2011). In order to deepen this exploration groups were
asked to follow the error analysis with a learner interview, in which they could interact
with one (or more) learner and work with the learner diagnostically to establish what
the error is about. Instead of hypothesizing about the reasons for learners’ errors evident
in a large scale systemic test, the teachers could explore with their own learners the
reasons for their errors in the test.
The aim of the interview was to understand how teachers interact with learners about
mathematical knowledge in the context of a conversation about errors. The idea of
learner interview broadly follows what Young (2005) calls “the think-aloud method”.
According to Young this method intends to capture human thoughts as data is collected.
“The think-aloud method” can be done in two ways - while the participant is
undertaking an activity or after its completion (p20). The central aim of “the think-aloud
method” is to capture what the subject (the learner) is actually doing (ibid) and why.
There are two advantages of this method which are important for the DIPIP project.
Unlike evaluation data whereby a teacher needs to figure out what the sample of
learners might have been thinking when they wrote something down on paper, the
“think-aloud method” enables the teacher to act as a researcher and through making
statements, asking questions and explaining, to interrogate the learner’s way of thinking.
In this way the teacher can capture the sequence of learner’s thinking and its depth, the
specific errors of which he/she is not aware, their misrecognition of key conceptual links,
difficulties they experience regarding visual or other problem solving skills a test
question may contain, or their misunderstanding of the instruction in the question. In
10
addition to the individual feedback time that a learner interview makes possible, the
idea here is that through appropriate engagement with the teacher, the learner gets an
opportunity to verbalise his/her thinking and the teacher gets an opportunity to
understand the learner’s difficulty in relation to the acquisition of the mathematical
content at hand. The main limitation of this method is that learners (or teachers) “who
are less capable of thinking about their own thinking will be less capable of reporting on
it” (p25). Another disadvantage (Sheinuk, 2010, p18) is “that not all thought is accessible
at all times and issues of language and articulation can impede the mental processes of
the participants from being accurately reported.”
Taking the above limitation into account, it is important to acknowledge the importance
of these kinds of feedback situations in teaching, in particular, for ‘assessment for
learning’ (Gipps, 1999 Black et al 2003; Gipps and Cumming 2004; Shepard 2000, Black
and Wiliam, 2006). One of the most challenging aspects of teaching is the transmission of
criteria. Researchers (Charlot, 2009) emphasise the importance of evaluative feedback for
helping learners recognise what counts as a legitimate text and for helping them develop
epistemic means to realise these texts in the future.1 Feedback practices work with
learners’ responses to help them both understand why they are not yet meeting criteria
and how they might begin to construct more appropriate responses. In this way,
feedback can be used to promote what Vygotsky (1987) called a ‘Zone of Proximal
Development’ by both recognising what the learner produces and articulating it in line
with more powerful forms of knowledge (Shalem and Slonimsky, 2010).
Bernstein (2000) distinguishes between ‘recognition rules’ that refer to the Learners’ ability to classify legitimate
meanings – that is, to know what meanings fall outside a theoretical model and what may/may not be put together – and
‘realisation rules’ that refer to the acquirer’s ability to produce/enact what counts a legitimate text (2000, 16 and 209).
1
11
Report Plan
Section one
Activity and process: in this section we discuss the rationale for the learner interview
activity, its aim, its context within the DIPIP project, and the instructions teachers
received before the activity.
Section two
Methodology: in this section we describe the process of the evaluation- the sample, the
aspects in the activity we selected for the evaluation and the codes we developed for
these, the coding instrument, the training of the coders, the coding process and lastly the
validation process.
Section three
Quantitative analysis: in this section we provide graphical presentation of the coded
interview sample with findings presented. We address broad questions about the overall
quality of the interview data. We present the quantitative data for the 13 teachers
selected for the evaluation. Where appropriate we note the differences between the
grouped grades – the 6 Grade 3-6 teaches and the 7 Grade 7-9 teachers.
Section four
Qualitative analysis: for this section we selected 4 interviews, two from each of the
grouped grades. In the analysis of these interviews we intend to first demonstrate our
coding so the reader understands how codes and categories were assigned to teachers’
utterances. Second we describe explicitly how the different activities in a learner
interview (five of which are led by the teacher and one by the learner) coalesce and
together create a successful or a weak interview. This will help us in deepening our
central question for the evaluation of this activity which is: What does the idea of
teachers’ interpreting learner performance diagnostically, mean in a context of a learners’
interview? We also consider other important questions relating to a diagnostic learner
interview, such as, What do teachers, in fact, do when they interpret errors with the
learner present? And what are the benefits for the learner.
Section five
We conclude with lessons to be learned from the project and recommendations both for
professional development and research.
12
Section One: Activity Process
13
1.1 Process
1.1.1 Time line
14 weeks: March to August 2010
The learner interview activity was part of the final project activity which involved
groups setting and marking tests for their classes, analysing the errors made by learners
when they wrote the tests and selecting particular learners to interview, based on the
errors they had made when writing the test. The preparation for the interviews, filming
of the interviews conducted in schools and subsequent large group presentations of
learner interviews took place over a period of 14 weeks between March and August
2010.
1.1.2 The Process
In their small groups, the teachers marked all of the written tests, looking out for
interesting learner responses. Then through a discussion in the group, they selected
three learners for a learner interview. The groups were requested to motivate the choice
of questions for the interview (in relation to the learners’ work) by describing:
 the mathematical concept the question is addressing
 the misconception evident in the learner’s work
 the way it is exemplified in the learner’s work, and
 what questions the learner could be asked to expose her/his thinking more fully.
Before the interviews, the groups discussed the following:
 ways of creating a conducive atmosphere for the interview, such that learners will feel
safe enough to verbalise their thinking and mode of reasoning.
 ways of avoiding leading questions
 watching one’s body language and allowing oneself to be puzzled by the learner’s
answer rather than being critical of it, and
 types of questions that can be used to prompt a learner to talk.
All the learner’s interviews were videotaped. The interview was divided into two parts.
In the first part, the teacher and learner discussed the item, and the learner’s reasoning
in addressing the question. In the second part, the teacher set a mathematical problem
similar to the item already discussed.
14
The last part of this process was a presentation by the small groups to the large group.
Each group was asked to watch the recordings of all the learner interviews conducted in
the group and select different episodes for different categories of episodes (for example,
a time that a teacher was impressed by, surprised by, unsure about learner’s thinking
during the interview, or when something planned/unplanned happened and how they
handled it). Once the reflection was completed the interviewer from the group selected
one episode from her/his own interview for presentation to the large group (an episode
where it was difficult to understanding the learner’s thinking). In the presentation to the
large group, the interviewer needed to justify the selection, explaining to the large group
why it was difficult to understand learner’s thinking in the chosen episode.
Table 1: Setting own test and conducting learner interviews
Material
Tasks
Group type
Guideline on test
setting
Setting a test
11 small groups
Groups were asked to
(2 Grade 3
 design a 6 item test (not more than 3 items to be multiple
choice).2
1 Grade 4
 explain the rationale for the test (the concepts and
procedures selected as a test focus, their relevance to the
curriculum and the misconceptions anticipated)
 specify marking criteria for the test.
Readings on
assessment
Readings on assessment
Verbal instructions to
groups.
Prepare for presentation to large group.
2 Grade 5
1 Grade 6
1 Grade 7
2 Grade 8
2 Grade 9)
At home
Three chapters from a book on assessment were given to all
group members as a resource. The reading of these chapters
was not structured and group members were asked to read
them at home in order to familiarise themselves with the
ideas and be able to apply them in their test setting activity.
3
Large groups
Groups were asked to
 prepare their tests and rationales for a presentation to the
large group
 at the presentations members of the groups were make
notes of comments and feedback that applied to their
group.
Verbal instructions to
groups
Revision of test.
Small groups
On the basis of feedback the groups revised their tests. The
groups were required to explain the changes they made to
the test.
All groups designed tests with 6 questions. Several of them had questions that were broken up into sub-questions. Only
one group included a multiple choice item (one).
2
Linn, RL and Miller, MD (2005) Measurement and Assessment in Teaching (New Jersey, Pearson Prentice Hall),
Chapters 6, 7. And 9
3
15
Material
Tasks
Group type
Guideline on choosing
a learner for an
interview
Marking own test and selecting a learner for an interview.
At school and in
small groups
Teachers were asked to:
 Administer the test to one of their classes
 Mark all of the written tests, looking out for interesting
learner responses.
 Back in their groups, select three learners that they
identified for a learner interview (see below).
 Motivate the choice of question for the interview (in
relation to the learners’ work) by describing:
 the mathematical concept the question is addressing
 the misconception evident in the learner’s work
 the way it is exemplified in the learner’s work
 what questions the learner could be asked to expose
her/his thinking more fully.
Guidelines for learner
interview
Preparation for learner interview.
Small groups
The aim of the interview was not to fish for the right answer
but for the teacher to understand the learner’s reasoning. The
groups were to consider the following:
 ways of creating a conducive atmosphere for the interview
 ways of avoiding leading questions
 watching one’s body language and allowing oneself to be
puzzled by the learner’s answer rather than being critical of
it
 types of questions that can be used to prompt a learner to
talk
Interview Protocol
Learners’ interviews were in two parts:
1.
The teacher and learner discuss the item, and the
learner’s reasoning in addressing the question.
2.
The learner is asked to do a mathematical problem
similar to the item already discussed.
Small groups and
teachers’
classrooms.
The learners’ interviews were videoed.
Learner interview
presentation guide
Reflections on learners’ interviews and preparation for
presentations.
Small groups and
large groups
Each group was asked to:
 watch the recordings of all the learner interviews
conducted in the group
 select different episodes for different categories of episodes
(eg a time that a teacher was impressed by, surprised by,
unsure about learner’s thinking during the interview, or
when something planned/unplanned happened and how
they handled it )
In addition to this, each volunteer interviewer was asked to:
 to select one episode (from her/his own interview) for
presentation to the large group (an episode where it was
difficult to understanding the learner’s thinking)
16
Material
Tasks
Group type
 justify the selection (reasons for selection and why it was
difficult to understand learner thinking)
In this report we focus on the learners' interview. As in the error analysis of evaluation
data (, we aim to answer the following questions:



What does the idea of teachers’ interpreting learner performance diagnostically, mean
in a context of learners’ interview?
What do teachers, in fact, do when they interpret errors with the learner present?
And what the benefits are for the learner.
In what ways can what they do be mapped on the domains of teacher knowledge?
17
18
Section Two: Methodology
19
2.1 Aim of the evaluation
We aim to evaluate the quality of application of error analysis as evidenced in learner
interviews carried out by the teachers. In this we aim to evaluate the quality of teachers
reasoning about error when they are in a one-on-one conversation with a learner.
2.2 Interviews evaluated
For the purpose of analysis of teachers conducting learner interviews we selected a
sample of 13 interviews from teachers who taught in Round 1, Round 2 and Round 3.
The teachers whose interviews we selected corresponded to the teachers from the
respective rounds whose classroom lessons were to be evaluated for the purposes of the
report on the application of error analysis and diagnostic reasoning in classroom
teaching.
Table 2: Sample summary
Full set of data to be considered
Sample
Round 1 teachers:
Five interviews – one each from Grades
3, 5, 7, 8 and 9
Learner interviews conducted by all teachers participating in
the learner interview activity.
(Five of the seven teacher groups that taught in Round 1 took
part in the learner interviews.)
Round 2 teachers:
Learner interviews conducted by all teachers participating in
the learner interview activity.
Six interviews – one each from Grades 3,
4, 5, 6, 7 and 9
(Six teacher groups that taught in Round 2 took part in the
learner interviews.)
Round 3 teachers:
Learner interviews conducted by all teachers participating in
the learner interview activity.
Two interviews – one each from Grades 8
and 9
(Two of the three teacher groups (Grade 7, 8 and 9) that taught
in Round 3 took part in the learner interviews.)
The teachers all took part in the interview activity at the same time although their
experience in the teaching activities took place over different times, according to the
round in which they did their teaching. Not all teachers participated in the interview
activity since two teachers were unable to arrange a convenient time for the interview
within the time limits of the interviews.
20
2.3 Evaluation Instruments
In this section detail on the instrument used in the interview analysis is given. An
overview of the coding template is followed by a more detailed explanation of each of
the activities coded for, when using the instrument.
2.3.1 The coding template
Six evaluation criteria were drawn up by the DIPIP team based on the error analysis
criteria (Shalem & Sapire, 2012) and in line with activities that describe the teacher’s and
the learner’s participation in the interview (see Appendix 1).
A coding template was prepared for the coders so that coders could enter their codes for
each minute of each interview in a similar manner. The coding template included
column headers for each minute to be coded and row headers according to the criteria
used for coding (see Appendix 2).
Coding Instrument overview
General information on interviews




Teacher name
Grade
Interview topic
Summary – an overview of the interview content and general flow written after the
full interview was coded.
Criterion code rows to code teacher participation





Procedural (Proc)
Conceptual (Con)
Awareness of error (Awa)
Diagnostic (Diag)
Use of everyday (ED)
Coding Categories4




4
Not Present
Inaccurate
Partial
Full
In the coding sheet, the categories were numbered Full (4), Partial(3), Inaccurate (2) Not present (1)
21
Criterion code row to code learner participation

Multiple explanations (Mult)
Coding categories
A simple count of the number of different explanations of errors given by learners
during their interviews was used to code for multiple explanations. The following
categories were coded:




One explanation
Two explanations
Three explanations
Four or more explanations
2.3.2 The five activities describe the teachers’ participation in the interview,
and their respective codes

Procedural activity – Interacting with learners on errors involves a great deal of
discussion of procedural activity in which statements, guiding instructions,
questions and explanations are made by the teacher in order to clarify the procedure
that is involved in the mathematical question. Procedural activity explanations need
to be given with sufficient clarity and accuracy if the learners are to grasp the
procedures and become competent in performing them. A Proc code was assigned
everytime the teacher addressed the learner on how to do the mathematical working
required by the question, including what steps to take and in what order. For
example (Grade 8 interview):
Teacher: Ok so you’ve got a 1, a 2 and a 6; and a nine (points to 1 st row); and a 4,
a 5 and another 9 (points to 2nd row). And I want you to make as many different
true sentences as you can using only these numbers. Now you can use different
symbols, so you can use plus and minus and brackets and all that kind of stuff.
Learner: Ok.
Teacher: But if you can, just write down as many as you can think of, um, I don’t
know, in about one minute, then let’s see what you get.
Learner: Does it have to equal to 9 or can it be any of these numbers?
Teacher: No, no it can be anything. You can use them in any way you like.
Learner: Ok.
Four categories were selected to code the quality of the procedural activity
conducted by the teachers during the interview - not present, inaccurate, partial and
full. It was not expected from the teachers that they engage procedurally during
every minute, and so a “time count” for the code of not present for procedural activity
is not meaningful in the content of the minute-by-minute coding of the interviews.
However, in this activity, for the overall interview, the category not present means
22
that the teacher did not engage in procedural activity at all in the full course of the
interview.

Conceptual activity – Mathematical procedures need to be unpacked (Shalem &
Sapire, 2012) and linked to the concepts to which they relate in order for learners to
understand the relations between mathematical constructs embedded in the
procedure. Interacting with learners on errors involves a focused conceptual activity
where by statements, questions and explanations made by the teacher conceptually
open up (or fail to open up) the procedure that is involved in the mathematical
question. The emphasis of this activity is on the teachers’ conceptual work in relation
to the procedure required by the question. A code Con was assigned every time the
teacher was seen to attempt to conceptually unpack the procedure or an aspect of it,
in order to illuminate the background and process of the procedure. For example
(Grade 3 interview):
Learner: I was trying to draw the seats.
Teacher: Ah I think here you had the right idea didn’t you? (Learner nods.)
Alright. What type of sum do you think it might be?
Learner: It would be like a times.
Teacher: You think it’s like a times. How do you think you might … do you
think you might work it out differently now?
Learner: You could do it as times.
Teacher: You think you could do it as times. Show me how you think you might
be able to do it as times. Take a pencil. Show me how you’ll do it as times.
Four categories were selected to code the quality of the conceptual activity
conducted by the teachers during the interview - not present, inaccurate, partial and
full. It was not expected from the teachers that they engage conceptually during
every minute, and so a “time count” for the code of not present for conceptual activity
is not meaningful in the content of the minute-by-minute coding of the interviews.
However, in this activity, for the overall interview, the category not present means
that the teacher did not engage in conceptual activity at all in the full course of the
interview.

Awareness of error activity – This is a particularly important criterion which needs
to be explained more fully in the context of learner interviews. Learner interviews
include different activities (we included five different activities), and these activities
take place relationally. They are used to reinforce each other and to clarify specific
ideas expressed by the teacher or the learner during the conversation. Nevertheless,
an interview in which the nature of the error does not become established in the
course of the conversation has no real educational value. A code Awa was assigned
specifically to statements made by the teacher that demonstrate her/his attempt/s
(direct or indirect) to establish the error around which the conversation is focused.
Establishing what the error is about is a delicate pedagogical matter since the aim of
the interview is for the teacher to probe the learner’s reasoning and not simply to
23
mark and state the error for the learner. This means that the code cannot be applied
to every minute of the interview. Nevertheless, at particular moments in the
interview and in response to what the learner verbalizes, the teacher is expected to
establish what the error is about. This is particularly so in view of the general
emphasis in the DIPIP project that learners’ errors are linked to general
misconceptions that need to be uncovered. The emphasis of this code is on teachers’
expressions to the learner of what the error is about, in response to what is
verbalized by the learner in the course of the interview.5 For example (Grade 9
interview):
Learner: As in for twenty-five (points to number on horizontal line), I did a two
point five, to, two hundred and fifty (points to number on vertical line), because
there was no twenty-five. So I used these variables (points to numbers on horizontal
line) just to say this is five, even though it’s zero comma five, I said this is five,
and like that…
Teacher: Ok, so you ignored the zero (points to zero) and…
Learner: Yes.
Teacher: Actually you multiplied by what? Multiplied by ten or the…?
Learner: No, I didn’t multiply it, I just ignored the (crosses out a number on
horizontal line)…
Teacher: You removed the zero.
Four categories were selected to code the quality of the awareness of error activity
conducted by the teachers during the interview - not present, inaccurate, partial and
full. In this code, not present refers to the minutes in the interview where the teacher
should have but did not take an opportunity to establish the error in response to
what the learner verbalized.

Diagnostic reasoning activity – The idea of diagnostic reasoning on the part of the
teacher during error analysis involves the teacher going beyond the actual error to
try and follow the way the learner was reasoning when he/she made the error.
Through probing questions teachers can attempt to understand the way a learner
was reasoning when he/she was solving the question. Much time in interviews could
be spent on teachers’ probing their learner’s reasoning, but this does not mean in
itself that the interview has a high quality of diagnostic activity. The importance is to
understand the relation between the five activities and how together they lead the
probing into a successful interview. A code Diag was assigned particularly to
probing statements and questions that teachers use in order to pursue the learner’s
reasoning. For example (Grade 3 interview):
Learner: I thought I had to plus over there
The code does not tell us if the teacher is aware of the error or not (the teacher may withhold it intentionally) but rather
if the teacher conveys it coherently to the learner. Notwithstanding, in some places we draw implications about teachers’
awareness of error from what they say.
5
24
Teacher: yes we plus there but when we get to your working out I don’t see your
subtraction symbol
Learner: I did not know if I had to make the sum a plus or a minus
Teacher: So you didn’t know if you had to plus or minus. Ok so for this sum you
were able to say 65-19+17. So you only struggled here?
Learner: yes maam
Teacher: So that's why when you get here you better add the whole thing rather
than showing your subtraction first?
Learner: yes maam
Teacher: so your problem is that you didn’t start subtracting and then only later
on adding
Four categories were selected to code the quality of the diagnostic reasoning activity
conducted by the teachers during the interview - not present, inaccurate, partial and
full. In this code the category not present refers to the minutes in the interview in
which the teachers should and did not probe or her/his probing did not engage with
the learner’s reasoning behind the error

Use of everyday activity – Teachers often explain why learners make an error by
appealing to everyday experiences that learners draw on and confuse with the
mathematical context of the question. A code of ED was assigned every time teacher
drew on everyday experiences to explain mathematical content. The emphasis of this
code is on the quality of the teacher’s use of everyday, judged by the links he/she
makes to the mathematical understanding he/she attempts to advance, when
engaging the learner on her/his incorrect answer. For example (Grade 6 interview):
Teacher: No, but I say … if you’re saying you were supposed to stop here (points
out line) and you stopped there (points out line), I need you to explain to me why?
Ok…turn this around (turns page), what does it look like to you, if I turn that
around? These lines, what do you think they look like?
Learner: Rows.
Teacher: They look like my…?
Learner: Rows.
Teacher: My rows, ok. All my rows now. Where else will you see those little …
rows or little markings, or little demarcations?
Learner: In the table.
Teacher: On my…?
Learner: Table.
Teacher: Table?
Learner: Ruler.
Teacher: On my…?
Learner: Ruler.
Teacher: Ruler. Ok, look at my ruler very carefully and study it very carefully,
where do I start with my ruler (child points at one on ruler)? What number do I
start?
Learner: Zero.
25
Teacher: I start at zero (points to zero on ruler). From zero where do I go (points at
one on ruler)?
Learner: One.
Teacher: I go to one. And then (points to two on ruler)?
Learner: Two.
Teacher: And then (points to three on ruler)?
Learner: Three.
Four categories were selected to code the quality of the use of everyday activity
conducted by the teachers during the interview - not present, inaccurate, partial and
full. It was not expected from the teachers that they engage with everyday
explanations during every minute, and so a “time count” for the code of not present
for everyday explanations is not meaningful in the content of the minute-by-minute
coding of the interviews. However, in this activity, for the overall interview, the
category not present means that the teacher did not engage in everyday explanations
at all. It may not have been appropriate in the context of the interview for the teacher
to engage with everyday explanations, but this was not part of the scope of this
report analysis and would need further research to be properly investigated.
2.3.3 The activity that describes learner participation:
Multiple explanations stated by the learner – One of the challenges in learner
interviews is for teachers to give their learners the opportunity to state explanations of
the error. This is because when learners verbalise the thinking behind their errors and
provide claims about the mathematics they do, they provide teachers with material with
which they can work. This is the core aim of the ‘think-aloud’ method of research, as
explained above. The only aspect of learner participation in the interview that was
noted by this code was the number of explanations of different errors the learner stated
during the interview. The learner stated her/his explanations in response to teacher
probing during the interview. The number of different errors stated by the learner is
linked to the number of different topics raised by the teacher in the interview. The more
focused the interview, fewer different errors are discussed in the course of the interview.
For example one of the best interviews was a grade 4 interview where the focus
throughout the interview was on the use of place value to understand how to subtract
using the vertical algorithm (see the qualitative analysis section of this report). One of
the weaker interviews was a grade 9 interview where the teacher jumped from one
concept to another, introducing many errors (and related explanations) but in relation to
different concepts and without links between the concepts discussed in the interview
(see the qualitative analysis section of this report).
A code of Mult assigned each time the learner gave voice to his/her explanation of their
own error. All explanations were coded as feasible explanations of the error since they
26
were stated by the learner and represent the thinking of the learner as he/she speaks in
the interview. The teacher was not the focus in this code although the teacher’s
responses to these error explanations receive attention in the other criteria. For a list of
some of the explanations given by learners in the sample interviews see Appendix 4.
2.4 Training of coders and coding process
Two external coders both experts in the field of maths teacher education coded the
learner interviews. The two coders watched and coded interviews after discussion with
the team. They raised queries with the team leaders after which a meeting was held in
which all coders sat together and watched an interview together and discussed coding
decisions. After this no further combined meetings were held but individual coders
continued to discuss queries as they worked through the video set. Both coders coded
the full set of videos.
Each interview was coded minute by minute for each of the 6 activities (five on teachers’
participation and one on learner participation). Obviously it was not expected that each
minute will receive a code on all six activities. Some minutes received no codes, some
minutes received one or more of the codes. When a code was received it was coded with
its relevant category descriptor: For example, Conceptual: partial.
The spread of the four categories across the interview for each of the codes enabled us to
allocate an overall category per interview. There were difficulties with the minute-byminute coding since this requires professional judgement to be made for each if these
intervals throughout the course of the interview, but it did enable some insight into the
types of activities the teachers engaged in during the course of the interviews that would
otherwise not have been possible had the interviews only been given a global code per
criteria.
Coders had to enter a code of 1 to 4 (representing the four categories not present,
inaccurate, partial and full) for each minute in which they noted activity on any of the
criteria. The coders inserted their codes in each column, per minute, according to the
given row heading for criterion coding. Cells were to be left blank when activity was not
present. Coders were required to make detailed notes in which they justified their
coding. For this purpose they were given another template consisting of two tables (see
Appendix 3). They completed these two tables for each of the coded interviews.
2.5 Validity and reliability check
In the overall set of coding there were similar patterns in the coding sets of the two
coders, although professional judgement was not always taken in the same way and so
the minute by minute codes were not always assigned in the same way. This was partly
because coders did not necessarily make the same breaks between minutes when they
27
watched the videos6. A third coder arbitrated the codes and produced a final set of
coding that was used for the quantitative analysis.
For purposes of alignment a final overall code was assigned to each interview for each of
the interview activity criteria. These overall codes were used for the purposes of
alignment between coders. Ultimately there was just over 70% alignment between the
three coders which was considered adequate.
2.6 Data analysis
The analysis of the coded data was done in three parts. The first is a quantitative
analysis of the 13 interviews in order to get a broad picture of how the teachers handled
the interviews. In the quantitative analysis section, graphs of the overall data set across
all videos are given and interpreted. Graphs which give the spilt across the grouped
grades (3-6 and 7-9) are included in an appendix, since there were a few differences
between the two groups which were note-worthy. We look at the pattern of performance
that emerged from the graphs’ representing the sample data. In this we discuss the
teachers’ performance on each of the five activity criteria by discussing findings on the
four categories of quality for each of the criteria used to evaluate teacher participation.
As part of the quantitative analysis we include a brief examination of learner
participation according to the different errors which they spoke about while they were
being interviewed. This is done for the total of 13 learners interviewed, looking at the
spread of their statements of explanation. We conclude the quantitative analysis with a
list of its main findings.
The second analysis is a qualitative analysis in which we take excerpts from four
interviews and analyse the quality of teacher engagement, using the six activity criteria
drawn up for the evaluation. Our qualitative analysis is guided by the findings on the
quantitative analysis. This analysis enables us to demonstrate our criteria and even more
importantly to show the relations between the coded activities in each interview.
The last part of the analysis reflects on the project interview activity and the findings
and draws recommendations.
The final interview video analysis and all our other video analysis is based on transcribed data, with minute-by-minute
time intervals. Transcriptions also facilitated note taking and reporting on the interviews.
6
28
Section Three: Quantitative analysis
29
3.1 Overall picture
An overall code was assigned to each of the learner interviews according to the minuteby minute coding. This was done by reviewing the spread of minute-by-minute codes
assigned for each interview and assigning a global code for each of the criteria for each
interview as described in the methodology section above. These overall codes allow us
to get an overview of the types of activities carried out in the interviews. We report the
overall data first and then break it down to the different interview activities discussed in
2.3.2. We report on these activities for the sample set of 13 interviews selected for the
evaluation, highlighting note-worthy differences between the two sets of groupedgrades (six Grade 3-6 teachers and seven Grade 7-9 teachers) when these arise. For
simplicity of language we refer to them as Grade 3-6 teachers and Grade 7-9 teachers. 7
3.1.1 Time spent on interview activities
There were 103 minutes of interviews coded for the Grade 3-6 teachers and 120 minutes
for the Grade 7-9 teachers. This means that a total of 223 minutes or 3 hours and 43
minutes were available for coding ion the interview sample. In Figure 1 below the total
number of minutes coded according to interview activity is shown.
Figure 1: Time spent per activity, Grade 3-9 and Grade 7-9 group
Time spent per activity
120
Number of minutes
100
80
60
40
20
0
Procedural
Conceptual
Awareness
Diagnostic
Everyday
Grade 3-6
51
33
26
92
7
Grade 7-9
44
28
32
100
2
In the Error Report we referred to groups. The analysis in this report hones on the 13 teachers that were selected for the
evaluation and so we refer to teachers and not to groups.
7
30
It is important to remember that seven interviews were selected from the Grade 7-9
group for the evaluation and only six from the Grade 3-6 group (because of the nature of
the sample, see Table 2), and that the number of minutes coded in each of these groups
within the sample were not equal. In the next table we represent the percentage of time
spent on the different interview activities in the grouped grades as well as in the overall
sample. The percentages calculated for analysis are taken of the total number of minutes
coded for each of the grouped grades (103 minutes for the Grade 3-6 teachers and 120
minutes for the Grade 7-9 teachers). It is important to note that because the coding was
done per minute and in some minutes there could have been activity on more than one
of the activity criteria (and hence received more than one code), percentages across all of
the activity criteria do not make a total of 100%.
Table 3: Percentage of time spent per activity across the interview sample set
Grade 3-6
Grade 7-9
Overall
Procedural
Conceptual
Awareness
Diagnostic
Everyday
49,51%
32,04%
25,24%
89,32%
6,80%
36,67%
23,33%
26,67%
83,33%
1,67%
43,09%
27,68%
25,80%
86,33%
4,24%
Observations about the percentage of the total interview time spent on each of the
interview activities
1. The highest amount of interview activity time was spent on the diagnostic
activity, with the relative amounts of time spent by the two grade groups
virtually the same.
2. The second highest amount of interview activity time was spent on the
procedural activity but in this activity there was more time spent by the Grade 36 teachers.
3. Almost equal amounts of time were spent on the conceptual and awareness of
error activities. On awareness there was virtually no difference between the
relative amounts of time spent but the Grade 3-6 teachers spent more time on
conceptual activities than the Grade 7-9 teachers.
4. The least amount of interview activity time was spent on the everyday activity,
the Grade 3-6 group spent more time here, though still very little overall.
Whenever we compare the grouped grades the percentages of time are used to facilitate
this comparison, since numbers of minutes per activity would weight the sample in
favour of the Grade 7-9 group for which a higher number of minutes were coded.
In both groups the teachers spent most of the interview time on diagnostic activity. By
comparison, the time spent on establishing awareness of the error is much lower. Very
little time is spent on working through the everyday, which is consistent with the
31
findings of the error analysis. In terms of teachers’ mode of engagement when the
teachers make statements, pose questions or explain, more time was spent on procedural
activity than on conceptual activity (more so by the Grade 3-6 teachers). In what follows
we investigate these time-related findings further. The following two considerations
guided the analysis:
1. Relation between procedural and conceptual
 The error analysis report shows high correlation between teachers’ procedural
and conceptual explanations of the error in the ICAS 2006/7 error analysis
(Shalem & Sapire, 2012, p.98). The report also shows that most of the groups’
explanations (of the correct solutions and of the error) are incomplete, missing
crucial steps in the analysis of what mathematics is needed to answer the
question (ibid). In view of this, our next step is to examine the quality of the
teachers’ procedural and the conceptual activity in the interview and the relation
between these activities.
 We do this in two ways. First, we look quantitatively at the pattern across the
whole sample but we refer to differences between the grouped grades when
these differences are higher than 10%. This analysis will direct us to the quality
of the teachers’ procedural and conceptual activity, mainly, in terms of use of
time in the interviews on statements, questions and explanations that are
accurate and full (procedurally or conceptually). Secondly, through a qualitative
analysis of four interview, we examine what the teachers do when they engage
with learners on procedure and conceptual and how they link these two
activities.
2. Relation between awareness of error and diagnostic reasoning
 The error analysis shows correlation between awareness of error and diagnostic
of learners’ reasoning behind the error. The report also shows that groups
struggled to describe learners’ reasoning behind the error. The above results on
use of time in the interviews show more activity on diagnostic reasoning than on
awareness of error. It suggests that teachers probe learners very often but do not
always take up the appropriate opportunity to establish awareness of error.
 In view of this, our next step is to examine the quality of teachers’ probing of
learners’ reasoning and the ways in which teachers establish the nature of the
error. To repeat, much time spent in interviews on probing learner’s reasoning
does not mean in itself that the interview has a high quality of diagnostic activity.
The importance is to understand the relation between all five different activities,
or how together they lead the probing into a successful interview.
 Our analysis of this will include a quantitative examination of the spread of the
diagnostic activity across the four categories. First, we look quantitatively at the
pattern across the whole sample but we refer to differences between the grouped
grades when these differences are higher than 10%. This analysis will direct us to
the quality of the teachers’ probing, mainly in terms of use of time on probing
questions that are accurate and clearly hone in on the error. We do the same for
32
the teachers’ use of time on awareness of error. Second, we examine in more
detail, using the above four interviews, what the teachers do when they probe
their learners and how do they take opportunities to establish awareness of error
in response to what the learners bring. The latter is available to us through the
data on learners’ participation (see 3.1.3).
33
3.1.2 Teacher participation
The 13 teachers’ engagement with their learners in the interview was the focus of five of
the six coding criteria.
Figure 2: Overall code set
Overall interview codes
Number of interviews
10
9
8
7
6
5
4
3
2
1
0
Procedural
Conceptual
Awareness
Diagnostic
Everyday
Pedagogical Content
Knowledge
Subject Matter Knowledge
Not present
1
4
6
2
9
Inaccurate
2
3
1
6
0
Partial
8
4
3
3
4
Full
2
2
3
2
0
Observations about overall coding of interview data for the total number of interviews.
1. In one of the 13 interviews the teacher did not offer any procedural activity at all. In
two of the 13 interviews inaccuracies were noted in the procedural activity offered
by the teachers. The majority of the teachers (eight out of 13) offered accurate but
incomplete procedural activities during the interviews. Only two of the 13 teachers
maintained procedural activity that is correct and includes all of the steps in the
procedure.
2. In four of the 13 interviews the teachers did not offer any conceptual activity at all. In
4 of the 13 interviews inaccuracies were noted in the conceptual activity offered by
the teachers. In three of the 13 interviews the conceptual activity that the teachers
offered included some but not all of the key conceptual links which illuminate the
background and process of the procedure. Only two of the 13 teachers (the same
teachers as in the above) maintained conceptual activity that illuminates
conceptually the background and process of the procedure.
34
3. In six of the 13 interviews the teachers did not take up any opportunity to establish
the error in response to what the learner verbalised about her/his error. In one of the
13 interviews the teachers’ statements about the error was mathematically flawed. In
three of the 13 interviews the teachers’ statements about the error were
mathematically sound but did not link to a common misconception. Only two of the
13 teachers (the same teachers as in the above) offered statements about the error
that are mathematically sound and suggest links to common misconceptions or
errors.
4. In two out of the 13 interviews, the teachers either did not probe at all or their
probing had no relation to what the learner verbalised (i.e. their probing shows that
they did not listen to the learner’s mathematical reasoning behind the error). In 6 of
the interviews, the teacher was found to use leading questions with little
engagement with what the learner brings. In three of the 13 interviews the teachers’
probing (the same teachers as in the above and one other) was responsive to what
the learner brought, but was too broad in that it was not sufficiently honed on the
error. Only two teachers used probing questions that are engaging the learners
reasoning and hone on the error.
35
3.1.3 Learner participation
The aim was not to look at learners’ full participation in the interviews since the focus in
the interview analysis was on the teachers, but we coded the explanations of errors that
were given by learners so that we could analyse qualitatively the relationship between
teachers’ activity and learners’ activity (see 4.3).
Figure 3: Learner multiple explanations
Multiple explanations given by
learners
Number of interviews
6
5
4
3
2
1
0
overall
One
explanation
Two
explanations
Three
explanations
Four or more
explanations
2
5
3
3
Observations of number of explanations stated by learners in the 13 interviews
1. Learners offered one explanation of one error in two of the 13 interviews.
2. Learners offered two explanations of two different errors in five of the 13 interviews
3. Learners offered three explanations of three different errors in three of the 13
interviews
4. Learners offered four or more explanations of four or more errors in three of the 13
interviews.
The overall picture of teachers’ and learners’ participation in the interviews points to
the following:
1. Most of the teachers (eight out of 13) use procedural activity, albeit partial.
Conceptual activity appears to be weaker; where by four teachers did not offer it at
all and in other four interviews the teacher’s conceptual activity was inaccurate.
2. In six of the 13 interviews the teachers did not take up any opportunity to establish
the error in response to what the learner verbalised about her/his error.
3. Inaccuracies are more present in diagnostic reasoning. In many of the interviews (six
out of 13), the teacher was found to use leading questions with little engagement
with what the learner brings.
36
4. Most of the teachers (nine out of 13) do not connect to or draw on everyday life
experiences when they engage with learners on the error.
5. The teachers focused the discussion on one or two errors in seven out of the 13
interviews. The other six of the interviews seem to be less focused.
37
3.2 Presence of the coded teacher interview activities
In this section of the report we conduct a quantitative analysis the performance of the 13
teachers in the sample according to the six activity criteria drawn up for the interview
activity evaluation. We examine the teachers’ performance according to each of the
interview activities criteria. Where the performance by the two grouped grades was
notably different we note and refer the reader to the relevant graph. We also include a
brief examination of learner participation. We begin with the examination of the
teachers’ procedural activity during the learner interviews.
3.2.1 Procedural Activity
Figure 4: Procedural explanations in learner interviews
Procedural explanations in interviews
50.00
45.00
40.00
Percentage
35.00
30.00
25.00
20.00
15.00
10.00
5.00
0.00
Series1
Not present
Inaccurate
Partial
Full
0.00
5.41
29.63
8.05
Observations about groups’ procedural activity during learner interviews:
1. In total the teachers spent 1 hour 35 minutes (43,09% of the total interview time of 3
hours 43 minutes) on procedural activity.
2. In 30% of the total interview time, the teachers gave incomplete procedural
statements, questions or explanations (partial). This means that during this time the
teachers spent this time giving procedural guiding comments that are accurate but
do not include all the steps needed.
3. In about 5% in the total interview time, the teachers gave statements, questions or
explanations which were categorized inaccurate, procedurally. This means that there
were flaws in their statements, questions or explanations or they were so incomplete
that they might have been misleading to the learner.
38
4. Only in 8% of the total interview time, the procedural activity engaged by the
teachers was categorised as full. On this category the grouped grades were notably
different: in comparison to the Grade 7-9 teachers, when the Grade 3-6 teachers
interacted with their learners on the error, they spent more time (14, 56% and 0,83%
respectively) engaging the learners with statements, questions, explanations that are
accurate and include all of the key steps in the procedure (see Appendix 5 for the
grouped grade graph) .
5. Minute by minute coding cannot capture not present for procedural explanations.
Teachers cannot be expected to be engaged in procedural activity every minute. This
category is shown in 3.1.2 which compares the 13 interviews over all five teacher
participation criteria across the whole interview.
39
3.2.2 Conceptual Activity
Figure 5: Conceptual explanations in learner interviews
Conceptual explanations in interviews
50.00
45.00
40.00
Percentage
35.00
30.00
25.00
20.00
15.00
10.00
5.00
0.00
Series1
Not present
Inaccurate
Partial
Full
0.00
9.09
9.65
8.95
Observations about groups’ conceptual activity during learner interviews:
1. In total the teachers spent 61minutes (27,68% of the total interview time of 3 hours
and 43 minutes) on conceptual activity.
2. In 9,65% of the total interview time, the teachers gave statements, questions or
explanations that were found to be incomplete, conceptually (partial). This means
that during this time the teachers spent time on statements, questions or
explanations that include some but not all of the key conceptual links that are
needed in order to illuminate the background and process of the procedure.
3. In 9,09% of the total interview time the teachers gave statements, questions or
explanations which were inaccurate, conceptually. This means that during this time
the teachers spent this time giving poorly conceived conceptual links in their statements, questions or explanations which are potentially confusing to learners.
4. In 8,95% the total interview time, the teachers gave statements, questions or
explanations that included full conceptual links. On this category the grouped
grades were notably different. This means that in comparison to the Grade 7-9
group, when the Grade 3-6 teachers interacted with their learners on the error, they
spent quite a lot more time (14, 56% and 2,5% of their time respectively)
engaging them with statements, questions or explanations that conceptually
illuminate the background and process of the procedure (see Appendix 5 for the
grouped grade graph).
40
5. Minute by minute coding cannot capture not present for conceptual explanations.
Teachers cannot be expected to be engaged in conceptual activity every minute. This
category is shown in 3.1.2 which compares the 13 interviews over all five teacher
participation criteria across the whole interview.
41
3.2.3 Awareness of error Activity
Figure 6: Awareness of the mathematical error in learner interviews
Awareness of errors in interviews
50.00
45.00
40.00
Percentage
35.00
30.00
25.00
20.00
15.00
10.00
5.00
0.00
Series1
Not present
Inaccurate
Partial
Full
14.02
1.25
6.52
4.16
Observations about groups’ awareness of the mathematical error given to learners in
the context of learner interviews:
1. In total the teachers spent 58 minutes (25,80% of the total interview time of 3 hours
and 43 minutes) on awareness of error activity.
2. In 6,52% of their total interview time, the teachers gave mathematically sound
statements, questions or explanations about the error that suggest the establishment
of partial awareness of error. On this category the grouped grades were notably
different. This means that in comparison the Grade 3-6 teachers spent more time
than the Grade 7-9 group (11,65% and 1,66% of their interview time
respectively) on giving statements, questions or explanations about the error, albeit,
without linking these to common misconceptions (See Appendix 5 for the grouped
grade graph).
3. Only in 1,25% of their total interview time, the teachers gave flawed statements,
questions or explanations about the error, which demonstrated poor awareness of
the error.
4. In 4,16% of the total interview time the teachers gave statements, questions or
explanations about the error that are mathematically sound and suggest links to
common misconceptions or errors.
6. In 14,02% of the total interview time the teachers received a category of not present.
On this category the grouped grades were notably different. In 17,50% and 8,73% of
42
their total interview time, the Grade 7-9 and the Grade 3-6, respectively, should have
but did not take an opportunity to establish the error in response to what the learner
verbalized. (See Appendix 5 for the grouped grade graph).
43
3.2.4 Diagnostic reasoning Activity
Figure 7: Diagnostic reasoning in learner interviews
Diagnostic Reasoning in interviews
50.00
45.00
40.00
Percentage
35.00
30.00
25.00
20.00
15.00
10.00
5.00
0.00
Series1
Not present
Inaccurate
Partial
Full
11.86
38.37
26.10
9.99
Observations about groups’ diagnostic reasoning given to learners in the context of
learner interviews:
1. In total the teachers spent 3 hours and 12 minutes (86% of the total interview time, 3
hours and 43 minutes) on diagnostic activity.
2. In 26,10% of the total interview time, the teachers demonstrated partial diagnostic
mode of engagement. On this category the grouped grades were notably different.
This means that in comparison the Grade 7-9 teachers probed more often than the
Grade 3-6 teachers (36,66% and 15,53% of their interview time respectively), in ways
that were coded as too broad and not sufficiently honed on the mathematical error
(see Appendix 5 for the grouped grade graph).
3. In 38,37% of the total interview time, the teachers probed learners but did not engage
with what the learner brought to the conversation. In this time teachers used only
leading questions with little engagement with what the learner brings. In this sense
the probing was inaccurate.
4. In 10% of their total interview time, the teachers demonstrated full diagnostic mode
of engagement. On this category the grouped grades were notably different. This
means that the Grade 3-6 teachers engaged more often with learner’s reasoning and
honed in on the errors in the interviews than the Grade 7-9 teachers (17,47% and
2,5% of their interview time respectively). See Appendix 5 for the grouped grade
graph.
44
5. In 11,86% of the total interview time the teachers received a category of not present. In
this time the teachers should and did not probe or their probing did not engage with
the learner’s reasoning behind the error.
45
3.2.5 Use of everyday Activity
Figure 8: Use of the everyday in explanations in learner interviews
Use of everyday in interviews
50.00
45.00
40.00
Percentage
35.00
30.00
25.00
20.00
15.00
10.00
5.00
0.00
Series1
Not present
Inaccurate
Partial
Full
0.00
1.25
2.50
0.00
Observations about groups’ use of the everyday in explanations given to learners in the
context of learner interviews:
1. Generally minimal use of the everyday in the interview discussions.
3.2.6 Multiple explanations
Multiple explanations are coded according to the explanations stated by the learner over
the full duration of the interview and hence the percentages are worked out of the total
number of interviews and not the number of minutes (6 interviews in the Grade 3-6
group and 7 interviews in the Grade 7-9 group).
46
Figure 9: Multiple explanations in learner interviews
Number of interviews
Multiple explanations in interviews
5
4
3
2
1
0
One
explanation
Two
explanations
Three
explanations
Four or more
explanations
Grade 3-6
1
3
1
1
Grade 7-9
0
2
3
2
Observations about groups’ multiple explanations given by learners in the context of
learner interviews:
1. Grades 7-9 learners gave more explanations of different errors than the Grade 3-6
learners. This reflects a stronger focus on one error in the Grade 3-6 group and a
greater tendency to discuss a range of questions in the Grade 7-9 group.
47
3.3 Quantitative Findings.
Main finding:
In both groups the teachers spent most of the interview time on diagnostic activity,
probing the learners. By comparison, the time spent on establishing awareness of the
error is much lower. In terms of mode of engagement when the teachers made
statements, pose questions and explain, more time was spent on procedural activity than
on conceptual activity (more so by the Grade 3-6 teachers). Very little time is spent on
working with learner’s error by drawing on everyday experiences. Analysis of the
number of errors raised in the interviews showed that in seven of the interviews more
than three errors were mentioned by learners.
Table 4: Percentage of time spent per activity across the interview sample set
Grade 3-6
Grade 7-9
Overall
Procedural
Conceptual
Awareness
Diagnostic
Everyday
49,51%
32,04%
25,24%
89,32%
6,80%
36,67%
23,33%
26,67%
83,33%
1,67%
43,09%
27,68%
25,80%
86,33%
4,24%
In what follows we provide more detail on different aspects of the main finding and
explain the specific finding:
1. Very little time is spent on working through the everyday, which is consistent
with the findings of the error analysis, that “groups draw primarily on
mathematical knowledge and less so on other possible explanations to explain
the correct answer or the errors” (Shalem & Sapire, 2012, p.97).
2. The teachers spent more time on procedural activity than they did on conceptual
activity. This means that in many more minutes (15,41% more) in the interview
the teachers engaged procedurally with the learners. Some of the time, teachers’
engagement was procedural as well as conceptual, but at other times, the focus
was on only procedural explanation. On its own this finding does not convey a
judgement of quality. Not every procedural activity in an interview situation
needs to be coupled with conceptual activity. A qualitative analysis is required to
understand the relationship between these activities in the interview, in order to
further understand if and how the teacher’s procedural engagement
compromises or contributes to the learner’s conceptual understanding.
3. The pattern of quality distribution within the procedural and conceptual
activities is different. Relative to each other, the teachers give more partial
procedural statements, questions and explanations than they do for conceptual
explanations. In other words, overall conceptual activity is less present in the
interview, but when it is present the distribution across the three categories
48
(inaccurate, partial and full) of quality is more even. In procedural activity, the
percentage of time is distributed unevenly across the three categories, with partial
being the highest.
4. In all of the five activities the teachers spent relatively little time (under 10% of
the total interview time) engaging with the learners in the activity-respective
way categorized as full. Notwithstanding, the Grade 3-6 teachers did better on
this category of quality in three activities (procedural, conceptual and
diagnostic). (See Appendix 5).
Table 5: Percentage of time coded full over total interview time
Interview
Activity
Procedural
Conceptual
Awareness
Diagnostic
Everyday
Full
8,05%
8,95%
4,16%
9,99%
0%
5. Despite being the dominant activity in all 13 interviews, the diagnostic activity
was found to be the one in which 40% of the time spent was categorized as
inaccurate, as the teachers’ probing did not engage with the learners’ reasoning.
Taken together with the 11,86% of not present , close to half of the time the
teachers engaged with learners, diagnostically, was wasted on very poor quality
interaction.
Table 6: Percentage of time coded inaccurate over total interview time
Interview Activity
Procedural
Conceptual
Awareness
Diagnostic
Everyday
Inaccurate
5,41%
9,09 %
1,25 %
38,37%
2,50%
6. The table below looks at the time the teachers spend on each of the five activities
in the two higher quality categories.
a. The table suggests that the teachers are confident in giving statements,
questions and explanations that address the mathematical procedure
required by the question. The majority of the time the statements,
questions and explanations are partial and in about 8% of the time on
procedural activity they are full. This is very different from the teachers’
49
conceptual engagement, which is done much less, but more importantly,
in comparison to the procedural activity, more time is spent on poor
quality conceptual activity.
b. The relation between error awareness and diagnostic is also worrying. It
is worrying because the teachers spend too much time probing the
learners with statements and questions that are of poor quality (close to
50% the total interview time) and which do not engage the learner’s
reasoning. This constrains the learners’ ability to verbalize their reasoning
– be it their confusion, misrecognition or lack of knowledge. More so,
poor quality probing constrains the teacher’s ability to gather relevant
information and to construct a picture of what the problem is. For
teachers who have strong subject knowledge that may be less of a
problem; they can rely on their understanding of the field, on their
general knowledge of common errors. The error analysis report suggests
that this is not the case with this group of teachers. It suggests that
teachers’ content knowledge is poor, a finding that is consistent with
much other research in SA. We argue that poor quality of probing takes
away the focus and leads to no resolution on what the nature of the error
is. It curtails the teachers’ capacity to gather evidence, to think through
what the learner does verbalize or to respond appropriately. This is also
evident in the high percentage of time (approximately half of the activity
total time in the interview), that the teachers miss an opportunity to
establish what the error is.
Table 7: Percentage of time coded full and partial over total interview time
Interview
Activity
Full
Partial
Total of partial
and full
Procedural
Conceptual
Awareness
Diagnostic
Everyday
8,05%
8,95%
4,16%
9,99%
0%
29,63%
9,65 %
6,52%
26,10%
2,50%
37,68%
15,60%
10,68%
36,09%
2,50%
Activity total (%of
the total interview
time)
43, 09%
27,68%
25,80%
86,33 %
4,24%
7. Arguably, the Grade 3-6 group interviews were more focused on one or two errors
during the interviews than the Grade 7-9 group. This may have been the case since
mathematical topics in the higher grades may have richer concept maps and thus a
discussion around one error from a learner test may raise more than one related
concept (or error in relation to this concept). Notwithstanding the many different
explanations given in some of the Grade 7-9 interviews were not a result of a rich
50
discussion around a concept and its related concepts, but rather to the number of
different test items discussed by teachers during an interview.
These findings give rise to the following guiding questions for the interview:
1. In what ways does the teacher’s procedural engagement facilitate the learner’s
conceptual understanding of the error?
2. What kind of probing is productive in that it gathers relevant evidence for teachers
that they can use to make judgements on the nature of the error and on when to
establish it in the course of the interview?
3. Given that in the context of an interview partial explanations would have a place,
what is the role of a full explanation?
51
52
Section Four: Qualitative analysis
53
4.1 Sample
In the following we present four examples of learner interviews. We selected two
interviews form the Grade 3-6 group and two interviews from the Grade 7-9 group. In
our analysis we use the evaluation criteria (see 2.3.1) to describe the strengths and
weaknesses in the ways the four teachers engage the learners with statements, questions
and explanations about the error. In broad terms the analysis will focus on the quality of
progression in the conversation and on the extent to which the interview increases
learner’s understating of the error.
The transcriptions are presented in tables, minute by minute, followed by analysis
relating to each time segment of the interview. Coding allocated per minute is indicated
in the second column of the transcription table. The code is indicated next to the
particular line that signals it, usually for the first time in that minute, although often it
also refers to other lines in the minute. Criterion abbreviations are used as indicated in
the table below.
Table 8: Activity criteria abbreviations
Criterion
Criterion Abbreviation
Procedural explanation
Proc
Conceptual explanation
Con
Awareness of error
Awa
Diagnostic reasoning
Diag
Use of the everyday
ED
Multiple learner explanations
LMult
Category descriptors are abbreviated to the first letter of the descriptor:
Not present (N),
Inaccurate (I),
Partial (P), and
Full (F).
54
4.2 Analysis of learner interviews
4.2.1 Grade 4 learner interview
The learner interview focuses on the following question selected from the test set by the
DIPIP grade 4 “small group”8. The teacher that elected to conduct the learner interview
selected this question since the learner had done the subtraction required in the solution
of this question incorrectly in a most unusual way.
The question was given as a word problem. Learners needed to read and interpret the
question in order to realise the sequence, the type of number operation they needed to
select, and the mathematical procedure they needed to follow in order to arrive at the
solution. First they needed to calculate Kim’s original mass (twice Amy’s i.e. 38 × 2 = 76
kg) and then they needed to subtract the number of kilograms she lost in the year (76 kg
– 19 kg = solution).
Figure 10: Extract from Grade 4 learner test
Figure 11: Snapshot of learner working.
8
See Process Report for the classification of ‘group’ in the DIPIP project.
55
Extract from learner interview
The interview is 18 minutes long. The extract below is the first 8 minutes of the
interview.
Minute 1
Speaker Utterance
Teacher Good morning, you’re going to do this interview with me. About weeks
ago we were writing a test.
Learner Yes
Teacher And you have done, this is the question 5 that you have done and you
have come as far as … you have added all the um kilograms because it’s
double the weight (points to paper with various numbers and words written
thereon) and then, after that, you decided to subtract. Now something that
I’m very interested in about is the way you were subtracting. Now can
you please, while subtracting, explain to us step by step what you were
doing.
Code
Diag (P)
Analysis – Minute 1
The teacher opens the interview with a broad probe describing to the learner the method
of subtraction that she used answer the test question.
Minute 2
Speaker
Teacher
Learner
Teacher
Learner
Utterance
(points to paper) Because this is 76 take away 19, you did it, you got
21. 76 take away 19. Can you please, step-by-step, what you are
doing … explain to me. (Places pen and paper in front of L. On paper
is written 76 – 19 underneath each other.)
Because you can’t subtract 6 (points to the 6 of 76) from 9 (points to the
9 of 19). So I took 1 away from the 7 (points to the 7 of 76, crosses out
the 7 and writes 6 above it) so it becomes 6 and this one becomes 7
(crosses out the 6 of 76 and writes 7 above it). Now you still can’t
subtract it, so I took away 1 again (crosses out the 6 written above the 7
of 76 and writes 5 above it) and it became 5. And I added this one
(crosses out the 7 written above the 6 of 76 and writes 8 above it), so it
became 8.
Alright. Ok.
And I still couldn’t get it, so I took there (crosses out 5 on the left side
and writes 4 above it) made it 4 and then I added (crosses out 8 on right
hand side and writes 9 above it) and made it 9. So 9 minus 9 obviously
is going to become zero so then I took it again and 4 took away 1
(crosses out 4 on left hand side and writes 3 above it), made it 3. And I
added 10 (crosses out 9 on right hand side and writes 10 above it).
Code
Diag (F)
Mult (1)
56
Analysis – Minute 2
The teacher continues to probe, this time referring more specifically to the learner’s
working. The probe hones in on the learner’s error and the teacher gives precise
instructions (“Can you please, step-by-step, what you are doing … explain to me”). The
learner responds in detail, giving her explanation of the way in which she had done the
computation. The learner takes the teacher through each step of the “borrowing from the
tens” that she did in her calculation. The teacher allows the learner all the time she needs
to explain her working in full and does not interrupt her or correct her, even though it
becomes clear well before the learner has finished her explanation where she had gone
wrong.
Minute 3
Speaker
Learner
Teacher
Learner
Teacher
Learner
Teacher
Learner
Teacher
Learner
Teacher
Learner
Teacher
Learner
Teacher
Utterance
10 take away 1, um 9, I got 1. And 3 minus 1 was 2. (writes 21 in
answer line) So that’s how I got my answer.
Well the way you were explaining to me very well. Right, now
Jamie, I’m going to give you another sum so you can do the
subtraction again, ok?
Ok.
But this time we’re going to use something else, something
different. You going to use a little bit of blocks and you have to
listen very carefully to me. (T places piece of red paper in front of
L.) Right, now Jamie, ok can you please read the sum there on top
for me (points to top of paper).
45 minus 14.
Minus 14, ok, right. Now can you please show me where all the
units are?
Tens are on this side (points to right hand digits of numbers – ie 5
and 4).
Name them for me, please.
Units are 5, 4.
And 4. So you going to subtract the units from each other. Can you
please tell me where the tens are?
The tens are on the left hand side (points to these).
Show me.
It’s 4 and 1 (points)
4 and 1 right.
Code
Diag (F)
Awa (F)
Con (F)
Analysis – Minute 3
The teacher continues to probe diagnostically by introducing another example that she
has chosen that will enable her to work with the learner on the error embedded in her
test solution through working with concrete manipulatives – unifix cubes. The selection
of the example together with the use of concrete manipulatives shows that the teacher is
57
leading towards establishing the learner’s error: the learner does not know how to
subtract two digit numbers from each other when there is an impasse. An impasse occurs
when there are insufficient units in the minuend - the number from which the learner
must subtract the given subtrahend - to simply subtract the units in the subtrahend -the
number which the learner must subtract from the given minuend - without rearranging,
or working in some way, with the tens and the units so that the subtraction can be made
possible. The teacher works conceptually with the learner by asking her to show her
where the tens and the units are in the given numbers. This further demonstrates the
teacher’s awareness that the learner’s error lies in her misunderstanding of the way she
needs to work with the place values of the numbers in the given subtraction problem.
Minute 4
Speaker
Teacher
Learner
Teacher
Learner
Teacher
Learner
Teacher
Learner
Teacher
Learner
Teacher
Learner
Teacher
Learner
Teacher
Utterance
Now what we going to do is we going to use a little bit of blocks,
right. And you going to help me a bit (hands blocks to L). Please
can you place all the tens for me on your diagram. Just place them
there. Just place them (moves blocks onto paper) there you go.
You going to place all the tens together. Like you said, tens (places
4 rows of tens in front of L). How many tens do you have there?
4.
4 tens is equal to how much?
4 tens equals 40.
40, good. Now I’m going to give you the units. How many units
can you count?
It’s 5.
Do you think we should break them up a bit ne?
Ja.
Ok there we go. Well done. Now Jamie what I want you now to do
is um you have to subtract, right? So we going to start subtracting.
First we going to start subtracting the units. How many units do
you have?
5
How many must you take away from them
4
Do it for me.
1, 2, 3, 4 – take them away (L removes 4 blocks)
Can I give you a hand?
Code
Proc (F)
Con (F)
Analysis – Minute 4
In the interaction in minute 4 the teacher gives procedural instructions and questions to
the learner to direct her through the working with the blocks (unifix cubes), to help the
learner work through the subtraction of two digit numbers. She also asks conceptual
questions (“4 tens is equal to how much?”) to make sure that the learner is keeping in
58
mind the place values (and the meanings of these place values) of the numbers with
which she is working.
Minute 5
Speaker
Teacher
Learner
Teacher
Learner
Teacher
Learner
Teacher
Learner
Teacher
Learner
Teacher
Learner
Teacher
Learner
Teacher
Learner
Teacher
Utterance
Let me give you a hand and I’ll put it all the way over there
(removes the 4 blocks to her right hand side) right.
Ok
Ok so how many’s left?
One
Well done. Now we’re going to subtract the tens. Can you do that
for me?
Uh huh
Ok how many tens must you take away from there?
One (indicates 1 with her finger).
We are going to take one ten away, well done. So do it for me. (L
removes one row of tens). Ok how many tens are left?
3
Altogether, what is your answer?
31
Ok so can you please write in there. When you take 5 away from 4
your answer was?
31
Ok it was 1 and then 3 ok well done. That is a star. (Removes
paper and blocks from the L.) Right we’re gonna take this and put
this away. Ok so you had a little bit of practice there, we going to
put it all the way over there. You are doing very well. Ok right
now I’m going to give you another one. Ok. Right, there you go
(places another piece of red paper in front of L with 76 – 19 written
underneath each other). Ok can you read the sum for me please?
76 minus 19.
76 minus 19. How … tell me which ones are the units.
Code
Proc (F)
Con (F)
Analysis – Minute 5
The interaction continues in the fifth minute. The teacher’s activity is predominantly
procedural but at all times, this carefully controlled procedure (linking hands on work
with the unifix cubes to the subtraction of the given two numbers in the example) is
designed to expose the learner to the correct way of working with numbers in a
subtraction problem where there is an impasse. Conceptually, the teacher keeps bringing
the learner back to identifying the numbers according to their place value (“which ones
are the units”) since this links to the underlying misconception evidenced in her test
working. The teacher finishes the first example and then moves on to a second example,
which is the same as the test question, but the teacher does not say anything about this
59
(she leaves this connection to be made later on the interview, in minute 11 which is not
part of this extract from the interview).
Minute 6
Speaker
Learner
Teacher
Learner
Teacher
Learner
Teacher
Learner
Teacher
Learner
Teacher
Learner
Teacher
Utterance
Units are 6 and 9.
6 and 9, and the tens?
Is 7 and 1.
7 and 1. 7 tens ok, and we must take one tens away (hands blocks to
L). So let’s count the tens quickly, how many tens do we have
there?
1, 2, 3, 4, 5, 6, 7 (points to each tens as she counts).
7 tens, and how many units do I need?
6
6 units, ok. (Hands 6 units blocks to L.) Remember what I taught
you, the same steps that we did. I want you to take … um … you’re
subtracting units. How many units must you subtract?
9
9. Now I want you to subtract 9 units from there.
I can’t because 6 is a lower number from 9.
So you can’t subtract uh 6 from there. So now what do you think
we should do? Because remember we are only allowed to use this
(indicates the blocks on the red paper) and only take away from 6.
Right. (L nods.) You’re taking away from 76, you need to take away
from there. So what do you think you should do now?
Code
Con (F)
Proc (F)
Analysis – Minute 6
In the sixth minute the interaction is once again predominantly procedural linking the
hands-on work to the conceptual and procedural activity involved in the subtraction of
the two numbers. Unlike the previous minutes of the interaction in which the teacher
guided the learner through the procedure, step by the step, in this minute the teacher
asks the learner to think of the procedure (“So what do you think you should do now?”)
Minute 7
Speaker
Learner
Teacher
Learner
Teacher
Learner
Utterance
I think you should, uh, take one ten away from 7 and put it in front
of the units.
Oh so you should take one ten away (points to a tens block) and you
put it with the units. Can you do that? (L does this.) Ok does it look
the same as the units?
No
What should we do?
Break it up.
Code
Con (F)
60
Teacher
Learner
Teacher
Learner
Teacher
Learner
Teacher
And then if you break it up, what do you get?
You get uh 16.
Oh ok, do it, show me. Show me how you get 16. (L proceeds to
break up the one tens she placed with the units into separate blocks.) Ok
well done, I’m so proud of you. So now what you did is, did you
borrow from there? Did you take from there?
(L nods.) Uh huh.
Ok show me – how many were there in the first place (points to the
tens blocks)?
7
So then you took one away from them, how many is left?
Proc (F)
Analysis – Minute 7
The teacher keeps the interview interaction focussed on the learner’s error, working
slowly and systematically through the example. She does not rush the learner and she
continually directs the learner to work with the blocks in order to understand the
numeric calculation that she must do.
Minute 8
Speaker
Teacher
Learner
Teacher
Learner
Teacher
Learner
Teacher
Learner
Teacher
Learner
Teacher
Learner
Teacher
Learner
Teacher
Learner
Teacher
Learner
Teacher
Learner
Teacher
Learner
Utterance
Can you please indicate (points to paper), is it still 7?
No
So indicate that it’s not 7 anymore … (passes pen to L, who crosses out
7). And how many is it now?
6 (writes this above 7).
It is now?
6
6. Right well done. So can you now take away?
Yes
How many’s this (points to 6 of units) now?
6
Is it still 6, the units?
No
What is the units now?
7
Is it 7? Can you please count your units.
(Counts aloud whilst moving units blocks.) 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11,
12, 13, 14, 15, 16.
How many?
16
First you had 6 and then you add how much?
10
And it gives you?
16
Code
Proc (F)
Con (F)
Awa (F)
Diag (F)
61
Teacher
Learner
Can you please write that six and you add what you did. And then
you get your 16 ne? Like you said. (L writes 16 above 6). Well done.
Can you now take 9 away from there?
Yes
Analysis – Minute 8
In the 8th minute the interaction in the interview allows the teacher to work with the
learner on her misconception in the written calculation, both procedurally and
conceptually. Procedurally, she shows the learner the correct way to write the
calculation. She asks, “Can you please indicate (points to paper), is it still 7?” and gives
the learner the opportunity to write the correct procedural steps, all the time linking this
activity to the concrete apparatus of the blocks. Conceptually she asks “So can you now
take away?” – calling on the learner to reason about whether or not she has overcome
the impasse. She also uses the blocks to allow the learner to establish the number of units
from which she is now subtracting (having exchanged one ten for ten units). The teacher
shows awareness of the learner’s error – which she is in the process of uncovering –
when she asks, “Is it still 6, the units?” A little later she asks, “Is it 7? Can you please
count your units.” The learner counts the units and finds out that there are actually 16
units from which she can now subtract 9 units, as required in the example. The focus
and gentle pace at which the teacher carries out this explanation is remarkable. The rest
of the interview continues in this way and by the end of the interview, the learner is able
to subtract two digit numbers correctly, showing her working and with an
understanding of the use of place value in the subtraction of two numbers when there is
an impasse.
62
4.2.2. Grade 5 learner interview
This learner interview is focused on the following question selected from the test set by
the DIPIP grade 5 teacher group. The teacher that elected to conduct the learner
interview selected this question since she was interested in finding out the reason for the
learner’s incorrect answer to the second part of the question.
The test question consists of two parts. The learner did the first part correctly and the
second part incorrectly.
1. The first part of the question requires the learners to write a mixed number that
has been illustrated diagrammatically using numerals.
2. The second part of the question requires the learners to convert the mixed
number into an improper fraction.
Figure 12: Extract from Grade 5 learner test
Question 1- fractions
1.1 Write the diagram above as a mixed number.
1.2 Write the diagram above as an improper fraction
(1)
(1)
Figure 13: Learner working in Grade 5 interview
Extract from learner interview
The interview is 13 minutes long. The extract below presents minutes 3-9 of the
interview.
63
Minute 3
Speaker
Teacher
Learner
Teacher
Learner
Teacher
Learner
Teacher
Learner
Utterance
Do you still remember that one?
Yes
Ok how did you do this problem? Can you please explain to me?
(Moves paper in front of L.)
This problem teacher (points)?
This one (points), the first one.
Teacher I said one whole plus one whole plus 3 over 4 equals to 2
and 3 over 4.
Good, very good. Now explain to me the second step. How did it
come about like this (points)?
Teacher I said 2 times 3, then I said is 6 and then I said 2 times 6 is
equals to, is equals to 12 teacher. Then I said 12 over 6 teacher.
Code
Diag (P)
Mult (1)
Analysis – Minute 3
The teacher’s last question in Minute 3 (“How did it come about like this (points)?) is an
open question, leading into the exploration of the learner’s error in relation to the second
part of question 1 in the learner’s test. The learner starts to explain what he did, giving
his first explanation of an error. It seems that the learner multiplied the whole number
part of the mixed number with the numerator of the fraction part of the mixed number
(getting 6), then he took this ‘product’ and multiplied it by the whole number (getting
12). This is a direct explanation of his working (shown above in Figure 13), but it does
not illuminate why he followed this procedure of converting a mixed number to an
improper fraction. The learner’s explanation is in direct response to the teacher asking
for explanation of how he did the question. In this exchange the teacher does not build
on the learner’s response, procedurally or conceptually.
Minute 4
Speaker
Learner
Teacher
Learner
Teacher
Learner
Teacher
Learner
Teacher
Learner
Teacher
Utterance
And then I was confused, teacher. I wanted to say 12 over 2 … 3.
Ok do you think that your answer is correct?
No
Why do you think that it is incorrect?
Because, teacher, I wrote it wrong, teacher.
What made you write it wrong?
Teacher, I was confused, teacher.
You were confused, ok. Eh have you ever done this sum before?
This problem before? Have you ever done a problem like this
(points to paper) before?
No teacher (shakes head).
In the classroom, where, have you ever done these sums?
Code
Mult (1)
Awa (N)
Diag (I)
64
Learner
Teacher
No teacher.
You didn’t. (Sound of door opening and closing in background.) What,
what are you thinking about right now about this problem?
Analysis – Minute 4
The learner continues with his explanation of what he did, saying that he was
“confused” and he tries to start saying what he had wanted to write (“I wanted to say 12
over 2 … 3”) but the teacher interrupts him and says “Ok do you think that your answer
is correct?”, as a direct response to the learner’s first attempt to explain his reasoning.
The teacher does not take the opportunity to probe the mathematical content of the
learner’s explanation for example by asking a question like “why did you say 2x3”?
This shows that the teacher is not trying to establish the mathematical reasoning behind
the learner’s error. The teacher does not engage with the learner’s explanations but
rather pushes him about the correctness/incorrectness of his answer and whether or not
he has done such work before. She does ask another broad question, “What made you
write it wrong?” attempting to engage his reasoning but does not actually follow this
through or link it to his explanation given in the previous minute.
Minute 5
Speaker
Teacher
Learner
Teacher
Learner
Teacher
Learner
Utterance
The way you have done it, why? What do you think about it?
I’m not thinking right, teacher.
You aren’t thinking anything.
Eh (yes).
Can you explain each step as you worked through the problem and
give me reasons? You can work out this here (points to question).
Work it out here (points to blank paper). Work it out.
(Picks up pencil and ruler and writes: R565 – R389 = …)
Code
Awa(N)
Diag (I)
Analysis – Minute 5
The teacher’s first question in minute 5 is another open probe, calling for an explanation
of the learner’s reasoning, but the learner is unable to answer the question
mathematically and simple says “I’m not thinking right, teacher.” In response to this,
and despite the clear signs the learner’s provides about his plight of helplessness, the
teacher says to the learner, “You aren’t thinking anything”, showing no attempt to
establish the mathematical reasoning behind the learner’s error. The teacher instructs the
learner to work through what he had done, to write it down step-by-step, passing him a
piece of paper. The learner starts writing out the solution to the next question in the test.
65
Minute 6
Speaker
Teacher
Learner
Teacher
Utterance
No not this one. The very same one, this one (points)
This (points)?
Yes. The very same one, the one that you, you said is incorrect.
You said it is not correct. (L is trying to erase something.) No just
leave it here. Write it here. (L writes 2 × 6 =
12
6
Code
Diag (I)
and shows teacher.)
Ok this is the … you said it’s not correct.
Analysis – Minute 6
The teacher shows the learner which question he should be writing out, showing no
acknowledgement of his earlier explanation (given in minute 3), she just points to what
the learner had done in his test. Diagnostically this still shows no response to the
learner’s reasoning. The teacher’s instructions have no connection to the learner’s
explanations of what he thinks the error was. All the teacher is doing is restating the
incorrectness of the learner working.
Minute 7
Speaker
Learner
Teacher
Utterance
Yes
Now I want you to write it correctly. How are you going to make it
correctly? Write it correctly? (L writes 2 ) Start it from here no?
Start it from here, don’t write anything that … starting from here.
Do it correctly from that step. (L inserts × after the 2, so it reads 2 × 3
Code
Diag (I)
6
= 2. Pushes it towards teacher.)
Analysis – Minute 7
In minute 7 there is no progress. The teacher uses leading questions instructing the
learner to write the mixed number “correctly” although the error has not been
established yet. The probing continues at a low level by way of the teacher giving
repeated instructions as to from where and how the learner should re-write his working.
The teacher’s probing has little engagement with what the learner brings in minutes 3
and 4.
Minute 8
Speaker
Teacher
Learner
Teacher
Learner
Teacher
Utterance
Is that how you do it?
Yes teacher.
Ok. You said two times..?
Two times 6.
Here is an answer here. How did you do it here? From here (points
Code
66
Learner
Teacher
Learner
Teacher
Learner
Teacher
Learner
Teacher
to top of first paper).
From there teacher?
Mm
I said one …
No forget about that, from here (points to part of 1st line of sum). I
want you to do this.
Here teacher?
Yes
I said 2 plus 3 over 4.
Diag (I)
3
Ok work it out so that you should get the answer. (L writes 2 + =
10
4
4
.) Are you satisfied about the answer?
Analysis – Minute 8
In minute 8 there is still no progress as the probing continues by way of the teacher
giving repeated instructions as to from where and how the learner should re-write his
working. The teacher refers the learner to part one of his answer, saying to him, “Here is
an answer here. How did you do it here?” In doing this she has moved away from the
part of the question where the learner made the error to the part where he got it right.
She instructs him, “Ok work it out so that you should get the answer.” The learner
3 10
responds by writing “2 +4 = 4 ” on his sheet of paper.
Minute 9
Speaker
Learner
Teacher
Learner
Teacher
Learner
Teacher
Learner
Teacher
Learner
Teacher
Learner
Utterance
Yes teacher.
How did you work it out? Tell me, you said what?
I said 2 times 3
2 times 3
2 times 3 is equals to 6
Uh huh
Then I said plus 4
Equals to?
10 over 4.
10 over 4. Ok. Uh… do you think this is the correct one? That this
one is the correct answer …
Yes teacher
Code
Diag (I)
Mult (2)
Analysis – Minute 9
The teacher now has some new working from the learner, which again is not correct (but
in a different way), and she probes it, but without any attempt to engage with his
mathematical reasoning. She says “How did you work it out? Tell me, you said what?”
The learner attempts another explanation (of the new working with the different error)
67
and the teacher repeats after him what he says he did, and prompts him, step-by-step to
write the conversion procedure. The learner explains that he has multiplied the
numerator of the fraction by the whole number to get 6. The learner explains that he
added this result (6) to the denominator (4) of the fraction and written the sum (10) over
the denominator. This is the second learner’s explanation of an error, differently error to
the error he made in the test. When the learner is finished the teacher asks, “Uh… do
you think this is the correct one? That this one is the correct answer …”, again not
engaging with his reasoning. Again a non-productive prompt (she knows the answer is
wrong). Ultimately in minute 11 the teacher shows to the learner the correct procedure
for conversion of a mixed number into an improper fraction. She does so without giving
any conceptual links to the mathematical idea that fractions can written in different
ways that are equal, so that the ‘demonstration’ of the correct procedure for the
conversion between forms of fractions that she gives will not necessarily shift the
learner’s understanding.
68
4.2.3 Grade 7 learner interview
The learner interview is focused on the following test question, the Grade 7 group
selected from the Grade 7 test they designed. The teacher that elected to conduct the
learner interview selected this test question since many learners (including the one she
chose to interview) had experienced difficulty with it in the test.
The question presents a shape (the top shape in the diagram below) which is to be
regarded as a “whole” into which three other shapes (the three smaller shapes below,
labelled A, B and C) need to be fitted. The wording of 3.1 says “What fraction of the
picture would be covered by each of the shapes below?” which implies that the learner
needs to find out what fraction of the bigger shape each of the given smaller shapes
represents. The question is testing if the learner can work out the fractional value of
parts of a whole. The activity requires that the learner checks how many of each of the
smaller shapes (taken one at a time and repeated as many times as necessary) are
needed to cover the whole shape above. A further instruction indicates that learners may
flip or rotate the shapes if necessary in order to make them fit into the given bigger
shape.
Figure 14: Extract from Grade 7 learner test
Question Three:
Look at the picture.
3.1. What fraction of the picture would be covered by each of the shapes below?
(The shapes can be flipped or rotated when necessary)
A) _______________
B) ________________
C) ______________
69
Figure 15: Learner working in Grade 7 interview
The darker lines highlighted in this snap shot of the learner’s
work show the way in which the learner thought she would
fit the three different smaller shapes into the bigger shape.
Extract from learner interview
The interview is 7 minutes long. The extract below presents minutes 1-7 of the interview.
Minute 1
Speaker
Teacher
Learner
Teacher
Learner
Teacher
Learner
Teacher
Utterance
So now, this class thing that we did in class, ok, most of it you
found very simple except for this whole question?
Yes
(zooms onto question paper) To start with, what did you find difficult
with this question?
When I first did this question, I thought it was pretty easy, because
all the shapes could fit in, like once, but then you told me they had
to be these sizes. So I didn’t really know like what…I didn’t really
know what to do then afterwards, because I thought just put
like…it doesn’t matter what size they were as long as they fit in
there. But then you said the actual size, so I didn’t know, so that’s
when I got stuck, and time ran out.
Ok, so then what were you actually trying to start drawing here?
I was trying to see if I could fit these triangles in, but then…
Oh, these triangles [and trapeziums and parallelograms] (points to 3
shapes below top shape) to fit into here?
Code
Diag (P)
Mult (1)
Awa (P)
70
Learner
Yes. Then I put the other two.
Analysis – Minute 1
The teacher is trying to get from the learner what was she struggling with (note the
language as the teacher is asking “what was difficult” and not “what do you think you
did wrong”).
The questions the teacher is asking are broad probes which do not hone directly in on
the error, which aim to get the learner to start talking about what she had done in the
test. With these broad questions the teacher is getting the learner to explain what first
confused her. The learner states her first explanation of “error” which in this case is
more that she was confused by the question and was not sure what she had to do. Her
explanation verbalises a confusion- the question includes three different shapes which
somehow could be fitted altogether into the whole. The teacher acknowledges the
learner’s explanation (that she thought she needed to put all three shapes into the big
shape) showing general (yet not specific) awareness of the learner’s error. The teacher
points to the shapes in the question. The learner answers and it appears that they
understand one another. Important to note, though, that the teacher’s understanding of
the learner’s error is not yet fully developed. What the teacher has established so far is
only that the learner started by trying to fit the triangle and then the other shapes into
the whole.
Minute 2
Speaker
Teacher
Learner
Teacher
Learner
Teacher
Learner
Teacher
Utterance
Ok. So that’s what you obviously started doing, was drawing the
triangle part.
Yes
Ok. So now, this question here said to you: the shapes can be
flipped or rotated. Alright. What did you understand by that when
they told you that?
That these shapes themselves cannot just be put there, because
otherwise they wouldn’t make the proper shape. So obviously you
had to turn them or just change them so that they can actually fit
inside here.
Ok. So do you think they were there for a reason?
Yes
Ok. So now, if we go back to this question (points to the shape of ‘the
whole’), it says here: each of these shapes is a part of this whole.
Because this is the whole (points to the shape of ‘the whole’), and each
of these (points to 2nd row of 3 shapes) is a part of that whole (points to
the shape of ‘the whole’). Ok. So you’ve got to use this shape (points
to first shape-trapezium, in row of shapes), and it says, what
fraction is shaded? And now it tells us that we can flip it around
or turn it upside down. So, if we are trying to fit…how, if you
Code
Awa (P)
Diag (P)
Con (P)
Proc (bold
text) (P)
71
could, how could you fit this (points to trapezium) into here
(points to shape of ‘the whole’)? If you want you can use a pencil
or a rubber or whatever, if you like.
Analysis – Minute 2
The teacher again acknowledges what the learner said she did (started drawing the
triangles) confirming for both of them the general awareness of the error, which does not
link to any misconception at this stage. She then moves to the next instruction (in the
question) that the shapes can be flipped or rotated, and probes the learner’s
understanding of that aspect of the question. This probe engages with the learner’s
reasoning but again does not hone in on the mathematical error. The learner’s response
indicates understanding of this instruction and the teacher moves on. The teacher then
gives a detailed interpretation of the question which is predominantly conceptual but
ends with more procedural instructions as to how the learner should proceed. She
explains that the question is drawing on an understanding of the relationship between
parts and a whole. The teacher then refers to the procedural mechanics of fitting the
given “part” shapes into the given “whole” shape.
Minute 3
Speaker
Teacher
Learner
Teacher
Learner
Teacher
Learner
Teacher
Learner
Teacher
Utterance
How would you fit this trapezium (points to trapezium) into here
(points to the shape of ‘the whole’), to work out how many of them fit
into the whole?
Um…I think I’d put two…maybe two there (draws line through top
shape), so this is one trapezium, that’s another (outlines shape of 2
trapeziums with bottom part of top shape)…or I could do the same here
as well to (draws line across top part of top shape)…
To cover the whole.
Yes. There would be four of them (outlines 2 more trapeziums on top
part of top shape).
Ok. So you’ve worked out that four of these (points to second rowtrapezium) can fit into here (points to top shape).
Yes
Ok. So now when you go to this question here, it says: what fraction
of the shape would be covered by each of these shapes? So if I took
one of these (points to trapezium) and put it into the shape (points to
top shape), what part of that whole would be shaded or covered?
A quarter.
A quarter. That’s correct. Because four of them (points to learner’s 4
outlined trapeziums on top shape) fit into there.
Code
Proc (F)
Con (F)
72
Analysis – Minute 3
The teacher completes her explanation of the question by posing a more specific
question in relation to the first shape (a trapezium) using procedural language “How
would you fit this trapezium … into here… to work out how many of them fit into the
whole?” She then allows the learner to “do” the activity. The learner explains what she
would do, procedurally (“I think I’d put two…maybe two there (draws line through top
shape)…”). The teacher prompts here to give the learner the language to state what she
has just done “To cover the whole”. The teacher then asks the learner some questions
which call on conceptual understanding from the learner (“what part of that whole
would be covered?”). The teacher also reinforces the procedural activity the learner has
undertaken to fill the whole with smaller trapezium shapes (“So if I took one of these
(points to trapezium) and put it into the shape (points to the shape of ‘the whole’). The learner
is able to answer correctly, saying “A quarter”. The teacher explains conceptually (in
relation to the shapes they are working with) that they have found a quarter, saying
“Because four of them (points to learner’s 4 outlined trapeziums on top shape) fit into there.”
Minute 4
Speaker
Teacher
Learner
Teacher
Learner
Teacher
Utterance
Ok. So now how in the beginning, did you get… …a third (points to
learner’s answer)? What did you work out that you got a third for
every part?
Because I thought that with all, I put…in the beginning, I originally
thought the trapezium here (traces another line on the shape of ‘the
whole’) and the parallelogram would be here (traces another shape on
shape of ‘the whole’), and there would be the triangle. And I thought
that they each take up a third. So I put one third for each of them.
Oh, I see! So that’s what you actually understood the question that
you take each part (points to 3 shapes in 2nd row), stick it in here, and
then a third, one, two, three parts, would be covered. Now fractions
are obviously equal parts, is that not so?
Yes, which doesn’t make sense.
Ok. So then it doesn’t make sense. Alright, so now we’ve worked
out that that trapezium (points to trapezium in 2nd row) is a quarter.
If we take this parallelogram (points to parallelogram in 2nd row),
how would you take this and fit it into there (points to top
shape)? So how many can we get into there? (long pause)
Code
Diag (F)
Mult (2)
Awa (F)
Con (P)
Proc (bold
text) (F)
Analysis – Minute 4
The teacher probes again for explanation of what the learner had done in the test –
trying to get to what the learner was thinking when she answered the test as she had.
She hones in on the mathematical error – calling the three different unequal shapes
thirds. The learner explains much more clearly than in the first minute that what she
thought was that because all three different shapes (parts) could fit into the bigger shape
73
(whole) at the same time, she could call them thirds. This is a common misconception in
learners who have not yet realised that thirds have to be equal in size. The teacher
establishes what the error is about when she acknowledges that the learner thought she
could just “stick it in here, and then a third, one, two, three parts, would be covered”. In
response to this statement, the teacher then refers back to the conceptual work that they
have done so far in the interview, establishing that the trapezium is a quarter of the
whole, and calls on more procedural activity. The conceptual links are implicit but the
procedural explanation is full – the learner must now work out how many
parallelograms will fit into the bigger shape.
Minute 5
Speaker
Teacher
Learner
Teacher
Learner
Teacher
Learner
Utterance
How many of these…like you did the trapezium (points to
trapezium) to fit in there (points to top shape), how could we draw it
to try and get these (points to parallelogram) to go in there (points to
top shape)? Or what would you do to try and get them into there?
Um…(rubs out previously drawn lines) I think…I would draw in half
again (draws a horizontal line through top shape) and then there would
also be (draws lines to make 4 rhombi) four that could fit.
Ok, perfect. So there’s also four that fits. So now if we go back to the
question: what fraction would be covered by each of the shapes? So
if I took one of these (points to parallelogram) and put it in any one of
them (points to learner’s 4 drawn parallelograms), what part of the
whole is coloured in?
A quarter.
Again it’s a quarter. Ok, now the tricky one. The triangle. How are
we going to work out, if I take one of these (points to triangle in 2nd
row), and put it into the whole (points to top shape), what fraction
that is?
(rubs out lines again) I’d draw a half (draws horizontal line through top
shape again),
Code
Proc (F)
Con (F)
Analysis – Minute 5
The teacher continues to probe the learner to think conceptually while she works
through the next step of the activity (which involves procedural activity). The
procedural explanation includes all of the required key steps (“How many of these…like
you did the trapezium (points to trapezium) to fit in there (points to top shape), how could
we draw it to try and get these (points to parallelogram) to go in there (points to top shape)?
Or what would you do to try and get them into there?”) The conceptual explanation
(“So if I took one of these (points to parallelogram) and put it in any one of them (points to
learner’s 4 drawn parallelograms), what part of the whole is coloured in?”) includes the
background and process required in the procedure – it explains how the learner must
74
use what she has done when she finds out how many parallelograms can be drawn to fit
into the bigger shape.
Minute 6
Speaker
Learner
Teacher
Learner
Teacher
Learner
Teacher
Learner
Teacher
Learner
Teacher
Learner
Teacher
Learner
Teacher
Utterance
…and then triangles (draws lines to make 8 triangles)…like how I did
in the beginning…and then, yes…
Ok, so now if we take one of them and covered part of that whole,
what fraction of the whole would have been coloured in?
An eighth.
An eighth. Ok, so now we know what those answers are: a quarter
(points to trapezium), an eighth (points to triangle) and a quarter
(points to parallelogram). Ok, so now it says here: if you only have
two of these (points to trapezium), ok…what fraction of this whole
picture (points to top shape) would be coloured in?
A half.
It’s a half.
Oh, I see now.
Can you see?
Yes
Alright, so now you got confused because you thought you had to
take each of those (points to the 3 shapes in 2nd row) and stick them in
separately (points to top shape), is that correct?
Yes
Originally.
Yes
But now how did you get a third (points to learner’s answer)…if you
took, like we drew, if you’ve got that parallelogram and that one’s
coloured in (shades parallelogram on top shape), and if we took the one
triangle, and coloured it in (shades triangle in top shape),
Code
Con (F)
Awa (F)
Diag (F)
Analysis – Minute 6
The learner is able to correctly find the fraction part which the triangle represents of the
bigger shape showing that she is able to respond correctly to the teacher’s conceptual
question “if we take one of them and covered part of that whole, what fraction of the
whole would have been coloured in?” The teacher reiterates the learner’s error, referring
back to the learner’s original comment (”because I thought just put like…it doesn’t
matter what size they were as long as they fit in “) – this comment is now made in the
light of the activity that has taken place in the interview. The teacher then probes the
learner’s error further by asking “But now how did you get a third (points to learner’s
answer)” and linking this question to the interview activity of finding parts of the whole
using the given shapes. This probe hones in on the error which was that the learner had
75
called parts “thirds” when they were not actually thirds of the given whole because they
were not all equal in size.
Minute 7
Speaker
Teacher
Learner
Teacher
Utterance
…and then the trapezium that you had originally drawn, and
coloured that in (shades in trapezium on top shape). Now how did you
get originally the answer of a third (points to learner’s answer),
because that’s not a third of the picture (points to top shape), is it?
Because I thought that it could be any size something to do with
three, and so I just put…because all three of them fit in there even
though it’s a different size, I just put a third for each of them.
For each of them. Oh, ok. Perfect. Right. That’s it. (counter at 6:30)
Code
Diag (F)
Mult (2)
Analysis – Minute 7
The teacher completes her probing question at the beginning of the seventh minute. The
learner is able to explain, with an awareness of what she has learnt in the interview (she
says “Because I thought that it could be any size something to do with three”) that she
had mistakenly called shapes of different sizes thirds (this is the same error that the
learner spoke about in minute 4). The teacher concludes the interview by expressing her
understanding of the learner’s error (“For each of them”).
76
4.2.4. Grade 9 learner interview
The learner interview is focused on the following questions selected from the test set by
the DIPIP grade 9 teacher group. The teacher did not seem to have selected a particular
question with a particular error for his interview, rather in his interview he went
through the whole test. The teacher began the interview by asking the learner about his
response to the first question and then continued to ask the learner about his answers to
the rest of the questions on the test as the interview progressed.
In the excerpt from the interview that follows the following three questions are
addressed.
1. The first question is a simple linear equation requiring the learners to solve for x.
2. The second question presents an open equation which the learners must
complete by inserting the correct operation symbols so that the equality of the
equation holds true. Instructions for the completion of the equation are given.
3. The third question presents an equation. There are two parts to the question: In
the first part the learners have to say whether or not the given equation is correct
and in the second part they are asked to correct the equation if it is not correct.
Figure 16: Extract from Grade 9 learner test
QUESTION 1
Solve for x:
5(x + 2) = 2x
QUESTION 2
Complete this statement so it is true that the LHS=RHS.
You may use any of the following operations in place of the *'s to complete the
statement.
+,-, ÷, x, ( )
6 * 8 * 4 = 24
QUESTION 3
5 + 9=14 ÷ 2=7 x 3 = 21
3.1
Is this statement correct?
3.2
If not then what is a possible solution to the problem.
77
Figure 17: Learner working in Grade 9 interview
This image not very clear since the teacher has signed over the learner’s working, the
learner has written the following, upon which the first part of the interview is based:
5(x + 2) = 2x
5x + 2 = 7x
7x = 7x
Extract from learner interview
The interview is 30 minutes long. The extract below presents minutes 2-8 of the
interview.
Minute 2
Speaker
Teacher
Learner
Teacher
Learner
Teacher
Learner
Teacher
Learner
Utterance
(continues) … here, eh , we, on question 1, this is a DIPIP planner
test, hey?
Yes
Ok we were requested to solve for x. Ok and uh as we have been
requested to solve for x, you wrote 5 plus 2 … 5 into x plus 2 is
going to be equal to 2x. And then on this statement here (points to
paper) you wrote 5x plus 2 is equal to, is going to be equal to ..?
7x
7x. And 7x is going to be equal to ..?
7x
Can you tell us how did you come to the answer?
Yes I wrote the test, but the test was too difficult because I was not
prepared. The mistakes I have done in this sum it is because I said
5 times x is equal to 5x and then I said 5 times 2 is equal to … plus,
uh… , plus 2x and then it equals to 7x. And then what I’ve done is
that I said 7x is equal to 7x … because I was checking my sum.
Code
Diag (P)
Mult (1)
78
Analysis – Minute 2
The teacher probes the learner’s explanation of the error on the first question by first
referring to the learner’s working on his test paper and then asking an open question,
“Can you tell us how did you come to the answer?” This diagnostic probe does engage
with the learner’s reasoning but is broad and not yet honed in on the mathematical error
in question The learner responds (giving his first explanation of an error). The learner
tries to explain what he was thinking when he did the working but he pauses and
stumbles through this explanation (“is equal to … plus, uh… , plus 2x”). The learner has
given a first explanation with which the teacher could now engage. This learner’s
explanation is not coherent and he clearly does not know how to simplify the brackets in
the equation or how to solve an equation. Part of the error is that the learner has applied
the distributive law incorrectly. This he explains very poorly (“The mistakes I have done
in this sum it is because I said 5 times x is equal to 5x and then I said 5 times 2 is equal to
… plus, uh… , plus 2x and then it equals to 7x.”). The next part of the error is that he
turned the equation into a statement of the equality of 7x and 7x, he has not actually
solved for x, which was required by the question (“And then what I’ve done is that I
said 7x is equal to 7x … because I was checking my sum”). The learner shifts the
emphasis of his explanation from the actual mathematics of the question, to the
procedure of “checking”, which he says he was doing.
Minute 3
Speaker
Teacher
Learner
Teacher
Learner
Teacher
Learner
Utterance
You were checking your sum?
Yes
So here (points to paper) the question was that you must solve for
x. So you said 7x is equal to 7x. What makes you think that your
answer was correct?
It’s because, what made my sum to be correct is that I think my sum
is correct because I get 7x and then I started to say : no 7x is equal to
7x.
But the question here (points to paper) was to solve for x. Say what is
the value of x. That x is equal to so-and-so. So that’s why I’m
asking (points to paper) do you think that your answer is correct?
No, Sir you tell me that my answer is not correct but me I think, I
think that my answer, I thought that my answer is correct because I
saw 7x and then I write 7x is equal to 7x. But I was supposed to
write x is equal to the answer.
Code
Awa (N)
Diag (I)
Proc (P)
Analysis – Minute 3
In response to the learner’s explanation the teacher says “You were checking your
sum?”. The comment shows no attempt to establish awareness of the mathematical
79
content of the fuller explanation that the learner gave. The teacher only picks up on the
last part of the explanation (minute 2: “because I was checking my sum”) and when he
continues to probe does not respond to what the learner has said. Instead, the teacher
repeats what the question called for (“you must solve for x”) and asks “What makes you
think that your answer was correct?” The teacher then repeats a brief procedural
explanation of what the question was calling for, but this explanation though correct is
incomplete, because it does not include an explanation of the way in which the equation
needs to be simplified and the isolation of the variable required in the solution of the
equation. He just says, “Say what is the value of x. That x is equal to so-and-so”. He
then repeats his question, “So that’s why I’m asking (points to paper) do you think that
your answer is correct?” The learner responds, “No, Sir you tell me that my answer is
not correct but me I think, I think that my answer, I thought that my answer is correct
because I saw 7x and then I write 7x is equal to 7x. But I supposed to write x is equal to
the answer.” In his response the learner again shows little understanding of what he was
meant to do in order to solve the equation (multiply out the brackets, isolate the variable
and solve for x) and he repeats what he wrote in the test. But he now acknowledges that
he understands that he should have solved for x (“write x is equal to the answer”). In
this exchange the teacher shows no engagement with the learner’s mathematical
reasoning he only gives correctional instructions.
Minute 4
Speaker
Teacher
Learner
Teacher
Learner
Teacher
Learner
Teacher
Learner
Teacher
Learner
Teacher
Learner
Teacher
Utterance
x is equal to the answer. But you didn’t do that. Instead you said
7x is equal to there (points to paper). So, but I want to find out. Here
(points to paper) you said that … the question is 5 into x plus 2 is
going to be equal to 2x. How did you get the 7 here? Because here
(points to paper) you wrote the 2x here, but here it is now 7x. How
did you find the 7x?
It’s because of I add 5x plus 2x and then I got 7x, that’s why I get 7x
there.
Oh ok, so you said 7x is equal to 7x, ok.
Yes
Alright now I see. Ok now I want us to look at question 2. We’ll
come back to this one.
Yes
And then because that is how you got the 7x (points to paper). But I
realize that you said that you could have done this sum better by
saying, by saying what?
7x is equal to 7x.
7x is equal to 7x.
Yes
By the way the question was that solve for x.
Solve for x.
If this is so, what is it that you understand?
Code
Diag (I)
Mult (1)
Awa (N)
Proc (P)
80
Learner
What I understand is that they want the value of x.
Analysis – Minute 4
The teacher continues to probe the learner’s reasoning behind his error but with no
engagement with what the learner has offered by way of mathematical explanation so
far in the interview. The learner repeats his explanation of his error (more succinctly this
time), “It’s because of I add 5x plus 2x and then I got 7x, that’s why I get 7x there”). The
teacher’s response to this “Oh ok, so you said 7x is equal to 7x, ok” shows no attempt to
engage with the learner’s inability to apply the distributive law. The teacher does not
pick up on the problem that 5(x + 2) ≠ 5x + 2x. After this, although the teacher says he
will “come back to this one” he repeats one more time that the learner had written “7x is
equal to 7x”. The teacher also repeats his correct but incomplete procedural explanation
that what the question actually required was to “solve for x”.
Minute 5
Speaker
Teacher
Learner
Teacher
Learner
Teacher
Learner
Teacher
Learner
Teacher
Learner
Teacher
Utterance
They want the value of x, ok. To say x is equal to …
Equal to the answer.
So but here did you do that?
No Sir
Ok but you know I didn’t understand something. When you write
a test you become scared because you said you were scared.
Yes
You were scared of what (laughs)?
I was not prepared, Sir.
You were not prepared, so you were scared (smiles)?
Yes Sir
Ok no don’t be scared. Eh, question 2, let’s look at question 2. In
question 2 here (pulls out paper from back of pile), they say here :
complete this statement so it is true that the left hand side is equal
to the right hand side ok? (Hand appears on camera to remove
clock which is blocking view of paper) Ok so they say: complete
this statement so it is true that the left hand side is equal to the right
hand side. You may use any of the following operation in the place
of an asterisk or star to complete this statement. Now here they’ve
given you the four basic operations … plus, minus, division,
multiplication and also the brackets. Now they say 6 asterisk 8
asterisk 4 is going to be equal to 24.
Code
Proc (P)
Diag (I)
Analysis – Minute 5
The teacher repeats his incomplete procedural explanation (see minute 3) of what it
means to “solve for x” by saying that, “They want the value of x, ok. To say x is equal to
81
…” and follows this with a further probe, “So but here did you do that?” to which the
learner responds, “No Sir”. This probing again is directed and not engaged with the
learner’s mathematical explanations which have been given. The teacher seems more
interested in the correctness of the learner’s answer rather than its mathematical content.
After a brief engagement with the learner about his being scared in the test because he
did not feel prepared for the test, the teacher moves on to the second test question,
reading through the question’s instructions. In doing so the teacher is giving a
procedural explanation of what was expected in the second question, which is that
learners complete a given incomplete equation, so that it expresses a valid equality. This
explanation does not include all of the key steps required in the solution of the question
although it is a correct and full reading of the question itself. A fuller explanation could
have included an example of how to replace an asterisk with the appropriate operation.
Minute 6
Speaker
Teacher
Learner
Teacher
Learner
Teacher
Utterance
I can see that you have written it there. But now, all of a sudden,
you have written BODMAS and, uh, here they say, eh, the question
was, uh, that you were supposed to use either plus, minus, division,
multiplication or the brackets. But you’ve written BODMAS. Why
did you write BODMAS here?
I wrote BODMAS because I want to do this sum step by step. Eh
BODMAS means Bracket Of Division Must Addition and
Subtraction. Now what I’m, what I do I start with the brackets and
then I, I … my second step it was division. After division I get this
answer (points to paper) and this answer I subtract with 8 and then
give me 24.
Ok now I see that.
Yes
But now the way in which you were supposed to solve the problem
which was that … 6 asterisk 8 …
Code
Diag (P)
Mult (2)
Proc (P)
Analysis – Minute 6
The teacher refers to the learner’s test script (learner’s response to question 2) and says
that, “But now, all of a sudden, you have written BODMAS […] Why did you write
BODMAS here?” In this way the teacher does engage broadly with the learner’s
mathematical reasoning as written in the learner’s test script. The learner responds that,
“I wrote BODMAS because I want to do this sum step by step.” The teacher does not
respond to this explanation by showing any awareness of the learners’ mathematical
error. His response is, “Ok now I see that” which has no mathematical content. He
moves on to some more procedural explanation of the question (by way of reading out
the question itself) which is again accurate but incomplete since it does not expand on
what is required by the question.
82
Minute 7
Speaker
Teacher
Teacher
Learner
Teacher
Learner
Utterance
(continues) … you were supposed to put any of the basic operations
to uh … make sure that the left hand side is equal to the right hand
side, because that is what the statement is saying.
Now, tell me, why do you think that your answer is correct?
My answer is correct because, eh … eh … I said 6 times 4 equals to
24 ne? And then I said 8 times 4 is equals to 32. And then I said 32
minus 8, it is equals to 24. That’s proves me that my answer is
correct.
Ok, now, eh … why do you write equals sign in the same line?
Equal to 32 minus 8 equal to 24? (Points to paper)
I was trying to show you, sir, that, um … I’m going step by step. I
said 6 times 4 it is equals to 8 times 4. And then eh 32, eh, 8 times 4
times 4, 8 times 4 equals to 32. It equals to 32 minus 8 equals to 24.
Code
Con (P)
Diag (N)
Mult (2)
Awa (N)
Analysis – Minute 7
The teacher expresses the conceptual aspect (the equality of the two sides of an
equation) of the question, explaining that, “you were supposed to put any of the basic
operations to uh … make sure that the left hand side is equal to the right hand side”. He
then probes further, asking, “Now, tell me, why do you think that your answer is
correct?” Diagnostically this probe shows no connection to the learner’s mathematical
reasoning as expressed in the test. The learner again tries to explain his working (his
main point is that he was trying to do it step-by-step, he does not pick up on the errors
in the working that he has written, he just reads it out as it has been written) but the
teacher does not listen to what he says. The teacher pinpoints a technical error that the
learner has made in his written script, asking, “Ok, now, eh … why do you write equals
sign in the same line?” This technical error underlies in part the learner’s error (he is
recording incomplete mathematical number sentences and equating them) but the
teacher does not follow this through. The learner responds that it was because he was
trying to show the teacher that he is “going step by step”. The teacher’s response to this
explanation is given in the following minute.
Minute 8
Speaker
Teacher
Learner
Teacher
Learner
Teacher
Utterance
Ooh, ok, … alright. We’ll also come back to that ok?
Yes
Now let us look at our question 3. In question 3 you are given 5
plus 9 is equal to 14 divided by 2 is equal to 7 times 3 is equal to 21.
Now they say: is the following statement correct? (Looks at L.)
No the following statement is not correct, sir.
Now why didn’t you write … because they wanted, they wanted I
Code
Awa (N)
Proc (P)
Diag (I)
83
Learner
Teacher
think maybe, a … a … an answer. But now you didn’t give an
answer whether is it correct or incorrect here.
Oh sir, I didn’t understand that you want a… a… answer or the
answer of 5 plus 9 minus 14, eh, divide by 2 equals to 7 times 3
equals to 21. I didn’t understand that you want no or yes.
Ok. But here they say: is the following statement correct?
Mult (3)
Analysis – Minute 8
The teacher does not respond to or explain the learner’s error, he simply says “Ooh, ok,
… alright. We’ll also come back to that ok?” and moves on to the next test question
(“Now let us look at our question 3”). He then gives a procedural reading of question 3
followed by another probe which does not engage with the learner’s mathematical
reasoning. The learner explains, “Oh sir, I didn’t understand that you want a… a…
answer” and goes on to explain that he thought maybe the “answer of 5 plus 9 minus 14,
eh, divide by 2 equals to 7 times 3 equals to 21” was what was needed. Here the learner
is trying to explain his confusion – that he had not understood that a “yes/no” answer
was required. The learner expands on the working out that he thought he was meant to
have given in answer to the question. Again, the teacher does not respond to the
learner’s explanation (which contains a similar error to that in minute 7 of equating
unequal numeric statements) but just reiterates what the test had asked, “But here they
say: is the following statement correct?” negating what the learner has said and
reinforcing his own authority rather than using the interview as an opportunity for
diagnosis and development of awareness of errors. The interview continues a full 30
minutes, in a similar manner to the seven minutes presented here.
84
4.3 Qualitative Findings
The central research question - What does the idea of teachers’ interpreting learner
performance diagnostically, mean in a context of learners’ interview? What do teachers, in
fact, do when they interpret errors with the learner present?
In order to use the qualitative data we formulate three specific questions that together
describe what do teachers do when they work with learners’ error diagnostically:
1. In what ways does the teacher’s procedural/conceptual engagement facilitate the
learner’s conceptual/procedural understanding of the error?
Mathematical concepts and procedures are linked – in some ways they could be seen as
different expressions of the same thing – one being more practical and one more
abstract. When teachers are aware of the conceptual understanding required in order to
carry out certain procedures, or the procedural knowledge that when understood
fluently could facilitate conceptual understanding, they can use these two as required to
help learners develop their mathematical knowledge. In the example of the grade 4
learner interview, the teacher spends several minutes working very slowly and
systematically through the procedure of subtraction of two digit numbers, linking the
procedural explanation to the place values (conceptual knowledge) of the numbers
when necessary. In the Grade 5 interview on the other hand, throughout the whole
interview the teacher repeats the test question, reiterates what the learner was doing
wrongly in the test (minute 6), repeats what the learner is saying, asking irrelevant
question (minute 4), repeats the probing question (minute 4) but at no point does the
teacher, in fact, engage with the procedure or provides the learner with the relevant
concept, with which to view the procedure required by the question of converting the
mixed number into an improper fraction.
The overall effect of a systematic procedural explanation (made with the necessary
conceptual links) is that the conceptual understanding of the learner is reinforced and
the procedural knowledge of the learner is developed. In the grade 4 learner interview,
in which the learner does not know how to subtract two digit numbers from each other
when there is an impasse, the teacher works predominantly procedurally but at
particular moments of the interview connects the procedure to the underlying number
concept. Step by step the teacher guides the learner (see minutes 3 and 6), and checks
that he understands the instructions (see minute 3 and 8). Whilst doing that (and whilst
linking the hands on work with the unifix cubes to the subtraction of the given two
numbers in the example see minutes 5 and 6), at specific points of time in the interview,
the teacher brings the conceptual background of the procedure. For example in minute
3, she is asking the learner to show her where the tens and the units are in the given
number of the example they are working together on, or asking her conceptual questions
such as “So can you now take away?” – calling on the learner to reason about whether or
85
not she has overcome the impasse (minute 8). The teacher keeps bringing the learner
back to identifying the numbers according to their place value, which links to the
underlying misconception evidenced in the test working of the learner. From this
example, we see that the teacher chooses when to illuminate the conceptual background
of the procedure and how. She does it by asking the learner questions that direct her to
use the concepts (for example minutes 3, 4) or by thinking about the two concepts
relationally as in “ …does it look the same as the units?” (here the concrete unifix cubes
enable the learner to realise the difference between a digit in the units place and a digit
in the tens place).
In the Grade 5 and 9 learners’ interviews, this is not happening. In the Grade 5, after 6
minutes of interaction, the partners are locked in the error – the learner tries to explain it
and the explanation gets ignored and the teacher restates the incorrectness of the learner
working and keeps asking the learner to explain the error, to show it, to write it and to
state the reasons. In contrast to the grade 4 learner, the grade 5 learner loses her
confidence in the interaction and states: “I’m not thinking right, teacher” (see minute 5).
In the grade 5 interview the teacher does not open the mathematics of the procedure
(converting from an mixed number to an improper fraction let alone offer conceptual
underpinning to the procedure. The grade 9 learner repeatedly acknowledges that he
has made errors but this is more in defence to the teacher’s on-going questioning about
the correctness of his answer. The teacher’s procedural contributions are to repeat the
literal meaning of “solve for x” by saying, “They want the value of x, ok. To say x is
equal to …” (minute 5). These statements do not respond directly to the learner’s
mathematical explanations (which relate to his incorrect application of the distributive
law). Furthermore, he keeps moving on to the next test question and so they do not
make any progress in unpacking the errors which are expressed.
A focused interview establishes the error by positing a relation between the procedural
and conceptual, the learner is confident to articulate her/his errors to the teacher, and the
teacher’s probing is focused on the mathematics.
2. What kind of probing is productive in that it gathers relevant evidence for teachers
to make judgement on the nature of the error and on when to establish it in the
course of the interview?
Probing with the intent to glean mathematically relevant information from learners is
productive. Productive probing can be open or directed, it relates to the learner’s work
under scrutiny and it is in line with what the learner is in fact saying. Teachers need to
listen to their learners when they ask questions. In the example of the grade 4 learner
interview the teacher allows the learner to explain her working in full and does not
interrupt her or correct her (minute 2). This is very different to the Grade 5 interview in
which the teacher interrupts the learner (minute 4) and even insults him (minute 5). In
the Grade 9 interview the teacher asks the learner for explanation of the error, the
86
learner stumbled attempt to explain his error (minute 2), which seems to be that he does
not know how to simplify the brackets in the equation but the teacher ignores the
learner’s verbalised mathematical confusion and only addresses (see minute 3) the
learner’s last few words, in which the learner in fact shifts his explanation to the
procedure of “checking”.
In the DIPIP interviews teachers were told to keep their minds open and listen and
respond to their learners. The qualitative (and the quantitative) analysis shows that
many of the teachers seem more intent on probing in decontextualized ways (i.e. not
related to what the learner verbalizes). They do that when



they expose the learners’ lack of knowledge (see Grade 5 learner interview, minute
4);
they re-teach the content, as in the grade 5 learner interview where the teacher uses
leading questions instructing the learner to write the mixed number “correctly”
although the error has not been established yet (see minute 7) and close to the end of
the interview shows the learner the correct procedure of converting mixed number
into an improper fraction; or
they repeat the test question several times but without addressing the problem as in
the grade 9 learner interview where the teacher repeats a brief procedural
explanation of what the question was calling (see minutes 3 , 4 and 5), and does not
address the problem- the learner’s inability to apply the distributive law.
None of these ways are productive as neither of them helps the teacher to establish a
focused interaction on the error. In these three instances, the teacher’s response does not
shift the level of discussion from what the learner did to the nature of the error.
At the beginning of an interview a broader probe is probably suitable but once the
interview progresses, the teacher needs to take on the answers given by the learner and
her/his own knowledge of the field and delve more deeply into the mathematical
content related to the error. In this way the interview moves from a broad focus to more
mathematically focused way of probing, and questions become more directed or
specific. In the Grade 7 learner interview, for example, the teacher starts with a very
broad question “what did you find difficult with this question?” (Minute 1) As the
interview progresses the probing becomes more specific and contextually relevant to
what the learner is verbalizing. First the learner says that she was confused by the
question and was not sure what she had to do. This is still broad (which continues in
minute 2). During this time the teacher is trying to call out the error and does it partially,
in line with her broad treatment of the error at this stage of the interview. But then the
teacher changes the type of questions – they become more specific in terms of the
mathematical content (for example, “How would you fit this trapezium … into here… to
work out how many of them fit into the whole?” in minute 3) or even more specific
(“what did you work out that you got a third for every part?” in minute 4). She also
relates her question to the test question (“So now when you go to this question here, it
says: what fraction of the shape would be covered by each of these shapes? So if I took
87
one of these (points to trapezium) and put it into the shape (points to top shape), what part
of that whole would be shaded or covered?” in minute 4). This allows a process in which
the specific nature of the error gets established, in response to what the learner
verbalizes. This mode of probing enables the teacher to gather evidence from the learner
with which she continues to establish the nature of the error, using explicit mathematical
language. So in minute 4 the learner says:
Because I thought that with all, I put…in the beginning, I originally thought the
trapezium here (traces another line on the shape of ‘the whole’) and the parallelogram
would be here (traces another shape on shape of ‘the whole’), and there would be the
triangle. And I thought that they each take up a third. So I put one third for each
of them.
And the teacher responds and uses the opportunity to state the error
Oh, I see! So that’s what you actually understood the question that you take each
part (points to 3 shapes in 2nd row), stick it in here, and then a third, one, two, three
parts, would be covered. Now fractions are obviously equal parts, is that not so?
In the first four minutes of the conversation, the interaction changes in a very controlled
way. It moves from the learner’s feeling confused about the question, not sure what she
has to do (minutes 1and 2), to trying out fitting the trapezium (minute 3) and lastly to
stating that she thought that all the three different unequal shapes could be thirds
(“each take up a third”). The last of these learner’s moves enables the teacher to define
the error more fully (“fractions are obviously equal parts, is that not so?”). This is in
direct contrast to the Grade 5 learner interview in which 9 out of the 13 minutes are
spent on repeated instructions as to from where and how the learner should re-write his
working. Throughout the interview the error is repeated and the awareness of the error
does not develop as a result what the teacher or the learner says. Two minutes before the
end of the interview the teacher shows learner the procedure for conversion of a mixed
number into an improper fraction.
Thus teacher’s probing the learner is an act of judgement about what to take up from
what the learner brings, how to connect it to the related mathematical knowledge, what
specifically to address in the next probing and more broadly, when and how to ask the
learner for further information.
3. Given that in the context of an interview partial explanations would have a place,
what is the role of a full explanation?
The progression of a discussion, if it is to be interactive, will naturally result in more
partial explanations as the interview progresses. It would be nice to see a learner able to
give a full explanation of the concept by the end of the interview. The learner should
have reached the point where he/she is able to do so by the end of a successful
interview.
88
The example of the Grade 4 learner interview given above shows the way in which
carefully structured, progressive partial explanations build up to a comprehensive and
well scaffolded full explanation during the course of an interview. By the end of this
interview, the learner had recognised the error in her initial working and was able to
correctly perform the procedure required to subtract one two-digit number from another
in an instance where “carrying” was required.
89
90
Section 5: Conclusion
91
Recommendations for professional development and for
further research
The interview activity undertaken during the last six months of DIPIP Phases 1 and 2
highlighted the difficulty that teachers have when dealing with learners’ mathematical
errors in the context of conversation with their learners. DIPIP teachers carried out these
interviews having spent more than two years in the project, working in groups with
colleagues and district officials under the guidance of group leaders who were highly
qualified mathematics education experts (university staff members or post graduate
students). These could be considered ideal circumstances for optimal performance, and
yet the teachers struggled to engage meaningfully with their learners in relation to
errors they had made. This evidence necessitates a caution with regard to the potential
of developing teacher competence in addressing learner errors simply through
involvement in group guided discussions. DIPIP findings did not relate directly to
teacher knowledge but the implications of the findings could be evidence of this
knowledge. The poor overall demonstration of awareness of error (in relation to
diagnostic reasoning demonstrated through the use of probing questions) could
highlight general weakness in the mathematical content knowledge of the teachers.
Recommendations for professional development
Teacher involvement in activities such as learner interviews is useful and can be
undertaken as a professional development activity in the context of guided group
discussions. It should be noted that the teachers’ find it difficult to engage with their
learners meaningfully with regard to errors that they make. Hence careful planning is
required in relation to mathematical content development necessary for meaningful
discussions of errors that arise in these content areas.
Recommendations for further research
Further research could be done into the relationship between teacher knowledge of
mathematical content and their ability to engage diagnostically with learners’ errors and
demonstrate an awareness of mathematical errors.
92
References
Bernstein, B. (2000). Pedagogy, symbolic control and identity. Oxford: Rowman & Littlefield.
Black, P., C. Harrison, C. Lee, B. Marshall, and D. Wiliam. (2003). Assessment for learning.
London: Open University Press.
Black, P. and Wiliam, D. (2006) Chapter 1: Assessment for Learning in the classroom in
Gardner, John (ed.) Assessment and Learning (London: Sage).
Charlot, B (2009) “School and the pupils’ work”. Educational Sciences Journal 10, pp 8794.
Gipps, C. (1999) “Socio-cultural aspects of assessment”. Review of Research in Education
24. pp 355-392.
Gipps, V.C., and J.J. Cumming. (2004). Assessing literacies.
http://literacyconference.oise.utoronto.ca/papers/gipps.pdf.
Shalem, Y and Slonimsky, S (2010) Seeing epistemic order: construction and
transmission of evaluative criteria British Journal of Sociology of Education, Vol. 31,
No. 6, November 2010, 755–778.
Shalem, Y., & Sapire, I. (2012). Teachers’ Knowledge of Error Analysis. Johannesburg: Saide
Shalem, Y., Sapire, I., Welch, T., Bialobrzeska, M., & Hellman, L. (2011). Professional
learning communities for teacher development: The collaborative enquiry process in the
Data Informed Practice Improvement Project. Johannesburg: Saide. Available from
http://www.oerafrica.org/teachered/TeacherEducationOERResources/SearchResults/tabi
d/934/mctl/Details/id/38939/Default.aspx
Sheinuk, L (2010) Intermediate Phase Mathematics Teachers’ Reasoning About Learners’
Mathematical Thinking. A Research Report submitted to the Wits School of
Education, Faculty of Humanities, University of the Witwatersrand in fulfillment
of the requirements for the degree of Master of Education.
Shepard, L.A. (2000). The role of assessment in a learning culture. Educational Researcher
29, no. 7: 4–14.
Vygotsky, L.S. (1987). The collected works of L.S. Vygotsky. Volume 1: Problems of general
psychology, ed. R.W. Rieberand and A.S. Carton. New York: Plenum Press.
Young, K. (2005). Direct from the source: the value of ‘think-aloud’ data in
understanding learning Journal of Educational Enquiry. Vol.6 (1), 19-26.
93
94
Appendices:
95
Appendix 1
Interview Coding Criteria
Criteria
Procedural
The emphasis of this code is
on the teachers’ procedural
explanation of the error.
Teaching mathematics
involves a great deal of
procedural explanation which
should be done fully and
accurately for the learners to
grasp and become competent
in working with the
procedures themselves.
Conceptual
The emphasis of this code is
on the teachers’ conceptual
explanation of the procedure
followed in the error.
Mathematical procedures need
to be unpacked and linked to
the concepts to which they
relate in order for learners to
understand the mathematics
embedded in the procedure.
Awareness of mathematical
error
The emphasis of this code is
on teacher’ explanation to the
leaner of the actual
mathematical error and not on
learners’ reasoning.
Diagnostic reasoning in
feedback
Category descriptors: Learner interviews
Full
Partial
Inaccurate
When the teacher
explains the error to
the learner/ probes
further, the teacher
demonstrates a
procedure.
When the teacher
explains the error to the
learner/ probes further,
the teacher
demonstrates a
procedure.
When the teacher
explains the error to
the learner/ probes
further, the teacher
demonstrates a
procedure.
The procedure is
accurate and
includes all of the
key steps in the
procedure.
The procedure is
accurate but it does not
include all of the key
steps in the procedure.
Teacher’s use of
procedure is
inaccurate and thus
shows lack of
understanding of the
procedure
When the teacher
explains the error to
the learner/ probes
further, the teacher
includes conceptual
links.
When the teacher
explains the error to the
learner/ probes further,
the teacher includes
conceptual links.
When the teacher
explains the error to
the learner/ probes
further, the teacher
includes conceptual
links.
The explanation
illuminates
conceptually the
background and
process of the
procedure.
The explanation
includes some but not
all of the key
conceptual links which
illuminate the
background and
process of the
procedure.
Teacher explains
the mathematical
error to the learner.
Teacher explains the
mathematical error to
the learner.
Teacher explains the
mathematical error to
the learner.
The explanation is
mathematically
sound and suggest
links to common
misconceptions or
errors
The explanation is
mathematically sound
but does not link to a
common misconception
or error
The explanation is
mathematically
flawed
Teacher seeks to
find out the
Teacher seeks to find
out the learner’s
Teacher probes but
does not seek to find
Not
present
No
procedural
explanatio
n is given
No
conceptual
links are
made in
the
explanatio
n.
The explanation
includes poorly
conceived conceptual
links and thus is
potentially confusing.
No
explanatio
n is given
of the
mathemati
cal error
No attempt
is made to
96
The idea of error analysis goes
beyond identifying a common
error and/or misconception.
The idea is to understand the
way the teacher goes beyond
the actual error to try and
follow the way the learner was
reasoning when s/he made the
error. The emphasis of this
code is on the teacher’s
attempt to engage with the
learner on how the learner
was reasoning when s/he
solving the question.
learner’s
mathematical
reasoning behind
error. In response to
the error the teacher
probes further and
asks the learner to
explain the steps of
her/his reasoning.
Use of everyday knowledge
When the teacher
explains the error to
the learner/probes
further, the teacher
appeals to the
everyday.
Teachers often explain why
learners make an error by
appealing to everyday
experiences that learners draw
on and confuse with the
mathematical context of the
question. The emphasis of this
code is on the quality of the
teacher’s use of everyday,
judged by the links s/he makes
to the mathematical
understanding s/he attempts
to advance, when engaging
the learner on her/his incorrect
answer
Multiple explanations of
error
This code examines the
teacher’s probing/directing for
learner explanations of their
errors. One of the challenges in
learner interviews is for
teachers to give their learners
the opportunity to voice their
explanations of the error. This
is because when learners give
voice to the thinking behind
their errors they can start to
think for themselves about the
flaws in their own
explanations.
Probing engages
with learner’s
reasoning and is
open. Probing
hones in on the
mathematical error.
Teacher’s use of the
‘everyday’ enables
mathematical
understanding by
making the link
between the
everyday and the
mathematical clear
Four or more
explanations are
given by the learner
in the interview
mathematical reasoning
behind error. In
response to the error
the teacher probes
further and asks the
learner to explain the
steps of her/his
reasoning.
Probing engages with
learner’s reasoning and
is open but is too broad
(not sufficiently honed
on the mathematical
error)
When the teacher
explains the error to the
learner/probes further,
the teacher appeals to
the everyday.
Teacher’s use of the
‘everyday’ enables
learner thinking but it
does not properly
explain the link to
mathematical
understanding
Three explanations are
given by the learner in
the interview
out the learner’s
mathematical
reasoning behind
error. In response to
the error he/she
teaches further.
Probing is directed.
The teacher uses only
leading questions
with little
engagement with
what the learner
brings.
When the teacher
explains the error to
the learner/probes
further, the teacher
appeals to the
everyday.
probe or
listen to
the
learner’s
mathemati
cal
reasoning
behind the
error.
Reteaching
without
any
questionin
g.
No
discussion
of
everyday
is done.
Teacher’s use of the
‘everyday’ dominates
and obscures the
mathematical
understanding, no
link to mathematical
understanding is
made
Two explanations are
given by the learner
in the interview
One error
explanatio
n is given
by the
learner in
the
interview
97
Appendix 2
Interview Coding Instrument
Video name (teacher and task):
Minute
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
Criteria
Procedural
Conceptual
Awareness of
mathematical error
Diagnostic
reasoning in
feedback
Use of everyday
knowledge
Multiple
explanations of
error
Summary:
98
19
20
Appendix 3
Interview Coding Notes template
Table 1 (these codes were quantitatively summarised and used to inform the analysis).
Error focus notes
(based on detailed coding in excel sheet)
Teacher name and grade:
Teacher, grade 7 (see 4.2.3)
Time slot
Criterion and
level assigned
Explanation, discussion and evidence
00:09
Diagnostic
reasoning and
awareness of
error, partial
Teacher clearly indicates to the leaner which of the
questions she had problems with and wants the learner
to explain why she struggle with the question.
01:34
Awareness of
error, diagnostic
reasoning,
conceptual and
procedural,
partial
Teacher interprets the question for the learner. The
learner then finds the question easy to do, which is
fitting a trapezium onto a given shape.
03:02
Awareness of
error, diagnostic
reasoning,
conceptual and
procedural, full
Teacher refers learner to her incorrect answer and asks
her how she got that, which the learner explains and
says she realises it doesn’t make sense. She then asks the
learner to fit a parallelogram onto the given shape
(03:45), which the learner finds also easy to do.
04:43
Conceptual and
procedural, full
Teacher then asks learner to fit a triangle onto the shape,
which the learner finds it easy to do.
05:43
Procedural ,
conceptual and
diagnostic
reasoning, full
Teacher refers learner back to her original answer again
and asks her to explain how she got a third from fitting
each of the pieces onto the given shape, i.e. trapezium,
triangle and parallelogram. Learner says she got a third
because she had three pieces to fit, regardless of the size
of the pieces.
Table 2 (this information informed the analysis but was not quantitatively analysed).
General Interview
summary
(see separate sheet for criteria)
Teacher name and grade:
Teacher, grade 7 (see 4.2.3)
99
Criterion
Level
Explanation, discussion and evidence
Interaction
3
Teacher created an atmosphere that is conducive for the
learner to express herself with confidence
Conceptual
development
3
Teacher shows an excellent mastery of the concepts
embedded in the question through her interpretation of the
question and guiding of the learner.
Procedural
development
3
Teacher works very well in explaining how the learner
should approach the question
Teacher
mathematical
knowledge
3
Through the interview, the teacher demonstrates that she’s
got great mathematical knowledge.
Questioning
3
Teacher follows an acceptable format of posing question to
the learner, and gives sufficient time to respond. She also
probes the learner on explaining how she got her incorrect
answers wrong.
Use of concrete
material
1
No concrete materials are used in the interview
100
Appendix 4
Example errors verbalised by learners in their interviews
Interview 1: Grade 3
Learner words
Problem and related error(s)
I said 9+7 and I said plus 5
Learner was trying to work out the following “6519+17”.
Because 65-19.. I know how to do that but
when I have to add 17 it is difficult. Then
I said, I organised my numbers into units
and tens and hundreds.
Errors:
Focus on units in an isolated manner/uses place
value haphazardly
Ignores operation symbols/does not know how to
deal with the three number string
Interview 2: Grade 3
Learner words
Problem and related error(s)
90 + 15 equals 105
Work out the number of rows needed to seat a
certain number of people
It would be like a times.
Errors:
Says that the number of spectators is the number of
rows
Not able to express the operation required as
“division” (although if this were properly probed it
could be that she is thinking about it in an inverse
way and the teacher may have worked with this).
Interview 3: Grade 4
Learner words
Problem and related error(s)
Because you can’t subtract 6 (points to the
6 of 76) from 9 (points to the 9 of 19). So I
took 1 away from the 7 (point to the 7 of
76, crosses out the 7 and writes 6 above it)
so it becomes 6 and this one becomes 7
(crosses out the 6 of 76 and writes 7 above
it). Now you still can’t subtract it, so I
took away 1 again (crosses out the 6
written above the 7 of 76 and writes 5
above it) and it became 5. And I added
this one (crosses out the 7 written above
Learner was trying to work out 76 – 19
Errors:
“Borrows” from the tens repeatedly by only carrying
one across to the units each time.
Does not understand relative values of digits
according to place value
101
the 6 of 76 and writes 8 above it), so it
became 8.
Interview 4: Grade 5
Learner words
Problem and related error(s)
Teacher by connecting all this length
(indicates by circling both hands above
paper), teacher.
Learner was explaining how she got the answer to
the question: “If fencing is bought in lengths of 10
metres how many lengths must Mr Brown bought,
buy?”
I learned measure how much.
Errors:
Gives the perimeter as the answer, does not work
out how many lengths need to be bought.
Tries to apply his knowledge of how to work out the
perimeter (“measure how much”) to this question.
Interview 5: Grade 5
Learner words
Problem and related error(s)
Teacher I said 2 times 3, then I said is 6
and then I said 2 times 6 is equals to, is
equals to 12 teacher. Then I said 12 over
6 teacher.
Learner was trying to convert a mixed number (2 )
3
4
to an improper fraction
Errors:
Works incorrectly with the whole number and
fractional parts
Does not understand that the different forms of the
number should be equal (represent the same value)
Interview 6: Grade 6
Learner words
Problem and related error(s)
I was supposed to start at zero, when I
started at one…
Learner was trying to label the vertical axis of a
graph
Errors:
Started labelling incorrectly
Interview 7: Grade 7
Learner words
Problem and related error(s)
… because I thought just put like…it
Learner was trying to work out fractional parts of a
102
doesn’t matter what size they were as
long as they fit in there.
And I thought that they each take up a
third. So I put one third for each of them.
whole
Errors:
Not sure about the wording of the question and so
thinks she has to fit the different shapes in at the
same time.
Thinks that fractional parts do not have to be the
same size
Interview 8: Grade 7
Learner words
Problem and related error(s)
Counts the shapes she has put in
“(Pointing at various sections of diag A) 1, 2,
3, 4, 5, 6. Ok.” But does not count a 7 th
one that is also there in her drawing on
the test answer sheet
Learner was trying to work out the how many
diamond shaped quadrilaterals would fit into a star
shape
Errors:
Does not fit the shapes into the bigger shape without
overlapping them – so not fitting them into the
bigger shape correctly.
Interview 9: Grade 8
Learner words
Error
So I understood that the one with 2, this
had to be a bigger value in order for it to
equal the one thing; and using 3 those
had to be smaller values in equal in order
to equal to the one value.
Learner was trying to create different equations
using given symbols, for which she had to work out
relative values.
Errors:
Difficulty in understanding the wording of the
question
Did not work with the given information to work
out the relative values of the symbols – made
assumptions about “bigger” and “smaller” values
Interview 10: Grade 8
Learner words
Problem and related error(s)
Because on the block it showed us, what
is nine minus five? So I worked it out and
it gave me five.
Learner was trying to work out the missing value in
the equation: “9 – 5 =  - 9”
Errors:
Does not keep in mind the equality of both full
103
expressions on the two sides of the equation
Interview 11: Grade 9
Learner words
Problem and related error(s)
I made it 0,5 for 5, because the graph was
not big enough
Learner needed to label the axes of a graph
I just multiplied. I thought I might find
the name of the graph
Learner had to find an equation using a table of
values
Errors:
Does not work out an appropriate scale, renames the
values to be written on the axes instead
Arbitrarily works with tables of values with no
method as to how to find out the equation they may
represent
Interview 12: Grade 9
Learner words
Problem and related error(s)
As in for twenty-five (points to number on
horizontal line), I did a two point five, to,
two hundred and fifty (points to number on
vertical line), because there was no
twenty-five. So I used these variables
(points to numbers on horizontal line) just to
say this is five, even though it’s zero
comma five, I said this is five, and like
that…
Learner was trying to label the vertical axis of a
graph
Errors:
Reduced the sizes of the numbers so that he could fit
them in according to the “scale” that he used.
Interview 13: Grade 9
Learner words
Problem and related error(s)
It’s because of I add 5x plus 2x and then I
got 7x, that’s why I get 7x there.
Learner was trying to find the solution to the
equation 5(x + 2) = 2x
And then what I’ve done is that I said 7x
is equal to 7x because I was checking my
sum.
Errors:
Does not know how to apply the distributive law
correctly
Does not know how to isolate x in the equation
Does not realise that he has to give a value of x in
answer to the question
104
Appendix 5
Grouped Grade Graphs
Figure 5.1 Procedural explanations in interviews by grouped grades
Procedural explanations in interviews
50.00
45.00
40.00
Percentage
35.00
30.00
25.00
20.00
15.00
10.00
5.00
0.00
Not present
Inaccurate
Partial
Full
Grade 3-6
0.00
5.83
30.10
13.59
Grade 7-9
0.00
5.00
29.17
2.50
Figure 5.2 Conceptual explanations in interviews by grouped grades
Conceptual explanations in interviews
50.00
45.00
40.00
Percentage
35.00
30.00
25.00
20.00
15.00
10.00
5.00
0.00
Not present
Inaccurate
Partial
Full
Grade 3-6
0.00
10.68
6.80
14.56
Grade 7-9
0.00
7.50
12.50
3.33
105
Figure 5.3 Awareness of error in interviews by grouped grades
Awareness of errors in interviews
50.00
45.00
40.00
Percentage
35.00
30.00
25.00
20.00
15.00
10.00
5.00
0.00
Not present
Inaccurate
Partial
Full
Grade 3-6
9.71
0.00
9.71
5.83
Grade 7-9
18.33
2.50
3.33
2.50
Figure 5.4 Diagnostic reasoning in interviews by grouped grades
Diagnostic Reasoning in interviews
50.00
45.00
40.00
Percentage
35.00
30.00
25.00
20.00
15.00
10.00
5.00
0.00
Not present
Inaccurate
Partial
Full
Grade 3-6
14.56
41.75
15.53
17.48
Grade 7-9
9.17
35.00
36.67
2.50
106
Figure 5.5 Use of the everyday in interviews by grouped grades
Use of everyday in interviews
50.00
45.00
40.00
Percentage
35.00
30.00
25.00
20.00
15.00
10.00
5.00
0.00
Not present
Inaccurate
Partial
Full
Grade 3-6
0.00
1.67
4.17
0.00
Grade 7-9
0.00
0.83
0.83
0.00
Figure 5.6 Learner multiple explanations in interviews by grouped grades
Number of interviews
Multiple explanations in interviews
5
4
3
2
1
0
One
explanation
Two
explanations
Three
explanations
Four or more
explanations
Grade 3-6
1
3
1
1
Grade 7-9
0
2
3
2
107