Uploaded by Ha William

A Question Generation Framework for Teachers

advertisement
See discussions, stats, and author profiles for this publication at: https://www.researchgate.net/publication/325868345
A Question Generation Framework for Teachers
Chapter · June 2018
DOI: 10.1007/978-3-319-93846-2_33
CITATIONS
READS
4
502
3 authors, including:
Nguyen-Thinh Le
Alexej Shabas
Charité Universitätsmedizin Berlin
Humboldt-Universität zu Berlin
60 PUBLICATIONS 568 CITATIONS
2 PUBLICATIONS 6 CITATIONS
SEE PROFILE
All content following this page was uploaded by Nguyen-Thinh Le on 24 September 2018.
The user has requested enhancement of the downloaded file.
SEE PROFILE
A Question Generation Framework
for Teachers
Nguyen-Thinh Le(&), Alejex Shabas, and Niels Pinkwart
Department of Computer Science,
Humboldt-Universität Zu Berlin, Berlin, Germany
nguyen-thinh.le@hu-berlin.de
Abstract. We report on the results of a study with 105 teachers about the
practice of question asking in German schools. Most teachers deploy questions
as an important teaching method, however, just a small number of them possess
systematic question asking techniques and prepare questions systematically for
their lessons. This is a motivation for us to propose a framework for question
generation, which is intended to help teachers prepare questions for their
lessons.
Keywords: Question generation
Question taxonomies
Questioning techniques
1 Introduction
For school teachers, question asking is an indispensable teaching technique. Dillon [5]
investigated questions that were asked by teachers in 27 classes in six secondary
schools in the USA and reported that asking question accounts for about 60% of a
teaching session. The benefits of question asking in teaching are multifaceted and were
investigated in many research studies [8, 9, 13]. How do teachers ask questions in their
class? Which question asking methods are applied by school teachers? Indeed, teachers
could at-tend a training workshop or self-study some guidelines for effective question
asking [9, 10]. Some effective question asking strategies and techniques were proposed
and validated empirically. For example, Stahl [12] suggested that teachers should give
three seconds wait-time for each question to help students think about the question and
verbalize a good answer. With respect to adapting questions to students, researchers
reported controversial results. While Chafi and Elkhozai [3] suggested asking more
questions of higher cognitive levels, Chin [4] argued that questions of higher order
would not necessarily motivate students to give answers at the required level. Instead,
Chin proposed a cognitive ladder for question asking, where teachers should first ask
low order questions before starting with higher order questions. Unfortunately, this
cognitive ladder has not been validated yet. In this paper, we investigate the practice of
question asking in German schools. The collected best practice of question asking is
intended to serve to build a framework of automatic question generation, which may
help teachers prepare questions for their lessons systematically.
© Springer International Publishing AG, part of Springer Nature 2018
C. Penstein Rosé et al. (Eds.): AIED 2018, LNAI 10948, pp. 182–186, 2018.
https://doi.org/10.1007/978-3-319-93846-2_33
A Question Generation Framework for Teachers
183
2 Pilot Study
We conducted an online questionnaire study. The participation in the study was
anonymous and voluntary. Therefore, we can expect that teachers who participated in
this study were not under time pressure and gave their answers honestly. In order to
collect answers of teachers from different federal states of Germany, we announced our
survey using the “Education Servers”1 in Germany). 143 teachers participated in this
survey. Some of them did not complete the questionnaire, and thus their data had to be
eliminated. After eliminating incomplete questionnaire data, 105 complete data entries
remained. The questionnaire was divided into three sections. The first section was
intended to collect general data about the teacher. The second section concerned the
aims of asking questions and the frequently used specific question types. The third
section focused on the question strategies, i.e., when a question of which type should
be asked (e.g., sequence of questions of different types), the questioning techniques,
i.e., how a question should be asked (e.g., verbalization), the wait-time for students,
and the used question taxonomies (e.g., Bloom’s [1], Wilen’s [14]).
Table 1 summarizes results of the study. The participants reported that they had
long teaching experience (on average 19 years), and thus, their experience with
question asking is of high reliability. Teachers reported that they ask about 16 questions
and prepare about six questions for each teaching session. (Note, answers were given
based on the memory of participants. Thus, these numbers should be considered rather
an estimation of teachers than statistics.)
Regarding the aims of question asking, the result of this study is partly in accordance with several reported research studies. A recent study conducted in Moroccan
schools reported that most questions are aimed to check the understanding of students
[3]. In addition, another aim of question asking found by this study is to recall factual
knowledge and to diagnose the difficulties of students. Another study [6] was conducted to investigate the cognitive support of question asking practice in secondary
schools in Pakistan. 267 questions were collected and classified. Khan and Inamullah
[6] reported that 67% of questions serve to test the knowledge level as well as to recall
student’s memory, and 23% of questions are used to support the understanding of
students. Another study showed that about 10% of questions used by teachers are
intended to stimulate students’ thinking [2]. Our study also shows similar results in that
the participants use 10.66% of question for reflecting and critical thinking. We also
learn from the results of our study that in German schools, teachers deploy a large part
of questions to enhance motivation (19.29%) and to support student-teacher interaction
and maybe also student-student interaction (19.80%). To our best knowledge, these
results have not been reported in previous research studies regarding question asking.
Regarding special question types used by study participants, in addition to the most
frequently used question types (W-questions, open questions, questions using learning
subject specific operators), six teachers could only give examples instead of question
types. Four teachers stated that suitable question types depend on the learning/teaching
1
“Education Server” (“Bildungsserver” in German) is a platform in Germany, where information
about education in each federal state is published.
184
N.-T. Le et al.
Table 1. Study results.
No.
1
Question
Teaching subject
2
School level/type
3
4
Teaching experience
No. questions/
teaching session
Aims of question
asking
5
6
7
8
Frequently used
question types
Results
42.01% Nature Science; 31.6% Language; 21.18% Social
Science; 5.21% Sport
42.86% Secondary level 1 (grade 5–10); 41.55% Secondary
level 2 (grade 11–12/13); 2.6% Vocational schools; 3.25%
special needs schools
Between 1 and 42 years, on average 19 years
16 asked questions (6 prepared questions)
22.83% Enhance understanding; 19.80% Support interaction;
19.29% Motivate/stimulate students; 6.6% Support analysis;
3.05% Examine/test; 5.08% Evaluate learning performance
60%: No question types; 39.2%: Yes: W-questionsa (13
times), open questions (10 times), questions using subject
specific operatorsb (3 times)
74.7%: No strategy; 25.3%: Yes, with strategies
62%: No technique; 38%: Yes, with techniques
Question strategies
Questioning
techniques
9
Question taxonomy
64.4%: No question taxonomy; 31.6% Bloom’s; 4% Wilen’s
10
Wait-time
On average, 1 min
a
Not all W-questions in the German language are identical with W-questions in English.
b
Question operators are verbs for defining tasks on each competency level of each subject.
subject, the age of students, and on a specific situation, for example, “questions vary
according to the grades,.., for younger students questions are verbalized in form of
request and rather sound friendly than imperative.”
Regarding the strategies of question asking (e.g., sequence of questions of different
types), 25.3% of participants (Table 1, Question 7) described their individual strategies
of question asking, which are very diverse. For example, one teacher described her
strategy that “questions should be asked from the simplest to the most complex” and
“involve the students with weak performance first, then students with the strong performance”. Another teacher suggested asking questions “from most difficult to easiest”
and “from general to detail”. In the context of techniques of question asking (e.g.,
verbalization), similar to results of questioning strategies, 38% of participants have
questioning techniques (Table 1, Question 8), e.g., “raise motivation through discrepancy”, “if possible, no decision question”, “expression should be diversified”. 64.4% of
participants do not know any question taxonomy. These results indicate that teachers
lack a systematic approach for question asking. A reason for explaining this phenomenon might be that question asking has not been integrated in the curriculum for
teacher training. Aiming at balancing this deficit, we intend to develop a framework for
automatic question generation in order to help teachers (especially pre-service teachers
and teachers with least teaching experience) prepare questions for their lessons systematically and (in long term) internalize question taxonomies and some question types.
A Question Generation Framework for Teachers
185
3 An Adaptable Framework for Question Generation
Since the strategies and techniques of question asking recommended by teachers
participated in the study are diverse and do not have features in common, it would not
be easy to integrate them into the framework for automatic question generation. Since
31.6% of the participating teachers know the Bloom’s question taxonomy, integrating it
into the framework for automatic question generation could make sense. To help
teachers apply question taxonomies and special question types systematically, our
proposed framework for question generation uses a set of question templates based on a
specific question taxonomy. To integrate frequently used question types in the
framework, a list of templates for W-questions may be specified. Open questions could
be deployed using the question classes “application” and “evaluation” of the Bloom’s
taxonomy or applying the six classes of Socratic questions [11]. Similarly, question
templates using operators for a specific subject could also be specified. Questions can
be generated using the specified tem-plates. The detailed description of the question
generation process is referred to [7].
The initialized question templates can be modified by teachers to adapt to their
students (e.g., according to students’ age). The question generation algorithm uses a
lesson topic as input and generates questions using the templates. The question generation process also considers relevant concepts related to a lesson topic by querying a
semantic/lexical database (e.g. ConceptNet, WordNet). Since many relevant concepts
may be retrieved from a semantic/lexical database, and thus many questions may be
generated by integrating the retrieved concepts into the pre-specified question templates, approaches for ranking the relevance of generated questions need to be
researched. In this paper, we investigate the applicability of the vector space model
approach, because it suites to the questions generated by our framework. A detailed
description of this ranking algorithm is available in [7]. To research whether the vector
space model based ranking algorithm contributes to more relevant questions, we
integrated that algorithm into the framework for question generation. We used the
German version of the semantic database ConceptNet to retrieve relevant concepts
related to a lesson topic. We acquired participants (computer science teachers) for this
study by contacting 25 schools in Berlin and via the “Education server” platform of
each federal state in Germany. There were ten participants: nine computer science
school teachers and one university professor. Since the professor taught computer
science, this participant was also included in the analysis. The participants had an
average of 11.7 years of teaching experience. One participant had problems to access
the system, hence, nine subjects remained in this study. We collected sixteen lesson
topics that participants input into the framework for question generation: HTML,
Encryption, Java, Algorithm, Automaton, Object, Parser, Grammar, Recursion, Network, Database, Compiler, Object-oriented programming, Turing machine, Byte,
Programming. Out of all the generated questions, 171 questions were selected as useful
by the participants. Then, we compared the number of selections made by participants
against the ranking position calculated by the ranking algorithm. The ranking algorithm
demonstrated mixed results. For example, for the lesson topic “HTML”, the related
concept “Xhtml” was ranked on the first place, followed by the concepts “Tag” and
186
N.-T. Le et al.
“Element”. The questions generated using the concept “Xhtml” were marked as useful
23 times by participants, followed by the concepts “Tag” with eight times, and “Element” with six times. This might indicate that the ranking algorithm resulted in good
accuracy for the lesson topic “HTML”. For the lesson topic “network”, the related
concepts of the first two places “Net” and “Internet” yielded questions with most
selections as useful questions. However, questions, which were generated using concepts on the 3rd and 4th ranking places (“System” and “Computer network”) were not
totally in agreement with the selection by participants. Concepts (“Al Qaida”, “Terror
network”, “Client”, “Link” “Backbone”, “Terror cell”) on the ranking place from 5 to
10 were least marked as useful by participants. Thus, for the topic “network”, the
ranking algorithm achieved relative good performance. However, the concepts “Al
Qaida”, “Terror network”, “Backbone”, “Terror cell” should have been eliminated by
the ranking algorithm, because they do not belong to the context of Computer Science.
The problem is that the concept “network” is related with different contexts in the
ConceptNet database. Thus, we plan to improve the ranking algorithm by distinguishing the context of related concepts.
References
1. Bloom, S.: Taxonomy of Educational Objectives: Cognitive Domain. Addison Wesley,
New York (1956)
2. Brown, G., Wragg, E.C.: Questioning. Routledge, London (1993)
3. Chafi, M.E., Elkhouzai, E.: Classroom interaction: investigating the forms and functions of
teacher questions in moroccan primary school. J. Innov. Appl. Stud. 6(3), 352–361 (2014)
4. Chin, C.: Classroom interaction in science: teacher questioning and feedback to students’
responses. Int. J. Sci. Educ. 28(11), 1315–1346 (2007)
5. Dillon, J.T.: Questioning and Teaching: A Manual of Practice. Croom Helm, London (1988)
6. Khan, W.B., Inamullah, H.M.: A study of lower-order and higher-order questions at
secondary level. Asian Soc. Sci. 7(9), 149–157 (2011)
7. Le, N.T., Shabas, A., McLaren, P.: QUESGEN: A Framework for automatic question
generation using semantic web and lexical databases. In: Frontiers of Cyberlearning,
Chap. 4 (2018). ISBN for Paper version: 978-981-13-0649-5, ISBN for Electrical version:
978-981-13-0650-1
8. Lin, L., Atkinson, R.K., Savenye, W.C., Nelson, B.C.: Effects of visual cues and selfexplanation prompts: empirical evidence in a multimedia environment. Interact. Learn.
Environ. J. 24, 799–813 (2014)
9. Morgan, N., Saxton, J.: Asking Better Questions. Pembroke Publishers, Makhma (2006)
10. Ontario Ministry of Education: Asking effective questions. Capacity Building Series (2011)
11. Paul, R., Elder, L.: Critical thinking: the art of socratic questioning, Part I. J. Dev. Educ.
ProQuest Educ. J. 31(1), 36–37 (2007)
12. Stahl, R.J.: Using “Think-Time” and “Wait-Time” Skillfully in the Classroom. ERIC
Clearinghouse for Social Studies/Social Science Education, ED370885 (1994)
13. Tenenberg, J., Murphy, L.: Knowing what i know: an investigation of undergraduate
knowledge and self-knowledge of data structures. Comput. Sci. Educ. 15(4), 297–315
(2005)
14. Wilen, W.W.: Questioning Skills for Teachers. National Education Asso, Washington, DC
(1991)
View publication stats
Download