Problems First - eLearn - eLearning at the Teaching and Learning

advertisement

Promoting active learning in Mathematics – a ‘Problems First’ approach.

 Donal Healy

Martin Marjoram

Ciaran O’Sullivan

 James Reilly

 Paul Robinson

5 th Annual Conference in Mathematics and Statistics

Service Teaching and Learning

24 th

IT Carlow and 25 th May 2010

ITT Dublin May 2010

1

‘Problems First’ project

Overview of Talk

 Problems First

Brief background to project

The 3 sub- projects:

1.

‘Problems First’ for Mechanical Year 1 Group

2.

‘Problems First’ for Pharmaceutical Science Year 1 Group

3.

Reflective Sheets in Key Skills Testing in Electronic Engineering Year 3

Layout for each description :

Implementation

Student Feedback

Effectiveness of approach

Lessons Learnt

ITT Dublin May 2010

2

‘Problems First’ Project -

Background

 Motivation: ongoing challenge engaging students in active learning in mathematics

Move from traditional to total enquiry based approach daunting

Approach of incremental change in delivery is under investigation at the ITT Dublin.

 Funded by SIF CONTINUE Innovations in

Teaching, Learning, and and/or Inclusive

Education Project

 Main aim - encourage active learning

ITT Dublin May 2010

3

‘Problems First’ Project -

Background

 Staff identified areas or aspects of 3 existing

Mathematics module to be modified to

 improve student engagement with the module

 increase student reflection on their learning

 hence lead improve learning

 Designed questionnaire for evaluation of student experience of this approach

ITT Dublin May 2010

4

‘Problems First’ for Mechanical Year 1 Group

Why?

 Mathematics 1 module, 1 st semester level 7 degree in Mechanical Engineering

6 review sections at beginning of module.

40% of the overall course

Lecturer unhappy with traditional approach for review part

New approach

(Christenson input at 2008 SEFI 14 (MWG) conference) materials to be studied are introduced via problem sets given to the students to work on first

 followed with a subsequent lecture session to deal with any issues arising and to recap the material.

ITT Dublin May 2010

5

Implementation:

(Mechanical Year 1)

For each of the 6 review sections: students first given a problem sheet; this was augmented where necessary by input from the lecturer

What Happened:

Most sessions students worked on problems , lecturer helping individuals or small groups when his help was sought.

Some sessions were devoted to recapping and summarising key concepts that had arisen from the review problem sheets.

What was needed:

Class materials to enable this approach

 Problem Sheets ( modified and added detailed solutions),

 Topic notes ( Gap notes edited and filled)

Reflective diary template for the lecturer.

6

ITT Dublin May 2010

Student Feedback:

(Mechanical Year 1)

Questionnaire

4 point Likert scale:

 Agree Strongly, Agree, Disagree and Disagree Strongly

 5 areas

Area:

Facilitation:

Documentation:

Organisation of learning:

Demonstration of Learning:

Group learning – dynamics and processes:

Questions

1 to 4

5, 6

7

8

9 to 12

ITT Dublin May 2010

7

Questionnaire Responses

Mechanical Year 1 (n =21)

 most students agreed or agreed strongly in most statement categories.

In particular there was strong agreement regarding the materials used, the ease of asking questions and confidence in answering exam questions in these topics.

ITT Dublin May 2010

8

Effectiveness of approach:

(Mechanical Year 1)

 Comparisons of student performance between:

2009 ‘Problems First’ group and 3 previous academic years.

 Benchmark prior attainment using Leaving Certificate

Mathematics grade (points):

2006, 2007, 2008 Students

Average Leaving Certificate Mathematics points score

42.5

2009 Students

38.24

Mathematics 1

Same lecturer

Method and standard of assessment components kept equivalent

All questions compulsory on the end of semester examination.

9

ITT Dublin May 2010

Effectiveness of approach:

(Mechanical Year 1)

 Two measures.

1. improvement in student mark between a one hour diagnostic test (administered at the first lecture) and an equivalent diagnostic test re-take

(administered after the review material has been completed)

2. student performance on the end of semester examination

Average improvement between Diagnostic test and CA test results (%)

Average

End of semester examination result

(%)

Number of students taking examination.

2006, 2007, 2008 Students

40.10

50.92

48

2009 Students

25.57

58.14

21

Mechanical Engineering student performance comparisons.

ITT Dublin May 2010

10

Lessons learnt:

(Mechanical Year 1)

 Food for thought !

No short term improved learning of the review material

Overall improvement in module performance.

Variation in test improvement scores and in examination performance was significant at p =1% level.

Used one –way ANCOVA (analysis of co-variance with for example year as factor , leaving certificate mathematics points as co-variate and test improvement score as response )

ITT Dublin May 2010

11

Lessons learnt:

(Mechanical Year 1)

 Observations

 an improved level of engagement by the 2009 student cohort

Evident continued student effort late in the semester

Statistically significant improvement at a p =1% level examination questions for topics covered later in semester, for example:

2006, 2007, 2008 Students

Average Mark on examination question on the last topic covered in the semester

36.2 %

2009 Students 55.5%

Future plan

Repeat the Problems First approach,

But refine of the materials and approach for the review sections using Insights recorded in the reflective diary key in informing changes.

ITT Dublin May 2010

12

‘Problems First’ for Pharma Science Yr 1 Group

Why?

 Mathematics 1 module 1 st semester level 8 degree in Pharmaceutical Science

Section traditionally ignored by this class

Not attempted by many in exam

Poor marks from those who did the question

 Encourage students to engage with lecturer outside class hours

 One hour less per week, class size doubled

ITT Dublin May 2010

13

Implementation:

(Pharmaceutical Science Year 1)

Unseen problem

Groups of 4

Terms not explained

 10 weeks

 Series of introductory tasks

 Regular opportunities to meet with lecturer

ITT Dublin May 2010

14

Questionnaire Responses

Pharmaceutical Science Year 1 (n =26)

There was strong agreement with the following statements:

The structure of the project enabled me to take more responsibility for my own learning

I felt comfortable asking questions relating to the project

ITT Dublin May 2010

15

Questionnaire Responses

Pharmaceutical Science Year 1 (n =26)

There was strong disagreement with the following statement:

I found the problem presented to be easy to follow

ITT Dublin May 2010

16

Effectiveness of approach:

(Pharmaceutical Science Year 1)

 Comparisons of student performance between:

2009 ‘Problems First’ group and previous academic year.

 No significant difference in marks at 5% level

 Perhaps not a bad thing given the circumstances!

 Mathematics 1

Same lecturer

Method and standard of assessment kept equivalent

ITT Dublin May 2010

17

Lessons learnt:

(Pharmaceutical Science Year 1)

 Introductory tasks completed to a very high standard

 Main aims not achieved by most

 Lack of engagement until shortly before deadline.

Preliminary deadlines to be implemented in future

 Final deadline earlier in semester. Most students who did finally engage seemed surprised and encouraged by the benefits

ITT Dublin May 2010

18

Mathematics Key Skills – Why?

 Students do not bring key knowledge with them from one semester to the next

Students do not master the basics

Students need to refresh their key mathematics knowledge continuously

BUT

 Students will not concentrate on anything with no marks attached

 In later semesters, the basics are not tested directly

ITT Dublin May 2010

19

Mathematics Key Skills – What?

 Key Skills consists of

1.

many categories of multi-choice questions

Designed to test material our students MUST be able to do.

• Each question comes with immediate feedback and reference to a book chapter and an electronic resource.

2.

the tests draw randomly from particular categories of questions

The tests are Moodle multichoice quizzes

We allow the tests to be repeated several times over a semester

Only a high mark is rewarded with credit

3.

different tests

For different groups and in different semesters.

We test material from earlier semesters that we consider to be

“Key Skills” for the current semester.

ITT Dublin May 2010

20

Key Skills Reflection Sheets

 Absence of Reflective Learning?

While there was no systematic survey, many students admitted not working on Key Skills topics between tests and not having any record of the question categories they got wrong.

We want students to be active learners and enforce a delay

(usually of several days) between tests to allow students to consider question feedback and review their test attempts.

 Reflection Sheets

 Since September 2009 a structured reflection sheet has been piloted in 3 rd Year Electronic Engineering to prompt students to identify and record areas in which they need to do revision work ahead of their next test attempt. Actions must be filled in against some or all of the question categories they got wrong and the sheet returned before their next attempt.

ITT Dublin May 2010

21

Reflection Sheets - Implementation

Reflection Sheets

 Piloted since September 2009

Sheet must be returned to the lecturer before the student may repeat the test

Actions must be filled in against some wrong answers

Examples: 4 reflection sheets for 2 students

(who sat 5 tests each).

 Student 1:

 Student 2:

Mark sequence: 8 8 8 13 13

Mark sequence: 6 6 8 10 13

ITT Dublin May 2010

22

Student 1 Sheet 1(see handout )

ITT Dublin May 2010

23

Reflection Sheets: Student Feedback

The reflection sheets themselves are a record of student feedback

Some students (as above) filled the sheets in very diligently giving detailed actions

Others were careless about which questions they got wrong and gave only generic actions, such as

“studied” or “revised”

No detailed study done on content of sheets – the focus was on ensuring that a sheet was returned by every student before every repeat attempt and that every student scoring less than full marks on a test was provided with a sheet immediately and encouraged to mark the wrong answers immediately

ITT Dublin May 2010

24

Reflection Sheets: Effectiveness

For students taking more than one test

A much larger proportion of the group with reflection sheets reached the threshold level of 10 right answers

(where they begin to get more than 0% for Key Skills).

Since the introduction of Reflection Sheets, there have been increases in:

The mean best score

The mean increase (1 st test to best test)

The proportion of tests better than the previous test

The mean mark in the semester examination

[However, these increases for the most part are not large enough to be considered statistically significant.]

ITT Dublin May 2010

25

Reflection Sheets: Effectiveness

Complication in Assessing Effectiveness

In 2007 (the first year of Key Skills), students were allowed to compensate for poor Key Skills marks with their performance in their midterm test and semester examination. This led to some students not engaging fully in the process. Since 2008, the 15% for Key Skills is based solely on Key Skills performance.

2009 students (with reflection sheets) are compared to

2007 and 2008 students combined and to 2008 students only (where the reflection sheets were the only difference in approach).

ITT Dublin May 2010

26

Proportions Passing “Threshold”

Fisher’s Exact Test: 2 × 2 Contingency Table of Passing Threshold of 10 Right Answers

With Sheet

No Sheet ('07 & '08)

No Sheet ('08 only)

All Repeating Students

Passing

Threshold

18

Not

Passing

1 p-value

30

17

14

7

0.0198

0.0504

Sub-Threshold Students

Passing

Threshold

13

Not

Passing

1 p-value

14

9

14

7

0.0061

0.0296

p-value: the probability that all of the students are from the same population and the proportions observed occurred randomly.

27

ITT Dublin May 2010

Mean Increase

Mann-Whitney Test of the Hypothesis that the Mean Increase is Unchanged

All Repeating Students Sub-Threshold Students

2007 and 2008 2008 only 2007 and 2008 2008 only p-value 0.0095

0.0485

0.0381

0.1261

Notes:

“Increase” for each student = Best score – First Score.

Two sample t -test gives lower p-values.

ITT Dublin May 2010

28

Reflection Sheets: Lessons Learnt

 It is strongly recommended that the reflection sheet be viewed as an essential element in implementing key skills testing.

 The combination of using reflection sheets and offering no compensation for poor key skills performance seems to offer the best approach.

Future Implementation

 Efforts to improve the quality of students’ entries under

“Actions” will be considered.

ITT Dublin May 2010

29

30

Thank you

Any Questions?

ITT Dublin May 2010

Download