- the Enhancement Themes website

advertisement
Guiding students onto a successful undergraduate path. Are we taking the
route through the “mole”-hill or are we climbing the mountain?
David Booth, Nicholas Brewer, Linda Morris and David Coates
School of Life Sciences: Learning & Teaching, University of Dundee, Scotland
Abstract
Prior to the development of a new Life Sciences curriculum at the University of Dundee,
which began in the academic year 2011/12. It was clear that some students were weak in
the applied numerate aspects of biosciences, which presented serious difficulties at the later
stages of their degree. In order to mitigate this, going forward, in addition to raising entry
requirements a system of bootstrapping was introduced.
All students entering the undergraduate degree programmes in Biological/Biomedical
Sciences undertake a ‘skills’ audit to assess their proficiency in maths, physics and
chemistry. Those not achieving a satisfactory passing grade of 70% are automatically
enrolled on a 20 credit intensive foundational module that supports their transition to
independent learning and enhances their integration into the theory and practical modules
that run parallel. This module takes place over 10 weeks and contextually covers topics such
as diverse as moles/molarity, use of logs and the physics of fluids.
As the University of Dundee values informed teaching through an empirical approach where
possible, we explore the effectiveness of this module. In a post-hoc, non-invasive and multidimensional approach, we analyse the strengths and weaknesses of these students relative
to their peers. Data is derived from the pre-entry skills audit, in-course assessments across
shared modules and running average grades from four cohorts of students entering first year
between 2011/12 and 2014/15.
We have evidence that students are embracing the module and are integrating into
promising scientists. At the half-way point of their academic career at the University of
Dundee their performance is indistinguishable from the rest of the cohort.
Author keywords: PCA, Numeracy, Skills Audit
Introduction
Key STEM skills and knowledge can be often overlooked and under-resourced in the morass
of subject specific material. As an example though it is an intrinsic component of the
scientific method, statistical analysis is widely recognised as both difficult to teach and the
boon/bane of the undergraduate student scientist (Reid and Petocz 2002). Such skills are in
high demand and it is universally recognised in the literature that ability to conduct analysis
improves the depth of a science students understanding (Colon-Berlingeri & Burrowes,
2011); permits the development of models (Elser and Hamilton, 2007); provides a strong
foundation for quantitative approaches to laboratory work (Metz, 2008); organize data (Rein
et al. 2007); and rationally evaluate the strength of arguments (Schield, 2004). Given the
highly competitive nature of employment in the sciences, both academia and industry, it is
vital to equip young scientists with the necessary prerequisites skills and knowledge for
work. It is a matter of easily ascertainable fact that the generic STEM skills and knowledge
are of most value to graduates in the life sciences at present, with many postgraduate and
advanced courses pay particular attention to fostering these skills precisely because of the
positive effect it has on their graduates in terms of employability or recruitment into doctoral
training programmes.
This position is further compounded by the substantive differences of scale and complexity
between 20th and 21st century science. Those that possess the skillset talk the unbeaten
path, and wrestle with big data can command the lion’s share of insight and prestige, a
notion reinforced by the results of a recent survey conducted by Dice (2014). Moore (1997)
argues that sciences with a strong mathematics component need no longer be deliberately
inaccessible and that it should be a common aim to have basic quantitative literacy for all
students in the 21st century. Though crucially and directly related to this work students of the
biosciences are moving into a field of study that is increasingly driven by technological
innovation and rapid refinement/evolution of technique and theory. As such the prerequisites
of student skills and knowledge are dramatically different from those of previous decades.
Such a sea-change in the style of research that is now both high-throughput and deeply
computational (research areas that are often suffixed with -omics) is evidences in the
multidisciplinary teams focused on the field of synthetic biology (EASAC policy report 13,
2010), and established fields of drug discovery (Sun et al. 2013), personalised medicine
(Everett et al. In Press) and bioinformatics (Ouzounis 2012).
Beginning in the academic year of 2011/12, a new curriculum in the Life Sciences was
implemented at the University of Dundee. The recent extensive realignment and redesign of
the University of Dundee life sciences undergraduate curriculum presented a suitable blank
canvas with regards to the teaching and assessment of mathematics, chemistry, physics and
statistics. The new ethos of the pedagogical approach is one engrained in the teaching of
key skills; critical thinking; interactive workshops; and training in “state of the art” techniques.
Prior to this, an audit and subsequent checklist of the skills and knowledge believed to be
core to all Life Science students prior to entry into research led teaching was mapped out.
One particular area of noted from previous cohorts was a weakness in not just basic
scientific numeracy but the ability to apply factors, convert units, calculate molarity and
perform basically logarithmic conversions and interpretations. For some even the basic
organic chemistry was identified as an issue. Though there are a penumbra of module or
instructor specific elements of much less formal non-biological STEM material delivered,
according to the perceived needs of the particular curriculum element in question, the
student experience and knowledge delivered would vary wildly with the associated instructor.
Feedback and consultation with teaching staff who directly supervise honours students
indicated that the “better” students at level 3 and beyond retain a broad awareness of the the
different basic STEM knowledge and techniques, but that significant misconceptions still
linger, and that the “weaker” students absorb almost nothing of value.
To address this a module entitled BS11005 - An introduction to Maths, Physics and
Chemistry was developed for those that may have identifiable weaknesses in areas
described above. Prospective students obtaining the entry requirements for a degree would
be offered a place on the programme, and subsequently be identified as individuals that may
benefit from enrolment to this inductor module during this key transitionary period.
A skills audit comprised of questions selected at random from the following topics: Basic
algebra, reading graphs, basic inorganic chemistry, log calculations, moles and molarity,
basic organic chemistry, basic physical chemistry, electrical principles, semi-log graphs and
statistics was developed. Students were presented with thirty questions drawn from this
question bank. On submission of the test students were informed as to whether they had
achieved a satisfactory score or if they should try again. The composition of the skills audit
would variable slightly per test such that it would be approximately 11-13 Mathematics
2
questions, 3-4 statistics questions, 10-13 chemistry questions and 3 physics questions. The
audit had a time limit of 45 minutes and automatically submitted the answers after that time.
The pass mark was arbitrarily set at 70%. The skills audit opened one month prior to
matriculation with the advisor of studies in early September. The module ran for 11 weeks
and covered topics in numeracy, physics of fluids, electrical principles, and fundamental
physical and organic covered in two or three week blocks. Composed of a single lecture (two
hours in duration) per week with five associated practical sessions, and three workshops.
The module was assessed by five computer assessments, each worth 8% of the overall
module mark and at the end of the module a one hour class test that covers all the material
and was worth 60% of the overall module mark.
Key questions that arise related to this approach are:
1. Can a skills audit prior to matriculation suitably discriminate those needing additional
credits associated with mathematics, chemistry and physics?
2. Does this approach have a meaningful impact on the students exposed to the
induction module? Are students completing the module academically competitive
relative to their peers?
Methodology
To explore the discriminatory power of the skills audit 1102 responses from cohorts of
students entering between the 2011/12 and 2014/15 academic years were collated.
Individual questions within the audit were clustered according to subject marks received
linearly transformed such that each correct response was worth a single mark. As students
would answer a random subset of questions, a count of questions attempted per subject was
made and the student performance in that area recorded as the proportion of correct
responses. This developed a score for each of the four areas covered, namely Chemistry,
Physics, Mathematics and Statistics. Those scoring less than 70% were classified as a
“Fail”, and those scoring more classified as a “Pass”. To determine whether the composition
of the audit in terms of question number per subject affected the student classification, a
generalised linear model (GLM) with Poisson distributed errors was performed against
question count for each subject set. To explore the relative contributions of the subjects
towards producing a final classification, principal component analysis was performed with
the first and second principal components correlated against the proportion of correct
answers per subject.
To explore the impact of BS11005 on subsequent performance, data for was collected in the
form of a set of summative running average grades per student, across the core curriculum
modules from those entering between the 2011/2012 academic year and 2013/2014
academic year. Students were anonymised and selected randomly from pools of those
classified as failing and passing the skills audit (final n=100 for each group). Data was
explored for tendencies and correlations; a one-way ANCOVA was conducted to determine
a statistically significant difference between students attending BS11005 due to gaining a
“Fail” vs “Pass” at skills audit on matriculation on the second year running average grades,
controlling for first year running average grades; overall student performance was explored
using one-way ANOVA and principal component analysis (Gorsuch 1983). Plotting,
summary statistics, univariate and multivariate analysis were conducted using R (R Core
Team, 2014).
Analysis and discussion of skills audit
Number of questions attempted in each component of the skills audit was found to not have
a significant effect on the outcome of the test (GLM for each of the four subject sets p>0.05).
However it was clear from principal component analysis that taken in an omnibus fashion the
subjects contributed differentially towards classifying students as either a pass or fail (table
1).
3
PC1
0.285
Stat.score
0.447
Chem.score
0.327
Physics.score
0.782
Standard deviation
0.328
Proportion of Variance 0.482
Cumulative Proportion 0.482
PC2
-0.180
-0.772
-0.181
0.582
0.254
0.289
0.771
Math.score
PC3
0.551
-0.452
0.666
-0.221
0.184
0.152
0.923
PC4
0.763
-0.023
-0.646
0.005
0.131
0.077
1.000
Table 1. Varimax rotated component loadings for four variables derived from the
skills audit. An examination of the Kaiser-Meyer Olkin measure of sampling
adequacy suggested a factorable sample (KMO = 0.68).
All four subjects load positively onto the first principal component, with only performance in
physics loading onto the second component (figure 1). This produces a characteristic
striated effect in the analysis, with linear clusters (running bottom left to top right) based on
physics performance, and those performing well in the other subjects tending towards the
bottom right within these clusters (figure 2).
4
Figure 1. Explanatory variables plotted against first and second principal
components. Indicating Students scoring highly chemistry, statistics and physics
being loaded positively on the first principal component.
5
Figure 2. First and second principal components of 1102 skills audit attempts
between 2011 and 2014. Explained variation of each component labelled on axis;
attempts classified as fail coloured red those classified as a pass coloured blue, with
confidence ellipses; arrow vectors indicating the loading of the four variables.
From these data it would appear that whilst the skills audit is discriminating students based
on ability, the quality of the assessment across the four key components is heterogenous.
Assessment of chemistry, mathematics and statistics is functioning well, however physics
delivering the least number of questions is effectively remaining untested. As such a more
evenly constructed and granular form of assessment of student ability may be necessary to
determine if a student requires remediation via BS11005. Likewise it may even be
appropriate to produce a four point score for each student for personal reflection and the
focusing of module choice out with the College of Life Sciences.
Analysis of student performance post attendance BS11005
Students classified as failing during skills audit were found to differ significantly from those
passing, however this difference between the two groups was not significant by the end of
the second year (see figure 3). In exploring the impact of BS11005 on the trajectory of
students into the research led teaching component of their undergraduate education, the
second year running average was found to be significantly affected after controlling for first
year performance.
6
Figure 3. Running average grade (mean ± 95%ci) of students passing in first year
(18.1 ± 0.227) and second year (17.5 ± 0.298); and students failing the skills audit in
first year (17.7 ± 0.278) and second year (17.5 ± 0.302). Asterisk above first year
barplot indicates significant difference in the running average grade, using one-way
ANOVA (F(1,198) = 6.562, p = 0.0112) with an effect size of 0.033.
Figure 4. Running average grade of students passing in first year versus and second
year. Students failing skills audit and taking BS11005 coloured red, those passing
coloured blue. Though gradients were not significantly different, there was a weak
but highly significant effect on the intercept, F(1,197) = 6.9493, p = 0.009) with an
effect size of 0.03.
Analysis of the student running average data set by component analysis separated
individuals by performance across all modules with strongest students being negatively
loaded on the first principal component (see figures 5 and 6). Modules have themes,
focusing on different skills and knowledge and this is evidenced by the vectors of the factors
loading in an almost perpendicular fashion between the first and second principal
component. This is reinforced by strong correlation of first and second year running average.
7
Figure 5. First and second year running average grades plotted against first and
second principal components. Indicating Students scoring highly in both years
loading negatively on the first principal component.
Figure 6. First and second principal components of 200 student running average
grade sets between 2011 and 2013. Explained variation of each component labelled
on axis; skills audit attempts classified as fail with BS11005 as an intervention
coloured red those classified as a pass without BS11005 as an intervention coloured
blue, with confidence ellipses; arrow vectors indicating the loading of the variables.
Kaiser-Meyer Olkin measure of sampling adequacy suggested a suitably factorable
sample (KMO = 0.84).
Notably students classified as failing or passing the skills audit were not found to cluster
together with both sets distributed almost equally distributed through the first and second
principal components. Whilst the confidence ellipse of the failing students is larger than that
of those passing, the two groups are indistinguishable from one another.
8
Discussion
The use of a multivariate approach to exploring appears to be a valid tool for gleaning some
insight into assessment validity and student performance; this approach has been well
documented in the literature and applied to a variety of student cohorts and subjects
(Gardner, 1972; Sullivan, 1996; Divjak and Oreški, 2009; Erimafa et al. 2009).
Whilst these results are reporting observations, and not the outcome of a controlled trial, it
seems plausible that intervening with an introduction to basic science would have a positive
effect on students perceived as lacking those skills/knowledge. It is notable that both sets of
students converge academically by second year. Whilst it was generally observed in CLS
that high performing students can appear to excel across the board, there is a diverse range
in student ability whereby those classified as weak or failing the skills audit at entry excel
overall in the core curriculum modules. This lack of a relationship between student ability at
matriculation might appear that to invalidate such a measurement as having poor validity in
the long term, however it is clear that students can be effectively discriminated with careful
selection of questions related to essential skills and knowledge required to fully embrace the
core curriculum content.
Student performance overall correlates strongly between level 1 and level 2, however It is
interesting however to note that ability within one does not correspond with ability across the
other, reflecting the multidimensional nature of student abilities and interests and the
multidimensional nature of instructor assigned grades. Bowers (2011) noted with secondary
school students that whilst instructor assigned grades can be subjective, that standardisation
in core assessments can account for academic ability whereas instructor assigned grades
act as a benchmark for motivation, attitudes and behaviours. Likewise Storkel (2012)
reported diversity of ability and interests in pathology graduates enrolled to become
clinicians. Given that students are receiving grades and assessment from a plethora of
instructors, it seems unlikely that this would be a source of bias.
References
Anderson, L. W., Krathwohl, D. R., and Bloom, B. S. (2001) A Taxonomy for Learning,
Teaching, and Assessing a Revision of Bloom’s Taxonomy of Educational Objectives, New
York, NY:Longman.
Brady, L. (1995) Curriculum development, 5th edn. Sydney, Prentice-Hall.
Biggs, J. (1999) Teaching for Quality Learning at University. SRHE and Open University
Press, Buckingham.
Bowers, A.J. (2011) What's in a grade? The multidimensional nature of what teacherassigned grades assess in high school. Educational Research and Evaluation: An
International Journal on Theory and Practice. Volume 17, Issue 3, 2011
Colon-Berlingeri, M. and Burrowes, P.A. (2011) Developing a Test of Scientific Literacy Skills
(TOSLS): Measuring Undergraduates' Evaluation of Scientific Information and Arguments.
CBE Life Sci Educ. 10(3): 259–267.
9
Crawford, C., Dearden, L. and Greaves, E. (2013) When you are born matters: evidence for
England. IFS Report 2013.0080. London: Institute for Fiscal Studies.
Crawley, M.J. (2007) The R Book. John Wiley & Sons, Ltd
DelMas, R., Garfield, J., Chance, B., & Ooms, A. (2006) Assessing Students‘ Conceptual
Understanding After a First Course in Statistics. Statistics Education Research Journal, 6(2),
28-58
Dice Salary Survey (2014) accessed at http://marketing.dice.com/ on the 1st Feb 2014.
Divjak, B. and Oreški, D. (2009) Prediction of Academic Performance Using Discriminant
Analysis. Proceedings of the ITI 2009 31st Int. Conf. on Information Technology Interfaces.
Elser JJ, Hamilton A (2007) Stoichiometry and the New Biology: The Future Is Now. PLoS
Biol 5(7): e181. doi: 10.1371/journal.pbio.0050181
Erimafa J.T., Iduseri A., and Edokpa I.W. (2009) Application of discriminant analysis to
predict the class of degree for graduating students in a university system. International
Journal of Physical Sciences Vol. 4 (1), pp. 016-021
Everett, Jeremy, Loo, Ruey Leng and Pullen, Francis S. (2013) Pharmacometabonomics
and personalized medicine. Annals of Clinical Biochemistry. ISSN 0004-5632 (Print), 17581001 (Online) (In Press) (doi: 10.1177/0004563213497929 )
European Academies Science Advisory Council (2010) Realising European potential in
synthetic biology: scientific opportunities and good governance. Policy report 13. ISBN: 9783-8047-2866-0
Fry, H., Ketteridge, S., and Marshall S. (2009) A Handbook for Teaching and Learning in
Higher Education, Routledge, Abingdon
Gardner, B. (1972) A multivariate computer analysis of students performances as a predictor
of performance as a surgical intern. Journal of Surgical Research Volume 12, Issue 3, Pages
216–219
Garfield, J. (2006) Collaboration in Statistics Education Research: Stories, Reflections, and
Lessons Learned, in International Statistical Institute Proceedings of the Seventh
International Conference on Teaching Statistics [online]. Available at
www.stat.auckland.ac.nz/~iase/publications/17/PL2_GARF.pdf.
Garfield, J.B. and Ben-Zvi, D. (2007) Developing students' statistical reasoning: connecting
research and teaching practice. Emeryville, CA: Key College Publishing.
Gorsuch, R.L.(1983) Factor analysis (2nd ed.) Hillsdale, NJ: Erlbaum
Hammersley, M. and Traianou, A. (2012) Ethics and Educational Research, British
Educational Research Association on-line resource.
Light, R. J., Singer, J. D., & Willett, J. B. (1990) By Design: Planning Research on Higher
Education, Cambridge, MA: Harvard.
Lock, R., Salt, D. & Soares, A. (2011) Subject Knowledge and pedagogy in science teacher
training. Education research [online]. Available at
http://www.wellcome.ac.uk/About-us/Publications/Reports/Education/
1
0
Loughland, A., Reid, A., and Petocz, P. (2002) Young People’s Conceptions of
Environment: A Phenomenographic Analysis. Environmental Education Research, 8, 187197.
McGowan H. M. (2011) Planning a Comparative Experiment in Educational Settings. Journal
of Statistics Education, Volume 19, Number 2
Metz, A.M. (2008) Teaching Statistics in Biology: Using Inquiry-based Learning to
Strengthen Understanding of Statistical Analysis in Biology Laboratory Courses. CBE Life
Sci Educ. 7(3): 317–326. doi: 10.1187/cbe.07-07-0046
Moore, D. S. (1997) New Pedagogy and New Content: The Case of Statistics. International
Statistical Review, 65: 123–137. doi: 10.1111/j.1751-5823.1997.tb00390.x
Morgan, B. (1999) "What are we Looking for in Theological Reflection?" Ministry Society
Theology, 13(2), 6-21.
Ouzounis, C.A. (2012) Rise and Demise of Bioinformatics? Promise and Progress. PLoS
Comput Biol 8(4): e1002487. doi: 10.1371/journal.pcbi.1002487
Paranjape, M.D. (2010) Crisis of statistics pedagogy in India. International Association of
Statistical Education (IASE) ICOTS8 Contributed Paper Refereed.
Prideaux, D. (2003) British Medical Journal. 326:268-279.
Print, M. (1993) Curriculum development and design, 2nd edn. Sydney, Allen & Unwin.
R Core Team. (2014) R: A Language and Environment for Statistical Computing. R
Foundation for Statistical Computing. Vienna, Austria. Http://www.R-project.org
Ramsden, P. (1992) Learning to Teach in Higher Education, London: Routledge.
Reid, A. and Petocz, P. (2002) Students’ Conceptions of Statistics: A Phenomenographic
Study. Journal of Statistics Education Volume 10, Number 2
Reid (1997) "The Hierarchical Nature of Meaning in Music and the Understanding of
Teaching andLearning," Advancing International Perspectives, 20, 626-631.
Rein, D.C., Sharkey, J. and Kinkus, J. (2007) Integrating Bioinformatic Instruction Into
Undergraduate Biology Laboratory Curriculum. Association for Biology Laboratory Education
(ABLE) 2006 Proceedings, Vol. 28:183-216
Schield, M. (2004) Curricular Development in Statistics Education. 2004 IASE Roundtable,
Lund Sweden
Storkel, H. L. , Woodson, M. B. , Wegner, J. R. & Daniels, D. B. (2012) Multidimensional
Student Assessment. The ASHA Leader.
Sullivan, W.G. (1996) Multivariate analysis of student performance in large engineering
economy classes. Frontiers in Education Conference. 26th Annual Conference.,
Proceedings of (Volume:1 )
1
1
Taylor, C., Rees, G. and Davies, R. (2013) Devolution and geographies of education: the
use of the Millennium Cohort Study for ‘home international’ comparisons across the UK.
Comparative Education, 49(3), 290-316.
Vere-Jones, D. (1995) The coming of age of statistical education. International Statistical
Review, 63, 3-23.
Xiaochen Sun, Santiago Vilar, and Nicholas P. Tatonetti (2013) High-Throughput Methods
for Combinatorial Drug Discovery Sci Transl Med 2 [DOI:10.1126/scitranslmed.3006667]
1
2
Download