Quantitative Methods Strategic Advisor report by John MacInnes ESRC Strategic Advisor on Quantitative Methods Training, University of Edinburgh Quantitative Methods 3 1 History and context of the SA QM role ESRC’s engagement with the issue of the standard of QM training goes back to 2000 and the reforms to PG training initiated during Gordon Marshall’s tenure of office as Chief Executive. It soon become clear however, that a significant constraint on efforts to improve PG training was the typically low level of skills held by PGs at entry to such training. This not only constrained the level of skills that PG training might aspire to reach, but perhaps more importantly, meant that most students entering doctoral study primarily thought in terms of what they might do with the methodology they had most knowledge and experience of and felt most comfortable with: qualitative approaches. Accordingly, in 2006 ESRC commissioned six pilot projects to explore innovation in undergraduate QM teaching, and a review of international practice by Dr Jonathan Parker. This formed part of a QM strategy that included: n orking in partnership with other organisations in order w to promote social science and the use of statistics in schools; n s upporting the development of undergraduate curricula in quantitative methods; n funding an international benchmarking review of best practice in the provision of undergraduate teaching in quantitative methods; n allocation of additional studentships for stand-alone the masters courses in quantitative methods; n increasing the number of ESRC studentships dedicated to advanced quantitative methods; n provision of an enhanced stipend for ESRC the studentships that will be using advanced quantitative techniques; n the provision of enhanced salaries to the ESRC one year postdoctoral fellowships using advanced quantitative techniques; n funding of mid-career re-skilling opportunities the through the Councils Research Methods programme, National Centre for Research Methods and Research Development Initiative. In November 2007 the ESRC held a workshop ‘Enhancing The UK Social Science Skills Base In Quantitative Methods: Developing Undergraduate Learning’. It considered reports from the pilot projects and the Parker review, reaching the following key conclusions: n e need to address an anti-quantitative methods culture W problem. n Quantitative literacy needs to be embedded at several levels with substantive social sciences as part of a wider effort to improve the method of scientific enquiry and investigation. n here is a need to focus co-ordination with other T partners, concentrate on where ESRC can make a difference. We need to consider how this we link into the work of the NCRM. n There is a need for more incentives for students, for those teaching quantitative methods and for institutions for improving quantitative skills. n here is a need for materials to support teaching T and learning of QM. For example teaching datasets through ESDS and building on that approach, as well as improving online resources. Accordingly in 2008 ESRC created the Strategic Advisor role on a half time basis for 12 months, with a remit to “lead in the development of a £2m programme aimed at enhancing undergraduate teaching in quantitative methods across the UK social science community.” The main activity of the SA in the initial year of the appointment was to: n Evaluate the ESRC pilot projects and their implications n stablish a mailing list for QM teachers to facilitate E communication and discussion n Consult with stakeholders about the undergraduate QM training deficit and ways to tackle it n ollect evidence from a range of social science C departments about how they addressed QM teaching and the barriers to improving provision n onduct a survey on the extent and nature of C undergraduate QM teaching. n Produce a report on the current state of undergraduate QM provision and make recommendations about best ways forward: Proposals to support and improve the teaching of quantitative research methods at undergraduate level in the UK. (Hereafter referred to as the ‘MacInnes report’) n old a QM teachers workshop in London to consider the H SA’s report and to discuss ways forward 4 2 Progress against objectives The MacInnes report made the following main recommendations. Against each recommendation is the action (if any) that has been taken arising from it. The ESRC should commission a further series of projects in UG curriculum innovation in QM teaching, building upon the lessons learned from the pilots, and producing teaching resources that can be shared across universities. The joint Curriculum Innovation /Researcher Development Initiative (CI/RDI) programme, with HEFCE funding & £400K from British Academy, funded twenty projects. The commissioning panel met in October 2011 and projects were completed by autumn 2014. The British Academy (BA) has funded the production of an end of programme guide so that potential users can quickly find resources produced by the programme that best suit their needs. The guide will be ready by December 2015. The 2014 survey (see below) found that there was a good level of awareness of the programme and it’s outputs. Produce a web based ‘off the shelf’ blended learning course in QM The University of Edinburgh PI Professor Ailsa Henderson was awarded an CI/RDI grant to produce this resource. Workshops that focus on basic QM skills for ‘non-quants’ staff. Several CI/RDI projects included introductory level workshops on teaching QM accessible to ‘non-quants’ staff. The BA created a number of ‘Skills Acquisition’ awards to allow staff with little QM experience to be mentored and trained. Q-Step centres (see below) are encouraged to find ways of broadening the skills base of non QM staff. Support, training, resources and incentives for QM teachers; A QM teachers mailing list was established which now has Increased contact and collaboration between members around 350 members. of this community spread across different university Five national workshops were organised for QM teachers, departments should be fostered. held at the RSS and British Academy with up to 90 people attending. The CI/RDI projects created a range of teaching resources, lessons about effective teaching and video and other workshop material, accessible at quantitativemethods.ac.uk and other sites. A special issue of the journal Enhanced Learning In the Social Sciences was published in 2014 with material based on CI/RDI projects. The HEA Social Science Summit in 2013 was devoted to teaching research methods, at which the SA gave the keynote address (available from the HEA as a paper). A second edition of the ESRC booklet on careers for students ‘Stand Out and Be Counted’ was produced with BA funding and widely distributed BA produced a policy statement ‘Society Counts’ arguing that more attention should be given to QM in the social sciences and more effort made to ensure that all undergraduates get a proper grounding in QM. The statement was supported by many of the major social science professional associations. Quantitative Methods Strategic Advisor Report 5 Contact was made with HE STEM (the Strategic Advisor will be speaking at their conference later this year) in order to learn from their experience. Key points emerging from this contact was the ubiquity of maths ‘remedial’ work in STEM subjects, including those where students typically have good passes at A-Level. The range of relevant online maths resources created within the STEM community was linked to the quantiativemethods.ac.uk page. STEM experience however was that online resources work best in tandem with face to face support. BA hosted two conferences on QM and it was encouraging that the former Minister for Universities David Willetts used his speech to note the importance of ‘Stem skills’ in nonSTEM subjects. The SA made presentations to the three Research Methods Festivals that took place between 2009 and 2014 The SA joined the Advisory and Training Capacity boards of NCRM to advise on how NCRM might best support or complement QMI activities. In the bid for NCRM funding for 2014-19 he formed part of the successful SouthamptonManchester-Edinburgh bid, and will act as PI on an NCRM project to evaluate QM teaching pedagogy. To strengthen links with the RSS the SA was elected to its Council in 2013. Training and incentives for students conducting secondary data analysis. BA now provides scholarships for undergraduates to participate in the Essex Summer School. Student placements to undertake secondary data analysis. What began as discussions with Nuffield about extending its undergraduate bursaries for summer placements from the natural to the social sciences evolved into the Q-Step programme, jointly funded by Nuffield, ESRC and HEFCE with a budget of approximately £19.8m. Initially 15 universities were awarded Q-Step status and in February 2015 a further three universities were awarded ‘affiliate’ status in the programme. The programme funded up to 53 new posts, but in practice some universities have topped up Q-Step funding to appoint additional lecturers. Recruitment was an unqualified success, with many appointments made from beyond the UK. Manchester university has already trialled placements and the results were very positive, with employers especially very pleased with the results. Cardiff has already built up links with schools, as have many other centres, and held workshops for school teachers which were heavily oversubscribed. Edinburgh held a successful summer school for 50 pupils in association with Headstart. 6 The development of Core Maths and the consequent demand from schools for maths related resources and expertise makes this an opportune time to explore such links. The next challenge is student recruitment with most centres receiving their first cohort of students on new degree programmes in Autumn 2015. However indications from the transfer of students from other degrees onto new Q-Step degrees (which can be done before UCAS codes have been obtained) are already very positive. In general centres have developed faster than originally hoped. There have been three meetings so far of centre convenors and it is clear that while there may be healthy competition between centres there is also a very great deal of mutual trust, collaboration and support, based on a common determination to take QM back to its proper place in the undergraduate curriculum. I have attended the launch of the Cardiff and Manchester centres and was struck by the enthusiasm and energy of all those involved. I have been struck by the exceptional awareness of, degree of interest in and enthusiasm for Q-Step across all stakeholders and within government. I discuss below how ESRC and others might best capitalise and develop this goodwill. The ESRC should consider holding an annual competition with cash prizes for the best undergraduate dissertation using QM. This proposal was not been taken forward. It was superseded by the Q-Step Programme. Changing student perceptions of QM. Much research has convincingly established that students see QM as difficult and irrelevant. However there is some evidence of changing student perceptions. Williams et al (unpublished) found evidence that students were aware of the employability benefits of QM, as highlighted in the ESRC leaflet Stand Out and Be Counted. (BA funds permitted republication and widespread distribution of the original ESRC leaflet). However Williams found that the key problem was one of confidence: many students did not see themselves as the kind of student who could master the QM skills needed to reap these labour market benefits. A range of CI/RDI projects address the issue of maths/statistics anxiety, as did articles in the ELISS special issue on QM teaching, and the consolidate report from the CI/RDI programme will draw what general conclusions are available. At present it is perhaps to safe to draw some tentative conclusions. One is that making QM teaching engaging and attractive while delivering the skills that enable students to become confident with basic statistics is a genuine challenge with no ‘magic bullet’ solution. The Q-Step centres have the potential to generate a substantial body of expertise in this area that other universities will be able to draw on. However they will enjoy the advantage of a committed core of staff capable of providing inspiring teaching which other universities may find more difficult to replicate. Quantitative Methods Strategic Advisor Report 7 A second conclusion is that curriculum space will be vital. Developing, practising and reinforcing basic skills takes time. As part of the re-commissioning of the National Centre for Research Methods, ESRC has funded farther research on effective QM pedagogy, led by the Edinburgh hub of NCRM with John MacInnes as PI. Revise QAA benchmarks to strengthen the requirements for QM training. QAA benchmarks are currently under revision for all social science subjects. The draft Geography benchmark statement was developed with input from one of the CI/ RDI projects and the Royal Geographical Society with a very welcome much stronger emphasis on QM. With funding from Nuffield Foundation the RGS will work with schools, Q-Step centres and others to promote geographers’ quantitative understanding across the entire educational life course. This provides a useful model for other subject areas to follow. I have been active in persuading relevant learned associations of the need to improve the unsatisfactory wording of these statements highlighted in the original MacInnes report. Create a web portal for QM teaching resources. The website quantitativemethods.ac.uk was established, hosted by NCRM and with funding from the HEA. This has been ‘refreshed’ as part of the BA funded project to publicise the outputs from CI /RDI. The recognition process for the proposed Doctoral Training Centres should include a review of what procedures are in place at undergraduate level for QM training. The SA reviewed all DTC / DTU applications and reported to the commissioning panel on the quality of provision for QM training made in the applications. The SA made a presentation to DTC directors in June 2012 on the QMI and the importance of both basic QM training for those with little exposure to QM in their undergraduate degrees and the provision of more advanced courses to develop the skills of future undergraduates who already had a Basic understanding of QM. In the next recognition round it will be important to ensure that DTCs are equipped to provide a range of more advanced courses which can build on the improved skills that we can expect the cohort of students emerging from the Q- step centres to possess Applicants for research funding should be asked to briefly report their contribution to methods teaching and this should form a modest part of the assessment of proposals. This proposal has not been taken forward. The Je-S system is unable to capture this information and it was felt that taking forward the other recommendations would have a more beneficial effect. The ESRC should monitor the proportion of end of project reports making appropriate use of quantitative data. This proposal has not been taken forward. The Je-S system is unable to capture this information so it would be very difficult to monitor this except manually. ESRC should seek to develop links with schools so that pupils become aware of the relevance of QM and the wide range of their application. During the period of my appointment as SA the issue of maths education in secondary schools became a matter of extensive public debate, in part stimulated by the publication of the Nuffield report is the UK and outlier? Which drew attention to the very low proportion of students continuing to study maths after 16 in England and Wales(although the proportion and Scotland are slightly higher). ESRC should consider how relevant findings from research it has funded could be adapted and used in undergraduate teaching. The ESRC should give appropriate support to the ‘Use of Maths’ A Level FSMQ initiative. 8 I became a member of the steering group for the Advisory Council on Maths Education enquiry The maths Needs of the Nation which examined the demand for mathematical and statistical skills within employment and across different subject areas in Higher Education and also contributed to the government sponsored ‘Vorderman’ report A World Class Mathematics Education for All. I also contributed to discussions facilitated by the Nuffield foundation with the Secretary of State for Education pressing the case from widening participation in post 16 maths education. The government made a welcome commitment to move towards the eventual universal participation of school students in maths Study until the age of 18. I also contributed to the consultation around the development of the new core maths qualification which is currently being rolled out and which closely follows the original model of the use of maths. The core maths curriculum is in many ways highly relevant to students aspiring to study social sciences at university because of its applied focus and extensive use of statistical ideas. Perhaps even more important it addresses the key problem of students and finished their maths study with GCSE and have no maths practice for two or three years before arriving as undergraduates at University. Revisions to the Maths A-Level syllabus also include a new section on statistics. However it is clear that schools will face many challenges in rolling out the new qualification, in coping with the increased statistical content of A-Level, and with the large numbers of pupils who will be required to re-take GCSE Maths if they do not pass at first sitting. There will be a shortage of Maths teachers, existing teachers will need CPD, and it is likely that many or most of those teaching Core Maths wil be teachers drawn from other subjects (Business studies, economics, psychology, sociology, biology, STEM subjects). There is an opportunity here for ESRC which I returned to below. I contributed to a report by Roger Porkess A world full of data Statistics opportunities across A level subjects which identified severe problems caused by the reduction of coursework, the fragmentation of the curriculum across individual A-Level subjects, pressures to teach to the test, the existing skills base of teachers and limited opportunities for CPD. This severely limited students exposure to the cyclical problem solving approach that underlines using data and simple statistical methods to address a problem or research question or work through the experimental process. Quantitative Methods Strategic Advisor Report 9 By contrast there was an ever-widening range of opportunities for students to work with data and computers in ways that would equip them with relevant sstatistical literacy skills for both HE or employment focused on using and interpreting evidence obtained from data. I also contributed to the Nuffield Foundations’s investigation and report on Mathematics in A level assessments which found that although in some subjects there were variable opportunities to use maths or statistics in assessments, the requirement to do so was typically very low and that there was less mathematics in the examination papers than might be expected from reading the syllabuses, and what there was tended to be routine simple calculations. Further discussions of the results at a workshop organised by Nuffield at which the SA participated also revealed a preference from some professional associations to mimimise the maths content to A-Levels to keep them attractive, leaving ‘difficult’ maths to be learnt in studying mathematics. I shall be addressing the conference of Sociology A-Level teachers in June 2015. In 2014 I joined the advisory board of Mathematics in Education and Industry. Its chief exec, Charlie Stripp, produced a paper for the HLSG on Core Maths and the importance of demand for the new qualification from HE. I ran a workshop on maths and social science at the MEI’s annual conference in 2012. There have been some initiatives in the direction of materials for schools , with the ESRC website hosting some resources for schools and some projects such as the survey of young people’s views on the Scottish Independence Referendum producing teaching materials. ESRC commissioned NatCen to produce a resource for schools base on the BSAS: www.natcen.ac.uk/our-expertise/policy-expertise/schools,education-training/bsa-quiz/ However there is now a strategic opportunity for ESRC to take this process further. The Core Maths syllabus focuses on application of maths. Schools will face a considerable challenge in supporting the new qualification as it substantially increases the curriculum time devoted to Maths. It is likely that much Core Maths teaching will be done by non-Maths teachers, including teachers of sociology, psychology, geography, economics, business studies and other social science disciplines. 10 Teaching resources focusing on issues that interest young people and illustrating the applications of maths that Core Maths covers would be attractive to schools, and provide the very welcome signal that an aptitude for maths and and interest in society go together. In the recommendations below I suggest that ESRC should consider working with bodies such as MEI, the relevant social science learned societies to produce and distribute teaching resources. The ESRC should take the lead in establishing a post-school level credit bearing qualification in the use of quantitative evidence based on social research methods With funding from BA I undertook a consultation over the possible form a QM Qualification might take, talking to learned societies, including those such as BPS and SoB which already accredit, relevant employers associations, subscribers to the QM emailing list and relevant fellows of the BA. I had extensive discussion wth Roeland Beerthen, Education officer at RSS and sat on its strategic review of the qualifications it currently offers. The outcome of the review was that the RSS will move to accrediting the courses and assessment of others rather than continuing to provide its own syllabi and examinations (the Ordinary and Higher Certificate). I sit on the RSS’s new advisory board and there is real interest in rolling out accreditation to courses or degree programmes which have a substantial statistical content, as well as traditional statistics degrees. Continuing the encouragement and preparation of support amongst social science learned societies for external recognition of the statistics content of their degrees (where there are some potentially delicate issues around the sharing responsibility for accreditation with RSS) will be important and complement the changes underway at RSS itself. There is more enthusiasm for accreditation than an exam format, both because funding an examination system and persuading students to take it would be difficult, and because accreditation is more flexible. There is a general consensus about the range of skills proposed in the consultation. The consultation document is reproduced as an appendix to this document. The ESRC should form a high-level strategy group to make the case to government and employers for the capacity of the social sciences to deliver better graduate numerical and statistical literacy. British Academy agreed to convene this group, chaired by Sir Ian Diamond and with representatives from all the key stakeholders involved in the provision of QM relevant training from school level onwards. It is currently overseeing a ‘State of the nation study’ on the supply and demand for QM skills and drafting a manifesto for data skills. It has proved an invaluable means of communication between the key stakeholders and forum for co-ordination and decisions over policy, publicity and lobbying in this area. After the end of my term as strategic advisor I will continue to sit on the group in a personal capacity. The HLSG will produce a report with policy recommendations based on the SoN investigation timed to come just after the general election, and is meanwhile lobbying SPADs and others about the importance of the QM skills challenge. Quantitative Methods Strategic Advisor Report 11 One impact of its behind the scenes work was the very welcome reference to QM skills and Q-Step centres in the Campaign for Social Science’s report The Business of People: the significance of social science over the next decade. B uilding on such initiatives as the Q-Step Centres, social science education must increasingly equip the next generation of researchers with quantitative techniques, the capacity to acquire and analyse new forms of data and the disposition to collaborate with other scientists In June 2015 the HLSG launched its report Data Skills in the UK State of the Nation Report at the Houses of Parliament. Either through such a body if it is established, or directly, if it is not, the ESRC should make clear to university vice chancellors’ that the weak position of methodology in general and QM in particular in university social science departments threatens the future health of UK social science and its international standing. There have been useful discussions at the HLSG. Relationships with University VCs are a delicate question, insofar as the latter rightly see their role as, amongst other things, defending the autonomy of universities. ESRC does have the opportunity in its relationships with VCs to stress the importance of a robust methodological environment for good social science work and that universities long term success in securing research funds in part depends upon equipping both undergraduate and postgraduate students, post-docs and early career researchers with a sound knowledge of both quantitative and qualitative methods. However there is a deeper issue that should perhaps best be tackled via discussion between RCUK HEFCE and BIS: the atrophy of statistics departments in universities driven by the REF. Statistics lie near the core of methods in almost all sciences. It is important that those applying statistics are aware of new developments or alternative approaches. Yet this process is stifled by the REF. Some departments or research units may be large enough to appoint a statistician to work alongside subject specialists, but many may not be. Such statisticians can be returned within the substantive discipline of their colleagues if they have built up enough subject specialist knowledge, but would be unlikely to be returned under Statistics. The requirements of the REF have encouraged those statistics departments which remain (and which are often subsumed within Maths departments) to focus on mathematical rather than applied statistics, which further undermines a productive interchange of ideas between substantive disciplines and statistics departments. Conversely border teaching application of statistics in many departments is carried out by people who may have had relatively little formal statistical training and may have few links with statisticians working in statistics Departments. I return to this issue below. 12 3 The QM teaching survey: the 2014 ‘refresh’ of the 2009 survey In 2009 an online survey was undertaken in order to gather evidence about the extent and nature of QM teaching in university social science departments and the views of those teaching QM in these departments. The main results were published as an appendix to the Strategic Advisor’s report. The 2014 survey drew responses from 178 respondents from 74 universities about 172 degree programmes plus variants covering approximately 11, 900 students. Joint or multiple degree programmes are ‘double counted’ in the following table showing the disciplinary spread covered by respondents in the survey. Sociology 55 Sports Studies 4 Criminology 26 Communications/Culture & Media studies 3 Politics 23 Social Anthropology 3 International Relations/Peace/Conflict studies 14 Childhood and Youth Studies 2 Social Policy 14 Development Studies 2 Business /Management Studies 10 Linguistics/Speech and Language studies 2 Human Geography 9 Behavioural Sciences 1 Education 5 Health and Social Care 1 History 5 Comparisons with the 2009 survey have to be made with some caution. The 2009 survey analysed responses on teaching from 66 courses. Asking about courses was the only feasible way to gain reliable information without having to collect detailed information about the complex structure of teaching arrangements across a large number of degree programmes in different universities. The complexity of academic governance and degree delivery structures means that even simple data is not always directly comparable across institutions. There is no neat mapping of staff to disciplines to departments to degree programmes and ultimately to courses. Modularisation means that individual students on the same degree may follow quite different curricula, and what might be a separate degree programme in one HEI might be a variant of a broader degree in another. Individual courses may be shared across several degrees, sometimes with different assessment arrangements, and so on. For the 2014 survey it was decided to make the degree programme the basic unit of analysis, and much greater effort was made to maximise the response rate. This was done because it was important to collect information about departments that might have little interest in QM, or other lack of motivation to participate, in order to gain as complete a picture as possible of the reach of the efforts by ESRC and others to improve QM provision since 2009. It was likely that departments who were above average in their level of interest and commitment to UG QM teaching were over represented in the 2009 survey. This drawback was relatively unimportant since it could be safely assumed that the issues facing these departments would also be challenges facing others. However for the 2014 survey it was important to try to reduce such non-response bias. A small incentive (a prize draw for book vouchers) was used and a great deal of effort was put into follow up emails and phone calls to non-responding HEIs which were known to offer social science degree programmes. The final survey was based on data from 178 respondents from 74 universities about 172 degree programmes: a much larger number than in 2009. The data requested was also more finely disaggregated than in the original survey, and while it was designed to make comparisons possible, a balance had to be struck between this and collecting data that would give as detailed a picture of the state of QM teaching as possible, within the constraints of an online survey. Because there is no good sampling frame for either social science degree programmes, departments, QM teaching staff or QM courses it is not possible to calculate a response rate. The Higher Education Statistics Authority (HESA) reports approximately 49,000 full time first year undergraduate students in social studies in 2013-14, but this total will include those studying Economics, who were excluded from our survey (Psychology students are classified to Biological Sciences by HESA) and those studying distance learning courses such as those provided by the Open University. The 2014 survey responses cover about 11,000 students, so that the results that follow are based on the experience of something over one quarter of all UK based social science students studying subjects other than psychology or economics. ‘Pre-1992’ HEIs (n = 50) were better represented than others but in terms of student numbers around one half of students covered in the survey came from ‘post-1992’ HEIs. As with the 2009 survey there are courses and universities that are missing, and given the effort put into securing responses to the survey it is less likely that universities, degree programmes or courses that feature outstanding social science UG Quantitative Methods Strategic Advisor Report 13 QM teaching have not been captured than institutions where less attention is paid to QM, so this must be kept in mind when considering the results. However given that the 2014 results cover a much larger number of degree programmes and universities, it can safely be assumed that non-response bias is lower. It may also be reasonable to assume that departments with little interest in or provision for QM teaching would have been less likely to take part in the surveys, so that direct comparisons between the two surveys may underestimate the amount of change that has actually taken place. The survey contained two sections: one covering QM teachers, which could be completed by multiple respondents in an institution and one covering degree programmes which could be completed as many times as there were degree programmes with significantly distinct arrangements for UGQM teaching in the degree. Quantitative methods teaching Table 1. Percentage of Degree Programmes including quantitative methods teaching Degree programme 2014 (%) N** All 92 Sociology 96 Social Policy 100 35 Criminology 88 Politics/International Relations Geography Social Anthropology Other 2009 (%) N 171 78 108 104 90* 51* 72 100 12 84 43 71 7 63 8 100 6 100 3 0 8 96 101 46 11 * Sociology and Social Policy were counted together in the 2009 report. ** Joint degree programmes (e.g. ‘Criminology and Sociology’) are counted once under each heading. Table 2. Median Total QM Teaching Contact Hours by Degree Programme Degree Programme Lectures 2014 Seminars Computer Labs All Of which embedded 2009 2014 2009 2014 2009 2014 2009 2014 36 10 All 20 14 11 10 10 10 48 Lower quartile 12 10 2 0 6 6 27 0 Upper quartile 40 30 20 20 26 26 98 60 Sociology 25 20 12 11 12 10 60 45 Social Policy 12 * 12 * 24 * 48 * Criminology 20 13 11 3 12 12 48 40 Politics / International Relations 12 12 6 14 6 2 25 28 Geography 10 17 13 0 5 10 28 31 Other 18 12 12 0 12 10 57 12 * Sociology and Social Policy were counted together in the 2009 report. Table 3. Embedded hours Median Mean Min LQ UQ Max Total contact hours 48 58 0 19 87 240 Of which ‘embedded’ teaching 10 27 0 0 41 111 14 Table 2a (appendix) shows the spread of contact hours for all universities, all degree programmes, all degree programmes by type of university, by student numbers and type of university and student numbers by degree programme title. While there is evidence of some increase in the hours devoted to QM teaching, it is significant rather than substantial and there is clearly room for much further improvement. What is most striking from the figures in table 2a is the great spread in the range of contact hours from zero to 240. Three year degrees comprise 360 credits, which translates into some 3,600 ‘notional’ learning hours and perhaps 360 to 540 contact hours if we assume 20-30 contact hours for a typical 20 credit course. This means that QM teaching might comprise some 10% of the typical student’s degree. While this proportion appears reasonable, it must be kept in mind that this includes embedded teaching where QM is not the only focus. In the summer of 2015 the British Academy commissioned me to undertake a comparative study of QM teaching. Table 4. Content of QM courses Taught in any year Taught in first year 2014 2009 2014 2009 Descriptive statistics /Summary statistics of level and spread 90 69 26 19 Survey methods /Data collection methods 85 69 31 24 Contingency tables 82 69 26 16 Sampling /Sampling and inference 86 68 24 19 Significance tests 86 * 20 * Graphic Display /Visualisation of Data 87 66 27 19 Questionnaire design 87 65 24 18 SPSS /software 91 58 27 15 Correlation / Measures of association 83 49 24 10 Controlling for a 3rd variable 71 27 10 3 Regression 65 18 9 3 Access to & analysis of secondary data 75 * 25 * Table 5. Dissertation projects: proportion using QM 50%+ 2014 2014 weighted * 2009 24 39 9 25-49% 6 4 17 10-24% 18 22 15 5-9% 22 13 28 0-4% 19 23 30 *Weighted by number of students Table 6. Dissertation projects: proportion using secondary analysis of microdata 2014* 25%+ 2 10-24% 5 5-9% 24 1-4% 26 None 42 *Weighted by number of students on programme; no corresponding question in 2009 There has been a welcome increase in the range of topics covered in QM teaching, especially in control and regression, and an increase in the proportion of final year projects using QM. However the proportion of projects attempting secondary data analysis is still low. Quantitative Methods Strategic Advisor Report 15 Quantitative Methods Teachers The mean career length in HE was 13 years (median 11 years; lower quartile 6 years; upper quartile 17 years). 89% were in permanent posts and 39% were in promoted posts. Most described their skills as ‘advanced’, almost three quarters had done externally funded research in the previous three years and almost half had done more than on such piece of research. Two thirds had published in peer reviewed journals using QM in this period and almost on half had produced more than one such publication in this perio. This is similar to the profile reported in the 2009 survey, except that the proportion in promoted posts has fallen (from 54%). Given that some respondents to the survey will include staff recently recruited as part of the Q-Step programme we might expect such a fall. It is encouraging that despite this, the profile of QM teachers continues to be one of experienced, research active colleagues. Teachers with longer experience in HE were slightly more likely to report Postgraduate Training as a source of their skills: indirect evidence of the impact of reforms enacted by ESRC since 2001. Table 1. Level of QM Skills No QM skills. 5 Competent at ‘basic’ QM (eg, descriptive statistics; standard procedures in SPSS, Stata, R or similar software; simple inference). 20 Sometimes use advanced QM (eg, secondary analysis of complex data sets; multivariate analysis). 27 Regularly use advanced QM. 46 Other 3 Table 2. Most important source of QM skills (col %; respondents can cite more than one source) Years employed in HE 10 years or less 11 or more years All Own study 45 50 48 UG training 19 26 23 PG training 65 48 56 Courses 40 30 35 Other 14 7 10 (N) 86 92 178 The most frequently reported disciplinary area was Sociology. 30% of respondents gave more than one disciplinary descriptor. In analyses that follow by individual subject area or combinations of subject area such respondents are included under each subject area they returned, so that these respondents are included multiple times. Table 3. Subject area % respondents Sociology 39 Politics/International Studies 19 Social Policy 19 Criminology/Socio-legal Studies 17 Human Geography/Population Studies/Demography 12 Education 10 Social Work 7 Business Studies 6 Social Theory 6 Statistics/Maths/Methodology 6 Health/Epidemiology / 5 16 Economic / Social History 5 Cultural studies 3 Gender / Feminist studies 3 Social Anthropology 3 Table 4. Views about QM Teaching (row %) (Strongly) Agree Neither agree nor disagree (Strongly) Disagree I enjoy teaching QM. 82 12 6 The students I teach enjoy learning QM 36 38 26 Most of the students I teach don’t like numbers. 64 16 19 Most students here are confident about using QM once they graduate. 26 27 46 In recent years my department has improved the QM teaching undergraduates receive. 67 15 19 In recent years my department has increased the proportion of the curriculum devoted to studying QM. 50 17 34 QM is taught and assessed within one or more substantive courses offered by my department. 71 7 22 My department has recently appointed staff with the ability to teach QM. 67 6 26 I have the resources I need to teach QM well. 63 15 21 My department sees QM teaching as a priority. 45 24 31 I am satisfied with the way my department teaches QM. 45 22 33 Quants is still a marginal interest in my department. 49 13 38 QM teaching is important for promotion here. 7 30 63 Figure 1 I enjoy teaching QM QM is taught and assessed within one or My department has recently appointed staff In recent years my department has improved Most of the students I teach don’t like I have the resources I need to teach QM well In recent years my department has increased Quants is still a marginal interest in my I am satisfied with the way my department My department sees QM teaching as a The students I teach enjoy learning QM My students here are confident about using QM teaching is important for promotion here 0% Strongly agree 10% 20% 30% Neither agree nor disagree 40% 50% 60% 70% (Strongly) disagree 80% 90% 100% Quantitative Methods Strategic Advisor Report 17 Table 5. Views about QM Teaching: comparison with 2009 (%) (Strongly) Agree (Strongly) Disagree 2014 2009 2014 2009 I enjoy teaching QM. 81 68 6 4 The students I teach enjoy learning QM 35 5 25 38 Most of the students I teach don’t like numbers. 63 84 18 3 Most students here are confident about using QM 25 once they graduate. 9 44 64 QM is taught and assessed within one or more substantive courses offered by my department. Other substantive courses in the department normally make use of quantitative evidence 70 31 22 49 I have the resources I need to teach QM well. I have the time and resources I need to teach quantitative methods well 63 26 21 39 My department sees QM teaching as a priority. 45 29 31 47 Quants is still a marginal interest in my 49 department. Quantitative methods are in the mainstream of the discipline here (reverse coded) 62 37 22 QM teaching is important for promotion here. 9 59 62 6 The responses to the attitude statements all appear to move in a positive direction since 2009. A larger proportion of respondents reported enjoying teaching QM (suggesting that the tendency to allocate such teaching as a ‘chore’ to those with little interest in it, has declined). Embedding teaching seems to have spread more widely, with over two thirds of respondents reporting that it is done in their department. Two thirds thought that their department had improved the way in which it teaches QM in recent years, a similar proportion reported that their department had appointed QM teaching staff (although this could represent staff turnover as well as a net addition). One half reported that the curriculum time devoted to QM had increased, and almost half (45%) thought that their department saw teaching QM as a priority. Although many more respondents thought that their students enjoyed learning QM and that they were confident about using it when they graduated, these proportions are still much lower (35% and 25%) than is desirable and majorities still thought that the students they taught do not like number work and that QM was a ‘marginal’ interest in their department. Overall the responses to the attitude statements indicate substantial progress (especially if we assume that the coverage of the 2014 survey was significantly wider than its 2009 predecessor) but also much room for farther improvement, especially in the expansion of curriculum space for QM teaching. Respondents were asked how useful they felt different activities taken to improve QM had been, if they were aware of them. The results are shown show that all the initiatives were seen my most teachers as useful, but between one half and one third of teachers were unaware of each activity. This suggests that there is still scope for publicising what is being done more effectively. However it is also indirect evidence of the reach of the 2014 survey: it certainly appears that it was not just completed by those in departments fully familiar with the QM agenda and committed to it. It is also notable that Q-Step was the initiative with the highest level of support, including from those not in Q-Step centres. Further analysis showed that degree of awareness (as measured by the number of initiatives that respondents reported they were not aware of) was uncorrelated with either length of service, or the type of post respondents held. There was a weak relationship to subject area, with the respondents from the better represented subject areas also reporting more awareness, and there was a stronger but still weak correlation (0.2) with responses to the attitude statements: those aware of a greater number of initiatives were slightly more likely to report improvements to QM teaching, more curriculum space, recent appointments of QM staff and priority for QM teaching. However correlation is not causation: while it could be that awareness of and participation in these initiatives helped stimulate departments to revise their approach to QM teaching, it is also possible that departments which were more motivated to reform QM teaching were more likely to become aware of or participate in these initiatives. 18 As we might expect there was an association with participation in training events. However there was also an association with type of university (awareness was higher in ‘pre-92’ HEIs) and with research and publishing records. It could be that respondents who are research active and therefore familiar with the activities of the ESRC have also been more likely to get involved with the QMI. There was also a strong relationship with positive reports of progress on QM Teaching as measured by the attitude statements. However the direction of causality here cannot be inferred from the data we have. It could be that involvement in and awareness of the events an initiatives that comprise the QMI have encouraged departments to improve UGQM teaching. However it could also be that departments that were determined to improve their QM teaching were also departments where respondents became more aware of and involved in the QMI. Figure 2 The Q-Step programme and funding Workshops and seminars for QM teachers The QM teachers mailing list NCRM training courses ESRC Secondary Data Analysis Initiative call The British Academy policy statement ‘Society Counts’ The Curriculum Innovation/Researcher Development Initiative Programme events The student career booklet ‘Stand Out and be Counted’ The quantitativemethods.ac.uk web page 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% Very useful Not useful Useful Not aware Table 6 Number of activities respondents were aware of % Cumulative % All 11 11 6-8 20 31 3-5 28 59 1-3 24 75 None 25 - The individual questionnaire Q1 In which university do you now work? Q2For how many years have you worked in Higher Education? (please include both research or teaching based posts) Q3 Is your current post permanent? Q4 Are you in a ‘promoted’ post (e.g. Reader, Professor or equivalent)? 1Yes 2No 3 Don’t Know 4Other Q5 How would you describe your own quantitative skills? Quantitative Methods Strategic Advisor Report 1 I do not have such skills. 2I am competent at ‘basic’ QM (e.g. descriptive statistics; standard procedures in SPSS, Stata, R or similar software; simple inference). 3 I sometimes use advanced QM (e.g. secondary analysis of complex data sets; multivariate analysis). 4 I regularly use advanced QM. 5Other Q6 What was the most important source of the quantitative skills you have now? Q6_1 Own study Q6_2 Undergraduate training Q6_3 Postgraduate training Q6_4 Attending specialist workshops or courses Q6_5Other Q6_Other Further details Q7 Have you done any externally funded research within the last three years? 1No 2 One grant 3 More than one grant 4Other Q8 Have you published any peer-reviewed work using quantitative techniques in the last three years? 1Yes 2One 3 More than One 4Other Q9Have you attended any training events, workshops or conferences dealing with quantitative methods teaching or that build QM skills you may pass on in teaching? Please exclude training events that build QM skills that you use principally for research. 1No 2One 3 More than one 4Other Q9_a If you have attended such an event please tell us who organised it. Q9_a_1ESRC Q9_a_2HEA Q9_a_3NCRM Q9_a_4 British Academy Q9_a_5 Professional Association (BSA, PSA, RSS etc) Q9_a_6Other Q9_a_Other Further details Q10What is the main disciplinary area of your department? (if there is more than one please choose as many as apply) Q10_1 Area studies Q10_2 Business Studies Q10_3 Criminology/ Socio-legal Studies Q10_4 Cultural studies Q10_5 Gender / Feminist studies Q10_6 Health / Epidemiology / Q10_7 Human Geography /Population Studies/Demography Q10_8 Economic / Social History Q10_9Education Q10_10Linguistics Q10_11 Politics /International Studies Q10_12 Social Anthropology Q10_13 Social Policy 19 20 Q10_14 Social Theory Q10_15 Social Work Q10_16Sociology Q10_17 Statistics / Maths / Methodology Q10_18 Research Unit Q10_19Other Q10_Other Further details Q11 Please give your opinion on the following statements about quantitative methods teaching. Q11_a I enjoy teaching QM. Q11_b The students I teach enjoy learning QM. Q11_c Most of the students I teach don’t like numbers. Q11_d Most students here are confident about using QM once they graduate. Q11_e In recent years my department has improved the QM teaching undergraduates receive. Q11_f In recent years my department has increased the proportion of the curriculum devoted to studying QM. Q11_g QM is taught and assessed within one or more substantive courses offered by my department. Q11_h My department has recently appointed staff with the ability to teach QM. Q11_i I have the resources I need to teach QM well. Q11_j My department sees QM teaching as a priority. Q11_k I am satisfied with the way my department teaches QM. Q11_l Quants is still a marginal interest in my department. Q11_m QM teaching is important for promotion here. 1 Strongly Agree 2Agree 3 Neither agree nor disagree 4Disagree 5 Strongly Disagree 6 Don’t Know 7 Not Applicable 8Other Q12 Please feel free to add further comments on any of your answers above Q13Over the last five years the ESRC, Nuffield Foundation, HEFCE, British Academy Higher Education Academy and others have been trying to improve the position of QM in UK Universities. The following lists some of the initiatives. Please indicate whether you have been aware of them and if so, how useful you think they have been. Q13_a ESRC Secondary Data Analysis Initiative call. Q13_b The Q-Step programme and funding. Q13_c Workshops and seminars for QM teachers. Q13_d NCRM training courses. Q13_e The QM teachers mailing list. Q13_f The quantitativemethods.ac.uk web page. Q13_g The student career booklet ‘’Stand Out and be Counted’’. Q13_h The British Academy policy statement ‘’Society Counts’’. Q13_i The Curriculum Innovation/Researcher Development Initiative programme events. 1 Very useful 2Useful 3 Not useful 4 Not aware of this 5Other Q14 Please add any observations you have on these or other initiatives to improve QM training. Q15 What do you think would be the best way to develop university QM training over the next five years? Quantitative Methods Strategic Advisor Report The institutional questionnaire Q1 Please enter the name of your university Q2 Which academic year does the following information relate to? 12013/14 22014/15 3Other Q3Please enter the name of up to six main degrees that share similar QM teaching arrangements (respondents could complete this information for as many( groups of ) degree programmes that had distinct arrangements for QM teaching Q3_a Degree 1 -- Name of the degree Q3_a_i Degree 1 -- Number of students Q3_b Degree 2 -- Name of the degree Q3_b_i Degree 2 -- Number of students Q3_c Degree 3 -- Name of the degree Q3_c_i Degree 3 -- Number of students Q3_d Degree 4 -- Name of the degree Q3_d_i Degree 4 -- Number of students Q3_e Degree 5 -- Name of the degree Q3_e_i Degree 5 -- Number of students Q3_f Degree 6 -- Name of the degree Q3_f_i Degree 6 -- Number of students Q4How many contact hours of quantitative methods teaching do students receive in their degree programme? ‘’Please include all QM teaching, including teaching ‘embedded’ in substantive courses as well as specialist ‘research methods’ courses.’’ E.g. if a methods course is divided between quantitative and qualitative methods, please base your answers only on the quantitative component(s); if a substantive course devotes a proportion of its teaching to introducing or practicing QM skills and this is reflected in assessment, please include such teaching in the hours entered. Q4_a Hours of lectures -- Optional pre-honours courses Q4_a_i Hours of lectures -- Compulsory pre-honours courses Q4_a_ii Hours of lectures -- Optional honours courses Q4_a_iii Hours of lectures -- Compulsory honours courses Q4_b Hours of seminars/workshops/tutorials -- Optional pre-honours courses Q4_b_i Hours of seminars/workshops/tutorials -- Compulsory pre-honours courses Q4_b_ii Hours of seminars/workshops/tutorials -- Optional honours courses Q4_b_iii Hours of seminars/workshops/tutorials -- Compulsory honours courses Q4_c Hours of computer labs -- Optional pre-honours courses Q4_c_i Hours of computer labs -- Compulsory pre-honours courses Q4_c_ii Hours of computer labs -- Optional honours courses Q4_c_iii Hours of computer labs -- Compulsory honours courses Q5Approximately how many hours of this teaching represents the delivery of QM skills ‘embedded’ in courses with a substantive focus? Q6 Please indicate which if any of the following topics are introduced at each level. Q6_a Questionnaire design -- Compulsory pre honours courses Q6_a_i Questionnaire design -- Optional pre-honours courses Q6_a_ii Questionnaire design -- Compulsory honours courses Q6_a_iii Questionnaire design -- Optional honours courses Q6_b Data collection methods -- Compulsory pre honours courses Q6_b_i Data collection methods -- Optional pre-honours courses Q6_b_ii Data collection methods -- Compulsory honours courses Q6_b_iii Data collection methods -- Optional honours courses Q6_c Summary statistics of level and spread -- Compulsory pre honours courses Q6_c_i Summary statistics of level and spread -- Optional pre-honours courses Q6_c_ii Summary statistics of level and spread -- Compulsory honours courses 21 22 Q6_c_iii Summary statistics of level and spread -- Optional honours courses Q6_d Contingency tables -- Compulsory pre honours courses Q6_d_i Contingency tables -- Optional pre-honours courses Q6_d_ii Contingency tables -- Compulsory honours courses Q6_d_iii Contingency tables -- Optional honours courses Q6_e Graphic display / visualisation of data -- Compulsory pre honours courses Q6_e_i Graphic display / visualisation of data -- Optional pre-honours courses Q6_e_ii Graphic display / visualisation of data -- Compulsory honours courses Q6_e_iii Graphic display / visualisation of data -- Optional honours courses Q6_f Measures of association -- Compulsory pre honours courses Q6_f_i Measures of association -- Optional pre-honours courses Q6_f_ii Measures of association -- Compulsory honours courses Q6_f_iii Measures of association -- Optional honours courses Q6_g Sampling and inference -- Compulsory pre honours courses Q6_g_i Sampling and inference -- Optional pre-honours courses Q6_g_ii Sampling and inference -- Compulsory honours courses Q6_g_iii Sampling and inference -- Optional honours courses Q6_h Significance tests -- Compulsory pre honours courses Q6_h_i Significance tests -- Optional pre-honours courses Q6_h_ii Significance tests -- Compulsory honours courses Q6_h_iii Significance tests -- Optional honours courses Q6_i Controlling for prior variables -- Compulsory pre honours courses Q6_i_i Controlling for prior variables -- Optional pre-honours courses Q6_i_ii Controlling for prior variables -- Compulsory honours courses Q6_i_iii Controlling for prior variables -- Optional honours courses Q6_j Regression -- Compulsory pre honours courses Q6_j_i Regression -- Optional pre-honours courses Q6_j_ii Regression -- Compulsory honours courses Q6_j_iii Regression -- Optional honours courses Q6_k Access to and analysis of secondary data -- Compulsory pre honours courses Q6_k_i Access to and analysis of secondary data -- Optional pre-honours courses Q6_k_ii Access to and analysis of secondary data -- Compulsory honours courses Q6_k_iii Access to and analysis of secondary data -- Optional honours courses Q6_l Use of software (SPSS, Stata etc) -- Compulsory pre honours courses Q6_l_i Use of software (SPSS, Stata etc) -- Optional pre-honours courses Q6_l_ii Use of software (SPSS, Stata etc) -- Compulsory honours courses Q6_l_iii Use of software (SPSS, Stata etc) -- Optional honours courses Q7 Do students normally complete a research project or dissertation as part of their degree? 1 Yes: compulsory 2 Yes: optional 3No 4Other Q7_aApproximately what proportion of students use a significant quantitative element in their projects or dissertations (e.g. analysis of official published data, secondary analysis of microdata, or original survey producing simple descriptive statistics)? Q7_b Of those quantitative projects how many undertake secondary analysis of microdata? Quantitative Methods Strategic Advisor Report 4 Social science graduates in the Futuretrack survey. 4.1 Background: the Futuretrack Survey In addition to the 2009 and 2014 QM surveys, the other major source of information that we have on students’ skills and attitudes to them come from the Futuretrack survey. Futuretrack followed a cohort of students who applied through UCAS to enter Higher Education between September 2005 and September 2006, surveying them at pre-entry, entry, year 3 or final year of course if later, and on into employment or other destinations in 2011-12. The survey was led by the Kate Purcell and Peter Elias at the Institute for Employment Research at the University of Warwick and funded by the Higher Education Careers Service Unit. Details of the survey and associated documentation are available at: www.hecsu.ac.uk/current_projects_futuretrack.htm I’m grateful to Kate, Peter and Ritva Ellison for giving me access to the microdata from the survey. Needless to say, I am solely responsible for the conclusions drawn here, and any errors and omissions. Around 121,100 students were captured at entry, 26,500 at their final year and 17,100 in 2011-12 when most would have been in the labour market for two or three years. At each stage there was attrition from previous waves of the survey (especially between entry and subsequent waves) but also the recruitment of new members to the survey. Thus, for example, there is information at entry for 127,000 students, at third/final year for 26,000 students and in 2011-12 for 17,000. Around 8,000 students completed questionnaires at all three of these waves. Table 1 shows the number of responses for each wave and combination of waves. Table 1. Number of responses at each wave and combination of waves (unweighted) Entry, Y3/Final + 2011-12 7792 Entry+ Y3/Final only 13626 Entry + 2011-12 only 5807 Entry only 99834 Y3/Final + 2012-12 only 1057 Y3/Final only 3978 2011-12 only 2418 All entry 127059 All Y3/Final 26453 All 2011-12 17074 These numbers mean that most subjects had sufficiently large numbers of students to facilitate separate analysis at each wave. To present the results subjects are usually grouped as follows: Medicine, vet, dentistry and allied Pre clinical and clinical medicine, subjects allied to medicine including nursing, veterinary science, dentistry. This category has not been shown in tables directly relating to employment as the longer length of most medical, dental and veterinary degree programmes means that not all futuretrack students would have graduated or be in employment by the time of the fourth wave study. TEM S All natural science subjects, engineering, mathematics, statistics, technology, psychology, architecture and building. E conomics, finance & accounting Economics, finance and accounting. These subjects have been grouped together as we could expect a high level of numerical competence to be fundamental to these degree programmes. Social sciences not elsewhere specified (n.e.s.) All social science subjects including management and business studies, marketing, mass communication and documentation, but excluding psychology, economics, finance and accounting. Where not shown separately, psychology is included with STEM subjects. Law, arts & humanities All law, arts, humanities, philosophical, historical and languages studies. 23 24 Combinations of subjects across more than one of these groupings are usually not shown separately but included in the totals for ‘all subjects’. Results should be treated with some caution for the following reasons: Non-response bias. The overall response rate for the survey after entry is quite low, and we do not have information on nonresponders. Attrition and new recruitment also change the composition of the survey sample over time. For specific subject areas some of the Ns are small: around 200 students. Measurement. Although mean scores are shown in some tables it should be remembered that the original scores are ordinal in character. In particular the categories on the three point scale evaluating skills in the 2011-12 wave are rather broad. The nature of the categories used also changed slightly over time. Interpretation. In evaluating their skills and the contribution of their course in developing them we cannot know what reference group(s) individual students had in mind: other students on their course, on similar courses elsewhere, other students at their university or other universities, their own expectations of themselves or others and so on. It is safe to assume that, for example maths students interpreted ‘good’ numerical skills or the amount of their development rather differently from history students. Nor can we know how sound students’ evaluations of their own skills are. 4.2 Degree course skills In their third year (or final year if it was later) students were asked ‘How far do you think YOUR COURSE has enabled you to develop the following? [1 = very much; 2 = quite a lot; 3 = a little; 4 = very little; 5 = not at all.]’ Twenty one skills were listed: Ability to apply knowledge Presentation skills Ability to use numerical data Problem-solving skills Ability to work in a team Research skills Awareness of strengths/weaknesses Self confidence Computer literacy Self discipline Critical analysis Self reliance Desire to go on learning Specialist knowledge Entrepreneurial/Enterprise skills Spoken communication Independence Time management Inter-personal skills Written communication Logical thinking Figures 1 and 2 show the results for the skills ‘Ability to use numerical data’ and ‘Written communication’ by individual subjects and general subject groupings. While STEM subjects (with the partial exception of Maths) all manage to develop students skills in written communication, few students in the arts or social sciences think that their course did much to develop their ability to use numerical data. Of particular interest are the results for Economics and Psychology: they are comparable to the STEM subjects in developing numerical analysis skills. The result for Biology is also shown separately in the table because it is a STEM subject that nevertheless recruits students with similar levels of maths skills to social science subjects (a similar proportion of entrants to Biology have Maths A-Level as entrants to social science subjects). Table 2 shows the data used in Figures 1 and 2. We can compare the pattern of skills development across the 21 skills for each subject or group of subjects. Figures 3 to 5 show the results for the social sciences as a whole (excluding Economics and Psychology), for Sociology and for Biology. Quantitative Methods Strategic Advisor Report 25 Figure 1. Final /Year 3 Evaluation of how far course developed skill: ‘Use numerical data’; all universities by specific and grouped degree subject. Very much Quite a lot A little Very little Not at all 100 90 80 70 60 50 40 30 20 So lS Al La H w, um Ar an ts & iti es su M c. Ec Sci. on ex om cl ic . s ts bj lit Po um H Ec ec ic s y ol an TE Ge So ci ra og og y ph y og ol Bi gy lo ho yc Ps on M at hs 0 om & ics, ac fin co an un ce tin g 10 Figure 2. Final /Year 3 Evaluation of how far course developed skill: ‘Written Communication’; all universities by specific and grouped degree subject. Very much Quite a lot A little Very little Not at all 100 90 80 70 60 50 40 30 20 La H w, um Ar an ts & iti es . Ec Sci. on ex om cl ic . s So c ts su bj ec Al lS TE M ic s lit Po y ci ol og ra og Ge an um H So y ph y og ol Bi gy lo ho yc Ps on om & ics, ac fin co an un ce tin g Ec M 0 at hs 10 26 Politics All STEM subjects Soc. Sci. excl. Economics Law, Arts & Humanities 24.5 18.9 15.7 3.4 3.0 29.0 7.7 2.1 Quite a lot 26.0 46.6 50.9 54.0 27.4 16.3 10.2 42.1 23.1 6.5 A little 10.2 9.6 18.6 22.9 41.7 42.0 30.3 21.1 32.7 23.2 Very little 0.4 1.4 5.1 3.7 11.8 27.7 33.6 6.4 24.0 32.9 Not at all 0.2 0.3 0.9 0.5 3.5 10.6 23.0 1.4 12.5 35.4 Human Geography 42.1 Biology 63.2 Psychology Very much Maths Sociology Economics, finance & accounting Table 2 Third/Final year: Course development of ability to use numerical data Third/Final year: Course development of written communication Very much 10.5 22.5 39.5 36.1 42.2 43.8 44.7 28.4 36.0 42.7 Quite a lot 29.4 52.7 47.0 48.4 45.0 46.5 42.6 45.6 46.0 39.0 A little 38.3 20.8 10.7 13.1 12.8 8.3 12.1 19.8 14.5 13.7 Very little 14.7 2.9 2.2 2.3 0.0 0.8 0.3 4.9 2.4 3.5 Not at all 7.1 1.1 0.6 0.0 0.0 0.5 0.3 1.3 1.0 1.0 Unweighted N 383 630 928 357 136 173 315 4984 2800 4332 Figure 3. All Social Sciences Very much Quite a lot A little Very little Not at all 100 90 80 70 60 50 40 30 20 10 Re se ar ch Ab Cr sk iti ilit ills ca y la to na ap lys pl W y is rit kn te ow n le co dg m e m un ic a In tio de n Sp pe ec nd ia e lis nc tk e no Pr wl es ed en ge ta tio n sk ills Se lf In r te el ia r-p nc er e so na ls Lo k Aw gi ills ca ar lt en hi es nk s Ab in g ilit w of s ea tre y to kn n g w e Sp or ss ths k e ok in s / en a co te am m m un Tim ic at e io m n an ag em Se en lf co t nf id en Se ce Pr lf di ob sc le i m pl De in -s e ol sir vin e g to sk go ills on l Ab ea Co ilit rn m in y pu to g te us rl e ite nu ra cy m er ic a E ld En ntre at a te pr rp en ris e e ur sk ial ills / 0 Ab us al ic ng t en ni ar le s da o ta we f st ak ren g n Se es ths lf ses / di In sc te ip r-p lin Sp er e ok so na en ls co ki Ab m lls m ilit un y to ic at wo io rk n in a Se te am lf co nf Co id en m pu ce te rl i te E ra En ntre cy te pr rp en ris e e ur sk ial ills / er m nu on ills sk ce en em ag an m go es en e to e g vin ol e nc Quite a lot to e -s nd pe de is lys lia re A little ss lf t en en id nf co ng ni em e ills sk ar le ag an m on n lin ip lls ki e en o ce we f st a re co kne ngt m ss hs Pr m es / un ob le ic m at Ab io -s n ol ilit v y i n to g sk wo ills rk in Ab Co a ilit te m am y pu to te us rl e ite nu ra cy m er ic al En da En tre ta te pr rp en ris e e ur sk ial ills / ok Sp e go tio sc di ta en Se ne re Aw a to Tim e lf Se es Pr ls g in ge nc lia re na so lf Se er r-p te nk hi lt ca ce en e n dg ed wl no tk gi Lo lis ia nd pe le io is lys at ic ow kn de In y pl un m ills sk na la ch ar ca iti m co ap ec In sir De to Sp y ilit Ab n te rit W Cr Quite a lot Aw ar ilit y De sir Tim m le In lf Se g ills sk na la ca iti Cr in n io nk n io at nt es e hi lt e dg at ic un m ge ed le ow kn ca gi Lo y m co pl Very much ob Pr te n rit wl ills sk se Re Very much Pr W ap no ch 0 to tk lis ia ar se Re 0 lit y Ab i ec Sp Quantitative Methods Strategic Advisor Report 27 Figure 4. Sociology A little Very little Very little Not at all 100 90 80 70 60 50 40 30 20 10 Figure 5. Biology Not at all 100 90 80 70 60 50 40 30 20 10 28 Table 3. Final/Year 3 views on employability The subject I have studied is an advantage in looking for employment The skills I have developed on my course have made me more employable My course is developing the skills I believe I will need to get a job It will be easy for me to get the kind of job I want when I graduate I am optimistic about my long-term career prospects STEM 2.4 2.6 2.8 4.0 2.9 20 Psychology 2.7 2.8 3.1 4.5 3.2 18 Econ. & related 2.0 2.6 2.8 3.7 2.7 22 Sociology 3.8 3.3 3.5 4.4 3.3 18 Politics 3.2 3.0 3.1 4.4 3.2 20 Human Geography 3.0 2.7 2.9 4.2 3.2 19 Social Work 1.5 1.9 2.2 2.7 2.4 24 Social Sci. excl. Econ 2.7 2.7 2.9 4.1 3.0 20 Law arts & humanities 3.1 2.8 3.1 4.4 3.3 18 All 2.5 2.6 2.8 4.0 3.0 20 Median salary (£K) 1= Agree Strongly 7= Disagree Strongly Figure 6. Final/Year 3 views on choice of course Yes, definitely or probably Would choose a similar course, but not this one Don't know Would choose something completely different Medical, dental & vet Allied to Medicine STEM Psychology Economics & related Sociology Politics Human Geography Social Work Social Sci. excl. Econ Law arts & humanities All 0 10 20 30 40 50 60 70 80 90 100 Quantitative Methods Strategic Advisor Report 29 4.3 Skills, knowledge and employability In wave three (in their final year) students were also asked about how far their course had developed knowledge, expertise or skills that would be useful in employment through a series of agree/disagree statements where a score of 1 represented ‘Agree strongly’ and 7 ‘Disagree strongly’, so that a lower mean score represents stronger agreement. The results are shown in Table 3 and Figure 6. Finally students were asked ‘If you were starting again would you choose the same course. Between two thirds and ninetenths said ‘yes’ (Figure 6) These results are what we might expect. Students in vocational subjects are more optimistic about their immediate and longer term career prospects, and aware of the close link between the knowledge and skills their degree course developed and their career. STEM students are slightly more likely than social sciences and arts/humanities students in seeing their degree subject as an advantage in the labour market, but students in all three areas take a similar view of the employability benefits of their course skills. They don´t anticipate that finding a job will be easy, but they are optimistic about their longer term career prospects. 4.4 Students in 2011-12 Social science graduates were rather more likely than others to be in a job. However other graduates were more likely than social science students to progress to PG study. (The high figures for ‘all’ students is caused by the higher proportion of students taking combinations of subjects at UG level who go on to PG study). The unemployment figure for economics graduates is surprising, especially given their strong earnings performance which we examine elsewhere. Possible speculative explanations for this might be that they spend longer on job search, or transit more frequently between jobs. (Table 4). Table 4. Employment situation 2011-12 by subject group STEM Psychology Econ, finance Social science Law, arts & & accounting n.e.s humanities Total (Self) employed 65.7 63.1 72.8 74.9 68.9 65.6 Studying 21.4 23.4 11.9 10.5 13.8 20.2 Unemployed 11.2 10.6 14.1 12.0 14.3 11.6 Gap year, travelling, not looking for work, unpaid work, other 1.7 3.0 1.2 2.5 3.0 2.6 Social science graduates, were paid less, were less satisfied with their jobs, slightly less likely to be working with other graduates, slightly less likely to be in a job that used either their knowledge or skills they acquired in their degree (Table 5) and less optimistic about their career prospects than graduates from STEM degrees (characteristics they shared with Psychology graduates). Further analysis showed substantial variation across individual social science subjects. STEM Psychology Econ, finance & accounting Social science n.e.s Law, arts & humanities Total Table 5. Job use of degree course knowledge and skills Only/mainly graduates in role 49.8 39.8 77.6 45.4 46.2 50.8 About equal 25.3 28.5 10.3 27.8 30.1 26.2 Only/mainly non-graduates in role 24.9 31.7 12.1 26.8 23.7 23.1 Job uses skills developed in degree 81.7 72.9 86.5 79.3 72.3 78.3 Job uses subject knowledge acquired in degree 66.8 50.9 67.7 63.3 50.8 62.4 30 Table 6 shows a variety of graduates’ views about their job and career prospects. First come a series of questions about their satisfaction current job, based on the mean scores measured on a 7 point scale where 1 = very satisfied and 7 = very dissatisfied. Again, social science graduates are generally more pessimistic about their current position and future prospects. Table 7 shows graduates’ reports of their pay in their first employment after graduation and their current job. Nursing Physics & Maths Engineering (all) Other STEM subjects Psychology Economics Politics Human Geography Social Work Sociology Other social science Law, arts & humanities Total Table 6. Graduates views of their current job and their career prospects 2011-12 Total pay (including overtime or bonuses) 3.6 3.4 3.4 3.9 4.4 3.5 4.1 4.0 4.1 4.8 4.0 4.3 4.0 The number of hours you work 2.9 3.1 2.8 3.2 3.3 3.1 3.3 3.3 3.7 3.6 3.3 3.5 3.3 The actual work itself 2.4 2.8 2.8 3.1 3.4 3.0 3.2 3.4 3.2 3.8 3.1 3.4 3.1 Job security 2.6 2.8 2.6 3.2 3.4 2.7 3.2 3.0 3.4 3.7 3.3 3.7 3.3 Opportunity to use your own initiative 2.7 2.8 2.7 3.0 3.3 3.0 3.1 3.3 3.1 3.6 3.1 3.3 3.1 Satisfaction with present job 2.6 3.0 2.9 3.3 3.7 3.0 3.4 3.5 3.4 3.8 3.4 3.6 3.3 Current job appropriate for your skills and qualifications 1.9 2.9 2.8 3.4 4.0 3.0 3.8 3.8 3.0 4.4 3.5 3.8 3.4 Promotion or career development prospects 3.1 3.1 3.0 3.7 4.1 2.9 3.8 3.9 3.8 4.4 3.7 4.0 3.7 I have a clear idea about the occupation I hope to have in 5 years’ time and the qualifications required to do so 2.2 3.5 2.8 3.1 3.1 3.0 3.2 3.5 2.8 3.4 3.1 3.1 3.0 I am optimistic about my long-term career prospects 2.3 2.7 2.4 3.0 3.3 2.4 3.0 3.1 3.1 3.4 2.9 3.3 3.0 I have the skills employers are likely to be looking for when recruiting for the kind of jobs I want 2.0 2.5 2.2 2.6 2.8 2.3 2.7 2.6 2.3 3.0 2.6 2.8 2.6 Satisfaction with First employment STEM Psychology Econ, finance & accounting Social science n.e.s Law, arts & humanities General, combinations or not classified Total Table 7. Gross pay incl. overtime, bonuses etc. and before tax, NI or other deductions <£11,999 28.4 39.4 20.7 33.2 47.9 43.8 35.1 £12-17,999 25.0 36.2 20.6 30.3 28.3 27.2 25.9 £18-29.99 39.0 22.3 48.0 31.3 20.2 23.4 32.7 £30+ 7.5 2.0 10.7 5.3 3.4 5.6 6.4 <11,999 18.1 25.2 11.8 34.4 18.9 34.3 29.1 12-17,999 19.8 33.8 18.6 37.0 24.4 25.9 23.8 18-29.99 47.3 36.9 39.8 26.4 44.5 32.5 34.7 30+ 14.7 4.1 29.9 2.2 12.2 7.3 12.5 Current employment (2011-12) Quantitative Methods Strategic Advisor Report 31 Table 8 shows graduates’ views on whether their degree course, or the skills it developed, have been an advantage in looking for employment. Again, social science graduates are more pessimistic than their STEM peers. Further analysis showed that these results were not greatly affected by the type of university (Pre/post-’92) that students attended. It should be borne in mind that some of the N’s for individual subjects were small, so that results are subject to much wider confidence intervals (not shown). Nor can we tell what selection bias might have been introduced by response to the survey. One might speculate that graduates who were relatively dissatisfied with their labour market experience may have had a higher motivation to participate. Table 9 breaks this information down by individual subject areas. Table 8. 2011-12 Degree course and employability The undergraduate subject I studied has been an advantage in looking for employment STEM Psychology Econ, finance & accounting Social science n.e.s Law, arts & humanities General, combinations or not classified Total 1= Strongly agree, 7= Strongly disagree Mean score [Wave 4] 2.9 3.8 2.7 3.4 3.7 3.5 3.2 The skills I developed on my undergraduate course made me more employable 2.7 3.2 2.8 3.1 3.1 3.1 2.9 Futuretrack wave 4 (2011-12) UCAS 2006 entrants to university who obtained a degree. Table 9. Views on employability by individual subject area “The undergraduate subject I studied has been an advantage in looking for employment” “The skills I developed on my undergraduate course made me more employable” Nursing 1.5 Nursing 1.8 Training Teachers 1.9 Pharmacology, Toxicology & Pharmacy 2.1 Pharmacology, Toxicology & Pharmacy 1.9 Anatomy, Physiology & Pathology 2.2 Anatomy, Physiology & Pathology 2.3 Training Teachers 2.2 Physics 2.3 Social Work 2.3 Social Work 2.4 Physics 2.4 Civil Engineering 2.4 Mechanical Engineering 2.4 Mechanical Engineering 2.4 Civil Engineering 2.5 Electronic & Electrical Engineering 2.5 Electronic & Electrical Engineering 2.5 Mathematics 2.6 Combs in Langs, Literature and related 2.5 Chemistry 2.7 Molecular Biology & Biochemistry 2.7 Economics 2.7 Chemistry 2.7 Law 2.8 Combs of languages 2.7 Computer Science 2.9 Economics 2.7 Molecular Biology & Biochemistry 2.9 Mathematics 2.7 Combs in Langs, Literature and related 3.0 Physical Geography & Environmental Sciences 2.7 Combs in Business & Admin Studies 3.0 Computer Science 2.9 Architecture 3.1 Combs in Business & Admin Studies 2.9 Business studies 3.1 Human Geography 2.9 Management studies 3.1 Drama 2.9 Combs of languages 3.2 Management studies 2.9 Physical Geography & Environmental Sciences 3.3 Architecture 2.9 Combinations within Social Studies 3.5 Sports Science 2.9 32 Biology 3.5 Biology 3.0 Academic studies in Education 3.5 Law 3.0 Drama 3.5 Business studies 3.0 Tourism, Transport & Travel 3.6 History 3.0 Design studies 3.6 Social Policy 3.0 English studies 3.7 Music 3.0 Music 3.7 Academic studies in Education 3.1 Psychology 3.8 Politics 3.1 Sports Science 3.8 English studies 3.1 Human Geography 3.8 Anthropology 3.1 Politics 3.8 Psychology 3.2 History 3.8 Combinations within Social Studies 3.2 Anthropology 3.9 Design studies 3.2 Philosophy 4.0 Philosophy 3.3 Social Policy 4.0 Sociology 3.4 Media studies 4.2 Media studies 3.4 Cinematics & Photography 4.3 Tourism, Transport & Travel 3.5 Sociology 4.4 Fine Art 3.7 Fine Art 4.5 Cinematics & Photography 3.8 All subjects 3.1 All subjects 2.8 5 An overall assessment of progress so far Given the longstanding and intractable nature of the problem of the quantitative skills deficit in UK social science, it is only right to be cautious in assessing progress, however it is also clear that the signs of progress so far are encouraging and that the policies pursued by ESRC since 2009 have had a substantial and positive effect. It is worth while recalling just how longstanding the current challenges seem to be. In 1946 a government committee was established to investigate ‘whether additional provision is necessary for research into social and economic questions’ (Clapham, 1946: 1). The Clapham report noted that ‘an adequate supply of statistical competence is quite fundamental to the advancement of knowledge of social and economic questions’, and goes on to suggest that ‘there was a chronic struggle […] for the services of a supply of statisticians’ (Clapham, 1946: 18). In 1971, the Royal Statistical Society commissioned another, much more extensive, Report on the Use of Statistics in the Social Sciences noted that the quality of applications for research grants that came before the SSRC Statistics Committee was ‘poor’ and their number - ‘disappointingly low’; non-numeracy among social science graduates prevailed, which was thought to have had originated in school; there was disorganised variety in what was taught and in the manner of teaching; a major problem was determining those best suited to do the teaching - mathematicians or social scientists. In response to the report the chairman of the SSRC ‘wrote a letter to all head-teachers of schools with sixth forms urging them to advise pupils intending to study social sciences at university to continue with mathematics to A-level if possible (S.S.R.C. Newsletter, No. 3)’ (Rosenbaum 1971:3). Problems accumulated over more than half a century are unlikely to be solved in a couple of years. The volume of funding of the Q-Step programme, the enthusiasm and commitment of the staff involved, and its impact not only on universities with centres, but on other universities that decide to emulate them, does look as if it is well capable of delivering the hoped for ‘step change’ in undergraduate methods teaching. Given this it would seem to make sense to consolidate what has been achieved rather than looking for further major programmes or initiatives at this stage. It makes sense to give Q-Step time to bed down before evaluating its performance. However at the appropriate time, if the centres prove successful in meeting their aims, then there will an important job to do in ensuring that all stakeholders, including potential student entrants to social science, are fully aware of the benefits of good social science QM training, and incentivised to ensure that it is firmly embedded in university social science. Hopefully successors to Q-Step, should Quantitative Methods Strategic Advisor Report they materialise, would be self-funding as universities choose to imitate the likely success of Q-step. However maintaining momentum in the face of the substantial inertia rooted in the current skill mix of university HE teaching staff will continue to be a challenge. The results of the 2009 survey ‘refresh’ are very positive, but even here there are some areas where it appears progress may be slower than in others. In particular it will be important to monitor how far curriculum space devoted to QM increases across the HE sector as a whole. UK universities will remain well below the standards achieved by the best universities overseas unless the time devoted to quantitative methods, as well as methods in general, continues to increase very substantially. The British Academy will release a report highlighting this issue shortly. What made possible that degree of progress which has been achieved? Any evaluation on the progress has been made so far is inevitably subjective, but there are a few tentative conclusions that may be drawn. First it was important that progress in QM was not seen as a zero-sum game, or following the narrow interests of quant specialists, but rather something that departments could work towards alongside their many other priorities and objectives. This was of course helped by the provision of extra resources, but just as important was the message that pursuing better QM training was not something that unnecessarily detracted from attention to qualitative methods. It has been important to stress that supporting QM was not an attack on ‘qual’. Over time this will become a more difficult balancing act, since greater attention to methodology will draw greater attention to just what qualitative approaches comprise. The latter embrace everything from highly sophisticated and powerful research techniques to, unfortunately, inchoate work with little empirical basis that claims qualitative status only by virtue of not being quantitative. Greater attention to empirical research, the evaluation and interpretation of evidence, and to methods may provoke some resistance from those who see a proper emphasis on methods to be some sort of commitment to ‘positivism’ or to undervaluing purely theoretical work. Second it was possible to construct a wide coalition of key stakeholders alongside the ESRC, including the Nuffield foundation, HEFCE, BIS, RSS, BA and others who could work together to pursue a range of objectives on which there was basic agreement. The volume of funding that this coalition made possible was also important and giving the QM community a useful lever with which to persuade deans, heads of school and other key university decision makers about the importance of QM teaching. Keeping this consensus an active and effective one will need to rise to the challenge of other changes to the HE environment, such as the potential Teaching Exercise Framework. It will be important to challenge arguments that have surfaced in the past and which may return in the future, that too great an emphasis on QM may be unpopular with students or detract from the student experience. There is no doubt that unimaginative QM teaching or inattention to application can produce teaching that students see as dull or irrelevant. It will be important to ensure that stakeholders understand that good QM teaching can be just as popular or engaging as any other. However this relates back to the point made above about curriculum space. Rushed QM teaching that tries to pack the basics into an over-crowded curriculum in the minimum time is unlikely to be successful or appreciated by students. Third a vital component of these stakeholders was the community of QM teachers built up through the mailing list and workshops. In the past QM teachers have often felt isolated. It may be possible to develop the formal and informal networks that have grown up over the last six years in other ways in the future. It might be that mentoring across institutions or departments could play an important role. The CI/RDI initiative led to a step change in the range quality and volume of training for QM teachers. This may require to be repeated at some point in the medium term. Fourth, the flexibility of the strategic advisor role made it possible to do things undertake initiatives, network and establish communications that would have been impossible to prescribe or define in advance. The excellent relationship that I enjoyed with the relevant ESRC officials made this work much more productive. Finally some of the success must be down to the ‘zeitgeist’. The words ‘big data’ did not appear in the original MacInnes report, but the popular perception that an increasing volume of new kinds of potentially useful data would be a feature of contemporary society and that more people with the skills to exploit this data would be urgently required has been an important backdrop to the activity described here. 33 34 6 Future work As Q-Step become established and the CI/RDI programme winds down the volume of immediate, high-priority activity will subside, however the achievements made so far will still need to be nurtured and developed. The following are priorities and issues I have identified. Of these I would highlight schools, DTCs and outreach to the natural sciences and humanities (1, 4 and) as the most important areas of work. 1 Core Maths and school outreach Post 16 maths education and the new Core Maths qualification for those who do not proceed to A-Level in Maths will be rolled out for the next few years in England and Wales. This qualification offers several advantages for university social science, where most recruits do not have A-Level Maths. It focuses on the application of maths, a well-recognised weakness of most university entrants, and includes substantial attention to the collection and analysis of data. Schools face a formidable challenge in finding teachers with the skills and training to provide this new qualification as well as teaching resources that school pupils will find engaging. This provides an opportunity for universities and the ESRC to engage with schools, helping to provide resources and training and by doing so signal the important message that maths and statistics are relevant to the social as well as the natural sciences. Given the fragile teaching base for quantitative methods in the social sciences it would be unrealistic to expect universities to undertake a large volume of work in this area, but it might be possible to fund one or more pilot projects focused on helping schools to make the most of Core Maths. Q-Step centres are doing a good deal of outreach work and it would desirable if the lessons they learn in this area can be communicated to other universities. 2 Outreach from Q-Step The impact of the Q-Step program will be greater to the extent that lessons about effective undergraduate QM pedagogy are learned and shared not only between Q step centres but across all universities. Q-Step staff will be busy establishing new degree programs and courses so that it would be unrealistic to expect them to take on the burden of all of this work. Perhaps a way can be found to encourage and incentivise staff from other universities to take on some of this work. NCRM will undertake a review and evaluation of QM pedagogy and the evidence base for different approaches as a 3 year project starting in May 2015. 3 Benchmarks QAA’s view that benchmarks are the property of the relevant disciplinary community has meant that work to revise benchmarks has been a matter of informal networking and lobbying within the relevant learned societies. This approach was very successful in the case of Geography, but markedly less so with Politics and International Relations. It may be worth repeating the approach made to QAA in 2010 and trying to persuade QAA that benchmarks ought also to respond to ‘user’ input, and citing the many calls from BIS and others for better graduate data skills. 4 DTCs The range of QM provision within DTCs probably continues to be uneven. It is important that the next round of recognition requires DTCs to demonstrate not only that they provide good basic QM training for all PGS, but that they will also have an adequate range of more advanced provision ready as the ‘pipeline’ delivers the first cohort of prospective PGs graduating from the Q-Step centres. These graduates will have a much broader and deeper range of QM skills. The size of this cohort will grow over time because of Q-Step and efforts by other universities. A useful but simple metric or target for DTCs would be the proportion of doctoral students using the existing UK or international data infrastructure in their research. 4 Nurturing the QM teaching community: It is important that the vibrant QM teaching community which has been built up over the last few years is nurtured and maintained. One way of doing this would be to continue to hold a regular workshop, seminar or conference which brings them together and provides opportunities to network exchange ideas and keep abreast of new developments. While I will continue to maintain the mailing list it might be helpful to develop a newsletter, perhaps on a quarterly basis, for example highlighting new resources or developments in QM teaching. This may be more effective and attractive to teachers than the heterogeneous and irregular emails that they currently receive as members of the list, although the list has proved very popular. Finally the quantitativemethods.ac.uk site will be most useful if it is regularly refreshed rather than allow to decay as tends to happen to many such sites as people move on or are distracted by other priorities. Quantitative Methods Strategic Advisor Report 5 Staff Training As the momentum of the Q-Step program and other initiatives develops there will hopefully be an increasing demand from staff without QM expertise to take up CPD or other forms of training. Fortunately many of the resources produced for undergraduates is also eminently suitable for staff new to QM. It would be useful to have a link on the ESRC site directing such staff towards training opportunities and resources. 6 Exploiting the data infrastructure It is often observed that the UK has an outstanding, world leading data infrastructure. Yet it is still also the case that this infrastructure is underused by the UK social science community, whether undergraduates, postgraduates or HE staff. The success of the Secondary Data Analysis Initiative has been very welcome. ESRC should consider ways in which applications for research grant funding might be required to state what elements of the data infrastructure the research will use or alternatively why the infrastructure is not relevant to the research proposed. ESRC you should also ensure that relevant parts of the infrastructure are made as accessible as possible to students and others. For example online analysis using NESSTAR makes the wealth of much of the data infrastructure immediately accessible to students and others with only basic QM skills. However one of the ESRC’s flagship project Understanding Society does not yet have this capability. Providing it should be a high priority. 7 BA HLSG The HLSG has become an active and effective forum which is currently undertaking a ‘State of the Nation’ survey of the supply and demand for data skills and will release a ‘Data Skills Manifesto’ and report based on their research just after the next election, to follow up the impact of ‘Society Counts’. It is important that the committee structure within ESRC is kept aware of this channel of communication with all the key stakeholders in the QM area. 8 Employers One of the problems encountered by all those working in this area has been to find a suitable interlocutor who can speak in an informed and authoritative way for employers. This is despite the range of indirect or more informal evidence that data skills are highly valued by employers, and reports predicting that such skills will be in greater demand in the future. This may be an issue that it would be useful to raise at ESRC Council. 9 Big Data The current level of interest in unpopular discussion of big data provides an opportunity for ESRC to make the case for the relevance of the kind of data skills that social science QM training can provide. The importance, potential, impact and novelty of ‘Big Data’ and may all be exaggerated, but insofar as this captures the imagination of funders, policy makers or stakeholders it provides a strategic opportunity to make the case for the importance and relevance of quantitative methods. In particular it is vital that the message gets across that no matter how ‘big’ or novel the data, statistical analysis is still essential in order to make any sense of it. 10 Beyond the social sciences? At the outset of the programme the decision was made to exclude the discipline of experimental Psychology and Economics. In Psychology the requirements for accreditation of Psychology degrees and registration with the British Psychological Society mean that all students receive training in statistics and experimental method. In Economics, almost all students are trained in econometrics. Students in these two subjects did not face the dearth of training opportunities typical of other social sciences. My work as strategic advisor brought me into contact both with arts and humanities, and STEM subjects. In the former, the arrival of new forms of data, such as digitised corpora of texts, or historical records, has made quantitative methods relevant to new disciplines. I helped a small project ‘NASH’ which ran a series of seminars for historians coordinated by Paul Atkinson (Lancaster) on basic statistical techniques. In the STEM subjects, with the possible exception of Physics, I discovered that the issues we have been dealing with in the social sciences also concern the STEM subjects, where despite initiatives such as Sigma, there is concern both about the maths knowledge and capacity that students arrive with (not all STEM recruits have a good pass in A-Level maths) and the extent and quality of QM teaching that they receive. This situation appears to be of most concern to Biology and Chemistry: both subjects where new methods make numerical and statistical analysis of data more important. However it is also relevant in Medicine, Engineering and elsewhere. I addressed a workshop for the Royal Society of Chemistry’s ‘Dial a Molecule’ project which had taken as it’s starting point the 2009 Strategic Advisor’s report and who were keen to learn form an emulate the approach that ESRC had adopted. Chemists were concerned about the lack of attention to handling data using statistical methods in undergraduate chemists’ lab training. 35 36 It would appear that, on the one hand, data analysis skills are becoming relevant to a whole range of disciplines in a new way, from medicine and public health to biology, chemistry, the humanities and elsewhere. On the other hand many of these disciplines have discovered that they face the same problems of students’ level of maths skills at entry and similar problems of staff skills and motivation as we in the social sciences imagined to be uniquely ours. Part of the problem here may have been the hollowing out of university Statistics departments over the last several decades, driven by the decline of service teaching and the negative effect of the REF on cross- and inter-disciplinary work. It would appear that the issues in quantitative methods that we have been addressing in the social sciences appear to be ones faced in various degrees by the natural sciences too. It is in many ways remarkable that most academics responsible for the application of statistics in research or the teaching of undergraduate and postgraduate students, will themselves have had little or no formal statistical training. This provides an opportunity for ESRC, which is the clear leader in this area, to share expertise with and work with other research councils and stakeholders to consider the state of methods training across the HE sector, regardless of discipline, and the issue of the statistics training. Perhaps a good first step would be to organise a stakeholder seminar on the issue of data skills and statistics training for UGs and PGs, under the auspices of RCUK, and with representation from HEFCE, the Royal Society British Academy, RSS and learned societies. I suspect that the issue of graduate statistical literacy Will become increasingly important over the next several years and offers a unique opportunity for ESRC, With a substantial experience that it has built up on this here to take a lead in the sector. 11 The SA role in the future: an advisory group? In many ways the strength of this strategic advisors roll was the flexibility with which the SA pursue issues and activities as the need arose. It would be desirable to continue this capacity but it may also be preferable to have a change in personnel. A different strategic adivisor might well bring fresh insights and energy to the challenge. With the C I/RDI program complete, the Q-Step centres established and the HLSG effective and active, the volume of work for the strategic advisor role may be less in the future than it has been over the last several years. An alternative, which could either replace our complement the strategic advisory role would be to set up an advisory group on whose advice and experience ESRC could draw as needed and could be relied upon to alert ESRC to relevant issues and developments. The advantage of any such group is that it can draw on a range of experience and expertise, but the disadvantage is of course that responsibility shared can also be responsibility avoided, especially in the context of fiercely competing priorities for individual’s time. Such an advisory group might report to the MIC or TSC or both, and be responsible for any workshops/conferences, newsletter and website. Acknowledgements The achievements detailed in this report owe much more to the effort, experience enthusiasm and commitment of many colleagues who worked closely with me than to anything I could have done on my own. At ESRC Rachel Tyrrell, Sarah Werts and Claire Feary gave invaluable guidance e and advice. Chief Execuives Sir Ian Diamond and Paul Boyle were generous with their time and support for what was a new and potentially risky direction for a research council. Andrew Tomei, Sharon Witherspoon, Sarah Lock and Debbie O’Halloran at the Nuffield Foundation and Linda Allebon at HEFCE worked hard to transform Q-Step from a modest idea to a major programme. Hetan Shah, Roeland Beertan, Neil Sheldon and John Pullinger at RSS; Anandini Yoganathan at British Academy and David Walker (Getstats and HLSG) John Craig (HEA) have also been generous with the time, sage advice and encouragement. Ms Plamena Panayotova has been an invaluable help with historical information about the link between statistics and UK social science. Finally, I have never ceased to be heartened and encouraged by the commitment of dozens of university social science QM teachers who share my vision of the great benefits to students, UK social science and wider society of making QM an engaging and exciting subject to master. Quantitative Methods Strategic Advisor Report Appendix 1 BA qualification scoping consultation document Scoping a national qualification in quantitative skills in social science 1 Background: the generic deficit in quantitative skills in UK social science 1.1 Several reports over the last decade have drawn attention to the development of a ‘generic deficit’ in skills in quantitative methods (QM) in UK Higher Education in the social sciences (excluding economics and psychology) and related humanities disciplines (history, economic history, human geography, journalism and media studies, empirical studies in law) as well as concern about the level of maths skills, and capacity to apply them, of students entering Higher Education. This deficit has become more worrying as the digital economy, technological innovation and the ‘data deluge’ bring new fields of study within the orbit of quantitative analysis, while advances in computing power make new forms of analysis feasible. Skills which have hitherto been mostly used in STEM disciplines, have become important for a range of non-STEM subjects 37 One result of this is that few undergraduates approach postgraduate study with QM in mind, Masters and doctoral level teaching effort is spent on developing basic skills, and the marginalisation of QM is reproduced in the next cohort of university staff. International Benchmarking Reviews in Sociology and Politics have drawn attention to the comparative weakness of the UK in QM. Employers looking for quantitative skills from graduates do not expect a social science degree to provide them, unless it is in economics, and expect to train students in these skills themselves, even those with M level degrees. Conversely, even basic quantitative skills (e.g. in the use of SPSS) give graduates a substantial employability advantage. The impact of the ‘generic deficit’ will become greater as continued technological innovation and the rise of digital economy and society makes quantitative and ‘STEM skills’ relevant to a continually expanding range of disciplines because of new forms of data generation and capture. 1.3 The Report made a number of recommendations which ESRC and others have taken up, including: n ith resources from the Funding Councils and British W Academy, 20 projects in Curriculum Innovation and Researcher Development have been commissioned (total £1.7m) to promote improved undergraduate teaching and train faculty in QM teaching. n he Nuffield Foundation and ESRC will fund several T centres of excellence for undergraduate QM teaching in the social sciences. These centres will provide a broader range of course options for students than those typically currently available, prioritise secondary data analysis and offer students the opportunity to study degree programmes focused on quantitative methods. They will also attract students with Maths proficiency who may not wish to study traditional STEM subjects. n he British Academy is establishing a High Level T Strategy Group to bring stakeholders concerned with quantitative skills together to lobby government, raise awareness of the issue and coordinate action. n range of support resources for QM teaching have been A produced including an email and discussion list for QM teachers; workshops for QM teaching staff; careers booklets showcasing the labour market advantages of QM; and a web portal providing links to QM teaching resources. n key metric used in recognising ESRC Doctoral Training A Centres was their capacity to deliver a high standard of QM and advanced QM provision. The QM Strategic Advisor is working with the National Centre for Research Methods to ensure that all social science postgraduate 1.2 The report of the Strategic Advisor to the ESRC on undergraduate QM teaching (MacInnes 2010) estimated that the proportion of university academic staff in these social science subjects with sufficient expertise in QM to teach basic skills was very low (probably around 10-15%, but possibly much lower than that) and that interest in QM had become marginalised in many departments, with a worrying gulf between a small minority of staff usually highly proficient in these techniques, and other staff with few or no skills. At undergraduate level, QM tended to be taught in specialist methods options that accounted for a small proportion (around 5% or less) of final degree credits, and was rarely integrated into the substantive components of degree courses, giving students the impression that methodology and the assessment of empirical evidence was of marginal significance to their degree. QM teaching was also seen as demanding, in part because of the greater amount of time taken in preparation and assessment, but also because of the low level of basic maths ability of the majority of students. Across the social sciences and humanities, fewer than one in five students will have studied maths to A-level or equivalent standard, and in many subjects the proportion is less than one in ten. 38 students develop a basic range of QM skills and also have the opportunity to develop advanced skills, e.g. through training bursaries to attend courses in other DTCs n he British Academy has drafted a Position Statement T on quantitative skills, and with ESRC is leading discussions with the relevant academic professional associations (including BSA, PSA, SPA, SRA, RGS, SWiE) to raise awareness of the issue and build support for the revision of the existing QAA benchmarks for QM. 2 A National Qualification in Quantitative methods 2.1 Although this is a significant programme of work, the scale and ingrained character of the problem requires a coordinated approach. The introduction of national system of recognition of levels of achievement by graduates in quantitative skills would complement the measures described above and underpin further progress. Such a system would signal to employers (whose demand for QM skills is high) that a graduate had a solid command of these skills. Its existence would raise student awareness of the employability and career benefits of QM skills. For example the QM content of courses could feature in KIS data. It would stimulate curriculum change and innovation by encouraging universities to introduce new course options or new or revised degree programmes whose learning outcomes aligned with these levels of achievement. It would help attract school students with an aptitude for maths to the social as well as natural sciences. 2.2 Such a system could operate in a number of different ways. One model would be a national examination that students could sit alongside or after their degree examinations (as happens in accountancy, financial economics, statistics or medicine). Another would be the recognition of university degree programmes with adequate training in QM (analogous to the way that, e.g. the British Psychological Society accredits Psychology degree courses as preparation for graduate membership of the BPS). Such recognition would be based on the range, level and type of quantitative skills that students were expected to learn in the course of their degree, and the way these were assessed. Examinations, or the assessment of degree curricula against a set of standards, would need to be organised by an institution that was sufficiently prestigious to be readily recognised and trusted by the disciplines concerned, universities, employers and students. The process would need to be sustainable, with clearly identifiable resource streams. The system would work best if there is a clear sense of ownership of the process by those involved, so that those delivering the skills in universities see the qualifications and the assessment process that underpins it as meeting their needs as educators. 3 The curriculum content and core competencies 3.1 It may be useful to think of two sets of skills that need to be acquired in learning QM. The focus of quantitative research is usually some kind of systematic comparison, which has its roots in the experimental method where the value of one variable is manipulated and its impact on another measured. While the interpretation of social survey results, for example, poses many of the interpretative questions raised within qualitative methods, it also requires understanding of the difference between experiment and observation, the nature of control, the role of selection effects and prior variables, and a good grasp of research design so that the most appropriate comparisons can be established. It requires an understanding of the nature of measurement error and the validity of constructs, such that students appreciate the limits of quantitative evidence as well as its strengths. It requires understanding of the principles of probability, random selection and inference. It also requires an understanding of the philosophy of social science concerning the status of empirical evidence and its relationship to the researcher. A quantitative approach therefore requires the ability to see the world in terms of its variability, captured through observing the distribution of relevant variables and how these may change over time. 3.2 The second set of skills concerns the more technical process of how comparisons are established, and what lessons it is legitimate to draw from them. This includes such issues as the nature of classification and measurement (validity and reliability); the identification of relevant variables; the selection of observations (sampling); procedures for measuring association between variables; procedures for identifying where such association is itself dependent on associations with other prior, variables and the construction of ‘models’ that reduce the complexity and detail of the observed data to the essential story that it may reveal. This is where ‘statistics’ are used, in the sense of a corpus of proven logics of calculation that can reveal patterns in data, as well as the probability of such patterns being found in wider populations from which the data has been drawn. Quantitative Methods Strategic Advisor Report 39 3.3 3.5 These two skill sets are interdependent because the first set can only be put into practice using the second set, while the relevance and purpose of the second set can only be appreciated within the context of the first. Moreover, particular technical skills, and the way they are used, vary across disciplines, in part because of the nature of the data that different disciplines collect. For example, international relations often works with small numbers of observations, because of its interest in the behaviour of states, and has little need to generalise to wider populations. Census or registration data available to demographers also makes consideration of sampling less necessary, but the techniques used to analyse tens of millions of observations are different to those suitable for a few dozen. Much social survey analysis turns on making inferences from samples to target populations. Studies of voting behaviour often deal with continuous variables (e.g. proportion of vote) as do studies of institutions (including economic enterprises, markets or entire countries). However sociological studies of individuals depend more on categorical variables (e.g. religion, occupation, educational qualifications) requiring different kinds of statistical analysis. Educational research, international comparative work and an increasing range of other work uses ‘multi-level’ comparisons, which take into account that observations (individuals, or even traits nested within individuals) are nested within larger units (classes, schools, countries). An expanding volume of longitudinal, time series, panel and repeated cross-sectional data requires understanding of the treatment of time as a variable, and recognition that repeated observations of the same unit of analysis are not independent of each other. However there are substantial areas of common ground. It may be useful to distinguish three levels of knowledge/ expertise. 3.4 This makes it difficult to specify a common curriculum that would be suitable for every discipline, and this has been recognised in ESRC’s support for embedding the learning of QM within a clear disciplinary context. Many studies have shown that students learn QM more effectively when they are convinced of their relevance. This is unlikely to be achieved if students only encounter quantitative evidence and analysis in specialist methods options in their degree programme. Moreover there is a healthy debate within disciplines about which quantitative methods are most appropriate (for example over the treatment of measurement error, weighting, non-response, the status of null hypotheses or the scope of assumptions about the linear nature of social structures or social change) as well as the existence of different traditions across disciplines in the adoption of particular technique and terminology. 3.6 Basic ‘statistical literacy’ skills that arguably, every graduate should possess, regardless of their field of study, in order to cope with quantitative material in their personal and professional lives. These concern the ability to critically evaluate the use of quantitative evidence by others and to understand how to collect and interpret publicly available quantitative data. Much of this material would be relevant to students in STEM disciplines, who might be proficient in much more advances procedures used in their discipline, but not always able to transfers aspects of this knowledge to other contexts. A suggested list of these skills would be: n The concept of a variable and its distribution; n he concept of a proportion and its numerical and T graphical expression; n The concept of a rate, including rates of change; n he concept of probability or risk, and the nature of T randomness; n Informal estimation and spurious accuracy n ummary descriptive statistics such as a mean, median S or ‘five number summaries’; n Graphical summaries of data and data visualisation. n Conditional probability and Bayes theorem; n he concepts of independence and association; T correlation and its distinction from causation; n Regression to the mean and its implications; n abular data of the kind commonly found in reports, T understanding how data may be standardised for purposes of comparison, discerning trends, observing associations, checking key items such as definitions of categories, sources of data; n he concept of an experiment, and its similarity and T difference to that of observation and control; n ources of measurement error in data, or a general S appreciation of the way in which different kinds of data are constructed (rather than blind faith or blanket scepticism about ‘numbers’); n he logic of random sampling and the importance T of selection effects (but not how to go about making calculations, e.g. of the sample size needed to capture a given effect size); 40 n ommon misuses of and mistakes in the presentation of C statistics and quantitative evidence; common fallacies encountered in poor statistical reasoning (e.g. the ecological fallacy or fallacy of affirming the consequent); n n appreciation that many social regularities and A patterns are visible only to quantitative analysis. means, correlation coefficients, analysis of variance; linear and logistic regression, including model fitting and analysis of residuals; graphical representations of data (histograms, charts, box and scatterpots). Good practice in the tabular and graphic presentation of data. Skills that a graduate competent in quantitative methods in the social sciences should possess, and be able to apply independently using (an) appropriate software package(s). The details of these skills will vary more by discipline. For example Geography students would be likely to develop skills in GIS, that might be less relevant to a Psychology student. The development of these skills, the datasets used and most common procedures will be more specific to each discipline and we welcome comments from learned societies about this. Some of these skills are technical ones developed by learning to use a relevant software package such as SPSS, Stata or R. However skill in the use of such packages should always be seen as a means to the key end of good quantitative analysis based on a sound understanding of underlying principles, rather than a superficial ability to ‘process’ data. Research Design: validity, reliability and control 3.8 Operationalisation of concepts, measurement error; randomisation, comparison, control and observation. The key role of prior variables and selection effects in social enquiry. Coping with social change and time. Theories of causation. Advanced competence in quantitative methods would come from greater experience in the use of the skills developed under (3.7) and better capacity for good statistical judgement arising from this. This could include a project or dissertation using QM, or placement with a research organisation using QM. However it could also require skill in a range of more elaborate procedures, such as: 3.7 Data location, collection/construction and access Survey designs and methods; sampling theory, sampling frames, stratification and clustering; cross-sectional, repeated cross sectional, panel, cohort and longitudinal data; response rates and bias; measurement error. The census; major surveys (e.g. the LFS, ESS, US, BSAS); administrative data; transactional and social media data. The data archive; the question bank; other sources of national and international data. Data security, anonymisation, confidentiality and disclosure risk; data protection; legal and ethical obligations. Data management. Preparing and manipulating data Data management and curation; using survey documentation to identify, locate and interpret variables correctly; understanding weights; recoding variables; creating new variables; dealing with missing observations; dealing with variables created from multiple response questions; flat file representation of hierarchical data (individuals and households, merging files). Exploring, analysing and presenting data. Levels of measurement, variable distributions, associations between variables, controlling for prior variables; data exploration and description; theory testing and elaboration, hypothesis formulation and testing, inference from samples to populations, confidence intervals, significance, effect size and power; the concept of a model and residuals from it; N-way contingency tables, comparison of n Multilevel models. Hierarchal data; n Event history analysis n Factor analysis n The General Linear Model 4 Quantitative Methods and (Social) Statistics 4.1 Expertise in quantitative methods is not the same as statistical expertise, and graduates competent in quantitative methods would neither be, nor should they be seen to be, statisticians. It makes sense to restrict the term statistician to those who have completed a degree in statistics recognised by the RSS, or who have passed the relevant RSS examinations. Knowledge and expertise in statistics requires, for example, a deeper mathematical knowledge of the laws that build upon the axioms of probability theory, and of the consequences of these for the distribution of errors resulting from different kinds of measurement drawn from different populations. 4.2 However, competence in quantitative methods requires a sound understanding of several concepts drawn from basic statistics (as outlined in sections 3.6 to 3.8 above) and the ability to recognise when and how to apply these. It thus requires competence in the application and use Quantitative Methods Strategic Advisor Report of statistics that comes from an understanding of these concepts in a social science context. Given the importance of statistics to quantitative skills, it would be desirable for the Royal Statistical Society to play a key role in the development and regulation of standards for any national quantitative methods qualification. 5 Delivery 5.1 There are a range of existing models of graduate level professional qualifications in such areas as accountancy, finance, market research and medicine from whose experience we can learn. 5.2 One model would be to identify individual university courses or whole degree programmes that delivered the skills described above with an appropriate syllabus, learning objectives, outcomes, and assessment. This would require periodic review by some competent authority, but decisions about how to organise course content, configuration and assessment to meet the demands of review would continue to rest with individual HEIs. For example, the skills described in 3.6 might be met by a one or two semester ‘statistical literacy’ course. A set of agreed common standards about the range of skills covered in such courses and the level of achievement to be expected from students passing them would give employers or postgraduate admissions officers clear information about students’ quantitative skills. Given its expertise in this area, it may make sense for the RSS, through getstats, to establish what ought to be a core curriculum or set of key standards for such courses as a set fo guidelines for universities to follow. Alternatively, universities could approach getstats for endorsement of their courses. Such endorsement would add considerable value to the courses, so that it would be reasonable for universities to bear the cost of any such process. The range of skills described in 3.7 would be more likely to be developed through a number of courses within a degree programme, perhaps including dissertation, project or placement work. Students reading the same degree, but making different course choices, would graduate with quite distinct levels of quantitative skills. Again, the capacity to evaluate a degree curriculum against a set of common standards would give useful information to both employers and students. 5.3 An alternative model would be a national exam, or other form of assessment, which would be administered 41 independently of HEIs by the awarding body, following a syllabus which it would set. This happens e.g. with the Chartered Financial Analyst (CFA) qualification. Individual HEIs might choose to prepare their students for such assessment by orienting their curricula to this syllabus, but would continue to assess students for such work as part of the award of their own degree. 5.4 Some mixture of these two systems would also be possible. Were the first model to be introduced, the interests of students who were not in recognised HEIs would need to be considered, so that another possible route could be a ‘stand alone’ qualification available to such students. Teaching could be delivered remotely via the web, through classes organised along the lines of the Open University, or by self study. Project or dissertation work would almost certainly need some kind of face-to-face contact and supervision. This might be possible through some kind of placement arrangement with social research organisations. Any such route might best be delivered by a university, or consortium of universities on a distance learning basis. Any such arrangement would require students to be able to be registered for study simultaneously with two universities. 6 Assessment 6.1 Many of the skills in all three sets are of a kind that can be assessed by traditional examination or coursework exercises. This is especially true where the key skill comprises the ability to recognise the relevance of a technical procedure and knowledge of how to implement it. This would make either delivery model above appropriate. 6.2 However there are some skills, especially in the second set, where assessment of a more complete and extended piece of work can better evaluate students’ ability to bring together a range of different skills to address a substantive problem and report the results, as well as providing a formative element to the assessment. Such work usually requires supervision, or some form of regular contact with students, to monitor progress and provide advice. Any system of distance learning would need to provide for this. National assessment of project or dissertation work would require significant examiner resources. 6.3 Universities are autonomous in setting the form of exams and other assessments, and establishing and monitoring standards of performance, within the framework of QAA 42 benchmarks and the external examiner system. It would be possible to leave all assessment for the qualification within this system, were there some institutional mechanism for reviewing course content and standards, in a way similar to that formerly adopted by ESRC for the recognition of PG training outlets. Rather than prescription of some variant of a ‘national curriculum’ from the top down, this would work best if individual universities continued to decide themselves how to teach and assess QM skills, but aware that provision that met a set of criteria would bring some form of recognition that would be clear to both students and employers. 6.4 A national examination system could guarantee common standards (and would drive course content in individual HEIs) and is arguably more meritocratic, but would also require a more substantial investment by the awarding institution to provide for the setting, administration and marking of exams or other forms of assessment. 7 Levels of proficiency and student volume 7.1 It would be possible for most undergraduates, regardless of their subject of study, to become proficient in the first set of skills (3.6), and for many universities to organise courses to deliver them. The volume of demand for such courses might well be high. 7.2 The second set of skills (3.7) is more directly relevant to students of social sciences (including human geography, education, empirical studies in law, criminology, economic and social history and aspects of linguistics). Currently, most graduates in Psychology and Economics would have developed these or a similar set of skills. A small proportion of graduates in other disciplines based in departments with a particular commitment to a quantitative approach may have developed some or possibly even all of them. They represent the range of skills that formed the core of ESRC’s requirements for Masters level training, and are the focus of the forthcoming Nuffield programme. Over time they could become a required standard for most social science students, especially if driven by QAA benchmark reform. Provision would need to be made for students who find they have no aptitude for or interest in quantitative work. It could make less sense to make quantitative proficiency a core requirement for, say political science or sociology, in the same way that it must inevitably be for economics. However, the ability to specialize ought to be balanced against the need, especially for students proceeding to postgraduate study, to have enough knowledge of all branches of a discipline necessary to be able to follow its literature. 7.3 It ought to be possible for a significant minority of undergraduates, many of whom might expect to proceed to postgraduate study, to achieve the third set of skills. Undergraduate social science degree courses in the Netherlands, Belgium, Germany, Switzerland and the United States tackle these subjects, as do undergraduate econometrics courses in the UK. Thee skills are taught at M and doctoral level in most DTCs. It ought to be possible to open up this teaching to suitably qualified and motivated undergraduates. 8 Awards and titles 8.1 Employers of graduates and admissions officers for Masters and doctoral training need clear signals, based on a readily understandable set of qualification titles, about the level of proficiency and range of skills developed. 8.2 Proficiency in the first set of skills (3.6) could be recognised by the award of a Diploma. This would sit alongside the degree obtained by the student. Possible titles for such a diploma could be: Data literacy Statistical literacy Numerical and statistical literacy Alternatively, it might be felt that a diploma was unnecessary, confusing or conferred too much weight on a limited range of skills, so that sufficient recognition would come from adequate description of the level of achievement in a Higher Education Achievement Academic Report (HEAR). Were the HEAR route to be adopted it would need to be clear that the course underpinning the HEAR entry was one that satisfied or exceeded the standards considered here. Endorsement by getstats would be one possible mechanism. 8.3 Proficiency in the second set of skills (3.7) could be recognised via the award of a degree with any title but recognised as reaching given standard of competence in quantitative methods. The key factor would be the visibility and prestige of such recognition, which in turn would be a question of the prestige of the body maintaining standards and the experience of employers or admissions officers of the worth of graduates holding the qualification. 8.4 Students whose degrees included advanced study in the Quantitative Methods Strategic Advisor Report full range of these skills might qualify for degrees with the title ‘with quantitative methods’. However it might be more appropriate to reserve this title for students reaching the kinds of skills listed in (3.8). Universities themselves, guided by the need to make the right signals to students and employers, would be best placed to decide this. The reduced number of graduates initially likely to develop proficiency in the third set of skills could make provision for a separate award title unwieldy. Against this, high visibility for this group of students would be important in recruiting students with good maths skills onto social science degrees, as well as signalling their preparedness for postgraduate study to admissions officers. 9 Ownership, administration and funding 9.1 To be sustainable these qualifications will need to be ones that students want and employers and graduate admissions officers value. They would need to be overseen by an appropriate national institution or consortium of institutions with the visibility and reputation that would give employers confidence in them. The most relevant existing bodies would be the Royal Statistical Society (which already operates its own examinations) the Economic and Social Research Council (which already has responsibility for postgraduate QM training) and the British Academy (which brings together the social sciences and humanities). Learned societies would certainly be important stakeholders in the qualification. Unlike the BPS, the British Sociological Assocation, Political Studies Association and Social Policy Association do not currently recognise degree level courses, however they play a key role in guiding the QAA benchmarks which set the standards for learning outcomes in university degree programmes. 9.2 While necessary start up funds might be sought from these stakeholders and BIS, ongoing funding would need to come either from students (e.g. through registration) or universities seeking recognition for their courses, or both. A income stream substantial enough to secure sustainability would be dependent on student demand for these qualifications, which in turn would be related to the extent of employer recognition of their worth. 9.3 With ESRC curriculum innovation projects, and work by JISC and HEA producing new curriculum materials, and Nuffield initiative funding coming on stream in early 2013, there would be a growing cohort of students with the skills to take the qualifications discussed here graduating from summer 2015 onwards. Numbers will build up 43 over time, and depend on the final scale of funding for the Nuffield programme, as well as the speed with which other universities follow the lead established by the Nuffield centres. At present it appears likely that the Nuffield programme will fund at least a dozen centres of excellence. Thus while there is sufficient time for extensive discussion of the best way forward, it would be desirable to have a system in place by the summer of 2015. References: reports on QM skills in HE, and Maths skills post 16 British International Sutides Association, Political Studies Association & ESRC (2007) International Benchmarking Review of UK Politics and International Studies. Swindon ESRC. British Sociological Association, Heads and Professors of Sociology & ESRC (2009) International Benchmarking Review of UK Sociology. Swindon ESRC. Department for Education and Science (2004) Making Mathematics Count: The Report of Professor Adrian Smith’s Inquiry into Post-14 Mathematics Education, February, URL www.tda.gov.uk/upload/resources/pdf/m/ mathsinquiry_finalreport Department for Education and Science (2005) 14–19 Education and Skills, Government White Paper, February, URL www.dfes.gov.uk/publications/1419educationandskills/ ESRC 1987 Horizons and Opportunities in the Social Sciences. ESRC. ESRC 2007 Report Of The Workshop On Enhancing The UK Social Science Skills Base In Quantitative Methods: Developing Undergraduate Learning. ESRC. HM Treasury (2002) SET for Success: The Supply of People with Science Technology Engineering and Mathematics Skills: The Report of Sir Gareth Roberts Review, April, www.hmtreasury.gov.uk/Documents/ Enterprise_and_Productivity/Research_and_Enterprise/ ent_res_roberts.cfm Lynch, R., Maio, G., Moore, G., Moore, L., Orford, S., Robinson, A., Taylor, C. and Whitfield, K. 2007 ESRC/ HEFCW Scoping Study into Quantitative Methods Capacity Building in Wales. Final report to the ESRC and HEFCW. Marshall, G. (January 2001) ‘Addressing a problem of research capacity.’ Social Sciences: News from the ESRC (47): 2. McVie, S., A. Coxon, P. Hawkins, J. Palmer & R. Rice 2008 ESRC/SFC Scoping Study into Quantitative Methods Capacity Building in Scotland. Final Report. 44 Mills, D et al (2006) Demographic review of the UK Social Sciences. ESRC Onwuegbuzie, A.J. and N.L. Wilson (2003) ‘Statistics Anxiety: Nature, Etiology, Antecedents, Effects and Treatments: A Comprehensive Review of the Literature’, Teaching in Higher Education 8(2): 195–209. Parker, J et al (2008) International Benchmarking Review of Best Practice in the Provision of Undergraduate Teaching in Quantitative Methods in the Social Sciences. ESRC Payne, G., Williams, M. & Chamberlain, S. (2004) Methodological pluralism in British sociology. Sociology 38(1); 153-163 Vorderman, C, C Budd, R Dunne, M Hart, & R Porkess (2011) A world-class mathematics education for all our young people. Available at www.conservatives.com/News/ News_stories/2011/08/~/media/Files/Downloadable%20 Files/Vorderman%20maths%20report.ashx Hodgen, J & L Sturman (2010) Is the UK an outlier? An international comparison of upper secondary mathematics education. Nuffield Foundation. Available at www. nuffieldfoundation.org/sites/default/files/files/Is%20the%20 UK%20an%20Outlier_Nuffield%20Foundation_v_FINAL.pdf Advisory Committee on Mathematics Education (2011) Mathematical Needs: Mathematics in the workplace and in Higher Education. Available at www.acme-uk.org/ media/7624/acme_theme_a_final%20(2).pdf Quantitative Methods Strategic Advisor Report 45 46 Economic and Social Research Council Polaris House North Star Avenue Swindon SN2 1UJ www.esrc.ac.uk @ESRC Tel: 01793 413000