Assessment Mini Grant Award Summary Report

advertisement
Assessment Mini Grant Award Summary Report
Krannert School of Management
Prepared by Charlene Sullivan and Jackie Rees Ulmer
May 31, 2013
The Krannert School of Management did a pilot run of the ETS Major Field Test (MFT) in Business
Administration on 41 senior undergraduate students. The MFT is a 120-question standardized exam
designed to assess the Business Administration knowledge of senior undergraduate students in the
United States and Canada. The questions represent areas of content that are typically taught in
undergraduate business school required coursework, including Accounting, Economics, Finance, General
Management (including Strategic Management, Organizational Behavior, and Human Resources),
Quantitative Business Analysis, Marketing, Legal and Social Environments, Information Systems, and
International Issues.
As the School of Management does not currently have a required capstone course in the undergraduate
curriculum, we wanted to see how a sample of our graduating seniors would perform across content
areas. We also wanted to see how our students would perform relative to other students at other
institutions. Finally, we wanted to try administering the exam on a small scale, to see what issues arose
and to gain insights into the feasibility of administering the exam (or a similar assessment) on either a
much larger sample or possibly every graduating student.
ETS offers the MFT in Business Administration in both a paper-based format and an online-format. We
choose the online format as it was cheaper ($25 for each online test vs. $27 for each paper-based exam)
and would provide results to both the administration and the student volunteers immediately.
Additionally, students are becoming quite used to taking standardized tests online, such as the GRE and
GMAT, so the online version of the test was seen as more attractive to the students, and possibly easier
to administer in a computer lab.
After two failed attempts at recruiting enough student volunteers, we were able to work with faculty in
a senior-level required course (MGMT 45100 Strategic Management) to recruit enough volunteers for
the test. We were very pleased with the turnout with 41 volunteers showing up at the designated time
and place. Volunteers were placed into a drawing to win up to $100 compensation, with a minimum
guaranteed compensation of $5. Students also received nominal extra credit for their participation.
We found that it was very easy to procure and administer the tests to students. We used Krannert Lab 1,
a Krannert Computing Center Facility, seating up to 45 students, for administering the exam. Due to how
the workstations are imaged and managed in that lab, the KCC staff had to do some work for the
secured testing window to run. However, they were able to do this on fairly short notice and we had no
technical problems during the administration of the exam.
When compared to all test takers from 413 institutions in the US and Canada between September 2010
and May 2013, our volunteers performed well. We were also pleased with how the students performed
within each subject as shown in Table 1. We then were able to run a comparative report, where we
compared our results against those of 11 schools identified as being somewhat similar to Purdue. These
11 schools are shown in Table 2. Our 41 volunteers were close to the overall mean of the comparative
group (157 vs. 157.5).
1
Table 1: Krannert Student Volunteer Results
Assessment Indicator
Number
Assessment Indicator
Title
Krannert Mean
Percent Correct
Mean Percent Correct of all
Exam Takers
1
Accounting
52
43.9
2
Economics
49
44.3
3
Management
58
57.2
4
Quantitative Business
Analysis
53
5
Finance
56
42.8
6
Marketing
55
55.2
7
Legal and Social
Environment
57
8
Information Systems
50
48.4
9
International Issues
54
52.8
Table 2: Institutions in Comparable Set
School Name
Ball State University, IN
Boston College, MA
Florida State University, FL
Georgia State University, GA
Northeastern University, MA
Oklahoma State University, OK
Tulane University, LA
University of Florida, FL
University of Nebraska – Lincoln, NE
University of Oklahoma, OK
Washington State University – Pullman, WA
40.6
55.6
Number of Students
555
263
33
3,169
337
405
337
1,511
1,513
200
567
On the subject assessment, our little group did slightly better in accounting (52 vs. 51.5), very slightly
worse in economics (49 vs. 50.2), worse in management (58 vs. 61.6), far better in quantitative business
analysis (53 vs. 43.9), better in finance (56 vs. 53.1), worse in marketing (55 vs. 61.3), very slightly worse
in legal and social environment (57 vs. 57.2), worse in information systems (50 vs. 54.5) and worse on
international issues (54 vs. 58.5).
While our student volunteers comprise a very small and self-selected sample, and therefore we cannot
make many inferences on the quality of our programs nor of the quality of our students based on this
experiment, we have generated a number of action items based on these results. The first is that we will
have representative faculty review the contents related to their subject matter, to see if the questions
on the test are representative of how our courses are taught on the West Lafayette campus. It could be
that we take a different approach to the subject matter and our approach is entirely appropriate given
2
our mission. Another metric of student achievement, our internship and full-time placement rates are
very high, particularly in information systems, so it could be that there is a mismatch between our
curriculum and test, and we should not make any changes based on these results.
The international issues subarea is the one that is likely to get our greatest attention. We have found
through our Assurance of Learning assessments that we could be broader and deeper in how we
incorporate international issues in our required courses. Our upcoming EQUIS accreditation review and
the implementation of the embedded outcomes of the University’s Outcome-based Core Curriculum
also indicate that we need work with instructors to identify more opportunities to engage our
undergraduates on this outcome.
This experiment will also seed discussions of curriculum change in terms of whether to add a capstone
course to our undergraduate majors and/or whether a required senior assessment would be
appropriate. A senior capstone course would be attractive to recruiters and students. However, there
would be significant instructor resources required in an already highly constrained environment. A
required assessment exam for graduating seniors would also have resource and logistic requirements.
We would likely have to switch back to the paper-based format due to the availability of workstations on
campus. There is a substantial cost to administering the test to every graduating senior ($27 per test this
year). A question that has come up at assessment seminars run by the AACSB has been getting students
to take the test seriously, even if it is required for graduation. In order to achieve this, we would likely
have to build a 1-credit course around the test, which adds to student expense and burden, which we
are highly cognizant of. Even a 1 credit course, where the performance on the test would be somehow
tied to a course grade, may not be enough to incentive students to perform well, given the nominal
impact on GPA at that point in the academic career. Finally, the test itself is problematic. Adopting the
ETS (or any other non-Krannert) instrument could result in the curriculum being “steered” toward
performance on a third-party test or “teaching to the test” which would cause heated discussion and
has a very high likelihood of non-acceptance by the Krannert Faculty. Building our own assessment test
would be an option, but again would be time-consuming and would present the same logistic and
student incentive issues as discussed above.
3
Download