An example of using data to close the loop (Debra Bryant)

advertisement
CLOSING THE LOOP:
A PRACTICAL APPLICATION
By Dr. Debra Bryant (NWCCU and Business Department Accreditation Liaison)
http://www.Dixie.edu/academic/using_reporting_results.php#what
What does it mean to “Close the Loop”?
“Closing the Loop” is one of the most important stages in the assessment process. It is
the action stage.
Once a department has
(a) decided what they want their students to learn,
(b) determined where & when assessment of that learning should take place
(c)
gathered samples of students’ work, and
(d) analyzed the data,
faculty takes the time to evaluate whether students actually learned what they were
expected to learn, and use that information to effectively improve teaching and
learning.
Faculty collaboratively:
* discuss the assessment results,
* reach conclusions about their meaning,
* decide what changes are needed, if any,
* determine the implications for those changes, and
* follow through to implement the changes.
An example from the Business Department
(a) Decide what we want students to learn
Mission
The mission of the Udvar-Hazy School of Business is to prepare students for
successful employment, advanced learning and service to community. We
are committed to providing an environment that embraces experiential
learning, stimulates academic excellence and incorporates ethical
considerations.
Goals
1. Provide students with core business knowledge and skills that enable attainment
of advanced business degrees and success in a rapidly changing, competitive
business environment. (Core Theme One – A Culture of Learning)
UHSB Student Learning Outcomes
1. Students will demonstrate a working level knowledge of the core functional areas of
business:
A. Students will demonstrate a working level knowledge of core business functions in
accounting, economics, finance, information systems, international business, legal and
social environment, marketing, and management.
B. Students will analyze a complex business situation, identify relevant functional business
issues and suggest viable courses of action
(b) Determine where & when assessment of learning should take place
Direct Measurement: Major Field Test in Business by ETS
When: During the Capstone course
*We neglected to consider the “who” or “by whom” and the exact “when”.
(c) Gather samples of students’ work
Conduct Major Field Test in Business by ETS during the semester in the Testing
Center.
(d) Analyze the data
Fall 2011
Learning Outcome for All
Business Majors
1A. Students will demonstrate a
working level knowledge of core
business functions in
accounting, economics, finance,
information systems,
international business, legal and
social environment, marketing,
and management.
Assessment
Method
Direct: Major
Field Test in
Business by
ETS
Benchmark Timing/
Placement
Not set
Annually
Capstone,
MGMT
4800
Results
Students: 40
Percentiles Overall: 37
Accounting: 61
Economics: 36
Management: 28
Quantitative: 10
Finance: 54
Marketing: 28
Legal: 30
Info Sys: 38
International: 47
(d) Analyze the data
Human response to bad results?
Shock!
Stupid students?
Bad teaching?
Defensiveness
In a quandary of where to start?
QUESTIONS TO ASK & THINGS TO CONSIDER WHEN LOOKING AT
ASSESSMENT RESULTS
[adapted from “Closing the Loop” Allen & Driscoll, 2013 WASC Assessment Leadership Academy]





What do the data say about your students’ mastery of subject matter, of research
skills, or of writing and speaking?
What do the data say about your students’ preparation for taking the next step in
their careers?
Are there areas where your students are outstanding? Are they consistently weak in
some respects?
Are graduates of your program getting good jobs, accepted into reputable graduate
schools, reporting satisfaction with their undergraduate education?
Do you see indications in student performance that point to weakness in any
particular skills, such as research, writing, or critical thinking skills?
MORE QUESTIONS TO ASK & THINGS TO CONSIDER


Do you see areas where performance is okay, but not outstanding, and where you would like to
it improve?
Do the results live up to the expectations we set?
o
o
Are our students meeting internal and external standards? How do our students compare to
peers?
Are our students doing as well as they can?

Are our expectations (benchmarks) appropriate? Should expectations be changed?

Does the curricula adequately address the learning outcomes?

Are our teaching & curricula improving?

What are the most effective tools to assess student learning? Do they clearly correspond to our
program learning outcomes? Do the learning outcomes need to be clarified or revised?
Possible issues to consider with assessment practices:
 Need revisions to the outcome. It isn’t what our students really learn.
 Better evidence, e.g., better assignment or writing prompt. Conclusions are of questionable
validity.
 Better sample. It was too small or biased. The results are not generalizable.
 Better rubric. The criteria aren’t reasonable for our program.
 Calibration. Questionable reliability? Compare with something standardized.
 Summative or formative evidence needed?
 More direct assessment to actually see what our students can do.
 Collected too much evidence. Make assessment more manageable.
 Need to involve more faculty, including adjuncts.
 Collected only one line of evidence and don’t have confidence in our conclusion.
 There are many ways to close the loop. What should we do???
When results suggest the need for change, consider improvements to the
program in the following areas:
 Pedagogy—e.g., changing course assignments; providing better formative feedback to
students; use of more active learning strategies to motivate and engage students;
 Curriculum—e.g., adding a second required speech course; designating writing-intensive
courses; changing prerequisites; resequencing courses for scaffolded learning;
 Student support—e.g., improving tutoring services; adding on-line, self-study materials;
developing specialized support by library or writing center staff; improving advising;
 Faculty support—e.g., providing a writing-across-the-curriculum workshop; campus support for
TAs; professional development for improving pedagogy or curricular design;
 Equipment/Supplies/Space—e.g., new or updated computers or software, improvements or
expansions of laboratories;
 Budgeting and planning—e.g., reallocating funds to support improvement plans based on
assessment findings; budgeting for new resources (software, staff)
 Management practices—e.g., establishing new procedures to ensure assessment results are
tracked and used for follow-up planning and budgeting
Motivation Ideas for the MAPP Test & Major Field Tests
• A culture of assessment at the institution can be an immense benefit in motivating
students for these types of tests. Faculty enthusiasm is a tremendous influence on students’
perception of the importance of the test.
• A letter is sent to entering freshmen and it explains the importance of the test and lets
them know they will be tested as a freshman and a senior. Helps to do this along with #1.
• Test takers are divided into teams, with prizes and recognition awarded to the top teams,
as well as top individual performers.
• With Major Field Tests, some departments will award different levels of credit toward a
grade in a capstone course in proportion to the score received. (One institution uses the
MFT Comparative Data Guide’s average student score: if a student beats that, he/she
receives a full 10% credit toward a grade, if lower, the student receives 5%).
• $20 Gift certificates to the bookstore.
• Free cap & gown rental..….
(d) Analyze the data
(e) Close the Loop – Take action
1. Change out assessment faculty member (to nonadjunct)
2. Implement motivational activities:
-
Emphasize with new faculty member the importance and value of the assessment
Emphasize to students the value of the assessment (resume, grad school …)
School pride
High score names posted
Awards?
3. Set benchmarks (average at least 50th percentile in each area)
(d) Analyze the data
Spring 2012
LO Assessment Results
1A The percentile rank on ETS Business
Major field test: Overall=88; Acct=
87; Econ= 66; Mgmt: 78; Quant:83;
Fin:93; Mktg:84; Legal/Social:86;
IS:88; International: 82
Much higher scores than for Fall
2011.
The two lowest areas were 66rd
percentile in Economics and 78th
percentile in Management.
Action Taken/Closing the Loop
1. Embed a grade for the ETS exam
and create wall of recognition
plaques for each semester cohort’s
percentile ranking
2. Benchmark raised in all areas to
be above the 75th percentile
3. Additional focus on &
reinforcement of economics and
management concept building
4. All core areas of study will gather
a list of foundational concepts to be
shared with all faculty
(d) Analyze the data
Fall 2012 & Spring 2013
Assessment Results
Action Taken/Closing the Loop
Fall2012 percentile rank on ETS Business
Major field test: Overall=94; Acct= 87; Econ=
96; Mgmt: 96; Quant:96; Fin:88; Mktg:69;
Legal/Social:99; IS:81; International: 82
Note: Mktg below 75th percentile
1. Move ETS test to week 12 of
semester so that students are
through most of required
coursework.
2. Revise prep information from
faculty.
Sp. 2013 percentile rank on ETS Business
Major field test: Overall=94; Acct=92;
Econ=97; Mgmt:86; Quant:90; Fin:85;
Mktg:79; Legal/Social:97; IS:88;
International: 88. Note: Mktg still lowest,
but above 75th percentile
(d) Analyze data
Fall 2013 & Spring 2014 Just when you think you have it right…
Assessment Results
Fall2013 percentile rank on new ETS Business Major Field Test: Overall=91;
Acct= 84; Econ= 72; Mgmt: 66; Quant:77; Fin:71; Mktg:95; Legal/Social:98;
IS:88; International: 12
Sp. 2014 percentile rank on new ETS Business Major field test: Overall=94;
Acct=84; Econ=90; Mgmt:72; Quant:59; Fin:92; Mktg:84; Legal/Social:99;
IS:94; International: 21
There are significant result differences, they are lower for all subject areas,
except Legal/Social & IS. International Issues is of greatest concern. But,
Economics, Finance, Mgmt & Quantitative Analysis are also low percentiles.
Note: The newly added process of across discipline faculty involvement in test
preparation was encouraging to students.
(d) Analyze the data
Fall 2013
Action Recommended
New Business MFT with significant test revision.
1. Need to determine significance of new test results. Request copy of new
revised test sample.
2. Obtain AACSB SLOs for subjects.
3. Each subject area should consider significance of lower percentile ranking.
International results of particular concern.
4. Re-evaluation of International Business course content. Research AACSB
SLOs for this area.
5. Assess results according to students’ majors. Quantitative Analysis results
of particular concern for Bus. Admin. majors.
(d) Analyze the data – Trend it
Lesson: Don’t panic! Apply the right questions and considerations.
# Students
Overall
Accounting
Economics
Management
Quantitative
Finance
Marketing
Legal
Information Sys
International
Above 75
Fall 2011
40
37
61
36
28
10
54
28
30
38
47
Below 50
Spring
2012
29
88
87
66
78
83
93
84
86
88
82
Fall 2012
40
94
87
96
96
96
88
69
99
81
82
Spring
2013
37
94
92
97
86
90
85
79
97
88
88
Fall 2013
(New)
33
91
84
72
66
77
71
95
98
88
12
Note: Test revised as of Fall13
Spring
2014
54
94
84
90
72
59
92
84
99
94
21
Summer
2014
19
99
98
99
95
99
98
94
97
99
74
Don’t let assessment results dictate decisions.
But, assessment results should advise faculty as they use
professional judgment to make suitable decisions.
Assessment results are meant to inform planning and
influence decision making, therefore reporting results to
the various stakeholders (e.g., students, administration,
accrediting agencies, alumni) is an integral part of
“closing the loop”.
Download