PowerPoint slides only

advertisement
1

Ashley Finley

Senior Director of
Assessment and Research
– AAC&U
(Finley@aacu.org)

Bonnie Orcutt
Director of Learning
Outcomes Assessment
Academic Policy Massachusetts Department
of Higher Education
(Borcutt@bhe.mass.edu)
Julie Carnahan
Senior Associate - SHEEO
MSC Project Director
(Jcarnahan@sheeo.org)

Terrel Rhodes
Vice President for the Office
of Quality, Curriculum and
Assessment – AAC&U
VALUE Project Director
(Rhodes@aacu.org)
2












Participants
Overview of Pilot Study versus Model
Purpose and impacts
Timeline
IRB and Student Consent
Faculty and Campus Responsibilities
Overview of pilot study communication structure
Outcomes, Assignment Parameters, and the VALUE Rubric
Faculty and Institution Benefits from Pilot Study Participation
Sampling
◦ Artifact Collection and De-Identification
Data Management System for the Pilot Study
Scoring, Results Aggregation, and Dissemination of Results
3
MSC
Participants:
CT, IN, KY,
MA, MO, MN,
OR, RI, and
UT
Steering
Committee:
State point
persons from
each state with
representatives
from SHEEO &
AACU.
Institution Point Persons: Institution point person from
each campus in each state.
Pilot Study Team: Representatives from states, SHEEO, &
AAC&U





Multiple measures of learning outcomes –
direct and indirect
Demonstration of learning across multiple
modalities such as projects,
presentations, papers, performances and
other assessed using the AAC&U VALUE
rubrics
Assessment of learning at multiple points
along students’ academic path
A representative sample of states,
institutions, and students in order to
create useful benchmarks of learning to
guide and inform campus-level work and
public accountability
Assessment management system (i.e.
database) that will enable electronic
uploading of campus-level data, online
access of student work for scoring,
analysis of state data, generation of
reports, while assuring confidentiality &
security of data

Two Outcomes: Written communication and
Quantitative literacy (Optional: Critical
thinking)

Written work only

Assessment of learning for students who
have completed 75% of total number of
credits to be graduated as of the start of the
fall semester

9 states, approximately 70 public institutions
including both 2 & 4 year campuses

Assessment management system (i.e.
database)– Taskstream -- has been chosen
for pilot study
5
The Pilot Study is Process-Oriented
Focus:
 Proof of Concept:
◦ Ability to produce (1)useful assessment data for campus use,(2) to
measure student learning using student artifacts as assessed with
the LEAP VALUE rubrics (3) to organize aggregated data for
comparison at the segment level and for presentation

Proof of Feasibility:
◦ Ability to (1) scale up (2) to sustain campus effort (3) to develop
reliable and cost-effective, sustainable scoring protocols

Proof of Reliability:
◦ Analyze inter-rater reliability for a given level (multistate, state, and/or
local scoring) and between these levels
6
What is the intended long-term effect of the work?

Establish proof of process that direct assessment of student work using the AAC&U
VALUE rubrics can be coordinated and meaningfully analyzed to inform efforts to improve
student learning across and within states and at the institutional level.

Establish proof of reliability with respect to centralized and decentralized scoring of
student work

Establish support for validity of assessment measure through random analysis of
assignment instructions, student work and examination of trends reflected at the multi-state,
state, and campus-levels

Establish areas where additional professional development is needed and inform focus of
activities.

Generate conversations and/or changes on campus(es) regarding assignment design
and/or curriculum
7
Phase I:
 Spring/Summer 2014
◦ Selection of pilot campuses (completed)
◦ Development of sampling protocols and assignment parameters (finalized)
◦ Identification of campuses, state point persons, and campus leads (completed)
◦ Faculty development workshops, webinars, and summits (underway)
 Fall 2014
◦ Development of scoring protocols and guidelines for selecting faculty scorers
◦ Selection of faculty scorers
◦ Continued faculty development activities
◦ Collection, de-identification, coding, and uploading of artifacts
◦ Creation of Artifact Information File
Phase II:
 Spring 2015
◦ Scoring of artifacts
◦ Data analysis and presentation of results
8
What is Assessment Research:
Taken from the Belmont Report:
“…the term "research' designates an activity designed to test an hypothesis,
permit conclusions to be drawn, and thereby to develop or contribute to
generalizable knowledge (expressed, for example, in theories, principles, and
statements of relationships). Research is usually described in a formal protocol
that sets forth an objective and a set of procedures designed to reach that
objective.”

Definition of Human Subject:
Human subject means a living individual about whom an investigator (whether
professional or student) conducting research obtains:
1. Data through intervention or interaction with the individuals, or
2. Identifiable private information.
-45CFR46102(f)
Observation of individuals without interaction or identity determination does not
count as “human subjects research”.
It is only private information if one of the following applies:
1. It is gathered through intervention or interaction with the individuals
studied, and/or
2. It contains identifiable private information.

9

Waiver of Informed Consent
◦ The research involves no more than minimal risk to the subjects;
◦ The waiver or alteration will not adversely affect the rights and welfare of the subjects;
◦ The research could not practicably be carried out without the waiver or alteration; and
◦ Whenever appropriate, the subjects will be provided with additional pertinent information
after participation.
What is Minimal Risk:
“Minimal risk means that the probability and magnitude of harm or discomfort anticipated in the
research are not greater in and of themselves than those ordinarily encountered in daily life or
during the performance of routine physical or psychological examinations or tests.”—
45CFR46102(i)

Does Assessment of Student Learning fall under this Category for Exemption?
Educational Practices
“Research conducted in established or commonly accepted educational settings, involving
normal educational practices, such as (i) research on regular and special education
instructional strategies, or (ii) research on the effectiveness of or the comparison among
instructional techniques, curricula, or classroom management methods.”
— 45CFR46.101(b)(1)

10

Broad Assignment Parameters:
◦ Facilitate the development and redesign of assignments to increase the
degree of consistency across assignments
◦ Enhance the validity of our measure of student learning
◦ If we impose these parameters, are we now “Teaching to the Test?”

Assignment Coversheet:
◦ Allows the faculty to indicate which dimensions of a learning outcome they
designed their assignment to be assessed for
◦ Use of the coversheet for second level analysis
◦ Webinar on Assignment Parameters, Design and Redesign will be offered
next week.
11

Taskstream selected as vendor for pilot study

Requirements of selected vendor:
◦ To be able to work with any platform a campus may be using
◦ To be user friendly
◦ To be able to provide direct support to campus point persons
◦ To provide analysis across multiple data points

Database will enable:
◦ Uploading of artifacts for eventual scoring (2015)
◦ Generation of reports at the national, multi-state, state, and individual
campus levels
◦ Confidentiality of institutions and anonymity of students and faculty
12

Responsibilities of Faculty
◦ Participation in professional development activities, as needed.
◦ Assignment design, redesign and alignment with outcomes
◦ Submit assignment, assignment coversheet, student work

Responsibilities of Professional Staff and
Administration
◦ Provide professional development activities
◦ Collect, de-identify and code student work to ensure confidentiality of the
student and faculty member
◦ Create an artifact information file containing state, institution and student
characteristics
◦ Submit coded material to the data management system
13
Who do I go to when I have a question?

Participating Faculty

Institution Pilot Study Leads

State Point Persons

MSC Steering Committee and
Julie Carnahan, SHEEO
Project Director
Institution Pilot Study Lead
State Point Person
MSC Steering Committee and
Julie Carnahan, SHEEO
Project Director
MSC Pilot Study Team and, if
needed, the MSC Subgroups
(members of each of these
groups come from institutions
within each participating state)
14
Sampling guidelines have been finalized to ensure reasonably
representative and random samples. To view them, go to:

Sampling Guideline Highlights
Eligible Student Population

Sample Size:
o
o
o
o
Targeted minimum of 75 – 100 artifacts per outcome per institution
Maximum of 10 artifacts per course and/or per faculty member
A single artifact may only be used for one outcome
Maximum of one artifact per student

Sampling plans will be submitted for review and feedback will be
provided

Sampling Consultants: Peter Ewell (NCHEMS)
Gary Pike (IUPUI)
15
How will confidentiality and anonymity be
ensured?
The pilot study will involve multiple levels of de-identification:

Campus Level: Confidentiality will be maintained at the
campus level. The institution collecting the student work will strip
the work of all identifying information – course name, course
number, faculty name, student name. Campus will electronically
upload artifacts into data management system.

State Level: Faculty and student anonymity and institution
confidentiality will be maintained at the state level. Student work
assessed at the state-level will be stripped of any identifying
information indicating the institution the work came from.

Multistate Level: Faculty, student, and institution anonymity
will be maintained at the multi-state level.
16

Advance assessment of
essential learning
outcomes on campuses

Development of
sustainable assessment
plans and sampling
strategies

Ability to contribute to
development of state and
multi-state benchmarks for
outcomes

Gain insight and
understanding into direct
assessment of essential
learning outcomes using
rubrics

Develop frameworks for
intentional assignment
design

Participate and contribute
to faculty learning
communities on and across
campuses
17
Scoring: Faculty scorers will be blind to the institution and
state that artifacts are drawn from. Faculty will be
precluded from assessing artifacts they submitted.
Aggregation of Results: Analysis of the ability to
aggregate results at the state and multi-state level by
segment – two-year and four-year – for meaningful
benchmarking, state-to-state comparisons and for
presentation.
No Public Reporting Out: Pilot study results will not be
reported out publically.
18

Institution results will not be identifiable at the state
and multi-state level with the exception of states with a
relatively small number of institutions. Student work,
assignment instructions and trial data analysis will be
coded in order for materials and results to be returned
to each participating institution

Identifiable trends at the institution, state and multistate level will be available while maintaining
institution anonymity.

Data disseminated to states will be aggregated by
segment. Confidentiality of institutional identity will be
maintained.
19
Additional MSC Webinars
Pilot Study Overview Webinar
May 9, 5-6pm (ET) (repeat of May 8 webinar)
 Assignment Design Webinar
May 13, 4-5pm (ET) and May 14, 4-5pm (ET)
 Sampling for the Pilot Study
May 21, 4-5pm (ET) and May 22, 4-5pm (ET)
 Multi-State Assessment: IRB & Student Consent
Date and Time TBD
 Coding, Formatting, Submitting: Using Taskstream
Date and Time TBD
Webinars will be recorded and posted to: http://www.sheeo.org/msc
Webinars already posted:
 Welcome to the MSC
20
21
Download