Evaluation and Replacement of Individual Development and Educational Assessment (IDEA) Stephen Burd ()

advertisement
Evaluation and Replacement of Individual
Development and Educational Assessment
(IDEA)
Stephen Burd (burd@unm.edu)
Associate Professor, ASM
Academic Technology Liaison
Presentation copies available online
http://averia.unm.edu
Last revised: 7/17/2016 3:49 AM
Project Context


In summer 2012, Associate Provost (Greg Heileman)
charged the Academic Technology Liaison (Stephen Burd)
to identify and evaluate alternative tools for student
assessment of courses and instructors
Rationale:
 High administrative complexity of current system
 Difficulty in gathering/using survey responses/results for further
analysis (e.g., data analytics and text mining)
 Concerns about usefulness of results in promotion and tenure
evaluation
 Faculty dissatisfaction with current system

A working group was formed with most faculty members
drawn from the Faculty Senate Teaching Enhancement
and IT Use Committees
 http://averia.unm.edu/IdeaNextStep
Working Group Members
Faculty
Other

Stephen Burd (ASM)


Robert Busch (Chemical & Nuclear
Engineering)
Moira Gerety (Deputy Chief
Information Officer)

Greg Heileman (Associate Provost for
Curriculum)

Kevin Comerford (Library)

Nick Flor (ASM)

Grace Liu (ASUNM)

Kristopher Goodrich (Counselor
Education)

Kris Miranda (GPSA)

Chris Holden (Honors)

Amy Neel (Speech & Hearing)

Caleb Richardson (History)

Mary Margaret Rogers (ASM)

Julie Sykes (Spanish & Portuguese)
Goals for IDEA Replacement (IDEA-R)

Increase use of and usability of student feedback
on courses/instructors for formative and
summative purposes

Adopt a modern tool with:
 Greater flexibility for faculty, departments, and
programs
 Online and mobile survey capabilities
 Improved reporting
 Support for analytics

These goals drove the RFP authoring process
Timeline




Fall 2012 – Working group examines faculty technology survey results and available
products – determines that a replacement for IDEA is warranted
Spring/Summer 2013 – Working group examines available alternatives and sample
RFPs in detail – develops/releases RFP
Fall 2013 - RFP responses close in October, preliminary evaluation begins
Spring 2014 – Detailed evaluation of RFP responses, top responses identified,
vendors demoed in early May:









ConnectEDU (CouseEval)
EvaluationKit – used at CNM and NMSU
eXplorance (Blue)
June 2014 – AVP Heileman reviews choices and feedback, chooses EvaluationKIT,
working group concurs unanimously
July-Sept 2014 – Acceptance (sandbox) testing is successful
Sept-Oct 2014 – Contract negotiated – Provost approves purchase
Oct–Dec 2014 - Steering committee formed, pilot testing, initial discussion of related
policies with Faculty Senate
Spring 2015 – Evaluate pilot results and make adjustments, phase 1 rollout to 3350% of UNM, finalize related policies
Summer 2015 – Full switchover to EvaluationKIT
Summary of Finalist Evaluations

eXplorance (Blue) – “The Lexus”
 Tops in functionality/features
 Much more expensive than the other two (± $400K)

EvaluationKit – “The Hyundai”
 Acceptable in functionality/features
 Reasonable cost (< $40K)
 Some reservations about:



Ease-of-use
Usability beyond summative end-of-semester evaluations
ConnectEDU
 Barely acceptable in functionality/features
 Significant concerns about viability of vendor, adequate
resources, and strategic direction for future product
development
EvaluationKIT Selection Reasons

License and operational cost a bit less than IDEA

Positive feedback from CNM and NMSU

Satisfies “must-have” requirements

Moves us firmly into the 21st Century

Gets us out of the paper shuffling business

Extra features of eXplorance are unlikely to be
used in the near term

Alternative tools exist for ad-hoc instructor-initiated
surveys (e.g., UNM Learn, Opinio, paper, …)
Key EvaluationKit Features

Survey structure similar to old ICES system





Develop a UNM question bank and/or “roll-your own” questions
A survey can “layer” questions from multiple organizational levels
No explicit tie to learning objectives or inter-institutional norms
Best use is mid-semester and end-of-semester evaluations – not
well-suited to ad-hoc instructor-initiated surveys
Fully online system
 Hosted on vendor servers – no local installation option
 Survey definition and administration via browser-based
application
 Students complete surveys via browser or cellphone app
 Reports generated in PDF/Excel and viewed online, delivered via
email, or downloaded
 Surveys/results can be extracted for downstream analytics
To Where From Here?

Fall 2014
 Start policy discussion with the Faculty Senate
 Plan/execute first small pilot for Fall end-of-semester evaluations


Participants ASM, Architecture, Public Administration, UNM Gallup
Experiment with:






Centralized and distributed administration
Survey content
Survey open/close dates – response rate impact
Email communication with students – response rate impact
Other communication with students – response rate impact
Spring 2015
 Evaluate first pilot results and plan phase 1 roll-out

Who will participate in this roll-out?
 Develop training materials
 Plan summer/fall roll-out

Summer/Fall 2015
 Turn off IDEA
 Roll-out EvaluationKIT across UNM
Policy Issues for Faculty Senate Consideration

Administration
 How will control over survey content and administration be distributed
among academic affairs, schools & departments, faculty, central IT
services?

Tool specificity
 Should use of a UNM-approved tool be required?

Survey content requirements
 Will UNM adopt a set of standard questions included in all surveys? If
so, what are they?
 Will UNM populate an institutional question bank from which questions
can be chosen and/or enable schools, departments, and instructors to
create their own?

Confidentiality of survey respondents
 Is existing language too weak, about right, not strong enough?

Distribution and/or confidentiality of survey data and reporting
 Who gets to see what data/reports and under what conditions?
 Do students or the public get to see any of it?
EvaluationKIT Mobile Interface Examples
EvaluationKIT
Browser-Based Interface Example
EvaluationKIT
Instructor/Course Report Example
Report Example - Continued
Download