5/11/2010

advertisement
Admissions Review Working Group Meeting Notes 5/11/10
Attendees: Lisa Nordlund, Sarah Garner, Tom Duchamp, Nichole Fernkes, Marty Howell, Larry Ricker,
Paula Milligan, Maya Procell, Julie Steckler, Susan Recordon Joan Abe, Tom Quinlivan


After quick introductions by everyone for the members not at the last meeting, we re-iterated
the overall goals of the committee:
1. Understand how the current review process is done in the departments
2. Find commonality amongst the different processes
3. Discuss potential features to be included in an online review application
4. Prioritize those features
5. Create/discuss use cases for those features
6. Discuss what can be done with our current resources and time schedule
The task item from last week’s meeting was for group members to submit a description of their
department’s current admissions review process. The goal of this week’s meeting was to look at
commonalities between each of the review processes and discuss potential features of an online
review application.
1. Assign reviewers: one of the first steps of the application review process is to assign
faculty reviewers to groups of applicants. There are a variety of ways in which these
reviewers are assigned:
a) Area of interest – applicants choose areas of interests in the online application
and some departments use those interests as a method to match them with
reviewers.
b) Ad hoc – all departments would have a need to assign applicants to reviewers
one at a time from a checklist or assign them to a group of reviewers
c) Degree coding – applicants could also be assigned to a group of reviewers by
default, based on the degree coding of their program. This is currently done by
the college of Education.
d) Assign over time – some departments, such as Epidemiology, like to limit the
number of applicants a reviewer can work on at one time, so they don’t become
overwhelmed. Once a reviewer is done working on a set of ten, then they are
removed from the list and then the next ten appear.
2. Functional lists: a series of sortable lists could be created to increase efficiency while
tracking and reviewing applications:
a) Complete/Incomplete applications – a sortable list of applicants whose
applications are complete, along with email addresses, so that they could be
easily notified that all their materials have been received. Also a list of
incomplete applications, along with a sub-list of missing items and their email
address for contacting the applicant. A possible feature of this list would be the
ability to mark an application as “review ready”. Even though all materials may
not be received, the department may determine that enough items are present
to make the application reviewable.
b) Lists for reviewers – reviewers who are assigned a subset of applicants will only
see those specific applicants when they log into the online review tool. All
applicants will still be visible in the current applicant list in MGP.
c) Ranked applicants – for departments who rate their applicants, reviewers will
be able to see a ranked list.
d) Monitor reviewers – one option that was discussed was a list of reviewers and
how many of their applicants have been reviewed – as a way for administrators
to monitor the progress of their reviewers.
3. Improved faculty review process: due to the tedious nature of the current online review
process, improvements will need to be made to reduce the number of clicks necessary
to review an application in MGP. The group discussed the ideal solution – a PDF stitched
together with all the application materials and associated data.
The reason why it is currently difficult for us to achieve a one-click viewing of
the entire application is because of the technology and resources it would take for us to
achieve that goal. In order for us to be able to convert all non-PDF documents into a PDF
and stitch them together with all other application data and materials is currently not
possible due to resource limitations. It would require both the purchase of a 3rd party
conversion software/server - which with the current state of our budget is not
affordable - and it would require more development hours than we have available. We
realize the need to have this functionality and have researched ways to create a PDF
version of the entire application but have not yet found a way to get there given our
current budget and resource limitations.
One possible way to improve the process would be to allow departments to
create an electronic reviewer work\cover sheet. An administrator could build a “review
profile”, in which they would select the particular application materials and data they
would want the reviewers to see. Possible items in this work sheet could include: test
scores, course work, BA school, GPA, degree year, areas of interest, recommender
names, links to recommendations and other application materials.
Another necessary feature would be a place to capture the notes of the
reviewer. Chronological displaying of notes, along with the name of the author could
promote dialogue amongst reviewers. Also, it would be important to have the ability to
turn off the notes so that the reviewers don’t influence opinions when first looking at an
application.
4. Scoring: it will be important to create a way to score the applications – especially for
those departments who use ranked lists. A variety of scoring systems are currently used.
Three possible ways to score:
a) Quantitative – many departments use formulas to create a quantitative
score for their applicants. What may work best out the gate, would be a
textbox where a score can be entered manually for each applicant.
b) Qualitative – departments could possibly create their own list of
qualitative descriptors that could be used to score applicants
c) Comments – a general comments box will be necessary for reviewer
notes.
The group was asked to prioritize the features above and it was decided that if the first three
features (assign reviewers, functional lists and an improved reviewer process) were created, it
would go a long way towards making the review process easier and more efficient. A simple
scoring system could be used at first (a text box for a score) and then a more elaborate system
could be added later if necessary.
The topic for the next meeting will be the use cases for the features discussed.
Download