Conference Log

advertisement
Metrics and Course Sharing Conference
26 Sep 2006, 13:42:54
Contents
Evaluate Metrics
Rate Metrics
Discuss Metrics
Course Sharing
Step Name: Evaluate Metrics
Tool Name: Categorizer
Include the mode of delivery as well as the instructor.
1. Feedback
1. Fred Villa
 The DOL "high edmand" jobs often do not exist in the areas served by the
rural campuses so we stretch what we are doing in an attempt to make our work
fit this
This is also important or true for possible funding streams for
our rural students!
I appreciate the difficulty derived through definitions,
particularly when it comes to restrictrictions on grant funding.
The restrictions can work in favor and against programs
regionally. An important responsibility on the requesting
campus or program falls into the category of communicating
the need to your students and communties so the funding need
is considered a high priority by your Deans, Provosts, and
Chancellor to the MAU.
 continuation of #1720 - how will the comm campuses fit in meeting the "high
demand" jobs in local areas and not have to stretch to make our unique needs fit
with the larger picture
Again, I appreciate the difficulties when a small campus with
limited resources has to demonstrate it's needs in competition
for high priority ranking against larger serving campuses. I
encourage frequent and highly developed communication to
your MAU's administration.
 Fred,
How will we be notified regarding the selections of the SB 137 awards after or on
the 16th?
Notifications of distribution for WFD/SB137 funding will be sent
out via email to Chancellors, Provosts, and Deans for
communication to programs.
 You mentioned that the Provosts will give you input into the selectiion process;
how will they get input from submittors so they can make provide informed input
to you?
I would anticipate this conversation to occur prior to the
submittal if it is a high priority for the campus. I encourage
program priorities and campus priorities be frequently brought
to the attention of MAU administration during budget
discussions.
 Programs that reflect high demand often require GER classes like biology,
math, english, humanities; when will funds for these faculty positions be included
in statewide priorities for increased funding and not just funds for the specific job
skills?
GER classes and their respective faculty positions would be
outside the intent of the WFD/SB137 funding. These positions
should be relayed to MAU administration as priority needs
through your budget talks and pressed as high priority needs to
provide student success in the programs addressing WFD and
High Growth areas.
 Show me the money, lots of money! :>)
:)
 :)
2. Pete Pinney and Dave Veazey
 this is good information
 How can I get more information about the expected completion rate for
developmental courses. Should they be the same as for regular courses?
There are studies that look at the national rates which I can try
to provide you with an email.
The rates are not as high as core academic courses.
 Are we comparing ourselves to a nationwide trend?
We do reflect the national average in certain areas, but I can
find more comparisons for peer institutions.
 How do you monitor if a student has or has not taken a placement test?
This currently is entered in Banner.
 What about the special populations we have in Alaska. Should we compare to
similar trends outside rather than a composite national trend?
I believe we should track by disaggregating the data down to
gender, age, high school preparation (courses)/GPA and other
factors that create distinct, trackable groups.
 From a student services perspective it would be nice to run a report in banner
on who has taken ASSET and what the scores were for all students.
Banner has a place for those scores.
 Is there a way to monitor students who need dev courses prior to being
accepted fully into a degree program at the university (ex: probation until they are
at 100 level work). This will give a way to track students and insentive for
students to complete the DEV courses.
Mandatory assessment and placement should be able to
distinguish just where the student falls. We are looking to find a
more accurate diagnostic than the current COMPASS/ASSET
 It would also be nice to know what non-degree seeking students would have
been on probation, but were not because they are not in a degree.
I assumed that these students with some credity history do
have a cumulative GPA stated on Banner.
 Talk to me about an ASSET scores report. Test scores are also available to
advisors and faculty on UAOnline now. Colleen
Thanks, Colleen.
 I agree we as student services personnel dealing with a large group of
students need reports to help us target student properly. I am also concerned
that low grade reports are only required for freshman. We miss a huge population
this way.
We could ask faculty senate to extend the reporting to all
students. Are you saying ALL or all in developmental courses?
 I disagree that DEV designator is a stigma that affects students. I began math
at DEVM 050. I was more affected by the challenge of math and my math
"disability" than what designator the university used. I don't think students care
about designators. And I continued thru calculus, entirely unaffected by
designators.
While individuals may not react to the designator, there are
studies that indicate this is a stumbling block for students to
select these courses when they need them.
 #1757 Full-time non-degree students at UAF are put on probation or
academically dismissal at the end of spring terms.
Is there something impacting this set of students that I should
be looking at in terms of developmental education?
 there are also underage students in home-schooled programs who take dev.'l
ed. classes.
Should all campuses evebtually use the same placement test?
This is why we should avoid a "one-size-fits-all" approach.
 At tribal college math focused conferences I've heard that testing students
"backwards" starting with more advanced math that they may have taken more
recently gives a more accurate picture. Then students who need a refresher in
fractions but is ready for algebra doesn't have to take a complete course just to
move forward.
This is a good idea to explore.
 But we need a way to track student that are doing poorly with out looking at
them individually and if there is a report that will pull down non-degree seeking,
pt students please email us.
I will have to look into this.
 #1766 Please contact me. Colleen
3. Gwen White
 It's so great you could join us. Your insight is much appreciated.
 Gwen, If Kodiak did not have anyone in the last CCSSE meeting, please let us
know so we can get someone involved. We have included use of this survey in
our Enrollment Management Plan. Thanks, Connie
 I doubt we will get many of these surveys back from rural students.
 On the Community College Student Report, multiple references to "this
campus" are ambiguous. Many CRCD students attend classes at up to 5
campuses in a given semester.
4. AK-ICE †Overview
2. New Metrics
1. Successful Course Completion (Course completion rates)
 What caveats can be put in place to prevent grade inflation, which will
certainly be inscented if this metric is implemented?
 Successful completion is not just if they passed a course, but if they also were
successful in the next level. IE they passed DEVM 060 and did they do well in
DEVM F105
 What about Pass/Fail courses? A Pass should be included.
I agree with this one
 This is a important metrics for the community campuses based on
assessment of C or better.
 Completion within 1 semester? How can we also count incompletes that are
completed within the following semester?
 Would need to have percentage of total enrolled students who successfully
complete with C or better.
 Course completion should be recognized as any completion other than
incomplete or failure.
 What is a completion of a course? A-D, depends on requirement of grade for
next level?
 Will this only apply to 100 level and above courses?
 What wil we do about incompletes? Can we find a way to alter the course
completion number after incompletes are done?
 This is an important internal measure of class/program success; however, it
may seem to be something that should just be a given level of expectation by
external stakeholders. Doesn't seem like a substantive "snapshot" measure.
 What about DEVM F105, in order to go on to MATH F107 you now have to
earn a B. But C is normally considered passing . . .
 What about Incomplete grades where a student completes the class
successfully up to 12 months later?
 This is an instructional evalution that may not include the other considerations
of distance education, like student services and academic advising.
 What is this really measuring? What will we do with this metric?
 Provides feedback on correct placement of students.
 I think there are two issues here. One is successful course completion of a
single course, the other is successful completion of a series of courses where we
see that students completion (grade) translates into mastery of the information so
that they can successfully go on to the next course. The developmental
completion rates metric might get at this issue, but it is not just a developmental
issue.
 Should this be left to course assessments and not use it as a metric at the SW
level? Are we duplicating effort?
 Success often depends upon a good DEV foundation... success usually
equals success in persistence, which often equals "graduation".
 Successful course completion is largely dependent on initial placement
practices. We can't expect professors to teach students who aren't adequately
prepared. There need to be consistent placement practices and required
prerequisites.
 Many community campus students start by taking one course and may not
know if they will take another until after they have completed the course.
Successful course completion rate is a good metric to start with because it will
help us look at placement and advising practices.
 I think a student is successful if they pass a class and go on to the next levle
 We need to be mindful about who these measures will be used for reporting to
the ocmmunity the legislature and internally. For example completion in a course
vs compleiton in a series of courses is an internal measure for success not one
that necessarily would be useful to the public or be productive to report to the
public
 What is the desired trend for this metric?
 I think this includes dev. courses, so it may not be necessary to have separate
Dev. metric.
Is this referring to just classes for credit? If so, it should be stated .
 I think this is not a good measure of student success unless the student's only
goal is to take a single course.
 My only concern is UAF DEVM F105, a B is required to go on to MATH F107.
 What about Inc or DF. Will we be consistantly going back to see how these
students did? Because these are not failures.
 Completion of a course also happens if the student gets an F. It is just not a
"successful" completion. However, it does mean that the student is finished with
the course. The concern is that some students withdraw or get an incomplete.
Those are the students that don't complete.
If a student fail a course, they did not complete the course.
They did not receive credit and will have to repeat the course
and receive the required grade (above the F) in order for it to
apply to a degree program.
F stand for fail not completion.
 Maybe we need to look at Failed, did not complete (W, I, DF), and passed.
Exclude audits.
 This is a basic measurement that we need. However, it cannot stand alone.
Other metrics that measure students success at moving on also need to be
considered. We need to remember that no metric will capture everything and
even if this is an incomplete measurement it is one piece of the puzzle.
 D's get degrees, I agree. Being below average isn't failure, F's and
Incomplete's are failure until courses requirements are successfully completed.
Course assessments shouldn't drive what is considered student success.
 Academic Program Planning (PBB measure) also addresses this issue
indirectly.
 A "C" should be a minimal standard for successful student completion.
 The President recommended the perspective that we are serving students and
from their perspective.
 2009 This seems to be an easy one to measure so we should start as soon as
possible.
 May want to add that metric title that it is Successful CREDIT course
completion since many campuse also offer non-credit courses and workshops.
 #302 then from that perspective we need to start asking each student what is
their goal. I think we would be surprised that many students while enrolling in a
degree program, not always see their end goal as a degree.
 I agree that C is a minimum standard for success.
 'D" grades are considered successes for general education and core
requirements.
 There is a difference between completion and successful completion.
Completion would be defined as having received a grade other than W or NB.
Successful completion would be defined as having received a minimum grade of
'C' or pass. Incompletes would have to go into an unknown category. Perhaps
the metrics could be redone one year after the semester ends. The interesting
thing would be the amount of change in that year period. Withdrawals and No
Basis would not be included in the graded metrics. They could be included in the
attempted metric.
 Successful course completion is a basic measure of what we do. It is
important to measure
 Is successful course completion an indicator of basic overall persistence?
 If you say a "D" is not successful completion, then why not move to
Pass/Failgrades and eliminate "D"s altogether; if you have the ability to have a
below average grade that is not an "F" then that should be your cutoff, not "C"
 #304 My interpretation is that this metric only applies to academic courses as
CEU (continuing education unit) and non-credit do not receive letter type or
pass/fail type grades.
 This one should be pretty easy to implement. 2008
 #312 plus we are not metrically measured on non credit courses  Could this be combined with overall persistence? Do we need to establish a
more clear definition before we set out to measure this?
 AY09-Hopefully the same year as we implement a developmental education
metric since they are aligned similarly.
 2009 - this should be easy to add. We already know the grades of students.
We just need to know the time period for reporting so that we don't lose those
students who take incompletes.
 AY09 - Calculation: # of students completing individual courses at or above
the course-defined minimum level of success.
 This metric could easily start as soon as adopted since it appears to be
looking at a straightforward percentage or quantitative number. I'm curious about
the unintended consequences of grade inflation or reduced rigor of the
instruction.
 Someone asked what is this really measuring. If students are paying for
courses but not passing them (not receiving the credit they paid for) this is a
problem. There are many things that might cause this to happen, including
student preparation, but if a lot of students don't successfully complete our
courses, then we need to know and do something to change it.
 2011 So much of "true" successful course completion depends on proper
placement and advisement of students. Initial efforts to hold students to
strengthened prerequisite and testing measures take time to implement. There
may be initial resistence to implementing placement,, so this takes time
 Primary data is the successful completion of courses - incompletes are
secondary data that should not be put into this measure but should be further
investigated to get students to this measure
 #312 and 314 - good point. Best measure of campus productivity would
include ALL students enrolled in credit or non-credit.
 I am concerned that this metric could lead to grade inflation to insure that
more students complete courses.
 AY-2009 since it is sokmewhat a measure now. I think we should also include
a measure for non-credit courses since many offer those as well.
 Even though D's can be used in a degree program, the student must have a
2.0 overall GPA and a minimum grade of a 'C' in their major courses for the
majority of majors. More programs are beginning to require a minimum grade of
'C' in some of the core requirements.
 #330 - agreed
 I don't believe this will lead to grade inflation. Faculty are not going to do
something like that just to make the metric look better.
 the sooner the better as it is evident there are many other issues that could be
addressed if this basic measure was implemented
 2008 -2009.
 This is a good argument for moving to competency-based mastery for grading
rather than arbitrary grade systems.
 Many students fail to complete classes due to personal issues...these we can't
do much to prevent, but that shouldn't stop us from working on the issues that we
can do something about...that's why we need to measure course completion and
keep working on correct placement, good access to advising, early support for
students transitioning to college...and other support measures....over time
strengthening course completion will improve retention etc measures but at the
community campus level the focus is best at the course level. Students may
complete a program based out of somewhere else, but if they received most of
their background at a CC then that is a success.
 In regards to #296, an F is not a successful completion of a course.
 #355 neither is an INC
 With regard to non-credit and CEU, the assumption is that everyone enrolled
completes since we post the grades automatically. Alll non-credit courses and
CEU courses are posted with a single grade appropriate to their level via a batch
process.
 #357 Incompletes are not completions but can be if the student finishes their
work by a deadline. F grades the student does not have that option.
2. Basic Overall Persistence/Success (Persistence rates)
 this seems very important, however does it tell us how long it takes someone
to get through to completion or is that measured some other place?
 This is iimportant in that we need to follow students as they attempt to get a
degree from more than one campus.
 Important to include those who begin as non-credit and non-degree-seeking
as an element in Persistence.
 Needs-based financial aid is an important piece to this component.
 Also important to have a larger window of time for completion (10 years
versus one year for example)
 What about transient populations, ie. military personnel, who transfer duty
stations and will not be counted in persistence daa?
 Is basic overall persistence linked to successful course completion?
 This should be broken down by developmental courses and college credit
(100 level and above) courses.
 We could do better in identifying the few courses where most of our failures
occur and work to remedy those gatekeeper situations.
 We have so many students that do not attend campus on a regular semeter
by semester...how do we count those students? Their goal is still their but they
need that time for family, financial, etc.
 We also need to think about what happens we agency funding goes away.
 Considering the enrollment patterns of our students, this seems critical.
 We need to use this metric to improve our service to students to help them
take more direct paths to completion, if that is there desire. Although many
students choose to take the long road, others go there unadvised and this should
be addressed. In addition, we need to look at how our programs work for or
against completion. How many times do we start a grant program to promote this
or that and then watch students flounder when the support goes away?
 Non-credit students and other students leading to a credential or a institutional
certificate needs to be measured.
 Institutional research can easily track student movement within the UA
system, however tracking students who transfer outside UA is less
comprehensive (in other words, we can find some but not all students who enroll
in another HEI after UA).
 Maybe we need to development more of an understanding or peresitanace
and development a "stop out student" approach. IE a student has not dropped
out if they don't return for so many semesters , but yet stopped out.
 This measure would allow us to track all those students who stop in and out
over time, since it is longitudinal in nature. In fact, it will allow us to take credit for
our many lifelong learners, who come back year after year for personal
enrichment.
 I feel that each of these four "metrics" is all about the same dimension:
graduation depends upon persistence, which depends upon course completion,
which depends upon DEV... I still think UA/BOR/St.Legisl/Public will be judging
us on UnD HdCt & SCH.
 Support from partners/corporations is crucial for student persistence as a part
of needs-based financial support. Community campuses need a way to
recognize the importance of these partnerships and how they provide the
sustainability for the vocational needs of a region.
 The debate about UnD, HdCt and SCH will continue until the empahsis on
comm campus numbers disappears and the counts are rolled into the overall
MAU numbers providing incentive for the MAU and other comm campuses in the
region to work together.
 To measure persistence more accurately, it would be helpful to have a better
way to communicate and record what the goal of each student is from the start.
Sometimes the measures for persistence don't capture the fact that the goal of
the student has been met and that is why they've stopped out.
 if some campuses don't offer a program, will they not be penailized in the data
comilation and i.e. not included?
 don't agree with the refinement comment....most non-credit students do not
have a goal to go on and take a credit class; apples and oranges. A 3 hour noncredit workshop in something has nothing to do with the credit programs. There
should be a mteric, though, of how many enrolled in non-credit workshops.
DNon-credit = CEU's and zero credit.
 What about a student from a rural area, who is degree seeking through a main
campus, not their rural. Where do they get counted toward rentention and
persistence? The rural campus is supporting and providing courses, but the main
campus gets the credit.
 how would this capture single-course credentials like CNA, PCA, weling,
etc.....? If it can't, then it should be qualifeid in the tilte of the metric and in
definiton.
 if a student transfers from one campus to another within a MAU or to
anoother, which campus gets counted? So, if someone persists at one campus
and persists amongst campuses, how is this addressed?
 How can we also include the notion of "stop out". Would we find a break
lowers the persistence rate? Would there be another way to count students who
are intermittently persistent?
 Isn't persistence an indicator of the success of the existing metric "strategic
enrollment management planning"??
 W
 There is actually some overlap/redundancy between basic persistence and
basic completion.
 Full time students are such a small % of our student population it's probably
not the best metric to study
 We don't need to parse out the enrollment of a cohort. But it is important to
follow them through their career as a student in other institutions.
 Again, there is no cinsistency between MAUs-- Is there a way to count the non-credit student who is in ABE-GED-ESL
courses and persists to the next year to continue working on their goals.....not
because they didn't complete, but because they have more to work on to gain
further competencies which may or may not include earning a GED(more ESL
and literacy compencenies)..
 AY10 - there needs to be a standardization across the MAUs to be able to
compare statewide.
 AY10 or 11
 AY10 or 11 - refinements to the current retention metric can make the initial
approach to demonstrating student persistence, so this one could be delayed
while we work out the details.
 #370 - not all of us offer the ABE/GED program
 First step is consistency in policy across MAUs for "remain enrolled in
program"
 Also need to count non-degree seeking students. Many students take classes
without being admitted into a credential program.
 How about a measure of a student initiated goal for non-degree seeking
students? Would it be possible to track an individually developed goal?
 The refers to students seeking 'credentials' and 'degrees'...this should also
include workforce endorsements, licensures, and similar achievements.
 This metric could expand to those who are non-degree seeking.
 Maybe we need to non degree seeking types in banner. Those really NonDegree seeking, and those Intending to Degree Seek. It would require some
question or evaluating of the student before registering and them admiting them
as such.
 please drop the potential refinement; someone taking a non-credit 3 hour
grant proposal workshop or a gardening class or a small bus. dev. workshop or a
GED class has nothing to do with measuring our success at how many of them
go on to enter a degree program. It's not their goal.
 AY 2011 I think the second part (Potential Refinement) is another measure not
part of the first measure. The first part seems to tied to degree/credential seeking
not to those large numbers we serve successfully in one or two courses. (They
mat be getting jobs or promotions)-it is all successful.
 NDS and degree seeking students can be rolled to gether for an overall
metric, but have to be split out for campuses to see how they are doing. It will be
hard to impact the NDS students, some of whom may not plan to take more than
a single course - these are then automatically counted as a non-success even
though they met their goal.
 Might want to take out the part specifying enrollment in a program. We need a
way to count and be measured on our non-degree seeking students who take
courses for life-long learning, job skills training, or other non-program specific
reasons. Our ability to retain these students over time is also a measure of
success.
 AY 2010
 The Community Campuses do more than just serve full time degree seeking.
students. The non credit student and special interest classes students take for
personal enrichment need to be considered and measured.
 Potential refinement: Take the existing suggested refinement (in the
powerpoint) and take it out 10 years instead of 5. Many students take longer than
5 years to complete a credential.
 Definition of persistence at the communicy campus level is so much more
than the traditional definition of persistence and should be identified and
measured
 2010 or 1012--recommend a stay of execution on this one until there is
consistency on the MAU level
 I think we need some criteria, such as degree seeking, to form a meaningful
cohort. It's not reasonable to expect a non degree seeking student who's just
coming to us for a class to persist. We should focus on the students who come to
us with the stated goal that they want to persist, ie they want to work toward
degree or credential. Then, we should focus our persistence efforts on those
students.
 maybe we need seperate measures from main campuses.
 The refinement implies that we should be expecting non-credit and CEU
particiapnts should be expected to then enroll in a program.,,,,we should not
expect that and be dinged if they don't.
 First-time, full-time, degree-seeking freshman is not useful. I would suggest
including those who are admitted, whether first-time or not, and certainly not
limited to only full-time enrollment, from year to year (AY or FY). Financial
constraints are a limiting factor for continous enrollment.
 There are some non credit programs that lead to a credential that should be
captured.
 Include students who are non degree seeking. These undeclared students are
persisting (continuing) to explore... after a while it is not unusual for many to
discover (via their advisor) that they are 1 or 2 courses away from an Assoc
Degree (general).
 Cohorts should be defined by Academic year to Academic year... not by any
one semester.
 #396 I agree and those numbers should be rolled up to the MAU level as at
the comunity campus the numbers are so small that they are almost insignificant.
 In our campus, we don't have much full-time students, alot of of students are
taking taking a course and don't really have an idea of what they want to do.
 Measurement - 1 cohort for when the student took their first academic course
and a second cohort for when they took their first non-credit/CEU course.
Persistence would be measured in each cohort separately.
 Campuses do not currently flag 'first-time' NDS students, which would have to
be done for IR to identify these students.
 personally, I think the current SW retention rate is a good measure for us. It
gives us a microcosim of our student body and is a good indicator of what we are
doing
3. Developmental Success (Developmental Completion Rates)
 This measure works if a student's goal is to just complete a developmental
course. Many students pass developmental courses (A,B,C,P), but are not
neccesarily prepared for college work.
 Developmental success especially in Math is defined as a B, at least at UAF.
Do we have separate rules for these exceptions or do we go with a generic rule?
 Definiton of developmental education needed to measure
 This is needed for the community campuses to measure the students who are
succeeding. Need to define success.
 Developmental courses do not transfer to other institutions through the
transfer credit process (because they are less than 100 level). So, a student who
"successfully completes" a developmental course at UAF wouldn't necessarily
enter UAA at college level if their assessment test scores didn't place them at
college level. Perhaps successful completion of a development course should
include an external measure such as a national placement test score.
 Depending on the student's level to begin (DEVM, DEVE, DEVS) if they
achieve a goal and it does not consist of completing all levels of DEV is that
success? Or do they have to go through all DEV and into 100 level to be
successful?
 There are three basic levels of ability: those who are in need of immersion into
college-level preparation, those who only need an add-on with their core to be
successful, and those who can succeed with a minimum of help through
supplemental instruction and some advising.
 What courses are considered "developmental" courses. Some applied English
and math courses might be included.
 Can this metric be used to go back to the K-12 programs to say that we are
not receiving students who are prepared for college? Can this be used as a
springboard for working with K-12 to develop programs with them so we receive
students who are prepared?
 Labeling of courses should get away from DEV and remedial.
 We generally have more students in this area than any other. Most of our
students who graduate have started at this level.
 Some students take the same developmental courses over and over, passing
each time. More complex measures such as % of students who go on to pass
college level course work will help assess the success of UA's developmental
program. This would go hand in hand with mandatory testing and placement.
 Since students taking developmental courses are less likely to complete a
recognized credential than those students who do not need developmental
courses, I think we need to devote additional resources to developmental
students.
 Inconsistencies in how "developmental" courses are handled across the UA
system may skew data if not carefully considered.
 Perhaps the most important part of this measure is our ability to help students
make progress along an educational continuum, so measuring a student's
enrollments and success in college-level courses AFTER the developmental
sequence may be the key to reporting campus productivity/effectiveness.
 Perhaps we need to retest a student after each Developmental Ed course to
make sure they are prepared, esp since student take courses from different
MAUs. We need to make sure we are preparing them before they go. The failure
rate when some of them do go on is a direct refelection that we failed!!!
 Need to track because many students never make it into freshman level
courses and we need to determine why.
 Many students take developmental courses concurrently with enrollment in
'college level' credits. Are there any challenges in accurately capturing their
"success?
 Most of our students do need developmental courses in area. The majority of
our student populatuion is an older group.
 This measure doesn't have to do with the grade received, but whether the
student received an acceptable grade to qualify for acceptance into a 100 level
course.
 This area is receiving and will receive even more focus at the SW and
legislative levels. It will be important for us to get out in front of this new focus
and begin measuring and reporting these statistics.
 Developmental success takes two forms. Completion of developmental
courses by themselves and completion of developmental courses to successfully
continue into college level courses. Would these be looked at together or
separately?
 Support #73 question - separate or together - I think there is value in each
 The bridge between developmental courses and the subsequent academic
course should be clear. The information in one should lead to success in the
future course. The tracking of success in subsequent courses (disaggregated to
show like populations and their success) could track how well those bridges are
working or not.
 This is part of the course completion metric, so it may be redundant. Also,
may not be
 This is part of the course completion metric, so it may be redundant. Also,
may not be consistently interpreted since there is such vaiation on how dev. ed is
conducted across the State.
 Building a clear and direct pathway from developmental coursework into
college level courses would be helpful for success. To transfer developmental
coursework cross mau would help students feel more compelled to be successful
in their developmental coursework.
 not all students have a goal to go onto non-dev. classes, so we shouldn't be
evaluated by how many do.
 Get a complete list of what counts as a developmental course.
 Many designators, particularly in math, are considered developmental. ie,
CIOS 116 and CTT 106 and DEVM 105 are all considered developmental math
 We need a thorough list of what classes are being subsituting for DEVE/M/S
courses. And look at successful completion and successful completion of the
courses they take after. Perhaps we need to also look at cohort students, ie
degree seeking or those in BA programs to see if what we are measuring is
going to show what we think it is going to show.
 Developmental classes need to be measured in a number of ways. Degree
seeking students, non-degree seeking etc.
 We need to also set up parameters to what we are considering successful, ie
students goals, or what the MAUs traditionally see as goals, ie a cert or degree.
 In the past we have convoluted the developmental courses with A.A.S. course
requirements (like CIOS 160 and TTCH 131). The A.A.S. courses have been
shown not to prepare students for the college level courses. We need to separate
the courses that satisfy A.A.S. degrees (and those students who have the goal of
A.A.S. completion) which can be measured with completion and those courses
that are truly developmental courses that students take to get ready for college
level classes.
 We have a need to have a more accurate assessment of incoming students to
reduce the wide range of abilities found in a single classroom.
 A proposed measurement is placement tests before and after successful
completion (A,B,C) of a developmental class. For example, a student tests into
DEVM 105 beginning Spring semester, gets a B, and then tests into Math 107
beginning Fall semester. The problem with this is that it is cumbersome to
students and puts an additional burden on each student for excessive testing.
 Someone mentioned Accuplacer, remember not everyone is using the same
tool to measure students, ie there is compass, asset, accuplacer, etc
 A course designated as 'developmental' should not included 'applied' courses
that are degree requirements. Developmental courses are preparatory ones for
college-level instruction.
 1. View degree program for student: requirements for DEV, do they meet
them? Did they complete courses in that area or did they test out?
2. NODS - Is the student taking a course for refresher for job, self, etc.
Do we need to look at student's that dropped or withdrew from a DEV course to
find out why the did? Can we improve the drop/withdraw of students in DEV
courses?
 I think the measure, as written (completion and then successful completion of
100 level math and English) is appropriate because other metrics (eg.
persistence) get at students who go on to cert or other programs which do not
require 100 level courses.
 Critical to this conversationis the STARS program and the SAC initiative to
requre developmental testing and advising for student entering degree awarding
programs, and to what extent are programs not part of that exercise.
 It appears there is not standardization between UAF and UAA re: disignators.
Will this be a problem?
 The STARs report and SAC are in support of mandatory student testing and
placement into developmental math and english courses. Without this in place, it
will be difficulty to impact developmental success.
 Developmental courses should be those identified as stand alone courses
(rather than developmental activites embedded in other courses) used as the
"first step" toward continuance along an educational path to basic completion of a
recoognized credential.
 Developmental students success can be measured by the outcomes of the
GERS that follow if they are degree seeking or certificate based.
 Even though other courses have been created at the 100 level that are
equivalent to DEV courses, they are not traditionally used to meet the
baccalaureate course requirements. So, they should not be counted as
developmental courses. They student will still have to test into the traditional
bacc level Math courses. These other courses have been developed to allow
students to earn a certificate or associates degree only and they do apply to
those degree requirements.
 The variety of developmental designators will create a challenge in capturing
all academic goals of individual students. We also have nontraditional students
who do not have the same deficits as entering high school students coming to
college. Any successful measure of this would require knowing where they come
in as opposed to how they exit. Groupings could capture gender, high school
preparation/GPA, ESL and mandatory placement scores as a way to accurate
track success upon exit, according to a student plan of study.
 #106 I like the idea, but maybe the test is something done by the instructor at
the end of the semester. I think instructor need to see the measured outcomes of
their students. Lets face it, not all DEVM instructors are good instructors.
 "Developmental Success" is not necessarily successful completion. DEV
courses can be successful in that they help the potential student understand
what is required of college level work esp. attendance and homework!... that they
get a W or NB is appropriate for both the college and the student (former). As a
"Metric" the UA should appreciate how much work we put in helping students to
understand and build up their skills... HOW MUCH vs. Success is more
appropriate as a measure of the service we perform.
 How is the instructor's competency factored in? For example, if one instructor
has a 50% completion rate for their students and another instructor has a 70%
completion rate from the same population of students, isn't that indicative of
different teaching ability? Sometimes student failure is tied to less than optimal
teaching.
Include the mode of delivery as well as the instructor.
 This is a KEY issue for UA accountability and should be measured in some
form
 It seems that sometimes these measures are working counter to each other
with success in developmental instruction as goal toward college-level success
versus completing student-centered goals, like job skill development.
 Perhaps table this for now and use course completion metric to start with.
That way all courses whether they be developmental or above would be
"counted." We could measure successful completion of developmental courses
and separately measure all other credit courses.
 Its very important for a student starting out in college to take some
developmental courses if they need it to succeed in college. (if needed) The
student should take a test to see if they are ready to move on to the next level, or
the instructor should let us know if they are ready to go on to the next level.
 Calculation - # of degree-seeking students, who place into developmental
courses, successfully completing the first 100-level Math and/or English. This
type of calculation would capture only those developmental students with goals
of enrolling in college-level courses and demonstrate their ability to succeed in
those courses, which is one of the goals of developmental studies.
Students do not always take a placement test to get into a
course. They choose to start at the lowest level and even that
course may be to high of level (but it is the lowest offered by
the college).
 if this metric remains, then it could be measured by number who got a c or a
pass.Some dev. courses are 100-level classes; i.e Prpe A108. Also, is Math
A105 seen as dev.'l if it meets AA requirements and doesn't meet BA/BS? So,
"devl" course means Dev.'" for a AA or BA degree?
Designators and/or course numbers within the UA system from
UAF, UAA, UAS
 Some certificates have the equivalent of developmental content explicitly
"embedded" within them. How is this to be addressed?
 Developmental success may not always be measured by a single course. For
many students it will be measured by the sequence needed to develop the
competency.
 Looking at the DEV courses and how they are offiered to see their success
rate as well as the student success rate. Example: Elive, Audio, Blackboard,
paper based correspondence, ...
 An interesting factoid for context - in the California State University system, 85
percent of incoming freshmen need remedial coursework. That system
successfully remediates most of these students, and those who aren't up to
college level after the freshmen year can't go on. The point is that UA is not
doomed because K-12 doen't prep students for college, nor is this a situaiton
unique to UA.
 This has a high priority at the statewide level, as demonstrated with the recent
STARS task force and funding for innovative developmental proposals. It would
probably be wise to propose at least some way to measure our successes in
these areas.
 Course modality should be seamless in a random look at completion/success.
The quality of a course should be irrelevent to the success of the student.
 #115 not all students intend to become degree seeking or cert seeking. I think
that we are not going to be able to look at all students, and maybe need to
specifically focus on cohorts of students, ie only degree seeking. Unfortunatly this
would not measure many of the rural students.
 You can't just assume that completion rate for an instructor equals
competency. It is the students ability to move successfuly into the next class that
indicates instructor competency. Overall, I think the ability of students to
successfuly gain skills and move through their chosen program is more important
at showing our abilities than judging us by single course completion even if that is
the students goal.
 FY 2009 - we will need a year or two to work out the policies with SAC,
Provosts and such. Plus need to think about if we are measuring data that is not
already collected.
 2010 - It will take a couple of years to get a proper list shared among campus
about what counts as a developmental and what counts as a successful
completion for the students.
 2010; Why wait? If it's worth doing, if we have confidence in the measure and
believe it's a valid measure, how could you justify waiting?
 Why can't we develop consistent course numbers, course objectives, and
syllabi for all "developmental" courses across the UA system? Is this desireable?
Is this achievable?
 2009 Why not as soon as possible? Courses are set up to be sequential
already. The outcomes of the course taken as a prerequisite for college level
work is to prepare the student for that class. Developmental Ed departments are
working on strengthening their programs, assessing their programs already.
 UA is asking for $500,000 for support of developmental success in FY07. UA
will be held accountable NOW for its past and current performance in
developmental success.
 AY09 - This has high priority at the statewide level, so it seems apparent that
wse will have to accept some level of accountability on this sooner rather than
later.
 2010 - Broad institutional implementation of metrics and budgetary
implications should be assessed prior to adding on metrics that have yet to be
fully defined and developed.
 Implemented could begin AY09, since mandatory placement policies are
going into effect, theoretically, this year, and funding requests for increased
developmental support begin FY08.
 #137 define successful completion - in this room alone there are several
different takes. We are going to have to work out these things.
 We embed developmental work into all our courses and most of our students
need substantial developmental work. I do not however think this is why my unit
exists but is necessary part of what we do therefore not a measure of
"productivity" as such.
 2010 we have been working on developmental success for some time and we
should be ready to make and meet goals already.
 2008- But I recommend that we look at overall successful course completion
too rather than just completion of develpmental courses. Perhaps there would be
some correlations.
 AY09--This has a lot of attention at SW level and we should start measuring
soon.
 In regards to #132-course modality is not the same as course quality and I
believe a poor quality course does impact a student - usually negatively.
 This should be implemented when the bugs can be worked out in FY 09.
Developmental Education is an important part of our mission and the students we
serve.
 Achieveing some consistency across developmental programs appears to be
on the radar screen at the statewide level.
 2007. We are way behind on this. Students out there expect that we already
are accountable and that the course sequence they are advised into is the
correct sequence. And that the course content and delivery is worth their money
and time.
 Will the developmental courses include the non-credit ABE and ESL classes?
They are development too.
 AY-2010 to give time to support the concept, determine all the various
devlopmental courses (what about those that embedd DEV into most of their
courses?) and to move all MAU's to similar measures.
 2011 - We need time to collect several years of data to determine if we are
capturing what we really want to be measured on
 As soon as possible.
 I agree with the comment about student goals; that is vital for many of teh
metrics like this one; not all people have as a goal to go on after they're
successful with their credit or no n-credit dev. course
 Maybe if a student test into a developmental course they need to also take a
student skills course, maybe we shoy
 2009: Can start as soon as possible on tracking...finding consistancy within
DEV programs across MAU's of what is covered in each course so student
transfering or taking courses across MAU's have consistancy.
4. Basic Completion (# of students who earn a recognized credential)
 Many of the students at community campuses do not take classes with the
intent of receiving a degree or a credential.
 Recognized creditials are important to individual programs and to
communicate to employers regarding success of programs.
 What about students who all they wanted to was to take Grant Writing or
Technical Writing to improve their skills. No creditial or degree was earned, but
yet they still achieved a goal. How in the world do we measure this?
 This is a critical way to acknowledge individual student achievement, which is
a high motivator for further study - it helps to create the sense of an educationa/
career ladder, where students can step on and off as life circumstances and
employment requirements dictate.
 It is very important for a campus to get to count the success of students as
they complete at other campuses. The "assist" needs respect.
 A student can take 1 course and that makes them more employable than if
they did not take it. A Rural Campus would consider that a success. There are so
many student that are NODS that are not insterested in a degree program...they
just want to be hired.
 Agree that this should include a measure of how the community campus
"assisted" in completion i.e. the statewide nursing and rad tech initiative
 In the list of examples, there were licensures and industry certifications. How
do we know when someone has completed one of these? How are these
defined? Is there a clear understanding of what a workforce credential is - that it
involves only CEU and non-credit courses?
 Increased advising/program of study tracking could add to the students'
ownership of the advancement of their own academic goals and help identify the
resources we can offer to help them achieve that goal.
 I feel that each of these four "metrics" is all about the same dimension:
graduation depends upon persistence, which depends upon course completion,
which depends upon DEV... I still think UA/BOR/St.Legisl/Public will be judging
us on UnD HdCt & SCH.
 Community campuses often act as feeder campuses. Persistence needs to be
tracked at a SW level to be meaningful. A student who starts at a community
campus and continues at another campus should be considered retained.
 The AK Department of Labor and US DOL use common measures focusing
on employment beyond graduation or completion, as well as income progression.
To what extent are these Common Measures important to our efforts? Perhaps
we should at least look at them to see how our own measures might (or might
not) complement them.
 It is important that this measure only recognized credentials and not just a
course(s) that do not lead to license, or industry- or state-recognized credential.
 UA has one of the lowest graduation rates (25 percent for bachelor students,
less for associate and certificate) in the country.
 Should develop credentials that recognize the small successes so that
students may build upon these to reach their overall goal.
 Alot of our students are non degree students and are taking courses to help
them succeed in their job.
 For many students, the important success is gaining employment or
advancing in employment. This may mean a certificate or degree of some sort,
but maybe not. We often get our new students based on their observation that
other students have obtained real word success through taking courses with us.
Is this part of this metric and how to measure?
 Concerned that this may still omit single-course crednetialling classes like
PCA, CNA, pipe welding, 6-pack licensing, COMTia, etc.
 Get a complete list of what what can be success after taking a developmental
course.
 This is an excellent metric for the community campuses as it gives us an
opportunity to measure all that we do to meet the community college mission.
 AY09--This is easy to measure and provides a good look at our success.
 This metric has to have a reporting element that refers to all the campuses
that are involved in the student's transcript
 When a student reaches their goal this is success.
 I prefer the Calculation. The potential refinement leaves out certificates.
 2009 - this should be an easy measure once the definition of "credential" is
agreed upon
 The potential refinement should not be used.
 Sometimes a single course will make a student more employable, ie, auto
mechanics, welding, basic accounting.
 2010 - we need to give the system some time to test with data fed back to all
campuses.
 We need to coordinate the recording of all licensures, industry certs,
workforce credentials, occ endorsements, certs, degrees, etc. Not all campuses
are recording these in the same fashion in banner. We would need to sit down
together and coordinate so reporting would be accurate.
 Concur with the calculation which includes licensures, industry certs, WF
credentials; This gets at the "nut" of what's different at community campuses.
 AY09 - This is needed to allow us to include all those students who are
earning credentials below the associate's level and outside "high demand" - they
are currently not captured anywhere else.
 AY09 but without the refinement. As stated above many courses are
considered complete for particular jobs.
 How can we capture the single-course "programs" that aren't in UA -defined
programs like CNA, PCA, welding, etc. We're losing many now and no one is
tracking this. Some of these credentials are in the high priority areas that UA has.
 How do you define licensures and industry certificates? The specific licensure
and industry certificates need to be specifically defined so when someone asks
what the student must do to be counted in the category they can be given the
specific course requirements.
 how do we collect licenses that are not given through UA or a partnership
program we use, ie a student taking a ITS course to help student for the
MCP/MCSE/MCSA test.
 #509 good point - communities may not have the identified "high demand"
jobs, but might have their own identified "high demand" jobs that are good for the
economy of the community
 Would be interesting to look at our graduates systemwide and see how many
campuses contributed to the success of the student.
 many community campuses' degrees are MAU degrees, so how will a campus
and not the MAU be identified?
 This metric should count the number of awards - so that if a student gets more
than one award both are counted.
 Classes that are set up for preparation of testing to a certificate need to be
included as a recognized credential.
 2010. Will be interesting to track this now that OEC's are on the books.
 Basic completion could also be defined as that the student has met their
personal goal - degree, a course, etc. But even though this could make the unti
look good in that their students are meeting their goal at high rate, it is also one
that could be easily corrupted so as to say "look how good we are doing".
 Whether a student starts with developmental studies is also a factor. A student
starting in DEVM 050 and achieving a BS degree measured against a student
starting in Math 107 and achieving a BS degree is apples and oranges.
 We need a diverse 'cluster' of measures to capture the full range of services
provided by campuses. No one measure alone will be adequate; we need to
ensure that this cluster of measures paints a full and accurate picture of our
work.
 #511--This is the purpose of this metric to gather and track info on these
programs and courses.
 What is the base to get a percentage? If I have a small base I look really good
if I need a large number but if I have a large base it could be determental. I can
use various numbers but the problem is to be consistant from one to another.
 there are awards and industry certifactions that come out of one or two
courses, not a series that is BOR-approved progrram.
 This measure seems flawed. Is it a number of graduates and students
completing a certification, etc., or is it a rate. This is an important difference. If it
is a retention/persistence rate, then that needs some definition. If certificate
students wait until applying for graduation, then the rate would be 100%. How
helpful is that?
5. What else could we measure?
 The AK Department of Labor and US DOL use common measures focusing
on employment beyond graduation or completion, as well as income progression.
To what extent are these Common Measures important to our efforts? Perhaps
we should at least look at them to see how our own measures might (or might
not) complement them.
 the percentage of our 18+ residents in our region that our regional campuses
serve.
 Pre/post testing is an addditional way to measure students success.
 'Facilitation' or another term to describe the effort a campus makes to ensure
a student physically located near them is succesful, regardless of whether the
student enrolls in a course or degree program at that campus.
 UA should focus on Accessibility = the percentage of our regional
communities that we serve student in per academic year
 We need to measure student's goals somehow, to compare to thier activity
and measure whether the goal was met.
 Instead of community maybe agency/partnerships. Not all communities have
the ability to give.
 Community contributions - $ and in-kind
 While we dropped 'transfers' as a topic of discussion, some may have
eliminated it because it implies moving from a community. But as more online
courses and degrees become available, 'transfering' may not involve moving
from your community. Our ability to provide a quality foundation for such
'transfers' may be important to measure.
 Community partnerships and economic support are important.
 UA should focus on service to UA students = measure support services:
advising, access equipment (internet, videoconf, audioconference, etc.),
classroom space in accessing distance courses, etc.
 A lower 48 community college, has advisors poll their student every fall to find
their goals. Maybe we could do something similar, however does banner have a
screen for this? It would be useful to see how students change their goals.
 Leveraged community resources measured in $$.
 It important for each community campus and/or college to have engagement
with it's community and region. How to measure the effectiveness, correct model,
or partnership level with local/regional eduation, government, organizations or
industries would have to be determined. This is a critical piece of information.
 #341 look at the PPT slide handout in your packet. This is addressed.
 For some of the campuses, the number of sites (towns and villages) served
would be a very good metric to use. It takes a lot of effort, resources (and
creativity) to successfuly serve a student who is the only college student in their
village. This would help support and promote outreach efforts and lead to more
students and more success. One successful student in a village often leads to
more.
 #348 excellent, it would be great to see how many go from a goal of one
course to improve skills to wanting to obtain a degree.
 Are we in agreement as to the end goal i.e. meeting the "community college
mission" so we are clear why we are proposing to measure anything regardless
of what it is?
 Successful 'graduates/completers' are an excellent indicator of effectiveness
in meeting community needs. How do we assess their 'success' beyond
graduation? Employment? wage progression? Continued education?
 Collecting and reporting student goals and whether the student goal has been
achieved might be more helpful to look at when considering persistence.
 More and more courses/certs/degrees are offered in summer (and not through
UAF Summer Sessions). Are these included in our analysis?
 We obviously need to discuss how we measure things and have it consistant
across MAUs and statewide.
 We should propose to measure that which is unique to what we do at the
community campuses to meet the community college mission - many of these
measurements feel like indicators to the successful achievement of existing
metrics that are measured at the MAU level
 It seems that much or our discussion of the other metrics is about how they
don't capture a certain group of our students, those whose goals are not easily
measurable by university standards (non-degree seeking, just looking for
additional job skills etc.). Maybe instead of modifying all the other metrics to
reflect these students we should add a separate metric just for these students.
 CEU's!!! CME's etc. Also how many sites we serve (often with only one
student there).
 #322 - McClenney had some measures related to employment that we may
want to go back to.
 Non-credit activity, etc, if not added into an enrollment metric.
 Course facilitation--could be simple as counting all students enrolled in any
credit/non credit course for total # enrolled, count students enrolled in programs
that are owned by your campus, count students enrolled in programs other than
your campuses' programs. A percentage of how many students of the whole are
enrolled in each of these two areas could also be used. For example:
-1005 students enrolled in credit and non-credit courses
-503 students enrolled in courses "owned" by your campus
-502 students enrolled at your campus in courses "owned" by another campus.
-51% of the students taking classes at your campus are taking your campuses'
courses.
-49% of the students taking classes at your campus are taking another
campuses' courses.
3. Course-sharing Benefits
1. What tangible value or advantage might accrue from a DE course
sharing initiative?
 credit, SCH for students in DE from other campuses at my campus who we're
providing support services for and tuition for classes we're providing the
instructor costs
 Students get access to more courses that might better fit their schedules.
 Utilize faculty expertise at other campuses, save money by increasing
enrollments.
 more collaboration amongst campuses that benefit studnets
 Courses will run with full seats and be more cost effective.
Courses taht run in a cost effective way can help keep costs
down to students and may ensure more offerings of the course
for others.
 faculty work loads are more secure
 the ability to provide faculty in hard-to-fill positions
 students get courses they need that a home campus can't offer.
 Student accessibility to courses they could not get at the campus otherwise.
 Students will be able to access courses that they otherwise would not be able
to take.
 Campuses can get SCH and credit for our metrics unlike we do now when we
support other campuses' students.
 Provide students with courses when we do not have resources.
 increase student head count numbers in courses that run less that capacity
resulting in an increase in faculty productivity
 This approach allows for students to vote their favorite style of taking a course
with their registration.
 ability to offer courses for "travelling" students, eg. deployed military, fishing
personnel, etc.
 more availability of courses from across the MAU's,
 increase the ability to offer diversity of offerings
 payoff is in diversity of offerings
 Sharing courses across MAUs through AK ICE will increase options for
students and make those offerings more visible and accessible to students.
 Students will have more course options readily available to them.
 There would be a more heterogeneous group of students to work with in a
classroom setting.
 Course sharing, with the proposed plan to credit SCH, is one way to give
campuses an incentive to enroll students outside thier campus. Currently,
encouraging students to do this can be viewed as 'harming' the campus, however
this is not the case!
 Faculty are "learning" beings and are energized by the challenge of creating
something new; may reduce burnout
 Student can keep their connection to the local campus thus making it easier
for them to navigate the university bureaucracy.
 A payoff of this discussion is to continue troubleshooting cross-MAU delivery
issues and to then develop an elegant, student-friendly system for course
sharing.
 more accountability
 campus can increase their SCH though AK ICE
 Students and professor benefit from a more diverse class
 Can use faculty expertise no matter where located.
 UAF/CRCD uses an 80/20 split in tuition dollars. This supports the real costs
of serving as a brokering campus (student services, space, heat). We should
consider this resource-sharing approach as one potential advantage.
 Assurance of consistent information transfer to students
 less duplication of effort
1.1. System more responsive to student needs leading to
more and more-satisfied students
 Fewer courses canceled because of insufficient enrollment
 Full classrooms mean more diversity, better learning
 Better student access to a variety of courses
 quicker time to degree
1.1.1. Improved allegience to campus leading
to higher retention
1.2. Improved public relations
 More positive public comments
 increased perception of acountability to the public
1.3. Increasing demonstrated campus productivity Improved metrics to statewide
 save one class
 improve faculty productivity by one student
1.4. Improved bottom line
 New source of revenue / Revenue sharing provides
incentive to cooperate
 Incentive to cooperate / reward for effort
 at least break even on the pilot project
1.5. Broader use of faculty expertise
 Easier to fill hard-to-fill positions
4. Course Sharing Challenges
1. What challenges will be raised by joining a course-sharing initiative?
 Lack of faculty expertise in distance delivery mechanisms
 Lack of faculty willing to teach distance.
 Students will have to have clear instructions, advising and logistical support
for books and IT support.
 Extra time is needed to collaborate; people are already busy
 Students may take something that their program really doesn't accept.
Students must work with their advisors as with any course to
ensure the course is acceptable.
 More reliance on scarce faculty training opportunities
 Different LMS that require a learning curve for students and passwords for
access creating multiple technology support and potential confusion for students
as to where to get help
Blackboard is a learning management system (LMS)
 Concerns raised by SAC/Provost Stell
 Different calendars and tuition/fees schedules
 Technicial issues
 Communication needs to be clear, concise to avoid student confusion
 procedures for booksales that are consistent
Tuition is different campuses
Receiving campus does not get a portion of the tuition
increased support staff needed
 Need to examine articulation of courses before they get requested for my
campus
 Provosts reluctance to try something new. Erection of barriers that may not
really exist.
 Cost-sharing will have to be uniform and consistent.
 Faculty not cooperating or collaborating for articulation between MAUs
 That whole book/supply issue!
 Especially initially, there will be an increase in student support services
needed in the receiving campus as students deal with issues such as book
mailings, contacting instructors, Technical support etc.
 Mis-estimating the number of seats needed in a course
 This isn't a one-size-fits all solution for crediting campuses for helping
students who are physically located on their site.
 Possible reduction of face-to-face enrollments on campus as a result of
increased distance offerings.
 one Blackboard shell needs to be created, not per per site
 MAU's would need to continue dialogues to ensure seamless delivery.
 Student access to blackboard
 Possible loss of revenue.
 Different Blackboard systems depending on MAUs
 Confusion for students about where to get help
 Even if we can't get approval to do this across the SW system, perhaps the
respective MAUs could do this on their own like UAA has.
 Cumulative obstacles for students when working across MAUs: books,
computer/Blackboard access, differing fees, differing schedules. No one is a
overwhelming problem but taken together they can be daunting.
 There is little inscentive for a campus that doesn't have a problem filling up
courses to deliver courses in a course shareing program. They already have all
the tuition and SCH.
Then they don't need to play. There would be no requirement
for every campus to do this.
 a move to require a change in Banner. This would effectively end course
sharing.
 different start dates
 Campuses wanting to offer the same course.
 different tuitions
 Possible loss of SCH where none exists now
 mailing of textbooks
 Uaonline needs to be able to ideentify speciifc campus the class is coming
from; it doesn't now.
 The initial use should be with full-time faculty to benefit workload issue.
But frequently campuses don't have FT faculty that are willing
to teach distance.
 we could get a much bigger black eye if we say we are open for business and
promise the students courses that dont count toward their program. Need a
system that is planned and creates order rather than making it more complex.
It is incumbent upon the students to work closely with their
advisors to ensure a specific course will work with their
program.
 Inconsistencies between MAUs in program requirements
 coordinate the deadline for advertising classes
 Could create an additional layer of paperwork etc.
 AK Ice requires the duplicate data entry in Banner and in AK Ice - not a
significant resource use for a pilot program, but significant if the pilot is expanded
later.
 Challenge to be brave for the long view and avoid the short sideedness that
occurs when a new idea is broached
 Strategic planning for new hires across regions may be impacted as course
sharing may make location of faculty less a factor of course delivery.
 Fear of loss - money, SCH, HC, autonomy, control etc.....

ICE classes should not be limited to just degree-related classes
 Should different MAUs have agreements in place before doing this to avoid
misconceptions about course/program expectations?
 IF other MAU's can't particpaite, can UAA MAU still continue?
 Possible impacts on cross-regional schedule and systems already in
existence. For those that already have a system we need to look at how or why
we might want to change it.
 Need well-trained staff (advisors and registrar) to minimize glitches.
 For videoconference classes, not all campuses may have space.
 If retention and recruitment is not increased this will only be a redistribution of
existing students at the expense of other community campuses or the MAU.
 Cross Mau course sharing should only be done after academic units have
worked out the exchange.
 Instructors/faculty will need a single course list for students for grading and
other academic functions.
 Need more bandwidth!!!
1.1. Faculty willingness, expertise, time, and resources to
engage in DE
1.2. Lack of resources to do DE
1.3. Differences of policy and practice and student outcomes
across MAUs
1.4. Getting information about best pactices and lessons
learned from past pilot projects
1.5. Differences of technical infrastructure for DE across
campuses
 E.g. Blackboard
1.5.1. Adquate facilities at supporting
campuses
1.6. Student services -- sufficient resources and
communication lines to let students know about it.
1.7. Differences of academic schedules across unites
1.8. Differences of tuition ratesk and fees
1.9. Providing books and supplies
1.10. Supporting students with disabilities
1.11. Adequate planning for all the details involves
1.12. Local, knowledgeable student support
1.13. Student financial logistics
1.14. Coordination to assure non-duplication of courses
1.15. Deadiline coordination
1.16. Getting permission from SAC and reporting results
5. Pilot Project
1. Pilot Project
1.1. Case Study: Planning, Data Collection, Analysis
 Measuring of faculty production needs to be worked out with
SWIR.
 We need to do structured planning for the case-study. That
means we need to have some communication among the
stakeholders. I would be happy to help organize the
communication channels. Curt
 Courses must already be approved for transfer across the
SW system.
 I would like to see a pilot outside of the MAU - a cross MAU
course sharing pilot would truly identify the issues that have
been mentioned
 It seems that we've already had some examples that could
be used as limited case studies. AK Ice and CRCD's CrossRegional schedule.
 UAF CRCD Health Programs would be interested in being a
pilot along with KUC & CC, IAC campuses.
 Why not have an inventory of all current course-sharing
mechanisms in UA? To included AK ICE, CRCD crossregional, UAS distance course delivery. That would provide a
wide array of ideas to work from in taking next steps.
 Perhaps HLTH programs across MAUs would be a good
pilot.
 Ample input by registrars is critical to success; as you may
have divined, there are diverse opinions, so your sample
should reflect the diversity.
 Systems in UA that are already working as far as sharing
courses - with the addition of statistics producted by Gwen's
office - credit hour and head count can be based on service
campus as well as course (currently done) and academic
(where degree is held).
 An inventory is one thing, but like many would agree we
need to take action. The only way we are going to find out what
works is try.
 Create for us a detailed checklist. Like a jet taking off, we
want to avoid problems by making sure we each have gone
through this check list.
 Sharing single sections between campuses as opposed to
multiple sections
 Donna Schaad collected data to present results of the
AKIce pilot at UAA. At least she request data which she
indicated she was using for a final report. I haven't seen it but
believe it exist. she's now associated with Wiche and available
via email. Academic affairs at UAA is still working with her on a
project.
 ECE and ITS programs may also be invited as they are
statewide programs.
 The courses selected for a pilot should be program driven
 Perhaps we could go through a trial course on AKICE but
use banner training, not production, to see how things worked
out?

we need case studies of the AK ICE and CRCD cross-regional
system to identify the best practices of each for inside MAU
course sharing. Outside MAU course sharing is another issue
and should have a new pilot project implemented with a
statewide program like Health Programs.
 Perhaps develop this as stand alone projects at each MAU
and then determine how to bring them together to course share
on a statewide basis
 There are two programs beside the health that are already
cross MAU - ITS and Networking - maybe one of those would
be a good pilot??
 We need do do a pilot program to see what courses would
work for different campuses. We need to have good planning
and good communication with differenct MAU's
 I think we need to remember change is good and think
about what is best for the students and the campus.
 The need for serving students and credit hour for purposed
of PBB and the need for $ to support those students is very
important for all campuses.
Define: Service Campus, Academic Campus, Course Campus
Don't really understand what you are looking
for. With the UAA model, there is an
orginating campus (who is providing the
instruction) and receiving campuses (any
campus that has students enrolled in the
course that aren't assigned to the originating
campus).
 yes,let's make sure we don't reinvent processes that already
work. Some good ideas are here.
 The selected courses don't need to be program driven.
What would be wrong with doing GERs like COMM 111, ENG
111, MATH 105? Being program driven invites other issues;
GERs would be easier to start with and tranferability issues are
lessened.
 Exploring challenges with student advising, support services
and fee payment logistics would inform the solutions to crossMAU registrations, as well as how students access their
courses and books.
 Begin with GER classes and not special topics as a pilot
project across institutions
 The advantage of AK ICE is that it acts as a repository of
information (collaboration, decisions, contacts...details,, details,
details). Whatever system we end up piloting, should include
some similar resource. Current system used by UAF rural sites
is dependent on a few people remembering a long laundry list
of conventions, contacts, procedures....details, details, details.
What seems simple to the people who use it all the time is
mind boggling to someone who plans to use it occassionally
 #1893 is the core question that came up yesterday for
discussion that we tabled. This clearly needs to be discussed
some time soon.
 We need to also look to the future to determine what might
be the best course of action. I already have a great deal of
students go outside my MAU to take courses. I think this trend
will only increase - so to us AKICe or another mechnaism
would be good.
 An Excel Spreadsheet built that records which classes have
been offered through AKICE, which campuses participated in
each course offered, who was the offering campus and who
was the receiving campus with numbers recorded for each
participating campus -- under comments should be listed
specific problems or needs associated with the course.
Also should show if the number of seats originally 'bought'
were all used or if the were 'sold back'
Would be good to list what campus need was fulfilled by both
the offering and receiving campus because of this course
offering.
 Keep the pilot limited to programs not just courses
 Students are our top priority. How do we make it easiest for
them to take courses (including receiving books, audio
information, etc.)?
 Course fees needs to be reviewed across UA when looking
at this pilot project for DE classes
 1902: if limited to just program, many students we serve will
be left out.
 I think in the pilot we should also include a course like CDEs
that already has books and fees attached - how will we deal
with that - we need to figure it out and a pilot is the best way.
 How can we work across MAU's to support students in
registering for any course that they need for a degree program
that may or may not be held at our MAU?
 Where are the cookies?
 #1895 and 1897 - GERs work well. They have been an
integral part of the UAA AK ICE offerings to date.
 Begin with programs in mind...very helpful to have faculty
working together...better buy in. For instance, in the CIOS
area, there is already good collaboration, many shared
courses. This program has a tremendous number of courses
and there is new technology (applications) coming out all the
time. Becomes immpossible for one campus to offer a full
assortment, but if programs at different campuses work
together, they can alternate what they teach. this would be a
win win for faculty as well as students.
 We would not be here if it wasn't for our students. We need
to provide them the best possible service we can.
 Holly and Jennie C. want to play.
 Credit is given to the campus that is deserving - course,
servicing and academic
 1895: GER's are not what many campuses need for DE;
they already offer face-to-face GER's. No need to limit pilot to
GER's and progrrams.
 Ruth would like to be a part of this discussion.
(To harass Holly and Jennie.)
 Sandy Gravley at MSC. I'm willing to participate.
 Jennifer M. will play too.
 re: 1898 that does not mean that the system is not worth
looking at, just that the information is not formalized the way
AK ICE is. I haven't seen yet that AK ICE will be able to handle
things like our cross-regional schedule so we need to look at
all systems.
 Colleen A wants to play
 KPC would be willing to play.
 Funds allocated to support the CRCD schedule of classes
with the detials of other MAU courses and their delivery
systems
 All the staff and faculty at Health programs wants to play!
 Kim @ CDE will play.....
 Even AKIce courses have a naming convention for section
numbers. That is not unique to just the UAF and UAS crossregional and statewide offerings.
 Marketing efforts for student friendly schedules
 will the pilot project reflect current ICE set-up where
SCH/HC goes to receiving campus by having a section created
at the campus or will a 80-20 split of tuition occur llike UAF
already does for their classes?
6. Meeting Evaluation
1. What advantages do you see to having these meetings at community
campuses?
 The location helps create context to the issues as they relate to community
campuses.
 Having these meetings at community campuses is valuable as we get a
chance to see other campuses and learn about how they operate and experience
what our students do when they take a course delivered to another community
campus
 Thank you all for coming and thank you Karen for supporting this meeting.
 These meetings are excellent opportunities to share information, develop
ideas, and bring forward proposals that reflect the needs of our communities and
students.
 community campus meetings are very valuable and as much as the directors
can tolerate SW people should attend since we learn a whole lot.
 cross coordination efforts on system wide measures
 Meeting at a community campus has been helpful because many of us are
coming from community campuses and rarely have the chance to see how other
campuses like us operate. (Thanks Cathy for the tour).
 It's valuable to see other communities, and to give campuses a chance to
shine.
 none since I did not even get to see the Ketchikan Campus or know what is
available to students.
 This was very valuable as it showed us another unique community campus
environment. We share some things as community campuses, but we also
respond to the needs of our different regions in different ways and it is very
helpful to experience a bit of each campus and region.
 There is something visceral about walking in the shoes of people who live in
the rural areas of Alaska. We should not lose sight of what life is like for people
outside Urban Alaska.
 It is always interesting to see a community and how a local campus serves
that community, thanks to UAS Ketchikan for sharing! It is also nice to be hosted!
 Have an opportunity to learn about campus's sitse and programs; not feel so
isolated. Also have some best practices affirmed.
 It was very beneficial to have the meeting at the community campuses as we
had the opportunity to see first hand the campus and resources.
 It good when we come together and see what other campuses are doing to
help better serve our students, and to get different ideas.
 It was good to have the registrars included in this meeting. Even though the
topic was mainly directed to community campuses the main campus registrars
are responsible for the registration process at all their campuses and need to be
included in this type of a discussion. Thank you for including them.
 Meeting at different community campuses give us a chance to see how they
operate, as well put a face to people we have been working with over the years.
 It is important we learn more about our system of campuses so we can learn
about their challenges, exchange info, best practices, etc.
 RE: Ruth's comment - I think additional time to see more of the campus and
the area would have been helpful.
 Since we provide so many distance courses, it is not unusual for students
from around the state to enroll in one. It is beneficial to see the community from
which our students come from.
 Communication increases understanding and meeting at a home campus is a
great form of communication and increased understanding across the UA
system.
 To identify commonalities.
 Such meetings increase the comraderie we should have. This group has more
in common with each other than they do with the deans at their MAUs.
 It is important to see the communities that other campus are located in and
the issues they deal with on a regular basis. It helps us to understand the needs
as expressed by these campuses in discussions.
 The discussions have been interesting and eye-opening. I do think that all the
current processes should have been discussed equally rather than the focus
being on AK Ice which is relatively new.
 I always find that we gain understanding as to the needs and or provisions
that other campuses have that can interact with our needs and or provisions
which makes for a great state-wide support system that benefits all of our
students. The fact that we are face to face, strengthens this bond.
 Some of our campuses are so small, so if we had a meeting at our campus,
we would have to have it at a different location.
We would love to see other MAU's come to our campus and see how we
operate, the only problem I would see is our campus is so small, we would have
to have it at a different location.
 These meetings allow an opportunity to network with others in a more intimate
way than a video or phone conference would allow, and to see facilities. Great to
put names to faces (and voices!) The networking opportunities frequently go
beyond the topic and help us work better together and individually.
 Seeing the communities that we serve and the conditions that affect students
and services in those communities helps promote commonalities, shared
challenges, and a sense of shared responsibilities for the students of our unique
and dynamic state.
 Finding ways of improving opportunities to our students seems to be the most
important thing to us, which is helpful when working on solutions to our mutual
benefit. I'd expect alot of progress to be made from more meetings like this one,
possibly in ways that could influence main campus relationships.
 The advantages are that we get to see what is at other campuses.
 Its good to come together and meet to discuss all the different issues.
 Thank you Curt for inviting us to this meeting. It would have been nice to see
the campus.
 This was the first time I was in the same room with the Campus Directors from
across the UA system.
2. What did we do that you liked?
 Using the collaborative software facilitated focus on issues and more efficient
use of time toward objectives.
 I think the group made HUGE strides on community campus metrics - this
aspect was invaluable.
 This collaborative software was a great way to capture many thoughts in a
short amount of time.
 The collaborative software and the process that went with it was great.
 Raised difficult issues in a facilitated environment that allowed for open
discussion.
 The meeting softwawre has been helpful to cover some big topics and involve
everyone in a shorter period of time!
 The software used to be able to have the topics and thoughts of others
available after the meeting. Was difficult to use in some ways to not hear what
others thought since timje was short on reading.
 It was good to get involved in a metric discussion prior to discussing how to
affect the metric on an individual campus.
 Great progress on metrics and rich discussion on course-sharing. The
technology overall was a useful aid. Thanks all!
 We liked the software that was used to support the meeting and
communication between attendees.
 Everyone had pretty much had a chance for input
 The software was empowering and a time saver.
 I like the collaborative software, but do think that we could have used
additional time to discuss our work.
 The electronic input mechanism creates an excellent method for documenting
comments.
 Issues that were raised seemed to be ones that we are putting the carriage
before the horse.
 Having a good facilitator, great tech support and new coomunication software
to share really helped facilitate through a great deal of "touchy" information.!
 I liked having Gwen here - she provided vaulable insight to IR processes and
kept us clear on what data is collected and if collecting the data we want/need
was feasible.
 I liked the fact that many of our comments were documented via this
softeware. Hopefully this information will be useful to bor and pres.
 Because of the collaboration software we were able to move through an
agenda that would have taken twice as long.
 Since time was short, the Group Systems software for collaboration was an
excellent tool for allow everyone to add comments that might not have been
heard due to time. It was also nice to break out the registration personnel to
discuss our concerns and bring it back to the directors.
 Involved student services/registrars.
 The software helped us to get many more ideas on the table than we would
have otherwise. This should be helpful in some of our future discussions.
 Anonymity is sometimes a good thing. Those that are not as willing to
verbalize their thoughts still had the chance to "voice" their opinions.
 good progress on metrics. Focuses on strengths,
 Setting a specific purpose and outcomes, plus having somone to failitate our
efforts. The IT setup certainly helped.
 I did like being introduced to this tool that helps everyone communicate and
be a part of the discussion on an equal and confidential basis.
 Liked the software format........great hospitality. Great organziation.
 This meeting was informative. It gav
 Provided an excellent facilitator and a new tool for collaboration.
 Getting all the players around the table at once. The joint discussions were
really good and I like the Group Systems format for gathering information.
 I love the GS Meeting software for processing "big picture" topics and getting
feedback from all participants. Brilliant!
 The opportunity to meet and discuss issues that could truly improve student
access and campus standing with others with the same concerns, especially with
other "registrars".
 The overall topics were not easy to cover but this meeting covered parts that
will help to make overall discissions that will help the community campuses (I
hope).
 The gathering together of the campus directors and the registrars so we could
discuss things in the relatively infant stage together. This gives the registrar's a
heads up of things that are coming in the future so they are not surprised when
they are tasked with making it happen.
3. What did we do that you would do differently in the future?
 Regarding AK Ice and UAF's distance delivery process, some background
information and evaluation already prepared would have sped the process.
This was my message written before the name tag was turned
on.
 More attention to making everyone comfortable with the meeting management
tool. I can see that this tool is difficult for some people, so more discussion
without the tool would be beneficial to work in to accomodate everyone's style.
 Better up-front planning and information/goals/expectations dispursement so
that the participants can be better prepared when they come to provide support
for the dialogue.
 Suggest that we spend a little more time introducing people at the get-go.
 Not try to take on two major change topics at a single two-day meeting.
 Include a campus tour.
 Send more information in advance to participants so that they can gather
relevant materials prior to attending.
 I would have had a microphone and speaker system
 Have a clear agenda for all participants involved Directors and Registrars.
 More information in advance for us to prepare from would be helpful.
 Ensure that we have plenty of time for discussion and diverse perspectives in
addition to typing comments into the computer.
 rearranged the room so we can all be involved in the discussion
 I would have like to have a packet to read before we got here. Also, logistical
information was slow on getting out.
 Move Ketchikan closer to home :-) And have the travel arrangements clearer
from the beginning.
 It would be nice if the meeting was held in a place that did not have so much
flying up and down :o)
 It would be a cost savings if all were brought to Anchorage campus
 Having time set aside for Director's to talk about their group meeting that took
place on the afternoon of the first day.
 Tour host campus.
There was one given on Sunday evening.
 More time for discussion. I felt like we did the preliminary work, but weren't
able to refine our ideas. I'm sure others will do that later, but I would be more
comfortable if the group was able to clarify their thinking as a group instead of
relying on others.
 Have a clear agenda, and more information in advance to prepare us. I would
also add a wireless mic if needed.
 Give more of an overview before starting into an agenda item. Preview the
process and the end goals.
 Given as much attention, time and coverage to course sharing that is currently
happening in CRCD/UAF and discussed why UAA doesn't do the same. Are we
looking for a fix to something that may not be broken?
 I have poor hearing and it was difficult to hear some of the speakers with no
microphone.
 Develop and articulate outcomes and expectations more clearly. Introducing
new technology in combination with a loosely managed meeting created
unnecessary tension within the space. Small group meetings should have
reconfigured space to accomodate different atomospherics.
 The campus directors need time to meet alone without SW or MAU
administration.
 Provide a list of the 5-6 ICE courses being offered this semester so the other
campuses might have a fuller conext of the diverse courses.
 It wouls have been good to have a brief demo or dry run with the software so
that everyone had a similar understanding of how it worked.
 The two complex topics (Metrics and AK ICE) should have each had their own
time, maybe.It might have been an ambitious agenda for the time allowed. It was
a little tight for development, but all worked out!
 Take a vote of where individuals are on a subject, have people for or against
convince others advantages and disadvantages of subject.
 Have a site for questions that arise during meeting and go through the
questions before moving on.
 While we have the record of what was typed into group systems it would be
nice to record since many great ideas where vocalized.
4. What else would you like to do that we have not yet done?
 Keep the agenda open enough so we can look around the campus.
 I would have liked to have toured the Ketchikan campus and heard a little
more about them.
It wasn't clear to me that we were coming to this meeting to consider another
pilot project. I guess this is the way things move along (slowly) but we could have
had the goals clarified better before we arrived.
 I would like to learn more about how other MAU's work through some of the
issues presented in this two-day session.
 Additional communication between the Registrars and the Directors regarding
AK ICE and its process.
 Share the days efforts more (we did so today better than yesterday). Have
some real closure on some of the LONG outstanding issues.
 Creating a statement of the importance of solving the course-sharing across
MAU's would emphasize the work completed to this point.
 See more of Ketchikan campus and Ketchikan. Also, more sharing of
information about how each campus, MAU operates. A short presentation from
each at the beginning would have been helpful. We community campuses still
don't know each other well enough to dive in without the intros and information
sharing.
 The first day seemed like we were swimming and not getting anywhere. I felt
that the goals were not clearly defined allowing us to not always stay on topic
which was important in such a small time frame. Today seemed more focused.
 More time for networking.
 Let's set a tentative date and location for our next meeting (February?)
 Break into smaller activity groups that mix the campuses so that we don't
appear to be one MAU or method pitted against another. We all have valuable
things to share and we need to remain open to what is available and why one
process may work better for one than for another. If we were to break into
smaller "mixed" groups, then more relationships can be built which bridge the
communications gap.
 Seemed like we touched on 2 big topics and could use a little more time to
assign tasks so we can keep the momentum going once we leave.
 Do more of the preliminary work via audioconference / Elluminate and save
the actual meeting for producing final products -- consider a state-wide audio
conference for spring meeting.
 If we successfully are able to use AK ICE, I'd like to see the Chancellors
require the MAU's to support us with courses that none of us can provide... the
nursing program is a fantastic example of this good support. (Linc)
 Thank Curt and the CDE crew for their work on bringing us all together - on
multiple levels. Thank Bob Briggs for shepherding the process with patience and
perseverance. Thank Cathy for the Sunday boat ride. Compliment our registrars
for their persistence, problem identification, and valuable participation.
Congratulate everyone for the excellent work products and progress on important
issues that this meeting produced. These statewide meetings have become
productive, substantive, and invaluable to the quality of UA educational
programs.
 I would like to see a list of AK ICE courses that have been (or are being)
offered and how they have fared.
 Perhaps a more directly related performance-based measurement vs. coursesharing connection.
 A date by which campuses offering ICE classes next semester would confirm
which courseswill be offered.
 Thank Gwen for her help!!
 Thanks to UA for support.
Step Name: Rate Metrics
Tool Name: Alternative Analysis
Ballot Details
Ballot Items
1. Developmental Success (Developmental Completion Rates)
2. Successful Course Completion (Course completion rates)
3. Basic Overall Persistence/Success (Persistence rates)
4. Basic Completion (# of students who earn a recognized credential)
5. Transfer (Transfer rates)
Criteria
1. Yours
2. Statewide
Ballot
Criteria
Ballot Items
Yours
Statewide
Polling Method:
Numeric
Scale1
Numeric
Scale2
Allow Bypass:
Yes
Yes
Developmental Success (Developmental Completion Rates)
Successful Course Completion (Course completion rates)
Basic Overall Persistence/Success (Persistence rates)
Basic Completion (# of students who earn a recognized
credential)
Transfer (Transfer rates)
Polling Guidelines:
1
How Important for your campus?*8*Please rate each metric on a scale from 1 to 5
1 = Very UNnimportant as an indicator of productivity for my campus
5 = Very important as an indicator of productivity for my campus
2
Statewide*8*Please rate each metric on a scale from 1 to 5
1 = Very unimportant that this metric be implemented statewide
5 = Very important that this metric be implemented statewide
Polling Details
Cell Statistic: mean
Number of Ballots Cast: 23
Number of Ballots Abstained: 1
Ballot Details
Ballot Items
1. Developmental Success (Developmental Completion Rates)
2. Successful Course Completion (Course completion rates)
3. Basic Overall Persistence/Success (Persistence rates)
4. Basic Completion (# of students who earn a recognized credential)
5. Transfer (Transfer rates)
Criteria
1. Yours
2. Statewide
Polling Results Table
Cell Statistic: mean
Criteria
Ballot Items
Yours
Statewide
Polling Method:
Numeric
Scale1
Numeric
Scale2
Allow Bypass:
Yes
Yes
Developmental Success (Developmental Completion Rates)
Successful Course Completion (Course completion rates)
Basic Overall Persistence/Success (Persistence rates)
Basic Completion (# of students who earn a recognized
credential)
Transfer (Transfer rates)
3.850000
3.590909
4.550000
4.136364
3.950000
3.909091
3.450000
3.954545
2.850000
2.863636
Vote Spreads
Column: Yours
Polling Method: Numeric Scale
Allow Bypass: Yes
Number of People Who Polled Each Value
Ballot Items
Developmental Success (Developmental
Completion Rates)
Successful Course Completion (Course
completion rates)
Basic Overall Persistence/Success
(Persistence rates)
Basic Completion (# of students who earn a
recognized credential)
Transfer (Transfer rates)
mean
1
2
3
4
5
1 3 3 7 7
6
7
8
9
10
0 0 0 0 0
3.850000
0 0 3 3 14 0 0 0 0 0
4.550000
0 2 4 7 7
0 0 0 0 0
3.950000
1 3 7 4 5
0 0 0 0 0
3.450000
3 7 4 2 4
0 0 0 0 0
2.850000
Column: Statewide
Polling Method: Numeric Scale
Allow Bypass: Yes
Number of People Who Polled Each Value
Ballot Items
Developmental Success (Developmental
Completion Rates)
Successful Course Completion (Course
completion rates)
Basic Overall Persistence/Success
(Persistence rates)
Basic Completion (# of students who earn a
recognized credential)
Transfer (Transfer rates)
mean
1
2
3
4
5
3 1 7 6 6
6
7
8
9
10
0 0 0 0 0
3.590909
1 1 4 4 12 0 0 0 0 0
4.136364
0 2 6 6 8
0 0 0 0 0
3.909091
1 1 5 6 9
0 0 0 0 0
3.954545
4 5 6 4 3
0 0 0 0 0
2.863636
Step Name: Discuss Metrics
Tool Name: Categorizer
1. New Metrics
1. Successful Course Completion (Course completion rates)
 Successful completion is not just if they passed a course, but if they also were
successful in the next level. IE they passed DEVM 060 and did they do well in
DEVM F105
 This is a important metrics for the community campuses based on
assessment of C or better.
 Would need to have percentage of total enrolled students who successfully
complete with C or better.
 Course completion should be recognized as any completion other than
incomplete or failure.
 This is an important internal measure of class/program success; however, it
may seem to be something that should just be a given level of expectation by
external stakeholders. Doesn't seem like a substantive "snapshot" measure.
 What about DEVM F105, in order to go on to MATH F107 you now have to
earn a B. But C is normally considered passing . . .
 This is an instructional evalution that may not include the other considerations
of distance education, like student services and academic advising.
 Provides feedback on correct placement of students.
 Success often depends upon a good DEV foundation... success usually
equals success in persistence, which often equals "graduation".
 Successful course completion is largely dependent on initial placement
practices. We can't expect professors to teach students who aren't adequately
prepared. There need to be consistent placement practices and required
prerequisites.
 Many community campus students start by taking one course and may not
know if they will take another until after they have completed the course.
Successful course completion rate is a good metric to start with because it will
help us look at placement and advising practices.
 I think a student is successful if they pass a class and go on to the next levle
 We need to be mindful about who these measures will be used for reporting to
the ocmmunity the legislature and internally. For example completion in a course
vs compleiton in a series of courses is an internal measure for success not one
that necessarily would be useful to the public or be productive to report to the
public
 What is the desired trend for this metric?
 I think this includes dev. courses, so it may not be necessary to have separate
Dev. metric.
Is this referring to just classes for credit? If so, it should be stated .
 I think this is not a good measure of student success unless the student's only
goal is to take a single course.
 My only concern is UAF DEVM F105, a B is required to go on to MATH F107.
 Completion of a course also happens if the student gets an F. It is just not a
"successful" completion. However, it does mean that the student is finished with
the course. The concern is that some students withdraw or get an incomplete.
Those are the students that don't complete.
If a student fail a course, they did not complete the course.
They did not receive credit and will have to repeat the course
and receive the required grade (above the F) in order for it to
apply to a degree program.
F stand for fail not completion.
 This is a basic measurement that we need. However, it cannot stand alone.
Other metrics that measure students success at moving on also need to be
considered. We need to remember that no metric will capture everything and
even if this is an incomplete measurement it is one piece of the puzzle.
 D's get degrees, I agree. Being below average isn't failure, F's and
Incomplete's are failure until courses requirements are successfully completed.
Course assessments shouldn't drive what is considered student success.
 Academic Program Planning (PBB measure) also addresses this issue
indirectly.
 A "C" should be a minimal standard for successful student completion.
 The President recommended the perspective that we are serving students and
from their perspective.
 2009 This seems to be an easy one to measure so we should start as soon as
possible.
 May want to add that metric title that it is Successful CREDIT course
completion since many campuse also offer non-credit courses and workshops.
 #302 then from that perspective we need to start asking each student what is
their goal. I think we would be surprised that many students while enrolling in a
degree program, not always see their end goal as a degree.
 I agree that C is a minimum standard for success.
 'D" grades are considered successes for general education and core
requirements.
 There is a difference between completion and successful completion.
Completion would be defined as having received a grade other than W or NB.
Successful completion would be defined as having received a minimum grade of
'C' or pass. Incompletes would have to go into an unknown category. Perhaps
the metrics could be redone one year after the semester ends. The interesting
thing would be the amount of change in that year period. Withdrawals and No
Basis would not be included in the graded metrics. They could be included in the
attempted metric.
 Successful course completion is a basic measure of what we do. It is
important to measure
 Is successful course completion an indicator of basic overall persistence?
 #304 My interpretation is that this metric only applies to academic courses as
CEU (continuing education unit) and non-credit do not receive letter type or
pass/fail type grades.
 This one should be pretty easy to implement. 2008
 #312 plus we are not metrically measured on non credit courses  Could this be combined with overall persistence? Do we need to establish a
more clear definition before we set out to measure this?
 AY09-Hopefully the same year as we implement a developmental education
metric since they are aligned similarly.
 2009 - this should be easy to add. We already know the grades of students.
We just need to know the time period for reporting so that we don't lose those
students who take incompletes.
 AY09 - Calculation: # of students completing individual courses at or above
the course-defined minimum level of success.
 This metric could easily start as soon as adopted since it appears to be
looking at a straightforward percentage or quantitative number. I'm curious about
the unintended consequences of grade inflation or reduced rigor of the
instruction.
 Someone asked what is this really measuring. If students are paying for
courses but not passing them (not receiving the credit they paid for) this is a
problem. There are many things that might cause this to happen, including
student preparation, but if a lot of students don't successfully complete our
courses, then we need to know and do something to change it.
 2011 So much of "true" successful course completion depends on proper
placement and advisement of students. Initial efforts to hold students to
strengthened prerequisite and testing measures take time to implement. There
may be initial resistence to implementing placement,, so this takes time
 Primary data is the successful completion of courses - incompletes are
secondary data that should not be put into this measure but should be further
investigated to get students to this measure
 #312 and 314 - good point. Best measure of campus productivity would
include ALL students enrolled in credit or non-credit.
 I am concerned that this metric could lead to grade inflation to insure that
more students complete courses.
 AY-2009 since it is sokmewhat a measure now. I think we should also include
a measure for non-credit courses since many offer those as well.
 Even though D's can be used in a degree program, the student must have a
2.0 overall GPA and a minimum grade of a 'C' in their major courses for the
majority of majors. More programs are beginning to require a minimum grade of
'C' in some of the core requirements.
 #330 - agreed
 I don't believe this will lead to grade inflation. Faculty are not going to do
something like that just to make the metric look better.
 the sooner the better as it is evident there are many other issues that could be
addressed if this basic measure was implemented
 2008 -2009.
 This is a good argument for moving to competency-based mastery for grading
rather than arbitrary grade systems.
 Many students fail to complete classes due to personal issues...these we can't
do much to prevent, but that shouldn't stop us from working on the issues that we
can do something about...that's why we need to measure course completion and
keep working on correct placement, good access to advising, early support for
students transitioning to college...and other support measures....over time
strengthening course completion will improve retention etc measures but at the
community campus level the focus is best at the course level. Students may
complete a program based out of somewhere else, but if they received most of
their background at a CC then that is a success.
 In regards to #296, an F is not a successful completion of a course.
 #355 neither is an INC
 With regard to non-credit and CEU, the assumption is that everyone enrolled
completes since we post the grades automatically. Alll non-credit courses and
CEU courses are posted with a single grade appropriate to their level via a batch
process.
 #357 Incompletes are not completions but can be if the student finishes their
work by a deadline. F grades the student does not have that option.
 What about Pass/Fail courses? A Pass should be included.
I agree with this one
1.1. Show-Stoppers
1.1.1. Completion within 1 semester? How can
we also count incompletes that are completed
within the following semester?
 What wil we do about incompletes? Can
we find a way to alter the course completion
number after incompletes are done?
 What about Inc or DF. Will we be
consistantly going back to see how these
students did? Because these are not
failures.
 What about Incomplete grades where a
student completes the class successfully up
to 12 months later?
1.1.2. Will this only apply to 100 level and
above courses?
 Should this apply to all courses, incuding
developmental?
1.1.3. What is this really measuring? What will
we do with this metric?
1.1.4. Should this be left to course
assessments and not use it as a metric at the
SW level? Are we duplicating effort?
1.1.5. What is a completion of a course? A-D,
depends on requirement of grade for next
level?
 What about no-basis grades?
 F should not be considered a completion
 Should D be considered a completion
when the student isn't ready fo next level
 Maybe we need to look at Failed, did not
complete (W, I, DF), and passed. Exclude
audits.
 If you say a "D" is not successful
completion, then why not move to
Pass/Failgrades and eliminate "D"s
altogether; if you have the ability to have a
below average grade that is not an "F" then
that should be your cutoff, not "C"
1.1.6. What caveats can be put in place to
prevent grade inflation, which will certainly be
inscented if this metric is implemented?
 do you want to be held accountable for a
metric you can't impact?
1.1.7. I think there are two issues here. One is
successful course completion of a single
course, the other is successful completion of a
series of courses where we see that students
completion (grade) translates into mastery of
the information so that they can successfully
go on to the next course. The developmental
completion rates metric might get at this issue,
but it is not just a developmental issue.
 Degree progress is a way of looking at
this
 Consider concentrations also
2. Basic Overall Persistence/Success (Persistence rates)
 Needs-based financial aid is an important piece to this component.
 Is basic overall persistence linked to successful course completion?
 We could do better in identifying the few courses where most of our failures
occur and work to remedy those gatekeeper situations.
 We also need to think about what happens we agency funding goes away.
 Considering the enrollment patterns of our students, this seems critical.
 Institutional research can easily track student movement within the UA
system, however tracking students who transfer outside UA is less
comprehensive (in other words, we can find some but not all students who enroll
in another HEI after UA).
 This measure would allow us to track all those students who stop in and out
over time, since it is longitudinal in nature. In fact, it will allow us to take credit for
our many lifelong learners, who come back year after year for personal
enrichment.
 if some campuses don't offer a program, will they not be penailized in the data
comilation and i.e. not included?
 don't agree with the refinement comment....most non-credit students do not
have a goal to go on and take a credit class; apples and oranges. A 3 hour noncredit workshop in something has nothing to do with the credit programs. There
should be a mteric, though, of how many enrolled in non-credit workshops.
DNon-credit = CEU's and zero credit.
 How can we also include the notion of "stop out". Would we find a break
lowers the persistence rate? Would there be another way to count students who
are intermittently persistent?
 W
 There is actually some overlap/redundancy between basic persistence and
basic completion.
 Full time students are such a small % of our student population it's probably
not the best metric to study
 We don't need to parse out the enrollment of a cohort. But it is important to
follow them through their career as a student in other institutions.
 Is there a way to count the non-credit student who is in ABE-GED-ESL
courses and persists to the next year to continue working on their goals.....not
because they didn't complete, but because they have more to work on to gain
further competencies which may or may not include earning a GED(more ESL
and literacy compencenies)..
 AY10 or 11
 AY10 or 11 - refinements to the current retention metric can make the initial
approach to demonstrating student persistence, so this one could be delayed
while we work out the details.
 #370 - not all of us offer the ABE/GED program
 Also need to count non-degree seeking students. Many students take classes
without being admitted into a credential program.
 How about a measure of a student initiated goal for non-degree seeking
students? Would it be possible to track an individually developed goal?
 This metric could expand to those who are non-degree seeking.
 Maybe we need to non degree seeking types in banner. Those really NonDegree seeking, and those Intending to Degree Seek. It would require some
question or evaluating of the student before registering and them admiting them
as such.
 AY 2011 I think the second part (Potential Refinement) is another measure not
part of the first measure. The first part seems to tied to degree/credential seeking
not to those large numbers we serve successfully in one or two courses. (They
mat be getting jobs or promotions)-it is all successful.
 NDS and degree seeking students can be rolled to gether for an overall
metric, but have to be split out for campuses to see how they are doing. It will be
hard to impact the NDS students, some of whom may not plan to take more than
a single course - these are then automatically counted as a non-success even
though they met their goal.
 Might want to take out the part specifying enrollment in a program. We need a
way to count and be measured on our non-degree seeking students who take
courses for life-long learning, job skills training, or other non-program specific
reasons. Our ability to retain these students over time is also a measure of
success.
 AY 2010
 Definition of persistence at the communicy campus level is so much more
than the traditional definition of persistence and should be identified and
measured
 2010 or 1012--recommend a stay of execution on this one until there is
consistency on the MAU level
 I think we need some criteria, such as degree seeking, to form a meaningful
cohort. It's not reasonable to expect a non degree seeking student who's just
coming to us for a class to persist. We should focus on the students who come to
us with the stated goal that they want to persist, ie they want to work toward
degree or credential. Then, we should focus our persistence efforts on those
students.
 maybe we need seperate measures from main campuses.
 The refinement implies that we should be expecting non-credit and CEU
particiapnts should be expected to then enroll in a program.,,,,we should not
expect that and be dinged if they don't.
 There are some non credit programs that lead to a credential that should be
captured.
 Cohorts should be defined by Academic year to Academic year... not by any
one semester.
 #396 I agree and those numbers should be rolled up to the MAU level as at
the comunity campus the numbers are so small that they are almost insignificant.
 In our campus, we don't have much full-time students, alot of of students are
taking taking a course and don't really have an idea of what they want to do.
 Measurement - 1 cohort for when the student took their first academic course
and a second cohort for when they took their first non-credit/CEU course.
Persistence would be measured in each cohort separately.
 Campuses do not currently flag 'first-time' NDS students, which would have to
be done for IR to identify these students.
 personally, I think the current SW retention rate is a good measure for us. It
gives us a microcosim of our student body and is a good indicator of what we are
doing
2.1. Show Stoppers - Persistence
2.1.1. Important to include those who begin as
non-credit and non-degree-seeking as an
element in Persistence.
 This should be broken down by
developmental courses and college credit
(100 level and above) courses.
 Non-credit students and other students
leading to a credential or a institutional
certificate needs to be measured.
 First-time, full-time, degree-seeking
freshman is not useful. I would suggest
including those who are admitted, whether
first-time or not, and certainly not limited to
only full-time enrollment, from year to year
(AY or FY). Financial constraints are a
limiting factor for continous enrollment.
 Maybe we need to development more of
an understanding or peresitanace and
development a "stop out student" approach.
IE a student has not dropped out if they don't
return for so many semesters , but yet
stopped out.
 Include students who are non degree
seeking. These undeclared students are
persisting (continuing) to explore... after a
while it is not unusual for many to discover
(via their advisor) that they are 1 or 2
courses away from an Assoc Degree
(general).
 The Community Campuses do more than
just serve full time degree seeking. students.
The non credit student and special interest
classes students take for personal
enrichment need to be considered and
measured.
 The refers to students seeking
'credentials' and 'degrees'...this should also
include workforce endorsements, licensures,
and similar achievements.
2.1.2. We have so many students that do not
attend campus on a regular semeter by
semester...how do we count those students?
Their goal is still their but they need that time
for family, financial, etc.
 if a student transfers from one campus to
another within a MAU or to anoother, which
campus gets counted? So, if someone
persists at one campus and persists
amongst campuses, how is this addressed?
 this seems very important, however does
it tell us how long it takes someone to get
through to completion or is that measured
some other place?
 Also important to have a larger window of
time for completion (10 years versus one
year for example)
2.1.3. AY10 - there needs to be a
standardization across the MAUs to be able to
compare statewide.
 First step is consistency in policy across
MAUs for "remain enrolled in program"
 Again, there is no cinsistency between
MAUs---
2.1.4. Support from partners/corporations is
crucial for student persistence as a part of
needs-based financial support. Community
campuses need a way to recognize the
importance of these partnerships and how they
provide the sustainability for the vocational
needs of a region.
2.1.5. We need to use this metric to improve
our service to students to help them take more
direct paths to completion, if that is there
desire. Although many students choose to take
the long road, others go there unadvised and
this should be addressed. In addition, we need
to look at how our programs work for or against
completion. How many times do we start a
grant program to promote this or that and then
watch students flounder when the support goes
away?
 What do you do with this metric once you
get it?
 To measure persistence more accurately,
it would be helpful to have a better way to
communicate and record what the goal of
each student is from the start. Sometimes
the measures for persistence don't capture
the fact that the goal of the student has been
met and that is why they've stopped out.
2.1.6. Isn't persistence an indicator of the
success of the existing metric "strategic
enrollment management planning"??
 Does this conflict with the completion
metric?
2.1.7. What about a student from a rural area,
who is degree seeking through a main campus,
not their rural. Where do they get counted
toward rentention and persistence? The rural
campus is supporting and providing courses,
but the main campus gets the credit.
 This is iimportant in that we need to follow
students as they attempt to get a degree
from more than one campus.
2.1.8. please drop the potential refinement;
someone taking a non-credit 3 hour grant
proposal workshop or a gardening class or a
small bus. dev. workshop or a GED class has
nothing to do with measuring our success at
how many of them go on to enter a degree
program. It's not their goal.
2.1.9. The debate about UnD, HdCt and SCH
will continue until the empahsis on comm
campus numbers disappears and the counts
are rolled into the overall MAU numbers
providing incentive for the MAU and other
comm campuses in the region to work
together.
2.1.10. Potential refinement: Take the existing
suggested refinement (in the powerpoint) and
take it out 10 years instead of 5. Many students
take longer than 5 years to complete a
credential.
2.1.11. I feel that each of these four "metrics" is
all about the same dimension: graduation
depends upon persistence, which depends
upon course completion, which depends upon
DEV... I still think UA/BOR/St.Legisl/Public will
be judging us on UnD HdCt & SCH.
2.1.12. how would this capture single-course
credentials like CNA, PCA, weling, etc.....? If it
can't, then it should be qualifeid in the tilte of
the metric and in definiton.
2.1.13. What about transient populations, ie.
military personnel, who transfer duty stations
and will not be counted in persistence daa?
2.1.14. Developmental success especially in
Math is defined as a B, at least at UAF. Do we
have separate rules for these exceptions or do
we go with a generic rule?
3. Developmental Success (Developmental Completion Rates)
 This is needed for the community campuses to measure the students who are
succeeding. Need to define success.
 There are three basic levels of ability: those who are in need of immersion into
college-level preparation, those who only need an add-on with their core to be
successful, and those who can succeed with a minimum of help through
supplemental instruction and some advising.
 Labeling of courses should get away from DEV and remedial.
 We generally have more students in this area than any other. Most of our
students who graduate have started at this level.
 Perhaps we need to retest a student after each Developmental Ed course to
make sure they are prepared, esp since student take courses from different
MAUs. We need to make sure we are preparing them before they go. The failure
rate when some of them do go on is a direct refelection that we failed!!!
 Need to track because many students never make it into freshman level
courses and we need to determine why.
 Most of our students do need developmental courses in area. The majority of
our student populatuion is an older group.
 This measure doesn't have to do with the grade received, but whether the
student received an acceptable grade to qualify for acceptance into a 100 level
course.
 Support #73 question - separate or together - I think there is value in each
 Many designators, particularly in math, are considered developmental. ie,
CIOS 116 and CTT 106 and DEVM 105 are all considered developmental math
 We need to also set up parameters to what we are considering successful, ie
students goals, or what the MAUs traditionally see as goals, ie a cert or degree.
 We have a need to have a more accurate assessment of incoming students to
reduce the wide range of abilities found in a single classroom.
 I think the measure, as written (completion and then successful completion of
100 level math and English) is appropriate because other metrics (eg.
persistence) get at students who go on to cert or other programs which do not
require 100 level courses.
 Critical to this conversationis the STARS program and the SAC initiative to
requre developmental testing and advising for student entering degree awarding
programs, and to what extent are programs not part of that exercise.
 Developmental courses should be those identified as stand alone courses
(rather than developmental activites embedded in other courses) used as the
"first step" toward continuance along an educational path to basic completion of a
recoognized credential.
 Developmental students success can be measured by the outcomes of the
GERS that follow if they are degree seeking or certificate based.
 Even though other courses have been created at the 100 level that are
equivalent to DEV courses, they are not traditionally used to meet the
baccalaureate course requirements. So, they should not be counted as
developmental courses. They student will still have to test into the traditional
bacc level Math courses. These other courses have been developed to allow
students to earn a certificate or associates degree only and they do apply to
those degree requirements.
 The variety of developmental designators will create a challenge in capturing
all academic goals of individual students. We also have nontraditional students
who do not have the same deficits as entering high school students coming to
college. Any successful measure of this would require knowing where they come
in as opposed to how they exit. Groupings could capture gender, high school
preparation/GPA, ESL and mandatory placement scores as a way to accurate
track success upon exit, according to a student plan of study.
 #106 I like the idea, but maybe the test is something done by the instructor at
the end of the semester. I think instructor need to see the measured outcomes of
their students. Lets face it, not all DEVM instructors are good instructors.
 This is a KEY issue for UA accountability and should be measured in some
form
 It seems that sometimes these measures are working counter to each other
with success in developmental instruction as goal toward college-level success
versus completing student-centered goals, like job skill development.
 Perhaps table this for now and use course completion metric to start with.
That way all courses whether they be developmental or above would be
"counted." We could measure successful completion of developmental courses
and separately measure all other credit courses.
 Its very important for a student starting out in college to take some
developmental courses if they need it to succeed in college. (if needed) The
student should take a test to see if they are ready to move on to the next level, or
the instructor should let us know if they are ready to go on to the next level.
 Calculation - # of degree-seeking students, who place into developmental
courses, successfully completing the first 100-level Math and/or English. This
type of calculation would capture only those developmental students with goals
of enrolling in college-level courses and demonstrate their ability to succeed in
those courses, which is one of the goals of developmental studies.
Students do not always take a placement test to get into a
course. They choose to start at the lowest level and even that
course may be to high of level (but it is the lowest offered by
the college).
 if this metric remains, then it could be measured by number who got a c or a
pass.Some dev. courses are 100-level classes; i.e Prpe A108. Also, is Math
A105 seen as dev.'l if it meets AA requirements and doesn't meet BA/BS? So,
"devl" course means Dev.'" for a AA or BA degree?
Designators and/or course numbers within the UA system from
UAF, UAA, UAS
 Developmental success may not always be measured by a single course. For
many students it will be measured by the sequence needed to develop the
competency.
 An interesting factoid for context - in the California State University system, 85
percent of incoming freshmen need remedial coursework. That system
successfully remediates most of these students, and those who aren't up to
college level after the freshmen year can't go on. The point is that UA is not
doomed because K-12 doen't prep students for college, nor is this a situaiton
unique to UA.
 This has a high priority at the statewide level, as demonstrated with the recent
STARS task force and funding for innovative developmental proposals. It would
probably be wise to propose at least some way to measure our successes in
these areas.
 Course modality should be seamless in a random look at completion/success.
The quality of a course should be irrelevent to the success of the student.
 #115 not all students intend to become degree seeking or cert seeking. I think
that we are not going to be able to look at all students, and maybe need to
specifically focus on cohorts of students, ie only degree seeking. Unfortunatly this
would not measure many of the rural students.
 You can't just assume that completion rate for an instructor equals
competency. It is the students ability to move successfuly into the next class that
indicates instructor competency. Overall, I think the ability of students to
successfuly gain skills and move through their chosen program is more important
at showing our abilities than judging us by single course completion even if that is
the students goal.
 FY 2009 - we will need a year or two to work out the policies with SAC,
Provosts and such. Plus need to think about if we are measuring data that is not
already collected.
 2010 - It will take a couple of years to get a proper list shared among campus
about what counts as a developmental and what counts as a successful
completion for the students.
 2010; Why wait? If it's worth doing, if we have confidence in the measure and
believe it's a valid measure, how could you justify waiting?
 2009 Why not as soon as possible? Courses are set up to be sequential
already. The outcomes of the course taken as a prerequisite for college level
work is to prepare the student for that class. Developmental Ed departments are
working on strengthening their programs, assessing their programs already.
 UA is asking for $500,000 for support of developmental success in FY07. UA
will be held accountable NOW for its past and current performance in
developmental success.
 AY09 - This has high priority at the statewide level, so it seems apparent that
wse will have to accept some level of accountability on this sooner rather than
later.
 2010 - Broad institutional implementation of metrics and budgetary
implications should be assessed prior to adding on metrics that have yet to be
fully defined and developed.
 Implemented could begin AY09, since mandatory placement policies are
going into effect, theoretically, this year, and funding requests for increased
developmental support begin FY08.
 We embed developmental work into all our courses and most of our students
need substantial developmental work. I do not however think this is why my unit
exists but is necessary part of what we do therefore not a measure of
"productivity" as such.
 2010 we have been working on developmental success for some time and we
should be ready to make and meet goals already.
 2008- But I recommend that we look at overall successful course completion
too rather than just completion of develpmental courses. Perhaps there would be
some correlations.
 AY09--This has a lot of attention at SW level and we should start measuring
soon.
 In regards to #132-course modality is not the same as course quality and I
believe a poor quality course does impact a student - usually negatively.
 This should be implemented when the bugs can be worked out in FY 09.
Developmental Education is an important part of our mission and the students we
serve.
 Achieveing some consistency across developmental programs appears to be
on the radar screen at the statewide level.
 2007. We are way behind on this. Students out there expect that we already
are accountable and that the course sequence they are advised into is the
correct sequence. And that the course content and delivery is worth their money
and time.
 AY-2010 to give time to support the concept, determine all the various
devlopmental courses (what about those that embedd DEV into most of their
courses?) and to move all MAU's to similar measures.
 2011 - We need time to collect several years of data to determine if we are
capturing what we really want to be measured on
 As soon as possible.
 I agree with the comment about student goals; that is vital for many of teh
metrics like this one; not all people have as a goal to go on after they're
successful with their credit or no n-credit dev. course
 Maybe if a student test into a developmental course they need to also take a
student skills course, maybe we shoy
 2009: Can start as soon as possible on tracking...finding consistancy within
DEV programs across MAU's of what is covered in each course so student
transfering or taking courses across MAU's have consistancy.
 Perhaps we need to retest a student after each Developmental Ed course to
make sure they are prepared, esp since student take courses from different
MAUs. We need to make sure we are preparing them before they go. The failure
rate when some of them do go on is a direct refelection that we failed!!!
 A proposed measurement is placement tests before and after successful
completion (A,B,C) of a developmental class. For example, a student tests into
DEVM 105 beginning Spring semester, gets a B, and then tests into Math 107
beginning Fall semester. The problem with this is that it is cumbersome to
students and puts an additional burden on each student for excessive testing.
3.1. Show Stoppers - Developmental Success
3.1.1. Developmental courses do not transfer
to other institutions through the transfer credit
process (because they are less than 100 level).
So, a student who "successfully completes" a
developmental course at UAF wouldn't
necessarily enter UAA at college level if their
assessment test scores didn't place them at
college level. Perhaps successful completion of
a development course should include an
external measure such as a national placement
test score.
3.1.2. What courses are considered
"developmental" courses. Some applied
English and math courses might be included.
3.1.3. Some students take the same
developmental courses over and over, passing
each time. More complex measures such as %
of students who go on to pass college level
course work will help assess the success of
UA's developmental program. This would go
hand in hand with mandatory testing and
placement.
3.1.4. This measure works if a student's goal is
to just complete a developmental course. Many
students pass developmental courses
(A,B,C,P), but are not neccesarily prepared for
college work.
3.1.5. Definiton of developmental education
needed to measure
3.1.6. Can this metric be used to go back to the
K-12 programs to say that we are not receiving
students who are prepared for college? Can
this be used as a springboard for working with
K-12 to develop programs with them so we
receive students who are prepared?
3.1.7. This is part of the course completion
metric, so it may be redundant. Also, may not
be
3.1.8. Many students take developmental
courses concurrently with enrollment in 'college
level' credits. Are there any challenges in
accurately capturing their "success?
3.1.9. Inconsistencies in how "developmental"
courses are handled across the UA system
may skew data if not carefully considered.
3.1.10. Depending on the student's level to
begin (DEVM, DEVE, DEVS) if they achieve a
goal and it does not consist of completing all
levels of DEV is that success? Or do they have
to go through all DEV and into 100 level to be
successful?
3.1.11. Get a complete list of what counts as a
developmental course.
 Some courses that people are referring to
as developmental have been created for use
in Certificate and Associates programs only.
They are not recognized as prerequesites for
baccalaureate required courses or even as
prerequisites within the DEV series. I would
suggest that they are not true developmental
courses.
3.1.12. This area is receiving and will receive
even more focus at the SW and legislative
levels. It will be important for us to get out in
front of this new focus and begin measuring
and reporting these statistics.
3.1.13. Since students taking developmental
courses are less likely to complete a
recognized credential than those students who
do not need developmental courses, I think we
need to devote additional resources to
developmental students.
3.1.14. In the past we have convoluted the
developmental courses with A.A.S. course
requirements (like CIOS 160 and TTCH 131).
The A.A.S. courses have been shown not to
prepare students for the college level courses.
We need to separate the courses that satisfy
A.A.S. degrees (and those students who have
the goal of A.A.S. completion) which can be
measured with completion and those courses
that are truly developmental courses that
students take to get ready for college level
classes.
3.1.15. Perhaps the most important part of this
measure is our ability to help students make
progress along an educational continuum, so
measuring a student's enrollments and
success in college-level courses AFTER the
developmental sequence may be the key to
reporting campus productivity/effectiveness.
3.1.16. Developmental success takes two
forms. Completion of developmental courses
by themselves and completion of
developmental courses to successfully
continue into college level courses. Would
these be looked at together or separately?
3.1.17. A course designated as 'developmental'
should not included 'applied' courses that are
degree requirements. Developmental courses
are preparatory ones for college-level
instruction.
3.1.18. Will the developmental courses include
the non-credit ABE and ESL classes? They are
development too.
3.1.19. This is part of the course completion
metric, so it may be redundant. Also, may not
be consistently interpreted since there is such
vaiation on how dev. ed is conducted across
the State.
3.1.20. The bridge between developmental
courses and the subsequent academic course
should be clear. The information in one should
lead to success in the future course. The
tracking of success in subsequent courses
(disaggregated to show like populations and
their success) could track how well those
bridges are working or not.
3.1.21. We need a thorough list of what
classes are being subsituting for DEVE/M/S
courses. And look at successful completion
and successful completion of the courses they
take after. Perhaps we need to also look at
cohort students, ie degree seeking or those in
BA programs to see if what we are measuring
is going to show what we think it is going to
show.
3.1.22. Building a clear and direct pathway
from developmental coursework into college
level courses would be helpful for success. To
transfer developmental coursework cross mau
would help students feel more compelled to be
successful in their developmental coursework.
3.1.23. not all students have a goal to go onto
non-dev. classes, so we shouldn't be evaluated
by how many do.
3.1.24. How is the instructor's competency
factored in? For example, if one instructor has
a 50% completion rate for their students and
another instructor has a 70% completion rate
from the same population of students, isn't that
indicative of different teaching ability?
Sometimes student failure is tied to less than
optimal teaching.
Include the mode of delivery as well as the
instructor.
3.1.25. Someone mentioned Accuplacer,
remember not everyone is using the same tool
to measure students, ie there is compass,
asset, accuplacer, etc
3.1.26. #137 define successful completion - in
this room alone there are several different
takes. We are going to have to work out these
things.
3.1.27. Developmental classes need to be
measured in a number of ways. Degree
seeking students, non-degree seeking etc.
3.1.28. The STARs report and SAC are in
support of mandatory student testing and
placement into developmental math and
english courses. Without this in place, it will be
difficulty to impact developmental success.
3.1.29. 1. View degree program for student:
requirements for DEV, do they meet them? Did
they complete courses in that area or did they
test out?
2. NODS - Is the student taking a course for
refresher for job, self, etc.
Do we need to look at student's that dropped or
withdrew from a DEV course to find out why
the did? Can we improve the drop/withdraw of
students in DEV courses?
3.1.30. Some certificates have the equivalent
of developmental content explicitly "embedded"
within them. How is this to be addressed?
3.1.31. Why can't we develop consistent
course numbers, course objectives, and syllabi
for all "developmental" courses across the UA
system? Is this desireable? Is this achievable?
3.1.32. It appears there is not standardization
between UAF and UAA re: disignators. Will this
be a problem?
3.1.33. Looking at the DEV courses and how
they are offiered to see their success rate as
well as the student success rate. Example:
Elive, Audio, Blackboard, paper based
correspondence, ...
3.1.34. "Developmental Success" is not
necessarily successful completion. DEV
courses can be successful in that they help the
potential student understand what is required
of college level work esp. attendance and
homework!... that they get a W or NB is
appropriate for both the college and the
student (former). As a "Metric" the UA should
appreciate how much work we put in helping
students to understand and build up their
skills... HOW MUCH vs. Success is more
appropriate as a measure of the service we
perform.
3.1.35. A proposed measurement is placement
tests before and after successful completion
(A,B,C) of a developmental class. For example,
a student tests into DEVM 105 beginning
Spring semester, gets a B, and then tests into
Math 107 beginning Fall semester. The
problem with this is that it is cumbersome to
students and puts an additional burden on
each student for excessive testing.
4. Basic Completion (# of students who earn a recognized credential)
 Recognized creditials are important to individual programs and to
communicate to employers regarding success of programs.
 This is a critical way to acknowledge individual student achievement, which is
a high motivator for further study - it helps to create the sense of an educationa/
career ladder, where students can step on and off as life circumstances and
employment requirements dictate.
 Increased advising/program of study tracking could add to the students'
ownership of the advancement of their own academic goals and help identify the
resources we can offer to help them achieve that goal.
 I feel that each of these four "metrics" is all about the same dimension:
graduation depends upon persistence, which depends upon course completion,
which depends upon DEV... I still think UA/BOR/St.Legisl/Public will be judging
us on UnD HdCt & SCH.
 Community campuses often act as feeder campuses. Persistence needs to be
tracked at a SW level to be meaningful. A student who starts at a community
campus and continues at another campus should be considered retained.
 The AK Department of Labor and US DOL use common measures focusing
on employment beyond graduation or completion, as well as income progression.
To what extent are these Common Measures important to our efforts? Perhaps
we should at least look at them to see how our own measures might (or might
not) complement them.
 It is important that this measure only recognized credentials and not just a
course(s) that do not lead to license, or industry- or state-recognized credential.
 UA has one of the lowest graduation rates (25 percent for bachelor students,
less for associate and certificate) in the country.
 Should develop credentials that recognize the small successes so that
students may build upon these to reach their overall goal.
 For many students, the important success is gaining employment or
advancing in employment. This may mean a certificate or degree of some sort,
but maybe not. We often get our new students based on their observation that
other students have obtained real word success through taking courses with us.
Is this part of this metric and how to measure?
 Concerned that this may still omit single-course crednetialling classes like
PCA, CNA, pipe welding, 6-pack licensing, COMTia, etc.
 Get a complete list of what what can be success after taking a developmental
course.
 This is an excellent metric for the community campuses as it gives us an
opportunity to measure all that we do to meet the community college mission.
 AY09--This is easy to measure and provides a good look at our success.
 When a student reaches their goal this is success.
 I prefer the Calculation. The potential refinement leaves out certificates.
 2009 - this should be an easy measure once the definition of "credential" is
agreed upon
 The potential refinement should not be used.
 Sometimes a single course will make a student more employable, ie, auto
mechanics, welding, basic accounting.
 2010 - we need to give the system some time to test with data fed back to all
campuses.
 Concur with the calculation which includes licensures, industry certs, WF
credentials; This gets at the "nut" of what's different at community campuses.
 AY09 - This is needed to allow us to include all those students who are
earning credentials below the associate's level and outside "high demand" - they
are currently not captured anywhere else.
 AY09 but without the refinement. As stated above many courses are
considered complete for particular jobs.
 #509 good point - communities may not have the identified "high demand"
jobs, but might have their own identified "high demand" jobs that are good for the
economy of the community
 Would be interesting to look at our graduates systemwide and see how many
campuses contributed to the success of the student.
 many community campuses' degrees are MAU degrees, so how will a campus
and not the MAU be identified?
 2010. Will be interesting to track this now that OEC's are on the books.
 Whether a student starts with developmental studies is also a factor. A student
starting in DEVM 050 and achieving a BS degree measured against a student
starting in Math 107 and achieving a BS degree is apples and oranges.
 We need a diverse 'cluster' of measures to capture the full range of services
provided by campuses. No one measure alone will be adequate; we need to
ensure that this cluster of measures paints a full and accurate picture of our
work.
 #511--This is the purpose of this metric to gather and track info on these
programs and courses.
 This measure seems flawed. Is it a number of graduates and students
completing a certification, etc., or is it a rate. This is an important difference. If it
is a retention/persistence rate, then that needs some definition. If certificate
students wait until applying for graduation, then the rate would be 100%. How
helpful is that?
 Community campuses often act as feeder campuses. Persistence needs to be
tracked at a SW level to be meaningful. A student who starts at a community
campus and continues at another campus should be considered retained.
4.1. Show Stoppers - Completion
4.1.1. Many of the students at community
campuses do not take classes with the intent of
receiving a degree or a credential.
4.1.2. What about students who all they
wanted to was to take Grant Writing or
Technical Writing to improve their skills. No
creditial or degree was earned, but yet they still
achieved a goal. How in the world do we
measure this?
4.1.3. A student can take 1 course and that
makes them more employable than if they did
not take it. A Rural Campus would consider
that a success. There are so many student that
are NODS that are not insterested in a degree
program...they just want to be hired.
4.1.4. Agree that this should include a measure
of how the community campus "assisted" in
completion i.e. the statewide nursing and rad
tech initiative
4.1.5. What is the base to get a percentage? If
I have a small base I look really good if I need
a large number but if I have a large base it
could be determental. I can use various
numbers but the problem is to be consistant
from one to another.
4.1.6. We need to coordinate the recording of
all licensures, industry certs, workforce
credentials, occ endorsements, certs, degrees,
etc. Not all campuses are recording these in
the same fashion in banner. We would need to
sit down together and coordinate so reporting
would be accurate.
4.1.7. It is very important for a campus to get to
count the success of students as they
complete at other campuses. The "assist"
needs respect.
4.1.8. how do we collect licenses that are not
given through UA or a partnership program we
use, ie a student taking a ITS course to help
student for the MCP/MCSE/MCSA test.
4.1.9. In the list of examples, there were
licensures and industry certifications. How do
we know when someone has completed one of
these? How are these defined? Is there a clear
understanding of what a workforce credential is
- that it involves only CEU and non-credit
courses?
4.1.10. Classes that are set up for preparation
of testing to a certificate need to be included as
a recognized credential.
4.1.11. This metric should count the number of
awards - so that if a student gets more than
one award both are counted.
4.1.12. How do you define licensures and
industry certificates? The specific licensure and
industry certificates need to be specifically
defined so when someone asks what the
student must do to be counted in the category
they can be given the specific course
requirements.
4.1.13. Alot of our students are non degree
students and are taking courses to help them
succeed in their job.
4.1.14. How can we capture the single-course
"programs" that aren't in UA -defined programs
like CNA, PCA, welding, etc. We're losing
many now and no one is tracking this. Some of
these credentials are in the high priority areas
that UA has.
4.1.15. This metric has to have a reporting
element that refers to all the campuses that are
involved in the student's transcript
4.1.16. there are awards and industry
certifactions that come out of one or two
courses, not a series that is BOR-approved
progrram.
4.1.17. Basic completion could also be defined
as that the student has met their personal goal
- degree, a course, etc. But even though this
could make the unti look good in that their
students are meeting their goal at high rate, it
is also one that could be easily corrupted so as
to say "look how good we are doing".
5. 'Facilitation' or another term to describe the effort a campus makes to
ensure a student physically located near them is succesful, regardless of
whether the student enrolls in a course or degree program at that campus.
6. What else could we measure?
 The AK Department of Labor and US DOL use common measures focusing
on employment beyond graduation or completion, as well as income progression.
To what extent are these Common Measures important to our efforts? Perhaps
we should at least look at them to see how our own measures might (or might
not) complement them.
 the percentage of our 18+ residents in our region that our regional campuses
serve.
 Pre/post testing is an addditional way to measure students success.
 'Facilitation' or another term to describe the effort a campus makes to ensure
a student physically located near them is succesful, regardless of whether the
student enrolls in a course or degree program at that campus.
 UA should focus on Accessibility = the percentage of our regional
communities that we serve student in per academic year
 We need to measure student's goals somehow, to compare to thier activity
and measure whether the goal was met.
 Instead of community maybe agency/partnerships. Not all communities have
the ability to give.
 Community contributions - $ and in-kind
 While we dropped 'transfers' as a topic of discussion, some may have
eliminated it because it implies moving from a community. But as more online
courses and degrees become available, 'transfering' may not involve moving
from your community. Our ability to provide a quality foundation for such
'transfers' may be important to measure.
 Community partnerships and economic support are important.
 UA should focus on service to UA students = measure support services:
advising, access equipment (internet, videoconf, audioconference, etc.),
classroom space in accessing distance courses, etc.
 A lower 48 community college, has advisors poll their student every fall to find
their goals. Maybe we could do something similar, however does banner have a
screen for this? It would be useful to see how students change their goals.
 Leveraged community resources measured in $$.
 It important for each community campus and/or college to have engagement
with it's community and region. How to measure the effectiveness, correct model,
or partnership level with local/regional eduation, government, organizations or
industries would have to be determined. This is a critical piece of information.
 #341 look at the PPT slide handout in your packet. This is addressed.
 For some of the campuses, the number of sites (towns and villages) served
would be a very good metric to use. It takes a lot of effort, resources (and
creativity) to successfuly serve a student who is the only college student in their
village. This would help support and promote outreach efforts and lead to more
students and more success. One successful student in a village often leads to
more.
 #348 excellent, it would be great to see how many go from a goal of one
course to improve skills to wanting to obtain a degree.
 Are we in agreement as to the end goal i.e. meeting the "community college
mission" so we are clear why we are proposing to measure anything regardless
of what it is?
 Successful 'graduates/completers' are an excellent indicator of effectiveness
in meeting community needs. How do we assess their 'success' beyond
graduation? Employment? wage progression? Continued education?
 Collecting and reporting student goals and whether the student goal has been
achieved might be more helpful to look at when considering persistence.
 More and more courses/certs/degrees are offered in summer (and not through
UAF Summer Sessions). Are these included in our analysis?
 We obviously need to discuss how we measure things and have it consistant
across MAUs and statewide.
 We should propose to measure that which is unique to what we do at the
community campuses to meet the community college mission - many of these
measurements feel like indicators to the successful achievement of existing
metrics that are measured at the MAU level
 It seems that much or our discussion of the other metrics is about how they
don't capture a certain group of our students, those whose goals are not easily
measurable by university standards (non-degree seeking, just looking for
additional job skills etc.). Maybe instead of modifying all the other metrics to
reflect these students we should add a separate metric just for these students.
 CEU's!!! CME's etc. Also how many sites we serve (often with only one
student there).
 #322 - McClenney had some measures related to employment that we may
want to go back to.
 Number of graduates each year.
 Sponsored tuition and courses.
Step Name: Course Sharing
Tool Name: Brainstorming
1. What tangible value or advantage might accrue from a DE course sharing initiative?
2. What tangible value or advantage might accrue from a DE course sharing initiative?
3. What tangible value or advantage might accrue from a DE course sharing initiative?
4. What tangible value or advantage might accrue from a DE course sharing initiative?
5. What tangible value or advantage might accrue from a DE course sharing initiative?
6. What tangible value or advantage might accrue from a DE course sharing initiative?
7. What tangible value or advantage might accrue from a DE course sharing initiative?
8. What tangible value or advantage might accrue from a DE course sharing initiative?
Download