Task Force on Student Learning Assessment Report

advertisement
Metropolitan State College of Denver, CO
Metropolitan State College of Denver
Final Report of the
President’s Task Force on Student Learning Assessment
February 18, 2005
Task Force Members
Peggy O’Neill-Jones, Technical Communication & Media Production
Suzanne Discenza, Health Professions
Madison Holloway, Management
Joshua Raines, Environmental Science (student)
Zav Dadabhoy, Student Activities
Paul Myskiw, Assessment & Testing
Chuck Hathaway, Center for Academic Technology
Linda Curran, Academic Affairs
And special thanks to Frieda Holley,
Emeritus Associate Vice President for Academic Affairs and Emeritus Professor of Mathematics
2/18/2005
1
Metropolitan State College of Denver, CO
INTRODUCTION
The President’s Task Force on Student Learning Assessment was issued its charge and
membership on September 13, 2004. The charge given by President Ray Kieft was to:
1
accomplish a review and listing of all the current activities related to the
assessment of student learning and evaluate the effectiveness of those
activities;
2
determine where (i.e., academic programs) MSCD is currently not conducting
an assessment of student learning; and
3
recommend instruments and assessment activities which would provide a
comprehensive program of the assessment of student learning across the
college.
At their first meeting on September 28, 2004, the Task Force determined that a few
additional members would expand the assessment competencies of the group: a
representative familiar with online instruction, a student, and if possible, an MSCD alumnus.
Chuck Hathaway of the Center for Academic Technology was asked to join the Task Force at the
outset, and a student representative, Josh Raines, joined the group for the final phase of the Task
Force’s charge, which was to develop a set of recommendations. Despite repeated efforts, it was
not possible to recruit an MSCD alumnus whose schedule permitted joining the group.
The Task Force also decided that a meaningful assessment of student learning must include
more than strictly academic programs in the traditional sense of the word. Accordingly,
current assessment activities have been considered to fall under two broad headings:
I. Curricular Programs and Outcomes (including all types of academic curriculum,
irrespective of delivery method or venue), and II. Co-Curricular Programs and Outcomes
(including programs or entities whose activities support, promote, and/or enhance student
learning.) The former includes classroom delivery, correspondence courses, online instruction,
internships, field placements, practica, professional preparation programs, etc. The latter
heading includes, but is not limited to, advising, assessment & testing, programs for entering
students, advocacy and counseling, tutoring, activities to promote communities of learners,
library, media, and computing resources.
The relationships among these programs and major types of assessment can be visualized as
follows:
THE LEARNING ENVIRONMENT
Curricular
Co-Curricular
2/18/2005
Programs
Assessment
Outcomes
Assessment
2
Metropolitan State College of Denver, CO
The Task Force also concluded that there are numerous assessment and evaluative
activities that occur at MSCD, which do not directly measure student learning. Many of
these are service utilization studies, or pertain to student satisfaction. While these assessment
activities are important to student success, they may not directly relate to student learning
outcomes, and have therefore not been included within the scope of this Task Force’s work.
The Task Force’s original charge referred to the Performance Contract being negotiated
between CCHE and MSCD, with the assessment of student learning to be “one of the performance
measures required by that performance contract”. However, it was tacitly understood that the
assessment would be required as well for the upcoming NCA accreditation visit in 2007. Task
Force members also acknowledged multiple other uses of such a comprehensive assessment
program, such as:
•
preparation for MSCD’s systematic academic program review;
•
annual departmental planning, faculty/staff orientation and professional development;
•
enhancing student engagement in the educational experience; and
•
increasing civic, community and workforce involvement.
MSCD’s unique modified open admissions policy also figured prominently in the discussions;
such a diverse student body might require a broader definition of student success. Since
MSCD’s doors are open, virtually without restriction, to every non-traditional-age student
seeking a college education, the college serves diverse purposes for a diverse clientele, which
includes:
•
•
•
•
•
•
degree-seeking adult learners who attended college earlier in their lives but were forced
to “stop out” for personal or financial reasons
adults who possess a college degree but are returning to pursue specialized certification
or lifelong learning
high achieving high school graduates who want to stay in the Denver area and are
attracted by the real-world education MSCD provides
less prepared students who want an opportunity to prove themselves in a higher education
environment
transfers to MSCD from two and four-year institutions
first-time freshmen, who range in ability from “at-risk” to high school valedictorians, and
non-traditional-age students who are first-time-to-college students.
About 56% of MSCD’s students are enrolled full-time, and 80% work full- or part-time.
Students range in age from 16 to 80 with a median age of 26 years. About 25% of the student
body is 30 years of age or older, but there has been an increase in the number of traditional-age
students who cannot afford to attend a traditional four year campus and who must work to
partially or fully support themselves while attending college. (Additional detail concerning
student body characteristics is included in Appendix D.)
Initial discussions also touched on the data-related issues that attend management of a
comprehensive campus-wide assessment program. Data collection, dissemination, backup,
warehousing, security, and access issues were among the topics identified for future
consideration. The Task Force decided that in addition to recommending what assessment
2/18/2005
3
Metropolitan State College of Denver, CO
instruments and activities should be used, that some consideration should be given to how
the assessment activities might be carried out, including data storage, retrieval, and
management issues. Additionally, a separate Task Force is considering the role of online
education at MSCD. This also has implications for MSCD’s student learning assessment efforts.
A major issue is the lack of an easily accessible data repository that is systematically utilized by
all constituents. Much of the needed assessment data is already gathered and compartmentally
stored at MSCD. It is simply not accessible to the campus community on a wide-scale basis.
Overall, the challenges that MSCD faces are likely to be substantial. On the one hand there is the
planned statewide revision of K-12 curriculum; on the other, there are changing workforce
requirements. It will be important to address the challenge of a student body that spans a
wide continuum of computer and information literacy skills. Beyond these basic
“technology” skills, employers of MSCD graduates also want students who can think critically,
deductively, inductively, creatively, with good oral and written communication skills, and who
possess the ability to work effectively and productively with other people of diverse
backgrounds.
The Task Force considered several definitions of assessment; the following seemed to best
encompass the group’s directives, philosophy, and goals:
Assessment is an ongoing process aimed at understanding and improving
student learning. It involves making our expectations explicit and public;
setting appropriate criteria and high standards for learning quality;
systematically gathering, analyzing, and interpreting evidence to determine
how well performance matches those expectations and standards; and using
the resulting information to document, explain, and improve performance.
When it is embedded effectively within larger institutional systems,
assessment can help us focus our collective attention, examine our
assumptions, and create a shared academic culture dedicated to assuring and
improving the quality of higher education.
-- Thomas A. Angelo
American Association of Higher Education Bulletin
November 1995, p. 7.
The Task Force also considered definitions of student learning that best suit the context of
our task. We agreed to define:
… learning as a comprehensive, holistic, transformative activity that
integrates academic learning and student development.
-- Learning Reconsidered: A Campus-wide Focus on the Student Experience
National Association of Student Personnel Administrators and American
College Personnel Association, 2004, p. 4.
***
2/18/2005
4
Metropolitan State College of Denver, CO
TASK FORCE CHARGES
#1a : Review and list current activities related to the assessment of student learning.
#1b : Evaluate the effectiveness of the identified assessment activities.
I. CURRICULAR PROGRAMS & OUTCOMES
The Office of Academic Affairs currently oversees three major approaches for assessing student
learning at MSCD.
The first of these approaches is the Annual Departmental Assessment Reports. All academic
departments on campus submit to the Associate VP for Academic Affairs, Curriculum and
Programs an annual report of their assessment activities, including copies of the instrument(s)
used to assess student learning, and discussion of the assessment outcomes. A list of these
assessment activities, by department, is included in Appendix A. The Annual Assessment
Reports are submitted first to the respective Dean of each of the three Schools (Letters, Arts, &
Sciences, Business, and Professional Studies), and then to the Office of Academic Affairs. The
Associate VPAA examines each report and notifies the department of any omissions or
deficiencies that are apparent.
The second avenue is the Program Review process. The effectiveness of the identified
assessment activities is evaluated through a number of different mechanisms. The Program
Review process is conducted every seven years for each academic department, and it includes an
evaluation of the adequacy of the department’s assessment activities and instruments. The
process allows for faculty peer review (the Program Review Committee members include one
faculty member each from the School of Business, the School of Professional Studies, and the
School of Letters, Arts, and Sciences, a member of the Faculty Senate Curriculum Committee, an
at-large faculty member with expertise in multicultural studies, and the Director of the Program
Review Committee, who is also member of the faculty), as well as evaluation by an outside
consultant. The evaluation of student assessment is always a key element in the consultant’s
charge when s/he reviews the academic program. Often the consultant is from an accrediting
agency or board, in which case assessment of student learning is a critical part of the
accreditation site visit and evaluation. If during the program review process the department’s
assessment activities and/or instruments are found to be lacking, the consultant will indicate so in
his/her report, and the department is asked to respond with a plan of improvement in their
Program Review Follow-up Questions. This information is typically presented to the MSCD
Board of Trustees one year after the Program Review. The chart in Appendix A includes a
column for each department’s most recent Program Review, with a synopsis of their
consultant’s evaluation of their student learning assessment activities.
The third approach is assessment of the effectiveness of MSCD’s General Studies curriculum.
This is accomplished in several ways. The first is the Academic Profile, a standardized general
education exam from the Educational Testing Service (ETS), which is given annually in selected
MSCD senior experience and teacher licensure courses. An attempt is made to select students
from different programs so that the results are somewhat representative of the general MSCD
student body. Group scores provide information on students’ knowledge of the humanities,
social sciences, and natural sciences and their ability to read, write, think critically, and use
2/18/2005
5
Metropolitan State College of Denver, CO
mathematical data. The means of MSCD students’ scores are compared to the mean scores of
students at similar institutions nationwide.
The second avenue for assessment of General Studies is through surveys sent to graduates,
employers, and seniors as part of the program review process. (See Appendix H for the survey
instruments used.) The surveys ask about the perceived adequacy of critical thinking, writing,
analytical, numerical, and other skills that General Studies courses are charged with developing.
Graduates receive two surveys: one specific to their major, the Majors Survey, and one
addressing general education, the Two-to-Five Year Graduate Survey. An Employer Survey is
included in the packet, and graduates are asked to give that survey to their supervisor. Seniors
also receive two surveys: one asking for an evaluation of their experiences as a major in their
degree field, the other asking for an evaluation of their college experiences (this second survey
addresses general education). The alumni and senior surveys ask graduates and seniors how
much their experiences at MSCD contributed to their achieving the general education goals that
the college has for its students. The employer survey asks the supervisor to rate the level of the
graduate’s achievement of 15 of the 25 goals. Among other general education questions, the
senior survey asks students to rate their ability to perform certain skills, e.g., write clearly, think
critically, before and after attending MSCD. An analysis of the gap provides some indication of
MSCD’s contribution to their developing skills.
II. CO-CURRICULAR PROGRAMS & OUTCOMES
Most non-academic MSCD programs support student learning and foster student retention and
success. Some of these positive contributions are measured through satisfaction surveys, user
evaluations, usability studies, headcount information, etc. Many programs gather these types of
program evaluative data using a variety of tools: surveys, focus groups, computer based surveys,
accreditations, peer evaluations, etc. While they provide significant value to students at MSCD,
few, however, seem to have defined and articulated their role in student learning opportunities.
The Task Force noted that some of the co-curricular programs and services at the college have
positioned themselves as service-oriented entities rather than extensions of MSCD’s learning
endeavor. For example, the scope of many Student Services departments may “support” student
learning indirectly, but they do not articulate learning as a primary outcome of their
program, nor expressly strive to achieve student learning as an outcome. Similarly,
numerous co-curricular programs succeed in facilitating some type of student learning by
their very nature; however, this happens as an incidental rather than an intentional, planned
outcome of the program.
In some cases, various departments and divisions include concepts of student learning in their
mission statements. However it does not appear that these concepts have been systematically
operationalized or assessed.
At the broadest possible level, the College, through the Office of Student Activities, uses the
National Survey of Student Engagement (NSSE) to survey freshman and seniors. The NSSE is an
annual survey that assesses the extent to which first-year and senior undergraduates engage in
educational practices associated with high levels of learning and development. The survey is
based on the premise that the frequency with which students engage in effective educational
practices indicates the quality of the educational experience. The survey measures five different
benchmarks of effective educational practices:
2/18/2005
6
Metropolitan State College of Denver, CO
•
level of academic challenge,
•
active and collaborative learning,
•
student interactions with faculty,
•
enriching educational experiences, and
•
supportive campus environments.
The survey also notes MSCD’s scores for these benchmarks of effective educational practices,
describing the percentile and decile ranking for each benchmark by comparing MSCD with a
national comparison group. The survey can be tailored both in terms of the comparison group
(e.g., Urban), as well as the specific questions asked. While MSCD has conducted the survey
three times in the last few years, the Task Force found little evidence that the survey results
were used for planning activities such as program enhancements, initiatives, or even
different ways of offering services.
Specific co-curricular programs conduct a variety of evaluative studies to determine program
effectiveness and, in a few cases, assessment of student learning:
• MSCD’s Academic Advising Center administers an on-the-spot survey of advising walkins to assess its effectiveness (see Appendix G).
•
The Office of Student Activities emphasizes student learning in their tag line, "Get
Involved, Learn More" [see http://studentactivities.mscd.edu/modules/office/mission.html ].
The primary function of the Office of Student Activities is to facilitate campus
involvement. The Office provides learning opportunities through the creation of
programs and activities linking students’ academic lives and their lives outside the
classroom. As the focus of college community life, Student Activities encourages
vigorous intellectual debate, critical thought and deeper understanding of current issues
and trends. The Office of Student Activities’ mission statement also provides the
rationale for conducting several assessments; e.g., outcomes are identified for each cocurricular event, in order to assess whether these goals have been met; and the Citizen
Leadership program was formulated to incorporate systematic assessment of student
learning.
•
In addition to these examples, various co-curricular programs have engaged in
professional development opportunities. The Student Life and Student Services division
has encouraged staff to understand and incorporate student learning as well as assessment
concepts into their work. Various speakers, workshops and conferences have been
arranged to enable this over the past five years. This is clearly encouraging; however,
these initiatives still need to translate into student services operations through formalized
planning cycles with articulated guidance and direction from College leadership.
There is no necessary division between curricular and co-curricular student learning
outcomes. Appendix B lays out one possible framework for integrating the two areas so
crucial to student life at MSCD.
***
2/18/2005
7
Metropolitan State College of Denver, CO
#2 : Determine where MSCD is currently not conducting an assessment of student learning.
I. CURRICULAR PROGRAMS & OUTCOMES
Areas that are lacking comprehensive assessment at MSCD include:
•
service learning (such as the on-the-job training received by student workers in the
Center for Academic Technology (CAT) lab; or activities of Students In Free Enterprise
(SIFE), a national organization that sponsors competitions to develop business ability and
contribute to the community; or the free tax return preparation undertaken by accounting
students at MSCD for those in the community who cannot afford such services)
•
information literacy (including library and internet and computer-assisted research
competency)
•
computer literacy (including software and hardware competency)
•
the effectiveness of remedial or preparatory education
•
the ‘value added’ of MSCD’s General Studies curriculum
•
student assessment of MSCD’s Honors Program
•
the effectiveness of online instruction, compared with classroom instruction
•
academic advising.
Service Learning. Some types of service, experiential, and community learning -- those which
are structured through MSCD’s Cooperative Education Internship Center -- are assessed. The
Cooperative Education Center has a systematic assessment process in place, which includes
student and employer evaluations. However, unless there are more than five students registered
for a particular Cooperative Education internship in a given semester, there is no required
evaluation of the faculty advisor. In fact, there are internships where six to ten students are
enrolled, but still no evaluation of the faculty advisor is being undertaken. There is also a lack of
student assessment of internship sites and preceptors. The challenge in any of these cases is to
devise a way to conduct the evaluations while protecting the evaluating students’ privacy.
Assessment is also lacking of internships/practica handled outside of the formal Cooperative
Education setting (i.e., through academic departments).
Information Literacy and Computer Literacy. Currently, there is no systematic assessment of
the information literacy or computer literacy of entering freshmen or transfer students, despite
the fact that students lacking in such skills cannot participate fully in MSCD’s learning
environment (use of the library and online course delivery are only two examples of this).
Defining ‘technology’ and deciding what is worthy of academic credit and what is ‘remedial’,
and where it should be taught, have been core issues in a campus discussion that has spanned
many years at MSCD. It is clear that students without technology skills are less employable, and
concern about this fact is increasing. In November 2004, the Chronicle of Higher Education (see
Appendix E) reported the following:
2/18/2005
8
Metropolitan State College of Denver, CO
The Educational Testing Service plans to unveil a standardized test this week,
designed to measure the computer savvy and information literacy of college
students.
The test will evaluate how well students can judge the validity of online
information, build a spreadsheet, compose e-mail messages summarizing a
passage, and perform other tasks. Called the ICT Literacy Assessment – the
first three letters stand for “information and communication technology” – it
was developed on the basis of advice from a committee of representatives
from seven universities. … They based their work on an earlier panel of
professors, businessmen, and government officials that ETS convened. That
panel’s report, Digital Transformation: A Framework for ICT Literacy, argued
for the importance of assessing such skills: “The panel strongly believes that it
is time to expand the notion of the digital divide … to include the impact of
limited reading, numeracy, and problem-solving skills.”
One of the panel members became involved with the project because
… [minority students are] not coming to college with the same set of
information management skills as other students. “Many of them don’t have
access to the higher-end technology at home.”
The Chronicle of Higher Education, November 12, 2004, p. A33
Appendix E also contains information on the University of Maryland’s Technology Certification
Exam, which can be found at the following link:
http://www.tekxam.com/certification/certification.htm
Remedial Education, and the General Studies Curriculum. During the spring 2005 term,
10,236 MSCD students were identified as needing to take one or more remedial courses that will
not count toward their degree program requirements. There is no formal, systematic assessment
of the effectiveness of remedial or preparatory education that MSCD requires for these students.
The reasons for this are varied and complex. As mentioned earlier, MSCD has a very diverse
student population; these students arrive with varying abilities as well as varying academic
preparation. The practice of using pre- and post-tests to assess ‘value added’ would assume that
students have a similar capacity for learning. When such assumptions are questionable or
unproven, pre-tests might more appropriately be employed to place entering students at the
proper course level in a given curriculum. At least one MSCD department, Mathematical and
Computer Sciences, employs the Accuplacer for that purpose. The department employs a grid
using test scores and other data to guide students’ appropriate placement (see Appendix F; please
note that this placement grid is currently being updated).
A pre-test would meet both purposes discussed here: it would assess skills upon entry, as well as
readiness (academic preparation) to undertake MSCD’s general education courses. There is no
question that without such pre-tests, assessing the ‘value added’ of MSCD’s General Studies
curriculum is a similarly difficult undertaking. The Academic Profile (AP) testing of the
effectiveness of MSCD’s general education is predicated on the assumption that General Studies
permeates the entire MSCD curriculum; the test is designed to answer the question: “How does
the average student tested at MSCD compare to the average student tested at other institutions?”
2/18/2005
9
Metropolitan State College of Denver, CO
The Academic Profile is currently serving as a post-test for General Studies skills; the efficacy of
MSCD’s particular General Studies courses is not addressed.
Therein lies an additional challenge: nearly half of MSCD’s entering students are transfer
students who have already taken some or most of their General Studies curriculum at other
institutions, so assessing MSCD’s ‘value added’ would require differentiating transfers from
native students, as well as transferred General Studies courses from General Studies courses
taken at MSCD.
Formal discussions are now underway, involving Community College of Denver faculty and
administrators and their MSCD counterparts, to address concerns with the effectiveness of
remedial courses. MSCD’s General Studies curriculum is currently undergoing the first phase of
a comprehensive review. It may be that these are the most appropriate responses to the current
lack of more conventional assessment in both of these areas.
Honors Program. Students enrolled in Honors classes are assessed in the same manner as for
MSCD’s other academic courses. However, student assessment of the Honors Program itself is
lacking. There are some indications that the program may be losing enrollment; assessing the
effectiveness of the program for the students who participate could be a starting point for a
discussion of its future positioning at MSCD. Increasing the visibility of the Honors Program is
one way of turning around the perceived community view of MSCD as primarily an “institution
of last resort”.
Online Instruction. It is becoming increasingly clear that assessment of online instruction, to
compare its effectiveness with that of resident (classroom-based) instruction, is needed. Online
education at MSCD has grown without a parallel administration, policy, or assessment that
acknowledges its differences from resident instruction. Although there have been past efforts to
address various issues (see, for example, the effort by MSCD Professor Larry Worster, in
Appendix C), a separate Task Force has been convened during the 2004-05 academic year and is
addressing this issue in a comprehensive, coordinated manner for the first time (see the draft
document in preparation by Dr. David Conde, Appendix D). The scope of the Online Task
Force’s charge includes assessment of online instruction.
Academic Advising. Advising is an activity that spans both curricular and co-curricular areas,
and it is somewhat arbitrary to differentiate academic from other types of advising related to
student success. Assessment is further complicated by the lack of a unified vision of what is
meant by ‘advising’ at MSCD. Consensus is lacking about such basic elements as a working
definition of advising, the scope of advising activities, or the appropriate distribution of advising
responsibilities across academic and/or administrative units, in spite of the fact that student
advising is now a formal part of faculty tenure dossiers. Data from the NSSE supports the fact
that departmental advising produces higher satisfaction in students who have declared their
academic majors than among the non-declared (or not-yet-declared) majors.
II. CO-CURRICULAR PROGRAMS & OUTCOMES
Many co-curricular programs across the country have established specific student learning
outcomes that support and enhance the mission of their college or university. These programs
are beginning to articulate the non-academic-specific skills they expect of students who interact
with their programs (see Appendix B for one such framework). These programs demonstrate
attention to instruction that transcends the classroom experience — education that encompasses
2/18/2005
10
Metropolitan State College of Denver, CO
the whole collegiate experience — and thus articulate institutional learning competencies (or
student development goals) for all students. These articulated student development goals are
then transformed into action plans for educators, and are assessed regularly.
Comprehensive assessment is lacking in several co-curricular areas at MSCD:
•
The Task Force found that there was little assessment of student learning in the Student
Services area. While student learning occurs at various programs within the division,
Student Services has not developed student learning as a cornerstone of its operations,
and assessments therefore do not measure student learning outcomes.
•
As growth trends continue in MSCD’s online course delivery and enrollment, it is
increasingly clear that a need may exist for an online component of the Student Life and
Student Services division, as an alternative to an entity tied to physical locations on the
Auraria campus during “normal business hours”.
•
Student advising plays a very important role in student success, but this is an activity that
occurs at so many levels within the College that it will require a college-wide discussion
to arrive at a coordinated means of assessment. (See Appendix G for a survey currently
used by MSCD’s Advising Center.)
***
2/18/2005
11
Metropolitan State College of Denver, CO
#3a : Recommend instruments and assessment activities which would provide a
comprehensive program of the assessment of student learning across MSCD.
#3b : Identify data management issues and considerations related to implementing a
comprehensive program of student learning assessment.
Many in the MSCD community would agree that there is a need to establish a “culture of
evidence”, in the form of systematic assessment activities built into the College’s infrastructure.
However, such a campus-wide assessment initiative would constitute a major culture shift for
students, faculty, and staff at MSCD. Although the elements for implementing a comprehensive
program of student learning assessment are listed last in this section, they are the foundation for
the set of recommendations which follows here.
I. RECOMMENDED INSTRUMENTS AND ASSESSMENT ACTIVITIES
1. Consider instituting a technology-based General Studies requirement. This does not
currently exist at MSCD. While there might conceivably be discipline-specific
technological skills, a campus-wide solution needs to be found, rather than piecemeal
individual department solutions. For example, assessing the information literacy and
computer literacy of entering students prior to the first week of classes would provide
them with valuable information that would bear on their full participation in the campus
community, as well as their post-graduation employability. This could be accomplished
via several avenues: a course (or courses) that all students could “test out” of, or
“clinics” or one credit hour courses offered each semester or on weekends that could
cover the relevant topics.
2. Devise a formal mechanism for using the annually-submitted reports on departmental
assessment of student learning in departmental planning processes. While academic
departments at MSCD conduct and report their annual assessments of student learning,
the Task Force found little evidence that these assessment results are systematically used
for planning activities in departments or programs.
3. Initiate a college-wide discussion to arrive at a coordinated means of assessing
student advising, and continue current efforts to assess the relative effectiveness of
departmental, school, and centralized academic advising.
4. Design an integrated curricular & co-curricular assessment program that uses
national Student Affairs models as a guide, in a purposeful, systematic, ongoing and
intentional manner. Ideally, such a program would be based on factual knowledge about
our diverse student body, what type of campus experiences they encounter and require,
and the quality of student learning outcomes that result. One example of a survey, the
Gaps Model of Service Quality, is being administered by MSCD’s Marketing department
in conjunction with one of their courses. The report for each area surveyed will be
available soon. Using such a model, academic departments could be assessed every five
to seven years, possibly in conjunction with program review. With additional resources,
the survey could possibly be handled by MSCD’s Assessment and Testing Center. A
shorter time frame (every 3 to 5 years) could be used for non-academic units.
2/18/2005
12
Metropolitan State College of Denver, CO
II. CRUCIAL ELEMENTS FOR IMPLEMENTATION OF A COMPREHENSIVE PROGRAM OF
STUDENT LEARNING ASSESSMENT
1. Strong leadership will be crucial to a successful program of student learning
assessment. MSCD’s senior administration needs to provide vision, emphasis, direction
and leadership to make student learning and assessment of these learning activities a
stated priority. This is essential, but especially so for the co-curricular areas.
2. Buy-in of students for existing as well as possible new assessment tests will be
crucial. Informal student feedback in MSCD’s Assessment and Testing Center indicates
that many students feel “maxed out” with assessment testing.
3. There should be a College-wide emphasis on student learning which extends beyond
academic departments. Student learning should be a priority for all MSCD endeavors.
Co-curricular programs, including Student Services, must make student learning
intentional, deliberate and purposeful. Learning outcomes must then be systematically
assessed and incorporated into the planning cycle.
4. College resources must be committed to such a large-scale undertaking. Many
student affairs units across the nation have an Assessment and Research office, yet MSCD
does not have such a unit for the entire College. Resources needed include personnel and
operational resources for ongoing systematic assessment. Assessment instruments
currently range in cost from $11 to $23 per student per test. Professional development
resources will go a long way toward promoting faculty buy-in.
5. A way to compile and share assessment data on a platform that is accessible will be
a large and central issue. Presumably the data will need to be uploaded electronically to
CCHE, or at the least, be accessible in the likely event of an audit. Many of the annual
assessment test outcomes are not systematically considered in setting academic
departmental goals for the short- or longer-term.
6. MSCD will need a localized version of document sharing and data warehousing. It
seems logical and possibly cost-effective to investigate whether forms and datawarehousing could be further developed in Banner before considering outside vendors.
Regardless of the system chosen, there would be multiple inputs and multiple users. It
would be important to allocate resources for professional development, as faculty would
need training, possibly reassigned time, and administrative support. System
administrators would also be needed; perhaps their training could be handled as part of
professional development activities. A two-year cycle in overall assessment goals could
position MSCD to meet almost any challenge or change, but strong College leadership
would be paramount in this undertaking.
2/18/2005
13
Metropolitan State College of Denver, CO
APPENDIX A.
Academic
Program
African
American
Studies
Anthropology
Art
2/18/2005
Assessment
Method(s) Used:
School of Letters,
Arts and Sciences
Comprehensive essay
examination designed
by the faculty.
No report since 2001.
Faculty-designed test
given to seniors to
contrast their
knowledge and skills
related to
anthropology.
No report since 2001.
Studio art students:
evaluation of
completed artwork,
student’s artist
statement, and defense
before a faculty
committee.
Design students:
portfolio.
Art history students:
work in a senior
experience class –
research paper, oral
presentation, & test.
Being revised with
retirement of faculty.
Program Review
Last Year
Assessment Method
was Evaluated
(Program Review)
Program has few graduates.
The external consultant found the
goals too general to be measured in a
systematic fashion.
Recommendation: The consultant
suggested faculty revise their goals
and learning objectives and develop
an evaluation tool and methodology
to assess the revised goals and
objectives. The dean has asked that
the goals and methodology be
revised.
The external consultant found the
assessment procedure offered an
“evaluation of students that is
nationally consistent and thorough,”
and noted that students are achieving
the stated goals and competencies.
Concern: Three of the student
outcome goals are specifically
directed at art history students, and art
faculty assess them as one goal rather
than as separate goals.
Recommendation and Plans:
Faculty should refine their method of
assessment of art history majors
relative to the three goals, providing
an evaluation of each goal. This
concern will be addressed in the new
curriculum proposal, which is in
process.
2004
2001
14
Metropolitan State College of Denver, CO
Academic
Program
Behavioral
Science
Biology
Chemistry
2/18/2005
Assessment
Method(s) Used:
Program Review
Faculty-designed test
given to both
beginning students
and seniors to contrast
their knowledge and
skills related to
behavioral science.
Faculty are
considering using the
PLACE and PRAXIS
results since most
majors become
teachers.
Pre- and post-tests are used to assess
behavioral science students. A pretest is given to students who have
completed one social science course
and are taking a second. The same
exam is given as a post-test to
students who are a year from
graduating. Both external reviewers
were concerned about the number of
students tested — approximately 10%
of the graduates. Students whose
senior experience is student teaching
are not tested.
Recommendation and Action
Planned: Both external reviewers
recommended that all students take
the post-test, particularly students
taking the licensure sequence.
Faculty plan to work with teacher
education to discuss ways of
assessing students while they are
student teaching.
Students are assessed using the ETS’s
Major Field Achievement Test and
the external consultant noted the
results of the last five years show
students were in the 52nd percentile
(63rd in the last year.) He suggested
testing more students. The consultant
cautioned that the test assessed
content and not the skills that might
be desired, such as problem solving
and creative thinking. Faculty are
now giving a Classroom Test of
Scientific Reasoning to students at
the beginning and end of their
college career.
To assess the program, the
Department administers American
Chemical Society’s standardized
national examinations in general,
analytical, organic, and physical
chemistry as the final in the
appropriate classes. The results
enable faculty to compare the
performance of their students and
majors with that of chemistry students
nationally. Two overall percentile
rankings are reported, one for all
students who took the exam, and one
for chemistry majors.
ETS’s Major Field
Test in Biology
scores; professional
exam scores (MCAT,
VAT, DAT, GRE);
placement of
graduates.
American Chemical
Exams in general,
analytical, organic,
and physical
chemistry.
Last Year
Assessment Method
was Evaluated
(Program Review)
1999
2003
2001
15
Metropolitan State College of Denver, CO
Academic
Program
Chicano Studies
Computer
Science
English
2/18/2005
Assessment
Method(s) Used:
Program Review
Research paper
completed in the
senior experience
class. No report since
2001 or 2002.
Seniors are assessed in CHS 4850
Research Experience in Chicana/o
Studies in which they complete a
research paper on a topic of their
interest in Chicana/o Studies.
Students are expected to obtain some
research “experience” from a
community agency or institution.
The external consultant was
impressed with the amount of data
collected. He wrote: “The MSCD
assessment data [including the
surveys] is as good as the data I have
seen at schools that have achieved
CSAB accreditation. It is obvious to
me that the department responds to
assessment data; it is important that
the department document their
responses to this information.”
The College Program Review
Committee reported that the English
assessment program is strong.
Students submit portfolios of their
work, and usually two faculty
members evaluate the portfolio.
Faculty report that the assessment
results have focused their discussion
on curriculum concerns even though
they have not recently made
curricular changes based on the
results. The external consultant
examined some of the portfolios and
was impressed by the quality of
papers in them.
Faculty who taught
graduating seniors
collectively evaluate
the knowledge of each
student in each
skill/knowledge
category. The
average score in each
category is reported.
Portfolio. Content of
the portfolio differs by
concentration.
Last Year
Assessment Method
was Evaluated
(Program Review)
2002
2001
2000
16
Metropolitan State College of Denver, CO
Academic
Program
Environmental
Science
2/18/2005
Assessment
Method(s) Used:
Faculty evaluation of
students’ achievement
of the student
outcome goals in a
senior experience
course in which
students conduct a
research project.
Program Review
Student learning outcomes were
stated when the major was proposed,
but the committee found that
students’ achievement of the
outcomes is not being assessed.
Students’ achievement of some
outcomes is partially assessed in a
course taken by other majors, but no
attempt is being made to differentiate
the performances of the two groups or
to assess all the environmental
science learning outcomes. The
course in which assessment takes
place is not required of all
environmental science majors;
consequently some majors’
achievements are not assessed.
Recommendation: The College
Program Review Committee made
some suggestions on how faculty
might assess environmental science
majors’ achievement of the
environmental science learning
outcomes.
Last Year
Assessment Method
was Evaluated
(Program Review)
2004
17
Metropolitan State College of Denver, CO
Academic
Program
History
Human
Development
2/18/2005
Assessment
Method(s) Used:
Faculty evaluation of
students’ achievement
of the student
outcome goals in a
senior experience
course in which
students write a major
paper or a series of
shorter research
papers.
Faculty also use the
History MFAT and
students’ performance
on the
PLACE/PRAXIS.
New Major – Should
soon have reports.
Program Review
The external consultant found the
educational goals appropriate for an
undergraduate program in history. He
stated that his review of five years of
annual assessment reports indicated
significant progress on the part of
faculty in developing an effective
assessment plan. Assessment
activities show that writing skills and
knowledge of the third world are
areas that need improvement.
Overall, he observed that “most
students score at or above average in
the skills and content knowledge
specified in the program’s goals.” He
was pleased that some curriculum
revisions that had been accomplished
recently had been to address concerns
found by the assessment process.
Last Year
Assessment Method
was Evaluated
(Program Review)
2003
Recommendation: The external
consultant suggested the addition of a
new goal: “ability to develop and
deliver oral presentations with multimedia support.” He also mentioned
that the faculty might consider having
students develop portfolios, but
cautioned that the portfolio process is
beneficial to both faculty and students
only when there are a sufficient
number of full-time faculty. He noted
that the syllabi are specifically tied to
the Colorado standards for licensure.
Both external reviewers suggested
reviewing student performance in the
courses required for social science
licensure.
New major
18
Metropolitan State College of Denver, CO
Academic
Program
Assessment
Method(s) Used:
Program Review
Faculty adopted a
portfolio method
because employers
usually require a
portfolio. Due to
confusion, no
assessment has been
completed since the
Journalism Program
became a part of
Communication Arts
and Sciences.
Faculty use a two-part, locallydeveloped assessment test. The first
part is multiple choice and addresses
general knowledge of journalism.
The second part is a writing test.
According to the annual assessment
report, seniors are not performing as
well as faculty would like on the
assessment test. Journalism faculty
noted that there was no incentive for
students to do well and that the
assessment test did not reflect the new
emphases in photojournalism and
public relations.
Recommendation and Plans: The
College Program Review Committee
suggested that faculty consider
making the results on the assessment
test part of students’ grades and that
faculty revise the test as appropriate
for the other emphasis areas. Faculty
have become interested in using a
portfolio process, which would
enable them to include and evaluate
the work students do on internships
and other external experiences.
Professionals from the community
could be involved in the evaluation of
portfolios.
Faculty assess students’ achievement
of the student learning outcome goals
using students’ work in a senior
experience course. The College
Program Review Committee noted
that students in the GIS concentration
are not assessed, that some goals are
not assessed, and that environmental
science majors are assessed in the
same courses but there is no
differentiation by major.
Recommendation: The Committee
recommended that faculty review
their desired educational outcomes
goals and revise their assessment
methodology to create a method of
assessing the effectiveness of the
Land Use Program as a whole,
including the GIS concentration.
Journalism
Land Use
2/18/2005
Faculty evaluation of
students’ achievement
of some student
outcome goals in
ENV 4960, ENV
4970, and GIS 4890 in
which students
conduct a research
project. GIS is now
being assessed in a
manner similar to the
other concentrations.
Last Year
Assessment Method
was Evaluated
(Program Review)
2000
2003
19
Metropolitan State College of Denver, CO
Academic
Program
Mathematics
Meteorology
2/18/2005
Assessment
Method(s) Used:
Faculty who taught
graduating seniors
collectively evaluate
the knowledge of each
student in each
skill/knowledge
category. The
average score in each
category is reported.
Faculty-designed
exam given in a senior
course.
Program Review
Faculty who taught graduating seniors
collectively evaluate the knowledge
of each student in each
skill/knowledge category. The
average score in each category is
reported.
Recommendation: Dr. England
proposed that faculty expand their
one-credit senior seminar courses and
require that students produce an
artifact. Producing the artifact would
help majors integrate and synthesize
the mathematics they have learned.
Faculty could use the artifacts to
evaluate students’ achievement of the
desired student outcomes, and the
artifacts would provide evidence of
student learning that could be
examined by mathematicians external
to MSCD.
According to the external consultant,
“the educational goals in terms of
student outcomes and competencies
are well-articulated and assessed by
the faculty.” An exam is given in a
senior-level class to determine
students’ achievement of the student
outcome goals. He also wrote: “The
assessment plan is well-conceived and
implemented. The stability of test
results through the years is a positive
reflection of the consistency of
desired competencies.”
Last Year
Assessment Method
was Evaluated
(Program Review)
2002
2003
20
Metropolitan State College of Denver, CO
Academic
Program
Modern
Languages
Assessment
Method(s) Used:
Program Review
Current methods:
Brigham Young
University
Computerized
Adaptive Proficiency
Examination in
selected first- and
second-year language
courses. The courses
chosen are end of
sequence courses at
each level and
indicate whether
students are prepared
to take classes at the
next highest level. At
the upper division
level until 2002 a
modified oral
proficiency interview
was given to
graduating seniors in
all 4000 level courses
designated as Senior
Experience courses.
Assessment at the end of the first-year
and second-year sequences: Faculty
have been experimenting with the
Brigham Young University
Computerized Adaptive Placement
Examination. It is not meeting all
their needs, and in 1998-99, they
added an individual oral interview in
some classes.
Assessment of Seniors: Faculty use
students’ work in the senior
experience course as their assessment
measure, reporting only the mean
score. The external consultant found
the senior assessment plan weak and
ineffective. The College Program
Review Committee was concerned
that no data were provided for
specific desired outcomes.
Recommendation and Actions
Taken: Following the consultant’s
suggestion, faculty are developing an
oral proficiency exam and they are
researching portfolio programs. The
College Program Review Committee
urged faculty to use a method that
provides information about students’
attainment of desired goals.
Committees of full-time music faculty
review students at seven points in
their studies using the National
Association of Schools of Music
quality standards that are appropriate
to the sub-discipline and stage in their
studies. The external consultant
found the assessment methodology
gave a thorough evaluation of
students’ achievements and
competencies, and he noted that
faculty had used assessment results to
modify their teaching. He stated that
he would not recommend any changes
in the criteria or procedure.
Music
Music performance
students (BM in
Music): Senior
Recital. BA in Music
students: Senior
Project.
Music Education
Music education
students: Piano
Proficiency Exam and
Final Jury
2/18/2005
Last Year
Assessment Method
was Evaluated
(Program Review)
2000
2002
See Music above.
21
Metropolitan State College of Denver, CO
Academic
Program
Assessment
Method(s) Used:
Program Review
Seniors are required to write essay
responses to four questions to
demonstrate their knowledge of and
skills in philosophy. Each essay is
graded by at least two faculty and the
results are reviewed by all faculty.
Both reviewers were concerned that
this take-home assessment exam does
not count for part of a student’s grade
for a course, and students,
consequently, do not take the
assessment test seriously.
Philosophy
The portfolio method
was adopted because
it was recommended
by external consultant
and also considered
more effective.
Physics
Physics faculty including UCD faculty
- collectively assess
seniors’ achievement
of the student
outcome goals.
Missing 2003, 2004
2/18/2005
Last Year
Assessment Method
was Evaluated
(Program Review)
2000
Recommendation and Plans: Both
external reviewers thought the
assessment methodology should be
improved, and they made a number of
suggestions, including alternative
methodologies. The committee
recommended that faculty consider
making the test count as part of the
grade for a course and offered as an
alternative, requiring the GRE. The
consultant mentioned requiring
students to create a portfolio and that
faculty connect the development of
the portfolio with advising. Faculty
will consider using the GRE;
however, they would prefer to use a
portfolio process. Students would
begin compiling their portfolio at the
end of the sophomore year.
All faculty who have taught a
graduating senior rate the senior’s
achievement of each educational goal,
and the ratings are averaged. The
external consultant considered the
assessment process “excellent.”
2002
22
Metropolitan State College of Denver, CO
Academic
Program
Political Science
Assessment
Method(s) Used:
Faculty’s assessment
of seniors’
achievement of the
student learning goals
in the senior
experience course.
They also use ETS’s
Major Field Test in
Political Science
Psychology
Term paper in the
senior experience
course; research paper
and oral presentation
in a junior-level
course; student
survey.
Social Work
Employers survey
sent in evennumbered years;
capstone projects; exit
portfolios;
Baccalaureate
Program Directors
(BPD) Alumni Survey,
a standard assessment
used nationwide, sent
in odd-numbered
years.
2/18/2005
Program Review
Political science faculty use
Educational Testing Services’ Major
Field Achievement Test to assess
students’ learning. Forty students
took the test in 2002-03, and the
results were favorable, with the MSCD
mean being above the national mean
in most measures except United
States Government and Politics. The
mean percent correct scores in the
area of methodology were not strong,
but the MSCD mean was slightly
higher than the national mean.
The program uses a variety of
assessment measures: a term paper, a
student survey, and a research project
summarized in a paper and an oral
presentation. Changes have resulted
from the assessment activities.
Among them are the requirement of
out-of-class writing in all psychology
courses, the development of uniform
course standards, and the requirement
of a new, creative project in the final
research course rather than allowing
use of a previously existing project.
A Community Advisory Board
annually reviews the program’s
assessment results and, if appropriate,
suggests changes.
Last Year
Assessment Method
was Evaluated
(Program Review)
2004
1999
Review in progress.
23
Metropolitan State College of Denver, CO
Academic
Program
Sociology
2/18/2005
Assessment
Method(s) Used:
Faculty-developed test
given to seniors.
Program Review
According to the external consultant,
the program has a highly developed
process for assessing how well majors
meet identified student learning
objectives. He noted that faculty
analyze quantitative data generated by
evaluating essay exams given to
seniors annually. He was pleased that
the assessment results clearly indicate
that program majors develop a keen
awareness of diversity issues.
Overall, the results indicate that
program goals are generally met.
Highest overall scores on the
assessment test occurred in 2000, with
a slight decrease since that date.
Students consistently score lowest at
demonstrating an ability to apply
knowledge and skills to facilitate
social change. Faculty have
considered creating an applied social
change course at the 3000 level to
ensure that more majors take a
course in social change.
Recommendation: The external
consultant recommended that trends
in the assessment test should be
closely monitored to determine
whether decreased overall scores
represent an ongoing decline. Both
external reviewers suggested that a
course on applied social change could
address the low scores on the
associated objective. The College
Program Review Committee
suggested that an alternative strategy
would be to incorporate topics related
to social change into a variety of
courses. Faculty may want to
investigate whether their new course,
Social Action Through Art, is
contributing towards achieving this
objective.
Last Year
Assessment Method
was Evaluated
(Program Review)
2004
24
Metropolitan State College of Denver, CO
Academic
Program
Speech
Communication
Assessment
Method(s) Used:
Faculty collectively
assess seniors’
achievement of the
desired outcomes.
Faculty assess all
seniors with whom
they have worked and
the averages are
reported.
Program Review
Faculty use a Likert-type assessment
scale to evaluate seniors’ achievement
of the program’s educational goals.
Internship supervisors provide
evaluations of students but not in a
systematic way. Students in two firstyear General Studies courses are also
assessed using different
methodologies. The external
broadcasting consultant found that
students were aware of the
competencies expected of them.
Another consultant observed that
there is no testing of seniors’ oral
performance skill, and she doubted
the present methodology truly
assesses the range of competencies
desired by the faculty.
Last Year
Assessment Method
was Evaluated
(Program Review)
2000
Recommendations: Two external
consultants urged creation of a
systematic way of using the
internships for assessment. One
consultant advocated the development
of a strong alumni-tracking procedure
and of needs assessment procedures
to determine the market need for the
program’s concentrations.
Theatre
Will use portfolio,
work in a senior
experience course,
and exit interviews.
School of Business
New major
ETS Major Field Test
Accounting
Computer
Information
Systems
2/18/2005
for Business,
accounting subscore.
A department
developed test and
ETS Major Field Test
for Business.
Review in progress
Review in progress
25
Metropolitan State College of Denver, CO
Academic
Program
Economics
Finance
Management
Marketing
Aviation
Management
Aviation
Technology
2/18/2005
Assessment
Method(s) Used:
Work in a senior
experience course,
which includes a
senior project, short
papers, and class
discussions. All
faculty are offered the
opportunity to be
involved in evaluating
the project. Faculty
are also planning to
require an oral
presentation of the
capstone research
project. Finally
faculty use ETS’s
Major Field Test in
Economics.
ETS Major Field Test
for Business. Reports
obtained from
internship supervisors
ETS Major Field Test
for Business.
ETS Major Field Test
for Business.
School of
Professional Studies
Internship
supervisors’
evaluations
Flight skills and
aeronautical
knowledge tested for
students enrolled in
AES 4710.
Program Review
Last Year
Assessment Method
was Evaluated
(Program Review)
Review in progress
Review in progress
Review in progress
Review in progress
Review in progress
Review in progress
26
Metropolitan State College of Denver, CO
Academic
Program
Assessment
Method(s) Used:
Civil
Engineering
Technology
Previous method:
graduates’ scores on
the Engineering in
Training exams if
available.
Criminal Justice
and Criminology
The goals are being
revised. A
departmentallycreated assessment
exam has been
piloted. Results of
this exam will be
counted in CJC 4650
to ensure students take
the exam seriously.
2/18/2005
Program Review
The ABET accreditation team noted
that although the engineering
technology programs had started to
gather assessment data, the results
were not being used or analyzed.
Faculty have not established
performance criteria to determine
students’ level of competence on
desired student outcomes. The
College Program Review Committee
stated that more information about
graduates’ abilities was needed since
some survey results were inconsistent.
Last Year
Assessment Method
was Evaluated
(Program Review)
2004
Recommendation and Action
Taken: ABET requires that the
program implement a continuous
improvement plan, being sure that the
results obtained can be used for
improvement and that students’
competencies are evaluated. The
College Program Review Committee
advised that if faculty could not assess
all student learning outcomes with
one instrument, they might consider
using students’ work in selected
upper-division courses to supplement
the assessment instrument.
Review in process
27
Metropolitan State College of Denver, CO
Academic
Program
Electronics
Engineering
Technology
Health Care
Management
2/18/2005
Assessment
Method(s) Used:
Past method: Faculty
evaluation of
students’ achievement
of the student
outcome goals in a
senior experience
course. As faculty
resources dropped,
this assessment
activity was no longer
carried out.
Practicum preceptor
evaluation, faculty
supervisor evaluation,
and student selfevaluation of seniors’
achievement of the
program exit
behaviors; pre- and
post-testing using the
Watson Glaser
Critical Thinking
Appraisal.
Program Review
Since the ABET site visit in October
2003, the faculty have been working
on mapping program outcomes to
specific course offerings. Once the
mapping is completed, faculty will
review the student products generated
in the courses e.g., written reports,
examinations, to determine which of
those products can be used to
determine students’ level of
competency on a TAC Criterion 1
outcome. Once those products are
determined, faculty will identify
performance criteria. In addition,
faculty are developing an assessment
instrument and appropriate
performance criteria relative to that
instrument that will measure students’
competencies and provide direct
feedback to faculty and administrators
regarding needs for program
improvement. The protocol will be
used this semester and results will be
analyzed and used to improve the
program. A copy of the instrument is
enclosed. – Information sent to ABET
in April 2004 – See also MET and
CET.
Three measures are used to assess
student outcomes. The WatsonGlaser Critical Thinking Appraisal is
given at the beginning and end of the
program. Faculty have identified five
General Studies exit behaviors, and
faculty assess students’ achievement
of the behaviors at the same time that
students self assess their achievement
of those behaviors. Finally, faculty
have identified eight health care
management exit behaviors
(educational outcomes). Faculty and
practicum preceptors assess students’
achievement of those behaviors, and
students are asked to self assess their
achievement of the behaviors. The
AUPHA reviewers wrote that faculty
addressed program
evaluation/assessment in an
“innovative and creative manner” and
recommended that the activities
continue and be published.
Last Year
Assessment Method
was Evaluated
(Program Review)
2004
2002
28
Metropolitan State College of Denver, CO
Academic
Program
Hospitality,
Meeting, and
Travel
Administration
Assessment
Method(s) Used:
Faculty evaluation of
a senior research
project; industry
professionals’
evaluations of student
team projects;
industry certifications;
exit survey of seniors.
No analysis is
provided by
individual goal.
Human
Performance
and Sport
2/18/2005
Internship or student
teaching supervisor
evaluation and senior
self-evaluation of
graduating seniors’
achievement of the
student outcome
goals.
Program Review
For two years, 1995 and 1996, several
student outcome goals were assessed
in a senior experience course in which
students designed and completed a
research project working with two
industry advisors. Students’
achievement of some of the goals was
measured using the advisors’
evaluations. In 1998 HMTA faculty
switched to asking seniors to evaluate
their preparedness in different aspects
of the industry and their preparedness
for present and future positions in the
industry. The committee criticized
this methodology because students
were judging their own preparedness.
Recommendation and Plans: The
College Program Review Committee
urged faculty to develop alternative
measures of assessment. Faculty
plan to expand the opportunities for
assessment in the senior experience
courses.
Faculty currently assess students’
knowledge and skills by having the
supervisors of required internships
evaluate students’ performances and
by having students evaluate their own
ability. The results are compiled by
concentration. The low numbers of
student and supervisor responses for
some of the concentrations were a
concern. Faculty were beginning to
question the assessment methodology.
Recommendation: The College
Program Review Committee urged
faculty to review their assessment
methodology, and if the current
methodology is kept, both external
reviewers advocated that faculty take
steps to increase the number of
responses, especially from internship
supervisors.
Last Year
Assessment Method
was Evaluated
(Program Review)
1999
2002
29
Metropolitan State College of Denver, CO
Academic
Program
Human Services
Industrial
Design
2/18/2005
Assessment
Method(s) Used:
Agency supervisors’
evaluations of seniors’
competencies shown
during a required
senior-level
professional
internship.
Portfolio
Program Review
To assess the program, faculty ask
supervisors of students’ required
internships to evaluate aspects of the
intern’s performance. The College
Program Review Committee noted
that the faculty analyze the results
extensively, but they do not analyze
the results by concentration so they
are not able to evaluate students’
achievements in the different
concentrations. Also, the Committee
was not certain that the same
assessment methodology was
appropriate for the NOA
concentration.
Plans: Faculty intend to analyze the
assessment results by concentration
and to devise a different assessment
methodology for the NOA
concentration.
Currently Industrial Design faculty
annually evaluate student portfolios as
their assessment methodology.
Portfolios are submitted by students at
various educational levels, not just by
seniors. With accreditation by the
National Association of Schools of
Art and Design (NASAD), the
evaluation of portfolios will assume
a greater importance than it
currently has. Portfolios will be used
in two ways. First, they will provide
a method of distinguishing students
who have the potential to become
professional designers from those
who do not, and second, they will
serve as an assessment tool and
process for reviewing the consistency
and quality of continuing students’
work.
Last Year
Assessment Method
was Evaluated
(Program Review)
2004
2003
30
Metropolitan State College of Denver, CO
Academic
Program
Leisure Studies
Mechanical
Engineering
Technology
2/18/2005
Assessment
Method(s) Used:
Program Review
Internship supervisor
evaluation and senior
self-evaluation of
graduating seniors’
achievement of the
student outcome
goals. One
concentration also
uses the National
Council for
Therapeutic
Recreation
Certification’s
Certification Exam for
Certified Therapeutic
Recreation Specialist.
The 1999 Therapeutic Recreation test
report showed that the mean
performance of MSCD’s 13 graduates
was significantly above both the
regional and national means of test
takers. The College Program Review
Committee observed that each year
the number of returned assessment
evaluations was extremely low and,
therefore, not adequate to draw any
conclusions.
Recommendation: The committee
suggested creating a new method of
assessment or a new mechanism for
collecting the evaluations, and, if the
latter, stated a goal should be to
obtain evaluations from over half of
the seniors. Plan: Faculty plan to
implement a new strategy to collect
the evaluations; they will allow the
agencies’ responses to be anonymous.
The ABET team noted that although
the engineering technology programs
had started to gather assessment data,
the results were not being used or
analyzed. Faculty have not
established performance criteria to
determine students’ level of
competence on desired student
learning outcomes. The committee
observed that employers’ responses to
the program review survey indicated a
need for an assessment instrument
that can determine students’ ability to
apply job-related knowledge and
skills.
Previous methods:
faculty-designed test.
Graduates’ pass rate
on the Fundamentals
of Engineering Exam.
Last Year
Assessment Method
was Evaluated
(Program Review)
2001
2004
Recommendation and Action
Taken: ABET requires that the
program implement a continuous
improvement plan, being sure that the
results obtained can be used for
improvement and that students’
competencies are evaluated. Faculty
discussed several approaches, but
proposed an examination. ABET
emphasized that a variety of methods
must be used. See also EET.
31
Metropolitan State College of Denver, CO
Academic
Program
Assessment
Method(s) Used:
Program Review
Nursing
Practicum preceptor
evaluation, faculty
supervisor evaluation,
and student selfevaluation of seniors’
achievement of the
program exit
behaviors; pre- and
post-testing using the
Watson Glaser
Critical Thinking
Appraisal.
The National League of Nursing
requires that a program evaluate itself
using 22 criteria. Each program is
required to develop a “systematic plan
for evaluation and outcomes
assessment.” The NLN Team was
pleased that all the methods and tools
referenced in MSCD’s systematic plan
had been tested for reliability,
validity, and trustworthiness. The
Team was concerned, however, that
at the time of the accreditation visit,
less than half of the criteria had
been evaluated as planned.
Special
Education
Surveying and
Mapping
2001
New major
Faculty plan to start
using a
comprehensive
examination.
Faculty evaluation of
individual students’
Technical
achievement of the
Communications student outcome goals
in a senior experience
course.
2/18/2005
Last Year
Assessment Method
was Evaluated
(Program Review)
Outcomes assessment was cited as a
weakness by ABET in 2004. “Some
elements of graduate assessment with
respect to program objectives
appeared to exist in an informal or
irregular manner.” ABET also cited a
lack of documentation. MSCD
reported to ABET that an outcomes
assessment system and continuous
improvement system have been
created and implementation is
underway.
Faculty have used two assessment
methods: evaluation of portfolios of
students’ work in 1994 and 1995 and
assessment of majors in a capstone
course since 1996. In the second, the
instructor of a capstone course
evaluates majors’ attainment of each
of the student outcome goals. The
consultants found the portfolio
method commendable and
encouraged evaluation of the
portfolios by members of the industry.
They found the capstone course
method a “…significant way to
measure what majors have learned . .
.” and noted that “assessment results
indicate that students have obtained
program competencies.” Faculty
report that the assessment results
have, in part, influenced some of their
curriculum changes.
1999
1999
32
Metropolitan State College of Denver, CO
Academic
Program
Assessment
Method(s) Used:
Program Review
Last Year
Assessment Method
was Evaluated
(Program Review)
Other Programs
Individualized
Degree Program
Survey of faculty
teaching a senior
course in which an
IDP student is
enrolled that asks
about the IDP
student’s achievement
of goals. Survey of
graduates of that year
asking if they believe
they achieved the
goals.
The program assesses skills and
abilities that cross disciplinary lines,
e.g., “self-directed learning skills,”
and “the ability to identify, analyze,
and systematically address problems
and questions in her/his field of study
and/or work” because the disciplines
studied vary from student to student.
IDP graduates are asked to indicate
their growth in achieving the desired
skills and abilities, and faculty who
taught a graduating IDP senior are
asked to rate the student’s
achievement of the skills and abilities.
The external consultant wrote “Past
assessments reveal that IDP students
have indeed achieved the desired
competencies identified for the
review. The staff continues to search
for ways to improve assessment.
General Studies
Short form of ETS
Academic Profile
exam given in select
senior experience
courses.
General Studies
Program review surveys sent to seniors
ask how much their experiences at the
college contributed to their achievement
of the General Studies goals.
Program review surveys sent to
graduates ask how much their
experiences at the college contributed
to their achievement of the General
Studies goals.
Program review surveys sent to
employers of MSCD graduates ask
them to rate their employees’ abilities
and characteristics. The abilities and
characteristics reflect the goals of the
General Studies Program.
.
Honors Program
Honors Program
2/18/2005
Individual courses in
the Honors Program
are assessed in the
same manner as all
other MSCD academic
courses.
Recommendation: The College
Program Review Committee urged
the Director of the Honors Program to
assess how well the Program was
meeting the needs and expectations of
the Honors students. To date, this
has not been done.
2001
33
Metropolitan State College of Denver, CO
APPENDIX B.
Student Learning Outcomes-A Conceptual Framework
Adapted from: -- Learning Reconsidered: A Campus-wide Focus on the Student Experience
National Association of Student Personnel Administrators and American College Personnel Association, 2004
Student Outcomes
Cognitive
Complexity
Dimensions of Outcomes
Critical Thinking, reflective thinking,
effective reasoning, intellectual
flexibility, emotion/cognition
integration, identity/cognition
integration
Understanding knowledge in a range of
disciplines (acquisition); connecting
knowledge to other knowledge, ideas,
and experiences (integration); relate
knowledge to daily life (application);
pursuit of lifelong learning; career
decidedness; technological competence;
research skills
Sample Developmental Experiences for Learning
Classroom teaching, readings and discussion; campus
speakers; problem based learning; action research;
study abroad; learning communities; campus
newspaper and media; cultural advocacy groups;
diversity programs.
Majors, minor, general education requirements,
certificate programs, laboratories; action research;
research teams; service learning; group projects;
internships; jobs (on/off campus); career development
courses and programs; living-learning communities;
Web-based information search skills; activities
programming boards (e.g. speakers); drama, arts, and
music groups; literary magazines; special teams and
activities (e.g. solar car, Model UN)
Humanitarianism
Understanding and appreciation of
human differences; cultural competency;
social responsibility
Multicultural requirements; membership in diverse
student organizations; service learning; communitybased learning; cultural festivals; identity group
programming (e.g. diversity programs); study abroad;
interdisciplinary courses; curriculum transformation.
Civic Engagement
Sense of civic responsibility;
commitment to public life through
communities of practice; engage in
principled dissent; effective in leadership
Interpersonal and
intrapersonal
competence
Realistic self appraisal and self
understanding; personal attributes such
as identity; self esteem; confidence;
ethics and integrity; personal goal
setting; meaningful relationships;
interdependence; collaboration; ability to
work with people different from self
Practical
competence
Effective communication; capacity to
manage one’s affairs; economic selfsufficiency and vocational competence;
maintain health and wellness; prioritize
leisure pursuits; living a purposeful and
satisfying life; technological competence
Manage college experience to achieve
academic and personal success;
academic goal success including degree
attainment
Teach-ins; involvement in academic department/major;
involvement in student organizations; service learning;
various student governance groups like student
government; club sports; emerging leader programs;
leadership courses; open forums; activism and protest;
student conduct code; identity with campus community
Identity based affinity groups; personal counseling;
academic/life planning; individual advising; support
groups; peer mentor programs; student governance
groups; paraprofessional roles (e.g. tutoring, peer
educators, peer mentor programs); student success,
upward bound programs, disability support services;
student employment; classroom project groups;
classroom discussions
Campus recreation programs; health center programs;
drug and alcohol education; career development
courses and programs; financial planning programs;
club sports and recreation programs; personal
counseling; academic/personal advising; portfolios;
senior capstone course.
Learning skills; bridge programs; peer mentoring;
faculty and staff mentoring; supplemental instructiontutoring; orientation programs; academic advising;
financial aid; disability support services; child care
services
Knowledge
acquisition,
integration, and
application
Persistence and
academic
achievement
2/18/2005
34
Metropolitan State College of Denver, CO
APPENDIX C.
http://clem.mscd.edu/~worster/observation/online_guidelines.html
Online Class Observation Guidelines
Student Confidentiality: Students should be informed by the professor that their bulletin board
responses, chat rooms, emails, and other forms of communication will be observed during the
observation period.
Note: An observation of online teaching is obligatory for those faculty for whom observation of
this portion of their teaching assignment is required by the handbook.
The Online Class Peer Observation and Evaluation consists of four parts.
•
Online Class Pre-Observation Request
o This information is intended to be confidential between the observer and
observee. Submission of this form will generate an email to the observer and
observee. This email is only as secure as standard email.
o If the current instructor is using a Web site created by another instructor, separate
observations for each independently created component are recommended. This
form allows for the identification of which components are to be critiqued during
this observation.
•
Online Class Post-Observation Comments
o This form is optional. This information is intended to be confidential between
the observer and observee. Submission of this form will generate an email to the
observer and observee. The resulting email is only as secure as standard email.
•
Online Class Observation Summary
o This information may become part of the faculty member's permanent record
for the purpose of annual evaluation, retention, and promotion. Submission of this
form will generate an email to the observer and observee. The faculty member
may choose to have the results emailed to the department chair. The resulting
email is only as secure as standard email.
•
Online Class Observation Response
o This form is optional. The faculty member who was being observed may respond
to the observation summary. This information may become part of the faculty
member's permanent record for the purpose of annual evaluation, retention, and
promotion. Submission of this form will generate an email to the observer and
observee. The faculty member may choose to have the results emailed to the
department chair. The resulting email is only as secure as standard email.
The observer should take care to try to understand the approach to online education taken by the
instructor. If possible all forms of communication and assessment should be observed including:
•
•
Internal Web pages
o Web pages that are housed on the official Web site for the class.
External Web pages
o Web pages that are housed outside the official Web site, often referred to as Web
library resources or Webliography.
2/18/2005
35
Metropolitan State College of Denver, CO
•
Email
The professor should copy the observer on responses to students and include
students' messages in response.
Broadcast Email
o The professor should copy the observer on responses to students.
Bulletin Board or Forum discussions
o The observer should take care to observe bulletin board discussions.
Assessments
o Online assessments should be viewed by the observer, both practice and real tests.
o Assessments that are administered on the campus or in the assessment center may
not be available for viewing by the observer. The online professor may choose to
make these available.
Chat Room Sessions
o If a chat room is used as an instruction tool, the observer may choose to observe a
session by "lurking."
o
•
•
•
•
2/18/2005
36
Metropolitan State College of Denver, CO
APPENDIX D.
I. INSTITUTIONAL ONLINE EDUCATION NARRATIVE
A. MSCD DISTINGUISHING FEATURES
With a Fall 2004 unduplicated headcount of 20,791, Metropolitan State College of
Denver (MSCD) is the largest four-year, public, undergraduate institution in the United States.
In 1965, the Colorado Legislature established MSCD as an urban college of opportunity with the
following mission:
“The mission of MSCD is to provide a high-quality, accessible, enriching education that
prepares students for successful careers, post-graduate education, and lifelong learning in a
multicultural, global, and technological society. The college fulfills its mission by working in
a partnership with the community-at-large and by fostering an atmosphere of scholarly
inquiry, creative activity, and mutual respect within a diverse campus community.”
In 2002, MSCD was honored as one of five outstanding, public, comprehensive colleges
offering bachelor degrees in the western United States. In the same article, MSCD was named
as one of the top three institutions in the western region for minority enrollment, and number six
for Hispanic enrollment (U.S. News and World Report, 2002).
MSCD offers baccalaureate degrees through three schools: School of Letters, Arts, and
Sciences (LAS); School of Business (SCOB); and School of Professional Studies (SPS). LAS
provides degrees and certificates in 30 majors. SCOB offers courses leading to bachelor degrees,
and certificates in six majors. SPS makes available degree and licensure programs in teacher
education, technology, and public service professions.
In 1997, MSCD received a ten-year accreditation from the North Central Association of
Colleges and Schools (NCA). MSCD also possesses accreditations from the following agencies:
Technology Accreditation Commission of the Accreditation Board for Engineering and
Technology, Inc.; National Recreation and Park Association; American Association for Leisure
and Recreation; National Association of Schools of Music; National League for Nursing
Accrediting Commission; Council on Social Work Education; and National Council for
Accreditation of Teacher Education.
B. STUDENT CHARACTERISTICS
MSCD primarily serves residents of metropolitan Denver (93.5%). 5.6% are from other
locations within Colorado and nearly 1% from out-of-state and/or foreign countries.
Approximately 25% are students of color, closely reflecting Denver’s ethnic population.
Please refer to the following chart for details regarding student characteristics.
2/18/2005
37
Metropolitan State College of Denver, CO
C. FACULTY CHARACTERISTICS
MSCD recognizes faculty as being among its greatest strengths. Terms such as “quality,”
“competent,” “respect,” “dedication,” and “commitment to culturally diverse and disadvantaged
students” routinely appear in reports, correspondence, and instructor evaluation.
MSCD lists 446 full-time and 555 adjunct faculty. 76% possess a doctorate degree and
another 22% master’s degrees. 18% represent minority cultures.
Faculty characteristics for MSCD are presented in the following chart.
2/18/2005
38
Metropolitan State College of Denver, CO
II. ONLINE EDUCATION COMPREHENSIVE DEVELOPMENT PLAN NARRATIVE
A. VISION OF ONLINE EDUCATION AT MSCD
In undertaking undergraduate education, the institution sees technology as a tool for teaching
and learning and encourages creative uses of this technology to provide the resources for faculty
to fulfill the mission of the institution.
Within the wider education landscape of the college, faculty and students moving freely
between congregated and online classes with an understanding of the demands and the value of
each type of course delivery. Faculty are comfortable in the knowledge that the online
curriculum is high quality, accessible, constantly evolving and dynamic.
There is also a primary understanding that technology is a tool for learning and not an end in
itself. Technology fulfills its promise when the institution and its people develop the ability to
use it to its capacity and do not to see it as a hindrance or barrier to learning or to expression.
When technology is applied in the delivery of instruction, the curriculum is delivered in the
best way for each discipline and online is seen neither as a different, a better, or an inferior way
of teaching and learning. Rather, it extends education beyond the walls of the campus by
employing more effective or efficient ways of communication.
In the online environment, faculty will move freely among the hardware and software, not
seeing technology as a barrier but understanding its uses so well that it improves learning.
Online education calls for a new generation of faculty who see curriculum and disciplines
through the eyes of tradition but who apply imagination and high level skills to reach a student
who is eager to learn in as many ways as possible.
Online education redefines the academic community that becomes boundless yet exists in a
virtual and psychological place that is MSCD. The college becomes a psychological space where
learning exists and takes place in many ways.
39
Metropolitan State College of Denver, CO
Critical thinking and reflection are assisted by different pedagogies and by the knowledge
and cooperation among faculty and students. This is unchanged by the application of technology
as online education continues to contribute to the traditions of our disciplines by seeing the
connections between ideas and trends as the academy instills the desire for learning and offers
abilities, skills and knowledge to a new generation of students and faculty.
B. DEFINITION OF ONLINE EDUCATION FOR MSCD
Although effective working definitions of online instruction already appear in the
literature, MSCD must generate its own definition in order to guide the direction of the
institutional and program development process [should we say why? Some may ask, why reinvent the wheel? This came up during the Friday 1/21/04 meeting of the Policy group]. This is
critical given the current application of more than one method of instructional delivery and
assessment that uses the same online platform. Preliminary efforts to define online education for
MSCD have yielded the following:
In order to participate in mostly online learning, the faculty must:
• Prepare online content
• Arrange for online or on-campus testing
• Do the same course preparation as for on-campus courses [but some faculty say that it
involves more time and effort than for on-campus classes]
• Translate instruction to Web environment techniques
• Learn or execute special methods of instructional design, communication, etc.
• Motivate student learning without having face-to-face interaction
• Communication extensively with students
• Have extensive technical knowledge
In order to participate in mostly online learning, the student must:
• Have extensive technical knowledge [is this necessarily so? If it’s “user-friendly” could
students get by with less extensive technical knowledge? Otherwise, we have a big
problem with the “digital divide”]
• Be a motivated, self learner
• Pay student fees and online fees
• Communicate extensively with instructor
Other overarching issues that must be addressed are:
• How does the instructor orient students to the class?
• How does the instructor assess student learning in the class?
• How does the instructor guard against cheating?
• Is a special online orientation/testing center needed?
• How does the instructor stay consistent with other MSCD policies, such as attendance
during the first week of class?
Specific definitions that reflect MSCD working reality are:
40
Metropolitan State College of Denver, CO
Blended Online Learning: A Web-based, asynchronous environment is used. The
instructor can choose the number of scheduled on-campus classes; the student/instructor
face-to-face contact has an instructional purpose; the student pays campus fees only.
Mostly Online Learning: A Web-based, asynchronous environment is used. The instructor
schedules one or more non-instructional meetings on campus, such as orientation, or
assessment; the student may or may not meet with the instructor during the visit.
Totally Online Learning: A Web-based, asynchronous environment is used with no oncampus class attendance requirement.
C. ACTIVITY GOALS
To construct this program to best serve the institution’s needs, we propose a series of
development activities that include:
1. To develop an online instructional system that is defined and compatible with MSCD’s
instructional mission.
2. To construct a program of offerings that takes advantage of the knowledge gained by
current online offerings.
3. To develop an online instructional program that directly involves faculty and academic
program staff in the activity in the construction, maintenance, growth and assessment of
the total program offerings.
4. To broker the integration of curriculum, technology, faculty and staff together into a
common commitment, understanding and working environment distinct to online
instruction.
D. ACADEMIC FACULTY AND STAFF ROLE AND FUNCTION
It is the faculty and academic leadership including deans and chairs that needs to be
charged with the responsibility for the development of online programming and the structure that
will house it. Committees and task groups provide the best and most consistent method of
proceeding to construct the organizational mechanism to develop the program and monitor and
assess its effectiveness. Questions to be addressed by the task groups include:
1. Policy Group:
• What should be the definition of online education at MSCD?
• What is the scope of the role of online in MSCD’s curriculum offerings?
• Who should teach online and why?
• What infrastructure should be designed to most effectively integrate online
instruction into the institution?
• What should be the relationship of classroom-based and online curriculum?
• What should be the relationship of the classroom-based and online faculty?
• What should be the structure for governance of online curriculum decisions and
approvals?
• What should be the structure for governance of online faculty?
• How should faculty intellectual property rights be addressed?
• What is the appropriate language for ADA compliance in online education?
• What is the relationship between a potential virtual college concept and other
areas of the institution?
41
Metropolitan State College of Denver, CO
2. Curriculum Group:
• Who governs the development of online curriculum in each discipline?
• What type of design formats best carry out the delivery of online instruction?
• How are the syllabi for online courses different than for on campus courses?
• How do we determine the best way for students to be assessed and tested in an
online course?
• What should be the elements of a faculty assessment instrument for online
instruction?
• How do we determine online class size?
• What are the elements of a uniform presentation design that is more user friendly
for both faculty and students?
• What are the principles that will govern the design of online course construction
to ensure that there is continuity across programs, schools and the college?
• How many design formats are appropriate to accommodate the needs and styles of
all college disciplines?
3. Faculty Personnel Group:
• What should be the process for selecting full-time faculty for online instruction?
• What should be the process for selecting part-time faculty for online instruction?
• What are the training requirements for online faculty?
• What should be the procedure for certifying faculty to teach online?
• Who should do the training for the development of online faculty?
• What are the contractual obligations of online faculty and what are the salary
considerations beyond those already established for on-campus faculty?
• What will be the mechanism for peer review and how will it be applied?
• How shall the class loads for full-time faculty be determined?
• How shall the class loads for part-time faculty be determined?
• Should we recruit faculty specifically to teach online courses?
• Should there be a faculty personnel category relating to online instructors?
4. Use of Technology Group:
• What are the computer literacy requirements for online instruction and what kind
of training is required to achieve it?
• How can all of the electronic services of the college be integrated to support
online students?
• What student support services are available to the online student community?
• What is Information Technology’s role in supporting online students?
• What is Banner’s role in student support services?
• What is the projection for growth in online instruction and what are the resources
needs for technology infrastructure to meet this grow?
• What would be the relationship of a “virtual college” to the other parts of the
institution?
42
Metropolitan State College of Denver, CO
APPENDIX E.
http://chronicle.com
Section: Information Technology
Volume 51, Issue 12, Page A33
Testing Service to Unveil an Assessment of Computer and Information Literacy
By JEFFREY R. YOUNG
The Educational Testing Service plans to unveil a standardized test this week designed to
measure the computer savvy and information literacy of college students.
The test will evaluate how well students can judge the validity of online information, build a
spreadsheet, compose e-mail messages summarizing a passage, and perform other tasks. Called
the ICT Literacy Assessment -- the first three letters stand for "information and communication
technology" -- it was developed on the basis of advice from a committee of representatives from
seven universities.
Officials of the testing service said the new exam, which students will take online, is unique
because it attempts to measure not only proficiency in using computer software, but also
information-processing skills that are increasingly important in college and in many jobs.
The testing service, which administers the SAT in college admissions, hopes to have colleges
give the new exam to students to see how well prepared they are for high-tech assignments.
"Students generally arrive at colleges knowing how to use a computer or use technology for a
number of purposes, but they're not necessarily the purposes that will make the students
successful," said Teresa M. Egan, project manager for new-product development at ETS. "They
can chat with their friends, they can download MP3 files, do instant messaging, and all that, but
they've kind of lost the ability to home in on a research topic and ... evaluate all the information
that's thrown at them."
The first batch of tests is expected to be given in January. During the test's first year or so, the
service will release only aggregate results, which will be given to colleges that administer the
test. Test takers will not be given individual scores.
Institutions can use the data "for resource allocation, curriculum planning, how effective their
institution has been with educating students in that area," said Ms. Egan.
By 2006, once ETS officials have developed a baseline, she said, they will start giving out scores
to test takers, who could use them to place out of courses or to include with job applications. The
scores will consist of ratings indicating a strong, satisfactory, or poor performance in a series of
categories, Ms. Egan said.
Institutions giving the test in January will pay $20 per student, which Ms. Egan called a
"promotional price." The regular price will be about $25 per student, she said.
The test was designed with the help of a group of college officials that the company formed last
year. The seven institutions represented are the California Community College System; the
California State University System; the University of California at Los Angeles; the Universities
of Louisville, North Alabama, and Washington; and the University of Texas System.
43
Metropolitan State College of Denver, CO
'Transformation in Learning'
They based their work on an earlier panel of professors, businessmen, and government officials
that ETS convened. That panel's report, "Digital Transformation: A Framework for ICT
Literacy," argued for the importance of assessing such skills: "The panel strongly believes that it
is time to expand the notion of the digital divide."
"The current global public-policy focus is on the detrimental impact of limited access to
hardware, software and networks such as the Internet," the report said. "We believe this
characterization of the digital divide must be changed to include the impact of limited reading,
numeracy, and problem-solving skills."
Barbara A. O'Connor, a professor of communications at California State University at
Sacramento, was a member of both panels. "It was becoming abundantly clear that unless
students could integrate information technology in with other cognitive skills," she said, "it was
really not causing any transformation in their learning."
She said she had helped deliver a trial version of the test to her students recently. They "thought
it was a really good idea," she said, and told her that they wanted to have individual scores that
they could show to potential employers.
Ms. O'Connor said she got involved out of a concern that minority students were not coming to
college with the same set of information-management skills as other students. "Many of them
don't have access to the higher-end technology at home," she said.
She has been a proponent of distance education and has taught Web-based courses. But she said
she worries that not all students are prepared to take such courses. "I wanted to make sure we
weren't using distance education as a way to disadvantage those minority students," she said.
Other tests promise to measure computer literacy as well, Ms. O'Connor said, noting that some
colleges in California use a test called the International Computer Driving License. While that
test focuses on whether students can use specific types of software, she said, the new exam
"really doesn't take this approach.
"It is: How would you use those skills in any environment, and how would you use them
creatively in problem solving?"
The Educational Testing Service plans to announce the test this week at a conference on
information technology to be held in Tampa by the League for Innovation in the Community
College.
44
Metropolitan State College of Denver, CO
APPENDIX E (continued).
http://www.tekxam.com
The University of Maryland Technology Certification Exam
45
Metropolitan State College of Denver, CO
46
Metropolitan State College of Denver, CO
47
Metropolitan State College of Denver, CO
48
Metropolitan State College of Denver, CO
49
Metropolitan State College of Denver, CO
APPENDIX F.
MSCD Department of Mathematical and Computer Sciences
General Studies Math Placement Chart - For First-Time, Degree-Seeking Students
Applies Only to students in or before their first semester as degree-seeking.
(Does not apply to non-degree seeking or those beyond first semester as degree seeking.)
ACT/SAT Math Score
from Feb 1999 or later
18 or lower/459 or lower
19 - 23/460 - 559
24 or higher/560 or higher
MTH 1080
MTH 1210
MTH 1310
MTH 1110
MTH 1610
No
Yes
Yes
No
Yes
Yes
No
Peer Study
Yes
No
Peer Study
Yes
No
Yes
Yes
ACT/SAT score over 5 years old or not available
Accuplacer Elementary
Algebra Score
From Feb 1999 or later
0 - 71
72 - 84
85 - 99
100 or higher
MTH 1080
MTH 1210
MTH 1310
MTH 1110
MTH 1610
No
No*
Yes
Yes
No
No*
Yes
Yes
No
No
Peer Study
Yes
No
No
Peer Study
Yes
No
No*
Yes
Yes
Accuplacer Elementary
Algebra Score
From Feb 1999 or later
0 - 44
45 - 60
61 - 84
MAT 030
MAT 060
MAT 090
MAT 106
Yes
Yes
Yes
Yes
Yes
Yes
No
Yes
Yes
No
No
Yes
General Studies Mathematics Placement Chart – Other
Applies to Placements Other Than Required in CCHE Remedial Plan
Remedial Courses —
Grade of C or higher
MAT 060 – Pre-Algebra
MAT 090 – Intro to Algebra
MAT 106 –Survey of Algebra
MTH 1080
MTH 1210
MTH 1310
MTH 1110
MTH 1610
No
Retest with
Accuplacer
Yes
No
Retest with
Accuplacer
Yes
No
No
No
No
No
No
Peer Study
Peer Study
Yes
Accuplacer Elementary
Algebra Score
0 - 44
45 - 60
61 - 84
MAT 030
MAT 060
MAT 090
MAT 106
Yes
Yes
Yes
Yes
Yes
Yes
No
Yes
Yes
No
No
Yes
Accuplacer College
Level Math Score
0 - 69
70 - 84
85 or higher
Other Courses, Grade C or higher
MTH 1080
MTH 1210
MTH 1110
MTH 1310
MTH 1610
MTH 1400
MTH 1410
No
Yes
Yes
No
No
Yes
MTH 1080
MTH 1210
MTH 1110
MTH 1310
MTH 1610
MTH 1320
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
No
No
No
Yes
Yes
No
No
No
Yes
Yes
No
Yes
Yes
Yes
Yes
Yes
No
No
Yes
Yes
No
50
Metropolitan State College of Denver, CO
MSCD Department of Mathematical and Computer Sciences
General Studies Mathematics Placement Chart - For Students with College Level Math or Certain
Degrees
Applies to Degreed Students (AA, AS, BA, BS, BFA) OR Transfer/Continuing Students With
Successful Completion (“C” or better) of College Level Math. (Degreed students who cannot
demonstrate successful completion of college level math may take MTH1080 with Peer Study. If they earn
“C” or higher, it will serve as the prerequisite course for MTH 1210 or MTH 1610. Students with
background in mathematics see Connie Novicoff [SI-135, phone (303) 556-4447] or Math Faculty for
individual placement.)
Test Score/Transcripts
Unavailable
For degreed students only
College Level Math
Course
1999 or Later
MTH 1080
MTH 1210
MTH 1310
MTH 1110
MTH 1610
Peer Study
MTH1080 with
“C” or higher
Must test
Must test
MTH1080 with
“C” or higher
MTH 1080
MTH 1210
MTH 1310
MTH 1110
MTH 1610
Yes
Yes
Must test
Must test
Yes
ACT/SAT Math Score
from Feb 1999 or later
MTH 1080
MTH 1210
MTH 1310
MTH 1110
MTH 1610
18 or lower/459 or lower
19 - 23/460 - 559
24 or higher/560 or
higher
No
Yes
Yes
No
Yes
Yes
No
Peer Study
Yes
No
Peer Study
Yes
No
Yes
Yes
Accuplacer Elementary
Algebra Score from
Feb 1999 or later
MTH 1080
MTH 1210
MTH 1310
MTH 1110
MTH 1610
0 - 60
61-72
72-84
85 - 99
100 or higher
No
Peer Study
Peer Study
Yes
Yes
No
No
Yes
Yes
Yes
No
No
No
Peer Study
Yes
No
No
No
Peer Study
Yes
No
No
Yes
Yes
Yes
General Placement Test
Score from Feb 1999 or
later
0-8
9 - 17
18 or higher
Algebra Placement Test
Score from Feb 1999 or
later
0-8
9 - 11
12 - 16
17 or higher
MTH 1080
MTH 1210
MTH 1310
MTH 1110
MTH 1610
No
Peer Study
Yes
No
No
Yes
No
No
No
No
No
No
No
No
No
MTH 1080
MTH 1210
MTH 1310
MTH 1110
MTH 1610
No
Peer Study
Yes
Yes
No
No
Yes
Yes
No
No
Peer Study
Yes
No
No
Peer Study
Yes
No
Yes
Yes
Yes
51
Metropolitan State College of Denver, CO
APPENDIX G.
Academic Advising Center Evaluation Statistics: November 2004
(17% survey return rate)
1 - How satisfied were you with the courtesy and respect shown to you by the staff of the
Academic Advising Center?
Very Satisfied = 87%
Somewhat Satisfied = 9.1%
Satisfied total = 96.1%
Neutral = 2.2%
Somewhat Dissatisfied = .75%%
Very Dissatisfied = .75%
Dissatisfied total = 1.51%
2 – My academic advisor thoroughly addressed all of my questions and concerns:
Strongly Agree = 77.5%
Agree = 17.8%
Agree total = 95.3%
Neutral = 3.8%
Disagree = .77%
Strongly Disagree = 0%
Disagree total =.77%
3 – My academic advisor provided me with useful, accurate advice and information:
Strongly Agree = 78.7%
Agree = 16.5%
Agree total =95.2%
Neutral = 3.15%
Disagree = 1.6%
Strongly Disagree = 0%
Disagree total =1.6%
4 – My academic advisor was knowledgeable about general studies and major
requirements:
Strongly Agree = 70.6%
Agree = 24.6%
Agree total = 95.2%
Neutral = 4.8%
Disagree = 0%
Strongly Disagree = 0%
Disagree total = 0%
5 – My advisor helped me with goal setting and academic progress:
Strongly Agree = 57.9%
Agree = 27%
Agree total = 84.9%
Neutral = 14.3%
Disagree = .79%
Strongly Disagree = 0%
Disagree total = .79%
52
Metropolitan State College of Denver, CO
6 – Were you able to develop a satisfactory schedule?
Yes = 89%
No = 1.6%
Don’t know = 9.4%
7 – Were you referred to appropriate campus resources?
Yes = 90%
No = 1.6%
Don’t know = 8.8%
8 – Were you referred to a faculty advisor in the department of your major?
Yes = 46%
No = 37%
Don’t know = 17.6%
9 – Overall, how satisfied were you with the quality of the advising you received today in
the Center?
Very Satisfied = 85.9%
Satisfied = 11.7%
Satisfied total = 97.6%
Neutral = 2.3%
Somewhat dissatisfied = 0%
Very Dissatisfied = 0%
Dissatisfied total = 0%
53
Metropolitan State College of Denver, CO
Academic Advising Center Evaluation Statistics: June 2004
(21.8% return rate)
1 – How satisfied were you with the courtesy and respect shown to you by the staff of the
Academic Advising Center?
Very Satisfied = 89.5%
Somewhat Satisfied = 9.9%
Satisfied total = 99.4% (July=97.9%; Aug=96.5%)
Neutral = .6%
Somewhat Dissatisfied = 0%
Very Dissatisfied = 0%
Dissatisfied total = 0% (July=.2%; August=.6%)
2 – My academic advisor thoroughly addressed all of my questions and concerns:
Strongly Agree = 77.4%
Agree = 22%
Agree total =99.4% (July=99%; Aug=97.2%)
Neutral = .6%
Disagree = 0%
Strongly Disagree = 0%
Disagree total =0% (July=0%; Aug=.6%)
3 – My academic advisor provided me with useful, accurate advice and information:
Strongly Agree = 77%
Agree = 21.2%
Agree total =98.2% (July=97.9%; Aug=95.1%
Neutral = 1.8%
Disagree = 0%
Strongly Disagree = 0%
Disagree total =0% (July=.2%; Aug=.6%)
4 – My academic advisor was knowledgeable about general studies and major
requirements:
Strongly Agree = 77%
Agree = 19.2%
Agree total = 96.2% (July=95%; Aug=94.5%)
Neutral = 3.8%
Disagree = 0%
Strongly Disagree = 0%
Disagree total = 0% (July=0%; Aug=.6%)
5 – My advisor helped me with goal setting and academic progress:
Strongly Agree = 60.5%
Agree = 28.3%
Agree total =88.8%(July=86.2%; Aug=84.7%)
Neutral = 10%
Disagree = 1.2%
Strongly Disagree = 0%
Disagree total = 1.2% (July=.7; Aug=1.4%)
6 – Were you able to develop a satisfactory schedule?
Yes = 94%
(July=94.4%; August=86.8%)
No = 3.5%
(July=1.2%; August=7.6%)
Don’t know = 2.5%
(July=4.4%; August=5.6%)
54
Metropolitan State College of Denver, CO
7 – Were you referred to appropriate campus resources?
Yes = 89.2%
(July=90.7%; August=87.6%
No = 2.7%
(July=3.5%; August=4.8%)
Don’t know = 8.1%
(July=5.8%; August=7.6%)
8 – Were you referred to a faculty advisor in the department of your major?
Yes = 50.1%
(July=47%; August=40.6%)
No = 36.3%
(July=34.8%; August=42%)
Don’t know = 13.6%
(July=18.2%; August=17.4%)
9 – Overall, how satisfied were you with the quality of the advising you received today in
the Center?
Very Satisfied = 86.6%
Satisfied = 10.5%
Satisfied total = 97.1% (July=97.7%; Aug=96.5%)
Neutral = 2.3%
Somewhat dissatisfied = .3%
Very Dissatisfied = .3%
Dissatisfied total = .6% (July=.2; Aug=.7)
Quotable Quotes:
______ made a great suggestion for me concerning my major and was very helpful – I’m very
appreciative!
____ did a great job. Thank you!
The first time I came in here 10 years ago I received poor advice – However, ______ has been
very helpful in trying to get me in the remedial math class I must have to get into Math 1610 –
my last class.
_____ – she was the best ever!
___ was extremely helpful.
_____ was awesome! It’s really scary not knowing how to register, where everything is, and
who to talk to. ____ was SO helpful. I’ve seen 3 different advisors up at CSU and none ever
came close to being as helpful as ______. Thanks so much!
Student who rated us low in all categories: I’ve waited for 2 hours and ended up making my
own schedule. You need MORE advisors!
55
Metropolitan State College of Denver, CO
APPENDIX H.
Employer, Graduate, and Senior Surveys
for
Assessment of General Education at MSCD
56
Download