Monitoring Report, 2014 - Medgar Evers College

advertisement
Medgar Evers College, The City University of New York
Follow-Up Report
November 1, 2014
Monitoring Report
to the
Middle States Commission on Higher Education
from
MEDGAR EVERS COLLEGE
Brooklyn, NY 11225
Dr. Rudolph F. Crew, President
Dr. Ellie A. Fogarty, Vice President
Accreditation Liaison Officer
October 31, 2014
Subject of the Follow-Up Report:
Documenting (1) further implementation of the institutional strategic plan with evidence that institutional
assessment information is used for planning and allocating resources (Standards 2 and 7); (2) steps taken
to strengthen institutional capability to support institutional and student learning assessment activities
and decision-making (Standards 7 & 14); (3) evidence that course syllabi consistently include student
learning outcomes and that program goals and expected student learning outcomes are published for all
programs at all levels (Standard 14); and (4) further implementation of an organized and sustainable
process to assess achievement of expected student learning outcomes in all programs, with evidence that
assessment information is shared and is used for the continuous improvement of student learning
(Standard 14).
If this report follows an evaluation or follow-up visit, indicate
Date of the Evaluation/Follow-Up Team’s Visit: October 15-16, 2013
Or if this report follows the submission
of a Periodic Review Report, indicate
Date the PRR Was Submitted: September 4, 2012
Medgar Evers College, The City University of New York
Follow-Up Report
November 1, 2014
Table of Contents
Introduction
p. 1
Addressing MSCHE’s Concerns:
1.
Document further implementation of the institutional strategic
plan with evidence that institutional assessment information is used
for planning and allocating resources (Standards 2 and 7)
p. 4
2.
3.
4.
Summary
Steps taken to Strengthen Institutional Capability to Support
Institutional and Student Learning Assessment Activities and
Decision Making
Recommended Revision of the MEC Assessment Plan
Enhancing Institutional Effectiveness
Strengthening Student Learning Assessment
p. 14
p. 14
p. 16
p. 18
Evidence that Course Syllabi consistently include Student Learning
Outcomes and that Program Goals and Expected Student Learning
Outcomes are published for all Programs at all levels (Standard 14)
p. 22
Further Implementation of an Organized and Sustainable Process to
Assess Achievement of Expected Student Learning Outcomes in all
Programs with Evidence that Assessment Information is shared and is
used for the Continuous Improvement of Student Learning (Standard 14)
Developmental English
Developmental Mathematics
The First Year Experience Redesign
General Education Program Assessment
Academic Program-Level Assessments
Course Level Assessment
p. 25
p. 28
p. 28
p. 30
p. 30
p. 32
p. 35
p. 37
p. i
Medgar Evers College, The City University of New York
Follow-Up Report
November 1, 2014
APPENDICES
Appendix A: 2012-2017 Institutional Strategic Plan
p. 4
Appendix B: Sample Action Plans Rated by Rubric
p. 4, 6
Appendix C: MEC Performance Management Plan Report
p. 4
Appendix D: 2014-2018 MEC Draft Strategic Plan: Claiming Prosperity
p. 4
Appendix E: MEC Snapshot
p. 4
Appendix F: Departmental Data Set Report
p. 4
Appendix F.1: MEC Dashboard Report
p. 9
Appendix G: MEC Action Plan Evaluation Rubric
p. 5
Appendix H: Institutional Effectiveness and Assessment Committee
& Sub Committee Membership
p. 5
Appendix I: Action Plan Template
p. 5
Appendix J: Operating Unit Assessment Plan Template
p. 7
Appendix K: Comparison of Strategic Plans, PMP, and Operational Plan
p. 7, 10
Appendix L: College-Wide Retreat Documents
p. 8
Appendix M: Revised Budget Call Memo and Template
p. 8
Appendix N: Budget and Finance Presentations
“Aligning Budget Requests” & “Budget Overview Process”
p. 9
Appendix O: New Budget Workshop Schedule
p. 9
Appendix P: End of Year Highlights
p. 10
Appendix Q: MEC Pipeline Initiative Overview
p. 11
Appendix R: MEC Draft Investment Plan “Claiming Prosperity”
p. 11
Appendix S: MEC Draft Operational Action Plan 2014-2015
p. 12
Appendix T: Key Outputs from 2013- 2014 Strategic Action Planning
p. 12
Appendix U: MEC Institutional Assessment According to the SSPM
p. 12, 14
Appendix V: Examples of Assessment Instruments
p. 14, 26, 36
Appendix W: MEC Assessment Plan
p. 14
Appendix X: Program Reviews and External Evaluator Reports
p. 17, 34
Appendix X.1: Overview of Departmental Data Discussion Document
p. 17, 34
Appendix X.2: Concentrations Selected by Degree Students
p. 17, 34
Appendix Y: Operating Unit Assessment Plans
p. 18
Appendix Z: External Consultant Review Reports, March 2014
p. 18
Appendix AA: Summaries of MEC Coordinators’ Reports on Assessment Activities
p. 20
p. ii
Medgar Evers College, The City University of New York
Follow-Up Report
November 1, 2014
Appendix BB: Departmental Assessment Plans
p. 21
Appendix CC: Sample Assessment Handbook – Biology
p. 21, 35
Appendix DD: Syllabus Template
p. 22
Appendix EE: Sample Syllabi by Department
p. 23
Appendix FF: Assessment Organizational Chart
p. 25
Appendix GG: CUE Report
p. 27
Appendix HH: Developmental English Redesign Project
p. 28
Appendix II: Developmental Mathematics Redesign Project
p. 29
Appendix JJ: First Year Experience Redesign
p. 30
Appendix KK: Pathways Resolution and Framework
p. 30
Appendix LL: General Education Assessment Process and Rubrics
p. 31
Appendix MM: Course Level Learning Assessment and Alignment Template
p.36
TABLES
Table 1: Comparison of 2013-2014 and 2014-2015 Action Planning Assessments
p. 6
Table 2: Institutional Assessment Calendar
p. 9
Table 3: Differentiation of Institutional Assessment and Student Learning Assessment
p. 15
Table 4: Consultant’s Review Summary on Assessment Practices – March 2014
p. 19
Table 5: Checklist of Departments Using New Syllabi Template as of Fall 2014
p. 22
Table 6: Developmental Needs of First Time Freshmen
p. 27
Table 7: Summer Immersion Pass Rates
p. 27
Table 8: Timeline for General Education Assessments
p. 31
Table 9: Program Reviews and Accreditation Status Reports
p. 32
Table 10: Progress on Planned Course Level Assessment
p. 36
CHART
Figure 1: MEC Organizational Chart
p. 3
GLOSSARY OF ACRONYMS
ACBSP
Accreditation Council for Business School and Programs
CLA
Collegiate Learning Assessment
CSWE
Council on Social Work Education
p. iii
Medgar Evers College, The City University of New York
Follow-Up Report
November 1, 2014
CUNY
City University of New York
EHAC
National Environmental Health Science & Protection Accreditation Council
FYE
Freshman Year Experience
GEP
General Education Program
IEAC
Institutional Effectiveness and Assessment Committee
MEC
Medgar Evers College
MSCHE
Middle States Commission on Higher Education
NCATE
National Council for the Accreditation of Teacher Education
NLNAC
National League for Nursing Accrediting Commission
OAA
Office of Academic Affairs
OAQA
Office of Accreditation and Quality Assurance
OIRA
Office of Institutional Research and Assessment
OIT
Office of Information Technology
PMP
Performance Management Process
SEEK
Search for Education, Elevation, and Knowledge
SSPM
Student Success Progression Model
p. iv
Medgar Evers College, The City University of New York
Follow-Up Report
November 1, 2014
INTRODUCTION
Medgar Evers College (MEC) is the youngest of the four-year colleges among the 19
undergraduate institutions of the City University of New York (CUNY). It is a vibrant, vital, and
transformative traditionally black institution that embraces the enduring legacy of Medgar
Wiley Evers, expressed through education, self-actualization and community service. The
College provides access and opportunity for all students to become dynamic professionals,
scholars, and change agents in their communities and in the diverse and rapidly changing
world. Since its founding in 1969, through the collaborative efforts of the Chancellor and Board
of Trustees of CUNY, elected officials, and community leaders, MEC has grown by expanding
programming to include eight associate degree programs and 18 baccalaureate programs under
its three academic Schools: School of Business; School of Liberal Arts & Education; and School
of Science, Health and Technology. Additionally, the School of Professional and Community
Development offers a wide range of programs for youth and adults aimed at college
preparation, career development, and community involvement.
In addition to enlarging its academic programming, over the past 45 years MEC has
graduated 14,000 students who have contributed to Crown Heights, Brooklyn, New York City,
and the world beyond. Currently MEC enrolls nearly 7,000 undergraduate students, who reflect
an increasingly diverse student body: 86% African American; 10% Hispanic; 2% Asian/Pacific
Islander; 1% European American, and 1% Native American. The College provides these students
with the academic programming and student support necessary to educate and graduate
competent and caring professionals who carry forward MEC’s legacy of courage, strength, and
fortitude.
With President Rudolph Crew at the College’s helm from August 2013, strategic
planning and ongoing assessments were at the forefront of his initiatives to improve
institutional effectiveness and student learning outcomes at MEC. His focus on accountability
coupled with his collaborative leadership style guided the College in working towards and
making substantial progress in meeting the professional higher education accreditation
standards.
p. 1
Medgar Evers College, The City University of New York
Follow-Up Report
November 1, 2014
This monitoring report responds to the issues raised in the MSCHE Commissioner’s
letter of November 22, 2013, as well as the Visiting Team’s recommendations and suggestions.
The report provides evidence that the College has further implemented and will continue to
sustain compliance with Standards 2, 7, and 14. Therefore, the report is organized around the
four main concerns of the Commission.
The first section of this report addresses further implementation of the institutional
strategic plan with evidence that institutional assessment information is used for planning and
allocating resources (Standards 2 and 7). The streamlined process for college-wide action
planning, which includes financial and resource allocations, serves as the vehicle that drives the
strategic goals and initiatives and this report shows progress in meeting these goals. The second
section of the report describes the steps taken to strengthen institutional capability to support
institutional and student learning assessment activities and decision making (Standards 7 and
14), and provides evidence of a comprehensive, organized, and sustained process for the
assessment of institutional effectiveness and student learning at the institutional, program, and
course levels, including general education. The third section provides evidence of progress in
ensuring that course syllabi consistently include student learning outcomes and that program
goals and expected student learning outcomes are published for all programs at all levels
(Standard 14).The College also outlines its progress on further Implementation of an organized
and sustainable process to assess achievement of expected student learning outcomes in all
programs, with evidence that assessment information is shared and is used for the continuous
improvement of student learning (Standard 14).
As the new leader at MEC, President Crew realized the need for realignment of staff in
key positions as well as new hires to effectively carry out the mandates of the College’s
strategic goals. Major personnel changes included the appointment of an Interim Senior VicePresident and Provost in the Office of Academic Affairs; the appointment of Vice Presidents in:
Student Affairs, School Initiatives and Enrollment Management Services; and in Finance &
Administration. Assistant Vice Presidents were hired for the areas of Communications & Public
Relations as well as Facilities Management Campus Planning & Operations, and a new Chief
Information Officer for Information Technology was appointed. These new appointments
p. 2
Medgar Evers College, The City University of New York
Follow-Up Report
November 1, 2014
reflect structural and functional redesign. The new administration understood that effective
functioning of these major units required that strategic goals were well-matched with
expertise, so that the planning, implementation, and outcomes would be of the highest quality.
The current structure strengthens the College’s ability to sustain the above practices to
objectively fulfill its mission and strategic goals (see Figure 1: MEC Organizational Chart).
Abbreviated MEC Organizational Chart
↗
Academic
Schools/Departments
President’s Executive Cabinet
(President, Provost, Vice Presidents)
↑
Institutional Effectiveness and Assessment Committee
↕
Office of Academic Affairs
↔
(Provost)
↑
↖
Office of Accreditation
Office of Institutional
and Quality Assurance
Research and Assessment
Administrative Offices
(Vice Presidents)
↑
Operational Units
p. 3
Medgar Evers College, The City University of New York
Follow-Up Report
November 1, 2014
1. Document further implementation of the institutional strategic plan with evidence that
institutional assessment information is used for planning and allocating resources (Standards
2 and 7)
During 2013-2014, Medgar Evers College continued to implement its 2012-2017
Institutional Strategic Plan (ISP) “Advancing the Spirit of Transformation, Realizing Dreams”
(Appendix A: 2012-2017 Institutional Strategic Plan (ISP). Implementation was performed at
three levels: 1) at the departmental level through the implementation of departmental/area
Action Plans which collected short-term goals and initiatives; 2) at the School/Administration
level through a series of Strategic Planning meetings and workshops, and 3) at the Institutional
level through a reorganization of areas and hiring of new personnel, including the appointment
of a Senior Vice President/Chief Operations Officer & Strategic Planning (COO).
As described in the MEC Assessment Plan, Action Plans “document the results of the
previous year’s efforts, and reflect goals, actions and budget priorities for the coming academic
year.” (Appendix B: Sample Action Plans Rated by Rubric).Each department/area sets goals that
are linked both to the Strategic Plan, and to the CUNY-wide Performance Management Plan
(PMP) (Appendix C: MEC Performance Management Plan Report) and, Appendix D: 2014-2018
MEC Draft Strategic Plan: Claiming Prosperity. Action plans submitted also propose an
estimated budget for each goal. Data and reports produced by the Office of Institutional
Research and Assessment (OIRA), such as the College Snapshot and Departmental Data Sets,
inform Action Plan goal-setting (Appendix E: MEC Snapshot) and (Appendix F: Departmental
Data Set Report). Since Action Plans are now in their second year of implementation,
departments/areas reported on the degree to which the previous year’s goals were met, and to
describe any challenges which prevented goal attainment. For example, in Facilities
Management, the renovation of the Library and relocation of administrative and academic units
were completed; the Office of Information Technology overhauled the technology
infrastructure; the Office of Academic Affairs collaborated with administrative and academic
units to develop assessment plans; Student Support Services restructured and relocated, and
p. 4
Medgar Evers College, The City University of New York
Follow-Up Report
November 1, 2014
new faculty were hired in English, Social Work, Biology, Business, Library and Mathematics. The
compiled set of Action Plans will be available in the Exhibit Room.)
A Sub-Committee of the Institutional Effectiveness and Assessment Committee was
established and charged with reviewing Action Plans utilizing the Action Plan Evaluation Rubric
to assess the quality of each department’s plan (Appendix G: MEC Action Plan Evaluation
Rubric). Membership of the Sub-Committee represents key operational areas of the College,
which include the following offices and administrators: Associate Provost (Chair); Assistant
Provost; Director, Office of Institutional Research and Assessment(OIRA); Director, Budget
Office; Director, External Relations and Development; Strategic Operations Manager,
Information Technology; Director, Human Resources, and Student Advisement (Appendix H:
Institutional Effectiveness and Assessment Committee & Sub Committee Membership).
As reported in the 2013 Monitoring Report, the Office of Academic Affairs initiated the
first college-wide action planning efforts in 2012-2013. The first call for Action Plan submissions
using the standard template was completed for 2013-2014 (Appendix I: Action Plan Template).
In September 2013, the Sub-Committee reviewed 2013-2014 Action Plans for 16 academic
departments and 30 operational units, using the Action Plan Evaluation Rubric developed in
2012-2013. The rubric provided feedback to individual academic departments and operational
units. Action Plans were evaluated on eight specific dimensions across four levels:
1. Strategic Initiative (major goals reflective of area mission)
2. Objectives (critical to achieving strategic initiative; clear, measurable, and attainable)
3. Actions (well-defined, reasonable benchmarks/deliverables; well-matched partnerships;
key contributions to program/unit improvement
4. Lead (functional expertise and beneficial collaborations)
5. Measures/Metrics (specific, appropriate, multiple measures)
6. Timeline (realistic milestones)
7. Budget (clear and justifiable; connected to actions/outcomes; identifiable fund sources)
8. Summative Assessment of Action Plan(overall evaluation of Plan)
Plan reviewers rated each dimension as either exceeds expectations, meets expectations,
approaches expectations, or does not meet expectations. Results, summarized in Table 1:
p. 5
Medgar Evers College, The City University of New York
Follow-Up Report
November 1, 2014
Comparison of 2013-2014 and 2014-2015 Action Planning Assessments showed that some
units exceeded expectations (21%) in action planning and serve as exemplars among the plans
reviewed. These units were thoughtful in their planning and used data sets to develop their
plans with achievable objectives, timelines and budgets. Their strategic initiatives were clearly
expressed with supporting actions and appropriate lead persons and collaborations to carry out
the tasks. Several units met expectations (27%) while some of them were approaching the
expectations (24%) in developing practical action plans.
Although this was a relatively new process for most units, results showed that only 28% of
the action plans submitted did not meet the basic expectations. Some of the challenges
identified were on the rubric dimensions of timelines and metrics that were generic and
required more specificity. Another area was the lack of clarity regarding actions and objectives.
Samples of Action Plans rated at each level are included in Appendix B: Sample Action Plans
Rated by Rubric.
Beginning in the fall of 2013, the Sub-Committee conducted a series of meetings with
academic departments and operational units to provide feedback on their 2013-2014 Action
Plans, and guidance on how to revise their plans. A total of 46 out of 48 departments/areas
(95.8%) participated in the initial Action Plan Review process.
Results from the second round of annual submissions (2014-15) show plans that are
better realized when compared to the initial submissions for 2013-14 (Table 1). For example,
68% in 2014-15 compared to 48% in 2013-14 met or exceeded expectations and only 2% (201415) compared to 28% (2013-14) did not meet expectations in the overall assessment of the
action plans that were submitted and evaluated.
Table 1: Comparison of 2013-2014 and 2014-2015 Action Planning Assessments
Dimensions
2013-14 N=46
2014-15 N= 42
Strategic Initiative
Objectives
Actions
Lead
Measures/Metrics
Exceeds
Expectations (1)
2013-14
2014-15
37%
22%
17%
17%
15%
33%
26%
20%
37%
24%
Meets Expectations
(2)
2013-14
2014-15
20%
30%
43%
33%
26%
48%
46%
41%
41%
26%
Approaching
Expectations (3)
2013-14
2014-15
7%
28%
24%
35%
24%
9%
26%
35%
17%
30%
Does not meet
Expectations (4)
2013-14
2014-15
37%
20%
17%
15%
35%
11%
2%
4%
4%
20%
p. 6
Timeline
Budget
Summative
Assessment
17%
24%
21%
30%
20%
20%
24%
15%
27%
Medgar Evers College, The City University of New York
Follow-Up Report
November 1, 2014
37%
39%
24%
20%
9%
33%
9%
28%
54%
20%
48%
24%
30%
28%
2%
Based on the extensive and collaborative strategic planning and review activities, two
needs were identified:
1. an Institutional Assessment Data Platform that will serve as a primary database to house
Action Plans, aggregated institutional data, Institutional Research reports, and academic
assessment results to facilitate strategic planning, and
2. differentiation between the assessment template used for the Operational Units and
the template used for the Academic Departments.
To address these needs, the broader Institutional Effectiveness and Assessment
Committee tasked the Office of Information Technology (OIT) with identifying an online
database for MEC’s institution-wide assessment undertakings. OIT identified Strategic Planning
Online (SPOL), a cloud based strategic planning software as a possible solution. Initial
presentations of SPOL’s capabilities have been met with positive responses. The database will
be managed by OIT, with the material collection responsibility shared by OIRA, Office of
Academic Affairs (OAA) and Office of Accreditation and Quality Assurance (OAQA).
A revised Operational Units Assessment Plan template was developed by the Office of
Accreditation, Quality Assurance and Institutional Effectiveness (OAQA) and was used for the
first iteration of the operating units’ assessment plans (Appendix J: Operating Unit Assessment
Plan Template).
In August 2013, President Rudolph “Rudy” Crew was appointed President of Medgar
Evers College. His process of reviewing the College’s key documents, in particular the 20122017 Institutional Strategic Plan (ISP), provided the basis for his vision of revitalizing the College
in tandem with re-establishing the College as a change agent for the community. His inaugural
State of the College Address in September 2013 presented his goals, which built upon but
added more specificity to the 2012-2017 ISP (Appendix K: Comparison of Strategic Plans, PMP,
and Operational Plan). These goals, which have come to be known as the “25’s,” include a:

25% increase in overall enrollment
p. 7
Medgar Evers College, The City University of New York
Follow-Up Report
November 1, 2014

25% increase in first-time, full-time freshmen with 25% of this increased
freshmen cohort being Baccalaureate level students

25% increase in retention

25% increase in graduation rates

25% increase in internships

$25m in fundraising
Undergirding these goals is the President’s firm belief that the College mission to serve
an underserved community extends beyond offering access to education and community
programming. Medgar Evers College must pro-actively strengthen the community by engaging
with middle and high schools, parents, and community stakeholders with strategies that
improve student achievement to ensure that high school graduates are authentically collegeready when they receive their high school diplomas.
With these dynamic goals in place, President Crew scheduled a series of meetings and
retreats which included a three-day off-campus college-wide Retreat in February, 2014. Under
the guidance of an external facilitator, 85 members of the college community (administrators,
faculty, staff and students) engaged in reviewing nine (9) critical strategic goals informed by the
President’s vision. Representative sub-committees were charged to explore each goal, identify
priorities and develop implementation plans, inclusive of budgetary and other resource needs
(Appendix L: College-Wide Retreat Documents). Based on these meetings, discussions, and
retreats, a draft of the 2014-2019 Strategic Plan was developed.
Concurrent with strategic planning meetings and retreats were further refinements to
the budget process such that allocations were both data driven and aligned with the Strategic
Initiatives of the College. In response to the new Strategic Initiatives, a revised budget request
was designed to highlight requests specifically related to Strategic Initiatives (Appendix M:
Revised Budget Call Memo and Template). Action Plans require that faculty and administrators
provide an estimated budget for each goal and action. The annual budget call process requires
that each department or area submit a detailed line item budget related to their Action Plan,
with requests not to exceed 3% of the previous year’s budget. Allocation decisions are based on
consultations with administrative and academic area heads to determine the College’s priorities
p. 8
Medgar Evers College, The City University of New York
Follow-Up Report
November 1, 2014
relative to budget requests. The Vice President for Finance and Administration made several
presentations to the college reviewing and detailing the budget request process which illustrate
the use of OIRA data sources to coordinate the action planning process with budget
preparation and resource allocation (Appendix N: Budget and Finance Presentations “Aligning
Budget Requests” & “Budget Overview Process”).
One important finding noted by Finance and Administration is that if department/area
Action Plans do not include estimated budgets, it is difficult to show alignment between
strategic planning and budget expenditures. The Office of Finance and Administration
proposed providing additional workshops and one-on-one sessions to better inform
departments/areas about calculating costs in creating budgets (Appendix O: New Budget
Workshop Schedule).
In accordance with the Annual Institutional Assessment Calendar in Table 2, the Office
of Institutional Research and Assessment distributed The Departmental Data Set Report for
each academic department at the May 2014 Academic Council and the MEC Dashboard Report
(Appendix F.1: MEC Dashboard Report) for the IEAC meetings. These sessions were devoted to
OIRA and OAA guiding faculty and staff in analyzing and using these data sets to address their
PMP targets and program goals, as well as using them for program revisions, improvement and
future action planning. These information sessions were also supported by personnel from
Finance and Administration who provided assistance in using these data for budgeting and
resource allocation (Appendix N).
Table 2: Institutional Assessment Calendar
Timeline
2013‐2014
Spring, Fall Institutional data collection and analysis linked to
each stage of SSPM
January
Action Planning Mid‐Year Results
Forum
February
March
2014‐2015 and beyond
Institutional data collection and analysis linked to
each stage of SSPM
President’s Cabinet Retreat
CTLE Faculty Development
Action Planning Mid‐Year Results Forum
College-wide Retreat
College-wide Retreat
Distribution of the Data Set for Retreat
Distribution of the Data Set for Retreat Report
Report
Receive University PMP Goals and Procedures for
Receive University PMP Goals and Procedures for the subsequent academic year
the subsequent academic year
Budget Call
Budget Call
PMP Goals & Targets Discussion at Academic
PMP departmental/ unit progress reports
Council; Deans, chairs & department heads to
p. 9
Medgar Evers College, The City University of New York
Follow-Up Report
November 1, 2014
discuss at departmental & school meetings and
solicit target recommendations for subsequent
academic.
PMP departmental/ unit progress
reports for current PMP cycle
Departmental/Unit Budget and Action Planning
Sessions
Departmental/Unit PMP Sessions
April
May
June
Dashboard and data sets distributed
Action Planning Retreat
PMP Year-End Report and PMP Goals for
Subsequent Academic Year Submitted
Budget Allocation
August
September Departmental/Action Plans Submitted
October
Publication of MEC Snapshot
November 2012‐2013 Strategic Plan
Implementation/Institutional Assessment
Report
December Community Council Meeting
Departmental/Unit Budget and Action Planning
Sessions
Departmental/Unit PMP Sessions
Departmental Data Set Report distributed
Action Planning Forum
The MEC Dashboard Report distributed
PMP Report and Targets Submitted
Departmental/Unit Action Plans Submitted
Budget Allocation
Chair’s Retreat
Distribution of the Data Set for Chair’s Retreat
Distribution of tentative enrollment and FTE Data
by Department for the fall semester
Publication of MEC Snapshot of the prior academic
year
2013‐2014 Strategic Plan Implementation and
progress monitoring Institutional Assessment
Report
Distribution of tentative retention data by
program for meeting with academic chairs
Community Council Meeting
The President, his Cabinet and selected campus leaders conducted a series of meetings
with community elected officials, religious leaders, public school officials, principals, and
teachers as well as other community based organizations to elicit their recommendations on
how Medgar Evers College could best meet their needs within the context of the historically
community based College mission.
The outcomes of these college retreats and stakeholder meetings contributed to the
draft Strategic Plan, “Claiming Prosperity: The Medgar Evers College Revitalization Initiative,
2014-2018” (Appendix D). Appendix K: Comparison of Strategic Plans, PMP, and Operational
Plan illustrates how the new draft strategic plan builds upon the previous Strategic Plan, while
foregrounding the new emphasis of connecting to community. Highlights of these activities are
included in the End of Year Report (Appendix P: End of Year Highlights).
p. 10
Medgar Evers College, The City University of New York
Follow-Up Report
November 1, 2014
One of the strategic initiatives that directly addresses the consensus of community
leaders on how Medgar can best fulfill its historic mission is the Pipeline Initiative. In brief, this
initiative aims to improve college opportunity, access and preparedness for over 30,000
students by using customized interventions from elementary through high schools throughout
the borough of Brooklyn (Appendix Q: MEC Pipeline Initiative Overview). Listed under the
header Shaping a College For All Cultures in the comparison of Strategic Plans outlined in
Appendix L, this initiative straddles two sectors of critical importance to MEC and its
community: 1) college-led community development based primarily on improving educational
outcomes, and 2) an increase in the number of college-ready high school students enrolling in
MEC, thus improving time to graduation and decreasing the substantial costs accrued by the
College in addressing remedial and developmental needs. A structured foundation and
corporate fundraising plan is in place for this community based initiative (Appendix R: MEC
Draft Investment Plan “Claiming Prosperity”).
These new strategic initiatives required hiring new personnel and administrators with
concomitant reorganization and realignment of responsibilities. The MEC Organization Chart
confirms the new emphasis on strategic planning, student support services and community
development. Major personnel changes included the appointment of an Interim Senior VicePresident and Provost in the Office of Academic Affairs following former Interim Provost’s
departure in April 2014. The College-wide Institutional Effectiveness and Assessment
Committee, which is chaired by the Provost, continues to develop and deploy college wideacademic strategic planning.
The new Senior Vice President & Chief Operating Officer has administrative oversight for
all strategic planning activities related to the development of a new MEC Facilities Master Plan.
In collaboration with the Office of Academic Affairs, the COO was charged to review existing
practices and to work closely with academic and operating units to refine action plans such that
goals were clear and realistic and informed the needs of the MEC Facilities Master Plan. The
COO’s office in conjunction with the Office of Institutional Research and Assessment (OIRA)
identify and develop data needs and reports in support of the Strategic Plan and the President’s
key initiatives.
p. 11
Medgar Evers College, The City University of New York
Follow-Up Report
November 1, 2014
The responsibilities of the newly hired Vice President for Student Affairs, School
Initiatives, and Enrollment Management Services reflect the President’s vision of a seamless
pre-K – 16 transition for students. School Initiatives is a new area of responsibility specifically
established to identify, create and administer innovative college-community partnerships. By
consolidating offices that are primarily student service oriented (Admissions, Registrar,
Financial Aid, and Testing), students can access these related services in one location.
In addition to these new positions, new Vice or Assistant Vice Presidents were hired to
lead Finance & Administration, Communications, Public Relations, Alumni and Foundation
Relations, as well as Facilities Management, Campus Planning & Operations.
Based on the new draft Strategic Plan, the College has developed an Operational Action
Plan to guide priority goals for the next year (2014-2015) in addition to the Investment Plan
that is used to solicit corporate and other funding to support the new goals and strategic
initiatives (Appendix S: MEC Draft Operational Action Plan 2014-2015). For example, the College
received external funding from Sprint Corporation to support Study Abroad; Santander Bank
partially funded the Writing Center and other student support areas, and United Parcel
Services, Voya, Capital One Bank, National Grid and Macy’s provided funding for student
scholarships and internships.
Ongoing work on several priority goals identified during the Retreat and sub-committee
exercises led to the completion of several goals. Under the guidance of MEC’s new
administration, these outputs are the result of intentional coordination of the strategic
planning process, which included the identification of major initiatives put forth through action
plans for improving the institution’s effectiveness, and then strategically using this information
for budgeting and resource allocation, all aimed at achieving the critical objectives in fulfilling
the College’s mission and goals. Major outputs during 2013-2014 are summarized and
presented in Appendix T: Key Outputs from 2013- 2014 Strategic Action Planning.
The College will continue to build its systematic process for close and continuous
monitoring of the strategic initiatives that will provide data to inform constituents about the
Institution’s progress as well as their impact on the overall Student Success Progression Model
(SSPM) (Appendix U: MEC Institutional Assessment According to the SSPM), CUNY’s
p. 12
Medgar Evers College, The City University of New York
Follow-Up Report
November 1, 2014
Performance Management Process (PMP) and proposed revisions to the out-dated College
Master Plans. Progress monitoring of these indicators will serve as the basis for addressing
future institutional planning.
p. 13
Medgar Evers College, The City University of New York
Follow-Up Report
November 1, 2014
2. Steps taken to Strengthen Institutional Capability to Support Institutional and Student
Learning Assessment Activities and Decision Making.
Since receiving the MSCHE Team and Commissioner’s reports, the College, led by the
Provost and the Office of Academic Affairs, has taken several deliberate steps to strengthen the
Institution effectiveness capability to support both Institutional and Student Learning
Assessment activities and decision making. The Institutional Effectiveness and Assessment
Committee (representative of all units) was sub-divided into smaller sub-committees and
charged to facilitate the development of a streamlined process for implementing assessment.
College-wide sub-committees were appointed to lead four specific areas of assessment:
Institutional Effectiveness and Assessment, General Education Assessment, Program-level
Assessment and Course-level Assessment. These sub-committees held several meetings and
developed templates to further guide institutional assessment and assessment of student
learning. Some of the products and processes emanating from these sub-committees and
shared and adopted by the larger IEAC were the creation of templates for operational
assessment, curriculum mapping and program and course assessments (Appendix V: Examples
of Assessment Instruments).
To strengthen institutional activities, MEC engaged the college community by expanding
formal assessment practices. As a result of the recommendation by the MSCHE Visiting Team,
the College separated the Assessment Plan into two sub-plans to show the difference between
Institutional Effectiveness (Plan A) and Student Learning Assessment (Plan B).
Recommended Revision of the MEC Assessment Plan
The revised MEC Assessment Plan (Appendix W: MEC Assessment Plan) is differentiated
in Table 3 below and highlights the institutional effectiveness (IE) and student learning
assessment (SLA) activities. Under the revised Assessment model, Plan A (IE) focuses on the
effectiveness at each transition point of the Student Success Progression Model (SSPM), and
evaluates the trends of the institutional assessment measures listed in Appendix U: MEC
Institutional Assessment According to the SSPM, including remediation exit rates, gateway
p. 14
Medgar Evers College, The City University of New York
Follow-Up Report
November 1, 2014
courses pass rates, average credit accumulation, retention and graduation rates. The
institutional level of assessment in the original plan is now entirely covered by IE. Plan B (SLA),
on the other hand, focuses on student learning outcomes and includes three of the four levels
in the original plan: (1) Institutional; (2) Departmental/Program; (3) Course -Gateway creditbearing, and (4) General Education. Table 3 below reflects how the four levels in the original
Assessment Plan are now covered and used by the two Sub-Plans:
Table 3: The Differentiation of Institutional Assessment and Student Learning Assessment at
the Four Assessment Levels in the Revised MEC Assessment Plan
Assessment Level
1. Institutional Level
2. Departmental/Program Level
Assessment Type
Institutional Effectiveness and
Assessment (IE - Plan A)
1. Assessment activities that utilize
measures of student success at the
institutional level from PMP and/or
the MEC SSPM, such as retention
rate, graduation rate, grade
distribution, GPA, total credits,
course pass rates etc. (see list of
measures in Table 6.)
2. Assessment of Operational
Offices and Student Support Offices
3. Assessment of all other areas
identified in the Strategic Plan and
PMP beyond the coverage of SSPM,
such as faculty productivity,
financial management, etc.
Departmental/program assessment
activities that utilize the
Departmental Data Set Report and
the standard departmental
assessment measures identified in
Table 6
3. Course Level
Course grade and withdrawal rate
analysis;
Student evaluation of teaching
4. General Education
N/A
Student Learning Assessment
(SLA - Plan B)
CLA exam or equivalent CUNY
selected instrument is used at the
institutional level. The outcomes
data are collected by CUNY Central
and the MEC Testing OIRA Offices
and are shared by both IE under
Standard 7 and SLA under Standard
14.
Departmental/program assessment
activities that utilize Standard SLA
tools such as:
Licensing exams;
e-portfolio;
Capstone courses;
ACT exams
The scope of course-level
assessment includes developmental,
gateway and other credit-bearing
courses.
Gen Ed assessment is a Student
Learning Assessment activity under
Standard 14.
p. 15
Medgar Evers College, The City University of New York
Follow-Up Report
November 1, 2014
Enhancing Institutional Effectiveness
A number of institutional activities were undertaken to improve institutional
effectiveness during the past year. Among these activities, the College:
i.
Implemented CUNYfirst, an integrated resources and services tool used by all CUNY
colleges;
ii. Enhanced facilities: Renovation and relocation of Student Support Services and School
of Business into central areas, creation of mini lounges/study areas, recreation and
exterior spaces; use of Archibus; and faculty/staff lounge;
iii. Rebranded and instituted regular MEC marketing and communications;
iv. Continued action planning and strengthened budget processes;
v. Conducted review sessions on use of institutional data sets in guiding departmental
action planning;
vi. Coordinated assessment activities for operating units;
vii. Adhered to the schedule for program review cycle;
viii. Increased academic support services through the coordinated undergraduate education
program (CUE) - expanded summer immersion; supplemental instruction, and
ix. Redesigned the First Year Experience.
The College joined the University in streamlining faculty and student processes (Admissions,
Testing, Financial Aid, Advisement, Registration, Billing, Records – Grades & Transcripts), human
resources, business services and operations and institutional research using CUNYfirst. This
information management system is a fully integrated resources and service tool that is a simple,
unified system. Implementation of this system makes for a smoother process in the daily
functions of students, faculty and staff.
The Office of Facilities Management completed several major projects. Student support
services were consolidated into one central location removing the challenges students faced in
accessing support and services in three separate buildings. The Registrar, Admissions, Financial
Aid, Academic Advising Center, and SEEK now operate out of one building. The Library
p. 16
Medgar Evers College, The City University of New York
Follow-Up Report
November 1, 2014
reopened in Fall 2014 with major enhancements and a fully staffed Writing Center is in
operation to support student learning activities. Based on feedback from student satisfaction
surveys, several mini lounges, study areas, recreation spaces and enhanced exterior were
created campus-wide to provide students with areas where they can engage in campus life
activities.
Facility upgrades were also completed for the School of Business in that all department
faculty and staff have been relocated to one central area that is completely refurbished with
new offices, furniture and working spaces. An immediate action emanating from the collegewide retreat in February 2014 resulted in the outfitting and opening of a faculty/staff lounge as
a shared space for intellectual discourse.
A major improvement to Medgar Evers College was the rebranding activities initiated by
President Crew. The College’s new Website, logo, banners, and community stamp were
deployed through a series of vigorous marketing and public relations activities carried out by
the Office of Communications and Public Relations.
In addition to infrastructural developments and rebranding initiatives, institutional
effectiveness was strengthened through the continuation of annual action planning by
academic departments and operating units. Supporting these activities were the strengthening
of the budget call process where departments and units received guidance in using institutional
data and budget timelines for planning and decision making. Similarly, OIRA held review
sessions to discuss the use of institutional data sets in guiding action planning, while the Office
of Accreditation and Quality Assurance (OAQA) worked with Operating Units to coordinate their
assessment activities. OAQA also ensured that the schedule for program reviews was
maintained as described in Table 9: Program Reviews and Accreditation Status Reports. The
programs with reviews completed are the English, Mathematics, and Philosophy and Religion
Departments. Program Reviews initiated are Biology, Psychology, and Social and Behavioral
Sciences (Appendix X: Program Reviews and External Evaluator Reports; Appendix X.1:
Overview of Departmental Data Discussion Document; and Appendix X.2: Concentrations
Selected by Degree Students).
p. 17
Medgar Evers College, The City University of New York
Follow-Up Report
November 1, 2014
OAQA was additionally tasked with initiating Assessment Plans for the following
administrative units: Communications; Finance and Administration; Student Advisement;
Facilities, and Information Technology (Appendix Y: Operating Unit Assessment Plans). At the
first iteration of developing assessment plans for non-academic units, one small challenge was
that the Administrative Heads were newly hired under President Crew; therefore, none had the
benefit of participating in the MEC assessment related activities sponsored over the past
several years. For the most part, the divisions were unfamiliar with the concept of identifying
how their activities were linked to student learning outcomes.
Strengthening Student Learning Assessment
The following steps taken by the College to strengthen assessment of student learning
included:
i.
Creating sub-committees from the Institutional Effectiveness and Assessment
Committees (IEAC) to lead assessment activities at various levels, namely
Program/Departmental Level, Course Level, and General Education;
ii. Retaining an external Consultant to evaluate the College’s current assessment practices
and provide recommendations for further development;
iii. Appointing Academic Assessment Leaders to guide and coordinate assessment for each
of the three Academic Schools (School-Level);
iv. Requesting academic departments to appoint one or two persons to serve as
Departmental Assessment Coordinators;
v. Sponsoring faculty professional development in assessment, and
vi. Holding review sessions to inform end users of the application of institutional data in
reviewing student learning assessment.
In March 2014, the Office of Academic Affairs retained the MSCHE recommended external
consultant, Dr. Jo-Ellen Asbury, to provide both an evaluation of existing materials as well as
guidance for next steps. Her review found that among the 16 academic departments, only four
departments provided evidence to support all dimensions of program assessment as shown in
her summary Table 4 below: (Appendix Z: External Consultant Review Reports, March 2014).
p. 18
Medgar Evers College, The City University of New York
Follow-Up Report
November 1, 2014
Table 4: Consultant’s Review Summary on Assessment Practices – March 2014
Rating Scale: 3 = exceeds expectations; 2 = meets expectations; 1 = present, but does not quite meet expectations;
0 = document not available for review
Department
Program
Program
Program
Course
Curriculum
Mission
Outcomes
Assessment
outcomes
map
Statement/
Plan
Statement
of Purpose
Accounting (BS)
X
X
X
2
1
0
0
0
Business Administration (AS)
X
X
X
Business Administration (BS)
X
X
X
1
1
0
1
0
Computer Information Systems (BS)
X
X
X
Computer Applications (AAS)
X
X
X
2
1
0
0
0
Public Administration (BS)
X
X
X
Public Administration (AS)
X
X
X
1
1
0
0
0
Economics and Finance
X
X
Education (BA)
Teacher Education (AA)
English (BA)
English (ENGR/ENGW)
English (ESL)
Foreign Languages
Liberal Studies (BA)
Liberal Arts (AA)
Liberal Arts AA Concentrations
English
Foreign Languages
Mass Communications, Creative and
Performing Arts and Speech
Philosophy & Religion
Psychology
Social & Behavioral Science
Social Work (BS)
Psychology (BA)
Religious Studies (BA)
2
X
X
3
X
X
X
2
X
2
X
X
1
X
X
3
X
X
X
1
X
1
X
X
X
X
X
X
X
X
X
2
X
2
X
1
X
X
3
0
X
1
X
0
X
X
3
X
X
X
2
X
0
0
X
X
3
X
X
X
X
X
X
X
X
X
X
X
X
X
X
2
2
2
X
X
X
1
0
0
X
X
Did not receive documents in advance.
X
1
X
0
X
X
2
X
0
p. 19
Mass Communications, Creative and
Performing Arts and Speech
Mathematics (BS)
Mathematics (MTHP)
X
Medgar Evers College, The City University of New York
Follow-Up Report
November 1, 2014
X
X
2
X
0
X
Nursing (LPN)
Nursing (AAS)
Nursing (BS)
2
X
X
2
X
X
X
Computer Science (AS)
Computer Science (BS)
X
X
Environmental Science (BS)
X
2
1
Biology (AS)
Biology (BS)
Social and Behavioral Sciences
Library
2
X
2
1
0
0
X
X
X
X
X
X
1
2
0
X
X
X
X
X
X
X
X
X
Did not receive documents in advance.
X
X
X
1
0
0
2
0
0
Did not receive documents in advance.
0
0
2
0
0
Following the Consultant’s report and suggestions, departments such as Social and
Behavioral Sciences (SBS), Mass Communications, Creative and Performing Arts and Speech
(MCCPAS) and Physical, Environmental and Computer Sciences (PECS) were charged with
clearly restating each program’s mission, goals and expected student learning outcomes. At the
departmental level, curriculum mapping was undertaken to ensure that degree programs and
course-based learning outcomes were clearly meeting program goals in a progressive and
integrated manner. The focused engagement of faculty and staff across the college in clearly
articulating program- and course-specific goals and expectations, mapping of curricula,
identifying assessment strategies resulted in unit-based comprehensive assessment plans and
in some instances, assessment handbooks.
To strengthen this exercise, selected faculty and staff with experience in assessment and
accreditation processes were given the task of assisting academic and administrative units in
revising program goals, mapping curricula, formalizing assessment plans and processes,
including timelines for data collection and analyses that would guide them in decision making
and action planning. Three Academic School Assessment Coordinators representing the three
Schools were appointed to streamline the departmental/unit level assessments through
workshops and professional development sessions (Appendix AA: Summaries of MEC
p. 20
Medgar Evers College, The City University of New York
Follow-Up Report
November 1, 2014
Coordinators’ Reports on Assessment Activities). Department/Unit Assessment Coordinators
were identified and charged to coordinate the assessment activities within their respective
departments. In addition, several departments and Schools conducted faculty and staff
assessment focused retreats.
Results from these activities reveal that there are 10 non-accredited academic departments
and five operating units with drafts of their assessment plans, pending full departmental
reviews and revisions (Appendix BB: Departmental Assessment Plans). Six of the non-accredited
academic programs (English, Biology, Chemistry, Mass Communications, Creative and
Performing Arts and Speech (MCCPAS), Psychology, and Social & Behavioral Sciences) also
drafted their Assessment Handbooks, and are in the process of departmental reviews before
posting on their Web pages (Appendix CC: Sample Assessment Handbook - Biology).
Departmental assessment coordinators will continue to be the first reviewers of
program-level and course-level assessments within departments and will provide ongoing
support to faculty and staff to ensure the continuity of the assessment culture. Additional
support of this departmental process will be provided by Academic School Assessment
Coordinators as part of their respective school-based activities. The College is initiating Schoollevel assessment days following the models of the accredited programs.
These activities are continuously undergirded by the Director of the Office of
Accreditation and Quality Assurance to ensure that proper assessment protocols are followed
as well as to provide ongoing assistance to non-academic units at the College. The Office of
Academic Affairs provided additional resources to further facilitate the work of assessment. Ten
Assessment Coordinators received financial support to attend the MSCHE Workshop on
Becoming an Assessment Coordinator, as well as an all day Assessment Symposium on
Curriculum Mapping for Effective Assessment hosted by Nassau Community College in May
2014.
p. 21
Medgar Evers College, The City University of New York
Follow-Up Report
November 1, 2014
3. Evidence that Course Syllabi consistently include Student Learning Outcomes and that
Program Goals and Expected Student Learning Outcomes are published for all Programs at all
levels (Standard 14).
In March 2014, the Consultant found that among the 36 operational and academic
departments (degree and non-degree) evaluated, 94% [34] included their mission statements,
83% [30] identified program outcomes and 69% [25] included course outcomes. Based on this
finding, the Institutional Assessment Committee developed a syllabus template, which was
subsequently reviewed and adopted. Accredited programs syllabi, as based on their respective
accreditor requirements, were already consistent and systematic in including college course
descriptions, mission, goals, student learning outcomes and related assessment instruments
(Appendix DD: Syllabus Template). Following college-wide assessment activities in the summer,
15 out of 19 (79%) academic departments have begun to revise their syllabi with several of
them being used as of Fall 2014. Among the progress in using the new syllabus template are 10
departments with non-accredited programs and four departments with accredited programs as
detailed in the following table.
Table 5: Checklist of Departments Using New Syllabi Template as of Fall 2014
PROGRAMS
Fall 2014
School of Business
Accounting
Business
Computer Information Systems
Economics & Finance
Public Administration
Accredited format

Accredited format
 (FIN courses)

School of Liberal Arts & Education


 except 2 courses


Accredited format
Education
English
Psychology
Religious Studies
Social & Behavioral Sciences
Social Work
School of Science, Health & Technology





Accredited format
Biology
Chemistry
Computer Science
Environmental Science
Mathematics
Nursing
Other Academic Programs
p. 22
Freshman Seminar
SEEK
Medgar Evers College, The City University of New York
Follow-Up Report
November 1, 2014


Sample syllabi are posted on the College’s website under their respective program links.
(Appendix EE: Sample Syllabi by Department). All students have access to the syllabi on the first
day of classes through their Blackboard links or in person by instructional faculty, so that
students, are fully informed about the expectations for successful completion of courses in
their programs.
The College’s recent rebranding efforts, spearheaded by the Communications and
External Relations Department, resulted in the creation of high-quality brochures for
departments and in a new College website with departmental pages that publish program goals
and expected student learning outcomes for Schools and Departments. To date, the
departments within the School of Science, Health and Technology (SSHT) have web pages that
are populated with mission, vision, and program goals. Following the completion of the SSHT
web pages, the Office of Communications & External Relations will work with the two other
Schools (School of Liberal Arts and Education and School of Business) to complete their web
pages. The redesign to include departmental revisions made during the past year of assessment
activities is still underway. The timeline for full completion of the College’s website is Spring
2015, by which time all programs will be current and public.
Similarly, the College’s Catalog is undergoing revisions to be consistent with the
College’s new branding initiative. Updates to the College Catalog will include stating each
program’s mission, goals and expected student learning outcomes, among others. Expected
completion date is Spring 2015.
The College also plans to include a separate web page for the General Education/
Pathways Curriculum that will include the program goals, student learning outcomes and their
respective assessments. This activity will commence after the review of the pilot data (Fall
2014) and subsequent revisions to the Curriculum at the program and/or course level based on
preliminary results. The General Education program assessment currently underway is included
in the syllabi for the respective courses being assessed using the Essential Learning Outcomes
Assessment rubrics. The anticipated completion date for including the General Education
p. 23
Medgar Evers College, The City University of New York
Follow-Up Report
November 1, 2014
curriculum among the College’s web pages will be Spring 2015. See Table 8, p31: Timeline for
General Education Assessments.
p. 24
Medgar Evers College, The City University of New York
Follow-Up Report
November 1, 2014
4. Further Implementation of an Organized and Sustainable Process to Assess Achievement of
Expected Student Learning Outcomes in all Programs with Evidence that Assessment
Information is shared and is used for the Continuous Improvement of Student Learning
(Standard 14).
Sustaining student learning assessment is guided by an Institutional Assessment Calendar
(Table 2, p9) as well as the process and procedures implemented by the coordinating units as
shown in Appendix FF: Assessment Organizational Chart. At MEC, assessment of student
learning is framed by MEC’s Student Success Progression Model (SSPM) which identifies unique
points in a student’s progress that are critical to their college readiness, retention, progression
and time‐to‐graduation. Sustaining college-wide assessment of student learning encompasses
both internal as well as external assessments. The major elements incorporated under the
Assessment of Student Learning therefore include university-wide (CUNY), institutional (MEC),
General Education/Pathways, program/departmental level, and course-level assessments as
designated in the SSPM framework and the revised Institutional Assessment Plan.
The College also committed to hiring an Assessment Director for student learning
assessment. Due to the changes in senior administration and the transition of the former
Interim Provost in April 2014, the position posting was delayed until July 2014, with an
expected hire date by January 2015. Having promoted a culture change with the establishment
of School and Departmental Assessment Coordinators, the long-term maintenance and use of
college-wide assessments can now be continuously monitored and strengthened by an
Assessment Director, when hired, with continued oversight by the Office of Accreditation and
Quality Assurance and the Institutional Effectiveness and Assessment Committee and Subcommittees.
On October 16, 2014, the College’s Academic Council discussed and agreed to maintain the
processes used in developing and implementing the college-wide assessment practices.
Periodic data reports from the Office of Institutional Research and Assessment as well as
program-level and course-level assessments are continuously used for improvement of student
learning. During departmental retreats, meetings and calibration sessions, these results inform
student learning at the program as well as course levels. While the OIRA provides periodic
p. 25
Medgar Evers College, The City University of New York
Follow-Up Report
November 1, 2014
departmental datasets that include general information on grades, GPAs, and pass rates for
students, the program and course assessments provide more specific information on learning
outcomes based on program criteria.
Therefore, the process for sustained assessment of student learning begins at the course
level by the instructional faculty in that courses provide the learning experiences that dictate
the student learning outcomes and faculty use of instruments that measure student
performances on these expected learning outcomes. Samples of course assessments are in
Appendix V: Examples of Assessment Instruments). These reports will be generated at the end
of each semester, beginning Fall 2014 using prescribed templates and will be submitted to the
Departmental Assessment Coordinators for analysis. Department coordinators will be guided by
School-based coordinators in reviewing data and analyses. They will work with their Chairs to
present outcomes data to their department staff for discussion and actions/decisions. Prior to
submission to the Office of Academic Affairs, Department Chairs will share summary reports at
their department and respective School meetings, showing how their data informs their
academic programs, action planning and PMP targets. The Office of Academic Affairs in
conjunction with the Institutional Effectiveness and Assessment Committee (IEAC) will review
reports and provide further guidance, if needed, in using pertinent data for future action
planning, including resource allocation and meeting PMP targets. The IEAC will receive reports
and also provide feedback and guidance in addressing any challenging areas/areas of concern.
OAQA archives data and reports on their SharePoint portal. Active reports and data which
include student learning outcomes, action plans, budget requests and allocations will be stored
on Strategic Planning Online (SPOL). This system will be active in Spring 2015. Public
dissemination of Student Learning Outcomes will be generated by the Assessment Director
(new Hire) under the guidance of the Office of Accreditation and Quality Assurance, and in
collaboration with the Office of Research and Institutional Assessment.
The College’s review and analysis of student outcomes in past years revealed critical needs
that warranted immediate steps to improve outcomes and progression in degree programs. The
greatest challenge continues to be the developmental needs of first time freshmen in areas of
p. 26
Medgar Evers College, The City University of New York
Follow-Up Report
November 1, 2014
mathematics, reading and writing. As Table 6 indicates, the College had 86% of the total Fall
Freshmen in 2013 required developmental education in at least one area.
Table 6: Developmental Needs of First Time Freshmen
ACADEMIC
YEAR
Fall 2011
Fall 2012
Fall 2013
Total First Time
Freshmen
1201
1045
1046
Total in Need
of
Remediation
934
881
897
Needs 1
Needs 2
Needs 3
Needs Any
42.2%
50.9%
59.3%
19.2%
17.8%
17.1%
16.3%
15.6%
9.4%
77.8%
84.3%
85.8%
The Summer Immersion Programs were successful in meeting the developmental
educational needs of its students. Additional resources were allocated towards the immersion
program to increase the number of students who were prepared to enter BA programs
following the immersion program since data show retention rates are higher for BA students
exiting immersion than for AA students not enrolled in immersion (64.7% vs. 56.1%). Table 7
shows the pass rates of students who participated in the Summer Immersion program as
compared to those students enrolled in the fall semesters (Appendix GG: CUE Report).
Table 7: Summer Immersion Pass Rates
Semester
Summer 2011
Fall 2011
Math
87%
31%
Reading
52%
35%
Writing
52%
38%
Summer 2012
Fall 2012
38%
28%
47%
39%
41%
39%
Summer 2013
Fall 2013
69%
29%
47%
50%
67%
47%
Summer 2014
75%
55%
69%
The College recognizes the need to support students with developmental education in
order to facilitate their credit accumulation and progression through their degree programs. To
meet these needs the English and Mathematics departments developed specific developmental
education projects to improve student progress.
p. 27
Medgar Evers College, The City University of New York
Follow-Up Report
November 1, 2014
Developmental English
The main objective of the “CATW Course Redesign Project” was to improve the reading
and writing test pass rates (CATW and ACT Reading) of students in developmental level courses.
It was a collaborative effort between English faculty and Mass Communications, Creative and
Performing Arts and Speech (MCCPAS) department faculty, designed in the Spring of 2013.
Faculty members in the MCCPAS were chosen because they teach credit bearing courses that
can be taken in conjunction with developmental Reading and Writing courses: there are no
prerequisites for Music (MUS 100), Art (ART 100), and Speech (SPCH 102). These are some of
the only credit bearing courses that students can take while taking remedial and developmental
Reading and/or Writing.
Approximately 33 courses were targeted in the “CATW Course Redesign Project,” and
these were the Experimental Group. These were primarily Art 100, Music 100, and Speech 102
with 4 Dance classes included. In this experimental group, the goal was to have Music, Art, and
Speech faculty reinforce particular Reading and Writing skills specifically aimed at the Reading
and Writing exit tests. The Control Group are those who did not participate in workshops or the
Course Redesign Project; there were 48 classes. About 22% (572 students) of the total students
(2612) in Art, Music, and Speech were developmental Reading or Writing course, or both.
About 481 students were in ENGL 112 classes (about 18%).
For the most part, the experimental group outperformed the control group. For
example, in Reading courses (005 and 006) the experimental group had a 60.78% pass rate
while the control group had a 50% pass rate. In the Writing courses the pass rate was 54.5% for
the experimental group, and the control group had, again, a lower pass rate of 45.2%.
Description of the implementation process, results from the pilot study, as well as future goals
stemming from these initial results are included in Appendix HH: Developmental English
Redesign Project.
Developmental Mathematics
The Developmental Mathematics program developed three differentiated project
approaches to meet the needs of students and to reduce the time spent in remediation:
p. 28
Medgar Evers College, The City University of New York
Follow-Up Report
November 1, 2014

Project #1 reduced the classroom lecture hours in MATH 009 and MATH 010 from 4
hours to 3, increasing the additional hour to 1 ½ hours recitation staffed along with
instructor, tutors or peer leaders. This resulted in more hands-on, individualized practice
for students who, based on test scores, could benefit from a full semester of either
MATH 009 and MATH 010. The Mathematics Department calls this approach
“modulated self-pacing,” which includes diagnostic pre-tests administered at the
beginning of each topic, an item analysis of the results, and an individualized study plan
based on those results.

Project #2 combined MATH 009 and MATH 010 into an accelerated model for students
who, based on test scores close to the cut-off, may not need two semesters of
developmental math. Again, classroom lecture hours were reduced from 8 to 6, with a
mandatory recitation time of 3 hours in a computer lab.

Project #3 combined MATH 010 with the gateway course MATH 136 into an accelerated
model for students who, based on test scores close to the cut-off, can benefit by being
placed directly into a college-level course with additional support. As in Project #2,
classroom contact hours were reduced from 8 to 6, with mandatory recitation of 3
hours in a computer lab.
Using results from diagnostic pre-tests, faculty were able to individualize instruction,
intervention and develop study plans for students. Assessment of students’ strengths and
needs garnered from diagnostics lead to more focused interventions, including tutoring,
technology-based and hands-on instruction, approaches that are supported by research (Larsen
et al, 2011).
Additionally, the Developmental Mathematics program developed alternative pathways
to improving mathematics proficiencies for non-STEM majors. The 2014 pilot of the
Mathematics Redesign program, funded by CUNY, resulted in a 76% pass rate and an 88%
retention rate among the students who participated. Details of the project, results and future
implementation timelines are included in Appendix II: Developmental Mathematics Redesign
Project.
p. 29
Medgar Evers College, The City University of New York
Follow-Up Report
November 1, 2014
The First Year Experience Redesign (FYE) Project
Institutional data for 2011-2013 revealed that while retention rates for all degree
students ranged from 64% to 94%, the retention rates for students needing developmental
preparation was below 50% in both the first and second year of college. The FYE Redesign
supports the transition of students from Associate to Baccalaureate programs, increases one
year retention rates and ultimately increases graduation rates as students progress from
entrance to the College to degree attainment. The redesign of the first year experience was one
of the primary strategic decisions that resulted from the College-wide Strategic Planning Retreat
held February 2014.
The Committee proposed several activities to strengthen the experiences of all first-time
freshmen, including summer orientation immersion sessions, an intentional advisement model,
revision of the Freshman Seminar syllabi, bridge programming in developmental courses,
common hour programming, and thematic learning community cohorts, among others
(Appendix JJ: First Year Experience Redesign).
General Education Program Assessment
While the ultimate responsibility for program revisions and implementation is with
faculty, there are instances where initiatives for program revision are in response to policy
proposals and guidelines from CUNY. In November 2008, the College Council of Medgar Evers
College set up a General Education Committee to revise the General Education curriculum and
program of the College. The General Education Committee retrofitted the MEC General
Education Curriculum to align with Pathways. CUNY's new general education framework is a
central feature of Pathways. It lays out requirements that undergraduate students across CUNY
must meet. Importantly, it also guarantees that general education requirements fulfilled at one
CUNY college will carry over seamlessly if a student transfers to another CUNY college
(Appendix KK: Pathways Resolution and Framework). In Spring 2013, all academic units in the
College implemented changes to their degree programs in compliance with the CUNY mandate.
p. 30
Medgar Evers College, The City University of New York
Follow-Up Report
November 1, 2014
Medgar Evers College’s General Education Committee was charged with developing a
systematic method for assessment of the General Education curriculum. Initial tasks carried out
by the Sub-Committee during 2013-2014 was the planning and development of a pilot study
that would use two beginning courses (ENGL 112 – College Composition and MATH 115 – The
Nature of Mathematics) to identify key assignments and common instruments to measure
requisite student learning outcomes prior to the full implementation of general education
curriculum. The Committee also reviewed capstone experiences across the general education
courses to determine how they represented and can be used as another assessment of the
General Education Program. The first assessment of the general education is currently
underway in Fall 2014, using the Essential Learning Outcomes of Liberal Education and
America’s Promise (LEAP) rubrics to measure student learning in the required core of the
general education curriculum (Appendix LL: General Education Assessment Process and
Rubrics).
Assessment across other courses in the General Education Program will be phased in
between the Fall 2014 and Spring 2016 semesters, using the following timetable (Table 8), with
full college-wide implementation by the end of Spring 2016. A phased approach was necessary
to accommodate the unique planning needs for each department/unit in measuring students’
progression in the general education curriculum clusters which were only fully implemented in
2013.
Table 8: Timeline for General Education Assessments
Scheduled
Term
Fall 2014
Department & General
Education Program
ENGL 112
MTH 115
ART 100
MUS 100
Types of
Assessments
Required Core
Course level
Assessment Instruments
Spring 2015
ENGL 150
BIO 101
PHS 101
HIST 200
SSC 101
BIO 211
Socio-Cultural &
Diversity
College Option Courses
Required Core
Course Level
LEAP Essential Learning Outcomes (ELO’s)
developed by the American Association of Colleges
& Universities
LEAP Essential Learning Outcomes (ELO’s)
developed by the American Association of Colleges
& Universities
Departmental Assessments (TBD)
Fall 2015
Spring 2016
Flexible Core
Course Level
Cluster III
LEAP Essential Learning Outcomes (ELO’s)
developed by the American Association of Colleges
& Universities
p. 31
Medgar Evers College, The City University of New York
Follow-Up Report
November 1, 2014
(see Appendix LL)
Socio-Cultural &
Diversity
College Option Courses
(see Appendix LL)
Cluster IV
Departmental Assessments (TBD)
Academic Program-Level Assessments: Program Reviews and Professional Accreditation SelfStudies
As delineated in the MEC Assessment Plan submitted with the 2013 Monitoring Report,
the College instituted an Academic Program Review Schedule that included both professionally
accredited and non-accredited programs. The primary office responsible for these reviews is
the Office of Accreditation and Quality Assurance (OAQA). Progress on meeting the MEC
Academic Program Review Schedule as submitted is delineated in Table 9: Program Reviews
and Accreditation Status Reports.
Table 9: Program Reviews and Accreditation Status Reports
Accredited Programs
Degree
Accrediting
Body
Next
Accreditation
20122013
20132014
20142015
Notes
NCATE
2019
E
AA
AA
See Action Plans for
implementation of accreditor
recommendations
CSWE
2015
AA
AA
P
Additional Faculty line hired
Administrative Assistant Hired
Continued administration and
evaluation of assessment
instruments as scheduled.
AA
AA
P
Data collection initiated.
Syllabi review initiated.
Self Study Committees
established & meeting.
Faculty attended ACEN
Conference; Revised ACEN
Standards reviewed.
School of Liberal Arts and Education
Education
-Childhood Education – BA;
-Childhood Special
Education – BA
-Early Childhood Spec. Ed. BA
-Education: Teachers
Education – AA
Social Work -BSSW
School of Science, Health and Technology
Nursing
NLNAC
2015
-AAS/PN
NYSED
-BSN
p. 32
Physical, Environmental &
Computer Sciences
- Environmental Science –
BS
Medgar Evers College, The City University of New York
Follow-Up Report
November 1, 2014
S
E
AA
Quarterly reports submitted
Dec. 2013 and April 2014
indicating progress made in
meeting EHAC requirements for
accreditation.
EHAC
in
application
stage
Accounting - BS
ACBSP
2014
S
E
AA
Business Administration
-Applied Management – BPS
-Business – BS
-Business – AS
Computer Information
Systems
-CIS – BS;
-Computer Application –
AAS
ACBSP
2014
S
E
AA
ACBSP
2014
S
E
AA
School of Business
Self Study submitted Jan. 2014
Accreditation re-affirmed May
2014
(The School received a one-year
extension for submission of their
Self-Study. In Summer 2013,
faculty requested that OAQA
take the lead in directing the Self
Study.)
Note: ACBSP accredits at the
College/School level. Thus, the
School of Business is accredited
to offer the degrees listed in
Column A.
Non-accredited Programs
Degree
Program
Review
last
completed
2011
20122013
20132014
20142015
Notes
E
I
AA
See Action Plans for implementation of External
Reviewer recommendations.
2006
P
S
E
Psychology - BA
2009 (inc.)
AA
P
S
Social &
Behavioral
Sciences
- Liberal Studies
2006
AA
P
S
The Department submitted its Review March 2014.
Low number of degree students and in P & R
concentration resulted in recommendation to delist
degree program.
Decision on delisting degree program and integrating
courses into Social & Behavioral Sciences pending.
No external reviewer was identified based on
departmental decision to delist program.
Presentation made by OAQA at Departmental Meeting
discussing overview and requirements of Program
Review.
APA Guidelines distributed to all faculty.
APA competencies table created by faculty to facilitate
review and alignment of program goals and student
learning outcomes to APA competencies.
OAQA completed analysis of course offerings, student
enrollment in concentration areas and in the AA and BA
programs.
OAQA completed data background report on AA and BA
English
- English - BA
- Liberal Arts –
English Studies
–AA
Philosophy &
Religious
Studies -BA
p. 33
2006
P
S
E
Mathematical
Science - BS
2014
S
E
I
Physical,
Environmental
& Computer
Sciences
-Computer
Science – BS
-Computer
Science – AS
Public
Administration
-Public
Administration
– BS
-Public
Administration AS
2006
AA
AA
P
Medgar Evers College, The City University of New York
Follow-Up Report
November 1, 2014
degrees to facilitate data-driven decision making. This
includes reviewing and trending reports received from
OIRA and the Registrar.
Program mission and goals revised by faculty.
The Biology Program Review remains in progress, with
several sections to be completed.
Data and survey results and analysis provided by OAQA
and OIRA.
Mathematical Sciences Program Review submitted
February 2014.
Data and survey results provided by OAQA and OIRA.
External Review Team on-site visit conducted on March
14, 2014.
External Reviewers Report was submitted April 2014.
See Action Plans for implementation of
recommendations.
Preparation to be initiated this year
2006
AA
AA
AA
Preparation slated for next year
– BA
- Liberal Arts –
AA
Biology
- Biology – BS
- Science – AS
KEY: P (Preparation); E (External review); S (Self Study); AA (Annual Assessment)
For 2013-14, OAQA provided procedural oversight and, in collaboration with OIRA,
provided content for several of the School of Business Re-Accreditation Self-Study standards;
provided data and content to the completed Program Reviews, and created data reports for the
preparation stage of the Social and Behavioral Sciences Program review (Appendix X: Program
Reviews and External Evaluator Reports; Appendix X.1: Overview of Departmental Data
Discussion Document; and Appendix X.2: Concentrations Selected by Degree Students).
In addition to working on their respective accreditation or Program Reviews, each
academic department reviewed and revised their mission statements, program goals and
p. 34
Medgar Evers College, The City University of New York
Follow-Up Report
November 1, 2014
student learning outcomes as part of the larger OAA Syllabus Revision Project. Results of these
activities will fold into the Program reviews and pending accreditation reports.
The initiatives of the administrative units included the implementation of CUNYfirst; the
CUNY Pathways Initiative, completion of the Library expansion and other facilities upgrades;
several financial audits and an overhaul of the IT infrastructure. However, each area submitted
their first iteration of their respective assessment plans which will provide the basis for future
revisions (see Table 9: Program Reviews and Accreditation Reports Administrative Units).
The Sub-Committee on Institutional Effectiveness and Assessment Committee convened in
September to review all plans. The next step in the cycle was to schedule two to three full-day
review meetings to meet with each department's assessment representatives in forty-five
minute sessions, discuss the plan, and provide feedback. The departments of which the plans
fail to meet expectations were required to resubmit in February.
Further development of the strategic planning software will streamline the collection and
analysis of Action Plans. Annual academic action plans and the strategic operational plans
follow a standard format, and have been implemented across units during the last two years.
Progress made in design, utilization and assessment of strategic action planning initiatives,
including the identification of priority goals and budgetary decision-making can be found in
Appendix T: Key Outputs from 2013-2014 Strategic Action Planning.
Course Level Assessment
Through the Academic Assessment Committees’ evaluation of course-level assessments,
instructional faculty engaged in close review of their course content as it related to student
learning outcomes in the areas of knowledge, skills, reasoning, product and/or dispositions, and
how to use performance outcomes data to improve their own teaching as well as student
learning. Faculty also understood how each course contributed to the program outcomes at the
departmental level as well as institutional outcomes in meeting the College’s mission and goals.
Faculty members ensured that student learning outcomes reflect the aforementioned
alignments (Appendix CC: Sample Assessment Handbook - Biology).
p. 35
Medgar Evers College, The City University of New York
Follow-Up Report
November 1, 2014
Part of the responsibilities of the Assessment Coordinators was to ensure that academic
areas collect, analyze and report student learning outcomes on key learning experiences,
beginning Fall 2014. Samples of assessment instruments that are being used are included in
Appendix V: Examples of Assessment Instruments. This practice, in addition to the current use
of periodic departmental data sets provided by OIRA, will continue every semester as faculty
are further engaged in continuous reflection of outcomes data and their uses for both course
and program decision making and improvement. Departmental biannual reviews (assessment
plans data collection timelines) of these data will also inform and guide strategic action
planning, resource distribution, including budgetary allocations and CUNY PMP, thereby
completing the assessment loop. Collecting data and providing evidence of course level
assessments are already practiced by accredited departments and serve as models for collegewide course assessments. Biannual college-wide data collection on course-level assessments in
the Fall 2014 semester will use these models. A sample template showing course-level key
learning assessments and their alignments with program goals is included in Appendix MM:
Course-Level Learning Assessment and Alignment Template.
The following Table 10 provides updates from Fall 2013 – Summer 2014 to the planned
course level assessments submitted in the 2013 Follow-Up Report.
Table 10: Progress on Planned Course Level Assessment
Plan
Identify courses for assessment (2-4 per type per year)
Faculty to participate (creating a cohort of faculty that will
meet together periodically for professional development and
sharing of strategies)
Provide training and opportunities for collaboration to support
faculty in assessing student learning based on existing learning
outcomes and assessment methods
Activities Completed
Identified in Table 11
Assessment Coordinators
Cohort

School – level
Coordinators

Departmental
Coordinators

OAQA
Conference Attendance (MSCHE,
Nassau Community College)
External Consultant - Individual
meetings and group workshop
with Consultant
p. 36
Medgar Evers College, The City University of New York
Follow-Up Report
November 1, 2014
Lead Assessment Coordinators –
School Level
Provide support for faculty to review outcomes and develop plans for
the improvement of student learning as well as assessment methods
Provide venues for faculty to share results and plans with colleagues
Provide professional development and financial support to expand the
use of e-Portfolios for course-level assessment
In College workshops
One-on-One Meetings with
Assessment Coordinators
IEAC meetings/ Forums
School and department Meetings
Departments have determined that
ePortfolio will be infused in capstones
only.
Currently, Education and SEEK
programs use ePortfolio
Summary
Medgar Evers College is pleased to report and demonstrate its compliance in meeting the
rigorous standards to maintain accreditation from the Middle States Commission on Higher
Education. In spite of the many changes at the leadership level of the institution during the last
two years, the College’s administrators, faculty, staff and students united in developing and
strengthening a comprehensive and consistent culture of assessment. By consciously embracing
the recommendations and suggestions provided by MSCHE, as well as other external reviewers,
the College is now on a path to continuous improvement in all areas. The above examples
provide evidence that Medgar Evers College is not only using assessment data to improve
institutional effectiveness and student learning outcomes, but also to impact the larger
community of Central Brooklyn in ways that will eventually improve its intake pool and
ultimately retention and graduation outcomes. This is a moral obligation that reflects the
College’s mission and existence in Central Brooklyn.
This process of self-evaluation has not only generated a new commitment to effective
practice, but also has reenergized the college community in collaborative engagement in
refining current programs and developing new programs, services and facilities to better serve
our diverse student population, faculty, staff and community. Following a period of instability
and insecurity, Medgar Evers College is again at a place where it is valued for its contributions
p. 37
Medgar Evers College, The City University of New York
Follow-Up Report
November 1, 2014
to higher education, as evidenced by the increased student enrollment and retention,
community engagement and global awareness and partnership interests during the last year.
Under our new leadership, the College has made significant improvements across
several areas within one year, demonstrating the resiliency of the institution. Beginning with
the rebranding efforts, restructuring of student services delivery, aesthetic improvements,
community engagement, professional development initiatives, and collaborative consultations
fostered by President Crew, the College is now on a track to fulfilling the mandates for which it
was created: to serve the marginalized, culturally and linguistically diverse population of Central
Brooklyn and environs, with Courage, Strength and Fortitude.
p. 38
Download