ASSESSMENT TASK FORCE FINAL REPORT

advertisement
1
ASSESSMENT TASK FORCE
FINAL REPORT
Mission, Goals,
Outcomes
Measurement & Data
Results & Analysis
Use of Results &
Analysis for
Improvement
MARCH 2011
2
Assessment Task Force Members
Nedra Alcorn
Associate Vice President
Office of Student Services
Rondall Allen, PharmD.
Associate Dean for Student Affairs and Curricular Assessment
College of Pharmacy
B. Cecile Brookover, MBA, Ph.D., Chair
Director for Institutional Effectiveness and Assessment
Office of Planning, Institutional Research and Assessment
Jacques Detiege, M.S.
Assessment Specialist
Office of Academic Affairs
Van Allen Gale, MBA
Senior Institutional Research Analyst
Office of Planning, Institutional Research and Assessment
Monique Guillory, Ph.D.
Special Assistant to the Administration
Xavier University of Louisiana
Elizabeth Yost Hammer, Ph.D.
Director, Center for Advancement of Teaching
Office of Academic Affairs
Treva Lee, Ph.D.
Director for Institutional Research
Office of Planning, Institutional Research and Assessment
Jonathan G. Rotondo-McCord, Ph.D.
Associate Dean, College of Arts and Sciences
Office of the Dean, College of Arts & Sciences
Lisa Schulte-Gipson, Ph.D.
Associate Professor, Department of Psychology
Chair, Core Curriculum Assessment Committee
3
Executive Summary
Following a successful ten year reaffirmation from SACS, Xavier initiated an assessment summit
to establish a strategic framework for assessment. While Xavier was found to be in compliance with all
SACS requirements and standards addressing assessment and outcomes assessment, the SACS On-site
Committee identified a number of areas for improvement. Members of Xavier’s SACS Leadership Team
and its Compliance Audit Committee also felt that a full review of the assessment process was
warranted and would set the stage for the next five and ten year SACS review. The summit was held on
June 15, 2010 and was facilitated by a professional consultant from Leadership Strategies. Objectives
included: (1) reach consensus on the purpose, or mission, of assessment at Xavier, (2) agree on a shared
vision and long-term goals derived from that vision, (3) identify critical success factors and barriers for
each of the long-term goals, (4) develop a list of possible strategies, or projects, for moving toward the
vision, achieving critical success factors, and/or priority strategies from the list of possible strategies,
and, (6) identify next steps, milestones, and accountabilities associated with the priority strategies.
An Assessment Task Force, chaired by Dr. Cecile Brookover, the Director for Institutional
Effectiveness and Assessment, was organized following the Summit and was charged to carry out a full
review of university assessment. The task force members represented a diverse set of nine
administrators, faculty and staff from the Xavier community
. The charge to this group included three
1
prime objectives: to map the areas where assessment occurs, to identify strengths and weaknesses in
the assessment process and to identify best professional practices in assessment. In addition, the task
force was charged to develop a preliminary set of recommendations, to seek input regarding these
initial recommendations, and to report to the University Planning Council. This report documents the
findings of the Assessment Task Force and their recommendations for improvement of the assessment
process at Xavier.
The state of assessment at Xavier was certified as meeting core requirements and
comprehensive standards based upon the recent SACS reaccreditation. However, in carrying out its
charge, the Assessment Task Force found that units, departments, programs, and individuals have
widely varying abilities to understand, conduct, and use assessment activities appropriately. The Task
Force also found that while units, departments, and programs carried out the process of assessment
successfully to the SACS reaccreditation review, there was widespread skepticism about the utility of the
process. In focused interviews conducted as a part of carrying out its charge, the Task Force concluded
that the assessment process in many areas was not yielding documented improvements and many
improvements were occurring outside the assessment process and were not well documented. These
improvements were not well documented. The recommendations from the Task Force are summarized
under the two major categories of The Assessment Process and Outcomes Assessment; within the two
categories the recommendations are prioritized as Essential (E) or should be adopted now; Short-Term
1
The Task Force members are Ms. Nedra Aicorn, Dr. Rondall Allen, Dr. Cecile Brookover, Mr. Jacques Detiege, Mr.
Allen Gale, Dr. Monique Guillory, Dr. Elizabeth Hammer, Dr. Treva Lee, Dr. Jonathan Rotondo-McCord, and Dr. Lisa
Schulte-Gipson.
4
(ST) or should be adopted in one year; and Long-Term (LT) or should be adopted by the next three year
assessment cycle.
Assessment Process Recommendations
Coordination of effort and systems would improve the effectiveness of the assessment process as well
as the university’s institutional effectiveness. Resources related to conducting assessments would also
improve effectiveness. Recommendations related to the assessment process, itself, are as follows.
1.
Establish a Xavier Assessment Group that serves as a monitoring and review committee for
Xavier assessment practices. The committee would include appointed members from academic
programs, administrative programs, and special programs as well as the Director for Institutional
Effectiveness and Assessment, the Assessment Specialist, and the Chair of the Core Curriculum
Review Committee. (LT)
2.
The Director for Institutional Effectiveness and the Assessment Specialist should prepare a
handbook for assessment at Xavier that explains the assessment process at Xavier and that
includes specific instructions for carrying out assessments. (E)
3.
The Director for Institutional Effectiveness and the Assessment Specialist should develop a
website specifically for assessment that includes the aforementioned handbook and other
resources. The website should be housed with the Office of Planning, Institutional Research and
Assessment’s website. (E)
4.
The cycle of five-year academic program reviews should be coordinated with the annual
program assessments so that the program reviews incorporate the currently-assessed learning
outcomes related to the specific program being reviewed. (ST)
5.
The programs with courses that are assessed in the Core Curriculum assessment process should
coordinate with the Core Curriculum Review by providing a method of assessing core learning
outcomes beyond the CAAP standardized test. (E)
6.
Individual academic program units should coordinate their curriculum with their learning
outcomes by preparing a curriculum map to document which courses relate to the learning
outcomes being assessed. The TracDat system provides a tool to develop a curriculum map. (LT)
7.
Planning, assessment, improvement, and institutional effectiveness should be better
coordinated by fully implementing the available components of the TracDat software. (ST)
8.
Student recruitment and retention plans formulated by the academic departments should be
incorporated into TracDat outcomes assessment. (LT)
9.
Smaller academic programs in similar disciplines should consider coordinating their assessment
by adopting some common outcomes that could be assessed as a larger sample. (ST)
5
10. Emphasize the importance of documentation in the TracDat Assessment Management System,
including documentation of outcomes, measures and related criteria, and improvements based
upon analysis of data. (E)
11. Set up assessment on a continuing basis throughout the year, not just at the end of the year.
(ST)
12. Adopt the recommendations from the Evaluation Group arising from the Assessment Task Force
Department Chair Survey conducted in Fall of 2010. (ST and LT) Details of these
recommendations can be found under the Evaluation Group section of this document.
Outcomes Assessment Recommendations
Specific recommendations related to outcomes assessment can be divided according to whether the
outcome concerns academic programs or administrative units. Recommendations for academic
programs and related support programs include the following items.
1.
SACS reviewers’ comments should be used to improve assessment. (E)
2.
Increase process compliance to prior levels. (E)
3.
Use weaknesses identified in Summer Program assessment to improve assessment year-round.
(E)
4.
Have more than one assessment method for each outcome so that evidence for meeting the
outcomes will be strengthened. (E)
5.
Limit the number of learning outcomes assessed in a three-year cycle to three-tofour specific
(knowledge, skills, abilities) outcomes that can be assessed using multiple methods. (E)
6.
Prepare a curriculum map to identify courses where outcomes can be assessed using embedded
course assessment employing rubrics. (E)
7.
Conduct additional training for use of TracDat, outcomes assessment, results analysis, and
improvement using face-to-face and online training. (E)
Recommendations for administrative units include the following items.
1.
Increase process compliance to prior levels. (E)
2.
3.
Emphasize the assessment of unit operations and processes. (E)
Determine the core functions of the unit that allow for reaching the goals of the office.
6
4.
Determine how the unit interacts with other administrative and/or academic units when
carrying out its core functions. Administrative units carry out their functions across multiple
offices. (E)
5.
Choose three core functions and state as outcomes for evaluation for each three-year cycle of
assessment. (E)
6.
Include the Institutional Effectiveness Survey results as an additional outcome. (E)
7.
Have more than one assessment method for each outcome so that evidence for meeting the
outcomes will be strengthened. (E)
8.
Conduct additional training for use of TracDat, outcomes assessment, results analysis, and
improvement using face-to-face and online training. (E)
The Assessment Task Force presents these recommendations as steps in developing a process of
assessment at Xavier University of Louisiana that will insure our university mission will be met for all
students. Additionally, adoption of and successful implementation of the recommendations will
guarantee that assessment of our educational programs and administrative processes follow best
practices that encourage continuous improvement based upon careful evaluation and evidence. These
recommendations, when implemented in a systematic fashion according to the timeline presented
above, will establish a culture of assessment for continued success.
7
INTRODUCTION
Following a successful ten year reaffirmation from SACS, Xavier initiated an assessment summit
to establish a strategic framework for assessment. While Xavier was found to be in compliance with all
SACS requirements and standards addressing assessment and outcomes assessment, the SACS On-site
Committee identified a number of areas for improvement. Members of Xavier’s SACS Leadership Team
and its Compliance Audit Committee also felt that a full review of the assessment process was
warranted and would set the stage for the next five and ten year SACS review. The summit was held on
June 15, 2010 and was facilitated by a professional consultant from Leadership Strategies. Objectives
included: (1) reach consensus on the purpose, or mission, of assessment at Xavier, (2) agree on a shared
vision and long-term goals derived from that vision, (3) identify critical success factors and barriers for
each of the long-term goals, (4) develop a list of possible strategies, or projects, for moving toward the
vision, achieving critical success factors, and/or priority strategies from the list of possible strategies,
and, (6) identify next steps, milestones, and accountabilities associated with the priority strategies.
An Assessment Task Force, chaired by Dr. Cecile Brookover, the Director for Institutional
Effectiveness and Assessment, was organized following the Summit and was charged to carry out a
complete review of university assessment. The task force members represented a diverse set of nine
administrators, faculty and staff from the Xavier community
. The charge to this group included three
2
prime objectives: to map the areas where assessment occurs, to identify strengths and weaknesses in
the assessment process and to identify best professional practices in assessment. In addition, the task
force was charged to develop a preliminary set of recommendations, to seek input regarding these
initial recommendations, and to report to the University Planning Council. This report documents the
findings of the Assessment Task Force and their recommendations for improvement of the assessment
process at Xavier.
The first meeting of the Assessment Task Force was held on August 11, 2010. The Chair presented a
plan for the operation of the group, and members stated their preference for assignment to one of
three working subcommittees. Subcommittee assignments were as follows:
1.
Evaluation Group
2.
Best Practices Group
3.
Mapping Group
—
—
Allen, Brookover, Rotondo-McCord and Schulte-Gipson
—
Alcorn, Brookover, Guillory and Hammer
Brookover, Detiege, Gale and Lee
In February 2011, an interim Power Point presentation on Task Force activities was given by Dr.
Brookover to the University Planning Council. The present document is the final report for the
Assessment Task Force.
2
The Task Force members are Ms. Nedra Alcorn, Dr. Rondall Allen, Dr. Cecile Brookover, Mr. Jacques Detiege, Mr.
Allen Gale, Dr. Monique Guillory, Dr. Elizabeth Hammer, Dr. Treva Lee, Dr. Jonathan Rotondo-McCord, and Dr. Lisa
Schulte-Gipson.
8
EVALUATION GROUP
The function of the Evaluation Group was to examine the strengths and weaknesses of the
current assessment system at Xavier. Their examination included a review of preparation for the recent
SACS review, a survey on assessment sent to the academic departments, a series of key informant
interviews of assessment personnel, and a review of recent assessment reports.
Survey of College Curriculum Initiatives, 2000-2010
In December 2009, as part of Xavier University’s overall preparation for SACS reaccreditation review,
the College of Arts & Sciences (CAS) Dean’s office requested the help of all college departments and
divisions in identifying important changes and revisions, made over the past decade, that have affected
either the core curriculum, departmental programs (majors and minors), or both. Department/division
chairs were asked to respond to the following three questions:
1.
During the past few years (both before and after Katrina), what have been the most significant
curriculum revisions made by your department? Examples of such revisions could be (but are not
limited to): changes to requirements for the major, minor, honors sequences, or labs;
development of new key courses: changes to core courses and/or core requirements offered by
your department; or substantial redesign of the way existing courses are taught.
2.
Have any of these changes been made as a result of your department’s assessment (either direct
or indirect) of student learning? If so, please explain.
Departmental summaries of key curriculum revisions, i.e. the natural end of any formal or informal
assessment process, would better allow retrospective identification of linkages between educational
outcomes assessment efforts—formal or informal, centralized or decentralized—and use of results to
improve programs. In order to provide context for the receht work of the newly formed Assessment Task
Force, a brief synopsis of the College Curriculum Initiatives report follows.
3 Since the strategy of the
report was to work “backwards” to identify points of linkage between curriculum change and
assessment, the summary below will treat these two themes in that order. After a short listing of
category types, selected examples are given for each type of curricular revision or assessment.
Types of curricular changes:
Revisions of departmental curricula during the years before and after Katrina can be grouped into
the following main categories.
1.
Introduction of new courses and revision or deletion of existing offerings.
2.
Revision, introduction, and deletion of major and minor programs.
3.
Revisions to existing core curriculum and adoption of new core
4.
Development of interdisciplinary programs (esp. Women’s Studies)
5.
Declared incorporation of fundamental themes (e.g. globalization, service learning) into the
broader curriculum
Types of assessments leading to change:
For full report, see “College of Arts & Sciences Curriculum Initiatives, 2000-Present, Xavier University of
Louisiana” (ian. 2010). Cf. also “Xavier University of Louisiana, 2010 SACS/COC Focused Report” (Feb. 2010).
9
In general, assessments, considered in the report, fall into the categories shown below.
6.
Internal college and departmental self-studies (formal and informal)
7.
Externally assisted evaluations
8.
Student outcomes assessment (internal and external) of major programs and core
Examples:
Typically, the most common type of curricular change is (cf. item 1 above) the development of new
courses, and revision of existing ones, in order to meet students’ learning needs more effectively and to
keep a department’s offerings up-to-date and in line with new developments in research and learning.
This also happens in the context of review and revision of a major program as a whole (item 2). A good
example of both processes taking place in a complementary fashion was provided by the sociology
department, which between 2005 and 2010 undertook the following revisions:
1.
Reduction of maximum enrollment limits in core SOCI 1010 sections, with the goal of increasing
critical thinking and writing skills and incorporating service learning (items 1, 3, and 5).
2.
Development of Women’s Studies courses (item 4) Use of ETS Major Field Test results to modify
the sociology major (items 2, 7)
3.
Review of nationwide discipline trends and use of both internal assessments and outside
consultants to further improve major program (items 1, 2, 6-8)
A second important area of curricular change in response to assessment can be noted where a
department focused on a particular key part of its curriculum for improvement, rather than an overhaul
of an entire major program. A good example of this kind of change took place in the biology department,
which used an internal self-study to identify weaknesses in the BIOL 1230/1240 (General Biology for
majors) sequence (items 1 and 6 above). In part as a result of this self-study, the department was
awarded Louisiana Board of Regents funding to develop a multi-year program to revise and improve the
BIOL 1230/1240 course sequence.
Another pattern of curriculum improvement took place in departments that chose to focus on the
effectiveness of their core curriculum offerings. This was the case with the history department which, as
a result of a two-year process of discussion and departmental retreats, developed a proposal to allow
students a much wider range of choice in fulfilling their history core requirement than had previously
been possible (items 3 and 6). Similar initiatives giving students greater flexibility of core course choice
were also undertaken in the departments of communication studies and theology. A greater emphasis on
globalization throughout the history curriculum (both core and upper-level) was also adopted, in part as
a result of earlier pre- and post-test assessments that demonstrated low student knowledge of world and
historical geography (items 5 and 8).
Streamlining the curriculum in the interest of greater efficiency and better use of resources has also
taken place in some areas, especially after Katrina. Examples include the elimination of the major
program in microbiology (biology department), and the minor programs in law and humanities
(philosophy department) and international studies (political science) (item 2 above).
Finally, a major ongoing assessment and improvement project continues to be the assessment of the
college core curriculum, which falls under the oversight of the CAS Core Curriculum Assessment
Committee (CCAC), with the assistance of administrative units and academic departments (items 3 and 8
above). While no single assessment project led to the adoption of the new core (effective Fall 2010 for all
new students), previous departmental assessments (both formal and informal) of their own core courses
10
informed faculty discussion and deliberation during the extended process of considering the new core
for adoption. The efforts of the CCAC have continued steadily on a yearly basis since Fall 2008, so that
future revisions of the core will be solidly based on comprehensive data and a regular assessment plan.
Assessment Survey of Department Chairs
Rather than reduplicating the previous work of the comprehensive report described above, the
Evaluation sub-committee commissioned a survey of academic department chairs, which was carried out
by the Office of Planning, Institutional Research, and Assessment during fall of 2010.
Background
In the fall of 2010, the evaluation subcommittee of the Assessment Task Force developed a
survey to send to all department chairs in the College of Arts and Sciences. The survey was designed to
determine the following: What assessment items are included in the assessment plans of the various
departments? Which stakeholders are actively involved in the assessment process? Do the departments
have adequate resources to complete the assessment activities in their respective plans?, and Are the
departments using assessment results to improve their programs?
Demographics
Surveys were sent out to the 18 departments composing the College of Arts and Sciences.
Fourteen surveys were returned by the specified deadline (a response rate of 78%). See Appendix B for
a list of the 14 departments that completed the survey. The average number of assessment goals
developed was approximately seven ( M = 6.62, SD = 2.26), with approximately 69% of them assessed
yearly. Seventy-one percent of responding departments did not have an accrediting body, while 29% of
responding departments did have an accrediting body.
Item A Which of the following are included in your assessment plan?
-
In response to the item “Which of the following are included in your assessment plan” 100% of
departments indicated that student learning outcomes were included. Ten of the 14 responding
departments (71%) indicated that student performance on senior comprehensive exams were included
in assessment outcomes. Five of the 14 responding departments (36%) indicated that department goals,
university mission, and student performance on national exams were included in their assessment plan.
Four of the 14 departments (29%) indicated that department mission and student progression were
included in their assessment plan. For a breakdown of the results by department see Appendix B, Table
1.
Item B Who of the following is involved in developing the assessment plan for your
department/division?
-
In response to the item “Who of the following is involved in developing the assessment plan for
your department/division” Nine of the respondents (64%)indicated the chairperson, seven of the
respondents (50%)indicated “a designated faculty group,” and five (36%)indicated other. Other
responses suggested that all faculty are involved and, in one response, that the former chair was
involved. For a breakdown of the results by department see Appendix B, Table 2.
11
Item C Who of the following is involved in data collection related to the assessment plan for your
department/division?
-
In response to the item “Who of the following is involved in data collection related to the assessment
plan for your department/division seven of the respondents (50%)indicated a designated faculty group,
seven indicated the chairperson, and six (43%) indicated other. Other responses tended to indicate
either all faculty (two responses) or individual faculty (two responses). Two respondents indicated that
no one has been assigned this responsibility in their department. Table 3 in Appendix B provides a
breakdown of this question by department.
Item D Who of the following is involved in discussing/interpreting assessment results for your
department/division?
-
In response to the item “Who of the following is involved in discussion/interpreting assessment
results for your department/division nine of the respondents indicated the chairperson, six indicated a
designated faculty group, six indicated other, and five indicated a designated faculty person. In general,
other responses indicated that all faculty are involved. Table 4 in Appendix B provides a breakdown of
this item by department.
Item E Who is involved in evaluating the effectiveness of the assessment plan?
-
In response to the item “Who is involved in evaluating the effectiveness of the assessment plan
six of the respondents (43%) indicated the faculty and six of the respondents (43%) indicated other.
Other responses indicated that all faculty are involved (three responses) or that the chair collaborates
with one or two faculty members (one response). A breakdown by department can be found on Table 4
in Appendix B.
Item F What priority does assessment take in relation to other departmental duties?
-
Overall the 14 departments ranked the six items for this question in the following order from
most important to least important: teaching, advising, scholarship, university service, assessment, and
community service. Overall assessment has the second lowest priority. Other responses to this question
included: “balancing all of the above and sharing with each other our thoughts and experiences in our
teaching, scholarship, etc.” and “The above account for 70 pIus hours per week. Is there any time left?”
Table 5 in Appendix B provides the responses by department.
Item G Do you have adequate resources to assess the items in your plan?
-
Ten respondents (71%) replied yes to this item, while four (29%)replied no. Respondents who
replied in the negative indicated that they needed more time and/or “human assessment experts to
evaluate the effectiveness, suggest improvements, and collect data.”
12
Item H Are the results of the assessment plan being used to improve the department/division? Who
decides how the results are used?
-
Thirteen of the respondents (93%) indicated that the assessment plan is being used to improve
the department/division. Ten of the respondents (71%)indicated that faculty members decide how the
results will be used to improve the department.
Item I How many times per academic year does your department/division meet to discuss and/or
review the assessment plan?
-
Respondents reported meeting an average of 2.43 (SD
discuss and/or review the assessment plan.
=
1.02) times per academic year to
Item i My department/division uses previous assessment results to revise teaching and learning
content, methods, and strategies?
-
On a one to seven Likert-type scale with one being strongly disagree and seven being strongly
agree, the mean response was 5.43 (SD = 1.99).
Recommendations from Analysis of the Chair Survey about Assessment
Based upon the data reported the process of assessment varies across the different programs;
some of the departments are not assessing all of their goals. All of the departments include student
learning outcomes in their assessment plan but the majority of the departments (10 of 14, 71%) assess
student performance on the senior comprehensive exams. A third of the departments assess the
university mission and student performance on national exams. Only one-third of the respondents
reported that they include their division goals in their assessment plan. This may be a contradiction to
the responses in questions 1 and 2 or a misunderstanding of the question. There were four departments
that indicated they have an accrediting body. However, only one department reported that they
included accreditation requirements in their assessment plan.
The majority of the respondents indicated that the Department Chair and a designated faculty
group are responsible for developing the assessment plan. However, in some departments all faculty
members are engaged in the process. Although some departments include other stakeholders, the
designated faculty group and the chairperson are primarily responsible for discussing/interpreting the
assessment results. The majority of the respondents indicated that the Department Chair and a
designated faculty group are responsible for evaluating the effectiveness of the assessment plan.
However, in some departments all of the faculty members are engaged in the process. Two
respondents indicated that no one had been assigned that responsibility for their department.
Ten of the respondents indicated that they have enough resources to assess the items in their
assessment plan. All of the departments except one indicated that they are using the assessment results
to improve their respective departments. The majority of the respondents indicated that faculty are
responsible for making decisions on how the results will be used.
13
Twelve of the 14 departments state that they use previous assessment results to improve
teaching and learning content, methods, and strategies. Also results from the Department and Division
Chair Survey indicate that many chairs are using the assessments results for curricular changes and
departmental improvements. However, the actual documentation of the use of assessments in the
TracDat assessment system does not indicate the same level of use of assessment results.
Recommendations
1. The Department Chairs and the faculty should determine why all of their goals are not being
assessed.
2. The Department Chairs and the faculty should evaluate their assessment plans to ensure their
goals are relevant, meaningful, and measurable.
3. The Department Chairs should rank order goals assessing essential goals more frequently than less
essential goals.
4. The university should develop a list of core assessment items that should be included in the
departmental reviews and assessment plans of each department. The following items should be
considered for the list: student learning outcomes, department mission, department goals,
university mission, student progression, student performance on senior comprehensive exams,
student performance on national exams (if applicable), post-graduate training placement (if
applicable), and accreditation requirements.
5. All departments should consider involving all of the key stakeholders (e.g. administrators, faculty,
staff, students, and alumni) in developing the assessment plan.
6. All departments should consider involving all of the key stakeholders (e.g. administrators, faculty,
staff, students, and alumni) in discussing/interpreting the assessment results.
7. Encourage the Deans to work with the division chairs in meeting their needs to carry out the
assessment activities in their respective departments. Training and additional personnel were the
needed resources listed by the respondents.
8. Continue to encourage division chairs and the faculty to close the loop on all assessment projects
by deciding upon actions to be taken based upon assessment results..
9. All departments should consider adding a student representative to their assessment team.
10. Encourage all chairs and faculty to use assessment results to improve teaching and learning in
their respective department.
11. The university could clarify priorities of activities (advising, assessment, community service,
scholarship, teaching, and university service).
Key Informant Interviews
Another component of the Evaluation Group’s process was to obtain information from people at
Xavier who had experience in assessment or in the SACS accreditation process by conducting a set of
“key informant interviews”. Key informant interviews are a part of a needs assessment or evaluation
tool to draw upon expertise of people within an organization. Generally, they are semi-structured
14
interviews, where the interviewer begins with a set of prepared questions, but additional questions may
be asked depending on the responses given. The prepared questions were as follows:
1.
2.
3.
4.
5.
6.
7.
Briefly describe your experience with assessment at Xavier.
Given your specific responsibilities for assessment at Xavier, what can be done to improve that
portion of assessment?
In your opinion, what is the strongest aspect of evaluation at Xavier and why?
In your opinion, what is the weakest aspect of evaluation at Xavier and why?
How could the process of assessment be improved?
Overall, what can be done to improve assessment at Xavier?
Please add anything that you think would help our committee with its task.
Interviews were conducted with Dr. Marguerite Giguette, Associate Vice President for Academic Affairs;
Dr. Ronald Durnford, Vice President of Planning, Institutional Research and Assessment; Dr. Rondall
Allen, Associate Dean for Student Affairs and Curricular Assessment, COP; Dr. Linda Mihm, Clinical
Associate Professor, Division Of Clinical and Administrative Sciences, COP; Dr. Lisa Schulte-Gipson,
Associate Professor, Department of Psychology and Chair of the Core Curriculum Assessment
Committee; and Mr. Jacques Detiege, Assessment Specialist for Academic Affairs. The following table
summarizes responses to selected questions from the Key Informant Interviews.
Summary of Key Informant Interviews
Question
A What is the strongest aspect of
assessment at Xavier and why?
-
B What is the weakest aspect of
assessment at Xavier and why?
-
Responses
1-In general people are completing their assessments and
have a decent understanding of them.
Core Curriculum assessment has progressed a long way and
has an established plan thanks to the help of Jonathan
Rotondo-McCord and others.
2-In principle, it is comprehensive. Each unit assesses
outcomes; faculty and staff are periodically reviewed; both
administrative and academic units are periodically
evaluated.
3-I believe that the culture of assessment is changing. Many
individuals may not like assessment, but it seems that (more
and more) they realize the importance of assessment.
4-Our Program Assessment Committee (COP)
5-The committee to performing assessment, but there is not
enough time to do it
6-Capacity of the assessment staff
1-Assessment is more focused on process rather than tying
learning outcomes to the curriculum. The biggest weakness
is that assessment is not used for program changes. We
need to ask “How has assessment led to this curriculum change?”
15
Question
Responses
2-Most change occurs outside the formal assessment
process; post-SACS process compliance has tapered off;
the assessment too often does not lead to systematic
improvements; assessment driven actions and
improvements are not always well documented; outcomes
and measures are often pro-forma
3-Speaking from the perspective of the CCAC, the process of
developing valid means to assess all aspects of the core is
quite challenging.
4-Faculty engagement and assessment resources
5-Lack of expertise and training
C How could the process of assessment
be improved at Xavier?
-
6-Understanding and appreciation of the value of
assessment among the general staff
1-The assessment cycle for programs has a due date that
occurs when generally only Chairs are on campus.Who
performs the “assessment of the assessments” and
generates recommendations? Review of assessments should
be conducted at more than one level (currently Director of
IR and Assessment and Planning, IR and Assessment office))
and include an Academic Affairs voice via the level of Dean
and above.Refine TracDat language to avoid
misunderstanding of the phrase “close the loop”.Determine
why an action plan is required for all outcomes that have
been met.
2-To improve, assessment must be perceived to be and must
be useful. A combination of training, engagement,
persuasion, feedback and encouragement. The process and
timelines need to be clear. Perhaps an assessment crib
sheet. Examples provided of effective and less-thaneffective outcomes and measures. Automatic reminders
with escalation.
3-Greater standardization in some (not all) areas. As with
the previous example of administration of CAAP & CAT tests
4-Designated data person; Designated assessment person;
Assessment in a timely manner; Better dissemination
5-Assessment data person, Database expert
6-Consequences for not complying with required documentation;
Rapid turnaround in reporting of assessment information collected
16
Questions
D Overall, what can be done to
improve all assessment at Xavier?
-
Responses
1-There needs to be more uniformity across programs in
assessment. For example, should we consider using the ETS
major field test as a senior comprehensive exam for all
programs along with consistent criteria for passing?
Currently Core Curriculum assessment is treated separately
from program assessments, but specific core learning
outcomes, such as writing, communication, etc. could be a
part of departmental assessments.
Program learning outcomes must address more students
than just the majors in the program.
Develop a system to handle assessment of small
departments with very few majors (related to prior
statement).
2-Some of the pieces need attention:
better integration of institutional effectiveness survey
results and follow up detail-driven surveys; incorporation of
class evaluations into departmental learning outcomes;
revision of program review;
more effective administrative unit review.
3-Focus on improvement of process
Focus on ensuring that assessments are valid
4-Separate assessment department
5-Separate assessment department
6-Person specifically assigned to manage data and databases
for the university as their only job
The Key Informant Interviews produced strengths and weakness of the assessment process that
harmonized with the other analyses cited by the Evaluation Group. However, some distinctive or
emphasized elements were found, including integration of Institutional Effectiveness Survey results into
assessments of administrative units, acquiring a database and data person to insure good data for
assessments, engagement in the assessment process by Xavier faculty and staff, more effective
administrative unit review, incorporation of class evaluations into the departmental learning outcomes,
and revision of the academic program review process.
Summary of Recent Assessment Reports
During the SACS reaccreditation process the following strengths and weakness were identified
by the reviewers.
1.
2.
Process compliance is good
Clear weaknesses in many outcomes and measurement of them
17
3.
4.
Missing opportunity to link student learning outcomes to academic program reviews
Improvements occurring outside the assessment process
Assessment process not driving improvements and change
In the report on Summer Programs the Assessment Specialist identified the following issues.
1.
2.
3.
4.
Documentation of reports in TracDat
Timeliness of data collection and reporting
Quality/Consistency of data and measurement
Documentation of result status
In the 2009-2010 Assessment Focus Report the Director for Institutional Effectiveness and
Assessment raised these areas of concern based upon her analysis of assessment at Xavier.
1.
2.
Lack of timely compliance with the performance and documentation of assessment in TracDat is
a serious problem for both academic departments and non-academic units.
Quality of outcomes and measurement is weak in many areas often with only one measurement
or criterion for a single, very specific outcome. Outcomes need to be broader with multiple
methods of measurement.
3. Very little documentation can be found for improvements and change related to assessment.
4. Group and 1-to-i training is needed in using the TracDat system, writing and measuring learning
and other outcomes, and documenting improvements and change.
5.
6.
New learning and other outcomes are needed for the three year cycle beginning in academic
year 2010-2011 with an emphasis on broader outcomes and multiple means of assessment for
the outcomes.
High level training in the TracDat assessment system is needed for the Assessment Specialist and
the Director of Assessment
High level training on the new version of TracDat occurred on January 24th and 25th of the new year.
18
BEST PRACTICES GROUP
The task set forth for the Best Practices Group was to perform a literature review of best
practices by examining all resources available on the professional practice of assessment. This
examination included review of hundreds of articles, documents and websites related to assessment.
The American Association for Higher Education (AAHE) is a national organization that includes members
from the entire spectrum of higher education stakeholders administrators, faculty members,
policymakers, accrediting agencies, government, students, and business. AAHE published and
disseminated a document entitled Nine Principles of Good Practice forAssessing Student Learning (Astin
et al., 1996), which can be summarized as follows.
—
1.
The assessment of student learning begins with educational values. We should assess what we
value in producing graduates from our universities and use assessment to improve our ability to
educate our students. The values of an educational institution are found in its mission
statement.
Example Alverno College, Mission Statement. Alverno College is an institution of higher
education dedicated to the undergraduate education of women. The student her learning and
her personal and professional development is the centralfocus of everyone associated with
Alverno.
Alverno College expands on its mission statement with explication of four major purposes. The
first major purpose is directly related to academic program offerings and is shown below.
—
—
—
Creating a curriculum
The curriculum, designed by faculty as the major source for student attainment of educational
goals, includes both a philosophy and a program of education. It is:
•
•
•
•
•
ability-based and focused on student outcomes
integrated in a liberal arts approach to the professions
rooted in Catholic tradition
designed to foster leadership and service in the community
flexible, to accommodate the educational goals of women with diverse responsibilities
affordable, to accommodate women’s economic circumstances
•
Xavier Xavier University of Louisiana, founded by Saint Katharine Drexel and the Sisters of the
Blessed Sacrament, is Catholic and historically Black. The ultimate purpose of the University is to
contribute to the promotion of a more just and humane society by preparing its students to
assume roles of leadership and service in a global society. This preparation takes place in a
diverse learning and teaching environment that incorporates all relevant educational means,
including research and community service.
2.
Assessment is most effective when it reflects an understanding of learning as
multidimensional, integrated, and revealed in performance over time. As educational
institutions the mission and goals of the university flows downward into establishing missions
and goals for the colleges, departments, core curricula, special programs, student services, and
-
19
administrative services for the university resulting in an integrated system, which is measured in
cycles of time and which includes multiple outcomes and multiple methods of measuring those
outcomes. Such a system allows for a strong structure for determining how to improve the
education of students.
Example Pennsylvania State University: Assessment occurs throughout the University. It is
found in classrooms, programs, departments, offices and colleges, and ultimately at the
University level (Pennsylvania State University Office of Planning and Institutional Assessment.
(2005) Assessing for Improvement. Innovation insights, 11)
http://www.psu.ed u/president/pia/innovation/Assessing for Improvement 2.pdf/
—
•
Xavier—Xavier: Xavier has worked to incorporate all assessment and assessment records into a
central repository using the TracDat assessment system.
3. Assessment works best when the programs it seeks to improve have clear, explicitly stated
purposes. The process of assessment requires that each unit in the institutional hierarchy is in
agreement on the goals that it wishes to achieve and where in the student learning framework
those goals can be realized. When performed properly the assessment process clarifies what is
important, allowing each unit to focus on useful assessment that leads to improvement rather
than an exercise in measuring what you know you can easily achieve.
•
Example University of Central Florida, Academic Assessment Handbook.The purpose of this
chapter [DEFINING STUDENT LEARNING OUTCOMES (SLO)] is to provide you with an
overview and definition of student learning outcomes. The importance of explicitly defining
expectations and standards is emphasized in this chapter. Also included is an extensive
discussion on how to write clear and precise statements of outcomes for your program.
http://oeas.ucf.edu/doc/acad assess handbook.pdf
•
Xavier Xavier has begun a series of training for academic and administrative departments that
emphasizes writing mission statements, goals, objectives and student learning and other
outcomes that are clearly stated with precise criteria for measurement.
4.
Assessment requires attention to outcomes but also and equally to the experiences that lead
to those outcomes. In educating students universities produce graduates that have particular
skills, abilities and knowledge. In order to produce graduates of quality the university needs to
have outcomes that measure all conditions and experiences that shape the students, including
specific courses, curricula, programs, and extra-curricular experiences.
•
Example The University of Texas System. In higher education, we typically talk about
knowledge, skills, abilities, and competencies as being one and the same. For example, we speak
of competent mathematicians and knowledgeable mathematicians. Yet, skills and knowledge
are acquired through learning experiences; the different combinations of skills and knowledge
one has acquired in a given program define the cornpetencies an individual possesses. These
cornpetencies are acquired through integrative learning experiences provided by academic
programs. (Conceptual Framework, 2000).
http://www.utsystem.edu/aca/initiatives/assessment/conceptualFramework.htm
—
—
—
20
•
Xavier Xavier has begun encouraging multiple measures that include all aspects of student
learning in the next three-year cycle of outcomes assessment that begins with the 2010-2011
academic year.
—
5. Assessment works best when it is ongoing not episodic. Quality assessment is not a hectic push
that occurs whenever accreditation rolls around, but a process of tracking outcomes in which
individuals and groups of students are measured from semester to semester, year to year, in
multiple-year cycles in their progress toward stated outcomes. The ongoing process of
assessment leads to ongoing improvement in achieving outcomes as well as to improvement in
the assessment process, itself.
a
Example North Carolina State University guidelines. The assessment cycle is continuous.
It should identify/document strengths, weaknesses, needs, improvements and future
plans. http://www.ncsu.edu/uap/academic-standards/uapr/process/language.html
•
Xavier Xavier has completed its first three-year assessment cycle using the TracDat assessment
management system and has begun its next three-year cycle using new outcomes.
—
-
6. Assessment fosters wider improvement when representatives from across the educational
community are involved. Every part of the university community, including constituencies
beyond the campus, contributes to the quality of students that are produced. This means that
each university should encourage a culture of assessment across its campus and beyond. There
should be no question that the process of assessment is respected and valued and that
assessment activities are completed with care on a timely basis.
a
Example St. Olaf College, winner of a 2010 Council for Higher Education Accreditation
Award. St. Olaf College is a nationally ranked liberal arts college with an enrollment of
—
approximately 3,000. The college has developed an innovative model of assessment that
sustains and supports student learning. Strong faculty leadership, extensive faculty and
staff engagement, administrative support, grant-funded inter-institutional partnerships
and student engagement in developing instruments and analyzing results are
components of the college’s assessment work.
•
Xavier— Xavier is working to create a culture of assessment that reaches across the entire
university. Its first steps were the collaboration across the entire Xavier community to obtain
SACS reaccreditation and to develop its QEP, Read Today, Lead Tomorrow.
7. Assessment makes a difference when it begins with issues of use and illuminates questions
that people really care about. The process of assessment should produce information and
evidence that is deemed useful for making decisions in a system of continuous improvement.
The goal of assessment is not to generate data that pats the institution, department or unit on
its back, but that leads to making good decisions about how to proceed in the future.
21
•
Example North Carolina State University. Objectives and outcomes help translate the very
broad goals of university, college, and department mission statements into the curriculum by
which you fulfill that mission. They describe the knowledge, attitudes, and skills you want
students to have when they finish a part of your program. They can apply to individual
assignments, to courses, or to whole programs.
•
Xavier New guidelines for writing outcomes for academic and administrative units at Xavier
have each program or office to generate a list of the essential attributes their students should
possess or functions that the administrative unit should perform. For academic programs that
describes what knowledge, skills and abilities students should possess upon completion. Next,
outcomes are written with appropriate measurement so that programs and offices can
determine if the outcomes are achieved.
—
—
8. Assessment is most likely to lead to improvement when it is part of a larger set of conditions
that promote change. Assessment needs to be part of the institution’s culture so that it is
integrated into all decision-making processes such as planning, budgeting, personnel, and
curricula. For example, curriculum changes should also be made based upon evidence produced
by assessment.
•
Example Pennsylvania State University. Creating a Culture for Innovation and
Improvement: Lessons Learned: Integrate CQI [Continuous Quality Improvement] into the
core processes of the institution how you hire, what you reward, what you communicate, how
you measure, and how you develop faculty and staff. What gets measured is what gets done,
and what’s valued is what is rewarded.
httrx//www.psu.ed u/president/pia/innovation/creating a culture.pdf
—
—
•
Xavier One way Xavier is considering implementing a culture of assessment is to require
curriculum change proposals to include assessment data. Core Curriculum assessment data led
to the choice of reading as the focus of the QEP, Read Today, Lead Tomorrow.
9.
Through assessment, educators meet responsibilities to students and to the public. As
educators we have a responsibility to foster quality and improvement in educating our students;
assessment provides us with the mechanism to do so. Also, by fulfilling this responsibility
ourselves, universities are better able to control imposition of one-size-fits all mandates from
government.
a
Example The University of Texas System, Our Commitment to Accountability. One of the
highest priorities of the Chancellor and the Board of Regents of The University of Texas
System is to be accountable to take responsibility for measuring and reporting the
effectiveness of our work and to use that information to continuously improve our
—
—
—
performnce. http://www.utsystem.edu/osm/files/FactSheet.pdf
22
‘
In order to fulfill our mission, which is basically an implied promise to our students and
the public, Xavier must continue to improve the quality and reporting of our assessment system.
The Assessment Summit and this Assessment Task Force Report provide a plan for the activation
of a system of integrated best practices in assessment for Xavier University of Louisiana that will
lead to university improvement.
Xavier
—
Through a review of approximately 75 college and university websites for institutions that grant at a
minimum a four-year undergraduate degree the following communalities were noted. Almost all
universities have an assessment committee composed of members from the greater community who
have expertise or interest in assessment. The committees are usually appointed. The majority of
institutions have an assessment website with links to resources related to assessments best practices
and other information. Almost all have an assessment handbook that spells out the assessment process
for the university. Some have separate handbooks for administrative units, and almost all include
assessment forms and examples of assessments that meet the requirements set forth in the handbooks.
Some of the best sources for best practices are listed in the annotated bibliography in Appendix
C. The resources are divided into the categories of General Assessment, Culture of Assessment,
Measurement and Evidence, and Using Assessment for Improvement. Links are provided to the
resources when available. These links can be placed upon an Assessment Web Page.
23
MAPPING GROUP
The purpose of the Mapping Group was to document all assessment activities at Xavier. First,
members of the Task Force all documented their own involvement with assessment at Xavier. See the
charts below for documentation for the Offices of Academic Affairs and Planning, Institutional Research
and Assessment. In the Center for Advancement of Teaching assessment are reported via the TracDat
system, annual FaCTS reports to the Mellon Foundation, and Internal reports for annual planning.
Results from these reports are used to plan for the next year’s programming. Assessment in the College
of Pharmacy is set up with a Program Assessment Committee, which includes members from Basic
Sciences and Clinical Faculty members, administrators from the College, and the Associate Dean for
Student Affairs and Curricular Assessment. See the table below listing the primary offices and positions
within the offices that have assessment responsibilities at Xavier.
Records were obtained from the Office of Fiscal Services, Gants and Contracts Accounting, a
listing of all active grants and funds. Analysis of this list revealed that many of the items were actually
research grants that required an annual progress report, but no actual evaluation. An example of that
type of item would be research grants funded under the NIH-NIGMS SCORE Grant Program. Using the list
as a guide, a survey was sent to determine assessment activities related to grants and contracts (see
Appendix D for the survey results). In order to be comprehensive surveys were sent to all persons listed
as Principal Investigators on the list. Responses were received from 32 PIs. Follow-up emails, telephone
calls and investigation resulted in compiling information about assessment activities on all items on the
list. All items required an annual progress report, and 34 items required an evaluation in the annual
report. Of those 34, external evaluators (outside of Xavier) were required for eight of them (24%);
internal evaluators (from Xavier) performed the assessments for the remaining 26 (76%). Most internal
assessments are performed by the Assessment Specialist or Director for Institutional Effectiveness.
24
Assessment Responsibilities at Xavier
Office
Office of
Planning,
Institutional
Research, and
Assessment
(OPIRA)
OPIRA
OPIRA
Position
Vice President for
Planning,
Institutional
Research, &
Assessment
Director for
Institutional
Effectiveness &
Assessment
Director for
Institutional
Research
Assessment
Responsibilities
Reporting
•
SACS Liaison
•
•
Emphasis on SACS 2.5,
3.3.1, and 3.2.8
SACS Liaison and SACS
Reports
•
Supervisor for OPIRA
activities, reports, and
surveys
•
Planning Focus Report
•
Strategic Plan and
Planning
•
Oversight and
reporting on University
Institution I
Effectiveness process
•
Special Reports and
Analyses (Competitor
Profile, strategic
information
summaries, etc.)
•
Assessment Focus
Report on all Annual
University assessment
activities
•
Assessment Task Force
Report (current year)
•
Planning
•
Institutional
Effectiveness
•
Academic Programs (CAS
& COP)
•
Academic Support
Programs
•
Administrative Units
•
Administrative Support
Units
•
Annual Institutional
Effectiveness Report
•
TracDat Administration
for all Xavier Assessments
•
SACS Reports
•
Institutional Effectiveness
•
Instructor Evaluations
•
Instructor Evaluation
Reports
•
SACS 4.1 Student
Achievement
•
SACS Reports
•
Annual Environmental
Focus Report
25
Assessment Responsibilities at Xavier (continued)
Office
OPIRA
Position
Senior Institutional
Assessment Responsibilities
•
Office of
Academic
Affairs
Assessment
Specialist
Director for the
Center for
Advancement of
Teaching
•
Survey Results Reports
assessment surveys,
including Institutional
Effectiveness Survey
Research Analyst
Office of
Academic
Affairs
Prepares & analyses
Reporting
•
Archives assessment data
•
All Summer Programs,
including NASA
•
Annual Summer
Programs Report
•
QEP
•
QEP Reports
•
I Cubed
•
I Cubed Reports
•
Freshman Seminar
•
NASA Report, etc.
•
Teagle Foundation
•
Various special
assignments
•
Course Portfolio Working
•
TracDat Report
Groups, including QEP
Reading
•
FaCTS Reports to Mellon
Foundation
•
Internal Reports for
Planning
•
FaCTS Grants
•
Maintains Assessment
Toolbox & links to
resources
•
Classroom observations
and other assessment
services
26
FINAL RECOMMENDATIONS
The state of assessment at Xavier was certified as meeting core requirements and
comprehensive standards based upon the recent SACS reaccreditation. However, in carrying out its
charge, the Assessment Task Force found that units, departments, programs, and individuals have
widely varying abilities to understand, conduct, and use assessment activities appropriately. The Task
Force also found that while units, departments, and programs carried out the process of assessment
successfully to the SACS reaccreditation review, there was widespread skepticism about the utility of the
process. In focused interviews conducted as a part of carrying out its charge, the Task Force concluded
that the assessment process in many areas was not yielding the documented improvements and many
improvements were occurring outside the assessment process. These improvements were not well
documented. The recommendations from the Task Force are summarized under the two major
categories of The Assessment Process and Outcomes Assessment; within the two categories the
recommendations are prioritized as Essential (E) or should be adopted now; Short-Term (ST) or should
be adopted in one year; and Long-Term (LT) or should be adopted by the next three year assessment
cycle.
Assessment Process Recommendations
Coordination of effort and systems would improve the effectiveness of the assessment process as well
as the university’s institutional effectiveness. Resources related to conducting assessments would also
improve effectiveness. Recommendations related to the assessment process, itself, are as follows.
13. Establish a Xavier Assessment Group that serves as a monitoring and review committee for
Xavier assessment practices. The committee would include appointed members from academic
programs, administrative programs, and special programs as well as the Director for Institutional
Effectiveness and Assessment, the Assessment Specialist, and the Chair of the Core Curriculum
Review Committee. (LT)
14. The Director for Institutional Effectiveness and the Assessment Specialist should prepare a
handbook for assessment at Xavier that explains the assessment process at Xavier and that
includes specific instructions for carrying out assessments. (E)
15. The Director for Institutional Effectiveness and the Assessment Specialist should develop a
website specifically for assessment that includes the aforementioned handbook and other
resources. The website should be housed with the Office of Planning, Institutional Research and
Assessment’s website. (E)
16. The cycle of five-year academic program reviews should be coordinated with the annual
program assessments so that the program reviews incorporate the currently-assessed learning
outcomes related to the specific program being reviewed. (ST)
17. The programs with courses that are assessed in the Core Curriculum assessment process should
coordinate with the Core Curriculum Review by providing a method of assessing core learning
outcomes beyond the CAAP standardized test. (E)
27
18. Individual academic program units should coordinate their curriculum with their learning
outcomes by preparing a curriculum map to document which courses relate to the learning
outcomes being assessed. The TracDat system provides a tool to develop a curriculum map. (LT)
19. Planning, assessment, improvement, and institutional effectiveness should be better
coordinated by fully implementing the available components of the TracDat software. (ST)
20. Student recruitment and retention plans formulated by the academic departments should be
incorporated into TracDat outcomes assessment. (LT)
21. Smaller academic programs in similar disciplines should consider coordinating their assessment
by adopting some common outcomes that could be assessed as a larger sample. (ST)
22. Emphasize the importance of documentation in the TracDat Assessment Management System,
including documentation of outcomes, measures and related criteria, and improvements based
upon analysis of data. (E)
23. Set up assessment on a continuing basis throughout the year, not just at the end of the year.
(ST)
24. Adopt the recommendations from the Evaluation Group arising from the Assessment Task Force
Department Chair Survey conducted in Fall of 2010. (ST and LT) Details of these
recommendations can be found under the Evaluation Group section of this document.
Outcomes Assessment Recommendations
Specific recommendations related to outcomes assessment can be divided according to whether the
outcome concerns academic programs or administrative units. Recommendations for academic
programs and related support programs include the following items.
8.
SACS reviewers’ comments should be used to improve assessment. (E)
9.
Increase process compliance to prior levels. (E)
10. Use weaknesses identified in Summer Program assessment to improve assessment year-round.
(E)
11. Have more than one assessment method for each outcome so that evidence for meeting the
outcomes will be strengthened. (E)
12. Limit the number of learning outcomes assessed in a three-year cycle to three-to four specific
(knowledge, skills, abilities) outcomes that can be assessed using multiple methods. (E)
13. Prepare a curriculum map to identify courses where outcomes can be assessed using embedded
course assessment employing rubrics. (E)
28
14. Conduct additional training for use of TracDat, outcomes assessment, results analysis, and
improvement using face-to-face and online training. (E)
Recommendations for administrative units include the following items.
9.
Increase process compliance to prior levels. (E)
10. Emphasize the assessment of unit operations and processes. (E)
11. Determine the core functions of the unit that allow for reaching the goals of the office.
12. Determine how the unit interacts with other administrative and/or academic units when
carrying out its core functions. Administrative units carry out their functions across multiple
offices. (E)
13. Choose three core functions and state as outcomes for evaluation for each three-year cycle of
assessment. (E)
14. Include the Institutional Effectiveness Survey results as an additional outcome. (E)
15. Have more than one assessment method for each outcome so that evidence for meeting the
outcomes will be strengthened. (E)
16. Conduct additional training for use of TracDat, outcomes assessment, results analysis, and
improvement using face-to-face and online training. (E)
The Assessment Task Force presents these recommendations as steps in developing a process of
assessment at Xavier University of Louisiana that will insure our university mission will be met for all
students. Additionally, adoption of and successful implementation of the recommendations will
guarantee that assessment of our educational programs and administrative processes follow best
practices that encourage continuous improvement based upon careful evaluation and evidence. These
recommendations, when implemented in a systematic fashion according to the timeline presented
above, will establish a culture of assessment for continued success.
29
APPENDIX A
ASSESSMENT SUMMIT REPORT XAVIER UNIVERSITY of LOUISIANA
-
June 15, 2010
Assessment Summit
Output Document
v.1
Leadership
56 Perimeter Center East, #103
Atlanta, Georgia 30346
770.454.1440
www.leadstrat.com
Overview
Xavier University of Louisiana held a one-day assessment summit on June 15, 2010. Cynthia
Waisner from Leadership Strategies served as facilitator and documenter for the session.
Retreat Purpose and Objectives:
The purpose of the retreat was to establish a strategic framework for assessment at Xavier.
Objectives of the retreat included the following:
Reach consensus on the purpose, or mission, of assessment at Xavier;
Agree on a shared vision and long-term goals derived from that vision;
Identify critical success factors and barriers for each of the long-term goals;
Develop a list of possible strategies, or projects, for moving toward the
vision, achieving critical success factors, and/or reducing or eliminating
identified barriers;
Reach agreement on priority strategies from the list of possible strategies;
and
Identify next steps, milestones, and accountabilities associated with the
priority strategies.
30
Retreat Agenda
Opening Exercise:
Strategy Prioritization
Hopes and Concerns
Where We’re Going
Picture
-
The Big
Action Plans (High Level)
6
Session Close
Mission
Vision
Li
Goals
How We’ll Get There
Critical Success Factors
Barriers
Li
Strategies
Documentation Contents
This document presents the results of the session as recorded by the facilitator. Comments
appearing in italic represent additions made by the documenter for clarity. The document
includes the following sections:
I.
Opening Exercise
31
II.
Vision
31
III.
Goals
32
IV.
Mission Statement
33
V.
CSFsandBarriers
33
VI.
Strategies and Priority Strategies
36
VII. Action Plan
40
VIII. Next Steps
41
31
I
I. Opening Exercise
Participants were asked to ident5’ their greatest hopes and concerns for the day. Each of
the four table groups was then asked to agree upon and report back their top hope and top
concern.
Hopes
Concerns
That we create a culture of assessment that is
useful, integrated, and coordinates all types of
assessment across the University
That our assessment process is not going to
be integrated and useful across the
University
That we are able to breach the silos, connect the
various assessments and connect everything to
the institution
That we do not achieve an Institution-wide
focus
That we develop a realistic, integrated, usable,
and transparent system across all levels
That we develop something that doesn’t meet
the criteria listed to the left of this concern
That we develop a comprehensive framework
for assessment, moving from the general to the
more specific and that we all have a better
feeling and sense of purpose regarding
assessment at the University
That we not fully take into account Xavier’s
learning culture can what we agree on here
truly be transplanted across the University?
—
II. Vision
I
A visioning exercise was usedfollowed by breakout sessions in small groups to identify common
themes. These themes were used to develop the mission, vision, and goals.
VISION: The vision provides a picture ofa preferredfuture state.
Proposed Vision Statement
Assessment: Driving Growth and Change
The following slogans were on post-it pages on the Green Team’s table. They are
included here for possible use as the effort goes forward:
•
Assess to be your best!
•
Data are your friend!
•
Bloom and grow!
32
1
III. Goals
Goals identify broad aims toward which the organization will work over a
long (approximately 10-year) span oftime:
BUY-IN
Maximize buy-in for assessment with all stakeholders
throughout their affiliation with the University
CULTURE OF
ASSESSMENT
Promote a more accepting culture of assessment
RESOURCES
.
You may want to omit the qualifying words “more accepting” in your
final version
Provide levels of resources needed to effectively carry
out assessment
Green Team edited to read: Provide resources needed to carry out
assessment effectively
EFFECTIVE
ASSESSMENT
MODEL
Promote an assessment model that is effective,
systematic, integrated, collaborative, and used for
growth and poSitive change. This model will include:
..
.
.
Ensuring that data collection is accurate, timely,
and consistent across the University;
•
Ensuring regular evaluation of the assessment
process and its usefulness to the University;
.
Ensuring usefulness and use of the assessment
syste ms;
.
Maximizing availability of comparable data sets
for analysis;
.
Ensuring transparency of assessment needs, data,
and results, as well as the responsibility for these
needs, data, and results;
.
Maintaining mission-oriented focus; This was
originally stated as “learning-centeredfocus”; and
•
Maintaining coordinated
(centralized?)
data.
33
IV. 1’Iission Statement
A mission defines the overall purpose ofan organization. The mission statement should describe
what the organization does, for whom it does it, and the benefit.
Proposed Mission Statement
To systematically collect, analyze, and use relevant data
from all levels i.e., individual, departmental, and
University in order to continuously and effectively fulfill
the mission and the overall goals of the University
—
—
Groups were asked to identfy possible edits to the above. They suggested:
• Include “ident)5’ goals”
• Fix the split infinitive and shorten it
• Add concept of “improvement”
• Move reason for assessment (‘fulfilling the mission “) to the beginning of the statement
The other three draft mission statements are included below. Specific language and concepts
contained in them may be useful in crafting thefinal version of the mission statement:
•
•
•
To set goals and evaluate performance to ensure outcomes are met in order to drive
growth and change to the benefit ofthe overall University community.
The purpose ofassessment at Xavier is to support the mission ofthe University by:
• Developing and implementing a plan for continuous cyclical improvement;
• Documenting implementation results; and
• Using the documentation to improve the plan
The purpose ofassessment at Xavier is to:
• Support the mission ofstudent learning;
• Identfy critical areas for decision-making; and
• Support decision-making, growth, and improvement through the systematic collection,
organization, and analysis ofinformation.
V. CSFs and Barriers
I
The participants next identified Critical Success Factors and Barriers for each goal. First, groups
identified CSFs those things which must go right, or the conditions that must be created —for the goals
and objectives to succeed. Participants next identified Barriers those conditions which might hinder
success in achieving the objectives.
—
-
34
GOAL: BUY-IN
CRITICAL SUCCESS FACTORS
Identifying the stakeholders and their unique roles
and responsibilities
BARRIERS
Misunderstanding of assessment
Lack of knowledge about assessment
Accountability
Resistance to change
Having a truly effective model and resources to
follow that model
Available time
Marketing plan
Perception and/or reality that effort is top-down
Value obvious to stakeholders
Simple and easy to use
GOAL: CULTURE OF ASSESSMENT
CRITICAL SUCCESS FACTORS
BARRIERS
35
Stakeholders understand assessment
Stakeholders have shared value for and language
of assessment
Stakeholders see the value of assessment
Strong, consistent reinforcement of best policies
Individual beliefs and mindset that past
assessment results have not been used for
growth and change
Belief that assessment is only needed/used for
administrative purposes (e.g., SACS)
Results are not communicated consistently to
stakeholders
Lack of necessary resources
Competing cultures and values
Maintenance of the status quo
GOAL: RESOURCES
CRITICAL SUCCESS FACTORS
All areas have the means for successful
assessments
BARRIERS
Poor planning
Competing institutional priorities
Assessment at “any cost” There was considerable
discussion around this point, with the end result
being clarification that this meant that assessment
does not get shunted aside by competing priorities,
but remains a high priority and area offocus “no
matter what,” i.e., “at any cost”
Budget process is not aligned with assessment
Quality and quantity of staff
Identification and evaluation of required resources
GOAL: EFFECTIVE ASSESSMENT MODEL
CRITICAL SUCCESS FACTORS
BARRIERS
36
The model is useful at all levels
Temptationltendency to do what is easy vs.
what is meaningful
Data are good and of a quality that is responsive
to users and stakeholders
Fear of exposing weakness
Territorial attitude towards data
Standard data templates are used where
appropriate
Responsible, trained personnel are assigned to
each assessment
There is stakeholder knowledge regarding the
data that are already available
VI. Strategies and Priority Strategies
I
Participants were next asked to brainstorm possible strategies for each of the goal areas, as well
as to assign each “potential strategy” generated earlier in the day to one of the goal areas.
Participants were asked to generate strategies that: directly affected achievement ofthe goal;
strengthened, leveraged, or put into place a CSF; and/or weakened, removed or lessened the
impact ofa barrier.
Once strategies were generated and reviewed, participants were asked to “vote,” using dots, to establish
priority strategies. Each participant was given four dots per goal area to assign as he or she wished.
Results for each goal area, in decreasing order of priority, are provided below.
GOAL: BUY-IN
STRATEGY
TOTAL POINTS
AWARDED
37
Develop an incentive program that highlights successful assessment efforts
combined with Conduct a festival of assessment to report successes and issues
18
Develop a campus-wide marketing plan for assessment
17
Identify the stakeholders and their roles and responsibilities
16
Conduct focus groups to solicit input and integrate their input into the model
At the bottom of this post-it, someone added the comment “Top down”
10
Develop a professional development plan for stakeholders
9
Identify resistant stakeholders and work with them combined with Invite them
to the festival and team them with assessment ambassadors
5
In the discussion, the point was made that even though this was not a top vote
getter, it should be incorporated early into the plans and actions for
assessment. Resistant stakeholders may have a lot of influence, and failing to
address this early on could make the initiative more difficult or even sideline it
altogether
Compulsory entrance and exit evaluations for students and employees
1
Regularly survey stakeholders and develop a strategy to determine the
effectiveness of the model
1
Hold (monthly?) seminars: define outcomes, measurement, focus groups?
1
Communicate the work of this group and the barriers we have identified to
implementation and acknowledge problems in past systems
0
38
GOAL: CULTURE OF ASSESSMENT
STRATEGY
TOTAL POINTS AWARDED
Assessment training and education
22
Develop incentive system for assessment
15
Some participants felt that this should be combined with the
strategy listed first, for a combined total of 37. This strategy also
appeared in the “buy-in” category when the totals from the two
categories are combined, there are 33 points for this strategy. If you
combine it with “assessment training and education,” the total
jumps to 55.
—
Add an assessment link at main Xavier web page
12
Develop a system for dissemination of results to all stakeholders
annually
10
Promote scholarship of assessment to faculty
9
Develop video of success stories
5
Publish vision and mission of assessment on a document that
includes the goals of assessment
4
Recognition for assessment
need both sticks and carrots
1
Create a video that describes the new assessment system
0
Food, entertainment, daily activities
0
—
GOAL: RESOURCES
39
STRATEGY
TOTAL POINTS AWARDED
All-Star Assessment Team combined with Identify personnel
combined with Identify who does what with institutional assessment
25
Identify needs: personnel, equipment, know-how, and training
combined with Identify all institutional resources that support
23
assessment
Align budgetary allocations with plan
12
Appoint a separate QEP evaluator as stated in the QEP budget
9
Engage University Planning Council to elevate assessment as a
University priority
5
Adhere to the strategic plan combined with plan, plan, plan!
5
Integrate grant-funded assessment into general assessment plan
3
Attempt to identify new, external resources, especially for initial
1
costs of developing/launching new system
GOAL: EFFECIIVE ASSESSMENT MODEL
STRATEGY
TOTAL POINTS AWARDED
Look at best practices and identify which components could work at
Xavier combined with Evaluate current model what is useful
—
(strengths and weaknesses)
Develop official, central hub of information —who is responsible
combined with For each of the University’s priorities, common or
30
40
compatible departments across campus develop objectives,
benchmarks, and a timeline. They meet regularly to discuss SWOT.
A note connecting the two strategies read “All Star Assessment
Team.” See notes in Action Plan regarding two visions for this
12
assessment team
KPls (Key Performance Indicators)
11
Invest in technology that will streamline processes
10
Develop uniform data sets with consistent variable names, based on
institutional data
8
University defines what assessment means
6
Develop organizational plan
4
1) Identify characteristics of overall assessment that all can agree
on; 2) identify systems that need to be integrated; 3) make
recommendations concerning next steps, actions, timelines; 4)
1
identify resources needed to carry out the plan
Develop a plan for coordinating efforts across the campus (who is
responsible, who is accountable?)
0
Have each unit or department input their data on a timely basis
0
VII. Action Plan
I
After ident)5’ing the priorities, the group determined that the first step needed to be creation and
charging of the “All-Star Assessment Team.” There were two separate visions ofwho would be
part ofthis group and what its primary task would be. Some saw the group as having a high
degree ofassessment subject matter expertise and doing the early analysis regarding who is
currently doing what, what resources are available and needed, etc. The group would then make
recommendations and open up the discussion to a broader, more inclusive group, eventually
reportingfindings and making recommendations to the Planning Council. Others saw the group
as a more representative and inclusive body ofthose involved in assessment, meeting regularly
41
to share issues and expertise, concerns, etc. After some discussion, the group decided that the
former model should be the focus for this first group, with the understanding that the
development ofthe more inclusive working committee will come at a later date.
This group will be a committee ofSMEs in assessment. It will be led by Dr. Brookover, who will
identb5’ and invite other members. It was noted that it will be important to have representation
from the non-academic side ofthe University. Recommendations are being developed regarding
this appointment. The committee wilifunction, in a sense, as a body ofinternal consultants. Its
working process may include thefollowing steps and stages:
1) map and evaluate the current system;
2) identO5’ best practices;
3) identify strengths and weaknesses of the current system;
4) develop a preliminary set ofrecommendations;
5) seek input regarding these initial recommendation; and
6) report to the Planning Council.
The specific steps and working process will be defined by the committee. Next steps and
responsibilities, including those for ensuring high-level support, are included in the next section.
I
Vu!. Next Steps
The following action items and responsibilities were identUled during the course ofthe retreat.
Action
Assigned to
Due Date
1. Complete documentation from the retreat
C. Waisner
6/18/10
2. Wordsmith draft mission, vision, and goal statements
and circulate for comments
R. Durnford
6/22/10
3.
Develop “one-pager” that outlines mission, vision,
goals? (Possible follow-up item)
R. Durnford
As determined by
Committee
4.
Identify membership for “All-Star” committee
C. Brookover
6/22/10
5.
Hold first meeting of “All-Star” committee. Purpose: To
develop charge and working plan
C. Brookover
7/1/10
L. Blanchard
9/1/10
6. Prepare announcement to University community
reporting on outcome of this meeting, establishment of
“All-Star” committee, and indicating high-level support
and commitment to this initiative
and/or N. Francis
Department
Biology
Business
Chemistry
Computer Sci.
Education
English
History
Language
Math
Philosophy
Political Science
Physics
Sociology
Theology
X
Accreditation
Requirements
Student
Learning
Outcomes
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
Student
Progression
University
Mission
Department
Goals
Department
Mission
Which of the following are included in your assessment plan?
Table 1
X
X
X
X
X
X
X
X
Senior
Comps
Results
X
X
X
X
X
X
X
National
Exam
Results
Results from the Evaluation Group Chair Survey in Tables
APPENDIX B
Graduation
Results
Job
Placement
X
Post-grad
Training
Placement
X
Other
42
43
Table 2 Which of the following is involved in developing the assessment plan for your
department/division?
-
Department
Biology
Business
Chemistry
Computer Science
Education
English
History
Language
Math
Philosophy
Political Science
Physics
Sociology
Theology
Former Chair
Designated
Faculty
Person
X
X
Designated
Faculty
Group
Administrators
Alumni
Chairperson
X
X
X
Staff
Students
X
X
X
X
X
X
X
X
X
All or
Other
Faculty
X
X
X*
X
X
X
X
X
X
X
X
X
X
X
Table 3 Who of the following is involved in data collection related to the assessment plan for your
department/division?
-
Designated
Faculty
Person
Department
Biology
Business
X
Chemistry
Computer Science
Education
English
History
Language
Math
Philosophy
Political Science
Physics
X
Sociology
Theology
A Whoever teaches History 4415
-
Designated
Faculty
Group
Administrators
Alumni
Chairperson
X
X
X
X
X
X
X
X
X
X
Staff
Students
All
Faculty
X
X
A
X
X
X
X
X
X
X
44
Table 4 Who of the following is involved in discussing/interpreting assessment results for your
department/division?
-
Designated
Faculty
Person
Department
Biology
Business
Chemistry
X
Computer Science
Education
English
X
History
Language
Math
X
Philosophy
Political Science
Physics
X
Sociology
X
Theology
A No one has been designated.
Designated
Faculty
Group
X
X
X
X
Administrators
Alumni
X
X
Chairperson
X
X
X
X
Staff
Students
X
X
All
Faculty
X
X
X
X
X
X
-
Table 5 What priority does assessment take in relation to other departmental duties?
-
Community
University
Department
Advising
Assessment
Service
Scholarship
Teaching
Service
Other
Biology
5
4
2
6
7
3
Business
6
2
4
7
3
5
Chemistry
5
2
3
6
7
4
Computer
5
2
6
7
3
4
Science
Education
7
6
3
4
2
5
English
4
2
5
7
3
6
History
5
1
2
4
6
3
Language
7
3
4
5
6
2
Math
6
4
2
5
7
3
Philosophy
4
2
1
6
7
5
3
7*
Political Science
5
3
1
4
6
2
Physics
5
3
2
6
7
4
Sociology
6
4
3
2
7
5
Theology
2
6
7
5
4
3
Overall Average
3.21
2.21
4.93
5.35
6.64
3.64
*balancing all of the departmental duties NOTE: Highest priority is 7, and lowest priority
•
Missing
is 1. The rankings are for individual departments, and the overall composite is computed from all
departments reporting.
X
X
X
A
X
A
X
45
APPENDIX C
Best Practices Resources
Best Practices
—
General
Alverno University publications
http://www.alverno.edu/for ed ucators/publications.html#sa
American Psychological Association. Assessment Cyberguide.
http://www.apa .org/ed/governance/bea/assessment-cyberguide-v2.pdf
This voluminous work is an excellent assessment publication.
Council of Regional Accrediting Commissions. (2003). RegionalAccreditation and Student
Learning: Principals for Good Practices.
This short publication gives a summary of good practices in assessment from the six regional accrediting
commissions.
Dwyer, C. A., Millet, C. M., & Payne, D. G. (June, 2006), A Culture of Evidence: Postsecondary
Assessment and Learning Outcomes. Educational Testing Service: Princeton, NJ
This issue paper from ETS gives guidelines for college assessment at the institutional level and calls for
the six regional accrediting agencies to set up a national system which incorporates common measures
for student learning. A section that describes fair and psychometrically sound testing is included in the
paper.
Middle States Commission on Higher Education
http://www.msche.org/publications view.asp?idPublicationType=5&txtPublicationType=G uidelines+for
+lnstitutional+lmprovement
http://www.msche.org/publications/examples-of-evidence-of-student-learning.pdf
These publications present a detailed, well-written account of all aspects of the assessment process.
North Carolina State Universityhttp://www2.acs.ncsu.edu/UPA/assmt/Guide Principles.htm
http://www2.acs.ncsu.edu/UPA/assmt/best practice stmt.htm
http://www.ncsu.edu/uap/academic-standards/ua pr/process/la nguage.html
Pennsylvania State University (Penn State). Innovationsinsight series.
46
This series of articles on assessment is one of the best overall resources for assessment best practices.
Links to some of these articles are shown below.
http://www.psu.edu/president/pia/innovation/creating a culture.pdf
http://www.psu.edu/president/pia/innovation/relationship between%20continuous improvement.pdf
http://www.psu.edu/president/pia/innovation/benchmarking for innovation.pdf
http://www.psu.edu/iresident/pia/innovation/facilitating teams.pdf
http://www. psu.ed u/r3residenthjia/innovation/Using surveys for data collection in continuous impr
ovement.pdf
http://www.psu.edu/president/pia/innovation/improve7.pdf
http://www.psu.edu/president/ria/innovation/integrating pla nning.pdf
http://www.psu.edu/president/pia/innovation/strategic indicators.pdf
http://www.psu.edu/president/pia/innovation/tools for organizational improvement.pdf
http://www.psu.edu/president/pia/innovation/Leading for Continuous Improvement v2.pdf
Stasson, M.,Doherty, K. & Poe, M. Principles of Good Practice for Assessing Student Learning.
Office of Academic Planning and Assessment, University of Massachusetts Amherst.
—
This 62 page handbook contains extremely useful information about the assessment process.
Shulman, L. S. (January/February, 2007). Counting and Recounting: Assessment and the Quest
for Accountability. Change.
This article discusses the seven pillars of assessment for accountability, which include multiple measures
and embedding assessment into courses, among others.
University of Central Florida. The Administrative Unit Handbook: Measuring Student Support
Services and Administrative Outcomes.
This 41 page handbook specifically addresses assessment of administrative units. It provides good
information on the entire assessment process.
http://oeas.ucf.edu/doc/adm assess handbook.pdf
University of Texas
—
Arlington. (Spring 2010). Unit Assessment Handbook.
This 91 page handbook covers every aspect of assessment and includes many examples for both
academic and administrative assessment.
47
North Carolina State University, Internet Resources for Higher Education Outcomes Assessment,
http://www2.acs.ncsu.edu/U PA/assmt/resource.htm
This website contains the most comprehensive list of and links to college level outcomes assessment
available. It was last update on February 7, 2011.
Culture of Assessment
Lakos, A. & Phipps, S. (2004). Creating a Culture of Assessment: A Catalyst for Organizational
Change. Portal: Libraries and the Academy, 4(3). Johns Hopkins University Press: Baltimore, MD. 345361.
This article discusses the history of a cultural of assessment in terms of organizational change and
specifically applies it to university libraries.
Piascik, P. & Bird, E. (2008). Evaluation, Assessment, and Outcomes in Pharmacy Education: The
2007 AACP Institute—Creating and Sustaining a Culture of Assessment.American Journal of
Pharmaceutical Education, 72(5), 1-9.
This article describes the development and implementation of an assessment program at the College of
Pharmacy at the University of Kentucky. Successes and challenges are discussed.
Roberts, Anderson, Bird, and Cain. Creating a Culture of Assessment: Closing the Loop. University
of Kentucky, School of Pharmacy, pdf of a power point presentation.
This power point presentation describes the steps in creating a culture of assessment at the UK School
of Pharmacy. It emphasizes the collaboration between faculty, students, and other stakeholders in the
assessment process.
Suggested Readings on Encouraging Faculty Engagement in Assessment— pdf chart
This chart provides an extensive bibliography on obtaining faculty buy-in to the assessment process.
Measurement and Evidence
Examples of Evidence of Student Learning
Education.
—
pdf chart, Middle States Commission on Higher
This handy chart gives numerous examples of direct, indirect, and evidence of student learning and of
learning processes that promote student learning.
Suski, L. (2006). The Role of Published Tests and Assessments in Higher Education
Middle States Commission on Higher Education.
—
pdf chart
-
This short article compares the three standardized measures in higher education: ETS Measure of
Academic Proficiency and Progress (MAPP), ACT Collegiate Assessment of Academic Proficiency (CAAP),
and Council for Aid to Education Collegiate Learning Assessment (CLA).
48
Using Assessment for Improvement
Pennsylvania State University Office of Planning and Institutional Assessment. (2005) Assessing
for Improvement. Innovation insights, 11.
http://www.psu.edu/president/pia/innovation/Assessing for Improvement 2.pdf
This article from PSU’s Innovation insights series explains the reciprocal relationship between
assessment and quality management.
49
APPENDIX 0
E-mail Survey for the Mapping Group
Xavier University of Louisiana
Assessment Task Force
Xavier University has undertaken an important exercise to map activities related to assessment that are
occurring throughout this institution. The goal isto gain an understanding of where assessment is
taking place and who the responsible individuals are for completing these activities.
You have been identified as being a P1 on a funded project. Please assist us with our efforts by
completing this brief survey related to the assessment of your funded project.
Project Title:
Funding Agency:
Department / Division:
P1:
Individual(s) Responsible for Assessment
/ Evaluation:
Affiliation:
Is an external evaluator required for this project by the funder?
[ ] YES
[ ]
NO
How frequently are assessment / evaluation reports required to be produced?
If reports are required, during which month are they due?
What is required in your reporting
Description of project activities / implementation
[ I YES
[ I
NO
Description of project participants / subjects (human)
[ ] YES
[ I
NO
Evaluation of outcomes / results
[ I YES
[ I
NO
Fiscal reporting
[ I YES
[ I
NO
Thank you for your assistance. If available, please provide us with an electronic copy of your required
assessment report. Again, thank you.
Damon Williams
Office of Academic
Affairs
U.S. Department of
Education
Damon Williams, Monica
Majors
Yes
No
Kathleen Morgan
Center for Undergrad
Research
—
NSF
-
No
Gene DAmour and Rachel
Cruthirds Office of
Resource Development;
Karen Zhang Computer
Science; Maureen Shuh
Pharmacy
Kathleen Morgan
Gene D’Amour
Office of Resource
Development
National Institute of
Health
-
No
Hua Mel! Chemistry and
Rachel Cruthirds/ Resource
Development
Rachel Cruthirds
Office of Resource
Development
Louisiana Space
Consortium
Annually
Annually
Annually
At end of funding cycle
Yes
Annually and Internal reports
are produced each quarter, but
the progress report id due to
the NIH annually
Yes
Michelle Soliman, Program
Manager; Dr. Guangdi
Wang, Program Director
Gene DAmour
Chemistry
NIH
Yes
Yes
Quarterly
Yes
Rose Shaw
Lamartine Meda
Chemistry
National Science
Foundation
Yes
Yes
Quarterly, At end of funding
cycle
No
NA
Kun Zhang
Computer Science
NIH, subcontracted
from Tulane Cancer
Center, P1 at Tulane:
Dr. Erik Flemington
Yes
At end of funding cycle
No
NA
Kun Zhang
Computer Science
Louisiana Board of
Regents
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Evaluation
of
outcomes?
results
No
Yes
No
Yes
Quarterly, Annually, At end of
funding cycle and For 2010, we
had to re-write and resubmit all
the reports from 2007-2010
No
Description
of project
participants?
subjects
(human)
Yes
Maureen Shuh
Maureen Shuh
RC/ EEP
Description of
project
activities?
implementation
Yes
Annually
How frequently are
assessment? evaluation
reports required to be
produced?
Annually
Yes
Nancy Martino
Nancy Martino
Communications
Dept.
U.S. Dept. of
Education, Office of
Special Education
Programs
Louisiana Board of
Regents
Dr. Shawn Drew
Galina Goloverda
Chemistry
Is an
external
evaluator
required
NIH SCORE
Individual(s) Responsible?
Evaluation
P1
Campus
Department?
Division
Funding Agency
Mapping Group Survey
APPENDIX E
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Fiscal
Reporting
50
Yes
Yes
Yes
Yes
No
Yes
Yes
No
Yes
No
Yes
Yes
Yes
Yes
Yes
Yes
Yes
No
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Annually, At the end of funding
cycle
Semi Annually
Annually
Semi Annually and Monthly
Activity Reports were required
Annually
Annually and a Weekly data
report
Semi Annually
Semi Annually and At end of
funding cycle
Semi Annually and At end of
funding cycle
At end of funding cycle
No
No
No
Yes
No
No
No
Yes
Yes
No
No
Robert Blake II
Cheryl Stevens
Dr. Keisha Watson is Data
Manager! Evaluator
Dr. Diana Anderson
Shaw and USEPA
XU LCRC lnt Adv Board
Dr. Peggy Kirby, Ed. Cet.
Dr. Peggy Kirby, Ed. Cet.
Dr. Rosalind Pijeaux Hale
and Ms. Kim Cherry
Dr. Renee Akbar and Dr.
Rosalind Pijeaux Hale
Dr. Rosalind Pijeaux Hale
Robert Blake II
Guangdi Wang
Cheryl Stevens
Dr. Syed
Muniruzzaman
Thomas Wiese
Kitani ParkerJohnson
Dr. Rosalind Hale!
Dr. Renee Akbar
Dr. William
Sharpton (UNO)!
Dr. Renee Akbar
Dr. Rosalind Hale
Dr. Renee Akbar
Dr. Rosalind
Pijeaux Hale
Dr. Rosalind
Pijeaux Hale! Dr.
Nancy Martino
Basic Pharmaceutical
Sciences, College of Pharmacy
Chemistry! Arts and Sciences
Chemistry
DCAS
Biology
COP DBPS
COP DBPS
Education
Education
Division of Education
Division of Education
Division of Education
Division of Education &
Depar. of Communications
Resource Development
Department of the
Army (Army Research
Office)
ATSDR/ AMHPS
NIH
HRSA
NSF
Shaw Enivironmental
sub of USEPA
LCRC
US DOE
US DOE
NASA
Wallace Foundation
Through the Louisiana
Dept. of Ed.
Wallace Foundation
Louisiana Dept. of Ed.
U.S. Department of
Education
U.S. Department of
Agriculture
Office of Naval Research
NOAA
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
At end of funding cycle
At end of funding cycle
At end of funding cycle
Annually
Annually
Annually
No
No
No
No
No
Dr. Rosalind Pijeaux Hale,
Dr. Nancy Martino, and Ms.
Ahdija Donatto
Gene DAmour
Gene DAmour
Alden Reine
Gene DAmour
Gene DAmour
Alden Reine
Resource Development
Resource Development
No
Yes
Yes
Yes
Yes
Semi Annually, At end of
funding cycle
No
Olger C. Twyrier, Ill
Olger C. Twyner,
Ill
Office of Resource
Development
US. Department of
Education
Yes
Olger C. Twyner, Ill
Olger C. Twyner,
Ill
Fiscal
Reporting
Office of Resource
Development
Evaluation
of
outcomes!
results
Yes
US. Department of
Education
Description
of project
participants!
subjects
Description of
project
activities!
implementation
Yes
How frequently are
assessment! evaluation
reports required to be
produced?
Semi Annually, At end of
funding cycle
Is an
external
evaluator
required
No
Individual(s) Responsible/
Evaluation
P1
Campus Department/
Division
Funding Agency
51
n.j
Lfl
Download