2007
Table of Contents
B. Understanding the Educational Environment Before Commencing the QEP ....................6
4. Broader Educational Policy Initiatives and the Connection to UT Arlington ................. 20
C. Process, Part II: Identifying Student Learning – Higher Order Thinking Skills .............. 20
D. Replication, Expansion, Innovation, and Institutionalization in Year 4 and Beyond....... 39
i
List of Figures and Tables
Figures:
1. Demographic Profile of UT Arlington’s Students (2006)....................................................... 4
2.
3.
Fall to Fall Retention Rates at UT Arlington (2005-2006) ................................................. 24
Graduation Rates at UT Arlington (2005-2006).................................................................... 25
4.
Tables:
QEP Management Structure ....................................................................................................... 35
1.
2.
3.
4.
Higher Order Thinking Skills as Reflected in UEPs ................................................................. 6
CLA Score Differences between UT Arlington Freshmen and Seniors (2005-2006) ............................................................................................................... 7
Performance of UT Arlington Freshmen and Seniors in Varying Tasks on CLA (2005-2006) ...................................................................................................................... 7
Emphasis on Bloom’s Taxonomy of Learning........................................................................... 8
6.
7.
8.
9.
Class Participation, Activities, and Feedback .......................................................................... 10
Importance of Selected Learning Activities and Thinking Skills.......................................... 11
Preparedness in Selected Thinking Skills ................................................................................. 12
Employer Ratings of Importance of Selected Job-Related Tasks........................................ 13
10. Key Words Identified by Students and Alumni ...................................................................... 23
12. Annual QEP Budget Summary (Years 1-10)............................................................................ 36
13. Summary Timeline for QEP Implementation .......................................................................... 38
14. Pilot Project Summaries .............................................................................................................. 42 ii
I.
Executive Summary
The Quality Enhancement Plan (QEP) of The University of Texas at Arlington reflects a shared vision of the educational experiences we want our students to have and of what we want our students to learn. After a careful, thoughtful, and extensive dialogue among the various interests that comprise our academic and broader communities (most significantly our faculty), the
University selected as the goal of its QEP the effective application of active learning to achieve higher order thinking skills . Through this QEP, UT Arlington seeks to foster an environment in which our students will be engaged partners in their own learning process, and will be able to apply, analyze, synthesize, and evaluate their classroom experiences to better understand the world around them.
The process for selecting the QEP theme was designed to be open and stakeholder-driven. This process of institutional self-reflection was undertaken in concert with other University initiatives
(e.g., a strategic planning process and a new branding campaign), and took advantage of lessons learned throughout these endeavors to inform subsequent steps. This approach facilitated the creation of a well-integrated plan for an educational environment at UT Arlington centered on the acquisition and development of higher order thinking skills through carefully designed active learning experiences.
This process resulted in the following definition of active learning for the QEP:
Active learning places the student at the center of the learning process, making him/her a partner in discovery, not a passive receiver of information. It is a process that employs a variety of teaching and learning strategies to place the responsibility for creating and defining the learning environment on the instructor and the responsibility for effective engagement in the learning process on the students. Active learning encourages students to communicate and interact with course materials through reading, writing, discussing, problem-solving, investigating, reflecting, and engaging in the higher order thinking tasks of application, analysis, synthesis, and evaluation. An active learning approach draws upon a continuum of teaching and learning strategies, including for example class discussion activities, undergraduate research, and community-based learning experiences.
This definition embodies many reasons why a QEP premised on active learning resonated so deeply with the University’s constituencies. The variety of teaching and learning strategies promotes innovation within the colleges and schools consistent with our strong-college model of governance and history of strong teaching. This variety of approaches acknowledges that our diverse and changing student population employs a number of different learning styles that need to be taken into consideration. Student engagement in their classes and their acquisition and development of the higher order thinking skills of application, analysis, synthesis, and evaluation are the learning environments that our faculty want to promote, that our students indicate they want to participate in, that our alumni report are most beneficial to them, and that employers believe produce the skills they desire in their future employees.
UT Arlington approaches its QEP from an institutional research perspective. The QEP is a narrowly tailored institutional research project designed to investigate the impact of the use of active learning techniques on students’ acquisition and development of higher order thinking skills. It is intended to address four research questions:
Executive Summary • 1
1. Does active learning contribute to enhancing higher order thinking skills among UT
Arlington students?
2. What are the most effective active learning strategies (in terms of complexity, time on task, and intensity) for increasing higher order thinking skills?
3. At what level in the UT Arlington undergraduate experience does active learning have the most impact?
4. How does the effectiveness of active learning strategies vary across the colleges and schools as explored through the respective pilot projects?
Twelve pilot projects were selected for the QEP following an open call for pre-proposal submissions from throughout the University. These projects were chosen based on certain commonalities (similar approaches, similar concepts, and most importantly similar student learning outcomes). They offer an array of educational contexts that will allow us to explore the intervention of active learning techniques. As shown in the following table, the pilot projects cover nearly all the colleges and schools, a broad range of the undergraduate experiences from introductory classes to capstone courses, and a wide variety of learning environments.
College/School
Architecture
Business
Level Learning Environment
Upper-division course Computer-based portfolio building
Introductory Using projects to enhance required curriculum (catapult group project)
Education
Engineering
Upper-division course Using blogs/podcasts to foster on-line e-communities of learners
Capstone Increasing reflective practice within engineering capstone experience across the college
Engineering Introductory Using response technologies in large class
Engineering Introductory Using computerized, interactive homework
Liberal Arts
Liberal Arts and Junior/Senior symposia
Introductory
Introductory
Using classroom response technologies in large class
Using projects to enhance required curriculum (group project module on elections)
Using projects to enhance required curriculum (plagiarism case studies) Univ. Library/
Liberal Arts
Nursing
Science
50-minute session
(primarily Freshmen)
Capstone
Capstone
Active learning projects on-campus as well as satellite locations
Problem-based research experience
The effectiveness of active learning techniques in the pilot projects will be assessed at the course, program, and University levels at several points in the QEP three-year implementation period.
Such assessments will provide data to explore, understand, and direct the classroom experience at
UT Arlington. At the end of the third year, these data will help us identify the most effective active learning techniques and best practices in the institution. The findings will inform the
University’s decision making in matters of allocating scarce resources, time, talent, and institutional energy, as we continuously seek to spread these best practices throughout the
University.
The QEP will also inform the University’s larger ten-year strategic planning initiative geared toward promoting what is referred to as its “Strategic Planning Priority I”—“to provide an environment that fosters broad-based education as well as professional studies designed to facilitate successful careers, personal development, and community service.” The QEP will offer evidence of teaching and student learning accomplishments and better enable the University to
Executive Summary • 2
develop teaching goals and strategies to reach this strategic planning priority. It will also initiate dialogue on teaching and learning so that faculty can learn from one another and disciplines can take advantage of one another’s perspectives. Finally, it will encourage and support innovation in the classroom.
The University’s plan to integrate active learning is designed to distinguish the student learning experience at UT Arlington from that of other institutions of higher learning and to create a cross-campus active learning environment that will become the hallmark of a UT Arlington education. We also seek to design an environment in which our students acquire and develop the thinking skills they need to be vital contributors to the workforce and the community and to ignite a thirst for lifelong learning. The QEP is a mechanism that allows the University to work toward fulfilling its institutional mission and its strategic plan, to help obtain its goal of becoming an institution of first choice, and to achieve institution-wide improvement through the pursuit of enhancing higher order thinking skills.
Executive Summary • 3
II.
Institutional Context
A. The QEP and UT Arlington’s Mission and Culture
At the heart of UT Arlington’s mission lies its commitment to providing an enriching educational experience for its students. One of its core mission objectives is to “… prepare students for full, productive lives and informed and active citizenship” (UT Arlington Mission Statement). In pursuing this objective, UT Arlington has “… developed undergraduate and graduate curricula and classroom practices that engage students actively in the learning process.” What remains to be examined, however, is how the classroom experience can be tailored to help realize our institutional mission.
UT Arlington’s culture is defined by a diversity of academic programming and a strong-college model of governance. In addition to a history of strong teaching, the University is a “research university with high research” (“RU/H” in the Carnegie Foundation’s Classification of
Institutions of Higher Education) aspiring to be classified as a “research university with very high research” (“RU/VH”). As a result of these various factors, the institution is capable of producing excellence in teaching and research innovation within the colleges and schools. Such innovation, however, can be further leveraged by connecting with and building upon similar initiatives across the colleges and schools and enhancing such efforts with alternative perspectives.
UT Arlington has a diverse student body that reflects the large metropolitan area in which it is situated. In Fall 2006, total enrollment at UT Arlington was 24,825, up from 21,180 in Fall 2001.
Undergraduates comprise over 77% of the total student population. A growing proportion of those attending UT Arlington are traditional college-aged students (30% between 17 and 21; 58% when broadening the grouping to be between 17 and 24), yet the University continues to draw many non-traditional aged students who balance obtaining their degrees with their responsibilities to families and full-time jobs. The diversity of the student body is also seen in the large percentage of Hispanics (14%), African Americans (12%), Asian Americans (11%), and international students (11%) attending the University. A majority (54%) of our students are women (see Figure 1).
Figure 1: Demographic Profile of UT Arlington’s Students (2006)
Ethnicity Gender Age
70%
60%
50%
40%
30%
20%
10%
0%
Af ric an
Am er ic an
As ia n
A me rica n
Hi sp an ic
Am er ic an
Na tive
W hit e
In te rn at io na l
M ale
Fe m al e
17
-2
1 Y ea rs
17
-2
4 Y ea rs
Demographic Category
Source: UT Arlington Office of Institutional Research, Planning and Effectiveness.
Institutional Context • 4
Given this diversity, a common dialogue is needed to establish and maintain connections among the innovative pedagogical initiatives occurring across the University. The QEP development process afforded UT Arlington the opportunity to open this dialogue and to ask the colleges and schools, faculty, staff, administrators, alumni, and students to explain what a meaningful learning experience is for them. The galvanizing questions of “What kind of educational experience do we want our students to have?” and “What do we want our students to learn?” focused the attention of the academic community for more than a year. At the heart of this discussion was the notion that through a common, shared enterprise aimed at achieving student learning outcomes based on the acquisition and development of higher order thinking skills, the education we provide would be stronger and more meaningful.
Through meetings, focus groups, informal conversations, online surveys, and in-person surveys, a discrete number of themes emerged. The faculty expressed a desire to have students come to class not only prepared, but also fully engaged in classroom participation. Students noted that they liked and learned best in classes that made the material “more real”—classes in which they could understand how the material they were learning related to what they learned in other courses and how this knowledge could be applied and used. Both faculty and students described a classroom as a shared enterprise, a partnership. Even those faculty and students who preferred a more traditional lecture format spoke of the desire to understand the material more deeply and sought to improve retention of what was presented. Alumni described their best classroom experiences as those defined by real-world application and observed that these experiences were key to a successful transition into the workplace. Employers indicated that they wanted our graduates to possess core thinking skills to meet the demands of their new environment.
It became clear that the various constituencies in the University community—faculty members, students, alumni, employers, and the administration alike—were describing common themes. At the heart of one theme was the core value of strong teaching built on the bedrock of a commitment to vibrant learning environments. With this realization came the answer to the question of “what kind of educational experience do we want our students to have,” and from it the QEP theme of active learning emerged. Many faculty members in the University are already engaged in and have had success in employing various techniques of active learning. These efforts provide a resource that the University can build on and draw from in developing a formal, intentionally designed QEP employing active learning.
Once active learning was defined as our theme, it became imperative to look beyond the means of instruction and to the heart of the question of what we want our students to know. In other words, what is the value added of a UT Arlington educational experience? A second common theme identified the value that the constituencies wanted the University to provide, namely the development of the higher order thinking skills of application, analysis, synthesis, and evaluation.
As a result, the goal of the QEP became to enhance student learning at UT Arlington by promoting active learning as a means to achieve higher order thinking skills.
Institutional Context • 5
B. Understanding the Educational Environment Before Commencing the QEP
UT Arlington has set its sights over the past few years on raising academic standards by implementing more rigorous admission requirements, promoting scholarly research (including undergraduate research), and increasing expectations for faculty and student performance. Each of the existing University-wide assessments discussed below indicates, however, a need to provide intervention in the classroom to enhance our students’ learning of higher order thinking skills.
This section considers the current state of the University in order to provide the context from which this QEP emanates.
Unit Effectiveness Plans (UEPs). The UEPs promote University-wide assessment on a two-year cycle of planning, implementing, evaluating, and improving. Each academic and administrative unit of the University completes a UEP, which links unit purpose and student learning outcomes to the mission, goals, and objectives in the University’s strategic plan. Units use the results of the assessment to develop proposals for improvement. Table 1 is based on a sampling of departments in the schools and colleges, and shows the number of higher order thinking activities (i.e., application, analysis, synthesis, and evaluation) cited as intended student learning outcomes in
UEPs from the previous cycle ending in Spring 2006.
Table 1: Higher Order Thinking Skills as Reflected in UEPs
Application Analysis Synthesis Evaluation
Business Administration:
Accounting, Economics, Finance, Information
Systems, Management, Marketing, Real Estate
Education:
Athletic Training, Child/Bilingual Studies,
Kinesiology
Engineering:
Aerospace Engineering, Civil Engineering,
Computer Science & Engineering, Electrical
Engineering, Industrial Engineering, Mechanical
Engineering, Software Engineering
Liberal Arts:
Advertising, Art, Art History, Broadcast
Communication, Criminology & Criminal Justice,
Communication Technology, English, French,
German, History, Journalism, Music, Philosophy,
Political Science, Russian, Spanish, Speech,
Sociology, Theatre Arts
Nursing
Science:
Biology & Quant. Biology, Medical Technology,
Microbiology, Chemistry/Biochemistry, Geology,
Mathematics & Math Sciences, Physics & Applied
Physics, Psychology
Social Work
Honors College
15 6 10 2
14 4 8 8
13 3 11 -
25 8 15 6
1 - 1 -
7 6 8 1
1 - 1 -
- - 2 -
SUPA: Interdisciplinary Studies 2 - 1 3
Source: UT Arlington Office of Institutional Research, Planning and Effectiveness.
When the QEP is in place, the number of student learning outcomes associated with higher order thinking is anticipated to increase as departments in the schools and colleges promote active learning strategies with these learning outcomes in mind.
University-Wide Tests and Surveys
• Collegiate Learning Assessment (CLA). The CLA is a test of critical thinking, reasoning, problem solving, and writing, and is not based on any specific course-related knowledge. It was
Institutional Context • 6
developed by the Council for Aid to Education in collaboration with the Rand Corporation.
When a college or university uses the CLA, it decides how to select students to participate in the test. At UT Arlington, the CLA has been administered annually to a sample of 100 freshmen and
100 seniors; the most recent test was administered in Spring 2006.
Table 2 shows overall CLA score differences between freshmen and seniors at UT Arlington for the 2005-2006 academic year.
Table 2: CLA Score Differences between
UT Arlington Freshmen and Seniors (2005-2006)
Mean SAT Score
Expected CLA Score
Actual CLA Score
Difference (actual minus expected in scale score points)
Difference (actual minus expected in standard errors)
Freshmen* Seniors** Value Added***
1045
1075
1042
1167 92
1124 1143 19
49 -24 -73
1.00 -0.50 -1.50
Performance level (Well Above,
Above, At, Below, or Well Below
Expected)
Source: UT Arlington Office of Institutional Research, Planning and Effectiveness.
Notes:
*Freshmen: Based on the average SAT score (1045) of freshmen sampled at UT Arlington, the expected average CLA score would be 1075.
UT Arlington freshmen scored 1124, which is Above Expected.
**Seniors: Based on the average SAT score (1042) of seniors sampled, the expected average CLA score would be 1167. UT Arlington seniors scored 1143, which is At Expected.
***Value Added: Based on the average SAT scores of freshmen and seniors sampled at UT Arlington, the expected difference between seniors and freshmen on CLA would be 92 points. This difference is an estimate of the expected value added. The difference of the actual scores between the seniors (1143) and freshmen (1124) was 19 points, which is Below Expected.
Table 3 shows deviation scores and associated performance levels for various tasks for UT
Arlington freshmen and seniors.
Table 3: Performance of UT Arlington
Freshmen and Seniors in Various Tasks on CLA (2005-2006)
Freshmen
Deviation Score Performance
Level
Seniors
Deviation Score Performance
Level
Performance Tasks
Analytic Writing Task
Make an Argument
Critique an Argument
Total Score
0.2
1.3
1.4
1.1
1.0
At
Above
Above
Above
Above
-0.1
-0.8
At
At
0.0 At
-1.4 Below
-0.5 At
Source: UT Arlington Office of Institutional Research, Planning and Effectiveness.
Notes:
Deviation scores are expressed in terms of standard errors to facilitate comparisons among measures. Institutions whose actual scores fall between -1.00 and +1.00 in standard error from their expected scores are categorized as being At Expected. Institutions with actual scores greater than one standard error but less than two standard errors from their expected scores are in the Above Expected or Below Expected categories, depending on the direction of the deviation. Institutions with actual scores greater than two standard errors from their expected scores fall into Well Above Expected or Well Below Expected categories.
The results above indicate a need for intervention in order to increase the value added of a UT
Arlington education. The findings, however, may be affected by two systematic factors that may skew the conclusions: (1) A small sample size (which will be rectified in the next administration
Institutional Context • 7
of the test by using a larger sample); and (2) the more rigorous admission standards that the
University has started to employ.
• National Survey of Student Engagement (NSSE) and Faculty Survey of Student Engagement
(FSSE). The NSSE is administered annually to a random sample of freshmen and seniors, focusing on various aspects of college life. The FSSE is administered to the faculty. Faculty members are asked to base their ratings on a typical freshman or senior at their institutions. In
Spring 2006, 1,364 freshmen and 1,327 seniors were randomly selected for the NSSE. The response rates were 17% and 25% respectively. Most of the freshmen who participated (92%) were of traditional age (younger than 24), whereas less than half (45%) of the participating seniors were. These proportions were reflected in the UT Arlington student population at large, with the exception that more females and more Asian American and international students responded to the survey than their respective percentages in the student population. For the FSSE
2006, 1,177 faculty members were invited to participate and 537 responded (46%).
Table 4 shows UT Arlington’s faculty and student responses on the amount of coursework they thought emphasized Bloom’s taxonomy of learning (see Section III.D). Eighty-six percent of the faculty reported that they placed very much or quite a bit of emphasis on synthesizing and organizing ideas in lower-level courses, and 90% reported this emphasis in upper-level courses.
Conversely, only 65% of the freshmen and 71% of the seniors reported that their coursework emphasized these activities. Both freshmen and seniors reported a much higher percentage of memorizing facts or ideas (74% of freshmen and 66% of seniors chose “Very much” or “Quite a bit”). Only 32% and 27% of the faculty in lower- and upper-division courses respectively reported that they placed very much or quite a bit of emphasis on memorization.
Table 4: Emphasis on Bloom’s Taxonomy of Learning
Faculty Responses Student Responses
Percentage of faculty who reported that they place quite a bit or very much emphasis on the following in their courses:
Distribution of student responses to how much their coursework during the current school year emphasized the following:
FSSE Item
Memorizing facts, ideas, or methods from your course and readings
Class
Lower
Division
Upper
Division
Very Much or
Quite a Bit
32%
(n = 42)
27%
(n = 75)
NSSE Item
Memorizing facts, ideas or methods from your course and readings
Analyzing the basic elements of an idea, experience or theory
LD
UD
90%
(n = 117)
94%
(n = 260)
Analyzing the basic elements of an idea, experience or theory
Synthesizing and organizing ideas, information, or experiences
LD
UD
86%
(n = 112)
90%
(n = 248)
Synthesizing and organizing ideas, information, or experiences
Making judgments about the value of information, arguments or methods
LD
UD
70%
(n = 91)
76%
(n = 211)
Making judgments about the value of information, arguments, or methods
Applying theories or concepts to
LD
82%
Applying theories or practical problems or in new situations
UD
(n = 106)
87%
(n = 242) concepts to practical problems or in new situations
Source: UT Arlington Office of Institutional Research, Planning and Effectiveness.
Class
Freshman
Senior
FR
SR
FR
SR
FR
SR
FR
SR
Very Much or Quite a Bit
74%
(n = 157)
66%
(n = 213)
73%
(n = 156)
80%
(n = 258)
65%
(n = 138)
71%
(n = 228)
69%
(n = 145)
67%
(n = 215)
73%
(n = 152)
80%
(n = 260)
Institutional Context • 8
Table 5 shows how faculty reported the extent to which they structured their courses to facilitate their students’ development of certain thinking skills, and how students report the extent to which their college experience contributed to these skills. The differences between the two perspectives occurred in many categories.
Table 5: Development of Thinking Skills
Faculty Responses Student Responses
Percentage of faculty who structured their courses quite a bit or very much so that students learn and develop in the following areas:
Distribution of student responses to the extent that their college experience contributed to their knowledge, skills, and personal development in the following areas:
FSSE Item Class
Lower
Division
Very Much or
Quite a Bit
47%
(n = 61)
NSSE Item
Writing clearly and effectively
Upper
Division
63%
(n = 175)
Writing clearly and effectively
Speaking clearly and effectively
LD
UD
35%
(n = 46)
48%
(n = 133)
Speaking clearly and effectively
Thinking critically and analytically
LD
UD
92%
(n = 121)
95%
(n = 262)
Thinking critically and analytically
Analyzing quantitative problems
LD
UD
55%
(n = 71)
60%
(n = 164)
Analyzing quantitative problems
Solving complex real-world problems
LD
UD
60%
(n = 74)
72%
(n = 193)
Solving complex realworld problems
Acquiring job or work-related
LD
56%
(n = 71)
Acquiring job or knowledge and skills
UD
81%
(n = 218) work-related knowledge and skills
Source: UT Arlington Office of Institutional Research, Planning and Effectiveness.
Class
Freshman
Senior
FR
SR
FR
SR
FR
SR
FR
SR
FR
SR
Very Much or Quite a Bit
68%
(n = 133)
66%
(n = 211)
56%
(n = 112)
62%
(n = 196)
78%
(n = 153)
83%
(n = 259)
61%
(n = 119)
74%
(n = 229)
54%
(n = 105)
54%
(n = 171)
54%
(n = 105)
68%
(n = 216)
Institutional Context • 9
With respect to class activities related to active learning, differences between the two sets of responses in certain areas are also apparent (see Table 6). For example, 24% of the faculty reported that more than half of lower-division students (28% for upper-division) frequently ask questions in class or contribute to class discussions, while 45% of freshmen and 64% of seniors said they did. A similar pattern emerges in the question of receiving prompt feedback on academic performance. Faculty responses in both the lower and upper divisions were quite high
(84% and 86% very often or often, respectively), whereas 42% of the freshmen and 55% of the seniors agreed.
Table 6: Class Participation, Activities, and Feedback
Faculty Responses Student Responses
Percentage of faculty who reported that more than half of students from their courses do the following:
Distribution of student responses to how often they did the following at their institution during the current school year:
FSSE Item
Frequently ask questions in class or contribute to class discussions
Class
Lower
Division
Upper
Division
50% or
Higher
24%
(n = 33)
28%
(n = 84)
Percentage of faculty who reported that students from their courses do the following often or very often:
FSSE Item
Work with other students on projects during class
Participate in a community-based project (e.g., service learning) as part of your course
Class
LD
UD
LD
UD
LD
Very Often or
Often
42%
(n = 56)
52%
(n = 155)
8%
(n = 11)
13%
(n = 39)
84%
(n = 112)
Receive prompt written or oral feedback from you on their academic performance
UD
86%
(n = 255)
NSSE Item
Asked questions in class or contributed to class discussions
Class
Freshman
Senior
Very Often or Often
45%
(n = 104)
64%
(n = 212)
Distribution of student responses to how often they did the following at their institution during the current school year:
NSSE Item
Worked with other students on projects during class
Participated in a community-based project (e.g., service learning) as part of a regular course
Received prompt written or oral feedback from faculty on your academic performance
Class
Freshman
Senior
FR
SR
FR
SR
Very Often or Often
51%
(n = 114)
45%
(n = 148)
8%
(n = 17)
15%
(n = 51)
42%
(n = 92)
55%
(n = 177)
Source: UT Arlington Office of Institutional Research, Planning and Effectiveness.
Institutional Context • 10
In some areas, however, the two groups of respondents were in close agreement (see Table 7).
The faculty rated as very important or important putting together ideas or concepts from different courses when completing assignments or during class discussions, as well as working on a paper or project that requires integrating ideas or information from various sources. A high number of students responded similarly.
Table 7: Importance of Selected Learning Activities and Thinking Skills
Percentage of faculty who reported that it is important or very important that their students do the following:
Distribution of student responses to how often they did the following at their institution during the current school year:
Very Important or
FSSE Item
Work on a paper or project that requires integrating ideas or information from various sources
Put together ideas or concepts from different courses when completing assignments or during class discussions
Class
Lower
Division
Upper
Division
LD
UD
60%
(n = 78)
76%
(n = 216)
43%
(n = 56)
67%
(n = 192)
Worked on a paper or project that required integrating ideas or information from various sources
Put together ideas or concepts from different courses when completing assignments or during class discussions
Examine the strengths and
LD
65%
Examined the strengths weaknesses of their views on a topic or issue
UD
(n = 84)
74%
(n = 213) and weaknesses of your own views on a topic or issue
Source: UT Arlington Office of Institutional Research, Planning and Effectiveness.
Class
Very Often of
Often
Freshman
68%
(n = 156)
Senior
FR
SR
FR
SR
79%
(n = 266)
45%
(n = 98)
67%
(n = 221)
43%
(n = 88)
52%
(n = 168)
Institutional Context • 11
• UT Arlington Alumni Survey. This survey was last conducted in 2003. Of 6,850 alumni contacted, 1,682 alumni responded (25%). The next survey is planned for 2007.
As shown in Table 8, a very large percentage of the University alumni felt very well or adequately prepared in skills and aspects such as reading and writing (92.2%); speaking (82%); understanding principles of critical thinking (86.5%); and identifying, formulating and solving problems (85.8%). Ratings were lower, however, in the areas of applying knowledge (76.5%), analyzing political and economic phenomena (66.7%), and designing/conducting experiments
(66.5%).
Table 8: Preparedness in Selected Thinking Skills
Question: How well prepared were you in the following areas as a result of all courses you completed at UT Arlington?
Reading and writing clear, correct English
•
Very Well or Adequately
•
Somewhat or Poorly
Understanding the basic principles of critical thinking, argument, and mathematical relationships
•
Very Well or Adequate
•
Somewhat or Poorly
Identifying, formulating, and solving problems
•
Very Well or Adequately
•
Somewhat or Poorly
Being proficient in oral communication
•
Very Well or Adequately
•
Somewhat or Poorly
Understanding the scientific method and problems analysis
•
Very Well or Adequately
•
Somewhat or Poorly
Gathering, analyzing, and interpreting data
•
Very Well or Adequate
•
Somewhat or Poorly
Applying knowledge of math, science, and/or engineering
•
Very Well or Adequately
•
Somewhat or Poorly
Analyzing political and economic phenomena
•
Very Well or Adequately
•
Somewhat or Poorly
N
606
51
592
92
604
100
558
123
547
121
554
129
486
149
403
201
%
92.2%
7.8%
86.5%
13.4%
85.8%
14.2%
82.0%
18.0%
81.9%
18.1%
81.1%
18.9%
76.5%
23.5%
66.7%
33.3%
Designing/conducting experiments
•
Very Well or Adequately
•
Somewhat or Poorly
Source: UT Arlington Office of Institutional Research, Planning and Effectiveness.
362
183
66.5%
33.5%
• UT Arlington Employer Survey. The instrument was last distributed in 2004 to employers of graduates who responded to the Alumni Survey in 2003 and consented to our request to contact their employers. Seven-hundred-seventy employers were contacted, and 136 of them responded
(18%). The next survey is planned for 2008.
One of the major findings from this survey is that employers’ ratings on the performance of certain job skills of UT Arlington alumni were higher than what they perceived as important for
Institutional Context • 12
the positions (see Table 9). Those skills include the ability to apply technical knowledge, processing and interpreting numerical data, thinking creatively, and planning projects. In other areas (defining and solving problems, speaking, listening, and writing), however, the employers gave higher ratings to the importance of skills than to the performance of their employees.
Table 9: Employer Ratings of Importance of Selected Job-Related Tasks
N Importance
Average Rating
N Performance
Average Rating
Applying job-related conceptual knowledge
Applying job-related technical
134 4.37 135
134 4.20 134
4.35
4.30 knowledge
Defining problems
Solving problems
135
134
4.43
4.57
136
135
134 3.75 132
4.24
4.31
4.11 Processing and interpreting numerical data
Think creatively
Speaking effectively
134
134
4.07
4.28
135
136
4.22
4.09
Writing effectively
Listening effectively
134
134
4.20
4.50
136
136
4.15
4.29
Planning projects 134 3.76
Source: UT Arlington Office of Institutional Research, Planning and Effectiveness.
Notes:
131 4.03
Instructions for Importance Ratings: For each of the following items, please mark one response that best describes the importance of the skill or quality to high performance in the position held by the UT Arlington graduate named on the previous page (1 = Not at all important, 2 = Marginally important, 3 = Important, 4 = Very important, 5 = Essential).
Instructions for Performance Ratings: For each of the following items, please mark one response that best describes the performance of the UT Arlington graduate who holds the position, as compared to others in your organization who hold or have held similar positions
(1 = Poor performance, 2 = Fair performance, 3 = Good performance, 4 = Very good performance, 5 = Exceptional performance).
The above analyses suggest that systematic, University-wide intervention in the classroom would enhance our students’ ability to make better use of the current active learning efforts reported by our faculty members. Throughout the QEP development process, examples of active learning efforts across campus were brought to the forefront as evidence of environments that support students’ acquisition and development of higher order thinking skills. The discrepancy between perceptions and outcomes, however, suggests that we have room to improve, as does other analysis that shows the University lags behind its average Carnegie peer institutions in the implementation of active learning techniques. This QEP, therefore, has been developed to address these issues. We also intend to conduct a more systematic survey on the use of active learning as defined in the QEP in Spring 2007. This survey will provide an understanding of the scope and type of active learning initiatives at the University prior to the QEP’s implementation.
Institutional Context • 13
III.
Developing the QEP
A. Process, Part I: Identifying the Theme – Active Learning
From the outset, the leadership of UT Arlington has been committed to an open, stakeholderdriven process for developing the QEP. Allowing the theme of the QEP to develop and evolve in such an organic manner meant that the University needed to begin identifying the theme early and to provide multiple opportunities for participation and input. Only after the desires of the academic community were identified and more narrowly defined through this process could the crafting of the QEP commence.
UT Arlington started the process of determining the QEP theme in Fall of 2004, when President
Spaniolo and Provost Dunn participated in a series of conversations with faculty on the development of a new University-wide strategic plan. Concurrent with this strategic planning process was a branding initiative, which culminated in the selection of the Be A Maverick identity; in choosing this theme, the University expressed a desire to develop students who are independent thinkers, not content to follow the crowd but instead eager to blaze new trails. The
QEP provided an additional opportunity to give meaning to the University’s branding efforts and its Strategic Planning Priority I, namely to provide “a high quality educational environment that contributes to student academic achievement, timely graduation and preparation to meet career goals.”
Seven possible themes for the QEP emerged from the input received in these conversations. In the Fall of 2005, the QEP leadership team asked the faculty to rank-order these themes (or to suggest other themes) through an online survey, in which 143 faculty members—about 10%— participated. The results were analyzed using a scoring index that took into account the level, frequency, and intensity of support for each theme. These results indicated significant support for four of the proposed themes (listed alphabetically): applied learning, communication skills, student engagement, and undergraduate research.
As conceived in the context of developing our QEP, a plan based on applied learning would link off-campus learning experiences such as cooperatives, internships, and community service learning to valued educational outcomes in a student’s major. A QEP seeking to improve the communication skills (both oral and written) of all University graduates would stress the principle that no idea is fully formed until it can be communicated. Student engagement may be thought of in broad terms, and might include student-life experiences as well as more traditional course-based forms of student activity. Finally, a plan based on enhancing undergraduate research as a part of every major might focus on requiring more research-based degree requirements and the use of capstone courses.
Continuing the efforts of President Spaniolo and Provost Dunn, the University’s QEP
Coordinator, Dr. Farrar-Myers, facilitated a series of conversations on the four potential themes in the Fall of 2005. She held numerous meetings with various faculty groups including the Faculty
Senate and the Undergraduate Assembly, as well as meetings with academic departments, academic deans, the Student Congress, and the University Development Board. In addition, these topics were brought to two open faculty focus groups for comment and refinement (38 attended the first session and 24 the second). Drawing on the collective feedback received (as well as analysis of the data culled from University-wide assessments referred to in Section II.B), an important point raised at the initial strategic planning conversations was reaffirmed: while each of the possible themes was important to UT Arlington’s academic community, we could not
Developing the QEP • 14
afford to sacrifice the desired student learning outcome of promoting thinking skills by focusing on too narrow of an approach.
The information, issues, and concerns gathered from these various sessions were then brought to a third faculty focus group comprised of Faculty Senate representatives. During this session, specific recommendations about the QEP theme and approach were developed. Among these was the idea that choosing and pursuing an active learning pedagogy would tie together each of the topics that the University’s academic community was interested in pursuing, and would promote the kind of learning we desire for our students. These recommendations were then presented to the Reaffirmation Leadership Team, consisting of Provost Dunn, QEP Coordinator Farrar-
Myers, SACS Liaison/Compliance Certification Coordinator Dr. Haws, and Senior Associate
Provost Dr. Moore, and ultimately to President Spaniolo. Out of this full process, active learning was selected as the theme for the QEP.
From the process that started with the President’s and Provost’s conversations through the final
Faculty Senate focus group, a preliminary working definition (later refined) of “active learning” as conceived by the UT Arlington academic community emerged:
Active learning is a process that employs a variety of pedagogical approaches to place the primary responsibility for creating and/or applying knowledge on the students themselves. It puts the student at the center of the learning process, making him/her a partner in discovery, not a passive receiver of information.
Active learning requires students to interact with and integrate course material by reading, writing, discussing, problem-solving, investigating, reflecting, and engaging in such higher order thinking tasks as analysis, synthesis, evaluation, and critical thinking. An active learning approach draws upon such teaching and learning strategies as question-and-answer sessions, short in-class writing exercises, student research, internships and community service, team learning, and problem-based learning.
Furthermore, the Faculty Senate focus group offered four recommendations for turning the theme of active learning into a viable plan to enhance student learning: (1) allow for innovation across schools and colleges; (2) focus on the undergraduate experience; (3) identify models of active learning experiences and the student learning outcomes that can be derived from them through an RFP process; and (4) develop a plan that focuses on these models and demonstrates how they might serve as the basis for expanding active learning initiatives across campus.
These recommendations reflect a strong sense of what the University is and where it is headed.
For example, since UT Arlington follows a strong-college model of governance, the colleges and schools maintain a strong foundation of autonomy and decision-making. The recommendation to allow for innovation across schools and colleges emphasizes that the QEP should not be a topdown process; instead, instructors and their respective school/college and department units should be responsible for creating new and exciting ways to implement active learning techniques in the classroom. These techniques, though, should be employed as a means to achieving the important learning outcome identified by the academic community, namely enhancing thinking skills.
The recommendation calling for a focus on the undergraduate experience is tied to one of the most significant external influences on the current life of the University, namely the increasing population of traditional college-age students (particularly those residing on and near campus).
Developing the QEP • 15
This influx requires us to address learning strategies for this generation in order to remain competitive and become an institution of first choice for the current and next generation of students. The third and fourth recommendations combined show a desire to highlight either the excellent active learning initiatives already taking place on our campus or those new ones that faculty members are willing to undertake; implicit in this is that we seek to learn from these initiatives with a view to spreading excellence throughout the campus. Further, this approach will allow us to use our scholarship to inform our teaching, thus marrying the two identities of the
University: its historical strength in teaching and its aspiration to be an RU/VH institution in the
Carnegie Foundation’s classification system.
B. Theoretical Foundations of Active Learning
Active learning offers a paradigm for student learning that differs from the traditional lecturebased model (Johnson et al., 2006). This paradigm allows students to “enjoy hands-on and minds-on experiences” (Benjamin, 1991) but does not necessarily require abandoning the lecture format in which many faculty and students feel most comfortable (see, e.g., Bonwell and Eison,
1991, ch. 3). Both the positive impact of active learning in higher education and the list of possible active learning techniques that can be employed are well documented. Knowing which strategies to employ and when and how to employ them, however, remains one of the challenges faced by educators seeking to promote the active learning paradigm. Bok (2006) contends that it is up to “each institution to conduct its own carefully constructed studies to determine the effects of active, problem-based teaching on its students.” Since many contextual variables have an impact on the effectiveness of any given active learning strategy, understanding how active learning can be implemented to enhance student learning within the contexts of UT Arlington’s history, strategic planning, and student population is a vital institutional research question.
1. What Constitutes Active Learning?
The rationale for using an active learning approach to education can be found in a simple statement by the philosopher Lao-Tse around the 5th century BC: “If you tell me, I will listen. If you show me, I will see. But if you let me experience, I will learn.” Active learning as a pedagogical tool has its roots in the cognitive learning theory of constructivism, in which the student constructs his or her own knowledge through continual discovery rather than having knowledge imparted by the instructor (Piaget, 1976; Brophy, 1983). It also draws upon the concept of experiential learning, where “knowledge is created through the transformation of experience” (Kolb, 1984; see also Dewey, 1938; Lewin, 1942; and Piaget, 1976). As the American
Psychological Association (1997) concluded, “the learning of complex subject matter is most effective when it is an internal process of constructing meaning from information and experience.” Compared to the traditional lecture-based approach to teaching, in which students are likened to sponges (Keeley, Ali & Gebing, 1998; Fox-Cardamone & Rue, 2003) or bank-like depositories of information received from their instructors (Freire, 1970), active learning strategies emphasize constructivist qualities such as independent inquiry and the structuring and re-structuring of knowledge (Niemi, 2002).
Active learning has been described in a variety of ways; as a result, some in the academic community may be confused as to what it actually means (Bonwell & Eison, 1991). John Dewey remarked briefly that learning is “something that an individual does when he studies. It is an active, personally conducted affair” (1924) and that “Education is a continuing reconstruction of experience” (1897). More recent descriptions (but not necessarily definitions) of active learning include: “Students learn by becoming involved … Student involvement refers to the amount of physical and psychological energy that the student devotes to the academic experience” (Astin,
1985) and “Learning is not a spectator sport. … [Students] must talk about what they are
Developing the QEP • 16
learning, write about it, relate it to past experiences and apply it to their daily lives. They must make what they learn part of themselves” (Chickering and Gamson, 1987).
The many definitions of active learning share two common themes: first, that good education entails more than the transfer of knowledge from professor to student; and second, that students have a role and responsibility in developing their own knowledge. Commonly found in most descriptions of active learning are the elements of talking and listening, writing, reading, and reflecting, which “singly or in combination are the building blocks common to all active-learning strategies” (Meyers and Jones, 1993). On the one hand, the variety of definitions of active learning provides a starting point for understanding what constitutes active learning, while on the other hand the lack of a common definition provides an institution of higher education with the flexibility and opportunity to craft a definition of active learning appropriate for its own context.
Despite the many definitions of active learning, though, research has demonstrated that active learning techniques have a powerful impact on students’ learning, for example, on “measures of transfer of knowledge to new situations or measures of problem-solving, thinking, attitude change, or motivation for further learning” (McKeachie et al., 1986; for other studies measuring the impact of active learning techniques, see Kuh et al., 1997; Springer, 1997; Cabrera et al., 1998;
McCarthy and Anderson, 2000; and Pascarella and Terenzini, 2005).
2. Pedagogy that Supports Active Learning
Even with the acknowledged successful effects of active learning techniques on student learning generally, the types of active learning techniques cover a wide array of both in-class and out-ofclass activities that can range on a continuum from simple to complex tasks (Bonwell and
Sutherland, 1996; Van Amburgh et al., 2005). Among the in-class techniques that can be employed are the modified lecture (including pausing during the lecture to allow discussion); using guided lectures and answering open-ended, student-generated questions (Bonwell & Eison,
1991); using primary sources in the classroom (May, 1986); cooperative learning (Smith, 1986); and simulations and role-playing games (Shannon, 1986; for a general discussion on active learning strategies, see Bonwell & Eison, 1991; Astin et al., 1984; and Schomberg, 1986).
Similarly, out-of-class experiences such as volunteering in the community or even contact with faculty out of the classroom (e.g., office hours) can enhance a student’s overall educational experience (Terenzini & Pascarella, 1994; Warren, 1997). Regardless of the specific techniques employed, a key to improving active learning in the classroom lies in improving the quantity, extent and depth of students’ involvement in their own educational experience (Weimer, 1996).
Lurking within the variety of definitions that could be used to describe active learning and the breadth of active learning techniques available to faculty members is the risk that active learning might not result in the educational gains that it promises. The effectiveness of any active learning technique is dependent on many contextual factors, some institutional, others student-based, and still others faculty-specific. As Wilbert J. McKeachie (1998) has noted, “The best answer to the question, ‘What is the most effective method of teaching?’ is that it depends on the goal, the student, the content, and the teacher.” Therefore, developing a QEP using active learning as the basis for enhancing student learning requires an understanding of what these contextual factors are and the impact they might have on the effectiveness of the techniques employed.
3. Factors that Impact the Success of Active Learning
The most basic institution-based factor to consider with respect to active learning techniques is the size of the class. Although a large class may inhibit the use of a number of active learning techniques, a willing instructor can modify a straight lecture format to employ such techniques as those listed above, as well as interactive problem-solving, small-group exercises, and reading and
Developing the QEP • 17
analyzing passages of text (Bonwell & Eison, 1991; Yuretich et al., 2001). The subject matter and academic discipline of the class can also determine which active learning techniques will be most successful. While oral history assignments, for example, might be useful in a humanities/interdisciplinary context (Biddle-Perry, 2005), a problem-solving-based active learning strategy may be more relevant in a technical discipline (Butun, 2005; Prince, 2004).
Discipline-specific research has shown that pedagogies based on active learning can be employed successfully in a variety of subject matters, including science and medicine-related courses
(Modell, 1996), chemistry (Felder et al., 1998), physics (Laws et al., 1999), biology (Ebert-May &
Brewer, 1997), psychology (Yoder & Hochevar, 2005), political science (Brock & Cameron,
1999); public relations (Berger, 2002), and research methods and statistics (Helman & Horswill,
2002). These studies emphasize, however, that the use of active learning techniques needs to take into account the nature of the subject matter being studied.
Perhaps the most significant institution-based factor is whether numerous barriers to pedagogical change can be overcome (Bonwell & Eison, 1991). From colleagues’ disapproval (Sutherland,
1996) to institutional reluctance “to undertake a continuous, systematic effort to improve the quality of education” (Bok, 2006) to physical impediments such as large lecture halls with limited flexibility in seating arrangements, professors willing to pursue active learning strategies may have few incentives and little support, but, more significantly, may face negative consequences for doing so. From an institutional perspective, effective use of active learning requires an environment that permits and encourages professors to overcome the barriers to employing such techniques. Such an environment can be crafted through systematic support of active learning programming, as evidenced by the Minnesota State Colleges and Universities, which concluded that “[b]y training and supporting active learning advocates … [their program] positively affected, fairly broadly within the system, the environment for teaching and learning and the significance of faculty development efforts for improving student learning” (Minnesota State
Colleges and Universities System, 2006).
Many of the student-based variables center on the question of who constitutes a professor’s audience in any given class. The answer to that question could dictate which active learning techniques may and may not be successful since students construct new understandings based on the knowledge they currently possess (Hoover, 2006). Students further along in their academic careers may have had more opportunities to be exposed to, but may not necessarily have more experience with, active learning and other pedagogical techniques than freshmen or sophomores.
Students who have not been exposed to active learning techniques, though, typically require greater structure and repetition of tasks if they are to feel more comfortable taking risks in the classroom (Bonwell & Sutherland, 1996). Similarly, course objectives for a class full of majors likely will be different from those for non-majors (Bonwell & Sutherland, 1996; see also Goodwin et al., 1991).
Furthermore, according to one study (Huitt and Hummel, 1998, cited in Wood et al., 2001), only
35% of high school graduates have reached a stage in their cognitive development (the “formal operational” stage defined by Jean Piaget) where symbols are related to abstract concepts.
Instead, most are at a “concrete operational” stage where symbols are related to concrete objects.
Thus, instructional techniques must take into account that the students, even though they might be of traditional college age (or older), “may be limited in their understanding of abstract concepts” (Wood et al., 2001).
The observations above highlight an important factor to be considered in structuring a QEP based on active learning. Students have different learning styles that “emphasize some learning
Developing the QEP • 18
abilities over others” (Kolb, 1981). These styles can be viewed through the lens of various theoretical constructs, such as an experiential learning model (Kolb, 1984; Fox & Ronkowski,
1997), a learning outcome model, a developmental approach, and a cognition and motivation theory (Cross, 1998). Regardless which theoretical understanding of student learning styles one utilizes, identifying their learning styles and “getting students involved in thinking, questioning, and actively seeking knowledge is a key to effective education” (Cross, 1998).
Despite the fact that active learning is a student-centered approach to teaching and has been shown to have a positive impact on student motivation (Gross Davis, 1993), students’ attitudes toward active learning and indifference to learning in general may pose the greatest threat to active learning (Warren, 1997). One key to overcoming such attitudes, however, appears to be communication and guidance from the professor (Felder & Brent, 2006). Despite his warning,
Warren notes that “many active learning techniques fail simply because teachers do not take time to explain them” (Warren, 1997). In addition, providing students with opportunities to offer feedback on the class during the semester may be a way to mitigate student disapproval arising from the introduction of new teaching techniques (Sutherland, 1996).
Much of the scholarship cited above on student-based barriers to active learning was written before the current generation of traditional college-age students—the Millennial students born in
1982 and after who make up almost 60% of UT Arlington’s student population—started to attend college. This generation, though, may be more amenable to active learning strategies than their predecessors (Howe & Strauss, 2000). Millennial students prefer to learn by doing and/or as members of a collaborative group. Schroeder (2004) has shown that a majority of students entering college today prefer to learn through concrete experiences in a structured and linear environment; this leads him to posit that these students may be less comfortable with abstract thought. On the other hand, Bell (2003) reported that most college freshmen believe that education is to be endured rather than truly enjoyed. Furthermore, the majority of faculty learn and structure their courses in a manner quite different from that of their students, focusing, for example, on abstract concepts and ideas for later application only if time permits (Schroeder,
2004). Thus, in order to engage Millennial students, a professor needs to take advantage of their predisposition toward active engagement in course material, while remaining aware that certain types of activities may not resonate with his or her audience.
Although the Millennial generation of students comprises a growing portion of our student body,
UT Arlington is composed of a diverse mix of students. With such diversity comes a variety of learning styles. For example, adult students—a significant population at UT Arlington—tend to be “self-directed learners” (Knowles, 1980) who want to draw upon their experience and “would rather be actively involved in learning than sitting passively on the sidelines” (Meyers and Jones,
1993). Other research has explored differences in learning styles between women and men
(Gilligan, 1993; Belenky, Clinchy, Goldberger, and Tarule, 1986) and among ethnic groups
(Banks, 1988). Despite the variety of learning styles posed by a diverse student population, both generally and at UT Arlington specifically, Meyers and Jones (1993) conclude that “Those who accept the premise that different students will learn in different ways … will find that activelearning strategies not only enliven the classroom but significantly improve their students’ thinking and learning capabilities.”
The third set of factors that influence the effectiveness of using active learning techniques relate to the course instructor. A professor’s personal style affects the level of interaction (that is, the
“level of interplay between an instructor and students and the level of interplay between students and other students”) with which he or she will be comfortable (Bonwell & Sutherland, 1996).
Developing the QEP • 19
Also factored into this level of comfort are the instructor’s personality characteristics, the degree of control he or she seeks to have in the classroom, and his or her preference for certain teaching methods (Bonwell & Sutherland, 1996). Instructors need to understand each of these factors in order to determine which active learning techniques work best for them so that they can more effectively use them in the classroom. One implication this has for developing an active-learningbased QEP is that the University should offer professors sufficient programming in developing active learning techniques (Bok, 2006) to enable them to select the strategies that work best for them and to encourage them to make connections among their students, their subject matter, and themselves.
4.
Broader Educational Policy Initiatives and the Connection to UT
Arlington
In designing any course, instructors ask themselves not only what they want their students to learn, but also what they want their students to be able to do; the first question goes to knowledge, the second to skills (Bonwell & Sutherland, 1996). Indeed, universities are being pushed to develop students with both a deep knowledge of substantive material as well as the skill set necessary to employ such knowledge. A “higher-education system that creates new knowledge, contributes to economic prosperity and global competitiveness, and empowers citizens,” and that “gives Americans the workplace skills they need to adapt to a rapidly changing economy” were among the goals that a commission appointed by Secretary of Education
Margaret Spellings identified as wanting in American higher education (Spellings Commission,
2006). Similarly, in 2000, the Texas Higher Education Coordinating Board adopted its Closing the
Gaps plan, which seeks among other goals to increase graduation rates, to promote excellence in the state’s institutions of higher education, to see the state’s economy advance through a strong workforce and innovation in research, and to expand the minds of individuals and enable them to “develop a growing interest in the changing world around them” (THECB, 2000).
These national and state-level objectives are also found in UT Arlington’s Mission Statement and strategic planning priorities. As “a comprehensive research, teaching, and public service institution whose mission is the advancement of knowledge and the pursuit of excellence,” the
University seeks to “promote a culture of intellectual curiosity, rigorous inquiry, and high academic standards” among faculty and students alike. We are committed to promoting lifelong learning and good citizenship among our student body and diversity on our campus, all the while advancing the economic, social, and cultural welfare of our various constituencies. Research on active learning has shown that it can be a meaningful way to achieve these goals (Felder et al.,
1998)—whether those of the Spellings Commission, the Texas Higher Education Coordinating
Board, or our own institution.
C. Process, Part II: Identifying Student Learning – Higher Order Thinking Skills
Once the theme of the QEP was identified, a Steering Committee chaired by the QEP
Coordinator was named with members selected based on nominations from the academic deans and vice presidents. The Committee’s composition—with representatives from each of the nine academic units plus the Honors College, Faculty Senate, Academic Affairs, Student Affairs,
University Advising, University Libraries, Graduate School, and the student body (including a liaison from the Student Congress Executive Board)—was designed to ensure that all voices in the UT Arlington academic community were well represented as the University crafted its QEP.
The SACS Reaffirmation Leadership Team charged the Steering Committee with taking the recommendations proffered by the third focus group in developing the QEP.
Developing the QEP • 20
As its first order of business, the Steering Committee developed an RFP process to identify models of active learning experiences and issued a call for QEP pre-proposals [ click here for call for pre-proposals ]. In response to the widely publicized RFP, the Steering Committee received 38 pre-proposals from academic units, the University Library, Student Affairs, and Academic
Affairs. To ensure full discussion, the Committee reviewed the pre-proposals in multiple phases using several evaluation criteria and methods. First, each Committee member evaluated each proposal based on the criteria for evaluation set out in the RFP: the number of students affected, the replicability of the pre-proposal for other departments/programs, the clarity with which student learning outcomes and the plan for assessing outcomes were described, how this assessment would be utilized for continuous improvement, and the feasibility of implementation
(including budget and personnel). Thereafter, the Committee looked for commonalities across the pre-proposals—similar approaches, similar concepts and, most importantly, similar student learning outcomes. The Committee also took into account budgetary constraints, balance across units, and importance to the University. The objective of these deliberations was to develop a consistent and comprehensive QEP. In the end the Committee believed it had developed the best, cohesive set of pre-proposals that would generate excitement for promoting active learning across the campus. This set would allow for systematic study of the effect of active learning pedagogies on student learning at the University.
The RFP process and the Steering Committee’s subsequent review yielded benefits beyond mere selection of the pre-proposals. First, going through the process and seeing both the strengths and shortcomings of each pre-proposal at that early stage highlighted a number of areas that the
Steering Committee needed to ensure were properly addressed in developing and implementing the QEP. For example, given the importance of assessing student learning outcomes, we needed to properly arm the Steering Committee to craft appropriate assessment tools. To that end, a number of Steering Committee members attended the Sixth Annual Assessment Conference at
Texas A&M University (February 23-25, 2006); and an external consultant specializing in assessment was brought in to work directly with the Steering Committee. To help prepare the faculty to state student learning outcomes on their syllabi and provide a means of assessing whether those outcomes are being met, the Provost’s Office and the Academy of Distinguished
Teachers brought Dr. Joseph Hoey, then the Director of Assessment at Georgia Institute of
Technology, to campus on February 20, 2006, to present a workshop for faculty on developing and assessing student learning outcomes. Second, reviewing all the pre-proposals contributed greatly in assisting the Steering Committee in refining our understanding about what active learning means in the UT Arlington educational context. Each of the pre-proposals submitted increased the Steering Committee’s understanding of active learning initiatives, especially among those faculty members who participated as primary investigators.
The insights from the QEP pre-proposal submissions were supported by the additional input that the QEP Coordinator received from ongoing conversations in the academic units, and such diverse constituencies as the Faculty Senate, the Undergraduate Assembly, the Graduate
Assembly, the alumni board, the student affairs leadership team, and students. The Steering
Committee took this input and the insights provided by the pre-proposal submissions and crafted a White Paper on the UT Arlington Active Learning Quality Enhancement Plan. This document was made available for the UT Arlington academic community’s review and comment, so that the input received could further inform the QEP development process. The White Paper contained sections about why active learning is right for UT Arlington, and resulted in a refined working definition of active learning more appropriate for the UT Arlington context:
Developing the QEP • 21
Active learning places the student at the center of the learning process, making him/her a partner in discovery, not a passive receiver of information. It is a process that employs a variety of teaching and learning strategies to place the responsibility for creating and defining the learning environment on the instructor and the responsibility for effective engagement in the learning process on the students. Active learning encourages students to communicate and interact with course materials through reading, writing, discussing, problem-solving, investigating, reflecting, and engaging in the higher order thinking tasks of application, analysis, synthesis, and evaluation. An active learning approach draws upon a continuum of teaching and learning strategies, including for example class discussion activities, undergraduate research, and community-based learning experiences.
Perhaps the most significant change in this refined working definition of active learning is the recognition of the partnership between instructor and student, and the role each plays in a student’s educational experience. Whereas the University’s original working definition of active learning placed “the primary responsibility for creating and/or applying knowledge on the students themselves,” this new definition distinguished between the instructor’s responsibility for creating and defining the learning environment and the student’s responsibility for effectively engaging in the learning process.
Even as the University’s working definition of active learning was being refined, a key question still remained: What specifically would active learning seek to achieve? Although the academic community knew it wanted the QEP to address our students’ thinking skills generally, the specific learning outcomes that we want to achieve by promoting active learning on campus needed to be carefully articulated and operationalized. Just as the process for identifying the theme of the QEP was fluid, dynamic, and open-ended, so too was the process for establishing these outcomes.
At the start of the strategic planning conversations, many in the academic community expressed a desire to enhance our students’ critical thinking, which can be defined as “that mode of thinking-about any subject, content, or problem in which the thinker improves the quality of his or her thinking by skillfully analyzing, assessing, and reconstructing it” (see http://www.criticalthinking.org
). As the University’s process for developing the QEP continued, however, we focused less on the label of critical thinking and more on the desired outcome of enhancing our students’ ability to analyze and reconstruct their knowledge. Furthermore, the call for pre-proposals required submissions to address the kinds of student learning outcomes that could be achieved through the proposed active learning activities and how those outcomes could be assessed. Although the 38 pre-proposals were submitted from across the University, a commonality of outcomes emerged and reaffirmed the sense of the academic community expressed throughout the QEP process – namely, a desire to increase the higher order thinking skills of application, analysis, synthesis, and evaluation.
Independently of the evaluation of the pre-proposals, the Steering Committee also examined the various surveys discussed in Section II.B, including the employer and alumni surveys. The
Committee paid particular attention to deficiencies in the UT Arlington educational experience as reflected in the NSSE survey and the CLA. Furthermore, as indicated above, a gap existed between the importance employers placed on certain higher order thinking skills (defining problems, solving problems, applying conceptual knowledge, and effectively communicating) and the actual performance of these skills by UT Arlington graduates.
Developing the QEP • 22
The Steering Committee conducted an additional student/alumni survey that requested feedback on the classroom experience and active learning by asking: (1) Describe your best classroom experience here at UT Arlington (e.g., teaching style, activities, etc.); (2) Describe your ideal classroom learning experience; (3) What does the term “active learning” mean to you?; and (4)
What suggestions might you have in your own area of study as to how “active learning” could be employed? This survey was conducted using a targeted sample of student and alumni opinion leaders during the Spring of 2006, and 224 responses were received. The key ideas identified by both students and alumni in their responses are summarized in Table 10.
Table 10: Key Words Identified by Students and Alumni
Students
Best and Ideal
Classroom Experiences
• interaction
• class discussion
• hands-on
• group work
• visual aids
• participation
• application
• excitement
• activities
• presentations
• lab-work
• being engaged
• discussion
• enthusiastic instructors
• small class
• real-world experience/applications
• part of the learning process
• a caring and approachable instructor
• teaching style/skills
What Does
“Active Learning” Mean?
• participation • activity
• application • interactive
• visual aids • presentations
• technology • feedback
• student involvement in teaching
• engaging students
• asking questions
• out of class involvement
• open discussion
• hands-on experience
• real world examples
Alumni • small classes • active participation
• lab-work • application
• feedback • group or teamwork
• interactive (working, students, teaching, and with professors)
• hands-on (experience/activity)
• use of technology (computers, Internet, videos, and multi-media)
• active
• participation
• research opportunities
• application of what is learned
• communication, thinking
• hands-on activities
• real world experience
Suggestions
• internships • field trips
• lab-work • projects
• activities • practicing
• interaction • seminars
• discussion • technology
• guest speakers • group projects
• real world examples/ experience
• hands-on experience
• application of material
• presentations
• communication
• technology
• internships
• field trips
• application
• lab-work
• group assignments
• real world exposure
Analysis of the responses reinforced the view of the Steering Committee that the QEP should be about promoting active learning as a means to achieve higher order thinking skills. The
Committee finalized its selection of the QEP pre-proposals with this outcome in mind.
D. Higher Order Thinking Skills in the UT Arlington Context
Developing thinking skills in the context of higher education leads one to Benjamin Bloom’s taxonomy of the cognitive domain (1956). This taxonomy structures six levels of thinking skills from less to more complex:
1. Knowledge : the ability to recall or recognize information, ideas, and principles in the approximate form in which they were learned.
2. Comprehension : the translation, comprehension, or interpretation of information based on prior learning.
3. Application : the selection, transfer, and use of data and principles to complete a problem or task with a minimum of direction.
4. Analysis : distinguishing, classifying, and relating the assumptions, hypotheses, evidence, or structure of a statement or question.
5. Synthesis : the origination, integration, and combination of ideas into a product, plan, or proposal that is new to the individual.
6. Evaluation : an appraisal, assessment, or critique developed on the basis or specific standards and/or criteria (Huitt, 2004).
Developing the QEP • 23
Huitt’s (1992) research concludes that students remember more when they learn to handle information at the higher levels of the taxonomy (application, analysis, synthesis, and evaluation) because more reflection and elaboration is required of them. The relationship between active learning and the higher levels of Bloom’s taxonomy has been documented. Active learning occurs while students are studying ideas, engaging in problem solving, and applying content. They acquire knowledge and skills while actively engaging in inquiry and reflecting on their experiences (Silberman, 1996).
Given the challenges placed upon higher education (particularly in state universities such as UT
Arlington) to be accountable to the taxpayers who sustain the institution (Spellings Commission,
2006), active learning should be geared toward the development of thinking skills that address the core wants and needs of our various constituencies. The enhancement of student learning on our campus, however, must fit within the vision that UT Arlington, its leaders, and its faculty have selected for the institution. As the University strives to become an RU/VH classified institution, we do not want to lose sight of our long and proud history of strong teaching. In recent years, we have promoted this strength through such vehicles as the Academy of Distinguished Teachers, the Honors College, and Freshmen Interest Groups (FIGs) within a Living Learning Center residence hall. The QEP affords the University the opportunity to take such efforts further by creating a culture in which our commitment to strong teaching is infused into our students’ everyday collegiate experience.
Additionally, the University desires to increase its current retention and graduation rates (see
Figures 2 and 3). For example, in comparing UT Arlington’s graduation results with 15 similar institutions, UT Arlington lands in the bottom quartile, ranked thirteenth, according to the
Education Trust, College Results Online (available at http://www.collegeresults.org
). A 2004 report produced by the Education Trust (Carey, 2004) demonstrated that institutions with high graduation rates worked hard to connect students to campus and cared about the quality of teaching and learning, and actively sought to improve teaching effectiveness. One of the longrange recommendations made in 2005 by a UT Arlington Graduation Rate Task Force was that the University increase faculty engagement with students through active learning strategies.
Figure 2: Fall to Fall Retention Rates at UT Arlington (2005-2006)
78.0%
76.0%
74.0%
72.0%
70.0%
68.0%
66.0%
64.0%
62.0%
60.0%
Overall White African
American
Hispanic Other
Source: UT Arlington Office of Institutional Research, Planning and Effectiveness.
Note:
Retention rates are based on first-time full-time degree-seeking freshmen still enrolled after one academic year.
Developing the QEP • 24
Figure 3: Graduation Rates at UT Arlington (2005-2006)
70.0%
60.0%
50.0%
40.0%
30.0%
20.0%
10.0%
0.0%
Overall White African Hispanic Other Transfers
American
First-Time Freshman
Source: UT Arlington Office of Institutional Research, Planning and Effectiveness.
Note:
Graduation rates for transfer students are based on the number of new full-time transfers with at least 60 transfer hours who obtained their degree within four years. All other categories are based on first-time full-time freshmen graduating in six years.
All the objectives cited above—from enhancing critical thinking skills to making our students better contributors to the workforce and society to improving the University’s retention and graduation rates—can be achieved by using active learning as a means to promote and develop the higher order thinking skills among students. Tsui (2002) notes that “[h]igher-order cognitive skills … are invaluable to students’ future; they prepare individuals to tackle a multitude of challenges that they are likely to face in their personal lives, careers, and duties as responsible citizens … and groom individuals to become lifelong learners.” Such results of an active learning curriculum coincide with UT Arlington’s mission.
In order to make its best decisions, though, the University must systematically examine how its
QEP is developed, implemented, and assessed. How do our students adapt to an environment in which we seek to promote the development of their higher order thinking skills? Since organizations can be learning systems in their own right (Kolb, 1974; see also Senge, 1990
[revised 1994] regarding the concept of “learning organization”), we must consider how the
University adapts to this new environment since it, too, must be ready to learn and adapt. It follows that the QEP can be structured as an institutional research project, addressing this question: How can active learning to achieve higher order thinking skills be best designed to ensure maximum benefit for student learning and for the University as a whole?
E. Process, Part III: Preparing for the QEP
Having identified the key learning outcomes for the QEP and the selected the pre-proposals, the
Steering Committee faced the task of turning these constructs into a concrete plan. Much of the work for doing so was delegated to subcommittees of the Steering Committee. The Research
Working Group initially strove to understand the landscape of active learning. Their work was supplemented by an active learning experience in which a graduate class in the School of Social
Work along with an independent study student researched aspects of active learning, issues affecting higher education, and potential funding opportunities. The Research Group then continued to refine the theoretical foundations for our QEP and investigated faculty incentives and development. The Process Working Group crafted a timeline for implementation, compiled the overall budget for the QEP, and developed a management plan. The Assessment Working
Group developed the methodology of assessment and (in conjunction with the Process Group)
Developing the QEP • 25
developed the QEP’s assessment plan. An additional marketing task force comprised of faculty members, students, a library representative, the University’s Public Affairs office, employers, and a representative of the President’s office, and chaired by an alumnus and an advising staff member, developed plans and recommendations to help constituents understand their respective roles in the QEP. In addition, this task force recommended the official title for the QEP, Active
Learning: Pathways to Higher Order Thinking at UT Arlington , which was unanimously endorsed by the QEP Steering Committee, the Reaffirmation Leadership Team, and President Spaniolo.
The QEP Steering Committee also identified the need to work with the authors of the twelve selected pre-proposal pilot projects to integrate them fully into a single comprehensive plan. Each group of authors submitted their proposals separately. Out of this group, however, we needed to form a single quality enhancement plan. To that end, a team chaired by the QEP Coordinator and including representatives from the College of Science and the College of Education attended the Northeastern University’s Summer Institute on Experiential Education to learn how to develop a strategic plan for integrating the various projects. Pursuant to this plan, the Committee worked in Fall 2006 to assist the pilot project authors in refining their active learning strategies, identifying the specific learning outcomes associated with them, mapping them to Bloom’s taxonomy and employing the assessments required for the QEP in the context of each course. In turn, the pilot project authors helped the Steering Committee identify and refine assessment strategies, implementation strategies, and issues of scheduling and control group design, as well as to appreciate the realities of the research design being created.
In addition, the Committee established a series of professional development programs that covered such topics as student learning outcomes and mapping to higher order thinking skills, assessment, and rubrics (both generally and specifically with respect to the Washington State rubric discussed in Section IV.A.3.a). It also provided for one-on-one interaction with the QEP
Coordinator and a faculty facilitator who specializes in assessment rubrics. Additional programming in Spring 2007 will address rubric inter-rater reliability, creation of a contextualized critical thinking rubric to assess oral and written assignments, creation of objective exams with embedded questions to measure desired student learning outcomes, creation of modified knowledge surveys (see Section IV.A.3.a), and research-related questions
(e.g., Institutional Review Board submissions). These development sessions, once refined during this process, will serve as models for developing programs for future faculty who desire to participate in this initiative.
The goal in working with the pilot project authors was to create a community of practice (defined as a group of people who share a concern or a passion for something they do and learn how to do it better as they interact regularly [Wenger, 2006]). They, in turn, would be leaders in our efforts to implement active learning as a means to achieving higher order thinking skills and, more broadly, to enhance student learning across campus. In addition, research conducted on pilot primary investigators’ learning curves will assist in better preparing faculty across the University for undertaking active learning initiatives.
One of the most important components of the professional development program for pilot authors was a workshop conducted by Dr. Charles Bonwell, a nationally recognized expert on active learning, on September 13-14, 2006. Dr. Bonwell worked directly with pilot authors to develop active learning course experiences with a view to achieving higher order thinking skills.
In addition, Dr. Bonwell held an open workshop for the academic community at large on the topic of active learning, and approximately 130 people attended, including a representative from
Developing the QEP • 26
each department; a video of the presentation was also posted to the University website. Dr.
Bonwell’s visit was crucial to finalizing the QEP.
Although the QEP focuses on the twelve pilot projects, it will demonstrate to the University community the benefits of an approach that emphasizes student learning through the acquisition and development of higher order thinking skills. To that end, a number of new faculty enhancement opportunities have already been initiated. For example, based upon the positive feedback received from Dr. Bonwell’s open workshop, an additional session afforded interested faculty members the opportunity to continue the conversations started by Dr. Bonwell and to participate in an active learning teaching circle; out of these discussions, two active learning teaching circles have already emerged. Faculty development opportunities such as these (as well as other efforts being undertaken within the various colleges and schools) can be a means of preparing additional faculty to be future participants, and ensure that the QEP pilot projects are truly part of a larger effort throughout the institution.
As discussed above, all constituencies—students, faculty, employers, alumni, and national and state regulators and policymakers—are requiring universities to be accountable and responsive to their demands. We deliberately chose a process for developing our QEP in such a way that the stakeholders within the University’s academic community would define for themselves how best to proceed. In the words of President Spaniolo at a general meeting of the faculty in Fall, 2006:
“In short, our ability to help students improve these thinking skills—and to provide convincing evidence that we have been successful in our endeavor—is the very sort of assessment and accountability we should embrace and champion …
Just as we have devoted time and energy to describing what it means to be a
Maverick, we also need to explain, in a systematic and compelling way, how we change the lives of the students we engage.”
Developing the QEP • 27
IV.
Achieving the QEP’s Objectives
A. The QEP Design: From Goals to Assessment
1. The Goal of the QEP
Goal: The effective application of active learning to achieve higher order thinking skills .
Our QEP compels us to examine the effectiveness of active learning in promoting higher order thinking skills in two ways: (1) By studying the acquisition and development of higher order thinking skills in our students; and (2) by determining how active learning and its attendant effects can best be accomplished in the classroom by faculty. Further, the QEP itself is an institutional research project designed to address the following questions:
1. Does active learning contribute to enhancing higher order thinking skills among
UT Arlington students?
2. What are the most effective active learning strategies (in terms of complexity, time on task, and intensity) for increasing higher order thinking skills?
3. At what level in the UT Arlington undergraduate experience does active learning have the most impact?
4. How does the effectiveness of active learning strategies vary across the colleges/schools as explored through the respective pilot projects?
The first question will determine whether the relationship between active learning and higher order thinking skills analyzed in the scholarly literature exists on our campus. The second question allows the University community to understand which active learning techniques work best with our student population, and thus provides important information to inform future decisions ranging from designing courses to allocating resources. The third and fourth questions are designed to explore the nuances of active learning techniques among various subsets of the student population.
The twelve pilot projects provide opportunities for investigation in a variety of educational contexts, including large introductory classes, symposia and other upper-division courses, and capstone classes. They address the application of active learning techniques across a continuum of complexity:
• Low-complexity techniques: Computer-based interaction systems; personal response systems (“clickers”); self/peer formative assessment;
• Moderate-complexity techniques: Small group presentations/discussions; role playing/simulations/games; peer teaching; and
• High-complexity techniques: Cases; cooperative cases; cooperative learning/problembased learning.
Achieving the QEP’s Objectives • 28
The projects also include variations in time-on-task and exposure to active learning techniques across a variety of intensities ranging from a single class period to a module within a class to a full-semester design.
In addition, these projects have been designed not only to create student learning outcomes that map to various levels of Bloom’s taxonomy, but to determine whether the application of active learning techniques can lead to the higher order thinking skills of application, analysis, synthesis, and evaluation as deemed appropriate by the pilot project instructors. To this end, the projects will employ a series of formative and summative assessments (described in Section IV.A.3).
Furthermore, because most colleges and schools are represented by at least one pilot project, the
QEP provides a means to begin investigating the most effective active learning strategies across many programs throughout the University.
2. The Hypotheses
The QEP is designed to investigate the following hypotheses:
HA1: Availability of more opportunities for active learning in the classroom increases higher order thinking skills in UT Arlington undergraduate students.
Null Hypothesis #1: Implementation of active learning in the classroom does not promote higher order thinking skills in UT Arlington undergraduate students.
HA2: Participation in active learning in the classroom by UT Arlington undergraduate students increases higher order thinking skills.
Null Hypothesis #2: Participation in active learning by UT Arlington undergraduate students does not promote higher order thinking skills.
The primary tool we will use to assess the relationship between active learning and higher order thinking skills is the Rasch model (see, e.g., Fischer and Molenaar, 1995) with covariates (see
Appendix A for more information on the Rasch model and issues related to its implementation in the context of the QEP). The Rasch model is a statistical model that generates the probability of a student responding correctly to an item while taking into account the interplay between the student’s ability—a latent unmeasured trait—and the difficulty of the item. The probability will be higher as the difference between the student’s ability and the difficulty of the item increases.
In the case of the QEP, the underlying ability that will be estimated with the Rasch model is the student’s higher order thinking skills as categorized by Bloom’s taxonomy. The work of each student in each class studied will be evaluated using a series of items that will be mapped onto the various categories of Bloom’s taxonomy. In addition, each class will be assigned an active learning intensity score as measured by complexity (as rated by the Active Learning Inventory Tool
[ALIT] - see Section IV.A.3.a), time-on-task, and intensity of active learning experiences (as measured by observers scoring actual instruction using the ALIT). By examining the average higher order thinking skills estimates and their standard errors obtained from the Rasch model for each class, and taking into account the discipline, level, and active learning combinations represented by the QEP program classes, we will seek to identify where active learning has the greatest effect.
When determining the average higher order thinking skills measure, the Rasch model will take into account relevant confounders that may have an impact on a student’s probability of
Achieving the QEP’s Objectives • 29
answering an item correctly. Some confounders considered will be class attendance, SAT scores, prior knowledge of subject, student effort, gender, ethnicity, course load, hours working outside school, and class size. In addition, where possible, a class in the QEP program will be paired with a control class in which active learning techniques are employed minimally, if at all (i.e., a class with a low active learning intensity score). Then these paired classes will be compared, using the
Rasch model, to determine whether there is any difference between the average higher order thinking skills measure for the QEP program class using active learning and the control class not using active learning. In some instances, a corresponding control class may not be available for a particular QEP program class (for example, capstone courses within a major or specialized symposia). For such classes, an alternative method of control needs to be employed. Thus, the analysis of the effectiveness of active learning techniques in acquiring and developing higher order thinking skills will be measured across semesters.
Table 11 summarizes some of the key design features of the pilot projects, including their level of active learning complexity and their control factors (for more detail on each pilot project, see
Section V).
Table 11: QEP Project Designs*
Primary
Investigator
Aktosun
Boardman
Cole
Active Learning
Complexity
High
Low
Low
Control Group Design
No control**
First two semesters one experimental, two controls. After year 1, only experimental
1 experimental & 1 control class each fall
Instructor
Same instructor
Same instructor until expansion in year three; then two additional courses and instructors
Same instructor, all sessions and all semesters, except for year 3, where different instructor will be used
2 instructors, with a different set of instructors employing module in different class in year 3
Dillon Low election semesters, and control in others; in year 3, add additional class with different instructor but same module
Each semester one experimental, one control
Hirtle and Wiggins High
Martin High
Maruszczak and
Rusher
High
McMahon
No control**
No control**
No control**
Both Moderate No control**
Peterson High No control**
Swafford and Prater
Trkay
High
Moderate
Experimental every fall and control every spring
ENGL 1301 – multiple experimental, multiple controls;
ENGL 1302 – 2 experimental, no control;
Track students from 1301 to 1302 to see if active learning exposure has a cumulative effect
* All pilot projects contain a pre/post-test design.
** No control: will employ an across-semester comparison design.
Same instructor
Same instructor
Same instructor
Same instructor
Different instructors in year 2 and 3 in
Freshman/Sophomore class. Same instructor in Junior/Senior class
Same instructor year 1; phase into other engineering disciplines over time
Same instructor all sessions, all semesters
ENGL 1301 - Same instructor within control and experimental;
ENGL 1302 – Same instructor
Achieving the QEP’s Objectives • 30
A three-level approach has been designed to collect the relevant data discussed below for testing our hypotheses. Assessment activities will take place continually throughout the QEP. Courselevel assessment will take place each semester that a pilot project is implemented. The semester of implementation will be determined by the pilot project primary investigator, factoring into account constraints of teaching load and departmental scheduling issues. Review of the courselevel data will occur during the subsequent semester. Furthermore, both QEP program-level and
University-level assessments will take place during each summer following pilot project implementation. In addition to those specified below, the use of other assessment tools, such as focus groups and non-participatory in-class observations, may be used. The ongoing course-level assessments and the yearly program and University assessments will allow for refinement of the design and of the associated assessment strategies for both individual pilot projects and the QEP as a whole, and thus close the assessment loop. They will also take into account and enhance the natural learning that will take place for both the program management and the faculty involved. a. Course-Level Assessment of the Pilot Projects
The classroom-based protocol for collecting the necessary course-level data for assessment of active learning and higher order thinking skills will include the following. Given the variability in pilot project content and structure, the timing of some of the interim assessment tools listed below will vary among pilot projects. Furthermore, these data will be collected from the identified control group courses.
Prior to Day One of Each Pilot Project
1. IDEA Student Rating of Instruction Tool, Part 1. The IDEA (Individual
Development & Educational Assessment) Center describes its Student Rating of
Instruction as focusing on student learning instead of the instructor’s teaching behavior. Prior to the beginning of a pilot project, each instructor will fill out a
Faculty Information Form identifying his/her teaching objectives for the project/course. [ For further information: http://www.idea.ksu.edu
]
Day One of Each Model Pilot Project
1. VARK Inventory (Visual, Aural, Read/Write, Kinesthetic). This inventory is a guide to assessing student learning styles and should be completed by both students and faculty members. It consists of a series of questions that are used to classify an individual’s learning style. As noted previously, the fact that students have different learning styles argues for using a variety of active learning strategies in the classroom. [ For further information: http://www.varklearn.com/english/index.asp
]
2. Modified Knowledge Surveys. The Knowledge Survey employs a self-reporting scale to assess students’ confidence in their ability to answer hypothetical questions in a list of course topics. We will combine the Knowledge Survey with a
“knowledge probe” idea by asking students to attempt to answer a selection of the questions across the course content. Each pilot project course instructor will write and assess his or her own Modified Knowledge Survey as appropriate for the discipline. This combination will allow us not only to gauge confidence, but also establish a baseline to measure knowledge of course content.
Achieving the QEP’s Objectives • 31
3. Student Data. Student data will include SAT scores, ACT scores, and/or GPA, if available; number of credit hours taken in the semester; number of credit hours accumulated overall; gender; ethnicity; transfer status; and class size, each of which may be a confounder to our model.
Every Day of Each Pilot Project
1. Daily attendance tracking. Attendance will be taken every time the class meets using a method that is appropriate given the size and type of class. Attendance can be taken in a variety of ways; examples include calling roll, passing around sign-up sheets, and having students turn in the answer to a question. Tracking attendance provides a measure of students’ effort and engagement in the class.
Interim Assessments within Each Pilot Project (timing will vary from project to project)
1. Student Effort. Three times during the semester, students’ effort, time devoted to class, and their impressions of their experience will be assessed using an instrument developed by the QEP Steering Committee. Topics to be surveyed include hours worked, class preparation, and their overall reflections on the course to date, among others.
2. Formative Assessments. Students’ reactions to active learning tasks in class will be collected through ongoing feedback using various Classroom Assessment
Techniques (CATs). This assessment allows faculty to make in-course adjustments and determine how well students are understanding the course material. It also will provide useful qualitative assessment of various instructional approaches.
3. Summative Assessments. Summative assessments will be collected using embedded questions on exams and collection of performance data on these questions. In those projects that employ field journals, portfolios, essays, essay exams, and oral presentations, a critical thinking rubric (based on Washington
State University’s rubric) will be used, contextualized for the course and assignments. This will provide comparable direct measures of desired student outcomes for the course. [ For further information: http://wsuctproject.wsu.edu/ctr.htm
]
4. ALIT. Periodic checks using the Active Learning Inventory Tool (ALIT) will be carried out by the QEP Coordinator and Undergraduate Reporting and
Assessment Specialist. ALIT was developed and validated by researchers from
Northeastern University, and the University has secured permission to employ this tool. The Coordinator and Assessment Specialist will be trained in the use of
ALIT by the designers from Northeastern University in Spring 2007. Use of ALIT will ensure that the primary investigators are employing their active learning strategies per their design. [ click here for ALIT ]
End-of-Semester Course/Project Assessments
1. Course Reflection Memo. A course reflection memo will be completed by pilot project faculty detailing any issues (benefits and problems) they may have faced as
Achieving the QEP’s Objectives • 32
a result of adding active learning techniques to their courses. They will address how long they have been teaching and whether they have had prior experience in using active learning techniques in the classroom. If so, they will be asked to reflect on their experiences relative to prior semesters. They will also address the impact active learning techniques have had on the coverage of topics described on the course syllabus as well as their perception of their students’ experience in class.
In addition, the instructor will discuss student learning outcomes, the acquisition and development (or non-acquisition and non-development) of higher order thinking skills, and what modifications, if any, they intend to make to the course in subsequent semesters.
2. NSSE. The normal yearly administration of the National Survey of Student
Engagement will include an over sampling of active learning pilot project student participants (and control sections, where appropriate). It will be administered in such a way as to allow this cohort to be isolated from the general sample for QEP assessment for both selected analysis and broader comparisons (as described in
Section IV.A.3.c).
3. FSSE. The normal yearly administration of the Faculty Survey of Student
Engagement will include an over sampling of active learning pilot project faculty participants. It will be administered in such a way as to allow this cohort to be isolated from the general sample for QEP assessment for both selected analysis and broader comparisons (as described in Section IV.A.3.c).
4. CLA. The normal yearly administration of the Collegiate Learning Assessment will include an over sampling of active learning pilot project students and a control group, if applicable. It will be administered in such a way as to allow this cohort to be isolated from the general sample for QEP assessment for both selected analysis and broader comparisons (as described in Section IV.A.3.c).
5. MyMav tag. A tag within the MyMav student information system will be placed on pilot project courses and student participants to allow for future tracking (e.g., alumni surveys, employer surveys, and later NSSE sampling).
6. IDEA Student Rating of Instruction Tool, Part 2. The IDEA Student Rating is determined at the end of the semester when the instructor is evaluated based on the objectives he or she identified in the Faculty Information Form. Each student provides a self-report of learning based on instructor-identified objectives. The resulting reports include both ratings of instructor effectiveness and suggestions as to how the instructor can improve performance in the classroom.
7. Modified Knowledge Survey. A follow-up Modified Knowledge Survey will be administered at the end of each course. b. Program-Level Assessment of the Pilot Projects
Data at the QEP program level will be generated from course-level estimates derived in turn from the Rasch model. These data will produce average higher order thinking skills estimates for each of the QEP Program classes and the control classes available in an academic year. For full discussion of the program-level assessment, see Appendix A.
Achieving the QEP’s Objectives • 33
c. University-Level Assessment of the Pilot Projects
Finally, data will be collected at the University level to determine the overall effect of the QEP on the University as a whole. University-level assessment of the QEP will be done annually, starting with baseline data collected during the 2006-2007 academic year. These data will come from the
NSSE, FSSE, CLA, and Alumni and Employer surveys (as administered). As noted above, students who participate in active learning courses will be asked to take part in UT Arlington’s yearly NSSE and CLA, and faculty teaching the active learning sections will also be asked to take the yearly FSSE. Isolating the student and faculty data will allow for comparison with the rest of the University-wide student and faculty samples, as applicable.
The Alumni Survey and Employer Survey data will be used on an ongoing basis to determine whether there is a change in learning outcomes overall for UT Arlington graduates. These surveys should provide insights into whether the concerted effort in the twelve pilot projects and any additional secondary effects on students in other courses (resulting from heightened awareness and proliferation of various teaching support programs) leads to noticeable changes in students’ acquisition and development of higher order thinking skills.
Unit Effectiveness Plans have been sampled prior to QEP implementation to establish a baseline
(see Section II.B). Comprehensive data will be collected in spring 2007 and for each UEP cycle thereafter. These data will measure any secondary effect the QEP may have on the rest of the
University community as seen through the assessment and mapping of relevant departmental learning outcomes. As noted above, although some departmental UEPs have their program learning outcomes focused on higher order thinking levels, one effect of this project might be that these numbers increase over the course of the QEP. Furthermore, each spring, a survey of faculty members regarding their use of active learning will be conducted, building upon the baseline survey conducted in Spring 2007 (see Section II.B).
Analysis at all three levels will allow for the QEP to inform our decisions as to where to allocate scarce resources, as well as how best to pursue rolling out the pilot projects as models to the larger University community in the course of the ten-year strategic planning initiative. Our decisions at the end of the QEP after year 3 as to how to proceed will be determined by the data culled from the initial QEP efforts of the faculty in the pilot projects.
B. Additional Components of the QEP
Figure 4 depicts the QEP management structure. The QEP will be overseen by a half-time QEP
Coordinator (a tenured member of the faculty). In the first year of implementation, the permanent QEP Coordinator will be assisted by the current SACS QEP Coordinator.
In addition, the permanent faculty QEP Coordinator will chair a new permanent University-
Wide Standing Committee on Higher Order Thinking and Active Learning, whose composition will be similar to that of the current QEP Steering Committee. This body will include members representing each of the following academic units, appointed by their respective deans: School of
Architecture, College of Business Administration, College of Education, College of Engineering,
College of Liberal Arts, School of Nursing, College of Science, School of Social Work, School of
Urban and Public Affairs, and the Honors College. The UT Arlington Faculty Senate will be represented on the Committee by its chair, and the Committee will also include representatives from Academic Affairs, Student Affairs, University Advising, the University Library, and the
Graduate School (each appointed by their deans/directors). The Committee will also have two student representatives (one appointed by the Vice President for Student Affairs and the other a
Achieving the QEP’s Objectives • 34
member of the Student Congress Executive Board, who will serve as liaison to Student
Governance). Initially the Committee will include one member from the original QEP Steering
Committee as well as the SACS QEP Coordinator to provide institutional memory and continuity. These members will rotate off after the first year. Finally, a pilot project author will serve as a representative for the three-year implementation of the QEP to ensure adequate representation of the pilot project faculty’s voice in the decision-making regarding the QEP initiative.
The Standing Committee will be charged with making recommendations on the ongoing implementation of the QEP to the Coordinator, Provost, President, and other concerned QEP parties, and will oversee the assessment of the QEP. This Committee will also be charged with exploring additional ways to promote the initiative, and will disseminate information about the implementation of the QEP to the campus community. The Committee will assist in the development of the SACS five-year impact report as well as in refining the second-round call for pre-proposals and selection of the second-round pilot projects, as discussed below.
Additional support for implementing the QEP will be provided by two newly created positions.
An Undergraduate Reporting and Assessment Specialist will be added to the Office of
Institutional Research, Planning and Effectiveness to handle the assessment necessary for the
QEP [ click here for draft job description].
A Web Developer/Database Manager will structure and oversee the compilation, storage, and maintenance of QEP-related data and maintain the active learning website portal. This position will be supported by two graduate research assistants
[ click here for draft job descriptions ].
Figure 4: QEP Management Structure
Prov ost
SACS QEP
Coordinator—one- year transition as
Advisory Consultant
(0.25 faculty)
Associate Provost
QEP Coordinator
(0.5 faculty) *
Web Developer/
Database
Manager
Director of Institutional
Research, Planning and
Effectiveness
Undergraduate
Reporting & Assessment
Specialist
University-Wide
Standing Committee on
Higher Order Thinking and Active Learning
Communities of Practice
(COP) 12 PI’S**
GRA GRA
Notes:
*QEP Coordinator is Chair of the University-Wide Standing Committee on Higher Order Thinking and Active
Learning.
**The Community of Practice consists of the twelve initial primary investigators of the QEP pilot programs, plus some members of the original Steering Committee.
Achieving the QEP’s Objectives • 35
The QEP calls for both new funding (drawn from the President’s Office discretionary funding and the Provost’s Office discretionary academic initiative funding) and existing funding (i.e., costs shared with other units on campus that would otherwise be provided by the QEP Program).
These categories are broken down on the Comprehensive QEP Budget Breakout in Appendix B as follows:
New funding:
• Pilot project assessments;
• QEP management team personnel costs (QEP Coordinator, Undergraduate Reporting and Assessment Specialist, and Web Developer/Database Manager and graduate assistants);
• Communities of Practice support;
• University-wide faculty initiatives, including speakers/workshops, mini-grants, and provisions for granting faculty development leaves; and
• Travel for QEP-related personnel and faculty to attend SACS and assessment-related conferences.
Existing funding:
• Continued University-wide assessments conducted and funded by the Office of
Institutional Research, Planning and Effectiveness, including NSSE, FSSE, and Employer and Alumni Surveys;
• Costs related to the University Library/Liberal Arts pilot project; and
• The University of Texas System payment for NSSE and CLA testing.
The primary investigator of each QEP pilot project submitted a tailored project budget that will be funded at the respective requested amounts for the three-year duration of the QEP (see
Appendix C for QEP Pilot Project Budget Years 1-3 with Average Projected Costs for Years 4-10).
Additionally, each project was provided with a graduate or undergraduate teaching assistant (if requested by the primary investigators) to aid with the collection of QEP-related data. Since the
University administration is committed to future funding for the replication and expansion of the pilot projects in year 4 and beyond at least at the established QEP levels, the funding reflected for years 4-10 is budgeted to continue at the average rate of the pilot project costs for years 1-3.
The Annual QEP Budget Summary in Table 12 was constructed by combining the budgets submitted by the twelve pilot projects (Appendix C) with the projected costs of supporting and expanding active learning/higher order thinking initiatives across campus (Appendix B).
Table 12: Annual QEP Budget Summary (Years 1-10)
Year 1 Year 2 Year 3 Year 4 Year 5
New $607,294 $556,983 $491,291 $551,856 $551,856
Existing $12,070 $22,554 $13,054 $12,070 $22,554
Total $619,364 $579,537 $504,345 $563,926 $574,410
Year 6 Year 7 Year 8 Year 9 Year 10
New $551,856 $551,856 $551,856 $551,856 $551,856
Existing $13,054 $12,070 $22,554 $13,054 $12,070
Total (Years 1-10)
$5,518,560
$155,104
Total $564,910 $563,926 $574,410 $564,910 $563,926 $5,673,664
Note: Due to the three-year cycle for many University-wide assessments, there is a recurring pattern of existing expenditures from year 4 onward. Years 4, 7, and 10 will be identical. Year 5 equals year 8 and Year 6 equals 9. Refer to Comprehensive Budget
Breakout (Appendix B) for details. The budget also may be adjusted in future years to account for increasing/inflationary costs.
Achieving the QEP’s Objectives • 36
3. Other University-Wide Efforts: Continued Engagement of the Faculty and Academic Community
A vital component of the QEP is that the broader faculty and academic community continue to engage with active learning pedagogy and strive to enhance students’ higher order thinking skills.
To this end, the ongoing Faculty Enhancement Program (sponsored by the Office of the Provost) will address the QEP directly through:
• Information to new faculty through orientation and mentoring programs;
• Continued recognition of teaching excellence through induction into the Academy of
Distinguished Teachers;
• Workshops, speakers, and instructional support sessions; and
In addition, the QEP calls for the creation/support of the following:
• Stipends for primary investigators to develop pilot project pre-proposals into full proposals;
• Eighteen yearly instructional support mini-grants [ click here for draft form of grant application ] ;
• Faculty Development Leave(s). Current policies allow awarding FDLs for course development. FDL proposals for developing courses featuring active learning will be encouraged;
• Creating an active learning website portal providing a virtual center for resources relevant to the active learning initiative [ click here for the portal ] ;
• The QEP Community of Practice comprised of the QEP pilot project primary investigators. They will participate in ongoing sessions and listserv dialogues;
• QEP pilot project primary investigators, who will share lessons learned in their own implementation experience through presentations, the development of specialized teaching circles developed around the pilot projects (large classes, symposia and capstones), and the pursuit of research and publications inspired by their respective projects;
• Recognizing innovation in employing active learning strategies in the nomination and selection process for the University’s new campus-wide Innovation in Teaching
Award;
• Working with the University committee currently investigating revisions to course evaluations administered at the end of all courses, with a view to adding questions related to active learning; and
• Incorporating the QEP initiative into the University’s programming and marketing strategies.
Appendix D expands the QEP management structure to depict the organizational structure for these other University-wide efforts.
Table 13 summarizes the steps for implementing the QEP during its three-year duration;
Appendix E provides a fuller description of this timeline by semester. Implementation of the QEP pilot projects will begin in the 2007-2008 academic year with the initiation of some projects during the Fall 2007 semester and others during the Spring 2008 semester. The timing of initial implementation will be dependent on course scheduling. Course-level assessment of each pilot project will be done following every semester that each course is taught beginning in the 2007-
Achieving the QEP’s Objectives • 37
2008 academic year and continuing through the 2009-2010 academic year; program-level and
University-level assessments will be performed during the summer of each academic year. As noted previously, the feedback from these various assessments will be used to identify what, if any, adjustments need to be made within individual pilot projects and for the QEP program.
Table 13: Summary Timeline for QEP Implementation
Fa 2007 Sp 2008 Su 2008 Fa 2008 Sp 2009 Su 2009 Fa 2009 Sp 2010 Su 2010 Implementation Steps
Implementation of pilot projects (including modifications based on prior assessment findings)
Completion of course-level assessment for prior semester's classes
CLA, NSSE, FSSE (with oversample)
Alumni survey
Employer survey
Annual faculty survey of active learning practices
Incorporation of QEP assessments and approaches into UEPs
Completion of program-level and University-level assessments
Assessment results shared with pilot project faculty and
Standing Committee for course and/or program refinements
Biannual survey of UEPs
Faculty Enrichment
Programs / additional QEPrelated initiatives
Pilot project PIs develop and utilize teaching circles
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
Employ QEP-Specific
Evaluation Rubric to assess impact and institutionalization of QEP
Standing Committee and
University administration determines next steps based on all prior assessments
X
X
In Summer 2010, after the final year of the QEP, a full assessment will take place incorporating all the prior program-level and University-level assessments. In addition, the QEP-Specific
Evaluation Rubric (see Appendix F for a working draft of this rubric) will examine the impact of the QEP and factors of institutionalization of active learning on campus along four dimensions:
• Philosophy and mission of active learning;
• Faculty support for innovation in active learning;
• Student support for and involvement in active learning; and
• Institutional support for active learning.
Achieving the QEP’s Objectives • 38
This analysis will provide additional information in making decisions on how to proceed (e.g., expand all or some of the pilot project models, call for new proposals, or some combination thereof).
D. Replication, Expansion, Innovation, and Institutionalization in Year 4 and
Beyond
During academic year 2010-2011, the University-Wide Standing Committee on Higher Order
Thinking and Active Learning, along with the University administration, will face a decision.
They will need to decide what projects to support in the second round. As we see it now, there are three main options.
• Option 1: Call for new second-round pre-proposals that will make use of active learning techniques to facilitate higher order thinking skills in UT Arlington students.
• Option 2: Call for second-round pre-proposals that will replicate a subset of the more successful first-round pilot projects. This will allow primary investigators to investigate more fully the effects of some of the first-round pilot projects.
• Option 3: Call for new second-round pre-proposals that will permit both preproposals involving new uses of active learning techniques to facilitate higher order thinking skills and a replicated subset of the more successful first-round pilot projects
(i.e., combine Options 1 and 2).
Based upon what was learned from the QEP development, the Steering Committee recommends that the following matters be considered in connection with a second call for pre-proposals:
Faculty submitting pre-proposals should discuss the following items:
• The specific higher order thinking skills that students will acquire and develop, and the related student learning outcomes;
• Active learning techniques that will facilitate the acquisition and development of the higher order thinking skills targeted in the pre-proposal; and
• Assessment tools that will be employed to measure learning outcomes (including addressing how such tools will be developed or acquired by the faculty member).
Other aspects of a successful pre-proposal should include:
• A budget that contains specific categories of expenses (e.g., salaries, graduate student stipends, equipment, supplies). Each year of the project should be addressed separately and the value of the projected expenses in each of the budget categories should be clearly connected to learning outcomes and the ability of students to acquire higher order thinking skills;
• A discussion of the replicability of the project within the primary investigator’s discipline and/or across disciplines;
• Whether the project proposed is a new proposal or a variation of a previous proposal; and
• The number of students participating in the first year and an estimate of potential growth.
Assuming a three-year implementation and assessment cycle patterned after the QEP, the second round of projects would begin in the 2011-12 academic year and culminate in a final assessment following the 2013-14 academic year. Thus, a third call for active learning course proposals could be undertaken during 2014-15, with implementation commencing in 2015-16. Appendix G sets forth a timeline for the period of years 4 through 10 following the completion of the QEP.
Achieving the QEP’s Objectives • 39
In spring 2012 (year 5), the University will submit an impact report to SACS laying out the findings from our assessments of the QEP and subsequent decisions informed by these data. In year ten, an analysis of the second five-year expansion efforts will take place using the data collected from relevant active learning courses, program participants during this period, and
University-wide assessments and rubrics discussed above in order to provide data to inform the next University strategic planning initiative. This assessment is intended to demonstrate not only the impact of active learning on the acquisition and development of higher order thinking skills of our students, but also the fundamental institutionalization and cultural change in the approach to the educational enterprise at UT Arlington.
Achieving the QEP’s Objectives • 40
V.
The QEP Pilot Projects
Table 14 provides detail on the individual pilot projects. These working charts were developed in conjunction with the primary investigators in order to flesh out the original pre-proposals and the design of the courses. In the case of objective tests, the summative assessment method will be to embed questions designed to assess the desired learning outcome. In the case of field journals, portfolios, essays, essay exams, oral presentations, and the like, a contextualized Washington
State critical thinking rubric will be developed by the course instructor. These will be developed in Spring 2007, along with training to ensure inter-rater reliability of their use. Finally, the pre/post-test refers to the Modified Knowledge Surveys designed by the course instructor.
The QEP Pilot Projects • 41
Table 14: Pilot Project Summaries
Primary Investigator:
Pilot Project Name:
Aktosun
Active learning through URE courses
Active Learning Rating(s): [ High] -- (V) Problem Based Learning
Time on Task : Full
Course Abstract: Course No.: MATH 4394 – Undergraduate Research Experiences
In order to create an active learning environment for our undergraduate mathematics students, we propose to introduce an Undergraduate Research
Experiences (URE) course into our curriculum. Students enrolled are expected to perform meaningful research under the supervision and mentorship of active faculty researchers. This course aims at training students to improve their oral, written, computational, and presentation skills; to analyze a specific research problem and formulate it as a mathematical problem; to apply new and prior knowledge to solve the mathematical problem; to translate the mathematical solution by giving a physical interpretation to it; and to analyze the shortcomings of the mathematical model used, determine how realistic the model is, and to compare the solution with those obtained by others or from other methods. This course creates an open learning environment that is based on inquiry rather than on passive learning and one that challenges students to think, to use new and prior knowledge, to collaborate with peers and faculty, and to explain both the problem and the solution to others. The goals of this course will be assessed by direct and indirect methods. Direct methods of assessment will include the tracking of student demographics, the structure of each URE course, the research performed, basic information on the nature of the supervision, student research papers and presentations, and the academic progress of the students after the course. A rubric will be developed to ensure common standards are articulated and evaluated on both the oral and written components. Indirect methods of assessment will include end-of-semester student surveys that will ask about goal attainment, recommendations for future courses, and will allow the students to express how they felt about their research topic and advisor, whether they felt they had sufficient supervision, and how learning in this course compares with learning in a traditional course.
Student Learning Outcomes: As a result of this course of instruction, students will be able to…
Use prior knowledge from calculus, linear algebra, and differential equations in order to analyze a specific research problem.
Formulate the research problem as a mathematical problem in order to reach an appropriate resolution.
Learning Activities/Formative
Assessment
Work in groups to discuss with peers and summarize the fundamental results from prior knowledge areas.
Work in groups to identify all the variables that may affect the solution, to discuss how all such mathematical variables can be interrelated, to identify the prior relevant research published in the literature, and to learn about library and other online resources.
Thinking Skills Emphasized in
Outcome and Learning Activities
Application
Comprehension
Summative Assessment Methods
Evaluate short oral presentations to peers and written summary outlines by employing a contextualized critical thinking rubric.
Evaluate the introductory portion of a technical report by employing a contextualized employing a contextualized critical thinking rubric.
The QEP Pilot Projects • 42
Student Learning Outcomes: As a result of this course of instruction, students will be able to…
Identify basic principles from such areas as complex variables, partial differential equations, and numerical analysis. Synthesize these with prior knowledge in order to solve the mathematical problem.
Translate the mathematical solution in order to give a physical interpretation to it.
Analyze the shortcomings of the mathematical model used in order to determine how realistic the model is.
Compare the solution with those obtained by others or from other methods.
Identify professional behaviors associated with mathematicians in order to discuss how professional mathematical scientists operate in academia and in industry.
Learning Activities/Formative
Assessment
Thinking Skills Emphasized in
Outcome and Learning Activities
• Work in groups to intensively discuss with instructor the research problem under study.
• Conduct laboratory work to learn symbolic and numerical implementation by using software of
Mathematica and Matlab.
Work in groups to discuss and explain the mathematical result by using fundamental ideas from physics or other fields.
Work in groups to discuss the reasonableness of the solution and how additional factors may complicate the problem under study.
Work in groups to discuss how scientific results are disseminated and how scientific research is funded, and to consider the importance of combining research and education.
Knowledge and Synthesis, Secondary
Application
Comprehension
Analysis and Evaluation
Knowledge
Fall 2007 Spring 2008 Fall 2008 Spring 2009
MATH 4394 N/A MATH 4394 Class
No. of Students
Research Design Pre-Post Test N/A Pre-Post Test N/A
Summative Assessment Methods
Evaluate the main portion of the joint technical report and use of computer programs for numerical and symbolic computations, by employing a contextualized critical thinking rubric.
Evaluate the portion of the joint technical report discussing the findings by employing a contextualized critical thinking rubric.
Evaluate the final section of the joint report discussing possible generalizations and future work and completion of the joint technical report, by employing a contextualized critical thinking rubric.
Employ a contextualized critical thinking rubric to evaluate the students’ preparation of transparencies/posters, scientific abstracts, and their respective public presentations.
Fall 2009
4394
Pre-Post Test
Spring 2010
N/A
The QEP Pilot Projects • 43
Primary Investigator: Boardman
Pilot Project Name : A Controlled Study on Classroom Response Technology for Large Classes
Active Learning Rating(s): [Low] -- (J) Computer Based Interaction Systems (Personal response system); (G) Application Activity on Semester
Course Abstract: Course No.: XE 1104 – Introductory to Engineering
A classroom response system (CRS) will be used in the introductory engineering course XE 1104. By using the CRS as an assessment tool, students are provided instant feedback to their questions, which easily allows the instructor to monitor student absorption and retention of the wide variety of topics covered by this critical course. As an introductory course, XE 1104 exposes students to engineering ethical issues and mandates that they work in teams of four—with each team being required to have at least three of the engineering departments represented. By the end of the semester students should be able to: distinguish between the different engineering disciplines in order to identify the discipline that best fits their strengths and weaknesses; identify ethical and professional behaviors in order to assess and judge provided contemporary ethical case studies; communicate clearly and effectively in writing in order to practice professional behavior that will be expected in their careers; work within a multi-disciplinary team in order to design and construct a project. The objective of this project is to assess the CRS effectiveness in increasing active learning and improving student learning outcomes by targeting three specific assessment areas: classroom learning, multi-disciplinary teamwork, and engineering ethics. These three goals will be assessed using student responses to questions on tests, course evaluations, end-of-semester- peer evaluations, and student exam questions.
Learning Activities/Formative
Assessment
Thinking Skills Emphasized in
Outcome and Learning Activities
Summative Assessment Methods Student Learning Outcomes: As a result of this course of instruction, students will be able to…
Distinguish between the different engineering disciplines in order to identify the discipline that best fits their strengths and interest.
Identify ethical and professional behaviors in order to assess and judge provided contemporary ethical cases.
Apply teamwork skills in order to design and construct in-class and outside of class projects effectively within interdisciplinary groups.
• Clicker questions
• Quizzes
• Self-tests
• Professional organization meetings
• Clicker questions
• Ethics case study discussion
• Quiz bowl – students compete in teams to answer questions
• Clicker questions
• Team contract
• Building a Spaghetti water tower
• Developing a preliminary project
report
• Ongoing peer evaluations
Analysis
Knowledge
Evaluation (secondary)
Application
• Exam questions
• Course evaluations
• Exam questions
• Course evaluation
• Case study written exercise (assessed using contextualized critical thinking rubric)
• Project demonstration (assessed using contextualized critical thinking rubric)
• Final project report (assessed using contextualized critical thinking rubric)
• Log/reflection books (assessed using contextualized critical thinking rubric)
• Peer evaluations
The QEP Pilot Projects • 44
Class
No. of Students
Fall 2007
XE 1104
Spring 2008
XE 1104
Fall 2008
XE 1104
IE 3312*
650 600 600
Research Design 1 experimental section
(CRS)
2 control sections (not use CRS)
1 experimental section
(CRS)
2 control sections (not use CRS)
Pre-post Test Pre-post Test
* Reflects plans for initial phase of expansion of model.
** Reflects plans for second phase of expansion of model.
All CRS
Spring 2009
XE 1104
IE 3312*
650
All CRS
Fall 2009
XE 1104
IE 3312*
IE 3315**
700
All CRS
Spring 2010
XE 1104
IE 3312*
IE 3315**
700
All CRS
The QEP Pilot Projects • 45
Primary Investigator:
Pilot Project Name:
Cole
Personal Response Systems: Active Learning in a Large Lecture Hall
Active Learning Rating(s): [Low] -- (J) Computer Based Interaction Systems; (G) Application Activity -- [Primary]
Muddiest Think/Pair/Share on Semester
Course Abstract: Course No.: HIST 1311 – United States History to 1865
This project will involve the re-development of a core requirement course, which is usually taught in a large lecture hall, to learn more about the benefits of a personal response system. This relatively new technology employs handheld devices (or “clickers”) and innovative software, which together presents the entire class’s responses to instructor-posed questions instantly in a graph, as well as keeping track of individual responses. A personal response system allows an instructor to engage students during lecture, assess their comprehension, and track student learning over time, even in large courses where students are often inclined to remain anonymous and passive. Clicker technology in a humanities course, like history, can test students’ ability to apply knowledge, but it can also engage students in their own historical analysis. This project will center on reorganizing the course around primary sources, with the aim of helping students in the process of making their own interpretations discover how well-informed and well-intentioned historians sometimes arrive at different understandings of the past. In addition to creating a more interactive course, this project will rely on intensive tracking of student in-class participation to subsequent performance on class assignments and exams. Embedded test questions, as well as a rubric used to analyze their essay exams, will allow for further evaluation of the students’ performance toward the course’s desired learning outcomes. The overall goal will be to use these active learning strategies to teach introductory students not only influential interpretations of the development of American society and politics prior to the Civil War, but also how historians go about constructing those interpretations.
Learning Activities/Formative
Assessment
Thinking Skills Emphasized in
Outcome and Learning Activities
Summative Assessment Methods Student Learning Outcomes: As a result of this course of instruction, students will be able to…
Students will be able to restate wellknown explanations of past events in order to identify how other historians have explained transformations in colonial and antebellum U.S. history.
Comprehension (primary) • Embedded exam questions (for both control and active learning sections).
• Response questions mapped to higher order thinking skills.
The QEP Pilot Projects • 46
Student Learning Outcomes: As a result of this course of instruction, students will be able to…
Students will be able to identify and distinguish between components of competing (and complementary) explanations of past events in order to apply critical analysis to important transformations in colonial and antebellum United States history.
Learning Activities/Formative
Assessment
• Ask students to infer how a social/economic/or political historian would read a primary source
• Students create fake primary sources that would justify a political/social, etc. interpretation of an event
• Student debate two different explanations for an event
Students will evaluate competing explanations, choosing and/or synthesizing the most appropriate one(s) in order to present a coherent argument about what caused significant changes in colonial and antebellum American history.
Fall 2007 Spring 2008 Fall 2008
Thinking Skills Emphasized in
Outcome and Learning Activities
Analysis (primary)
Application (secondary)
Synthesis (primary)
Spring 2009
HIST 1311 N/A Class
No. of Students
HIST 1311
Research Design 1 experimental section (CRS)
1 control section (not use
CRS)
Pre-Post Test
N/A
N/A 1 experimental section (CRS)
1 control section (not use CRS)
Pre-Post Test
Summative Assessment Methods
Essay exam assessed using contextualized critical thinking rubric.
Essay exam assessed using contextualized critical thinking rubric.
Fall 2009
HIST 1311
(CRS)
1 control section (not use
CRS)
Pre-Post Test
Spring 2010
N/A
The QEP Pilot Projects • 47
Primary Investigator: Deen
Pilot Project Name : Democracy in Action: Understanding the Democratic Process through an Analysis of Political Campaigns
Active Learning Rating(s): [Moderate] – (G) Application Activity [Primary]; (L) Small Group Presentations/Discussions
Time on Task: Module within course integrated throughout the three sections of the course
Course Abstract: Course No.: POLS 2311 U.S. Government
POLS 2311 U.S. Government is a required course for all undergraduate students at the University. In this project, students will participate in an active learning module regarding political campaigns and elections. The objective of this project is for students to understand better the electoral process and to assess the impact of contemporary campaigning on the democratic process. Working in small groups, students will conduct research on candidates running for political office (e.g., state representative, city council, county judge, district attorney or perhaps even U.S. Congress), engage in and reflect on presentations from guest speakers regarding these races, evaluate the campaigns and present their analysis to their classmates. This multi-pronged learning strategy allows students to develop important research and analytical skills by engaging actively with the course material through “real world” experiences. A common assessment rubric will be developed to ensure that the faculty who are involved are utilizing consistent standards to measure students’ performance. Each presentation and essay will be examined utilizing this rubric to ascertain the level of higher order thinking at which each student is operating. A pretest/posttest and embedded questions will be employed to also gauge the progression of students’ skills.
Student Learning Outcomes: As a result of this course of instruction, students will be able to…
Section I of Course
Identify the components of classical
Democratic Theory in order to discuss
U.S. Government.
Identify the components of contemporary
U.S. political culture in order to discuss the context of democracy in the U.S.
Learning Activities/Formative Assessment Thinking Skills Emphasized in Outcome and Learning Activities
• Read texts
• Respond to professor’s questions on the relationship of classical Democratic
Theory to U.S. Government structures
• Read texts
• Respond to professor’s questions
• Identify examples of shared values in popular culture (magazines, ads, newspapers)
Comprehension
Comprehension
Summative Assessment Methods
Closed and/or open ended test questions
Closed and/or open ended test questions
The QEP Pilot Projects • 48
Student Learning Outcomes: As a result of this course of instruction, students will be able to…
Section II of Course
Identify the components of the electoral environment that candidates for political office face in order to discuss the electoral process.
Identify the role of the media in U.S. politics in order to discuss U.S. political culture and democracy.
Analyze candidates’ election materials in order discuss the electoral process.
Compare/contrast candidates’ campaign literature with media reports of the campaign in order to:
• Evaluate their congruence.
• Assess the role of the media in elections.
• Judge the viability of the campaign.
Learning Activities/Formative Assessment Thinking Skills Emphasized in Outcome and Learning Activities
• Read texts
• Respond to professor’s questions
• Reflection & Dialogue notes of guest speakers
• Work in small groups to research political campaigns.
• Read texts
• Respond to professor’s questions
• Consume media to identify examples of the role of media in popular culture, looking for positive/negative frame
• Read texts
• Respond to professor’s questions
• Work in small groups to compare across candidates’ literature
• Work in small groups to classify examples of campaign advertisement
( e.g.
yard signs, mailers, bumper stickers) by effectiveness, target audience, allocation of resources, positive/negative frame.
• Read texts
• Respond to professor’s questions
• Compares and contrasts candidate literature with media news reports
• Work in small groups to distinguish among newspaper endorsements to explain sources of support for candidates
Comprehension
Comprehension
Analysis
Analysis
Summative Assessment Methods
Closed and/or open ended test questions
Closed and/or open ended test questions
Closed and/or open ended test questions
Closed and/or open ended test questions
The QEP Pilot Projects • 49
Student Learning Outcomes: As a result of this course of instruction, students will be able to…
Section III of the course
Synthesize their understanding of course concepts with their research on candidates’ campaigns in order to apply democratic theory to real life politics.
Judge the electoral viability of a candidate’s campaign in order to apply democratic theory.
Learning Activities/Formative Assessment Thinking Skills Emphasized in Outcome and Learning Activities
• Create an oral presentation showing the application of course concepts to the candidates’ campaigns in order to apply democratic theory to real life politics.
• Assess the political viability of candidates through a small group oral presentation.
• Assess the political viability of candidates through a small group oral presentation.
• Develop and implement a viability rating scale
Synthesis
Evaluation
Fall 2007 Spring 2008 Fall 2008 Spring 2009 Fall 2009
Summative Assessment Methods
• Closed and/or open ended test questions
• Reflective writing assignment using contextualized assessment rubric
• Closed and/or open ended test questions
• Reflective writing assignment using contextualized assessment rubric
POLS 2311 POLS 2311 POLS 2311 POLS 2311 POLS 2312
Spring 2010
POLS 2311 Class
No. of Students
Research Design Control group
Pre-Post test
Experimental group
Pre-Post test
Experimental group
Pre-Post test
Control group
Pre-Post test
Comparison group for evaluation of possible lecturer effect
Pre-Post test
Experimental group
Pre-Post test
The QEP Pilot Projects • 50
Primary Investigator: Dillon
Pilot Project Name : An Instant-Response Homework System for Engineering Classes
Active Learning Rating(s): [Low] -- (A) Cold-calling on a student for a simple question based on a lecture point recently made;
(G) Application Activity on Semester
Course Abstract: Course No.: EE 2315 – Circuit Analysis I
Homework problems assigned to engineering students help develop problem solving skills with engineering models by requiring the students to calculate answers based on a supplied set of numerical parameters. Unfortunately, students have often lost focus on the problems by the time the assignment is returned to them. A solution to this problem is to have students complete homework on-line using a fully interactive program. The program generates random variables for the model parameters so that each student gets a unique version of each assigned problem. Answers are automatically calculated, stored in a password-protected data base under the student’s user name and compared with the student’s submitted answers. This will provide students with instant feedback that will contribute to critical thinking by quickly rejecting answers obtained by false reasoning or incorrect assumptions. In addition, collaboration between students will become a teaching/learning experience rather than just an exchange of data, since no two students will have exactly the same parameters. Having completed EE 2315, students are expected to be able to identify the fundamental laws of electric circuit analysis in order to effectively use the underlying concepts for analyzing electric circuits, and to use the basic methods of circuit analysis to efficiently formulate solutions for complex circuit problems. To assess whether these learning outcomes have been met, two sections of EE 2315 will be offered in the fall of 2007 with the same lectures. One section will be given conventional homework assignments, while the other will have the web-based system for homework. The evaluation of student learning will be based on formative in-class questioning, embedded examination questions and results, homework grades and tracking of performance on such, and class attendance. Using a dependable set of rubrics, we will be able to measure the students’ abilities to apply knowledge of math, science and engineering as well as to solve engineering problems.
The QEP Pilot Projects • 51
Student Learning Outcomes: As a result of this course of instruction, students will be able to…
Identify the fundamental laws of electric circuit analysis in order to effectively use the underlying concepts for analyzing electric circuits.
Learning Activities/Formative
Assessment line homework (control out of the book).
• Ask a specific student (by name) to answer a fundamental question about a recently explained topic.
• Mini-Projects: Design simple elements: resistors, inductors & capacitors.
Use basic methods of circuit analysis in order to efficiently formulate solutions for complex circuit problems. line homework (control out of the book)
• Ask a specific student (by name) to answer a fundamental question about a recently explained topic.
• Mini-Projects: Design circuits using previously designed elements.
Thinking Skills Emphasized in
Outcome and Learning Activities
• Knowledge and Application
Application
Fall 2007 Spring 2008 Fall 2008 Spring 2009
Class EE 2315
No. of Students 60-80
N/A
N/A
EE 2315
60-80
N/A
N/A
Research Design Two sections - one control group; one experimental group
Pre-Post test
N/A Two sections - one control group; one experimental group
Pre-Post test
N/A
Summative Assessment Methods
Exam questions (same in both control and experimental sections)
Exam questions (same in both control and experimental sections)
Fall 2009
EE 2315
60-80
Two sections - one control group; one experimental group
Pre-Post test
Spring 2010
N/A
N/A
N/A
The QEP Pilot Projects • 52
Primary Investigators: Hirtle & Wiggins
Pilot Project Name : New Media Literacy for a New Generation of Learners
Active Learning Rating(s): [High] -- (V) Cooperative Learning/Problem Based Learning on Semester
Course Abstract: Course No.: LIST 4343 – Content Area Reading and Writing
Today’s new generation of learners are: digitally literate, connected, immediate, experiential, social, highly interactive and collaborative. Yet often, there is a disconnect between the way the Net Generation is taught and the way they go about learning, problem solving and communicating in their daily lives. LIST
4343 aims to illustrate the importance of reading, writing, speaking, and listening across the curriculum and it explores methods of teaching in all of these areas for grades 4-12. This study seeks to understand what happens when digital tools, such as blogs, podcasts, and vodcasts, are utilized to determine what impact these tools have on students’ ability to understand and employ these skills within their teaching. To assess the impact of employing the use of digital tools, data on students who are enrolled will be collected and evaluated using ethnographic software that analyzes textual, graphic, audio, and video data. Blogs and faceto-face meetings will also be used in order to assess the impact of digital tools on pedagogical comprehension in content area reading and writing across the content areas. Furthermore, a rubric will be developed to evaluate the acquisition of these skills in the student’s final research projects.
Learning Activities/Formative
Assessment
Thinking Skills Emphasized in
Outcome and Learning Activities
Summative Assessment Methods Student Learning Outcomes: As a result of this course of instruction, students will be able to…
• Distinguish the important features of digital tools such as podcasting and blogging in order to increase cross-discipline dialogue.
• Distinguish the importance of podcasting and blogging in order to construct literacy learning in their content area.
• Apply their knowledge of literacy strategies in order to create practical application in other content areas.
• Use insights from their crossapplications in order to create practical application in other content areas.
Blog each other discussing how literacy can affect and enhance pedagogy in their content area.
Blog practical applications of the various literacy strategies presented according to each content area. Based on their discussions, what specific activities can they incorporate in their content area?
Mini-lesson in the literacy strategy presentation.
Comprehension
Application
Journal Responses will be written and relevant experience and knowledge of their content area and the relation of literacy will be explored. A contextualized critical thinking rubric will be used to assess these materials.
Literacy Strategy presentations and follow up class or digital discourse explicating the use of literacy strategies in multiple content areas will be analyzed for higher order/critical thinking skills which reflect knowledge of application of literacy strategies in teaching and learning in the content areas. A contextualized critical thinking rubric will be used to assess these presentations.
The QEP Pilot Projects • 53
Student Learning Outcomes: result of this course of instruction, students will be able to…
As a
Connect literacy learning with their content area in order to effectively dialogue with their peers in other content areas.
Diagram the cross cultural nature of literacy in order to analyze the relevance of literature in their content areas.
Learning Activities/Formative
Assessment
Blog/dialogue with each other across content areas on how literacy is relevant to them. Create a concept map on each of the major content areas showing literacy as the center of the map.
Synthesis
Blog/dialogue with each other to scaffold the pedagogical implications of literacy in their future practice.
Thinking Skills Emphasized in
Outcome and Learning Activities
Analysis
Summative Assessment Methods
Concept Map presentations diagramming student understanding of importance of literacy strategies in learning follow up class or digitally. Will be analyzed using a contextualized critical thinking rubric for connections made to their content area in addition to other content areas. The students should demonstrate the understanding and synthesis of interdisciplinary connections.
The class discussions in the control group and the class discussion and blogs in the experimental group will be examined using a contextualized critical thinking rubric for connections made to literacy in their future practice. The students will be required to develop a concept map connecting the content areas to their specific discipline within their journal response.
Final Lesson Plan utilizing literacy strategies and follow up class or digital discussions.
Will be analyzed to determine how students incorporated literacy learning strategies in the various content areas.
Evaluate the value of literacy in order to incorporate it in to their daily pedagogical practice.
Incorporate what they learned from the blogs through their final lesson plan.
Evaluation
Class
No. of Students
Research Design
Fall 2007
LIST 4343
Spring 2008
LIST 4343
Fall 2008
LIST 4343
Spring 2009
LIST 4343
Fall 2009
LIST 4343
Spring 2010
LIST 4343
35 35 35 35 35 35
Pre-Post test
Analysis across semesters
Pre-Post test
Analysis across semesters
Pre-Post test
Analysis across semesters
Pre-Post test
Analysis across semesters
Pre-Post test
Analysis across semesters
Pre-Post test
Analysis across semesters
The QEP Pilot Projects • 54
Primary Investigator: Martin
Pilot Project Name : RN-BSN Capstone Seminar
Active Learning Rating(s): [High] – ( V) Cooperative Learning/Problem Based Learning -- Primary
( Q) Peer Teaching
(T) Cooperative Cases on Semester
Course Abstract: Course No.: NURS 4382 – Capstone Seminar, RN-BSN Program
This course is for the RN student who has enrolled to obtain their Bachelors of Science in Nursing. The course is offered simultaneously on the main campus as well as in remote locations. The enhanced-version of the Capstone Seminar will require all students to give an oral presentation to meet an identified agency’s need for education. Students will choose their presentation subject by assessing the agency’s need for education and the current literature on timely topics in nursing and health care that are related to the identified needs of the agency. They will share these articles and what they learned from them at their agency site and with their fellow students in online discussions in which students from different sites are grouped together. Students will write a literature review on their chosen topic, and once completed, this review will be used to construct a Power Point presentation to educate client and/or staff in their chosen agency on their chosen topic. These opportunities allow students the chance to broaden their knowledge by learning from each other, agency workers, and faculty members as well as employ the full range of their academia experience. To assess whether the goals are met, students’ class discussion, online discussion, and content and delivery of their presentations all will be assessed using an evaluation rubric. Written appraisals from their agency on the quality of the presentation, usefulness and immediate application of the information provided will be gathered to judge the effect the students’ presentations had on the agencies. This aspect of evaluation will not only demonstrate the effectiveness of the presentation, but also the relevance of the choice of topics. Information will also be collected from the students as to their confidence level after presenting and their likelihood to repeat this behavior. Additional plans call for tracking these students’ changes in job settings after graduation and the number of RN-BSN students who progress to masters and doctorial programs juxtaposed with learning styles and other criteria determined at the beginning of the QEP program. This later assessment would serve to evaluate whether the outcomes desired by the overall program are met and how best to utilize this capstone experience.
Student Learning Outcomes: As a result of this course of instruction, students will be able to…
Identify and analyze a need for education within an agency in order to recommend appropriate action.
Synthesize peer reviewed literature in order to use the information in a clinical agency.
Learning Activities/Formative
Assessment
In a presentation, use previously acquired skills to determine a need that the agency has for education of clients or staff.
• Search for literature which speaks to a need for change within an agency the student is or has worked with.
• Write a critical analysis using peer reviewed literature.
Thinking Skills Emphasized in
Outcome and Learning Activities
Analysis and Evaluation {note: identify in this context is at an evaluation level}
Synthesis
Summative Assessment Methods
Choice of educational need will be assessed as part of the presentation using a contextualized critical thinking rubric.
The critical analysis paper would be assessed using a contextualized critical thinking rubric.
The QEP Pilot Projects • 55
Student Learning Outcomes: As a result of this course of instruction, students will be able to…
Evaluate the appropriate communication level for the target audience’s educational experience in order to improve communication with staff, administration or clients in their practice area.
Create a presentation in order to satisfy an identified need within an agency.
Analyze clinical research articles in order to determine the research’s usefulness in evidence-based practice.
Learning Activities/Formative
Assessment
Assess the communication level of the audience through various methods such as interviews or review of agency materials.
Create & present a power point presentation to satisfy the identified educational need.
• Class & online discussions.
• Survey done beginning and end of class.
Fall 2007 Spring 2008
Thinking Skills Emphasized in
Outcome and Learning Activities
Evaluation
Application
Analysis
Fall 2008 Spring 2009
NURS 4382 NURS 4382 NURS 4382 NURS 4382 Class
No. of Students
Research Design Pre-Post test
Analysis across semesters
Pre-Post test
Analysis across semesters
Pre-Post test
Analysis across semesters
Pre-Post test
Analysis across semesters
Summative Assessment Methods
Evaluation of the communication level will be evaluated as part of the presentation using a contextualized critical thinking rubric.
Presentation effectiveness will be assessed using a contextualized critical thinking rubric.
Class & online participation will be assessed using a contextualized critical thinking rubric.
Fall 2009
NURS 4382
Pre-Post test
Analysis across semesters
Spring 2010
NURS 4382
Pre-Post test
Analysis across semesters
The QEP Pilot Projects • 56
Primary Investigators: Maruszczak & Rusher
Pilot Project Name : Interactive Digital Portfolio
Active Learning Rating(s): [High] -- (V) Cooperative Learning / Problem Based Learning on Semester
Course Abstract: Course No.: Arch 4395 – Selected Topics in Architecture (Interactive Digital Portfolio)
This course will provide a conceptual framework for the development of an interactive digital portfolio. The digital portfolio is used by the design disciplines to document creative work. This course is comprised of the Intermedia Toolbox, Active Archive, Adaptive (Live) Portfolio, and culminates in a Personal
Education Portfolio. The class will be structured around as a series of self and group active learning exercises and pedagogies that build toward a living document that simultaneously develop digital proficiencies and critical thinking skills. The Personal Education Portfolio will be a live, ongoing Portfolio of their own relationship to learning using inter-disciplinary practices, new multi-media methods and contemporary communication models. Assessment of students will be based on their ability to synthesize the four modules by acquiring digital proficiencies, incorporating collaborative and composite designs, interdisciplinary research, developing adaptive models/case studies and synthesizing into a new Interactive Educational Portfolio.
Student Learning Outcomes: As a result of this course of instruction, students will be able to…
Learning
Activities/Formative
Assessment
Thinking Skills Emphasized in
Outcome and Learning
Activities
Summative Assessment Methods
Section I: Intermedia Toolbox
Describe the Pedagogical Typologies embedded in work in order to design a Performance-based start-up manual.
Explain and assess interrelatedness of own work in order to effectively manipulate digital video to convey concepts, design intent, and communicate ideas.
Adapt and react to intense rapid-fire analytical exercises that challenge the student in order to quickly develop communication ideas in an impromptu manner.
Create digital environments through 3D in order to model ideas.
Develop organizational structures for portfolio in order to recognize the limitations and possibilities.
Pedagogical Theater
Captured Performances
Graphic Software Sets
3D
Modeling/Animation
Interactive Interfaces
Comprehension
Application/Evaluation
Synthesis/Application
Synthesis
Synthesis
Using a contextualized critical thinking rubric to assess:
Student’s ability to categorize elements of work into pedagogical types.
Level of conceptual sophistication.
Ability to make critical assessments and adjustments to work.
Completion/Adaptability/Decision
Making/Conceptual
Notation Generation
The QEP Pilot Projects • 57
Student Learning Outcomes: students will be able to…
As a result of this course of instruction, Learning
Activities/Formative
Assessment
Thinking Skills Emphasized in
Outcome and Learning
Activities
Section II: Active Archive
Gather and categorize Graphic Imagery in order to compile at a professional level.
Indexing, colleting & categorizing materials.
Analyzing
Summative Assessment Methods
Using a contextualized critical thinking rubric to assess:
Choice of projects selected will demonstrate the thought process of revealing one’s self through a body of work.
Resolution/Design Logic. Create a dynamic scroll of imagery. Compose digitally in order to create a dynamic scroll of imagery.
Design a method in order to filter & evaluate a body of work.
Documentary – archiving, recalling & composition of materials.
Cognitive Assessment – filtering, reflecting, & designing of materials.
Application
Synthesis
Connect ideas associated with projects in order to create definitions of typologies.
Typological Analysis – establishing connections, relationships; understanding logical links.
Synthesis
Section III: Interactive Live Portfolio
Written essay describing process of filtering, design methodology, and reassessment of body of work
Definitions of Typologies.
Develop a critical path in order to design, in a collaborative fashion, an electronic adaptive portfolio.
Electronic Blackboards with cognitive maps.
Studio Crews with interactive think tanks.
Exquisite Corpse with interactive design.
Synthesis
Using a contextualized critical thinking rubric to assess:
Flexibility/Adaptability/Legibility.
Advancement of concepts and digital skills via cross-pollination.
Adaptability of design and student in a group setting.
The QEP Pilot Projects • 58
Student Learning Outcomes: As a result of this course of instruction, students will be able to…
Section IV: Personal Educational Portfolio
Learning
Activities/Formative
Assessment
Self-Evaluate their work and Group-Evaluate colleague’s work in order to create a retrospective personal portfolio.
Retrospective s elf, group critique and final analysis of work.
Synthesis of work using/in
Digital Media.
Fall 2007 Spring 2008
Evaluation
Fall 2008
Class
No. of Students
Research Design
Arch 4395
20
Pre-Post test
Analysis across semesters
Arch 4395
20
Pre-Post test
Analysis across semesters
Thinking Skills Emphasized in Outcome and Learning
Activities
Summative Assessment Methods
Using a contextualized critical thinking rubric to assess:
Final Digital Portfolio.
Arch 4395
20
Pre-Post test
Analysis across semesters
Spring 2009
Arch 4395
20
Pre-Post test
Analysis across semesters
Fall 2009
Arch 4395
20
Pre-Post test
Analysis across semesters
Spring 2010
Arch 4395
20
Pre-Post test
Analysis across semesters
The QEP Pilot Projects • 59
Primary Investigator: McMahon
Pilot Project Name : Freshman/Sophomore Seminar Course
Active Learning Rating(s): [Moderate] -- (L) Small Group Presentations/Discussions (Primary) on Semester
Course Abstract: Course No.: HONR 2103 – Honors Special Topics
The Honors College will develop a freshman/sophomore and a junior/senior-level one-hour Honors undergraduate seminar to be offered every fall and spring semester on interdisciplinary topics attractive to Honors and Honors-qualified students from a variety of academic disciplines. Seminar topics will eventually be selected from proposals submitted by UT Arlington faculty; however, the initial phase of the project will utilize two courses. The freshman/sophomore course will be entitled “Honors Seminar” and the junior/senior course will be entitled “Advanced Honors Seminar.” Individual seminar topic titles will be assigned under these main titles each semester the seminars are taught. These seminars will foster a learning environment where students examine a focus topic, participate in interactive classroom discussions, free exchange of ideas and opinions, and exploration of new ideas. Students completing this course are expected to have: 1) knowledge of major aspects of the seminar topic; 2) the ability to critically read and discuss seminar materials and subject matter; 3) the ability to critically assess an assigned topic and develop critical questions and statements as the basis for group discussion; 4) the ability to carry out independent research, develop and deliver an oral presentation, and moderate a seminar discussion on an assigned topic; and 5) the ability to express and discuss opinions regarding an assigned topic with the instructor and peers in an interactive group setting. Student learning outcomes will be assessed using qualitative survey documents for students and faculty facilitators, as well evaluation of the students’ oral reports, short reports and essays using a common grading rubric. In addition, instructors will assess students based on their level of preparation for their oral presentation, in leading class discussion, as well as the level of student participation in the seminar.
Student Learning Outcomes: As a result of this course of instruction, students will be able to…
Identify key course concepts in order to more effectively analyze a contemporary and/or historical issue/event.
Analyze the assigned subject matter in order to develop questions for group discussion.
Synthesize & apply conclusions from group discussions in order to create their own positions.
Evaluate their own conclusions by comparing them with peers’ to identify the most effective conclusions.
Learning Activities/Formative Assessment
• Weekly reading assignments
• Class discussion
• Preparation of questions for class discussion from reading assignments.
• Class discussion
• Drawing conclusions based on class discussion of reading assignments.
• Class discussion of assigned topic areas.
• Develop final conclusions regarding the seminar topic in their final essay.
Thinking Skills Emphasized in
Outcome and Learning
Activities
Knowledge
Analysis
Synthesis & Application
Analysis & Evaluation
Summative Assessment Methods
• Weekly short tests on assigned readings
• Student responses and topic discussion during seminar sessions assessed by the instructor using a contextualized critical thinking rubric.
Evaluation of the appropriateness of studentprepared questions on the seminar topic by the instructor using a contextualized critical thinking rubric.
Periodic short reports in which students state their conclusions and positions on specific topics addressed in the seminar assessed by the instructor using a contextualized critical thinking rubric.
Final essay describing overall conclusions regarding the seminar topic assessed by the instructor using contextualized critical thinking rubric.
The QEP Pilot Projects • 60
Fall 2007 Spring 2008 Fall 2008 Spring 2009 Fall 2009 Spring 2010
HONR 2103 Class
No. of Students
HONR 2103 HONR 2103 HONR 2103 HONR 2103 HONR 2103
Research Design Pre-Post Test
Analysis across semesters
Pre-Post Test
Analysis across semesters
Learning Activities/Formative
Assessment
Pre-Post Test
Analysis across semesters
Pre-Post Test
Analysis across semesters
Thinking Skills Emphasized in
Outcome and Learning Activities
Pre-Post Test
Analysis across semesters
Pre-Post Test
Analysis across semesters
Primary Investigator: McMahon
Pilot Project Name : Junior/Senior Honors Seminar Courses
Active Learning Rating(s): [Moderate] -- (L) Small Group Presentations/Discussions (Primary)
(Q) Peer Teaching on Semester
Course Abstract:
See prior course abstract.
Course No.: HONR 4103 – Honors Advanced Special Topics
Summative Assessment Methods Student Learning Outcomes: As a result of this course of instruction, students will be able to…
Evaluate the subject matter covered in each seminar session in order to select the most appropriate material for their own claims or conclusions.
Research, analyze & synthesize published information on an assigned topic in order to orally present their findings and conclusions.
• Assigned readings
• Class discussion
Preparation of at least one oral report and leading class discussion on an assigned topic area by each student in the class.
Evaluation & Analysis
Analysis & Synthesis
Short reports to be handed in prior to each seminar session and evaluated by the instructor with a contextualized critical thinking rubric.
Student preparation for, presentation of and leading class discussion on an assigned topic area evaluated by the instructor using a contextualized critical thinking rubric.
The QEP Pilot Projects • 61
Student Learning Outcomes: As a result of this course of instruction, students will be able to…
Form & defend conclusions based on peer discussion of seminar topics in order to create sound & effective arguments.
Learning Activities/Formative
Assessment
Thinking Skills Emphasized in
Outcome and Learning Activities
Active participation in class discussions making sound arguments and presenting and defending conclusions.
Synthesis & Application
Identify & integrate key concepts into a coherent presentation in order to lead a group discussion and investigation of a particular subject area.
Fall 2007
Preparation and presentation of at least one oral report and leading class discussion on an assigned topic area by each student in the class.
Spring 2008
Analysis & Synthesis (application secondary)
Fall 2008 Spring 2009
HONR 4103 HONR 4103 HONR 4103 HONR 4103 Class
No. of Students
Research Design Pre-Post Test
Analysis across semesters
Pre-Post Test
Analysis across semesters
Pre-Post Test
Analysis across semesters
Pre-Post Test
Analysis across semesters
Summative Assessment Methods
• Student participation in seminar discussion to be assessed by the instructor using a contextualized critical thinking rubric.
• Periodic short essays on assigned seminarrelated topics assessed by the instructor with a contextualized critical thinking rubric.
• Final essay describing overall conclusions regarding the seminar topic assessed by the instructor using a contextualized critical thinking rubric.
Student preparation for, presentation of and leading class discussion on assigned topic area evaluated by the instructor using a contextualized critical thinking rubric.
Fall 2009
HONR 4103
Spring 2010
HONR 4103
Pre-Post Test
Analysis across semesters
Pre-Post Test
Analysis across semesters
The QEP Pilot Projects • 62
Primary Investigator: Peterson
Pilot Project Name : Quality Enhancement Plan (QEP) for enhancement of the Engineering Capstone Design experience
Active Learning Rating(s): [High] -- (V) Problem Based Learning on Semester
Course Abstract: Course No.: CSE 4316 – Computer System Design Project I
For each of the programs in the College of Engineering, students’ undergraduate coursework culminates in a capstone design experience. The common characteristic of these experiences is a strong engineering design emphasis, usually in a team environment. Each team typically selects its own real-world problem and takes that problem from definition to design of a solution and usually to implementation. Student teams work closely with faculty member(s) in the development of the project. The capstone design experience can produce direct evidence of the accomplishment of a subset of these outcomes for each of the engineering programs. For example, students in Industrial Engineering (IE) must demonstrate their ability to communicate effectively, in that each student makes three class presentations that are videotaped and streamed onto a class website where the presentations can be critiqued by others in the class and the students themselves receive feedback. Two of the three presentations deal with contemporary issues, thus achieving the student learning outcome of having knowledge of contemporary issues. Each of these capstone experiences employs a project log and project final report. The project log currently is just a reporting of the students’ activities as is the final report. These logs will be retooled as journals to intentionally build in reflective questions that are developed around the key skills vital for their respective engineering practice. Furthermore, all capstone faculty members will work together to enhance their respective courses with additional formative assessment strategies throughout the capstone experience as well as to develop a common rubric to ensure that instructors in each of these capstones are evaluating and emphasizing a common set of expectations of performance for both the oral and written reports. This more methodical, step-by-step approach will allow students to apply their full academic experience as well as learn what their professional practice will entail. By enhancing these capstones in this manner, the respective engineering programs will ensure learning experiences more directly aligned with and designed to meet the College’s and external accreditors’ required learning outcomes.
Student Learning Outcomes: As a result of this course of instruction, students will be able to…
Analyze economic, environmental, social, political, ethical, health and safety, manufacturability and sustainability issues in order to design a system, component or process to meet desired needs within realistic constraints. (ABET outcome c)
Learning Activities/Formative
Assessment
• Journal / log book
Thinking Skills Emphasized in
Outcome and Learning Activities
Analysis
Summative Assessment Methods
• Professional Oral presentation with demonstration, assessed using contextualized critical thinking rubric.
• Project Report, assessed using contextualized critical thinking rubric.
The QEP Pilot Projects • 63
Student Learning Outcomes: As a result of this course of instruction, students will be able to…
Analyze economic, environmental, social, political, ethical, health and safety, manufacturability and sustainability issues in order to identify, formulate and solve engineering problems. (ABET outcome e)
Apply the techniques, skills and modern engineering tools necessary for engineering practice in order to function in engineering practice.
(ABET outcomes d, g, i, j, k)
Learning Activities/Formative
Assessment
• Journal / log book
• Journal / log book
• Reports: oral and written
Fall 2007 Spring 2008 Fall 2008
Thinking Skills Emphasized in
Outcome and Learning Activities
Summative Assessment Methods
Knowledge, Analysis (because of classify) and application
Application
• Professional Oral presentation with demonstration, assessed using contextualized critical thinking rubric.
• Project Report, assessed using contextualized critical thinking rubric.
• Professional Oral presentation with demonstration, assessed using contextualized critical thinking rubric.
• Project Report, assessed using contextualized critical thinking rubric.
Spring 2009 Fall 2009
Class CSE 4316
Possibly additional capstones
CSE 4316
At least one additional capstone
CSE 4316
At least one additional capstone
CSE 4316
At least one additional capstone
CSE 4316
At least one additional capstone
Spring 2010
CSE 4316
At least one additional capstone
No. of Students
Research Design • Pre-Post test
• Analysis across semesters & across engineering discipline
• Phased-in design
• Pre-Post test
• Analysis across semesters & across engineering discipline
• Phased-in design
• Pre-Post test
• Analysis across semesters & across engineering discipline
• Phased-in design semesters & across engineering discipline
• Pre-Post test
• Analysis across semesters & across engineering discipline
• Phased-in design semesters & across engineering discipline
The QEP Pilot Projects • 64
Primary Investigators: Swafford & Prater
Pilot Project Name : Catapulting to Higher Learning
Active Learning Rating(s): [High] -- (V) Cooperative Learning / Problem Based Learning
(H) Student-Generated Questions on Module
Course Abstract: Course No.: OPMA 3306 – Introduction to Operations Management
Most business students understand topics such as accounting, finance, and marketing but lack a conceptual base when it comes to operations management; therefore, experiential learning is vital in operations management courses. OPMA 3306 introduces undergraduate students to various OM concepts and techniques. To supplement our lecture style of teaching, we add a group project as a required element of the course. Within our catapult project, students gain a deeper understanding by applying several quantitative and qualitative techniques covered in lecture. The project begins when student groups decide to buy or build their catapult. Most often, groups exercise their creative abilities and decide to design and build their own catapults. Groups then apply project management techniques and learn how to create production schedules and time-phased material requirement records for attaining resources. In the competition phase, groups use their catapults to create data representing their catapult’s ability to hit a defined target. They also gather observations on other groups’ catapults and catapulting process. Using the data, observations, and quality control techniques, groups assess their catapult’s performance and identify ways to improve it. After the second competition, groups compare the performance of different catapults and prepare a project report that integrates the different applied techniques. The project incorporates a variety of tools within a common context to assist students in understanding the integrative nature of analytical tools. Along with its integrated approach, it also promotes the development of effective communication and team participation skills by requiring students to provide various reports throughout the project. Group learning assessment is based on the analysis and explanation within each written report and the group’s overall comprehensive report. Individual student learning assessment is based on test performance throughout the course and pre-project and post-project performance comparison on questions covering applied concepts throughout the project.
Learning Activities/Formative
Assessment
Thinking Skills Emphasized in
Outcome and Learning Activities
Summative Assessment Methods Student Learning Outcomes: As a result of this course of instruction, students will be able to…
Identify and relate operation management concepts with competitive strategy in order to make operational decisions.
Inside the Classroom:
• Participate in class discussion
• Provide answers to verbal questions
• Create a concept map
• Solve problems
Outside of the Classroom:
• Complete assigned problems
Analysis • Quizzes
• Exam questions
The QEP Pilot Projects • 65
Student Learning Outcomes: As a result of this course of instruction, students will be able to…
Apply quantitative operation management tools in order to discuss the outcomes given certain information.
Identify qualitative factors affecting operational characteristics in order to compare different options within a business setting.
Explain how operation management decisions relate to each other in order to assess the overall impact of the decision.
Learning Activities/Formative
Assessment
Thinking Skills Emphasized in
Outcome and Learning Activities
Inside the Classroom:
• Participate in class discussion
• Provide answers to verbal questions
• Solve problems
Outside of the Classroom:
• Complete assigned problems
For QEP Pilot Version:
• Apply various operation management tools in the context of the project.
Inside the Classroom:
• Participate in class discussion
• Provide answers to verbal questions
• Solve problems
Outside of the Classroom:
• Complete assigned problems
For QEP Pilot Version:
• Explain the qualitative aspects that impacted the outcome of the catapult competition.
Inside the Classroom:
• Participate in class discussion
• Provide answers to verbal questions
• Solve problems
Outside of the Classroom:
• Complete assigned problems
For QEP Pilot Version:
• Explain the precedents in the process required to achieve a successful catapult launch.
Application
Analysis
Synthesis
Summative Assessment Methods
• Quizzes
• Exam questions
• Project Report using contextualized critical thinking rubric.
• Quizzes
• Exam questions
• Project Report using contextualized critical thinking rubric.
• Quizzes
• Exam questions
• Project Report using contextualized critical thinking rubric.
The QEP Pilot Projects • 66
Student Learning Outcomes: As a result of this course of instruction, students will be able to…
Contrast and compare different options related to operation management concepts in order to select the best option given business and environmental conditions.
Learning Activities/Formative
Assessment
Thinking Skills Emphasized in
Outcome and Learning Activities
Inside the Classroom:
• Participate in class discussion
• Provide answers to verbal questions
• Solve problems
Outside of the Classroom:
• Complete assigned problems
For QEP Pilot Version:
• Compare catapults' design and performance using quantitative and qualitative methods in order to judge catapult performance.
Evaluation
Class
No. of Students
Research Design
Fall 2007
Summative Assessment Methods
• Quizzes
• Exam questions
• Project Report using contextualized critical thinking rubric.
OPMA 3306
Two sections
OPMA 3306
Two sections
OPMA 3306
Two sections
OPMA 3306
Two sections
OPMA 3306
Two sections
OPMA 3306
Two sections
120 120 120 120 120 120
Pre-Post test
Experimental group
Spring 2008
Pre-Post test
Control group
Fall 2008
Pre-Post test
Experimental group
Spring 2009
Pre-Post test
Control group
Fall 2009
Pre-Post test
Experimental group
Spring 2010
Pre-Post test
Control group
The QEP Pilot Projects • 67
Primary Investigator: Trkay
Pilot Project Name : Academic Integrity 101: An Anti-Plagiarism Instruction Session
Active Learning Rating(s): [Moderate] -- (G) Application Activity
(L) Small Group Discussions on Semester
Course Abstract: Course No.: ENGL 1301 Critical Thinking, Reading, and Writing I &
ENGL 1302 Critical Thinking, Reading and Writing II
In Academic Integrity 101: An Anti-Plagiarism Instruction Session, students will be presented with a worksheet that contains only part of an essay. Working in small groups, students must determine whether the essay section has been plagiarized or not; identify what constitutes plagiarism; outline the disciplinary process at UT Arlington for violations of academic integrity standards; explore the possible consequences for those who continue to plagiarize in their professional lives; and demonstrate the proper citation of the “plagiarized” material. Each group presents portions of what they have discovered to the other groups, and all students are encouraged to participate in class discussion, thus challenging the students to think more deeply about the ethical use of information. By the end of the session, students should be able to interpret and apply their knowledge of the economic, legal, and social standards for information use in order to apply these standards to their research and writing and to be able to analyze a case example for evidence of plagiarism. Evaluation of the sessions will be completed by applying a four-point critical thinking rubric to students’ homework assignments. Assessment of student learning of information usage will target three key areas: measuring the students’ ability to identify and summarize the economic, legal, and social standards for information use, the student’s ability to identify and consider the ethical context for information use, and their ability to identify and assess the consequences for unethical information use.
The QEP Pilot Projects • 68
Student Learning Outcomes: As a result of this course of instruction, students will be able to…
Interpret the economic, legal, and social standards for information use in order to apply these standards to their research and writing.
Students will apply knowledge of the economic, legal, and social standards for information use in order to analyze a case example for evidence of plagiarism.
Fall 2007
Learning Activities/Formative
Assessment
Case Analysis
(1) Search for and find definitions of plagiarism; (2) Identify legal and social consequences of plagiarism; (3)
Identify examples of the unethical use of information; (4) Identify proper documentation for an information source.
Group Discussion
(1) Discuss how definitions of plagiarism apply to real-life use of information; (2) Discuss techniques for avoiding plagiarism; (3) Connect use of information to the possible consequences if information is used illegally or unethically
Sample second case analysis without instructor assistance
Spring 2008 Fall 2008
Thinking Skills Emphasized in
Outcome and Learning Activities
Knowledge
Application
Spring 2009
Class ENGL 1302
(experimental)
ENGL 1301 (exp. and control)
No. of Students 350
Research Design Pre-Post Test
Control and experimental group
Analysis across semesters
ENGL 1302
(experimental)
ENGL 1301 (exp. and control)
350
Pre-Post Test
Control and experimental group
Analysis across semesters
ENGL 1302
(experimental)
ENGL 1301 (exp. and control)
350
Pre-Post Test
Control and experimental group
Analysis across semesters
ENGL 1302
(experimental)
ENGL 1301 (exp. and control)
350
Pre-Post Test
Control and experimental group
Analysis across semesters
Summative Assessment Methods
• Using contextualized critical thinking rubric to assess the reflective essay (Homework Part
2).
• Numeric tally of plagiarism cases in course sections that have attended the active learning session.
Using contextualized critical thinking rubric to assess individual case study (Homework Part 1).
Fall 2009
ENGL 1302
(experimental)
ENGL 1301 (exp. and control)
350
Pre-Post Test
Control and experimental group
Analysis across semesters
Spring 2010
ENGL 1302
(experimental)
ENGL 1301 (exp. and control)
350
Pre-Post Test
Control and experimental group
Analysis across semesters
The QEP Pilot Projects • 69
VI.
References
American Psychological Association. 1997. Learner-Centered Psychological Principles: A Framework for School Design and Reform . Washington, DC: Center for Psychology in Schools and Education.
Astin, A., J.H. Blake, Z. Gamson, H. Hodgkinson, B. Lee, and K. Mortimer. 1984. Involvement in Learning: Realizing the Potential of America Higher Education . Washington, DC: National Institute of Education.
Astin, Alexander W. 1985. Achieving Educational Excellence . San Francisco: Jossey-Bass Inc.
Banks, James A. 1988. “Ethnicity, Class, Cognitive, and Motivational Styles: Research and Teaching Implications.”
Journal of Negro Education , 57: 452-466.
Baron, R. M. and Kenny, D. A. 1986. “The Moderator-Mediator Variable Distinction in Social Psychological
Research: Conceptual, Strategic, and Statistical Considerations.” Journal of Personality and Social Psychology ,
51, no. 6: 1173-1182.
Belenky, Mary, Blythe Clinchy, Nancy Goldberger, and Jill Tarule. 1986. Women’s Ways of Knowing: The
Development of Self, Voice, and Mind . New York: Basic Books.
Bell, James. 2003. Ideas for Teachers . Retrieved from Educators Reference Desk database available at http://classweb/howardcc.edu/jbell/all_college_teachers.htm (last visited February 22, 2006).
Benjamin, Ludy T., Jr. 1991. “Personalization and Active Learning in the Large Introductory Psychology Class.”
Teaching of Psychology , 18: 68-74.
Berger, Bruce K. 2002. “Applying Active Learning at the Graduate Level: Merger Issues at Newco.” Public Relations
Review , 28: 191-200.
Biddle-Perry, Geraldine. 2005. “Stimulating Critical Thinking in the Theoretically Timid: The Role and Value of Oral
History Assignments within an Interdisciplinary Context.” Art, Design & Communication in Higher
Education , 4: 85-99.
Bloom, Benjamin S., ed. 1956. Taxonomy of Educational Objectives: The Classification of Educational Goals . New York:
Longman.
Bloom, B.S., Englehart, M.D., Furst, E.J., Hill, W.H. and D.R. Krathwohl. 1956. Taxonomy of Educational Objectives:
The Classification of Educational Goals. Handbook I: Cognitive Domain. New York: Longman.
Bok, Derek. 2006. Our Underachieving Colleges . Princeton, NJ: Princeton University Press.
Bonwell, Charles C. and James A. Eison. 1991. Active Learning: Creating Excitement in the Classroom . ASHE-ERIC
Higher Education Report No. 1, Washington, D.C.: The George Washington University, School of
Education and Human Development.
Bonwell, Charles C. and Tracey E. Sutherland. 1996. “The Active Learning Continuum: Choosing Active Learning
Activities to Engage Students in the Classroom.” In Tracey E. Sutherland and Charles C. Bonwell, eds. 1996.
Using Active Learning in College Classes: A Range of Options for Faculty . San Francisco: Jossey-Bass Inc.
Brock, Kathy L. and Beverly J. Cameron. 1999. “Enlivening Political Science Courses with Kolb’s Learning
Preferences Model.” PS: Political Science and Politics , 32: 251-256.
Brophy, J. 1983. “Conceptualizing Student Motivation.” Educational Psychologist , 18: 200-215.
Butun, Erhan. 2005. “Teaching Genetic Algorithms in Electrical Engineering Education: A Problem-Based Learning
Approach.” International Journal of Electrical Engineering Education , 42: 223-233.
Cabrera, Alberto F., Amaury Nora, Elena M. Bernal, Patrick T. Terenzini, and Ernest T. Pascarella. 1998.
“Collaborative Learning: Preferences, Gains in Cognitive & Affective Outcomes, and Openness to Diversity
Among College Students. Paper presented at the Annual Meeting of the Association for the Study of Higher
Education, Miami, FL.
Carey, K. 2004. A Matter of Degrees: Improving Graduation Rates in Four-Year Colleges and Universities. Washington,
D.C.: Education Trust.
Chickering, Arthur W. and Zelda F. Gamson. 1987. “Seven Principles for Good Practice in Undergraduate
Education.” AAHE Bulletin , 39: 3-7.
Chung, Jenny C.C. 2001. “Active Learning of Geriatric Rehabilitation: Deliberations of an Undergraduate
Occupational Therapy Programme.” Scandinavian Journal of Caring Science , 15: 250-256.
Cross, Patricia. 1998. “Why Learning Communities? Why Now?” About Campus , July-August, 1998, 4-11.
Devadoss, S. and Foltz, J., 1996. “Evaluation of Factors Influencing Student Class Attendance and Performance,”
American Journal of Agricultural Economics , 78, no. 3: 499-507.
Dewey, John. 1897. My Pedagogical Creed . New York: E.L. Kellogg & Co.
Dewey, John. 1924. Democracy and Education . New York: Macmillan.
Dewey, John. 1938. Experience and Education . New York, The Macmillan Company.
Durden, G. C. and Ellis, L. V., 1995. “The Effects of Attendance on Student Learning in Principles of Economics,”
The American Economic Review , 85, no. 2: 343-346.
References • 70
Ebert-May, Diane and Carol Brewer. 1997. “Innovation in Large Lectures – Teaching for Active Learning.”
Bioscience , 47: 601-607.
Felder, Richard M. and Rebecca Brent. 2006. “Navigating the Bumpy Road to Student-Centered Instruction.”
Available at http://www.ncsu.edu/felder-public/Papers/Resist.html (last visited October 15, 2006).
Felder, Richard M., Gary N. Felder, and E. Jacquelin Dietz. 1998. “A Longitudinal Study of Engineering Student
Performance and Retention V. Comparisons with Traditionally-Taught Students.” Journal of Engineering
Education , 87: 469-480.
Fischer, G.H. and Molenaar, I. 1995. Rasch Models: Foundations, Recent Developments and Applications.
New York:
Springer-Verlag New York, Inc.
Fox, Richard L. and Shirley A. Ronkowski. 1997. “Learning Styles of Political Science Students.” PS: Political Science and Politics , 30: 732-737.
Fox-Cardamone, L. and S. Rue. 2003. “Students’ Responses to Active-Learning Strategies: An Examination of Small-
Group and Whole-Class Discussion.” Research for Educational Reform , 8: 3-15.
Freire, Paulo. 1970. Pedagogy of the Oppressed . New York: Herder and Herder.
Gilligan, Carol. 1993. In a Different Voice: Psychological Theory and Women’s Development . Cambridge, MA: Harvard
University Press.
Goodwin, Leonard, Judith E. Miller, and Ronald D. Cheetham. 1991. “Teaching Freshmen to Think – Does Active
Learning Work?” Bioscience , 41: 719-722.
Gross Davis, Barbara. 1993. Tools for Teaching . San Francisco: Jossey Bass.
Hagedorn, Linda Serra, Ernest T. Pascarella, Marcia Edison, John Braxton, Amaury Nora, and Patrick T. Terenzini.
1999. “Institutional Context and the Development of Critical Thinking: A Research Note.” The Review of
Higher Education , 22: 265-285.
Helman, Shaun and Mark S. Horswill. 2002. “Does the Introduction of Non-Traditional Teaching Techniques
Improve Psychology Undergraduates’ Performance in Statistics? Psychology Learning and Teaching , 2: 12-
16.
Hoover, Wesley A. 2006. “The Practice Implications of Constructivism.” SEDL Letter, Volume IX, Number 3 .
Available at http://www.sedl.org/pubs/sedletter/v09n03/welcome.html (last visited October 17, 2006).
Howe, Neil and William Strauss. 2000. Millennials Rising: The Next Great Generation . New York: Vintage Books.
Huitt, W. 1992. “Problem Solving and Decision Making: Consideration of Individual Differences Using the Myers-
Briggs Type Indicator.” Journal of Psychological Type , 24: 33-44.
Huitt, W. 2004. “Bloom’s et al.’s Taxonomy of the Cognitive Domain.” Educational Psychology Interactive . Valdosta,
GA: Valdosta State University. Available at: http://chiron.valdosta.edu/whuitt/col/cogsys/bloom.html (last visited April 13, 2006).
Johnson, David W., Roger T. Johnson, and Karl A. Smith. 2006. Active Learning: Cooperation in the College
Classroom , 3rd ed. Edina, MN: Interaction Book Company.
Keeley, Stuart, Rahan Ali, & Tracy Gebing. 1998. “Beyond the Sponge Model: Encouraging Students’ Questioning
Skills in Abnormal Psychology.” Teaching of Psychology , 25: 270-274.
Knowles, Malcom. 1980. The Modern Practice of Adult Education: From Pedagogy to Andragogy . Chicago: Follett
Publishing Co.
Kolb, D.A. 1984. Experiential Learning: Experience as the Source of Learning and Development . Englewood Cliffs, NJ:
Prentice-Hall.
Kolb, David A. 1974. “On Management and the Learning Process.” In David A. Kolb, Irwin M. Rubin, and James M.
McIntyre, eds. Organizational Psychology: A Book of Readings , second edition. Englewood Cliffs, NJ:
Prentice-Hall, Inc.
Kolb, David A. 1981. “Learning Styles and Disciplinary Differences.” In Arthur W. Chickering and Associates, eds.
Responding to the New Realities of Diverse Structures and a Changing Society . San Francisco: Jossey-Bass Inc.
Kuh, George D., C. Robert Pace, and Nick Vesper. 1997. “The Development of Process Indicators to Estimate
Student Gains Associated with Good Practices in Undergraduate Education.” Research in Higher Education ,
38: 435-454.
Laws, Priscilla, David Sokoloff, and Ronald Thornton. 1999. “Promoting Active Learning Using the Results of
Physics Education Research.” Science News , 13.
Lewin, Kurt. 1942. “Field Theory and Learning.” In D. Cartwright, ed. Field Theory in Social Science: Selected
Theoretical Papers . London; Social Science Paperbacks.
May, Elaine Tyler. 1986. “Using Primary Sources in the Classroom.” In Steven F. Schomberg, ed. Strategies for Active
Teaching and Learning in University Classrooms . Minneapolis: University of Minnesota.
McCarthy, J. Patrick and Liam Anderson. 2000. “Active Learning Techniques Versus Traditional Teaching Styles:
Two Experiments from History and Political Science.” Innovative Higher Education , 24: 279-294.
McKeachie, Wilber J., Paul Patrick, Y. Guang Lin, and David A. F. Smith. 1986. Teaching and Learning in the College
Classroom: A Review of the Research Literature . Ann Arbor: University of Michigan Press.
McKeachie, Wilbert J. 1998. Teaching Tips: Strategies, Research and Theory for College and University Teachers .
Boston: Houghton-Mifflin.
References • 71
Meyers, Chet and Thomas B. Jones. 1993. Promoting Active Learning: Strategies for the College Classroom . San
Francisco: Jossey-Bass Publishers.
Minnesota State Colleges and Universities System. 2006. Learning That Lasts, 2002-2005: Final Report to the Bush
Foundation . Minneapolis: Minnesota State Colleges and Universities System.
Modell, H. I. 1996. “Preparing Students to Learn in an Active Learning Environment.” Advances in Physiology
Education , 15: 69-77.
Niemi, Hannele. 2002. “Active Learning: A Cultural Change Needed in Teacher Education and Schools.” Teaching and Teacher Education , 18: 763-780.
Pascarella, Ernest and Patrick Terenzini. 2005. How College Affects Students, vol. 2: A Third Decade of Research . San
Francisco: Jossey-Bass.
Piaget, Jean. 1976. The Psychology of Intelligence . Totowa, NJ: Littlefield Adams.
Prince, Michael. 2004. “Does Active Learning Work?: A Review of the Research.” Journal of Engineering Education
(July 2004): 1-9.
Schomberg, Steven F. 1986. Strategies for Active Teaching and Learning in University Classrooms . Minneapolis:
University of Minnesota.
Schroeder, Charles C. 2004. New Students – New Learning Styles . Available at http://www.virtualschool.edu/mon/Academia/KierseyLearningStyles.html (last visited October 8, 2006).
Senge, Peter M. 1994. The Fifth Discipline: The Art and Practice of the Learning Organization . New York: Currency
[Doubleday].
Shannon, Terrie M. 1986. “Introducing Simulation and Role Play.” In Steven F. Schomberg, ed. Strategies for Active
Teaching and Learning in University Classrooms . Minneapolis: University of Minnesota.
Silberman, Mel. 1996. Active Learning: 101 Strategies to Teach Any Subject . Boston: Allyn and Bacon.
Smith, Karl A. 1986. “Cooperative Learning Groups.” In Steven F. Schomberg, ed. Strategies for Active Teaching and
Learning in University Classrooms . Minneapolis: University of Minnesota.
Springer, Leonard. 1997. “Relating Concepts and Applications Through Structured Active Learning” (ERIC
Document No. 418855). Paper presented at the American Educational Research Association, Chicago, IL,
March 24-28, 1997.
Sutherland, Tracey E. 1996. “Emerging Issues in the Discussion of Active Learning.” In Tracey E. Sutherland and
Charles C. Bonwell, eds. 1996. Using Active Learning in College Classes: A Range of Options for Faculty . San
Francisco: Jossey-Bass Inc.
Sutherland, Tracey E. and Charles C. Bonwell. 1996. Using Active Learning in College Classes: A Range of Options for
Faculty . San Francisco: Jossey-Bass Inc.
Terenzini, Patrick T. and Ernest T. Pascarella. 1994. “Living with Myths: Undergraduate Education in America.”
Change , January/February, 1994: 28-32.
Texas Higher Education Coordinating Board. 2000. Closing the Gaps: The Texas Higher Education Plan . Available at http://www.thecb.state.tx.us/reports/PDF/0379.PDF (last visited October 7, 2006).
The Secretary of Education’s Commission on the Future of Higher Education. 2006. A Test of Leadership: Charting the Future of U.S. Higher Education (pre-publication copy, September 2006). Available at http://www.ed.gov/about/bdscomm/list/hiedfuture/reports/pre-pub-report.pdf (last visited October 7,
2006).
The University of Texas at Arlington Graduation Rates Task Force. 2005. Final Report, June 1, 2005 .
Tsui, Lisa. 2002. “Fostering Critical Thinking Through Effective Pedagogy: Evidence from Four Institutional Case
Studies.” Journal of Higher Education , 73: 740-763.
University of Texas System Board of Regents. 2004. Accountability and Performance Report. Austin, TX: University of
Texas System.
Van Amburgh, J., J. Devlin, J. Kirwin, and D. Qualters. 2005. Active Learning Inventory Tool . Department of
Pharmacy Practice and Center for Effective University Teaching, Northeastern University, Boston, MA.
Warren, Russell G. 1997. “Engaging Students in Active Learning.” About Campus (March-April 1997): 16-20.
Weimer, Maryellen. 1996. “Active Learning: Quantity, Extend, Depth Count.” The Teaching Professor , 10: 1.
Wenger, Etienne. 2006. “Communities of Practice: A brief introduction.” Available at: http://www.ewenger.com/theory/ (last visited December 4, 2006).
Wood, Kay C., Harlan Smith, & Daurice Grossniklaus. 2001. “Piaget's Stages of Cognitive Development.” In M.
Orey, ed. Emerging Perspectives on Learning, Teaching, and Technology . Available at http://www.coe.uga.edu/epltt/piaget.htm (last visited October 11, 2006).
Yoder, Janice D. and Catherine M. Hochevar. 2005. “Encouraging Active Learning Can Improve Students’
Performance on Examinations.” Teaching of Psychology , 32: 91-95.
Yuretich, Richard F., Samia A. Khan, R. Mark Leckie, and John J. Clement. 2001. “Active-Learning Methods to
Improve Student Performance and Scientific Interest in a Large Introductory Oceanography Course.”
Journal of Geoscience Education , 49: 111-119.
References • 72
Appendix A: Rasch Model Assessment Issues
I. Course-Level Assessment Plan for Pilot Program Courses
For assessment purposes, we wish to measure the students’ abilities, not their performance on a particular set of items. Thus we need an analysis method which separates student ability from item difficulty. This capability is provided by the so-called Rasch model .
A. The Rasch Model
The Rasch model (see, e.g., Fischer and Molenaar (1995)) posits the existence of an “ability” parameter, a i descriptive of student i, and a “difficulty” parameter, d jr
, of item j, such that
,
(Eq. 1) p ijkr
= exp{a i
- d jr
} /
4
s
=
1 exp{a - d } i js
.
Here a i is a latent (i.e. not directly measurable) quantity deemed to measure (along a latent continuum) the ability of student i in the knowledge domain represented by item j. The assumption is that every student i has such an ability a i
, such that the larger is a i
the more able is student i in this specific knowledge domain.
The item difficulty, d jr
, represents the difficulty to achieve rubric rating r on item j. The larger is d jr
, the more difficult is it to attain rating r on item j. Note also the tacit assumption that student ability and item difficulty vary along the same continuum, and hence are comparable.
In fact, from Eq. 1 we see that the student ability, a i,
is in direct opposition to the difficulty, d jr
, of item j, so that it is the difference between student i’s ability and the item difficulty that governs his probability of attaining rating r on item j.
B. Using the Rasch Model to Assess Higher Order Thinking Skills (HOTS) at the Course Level
For a course in which we have two or more sections running, one with and one without active learning (AL), this would be straightforward. Assuming we have the raw rubric responses on a set of standardized (across sections) items, the Rasch model can be fit – with the abilities, a i
, assumed to be normally distributed (N(
μ σ
2
) ) random effects, to yield a mean (
μ
) and variance (
σ
2
) of the ability distribution for each stratum x section combination.
These mean ability estimates could then be compared via standard statistical methods.
C. Three Examples
To illustrate the Rasch model in several cases we will encounter in analyzing the results of our QEP studies, we offer three examples.
1.
Two Conditions, No Confounders
First, consider two sections, one without AL (k=1) and one with AL (k=2), with no other factors relevant (for this simple example). Assume that there are 5 items (on a test, say), each of which is scored on a 4-level rubric. So j=1,2,3,4,5 indexes items and r=1,2,3,4 indexes rubric levels (with j=1 denoting the lowest and j=4 the highest performance level). For each j, k and r=2,3,4 define the multiple-logit transform of p
Then the Rasch model for this situation is (recall: i indexes student): ijkr:
Logit p ijkr
= log(p ijkr
/ p ijk1
).
Logit p ijkr
=
μ k
+ S i
+ d jr
, (M1)
Where:
μ k
= mean student ability for section k; S i
is a random student effect with mean 0 and variance
σ
2
; and d jr is the difficulty of obtaining rubric rating r on item j.
The analysis would consist of using the model (M1) to estimate the interest), and then comparing the
μ k
’s,
σ
2
and the d jr
’s (which are of no direct
μ
’s. (Technical note: if we suspected that the ability variances differed across k the k levels, we could fit the above model to each group separately, test the variances for equality, and analyze the means regardless via Wald tests.)
A-1
2.
Two Conditions with Confounder Control
As a slightly more complicated example, suppose that we want to control for attendance and SAT score in the previous example. Assuming that each of these is available more or less exactly measured (i.e. ungrouped), and that we wish to compare the mean abilities for groups k=1 and k=2 at attendance level A0 and SAT level T0, the Rasch model becomes (assuming linear relationships in the logit scale):
Logit p ijkr
=
μ
+ S k i
+ d jr
+ b1*(A ik
– A0) + b2*(T ik
-T0) (M2) where A ik
denotes the attendance of student i in group k and T ik
denotes the SAT score of student i in group k.
If we chose to use grouped versions of attendance or SAT, then we would write the model using indicator variables in the usual manner for regression models.
3. Two Conditions with Rater Effects
Some of the items used in our QEP will consist of projects that will require grading by multiple raters. Assuming that each of the raters grades some of the papers from both conditions, then it will be possible to estimate – and remove – any rater effects which may exist. Assuming that the raters are blinded to the group status of the students, the following Rasch model will allow estimation of mean abilities in each group, after removal of any rater effects:
Logit p ijkrg
=
μ k
+ S i
+ d jr
+ R g
, (M3) where the new subscript, g, indexes raters and the term R g
denotes a rater effect. Note that the assumption that the raters are blinded as to condition k allows the model to reasonably hold without a rater x group interaction present.
Estimates of the
μ k
’s from model (M3) are adjusted for both item difficulty and rater effects.
D. Feasibility
As a technical point, software (e.g., CONQUEST) is currently available to estimate all of models (M1-M3) via marginal maximum likelihood methods.
1. Instructors as Confounders
Inasmuch as AL (or any other teaching strategy) must be implemented by an instructor, separating the component of students’ success due to AL from that due to the instructor – without AL – is a difficult problem. The issue is germane since, if AL techniques are not effective tools for the general population of faculty, but only for those who fully embrace the concept, then promoting it as a University-wide initiative may be misplacing resources.
Replication of courses across semesters – with AL strategy held fixed but instructors varying across replications, will allow the component of variation due to different instructors to be estimated (while controlling for the other confounders). If this component turns out to be large, the conclusion will be that AL works for some instructors and not for others, whence uniform promotion of the techniques across the University may be unwarranted. If it turns out to be small, then University-wide implementation would be warranted.
A more direct isolation of the effect of AL (vs. no AL) – controlling for the instructor effect – would be to have one instructor teach two different sections of the same course during the same semester: one with AL, the other without.
The idea is that the instructor effect should be the same in both sections, and would be subtracted out of any comparison between his two sections. Of course, this idea suffers the possibility of inadvertent instructor bias for or against AL, meaning the instructor effect might not be the same in both sections. Further, this may not account for faculty learning curves in employing AL in the classroom.
2. Impact of AL on Attendance
Research has shown (see e.g. Devadoss & Foltz (1996) or Durden & Ellis (1995)) that a student’s performance is directly impacted by her attendance in class. It may be that AL will increase attendance because it makes a class more fun and interesting. Thus, any increase in HOTS may be due to increased attendance, rather than directly the result of AL. That is, attendance is a mediator for AL with regards to the impact of AL on HOTS. This mediator effect can be tested by standard statistical methods (see Baron & Kenny (1986)). If this proves true, then focusing University resources on improving attendance by a variety of means, rather than just focusing on AL, may be desirable. This would give more options to instructors, especially if the caveat about AL depending on the instructor proves true.
A-2
F. Controls
To ascertain the effect of AL on HOTS several control scenarios will be used. Whenever possible, two sections of the same course will be taught in the same semester by the same instructor with the only difference in the two classes being the presence or absence of the active learning component of the class. Where it is not possible to have two sections of the exact same course in the same semester, if possible similar courses will be taught in the same semester by the same instructor one with AL and one without AL. Also, the exact same course will be taught in successive semesters by the same instructor, one semester with AL and the other semester without AL. To try to examine the instructor effect the same classes, both with AL and without AL, will be taught in subsequent semesters with different instructors.
II. Program-Level Assessment of HOTS
A.
Introduction and Purpose
The purpose here is to outline a plan for assessing the efficacy of the proposed AL instructional strategies, based on viewing our program as a whole – rather than at the course level. The term “program” herein refers to the set of all our pilot courses.
B. Our QEP Goal inspired the Structure of the Program Courses
The general goal for the QEP is to learn in which types of courses or disciplines, and at what level of intensity, AL works best on our campus. In the spirit of an exploratory design, the set of program courses was selected to represent a variety of levels of the following three factors:
• Course discipline (e.g., history, engineering, etc.)
• Course level (e.g., freshman, sophomore, etc.)
• Intensity of AL employed by the course instructor (measured on a 3-level ordinal scale, where 0 signifies no AL, 1 signifies an intermediate level, and 2 signifies a high level
Obviously, many course disciplines might have been studied, and our QEP program could select only a few.
Similarly, all the possible course levels are not represented in our program, either; we chose to focus on the undergraduate experience.
Thus, although our program courses admit a three-factor framework, the design is of necessity incomplete (i.e., we will not be able to examine the efficacy of AL in all combinations of course disciplines/levels). The implications of this aspect of our program design are discussed in the next section.
C.
Program-Level Assessment Strategy
1.
Some More Focused Questions
Toward the general goal, we consider the following more-focused questions:
(Q1) Is AL, at some positive intensity, associated with improved HOTS in any of the program courses?
If so, the following two questions will be addressed:
(Q2) Is there any evidence among these courses that a higher AL intensity is associated with more HOTS improvement?
(Q3) Are there any distinctions, in terms of course discipline / level, between courses in which AL is, and is not, associated with improved HOTS?
Question 1 is crucial, since a negative answer to it sends us back to the drawing board about whether AL is worth pursuing.
Question 2 goes to the important matter of whether there is a dose-response relationship between AL intensity and
HOTS performance. Such is a standard way to investigate the question of whether the presence of AL in the course instruction actually causes improved HOTS. Specifically, presence of a dose-response relationship is generally viewed as a necessary condition for a causal relationship to exist.
Question 3 is a step toward trying to generalize the findings to other courses which might benefit from AL.
2. Difficulties due to the Program Design
The problem of addressing these questions on the basis of our program design is essentially one of meta-analysis. To wit:
A-3
i.
The course-level assessments, each in itself a mini-study, will provide information (albeit it of varying quality, depending on whether a no-AL control section is available) on the efficacy of AL at a specific intensity, at specific combinations of course discipline / level. ii.
Further, these course-level assessments will be available only for a small subset of the universe of potential course discipline/ level combinations. iii.
Finally, among the available course-level assessments, the AL intensity will probably vary.
Our questions Q1-Q3 require us, in one manner or another, to compare or combine these bits of course-level information towards gaining answers. A complete design, meaning all combinations of all levels of the three factors, would make this a fairly routine task. As described below, our incomplete design makes things more difficult.
To begin, Question 1 will be simple to address, since it just requires looking over the list of course-level assessments and identifying the course/level combinations showing improvement in HOTS (accounting for uncertainty in our
HOTS ability estimates, of course).
However, any statistician will quickly recognize that questions 2 and 3 are inaccessible from our program design without simplifying assumptions. For example, to address question 2 in a reliable way, we would need HOTS data on different sections representing the same combination of course discipline/level, one section (at least) for each of the three levels of AL intensity. Even then, we could not be sure that any dose-response relationship found for this course discipline/level combination would persist to other such combinations (such a question could be studied with a complete design, but not with our incomplete design). The simplifying assumption required to take anything useful from this analysis would be the working hypothesis that such relationships are the same, regardless of discipline/level combination.
As another example, suppose (hypothetically) that question 3 is answered affirmatively, and that AL apparently succeeds in freshman Liberal Arts courses, but apparently fails in senior engineering courses. Is the difference in AL performance due to the different disciplines or to the different course levels? Or perhaps to some combination of both? Such questions are not addressable with our design. (To address this question requires observing all the combinations of liberal arts or engineering, with freshman or senior level courses.) Interpreting this result – in terms of which discipline/level combinations to try next – is impossible.
We end this section with the summary remark that our program design should give us a reasonably clear answer to
Q1, but only hints at the answers to Q2 and Q3 lending further credence to the benefits of pursuing an expanded design.
D.
Statistical Analysis for the Program-Level Assessment Plan
Here we sketch out the statistical analysis we intend to use in addressing Q1-Q3, in light of the above-noted difficulties. The main goal is to combine, across program courses for which it makes sense, the course-level information on the efficacy of AL in increasing HOTS ability. The discussion presumes the information about the statistical analysis of the HOTS data at the course level.
1.
Statistical Model for Estimating HOTS Ability from Data across Program Courses.
The basic quantities needed here are the output quantities of the course-level assessment. Specifically, for each HOTS level of interest, we need:
From each section i under condition s (s=1 for no AL, s=2 for AL) of program course c: the estimated mean (mu csi of the ability distribution, along with its standard error (se csi
).
)
These mean ability estimates will already be adjusted to specific levels of confounders (like attendance, SAT score, etc), so may be considered reasonably comparable in this sense.
Since the mean ability estimates may (since they are maximum likelihood estimators) be presumed to be approximately normally distributed, but with possibly different standard errors across the c,s,i combinations, a reasonable approach to their combination is to use a Gaussian linear model, as follows. mu csi
= u s
+ E csi
(M1) where: u s
(se csi
)**2.
is the mean ability for AL level s; and E csi
is a random error, presumed normal with mean zero and variance
A-4
Model (M1) might be re-written with the error term, E csi
, decomposed into a random course effect, a fixed effect of
AL level, a random course x section interaction and a (different) error term, and the corresponding variance components may indeed be of interest in assessing the corresponding sources of variation in any AL effects.
However, the form (M1) allows easier treatment of the current problem of combining the estimates to produce an estimate an assumed-meaningful mean ability over our program courses (or some subset of them as we see fit to combine them). Specifically, owing to the potentially different standard errors of the estimates mu csi provides the basis for a weighted least squares analysis strategy to combine the estimates.
, model (M1)
Also worth noting, section-level covariates (like class size) might be incorporated into (M1) if they were thought to be important confounders.
The difference u2 – u1, estimable from model (M1), represents (barring the quibble in section E below) a summary measure – over all the program courses – of the difference in mean ability for the given HOTS level.
An outstanding issue, not addressed here, is whether one can really compare the higher order thinking skills necessary for different disciplines and levels. Our program assessment strategy assumes, however, that we are measuring the same HOTS in the individual classes, which may not be true but may worthy of further investigation.
A-5
Appendix B: Comprehensive QEP Budget Breakout
Comprehensive QEP Budget Breakout Unit Costs
Year 1 Totals
New Existing
Year 2 Totals
New Existing
Pilot Project Implementation (Startup-Year 3)
Subtotal $290,620.92 $850.00 $251,329.92 $850.00
Pilot Project Assessment
Material Cost (Scantron, etc.) A $11,550.00 $11,550.00 $11,550.00
Classroom Assessments B
Notes:
$0.00 $0.00 $0.00
A
Scantrons for Modified Knowledge Survey and interim assessments for pilot projects & controls, IDEA forms (after first semester), and additional assessment related costs.
B
No Cost: UT Arlington is an IDEA pilot project participant for one semester. VARK is a free online inventory. Use of ALIT is by permission. CONQUEST is already owned.
Subtotal $11,550.00 $11,550.00 $11,550.00
QEP Management Team /Annual Salary & Benefits
QEP Management Team /Computer Equipment
Year 3 Totals
New Existing
$185,637.92 $850.00
$11,550.00
$0.00
Year 4 Totals
New Existing
$242,529.59 $850.00
$11,550.00
$0.00
$11,550.00 $11,550.00
Job Title
QEP Coordinator ( 1/2 time appt. based on Eng. Assoc. Prof. level)
Undergraduate Reporting & Assessment Specialist
C
Web Developer/Database Manager
GRA (2 assigned to Web Developer, 12 month basis)
D
Notes:
Subtotal /Annual Salary & Benefits
Subtotal
C Dell 670 Computer & HP Deskjet Color Printer.
D
Dell 670 Only.
Subtotal /Computer Equipment
Communities of Practice Support
Copy costs for handouts, binders for 14
Stipends (2 facilitators plus 12 PI's @ $500)
Refreshments for 9 meetings, 1 meal
Subtotal
University Wide Faculty Initiatives
Website of Active Learning Strategies
D
Speakers
Faculty Incentives
Mini Grant (Issue 18 at $1500/yr)
Faculty Development Leave
E
Notes:
E
D
Server existing within IRP.
Potential award for an Active Learning initiative.
Subtotal
Annual Salary Annual Benefits Computers
$42,500.00
$42,000.00
$14,025.00
$19,800.00
Existing Eq.
$3,790.00
$47,590.00
$24,000.00
$15,704.70 Existing Eq.
$7,920.00 $7,230.00
$213,539.70
$378.00
$7,000.00
$1,030.00
$8,408.00
$0.00
$8,000.00
$27,000.00
$10,000.00
$45,000.00
$11,020.00
Salary & Benefits
$56,525.00
$61,800.00
$63,294.70
$31,920.00
$213,539.70
$11,020.00
$378.00
$7,000.00
$1,030.00
$8,408.00
$0.00
$8,000.00
$27,000.00
$10,000.00
$45,000.00
Salary & Benefits
$56,525.00
$61,800.00
$63,294.70
$31,920.00
$213,539.70
$378.00
$7,000.00
$1,030.00
$8,408.00
$0.00
$8,000.00
$27,000.00
$10,000.00
$45,000.00
Salary & Benefits
$56,525.00
$61,800.00
$63,294.70
$31,920.00
$213,539.70
$378.00
$7,000.00
$1,030.00
$8,408.00
$0.00
$8,000.00
$27,000.00
$10,000.00
$45,000.00
Salary & Benefits
$56,525.00
$61,800.00
$63,294.70
$31,920.00
$213,539.70
$3,673.33
$378.00
$7,000.00
$1,030.00
$8,408.00
$0.00
$8,000.00
$27,000.00
$10,000.00
$45,000.00
A-6
Comprehensive QEP Budget Breakout Unit Costs
University Wide Assessments
Existing University Funding for Assessment
NSSE (Administered Annually)
Collegiate Learning Assessment (CLA)(Administered Annually)
FSSE (Administered Annually)
$7,500.00 $7,500.00
$2,204.00
$10,400.00
SES/SAS (Administered every 3 years starting in Year 1)
Alumni Survey (Administered every 3 years starting in Year 2)
$1,516.00
$12,000.00
Employer Survey(Administered every 3 years starting in Year 3)
Notes: F $7,500 for NSSE and $10,400 for CLA paid by UT System annually.
$2,500.00
Existing University Funding Subtotal
New QEP University Assessment Funding
NSSE Oversampling ($7.50/person for 200 students, Annually)
CLA Oversampling ($20/person for 100 students, Annually)
$25,720.00
$1,500.00
$2,000.00
FSSE Oversampling (No additional cost, Annually)
New QEP University Assessment Funding Subtotal
Travel
SACS Conference
G
PreConference Workshops
Airfare
Hotel
Food
$0.00
$3,500.00
$720.00
$1,200.00
$1,800.00
$900.00
Parking (5 days parking at $10/day/person)
Registration
QEP Coordinator Travel
Subtotal SACS Conference
Undergraduate Reporting & Assessment Spec. Travel
Faculty Travel H
$5,655.00
$3,000.00
$3,000.00
$12,000.00
Subtotal Travel
Notes:
$23,655.00
G
QEP Coordinator & 2 PIs to SACS Annual Conference, 4 day conference.
H
4 additional faculty to attend assessment-related conferences.
$150.00
$885.00
Subtotal
Grand Total (Annual New & Existing)
Note: Year 4 amounts equal the corresponding average annual expenditures for years 1-3.
F
F
New
Year 1 Totals
Existing
$7,500.00
$2,204.00
$1,516.00
New
Year 2 Totals
Existing
$7,500.00
$2,204.00
$12,000.00
$1,500.00
$2,000.00
$0.00
$3,500.00
$720.00
$1,200.00
$1,800.00
$900.00
$150.00
$885.00
$3,000.00
$3,000.00
$12,000.00
$23,655.00
$11,220.00
$1,500.00
$2,000.00
$0.00
$3,500.00
$720.00
$1,200.00
$1,800.00
$900.00
$150.00
$885.00
$3,000.00
$3,000.00
$12,000.00
$23,655.00
$21,704.00
Year 1 New Yr 1 Existing Year 2 New Yr 2 Existing
$607,293.62 $12,070.00
$619,363.62
$556,982.62 $22,554.00
$579,536.62
New
Year 3 Totals
Existing
$7,500.00
$2,204.00
$2,500.00
New
Year 4 Totals
Existing
$7,500.00
$2,204.00
$1,516.00
$1,500.00
$2,000.00
$0.00
$3,500.00
$720.00
$1,200.00
$1,800.00
$900.00
$150.00
$885.00
$3,000.00
$3,000.00
$12,000.00
$23,655.00
$12,204.00
$1,500.00
$2,000.00
$0.00
$3,500.00
$720.00
$1,200.00
$1,800.00
$900.00
$150.00
$885.00
$3,000.00
$3,000.00
$12,000.00
$23,655.00
$11,220.00
Year 3 New
$491,290.62
Yr 3 Existing
$504,344.62
$13,054.00
Year 4 New Yr 4 Existing
$551,855.62
$563,925.62
$12,070.00
A-7
Appendix C: QEP Pilot Project Budget Years 1-3 with Average Projected Costs for Years 4-10
Yearly Totals: 12 Pilot Projects Start Up Year 1 Year 2 Year 3 Grand Total
Averaged Annual
Cost
Summer07 Sept 07-Aug08 Sept 08-Aug09 Sept 09-Aug10 Years 1-3 Years 4-10
New $758,844.76 $242,529.59
$850.00 $850.00 $2,550.00 $850.00
Subtotal $761,394.76 $243,379.59
Breakout for QEP Pilot Project Budget Years 1-3 with Average Projected Costs for Years 4-10
(All pilot project budgets were determined by the pilot project primary investigators.)
1 Primary Investigator: Aktosun
Unit: College of Science
Start Up
Summer07
Personnel Faculty Salaries
Faculty Benefits
Cumulative Total
Additional funding for GTA will be provided by Department of Mathematics.
2 Primary Investigator: Boardman
Unit: College of Engineering
Personnel
Equipment
Cumulative Total
3 Primary Investigator: Cole
Unit: College of Liberal Arts
Personnel
Equipment
Cumulative Total
Faculty Salaries
Faculty Benefits
GTA
Hardware
GTA
Hardware
Start Up
Summer07
Start Up
$6,000.00
$18,501.00
Summer07
Faculty Salaries
$0.00
$9,616.00
$2,885.00
$0.00
Year 1
Sept 07-Aug08
$31,555.00
$9,466.00
$41,021.00
Year 1
Sept 07-Aug08
Year 1
$5,624.00
$1,688.00
$3,713.00
$11,025.00
Sept 07-Aug08
$3,000.00
$8,450.00
$7,000.00
$18,450.00
Year 2
Sept 08-Aug09
$32,817.00
$9,846.00
$42,663.00
Year 2
Sept 08-Aug09
Year 2
$3,652.00
$1,096.00
$4,748.00
Sept 08-Aug09
Year 3
Sept 09-Aug10
$34,130.00
$10,239.00
$44,369.00
Year 3
Sept 09-Aug10
$3,000.00
$4,225.00
$12,825.00
Year 3
$0.00
Sept 09-Aug10
$5,600.00
$4,225.00
$4,225.00
Subtotal
$98,502.00
$29,551.00
$128,053.00
Subtotal
$18,892.00
$5,669.00
$3,713.00
$6,000.00
$34,274.00
Subtotal
$6,000.00
$16,900.00
$12,600.00
$35,500.00
A-8
4 Primary Investigator: Deen
Unit: Liberal Arts
Personnel
Office Supplies/Photocopies
Cumulative Total
5 Primary Investigator: Dillon
Unit:
Personnel
Cumulative Total
6 Primary Investigators: Hirtle & Wiggins
Faculty Salaries
GTA (Undergraduate)
Start Up
Summer07
Faculty Salaries
Faculty Benefits
Start Up
Summer07
GTA
Start Up
$0.00
$9,812.00
$2,943.00
$12,755.00
Year 1
Sept 07-Aug08
Year 1
$500.00
$1,000.00
$198.00
$1,698.00
Sept 07-Aug08
$6,062.00
$1,819.00
$5,872.50
$13,753.50
Year 2
Sept 08-Aug09
Year 2
$500.00
$1,000.00
$198.00
$1,698.00
Sept 08-Aug09
Year 3
Sept 09-Aug10
Year 3
$500.00
$1,000.00
$198.00
$1,698.00
Sept 09-Aug10
$7,500.00
$2,250.00
$5,872.50
$15,622.50
$5,872.50
$5,872.50
Unit: College of
Equipment
Education
Cumulative Total
Training
Office Supplies/Photocopies
Conference
GTA
Hardware
Summer07
Software
$0.00
Year 1
Sept 07-Aug08
$2,400.00
$12,250.00
$4,750.00
$500.00
$5,000.00
$24,900.00
7 Primary Investigator: Martin
Unit:
Personnel
Cumulative Total
Faculty Salaries
Faculty Benefits
GTA
Mileage*
Start Up
Summer07
$0.00
Year 1
Sept 07-Aug08
$18,000.00
$5,400.00
$19,248.42
$2,225.00
$44,873.42
* Travel for years 1-3 are each figured at 5,000 miles. Reimbursement was calculated at $0.445/mile.
Year 2
Sept 08-Aug09
$2,400.00
Year 2
$2,500.00
$500.00
$5,000.00
$10,400.00
Sept 08-Aug09
$18,000.00
$5,400.00
$19,248.42
$2,225.00
$44,873.42
Year 3
Sept 09-Aug10
$2,400.00
Year 3
$500.00
$5,000.00
$7,900.00
Sept 09-Aug10
$18,000.00
$5,400.00
$19,248.42
$2,225.00
$44,873.42
Subtotal
$1,500.00
$3,000.00
$594.00
$5,094.00
Subtotal
$23,374.00
$7,012.00
$17,617.50
$48,003.50
Subtotal
$7,200.00
$12,250.00
$4,750.00
$2,500.00
$1,500.00
$15,000.00
$43,200.00
Subtotal
$54,000.00
$16,200.00
$57,745.26
$6,675.00
$134,620.26
A-9
8 Primary Investigators: Maruszczak & Rusher
Unit: Architecture
Personnel
Equipment
Faculty Salaries
Faculty Benefits
GTA*
Hardware
Start Up
Summer07
Cumulative Total
* 2 GTA's per semester requested.
Software
9 Primary Investigator:
Unit:
Personnel
Cumulative Total
McMahon
10 Primary Investigator: Peterson
Faculty Salaries
Faculty Benefits
GTA (Undergraduate)
Start Up
Summer07
Unit:
Personnel
Equipment
Office Supplies/Photocopies
Travel
Cumulative Total
Start Up
Summer07
Faculty Salaries
GTA
Software
Airfare
11 Primary Investigators: Prater & Swafford
Unit: Business
Office Supplies/Photocopies
Travel
Cumulative Total
12 Primary Investigators: Trkay
Unit: University Libraries/College of
Liberal Arts
Cumulative Total*
* All funding is provided by University Library.
GTA
Airfare
Start Up
Summer07
Start Up
Summer07
$0.00
Year 1
Sept 07-Aug08
$10,000.00
$2,500.00
$7,500.00
$4,500.00
Year 2
Sept 08-Aug09
$10,000.00
$2,500.00
$7,500.00
$9,500.00
$34,000.00 $20,000.00
Year 1
Sept 07-Aug08
$8,000.00
$3,600.00
$4,000.00
$15,600.00
Year 2
Sept 08-Aug09
$8,000.00
$3,600.00
$4,000.00
$15,600.00
Year 3
Sept 09-Aug10
$10,000.00
$2,500.00
$7,500.00
$20,000.00
Year 3
Sept 09-Aug10
$8,000.00
$3,600.00
$4,000.00
$15,600.00
Subtotal
$30,000.00
$7,500.00
$22,500.00
$4,500.00
$9,500.00
$74,000.00
Subtotal
$24,000.00
$10,800.00
$12,000.00
$46,800.00
$0.00
Year 1 Year 2 Year 3
Sept 07-Aug08
$57,500.00
$12,000.00
$1,000.00
Sept 08-Aug09
$57,500.00
$12,000.00
$0.00
Sept 09-Aug10
$20,000.00
$12,000.00
$0.00
$5,000.00
$4,000.00
$79,500.00
$6,100.00
$2,000.00
$77,600.00
$6,100.00
$3,000.00
$41,100.00
Subtotal
$135,000.00
$36,000.00
$1,000.00
$17,200.00
$9,000.00
$198,200.00
Year 1
Sept 07-Aug08
$5,000.00
$200.00
$600.00
$5,800.00
Year 2
Sept 08-Aug09
$5,000.00
$300.00
$5,300.00
Year 3
Sept 09-Aug10
$0.00
Subtotal
$10,000.00
$200.00
$900.00
$11,100.00
Year 1 Year 2 Year 3
Sept 07-Aug08
$850.00
Sept 08-Aug09
$850.00
Sept 09-Aug10
$850.00
Subtotal
$2,550.00
A-10
Appendix D: QEP Support Program for Active Learning
Provost
SACS QEP
Coordinator—one-year transition as Advisory
Consultant
(0.25 faculty)
Associate Provost
QEP Coordinator
(0.5 faculty)
University Wide Standing
Committee on Higher
Order Thinking and
Active Learning
Communities of Practice
(COP) 12 PI’S
GRA
Web Developer/
Database
Manager
GRA
Teaching
Circle -
Large Classes
Teaching
Circle -
Symposia
Director of Institutional
Research, Planning and
Effectiveness
Undergraduate Reporting
& Assessment Specialist
Teaching
Circle -
Capstone
Faculty
Mentoring
Program
Teaching
Circles
Academy of
Distinguished
Teachers
Instructional
Support
Sessions
Workshops/
Speakers
New Faculty
Orientation
Faculty
Enrichment
Programs
A-11
Appendix E: Timeline for QEP Implementation
S EMESTER
Spring 2007
Summer 2007
Fall 2007
Spring 2008
Summer 2008
Fall 2008
Spring 2009
A CTIVITIES
•
QEP final draft is completed for submission to SACS by January 15, 2007 for on-site review and released to University community.
•
Active learning website portal launched and continually updated.
•
Pilot-Project Community of Practice continues to meet to refine course-level assessments and prepare for implementation.
•
SACS Reaffirmation Team visits.
•
Implementation plan is finalized following on-site visit.
•
Permanent QEP Coordinator selected.
•
University-Wide Standing Committee on Higher Order Thinking and Active Learning
(the “Standing Committee”) takes over from QEP Steering Committee.
•
Initial faculty survey of current active learning practices is conducted.
•
Primary investigators continue preparation for fall 2007 and spring 2008 proposal implementation.
•
Remaining relevant staff members are hired.
•
Permanent QEP Coordinator and SACS QEP Coordinator work with all pilot project primary investigators to ensure all necessary assessment collection issues are addressed.
•
Undergraduate Reporting and Assessment Specialist, along with Web
Developer/Database Manager work to ensure the QEP assessment database is completed; all Scantron readable forms are developed and printed.
•
QEP pilot projects are implemented in accordance with QEP and individual designs.
•
QEP Coordinator works with the Provost’s Faculty Enrichment Programs to ensure adequate coverage of topics/issues relevant to the QEP and continues additional initiatives to promote the QEP across the University.
•
QEP pilot projects are implemented in accordance with QEP and individual designs.
•
Course-level assessment for pilot project classes taught in the fall is completed by
Undergraduate Reporting and Assessment Specialist. The findings are shared with the pilot project faculty and the Standing Committee for review and discussion of necessary refinements in course or assessment strategies.
•
Annual faculty survey of current active learning practices is conducted.
•
QEP Coordinator continues to work with the Provost’s Faculty Enrichment Programs and to pursue initiatives promoting the QEP across the University.
•
QEP assessments and approaches are incorporated into the appropriate departments’
Unit Effectiveness Plan documents.
•
Course-level assessment for pilot project classes taught in the spring is completed by
Undergraduate Reporting and Assessment Specialist. The findings are shared with the pilot project faculty and the Standing Committee for review and discussion of necessary refinements in course or assessment strategies.
•
Program-level and University-level assessments are completed by Undergraduate
Reporting and Assessment Specialist. The findings are shared with the pilot project faculty and the Standing Committee for review and discussion of necessary refinements in QEP program and assessment strategies.
•
Pilot projects (modified if necessary based upon assessment findings) are implemented in accordance with QEP and individual designs.
•
Biannual survey of UEPs to identify higher order thinking learning outcomes is completed.
•
QEP Coordinator continues to work with the Provost’s Faculty Enrichment Programs and to pursue initiatives promoting the QEP across the University.
•
QEP pilot project primary investigators develop teaching circles.
•
Pilot projects (modified if necessary based upon assessment findings) are implemented in accordance with QEP and individual designs.
•
Course-level assessment for pilot project classes taught in the fall is completed by
Undergraduate Reporting and Assessment Specialist. The findings are shared with the pilot project faculty and the Standing Committee for review and discussion of necessary refinements in course or assessment strategies.
•
Annual faculty survey of current active learning practices is conducted.
•
QEP Coordinator continues to work with the Provost’s Faculty Enrichment Programs and to pursue initiatives promoting the QEP across the University.
•
Continued utilization of teaching circles coordinated by QEP pilot project primary investigators.
A-12
S EMESTER
Summer 2009
Fall 2009
Spring 2010
Summer 2010
A CTIVITIES
•
Course-level assessment for pilot project classes taught in the spring is completed by
Undergraduate Reporting and Assessment Specialist. The findings are shared with the pilot project faculty and the Standing Committee for review and discussion of necessary refinements in course or assessment strategies.
•
Program-level and University-level assessments are completed by Undergraduate
Reporting and Assessment Specialist. The findings are shared with the pilot project faculty and the Standing Committee for review and discussion of necessary refinements in QEP program and assessment strategies.
•
Pilot projects (modified if necessary based upon assessment findings) are implemented in accordance with QEP and individual designs.
•
QEP Coordinator continues to work with the Provost’s Faculty Enrichment Programs and to pursue initiatives promoting the QEP across the University.
•
Continued utilization of teaching circles coordinated by QEP pilot project primary investigators.
•
Pilot projects (modified if necessary based upon assessment findings) are implemented in accordance with QEP and individual designs.
•
Course-level assessment for pilot project classes taught in the fall is completed by
Undergraduate Reporting and Assessment Specialist. The findings are shared with the pilot project faculty and the Standing Committee for review and discussion of necessary refinements in course or assessment strategies.
•
Annual faculty survey of current active learning practices is conducted.
•
QEP assessments are again incorporated into the appropriate departments’ Unit
Effectiveness Plan documents for the 2010-2012 cycle.
•
QEP Coordinator continues to work with the Provost’s Faculty Enrichment Programs and to pursue initiatives promoting the QEP across the University.
•
Continued utilization of teaching circles coordinated by QEP pilot project primary investigators.
•
Course-level assessment for pilot project classes taught in the spring is completed by
Undergraduate Reporting and Assessment Specialist. The findings are shared with the pilot project faculty and the Standing Committee for review and discussion of necessary refinements in course or assessment strategies.
•
Program-level and University-level assessments are completed by Undergraduate
Reporting and Assessment Specialist. The findings are shared with the pilot project faculty and the Standing Committee for review and discussion of necessary refinements in QEP program and assessment strategies.
•
Employ the QEP-Specific Evaluation Rubric to assess impact and institutionalization of
QEP initiative.
A-13
Appendix F: The QEP-Specific Assessment Rubric*
DIMENSION I: PHILOSOPHY AND MISSION OF ACTIVE LEARNING AS A MEANS TO ACHIEVE HIGHER
ORDER THINKING SKILLS
A primary component of active learning institutionalization is the development of a campus-wide definition for active learning that provides meaning, focus, and emphasis for the QEP effort. How narrowly or broadly active learning is defined on our campus will affect which campus constituents participate/do not participate, which campus units will provide financial resources and other support, and the degree to which active learning will become part of the campus’ institutional fabric.
DIRECTIONS: For each of the four categories (rows), place a circle around the cell that best represents the CURRENT status of the development of a definition, philosophy, and mission of active learning.
Element
DEFINITION OF
ACTIVE
LEARNING
STRATEGIC
PLANNING
ALIGNMENT
WITH
INSTITUTIONAL
MISSION
ALIGNMENT
WITH
EDUCATIONAL
REFORM EFFORTS
STAGE ONE
Critical Mass Building of experiential activities. active learning on campus. profile efforts on campus.
There is no campus-wide definition for active learning. The term "active learning" is used inconsistently to describe a variety
The campus does not have an official strategic plan for advancing
While active learning complements many aspects of the institution's mission, it remains on the periphery of the campus. Active learning is rarely included in larger efforts that focus on the core mission of the institution.
Active learning stands alone and is not tied to other important, high-
STAGE TWO
Quality Building
There is an operationalized definition for active learning on the campus, but there is some variance and inconsistency in the application of the term.
Although certain short- and long-range goals for active learning have been defined, these goals have not been formalized into an official strategic plan that will guide the implementation of these goals.
Active learning is often mentioned as a primary or important part of the institution's mission, but active learning is not included in the campus' official mission or strategic plan.
Active learning is tied loosely or informally to other important, highprofile efforts on campus.
STAGE THREE
Sustained Institutionalization
The institution has a formal, universally accepted definition for active learning that is used consistently to operationalize many or most aspects of active learning on campus.
The campus has developed an official strategic plan for advancing active learning on campus, which includes viable shortrange and long-range institutionalization goals.
Active learning is part of the primary concern and mission of the institution.
Active learning is tied formally and purposefully to other important, highprofile efforts on campus.
Example Measures
Appearance in catalogs and
UEPs.
NOTES
Number/percentage of departments with specific strategic plan for active learning with short and longrange goals (e.g., UEP).
Number/percentage of departments with specific strategic plan for active learning with short and longrange goals (e.g., UEP).
Number/percentage of faculty and students involved in active learning in classroom, participating in pilot projects and communities of practice.
*Adapted from “Institutionalization of Service-Learning Self-Assessment Rubric” (Revised 2002), Andrew Furco, University of California, Berkley, 1999. Based on the
Kecshes/Muyllaert Continuums of Service Benchmark Worksheet. A Project of Campus Compact at Brown University. As used in Appendix F, the term “active learning” is meant as a short-handed reference for the concept of “active learning as a means to enhance higher order thinking skills.”
A-14
DIMENSION II: FACULTY SUPPORT FOR AND INVOLVEMENT IN ACTIVE LEARNING AS A MEANS TO
ACHIEVE HIGHER ORDER THINKING SKILLS
One of the essential factors for institutionalizing active learning is the degree to which faculty members are involved in implementation and advancement of active learning on a campus.
DIRECTIONS: For each of the four categories (rows), place a circle around the cell that best represents the CURRENT status of faculty involvement in and support for active learning.
Element
FACULTY
KNOWLEDGE
AND AWARENESS
FACULTY
INVOLVEMENT &
SUPPORT
FACULTY
LEADERSHIP
FACULTY
INCENTIVES &
REWARDS
STAGE ONE
Critical Mass Building
Very few members know what active learning is or understand how active learning is different from other teaching and learning environments.
Very few faculty members are instructors, supporters, or advocates of active learning. Few support the strong infusion of active learning into the academy or into their own professional work.
Active learning activities are sustained by a few faculty members on campus.
None of the most influential faculty members on campus serve as leaders for advancing active learning on the campus.
In general, faculty members are not encouraged to engage in active learning; few if any incentives are provided (e.g., Faculty
Development Leaves, funds for activities, etc.) to pursue active learning activities; faculty members' work in active learning is not usually recognized during their review, tenure, and promotion process.
STAGE TWO
Quality Building
An adequate number of faculty members know what active learning is and understand how active learning is different from other teaching and learning environments.
While a satisfactory number of faculty members are supportive of active learning, few of them are advocates for infusing active learning in the overall mission and/or their own professional work. An inadequate or unsatisfactory number of KEY faculty members are engaged in active learning.
There are only one or two influential faculty members who provide leadership to the campus' active learning effort.
Although faculty members are encouraged and are provided various incentives (e.g., Faculty Development
Leaves, funds for activities, etc.) to pursue active learning activities, their work in active learning is not always recognized during their review, tenure, and promotion process.
STAGE THREE
Sustained Institutionalization
A substantial number of faculty members know what active learning is and can articulate how active learning is different from other teaching and learning environments.
A substantial number of influential faculty members participate as instructors, supporters, and advocates of active learning and support the infusion of active learning both into the institution's overall mission AND the faculty members' individual professional work.
A highly respected, influential group of faculty members serves as the campus' active learning leaders and/or advocates.
Faculty who are involved in active learning receive recognition for it during the campus' review, tenure, and promotion process; faculty are encouraged and are provided various incentives (e.g., Faculty
Development Leaves, funds for activities, etc.) to pursue active learning activities.
NOTES
Example Measures
Number/percentage of faculty surveyed reflecting knowledge of active learning.
Number/percentage of faculty surveyed reflected implementation of active learning in course work (class and clinical); number/ percentage of syllabi that mention active learning; number of web hits; number of faculty participating in teaching circles/ communities of practice.
Number of faculty serving as leaders of active learning within campus teaching circles or communities of practice.
Number of faculty recognized for active learning involvement yearly; number of faculty participating in
QEP projects; number of faculty receiving active learning related Faculty
Development Leaves.
A-15
DIMENSION III: STUDENT SUPPORT FOR AND INVOLVEMENT IN ACTIVE LEARNING AS A MEANS TO
ACHIEVE HIGHER ORDER THINKING SKILLS
An important element of active learning institutionalization is the degree to which students are aware of active learning opportunities on campus.
DIRECTIONS: For each category (rows), circle the cell that best represents the CURRENT status of student support for involvement.
Element
STUDENT
AWARENESS
STUDENT
OPPORTUNITIES
STAGE ONE
Critical Mass Building
There is no campus-wide mechanism for informing students about active learning.
Few active learning opportunities exist for students; only a handful of active learning courses are available.
STAGE TWO
Quality Building
While there are some mechanisms for informing students about active learning, the mechanisms are sporadic and concentrated in only a few departments, courses, or programs.
Active learning options are limited to only a certain group of students (e.g., students in certain majors, honors students, seniors, etc.).
STAGE THREE
Sustained Institutionalization
There are campus-wide, coordinated mechanisms (e.g., active learning listings in the schedule of classes, course catalogs, etc.) that help students become aware of the various active learning courses that are available to them.
Active learning options and opportunities are available to students in many areas, regardless of students' major, year in school, or academic and social interests.
NOTES
Example Measures
Number of course syllabi reflecting definition and use of active learning to students.
Number of students involved in an active learning; related student-to-student mentor program; recognition of such students.
A-16
DIMENSION IV: INSTITUTIONAL SUPPORT FOR ACTIVE LEARNING AS A MEANS TO ACHIEVE HIGHER
ORDER THINKING SKILLS
For active learning to become institutionalized on our campus, there must be resources and support toward the effort.
DIRECTIONS: For each category (rows), circle the cell that best represents the CURRENT status of your campus’ institutional support for active learning.
Element
COORDINATING
ENTITY
POLICY-MAKING
ENTITY
STAFFING
FUNDING
DEPARTMENTAL
SUPPORT
EVALUATION &
ASSESSMENT
STAGE ONE
Critical Mass Building
There is no campus-wide coordinating committee to assist the implementation, advancement, and institutionalization of active learning.
The institution’s official/ influential policy-making board(s)/committee(s) do not recognize active learning as an essential educational practice for the campus.
There are no staff/faculty members on campus whose paid responsibility is to advance and institutionalize active learning on the campus.
The campus' active learning activities are supported primarily by sources outside the institution (e.g., shortterm grants).
Few, if any, departments recognize active learning as a formal part of their academic programs.
No organized, campus-wide effort underway to account for the number and quality of active learning activities taking place.
STAGE TWO
Quality Building
There is a coordinating committee on campus, but it either does not coordinate active learning activities exclusively or provides services only to a certain constituency (e.g., students, faculty) or limited part of campus (e.g., certain majors).
The institution’s official and influential policy-making board(s)/committee(s) recognize active learning as an essential educational practice for the campus, but no formal policies have been developed.
Campus has an appropriate number of staff members who fully understand active learning and/or hold appropriate titles that can influence the advancement and institutionalization of active learning throughout the campus; however their appointments are temporary.
The campus' active learning activities are supported by both sources outside the institution (e.g., short-term grants) as well as University funds.
Several departments offer active learning opportunities and courses, but these opportunities typically are not a part of the formal academic program of the department and/or are not primarily supported by departmental funds.
An initiative to account for the number and quality of active learning activities taking place throughout the campus has been proposed.
STAGE THREE
Sustained Institutionalization
The institution maintains a viable coordinating committee that is devoted primarily to assisting the various campus constituencies in the implementation, advancement, and institutionalization of active learning.
The institution’s policy-making board(s)/ committee(s) recognize active learning as an essential educational practice for the campus and formal policies have been developed or implemented.
The campus houses and funds an appropriate number of permanent staff members who understand active learning and who hold appropriate titles that can influence the advancement and institutionalization of active learning on campus.
The campus' active learning activities are supported primarily by University funds.
A fair to large number of departments provide active learning opportunities that are a part of the formal academic program and/or are primarily supported by departmental funds.
Ongoing, systematic assessment effort in place accounting for the number and quality of active learning activities taking place throughout campus.
NOTES
Example Measures
Establishment of University-
Wide Standing Committee on
Higher Order Thinking and
Active Learning.
The number of curriculum committees that require an active learning component for new course approval.
Number/percentage of staff (e.g., advisors) surveyed reflecting knowledge of active learning and use at UT Arlington, and involvement with faculty to enhance use.
Amount of administrative funding for pilot projects.
Number of faculty workshops across campus on active learning; funding of faculty to travel to offcampus active learning workshops; Number/percentage of departments with specific plans for active learning (e.g.,
UEP).
Benchmarks (CLA, NSSE, FSSE,
Alumni Survey, Employers
Survey, Pilot Project assessments) as to successful use of active learning methods and comparison of students’ progress in active learning courses as compared to other students.
A-17
Appendix G: Timeline Following QEP Completion
S EMESTER
Summer 2010
Fall 2010
Spring 2011
Summer 2011
Fall 2011
Spring 2012
Summer 2012
Fall 2012
A CTIVITIES
•
The Standing Committee and University administration determines next steps based on findings of all prior assessments of QEP initiative.
•
Begin to implement next steps (e.g., issuing call for second round of active learning course proposals and selection). [Funds are budgeted for the 2010-2011 academic year if the
Standing Committee elects to continue the pilot projects during year 4 while implementing next steps and/or for start-up or other programmatic costs.]
•
Aspects of exemplary pilot projects are publicized in order to expand the reach of active learning on the UT Arlington campus; support infrastructure for expansions is fully in place.
•
Biannual survey of UEPs to identify higher order thinking learning outcomes is completed.
•
QEP Coordinator continues to work with the Provost’s Faculty Enrichment Programs and to pursue initiatives promoting the QEP across the University.
•
Continued implementation of next steps (e.g., work with selected proposals to prepare for full implementation starting fall 2011).
•
A new Second-Round Projects Community of Practice—consisting of the faculty who are implementing second-round proposals and assessment-team members—is constituted.
•
Annual faculty survey of current active learning practices is conducted.
•
QEP Coordinator continues to work with the Provost’s Faculty Enrichment Programs and to pursue initiatives promoting the QEP across the University.
•
Continued implementation of next steps (e.g., work with selected proposals to prepare for full implementation starting fall 2011).
•
Data from pilot-project assessments are incorporated where applicable as part of respective
2010-2012 departmental UEP assessments and reported to the Standing Committee.
•
All QEP-related data are compiled for SACS Impact Report, and begin drafting the Report.
•
Second-round projects are implemented in accordance with programmatic and individual designs.
•
QEP Coordinator continues to work with the Provost’s Faculty Enrichment Programs and to pursue initiatives promoting the QEP across the University.
•
Continued drafting of SACS Impact Report.
•
Second-round projects are implemented in accordance with programmatic and individual designs.
•
Course-level assessment for second-round classes taught in the fall is completed by
Undergraduate Reporting and Assessment Specialist. The findings are shared with the course faculty and the Standing Committee for review and discussion of necessary refinements in course or assessment strategies.
•
SACS Impact Report is submitted.
•
Annual faculty survey of current active learning practices is conducted.
•
QEP assessments are again incorporated into the appropriate departments’ Unit
Effectiveness Plan documents for the 2012-2014 cycle.
•
QEP Coordinator continues to work with the Provost’s Faculty Enrichment Programs and to pursue initiatives promoting the QEP across the University.
•
Course-level assessment for second-round classes taught in the spring is completed by
Undergraduate Reporting and Assessment Specialist. The findings are shared with the course faculty and the Standing Committee for review and discussion of necessary refinements in course or assessment strategies.
•
Program-level and University-level assessments are completed by Undergraduate Reporting and Assessment Specialist. The findings are shared with the course faculty and the Standing
Committee for review and discussion of necessary refinements in second-round program and assessment strategies.
•
Second-round projects (modified if necessary based upon assessment findings) are implemented in accordance with programmatic and individual designs.
•
Biannual survey of UEPs to identify higher order thinking learning outcomes is completed.
•
QEP Coordinator continues to work with the Provost’s Faculty Enrichment Programs and to pursue initiatives promoting the QEP across the University.
A-18
S EMESTER
Spring 2013
Summer 2013
Fall 2013
Spring 2014
Summer 2014
Fall 2014 – Spring
2017
A CTIVITIES
•
Second-round projects (modified if necessary based upon assessment findings) are implemented in accordance with programmatic and individual designs.
•
Course-level assessment for second-round classes taught in the fall is completed by
Undergraduate Reporting and Assessment Specialist. The findings are shared with the course faculty and the Standing Committee for review and discussion of necessary refinements in course or assessment strategies.
•
Annual faculty survey of current active learning practices is conducted.
•
QEP Coordinator continues to work with the Provost’s Faculty Enrichment Programs and to pursue initiatives promoting the QEP across the University.
•
Course-level assessment for second-round classes taught in the spring is completed by
Undergraduate Reporting and Assessment Specialist. The findings are shared with the course faculty and the Standing Committee for review and discussion of necessary refinements in course or assessment strategies.
•
Program-level and University-level assessments are completed by Undergraduate Reporting and Assessment Specialist. The findings are shared with the course faculty and the Standing
Committee for review and discussion of necessary refinements in second-round program and assessment strategies.
•
Second-round projects (modified if necessary based upon assessment findings) are implemented in accordance with programmatic and individual designs.
•
QEP Coordinator continues to work with the Provost’s Faculty Enrichment Programs and to pursue initiatives promoting the QEP across the University.
•
Second-round projects (modified if necessary based upon assessment findings) are implemented in accordance with programmatic and individual designs.
•
Course-level assessment for second-round classes taught in the fall is completed by
Undergraduate Reporting and Assessment Specialist. The findings are shared with the course faculty and the Standing Committee for review and discussion of necessary refinements in course or assessment strategies.
•
Annual faculty survey of current active learning practices is conducted.
•
Second-round course assessments are again incorporated into the appropriate departments’
Unit Effectiveness Plan documents for the 2014-2016 cycle.
•
QEP Coordinator continues to work with the Provost’s Faculty Enrichment Programs and to pursue initiatives promoting the QEP across the University.
•
Course-level assessment for second-round classes taught in the spring is completed by
Undergraduate Reporting and Assessment Specialist. The findings are shared with the course faculty and the Standing Committee for review and discussion of necessary refinements in course or assessment strategies.
•
Program-level and University-level assessments are completed by Undergraduate Reporting and Assessment Specialist. The findings are shared with the course faculty and the Standing
Committee for review and discussion of necessary refinements in second-round program and assessment strategies.
•
Employ programmatic rubric (comparable to the QEP-Specific Evaluation Rubric) to assess impact and institutionalization of full active learning initiative.
•
The Standing Committee and University administration determines next steps based on findings of all prior assessments of full active learning initiative.
•
Determine next steps (e.g., continue to disperse active learning throughout the University).
[Funds are budgeted for the 2010-2011 academic year if the Standing Committee elects to continue the pilot projects during year 4 while implementing next steps and/or for start-up or other programmatic costs.]
•
Utilize assessment findings from QEP and subsequent active learning initiatives to inform the University’s strategic plan evaluation (2016-2017).
A-19
Appendix H: Acknowledgments
This report is the culmination of the shared labors of many faculty, staff, students, alumni, employers, and other friends of the UT Arlington community, to whom the SACS Leadership
Team and the Steering Committee express their gratitude. Major contributors are listed below.
SACS Leadership Team
James D. Spaniolo (President); Dana Dunn (Provost); Michael Moore (Senior Associate
Provost); Pam Haws (SACS Liaison and Compliance Certification Coordinator); Victoria A.
Farrar-Myers (QEP Coordinator)
QEP Steering Committee
Rebecca Boles (Architecture); Mark Eakin (Business Administration); Jill Fox (Education); John
Priest (Engineering); Denny Bradshaw (Liberal Arts); Cheryl Anderson (Nursing); D.L. Hawkins
(Science); Doreen Elliott (Social Work); Joel Goldsteen (Urban and Public Affairs); Minerva
Cordero-Epperson (Honors); Dan Formanowicz (Faculty Senate); Araya Maurice (Academic
Affairs); Pat O’Neill (Student Affairs); Dawn Remmers (Advising); Julie Alexander (Library); Phil
Cohen (Graduate School); Carl Esposito (student); Holly Lortie (Student Governance Liaison);
Samuel Odamah (student); Bobbie Brown (GRA); Cheryl Cardell (Academic Affairs); Sharon
Judkins (Nursing); Sheena Odwani (student); Nancy Rowe (Statistical Services); Steven Apell
(GRA)
Primary Investigators (bolded) and Their Co-Authors
School of Architecture: Rebecca Boles, Dean Donald Gatzke, John Maruszczak , Thomas Rusher
College of Business Administration: Edmund Prater, Patricia Swafford
College of Education: Jeannine Hirtle , Nancy Hadaway, Beverly Boulware, Joy Wiggins , Kathleen
Tice
College of Engineering: Bonnie Boardman, William E. Dillon, Lynn Peterson
Honors College: Robert McMahon , Karl Petruso , and Minerva Cordero-Epperson
College of Liberal Arts: Stephanie Cole, Rebecca Deen , Victoria A. Farrar-Myers, and Dale Story
University Library/College of Liberal Arts: Gretchen Trkay and Joanna Johnson
School of Nursing: Fran Martin
College of Science: Tuncay Aktosun , Ruth Gornet, Hristo Kojouharov, Barbara Shipman,
Jianzhond Su
Others Pre-Proposal Submissions
School of Architecture: John Maruszczak, Ralph Hawkins, Brad Bell, Donald Gatzke, Rebecca
Boles, Roger Connah
College of Business Administration: Cynthia Kalina-Kaminsky and Cheryl Prachyl
A-20
Career Development: Cheri Butler
Campus Recreation: Sharon Carey
College of Education: Lee Ann Dumas, David Stader, Michael Anderson, Shirley Theriot and Kim
Ruebal
College of Engineering: John Patterson
Honors College: Karl Petruso
College of Liberal Arts: Miriam Byrd, Richard Francaviglia, Ann Hodges, Mary French, Charles
Knerr, Charles Nussbaum, Harry Reeder, and Kenneth Roemer
University Library: Julie Alexander and Evelyn Barker
School of Nursing: Mindy Anderson, Judy LeFlore and Jacqueline Michaels
Office for Students with Disabilities: Dianne Hengst
College of Science: Nilakshi Veeribathi, Robert Bonadurer, Manfred Cuntz, Jianzhond Yu, James
Horwitz, and Barbara Shipman
School of Urban and Public Affairs: Allen Repko
Deans of the Colleges and Schools
Donald Gatzke (School of Architecture); Daniel Himarios (College of Business Administration);
Jeannie Gerlach (College of Education); William Carroll (College of Engineering); Robert
McMahon (Honors College); Beth Wright (College of Liberal Arts); Elizabeth Poster (School of
Nursing); Paul Paulus (College of Science); Santos Hernandez (School of Social Work); Richard
Cole (School of Urban and Public Affairs)
Vice Presidents
Gary Cole (Vice President for Development); Ron Elsenbaumer (Vice President for Office of
Research); John Hall (Vice President for Administration and Campus Operations); Frank Lamas
(Vice President for Student Affairs); Suzanne Montague (Vice President for Office of
Information Technology); Amy Schultz (Interim Vice President for Communications); Rusty
Ward (Vice President for Business Affairs and Controller)
Student Congress Executive Board
Zac Sanders (President of Student Congress); Collins Watson (Vice President of Student
Congress); Holly Lortie (Parliamentarian of Student Congress); Kelly Earnest (Program Director of Student Congress); Stacey McKendry (Secretary of Student Congress); and Student Congress
Senators for 2005-2006 and 2006-2007
Northeastern University Summer Institute on Experiential Education at Martha’s Vineyard
UT Arlington team members: Jill Fox and Greg Hale; Martha’s Vineyard Summer Institute staff
A-21
QEP Marketing Task Force/Committee
Task Force: Kerri Ressl (Co-Chair/Alumni); Andy Axsom (Co-Chair/Advising Staff); Ricardo
Lopez (Employer/Alumni); Evelyn Barker (Library); Danny Woodward (President’s Office);
Carol Lehman (University Publications); Mark Lansdon (SACS Website); Seth Ressl
(Programming Events/Staff); William Knisley (Students); Andrea Sample (Students); Joe Tesoro
(Students); Carl Esposito (Students); Mark Peterson (Marketing Faculty Member)
Committee: Lisa Nagy (Co-Chair/Student Affairs); Andy Axsom (Co-Chair/Advising Staff); Kelly
Earnest (Student Affairs/Student Congress); Carl Esposito (Visitor’s Center/Alumni/Graduate
Student); John Hillas (Student Affairs); William Knisley (EX.C.E.L.. Campus Activities); Carol
Lehman (University Publications); Ricardo Lopez (UTA Alumni/Employer); Angel Taylor
(Student Success Programs); Dara Newton (Undergraduate Recruitment/Admissions); Mark
Peterson (Marketing Faculty Member); Michael Poole (E.X.C.E.L. Campus Activities); Amanda
Pritchard (Graduate Studies/Graduate Student); Kerri Ressl (Alumni Association)
External Speakers
Charles C. Bonwell; Joseph Hoey
Office of University Publications
Carol Lehman (Graphics); Robert Crosby (Photography); Chuck Pratt (Assistant Director of
University Publications; Teodozja Korzeniowski (Web Specialist); Mark Permenter (Director of
University Publications)
Video Services
Lisa Evans-Reagan (Director of University Video Services), James Sanders (TV Producer-
Director), Don Harris
Student Affairs Division
Jeff Sorenson (Assistant Vice President for Student Governance and Organizations); Marty
Sorensen (Assistant Vice President for Student Affairs); Molly Alfers (Assistant Director); Cathy
Prichett (Director of Programming, Honors College); Sharon Carey (Director of Campus
Recreation); Drew Barfield (Assistant Director of Intramural Sports & Sports Clubs); Chris
Muller (Associate Director of Campus Recreation); Doug Kuykendall (Assistant Vice President of
Student Affairs)
Distance Education
Pete Smith (Assistant Vice President of Distance Education); Mark Chapman (Technical Media
Coordinator); Clayton Beberstein (Manager; Technical Operations)
UT Arlington Library
Gerald Saxon (Dean); Mary Jo Lyons (Coordinator for Information Literacy); Gretchen Trkay
(Instruction & Information Literacy Librarian); Tommy Wingfield (Librarian); Kristin Swenson
(Web Specialist)
UT Arlington Classes
INTD 4345 (Architectural Graphics) and especially Tabitha Pease, Cortney Wells, Tetsuya
Okamoto, and Mary Duke; SOCW 6396 (Social Work Education: Principles and Skills) Spring
2006 Class
A-22
Consultants
Joni Spurlin (University Director of Assessment, North Carolina State University); Donna
Qualters (Director of Center for Effective University Teaching, Northeastern University); Mary
French (Assistant Professor in English)
Additional Contributors
Associate Deans and Academic Directors
Association of Academic Directors and Chairs
Bart Weiss (Associate Professor of Art & Art History)
Bill Daley (Director, Office of Information Technology)
Bob Wegner (Associate Dean for Program Research and Evaluation, SUPA)
Chairs, Departments, Colleges, and School faculty meetings who hosted QEP discussions
Christina Cobb (Alumni Director) and Alumni Board of Directors
David Tees (Director of Training and Services, SUPA)
Delene Remmers (Administrative Assistant, Counseling & Career Development)
Development Board of the University of Texas at Arlington
Elfriede Foster (Instructor, Architecture & Environmental Design)
Faculty Senate
Faculty survey respondents
Fatema Lotia (Student)
George Chave (Associate Professor of Music)
Judy Young (Director, Office of International Education)
Karl Petruso (QEP Editor, Professor Anthropology)
Linda Wilson (Assistant Provost)
Mary Lynn Crow (Professor of Education)
Office of Institutional Research, Planning, and Effectiveness
Offices of the President and Provost
Respondents to QEP White Paper and Iterative QEP draft release
Robert Hower (Chairperson and Professor of Art & Art History)
Stephanie Hamm (Social Work Ph.D. Student)
Stephen Morris (Student)
Strategic planning conversation and focus group participants
Student and Alumni survey respondents
Student presentation participants
Undergraduate and Graduate Assembly
A-23