EDRS 820 Executive Summary and Final Report

advertisement

Running head: SIMPLE EVALUATION SUMMARY AND REPORT

Evaluation Executive Summary and Final Report;

SIMPLE Design Framework for Interactive Teaching Development

Alison Smith

Robert Stansbery

Colleen Barry

George Mason University

1

SIMPLE EVALUTION SUMMARY AND REPORT 2

Executive Summary

The SIMPLE Design Framework is a program designed to support university faculty in examining, developing, and implementing evidence-based interactive instructional practices in science, technology, engineering and mathematics (STEM) courses. STEM faculty are hired, in many cases, as accomplished researchers in their fields; once hired at the university level, tenuretrack faculty are encouraged and evaluated primarily as researchers and authors, and little training or development is provided to them in pedagogy (Fiore & Rosenquest, 2010;

Hjalmarson, Nelson, Huettel, Padgett, Wage, & Buck 2013; Theall, 2005). In many STEM courses, particularly large courses which may contain hundreds of students, the manner of instruction traditionally is didactic and teacher-centered, with the instructor teaching through lecture (Hiebert & Morris, 2012). Yet contemporary research strongly supports a more interactive, engaging and student-centered style of instruction, linking interactive instruction with higher levels of student achievement (Fiore & Rosenquest, 2010). This dichotomy between research and practice has created a dilemma for university faculty members who recognize a need for changes to their instructional practices but have little training in instruction in their content-specific backgrounds. The SIMPLE Design Framework seeks to provide a format through which STEM faculty may collaborate with their colleagues to explore and experiment with changes to their instructional style.

The program uses a theory of change in which small, incremental changes in teaching practices will lead to larger, more comprehensive changes both individually (the instructor) and organizationally (throughout the department, the university, and, on a larger scale, the academy as a whole). It is designed to meet the needs of individual instructors as well as provide networks of “critical friends” through communities of practice in which colleagues share and

SIMPLE EVALUTION SUMMARY AND REPORT reflect upon their successes, challenges, solutions and innovations in instructional practices and thereby help each other to implement similar changes.

To this end, a group of faculty from varying disciplines meets monthly to discuss practical issues and research related to interactive instruction. In turn, each member of this group acts as a facilitator for a small group of four to six instructors from similar content areas; this group acts to foster communication and provide a forum for discussion about new practices.

Design Memos, written by group members, document each new practice and serve to foster dialogue and reflection upon how each new practice was successful or challenging. In the coming academic year, the plan entails expansion of the faculty groups into further departments and additional groups. Stakeholders in the program include primary investigators Dr. Margret

Hjalmarson, Dr. Jill Nelson, Dr. Anastasia Samaras, and Dr. Cody Edwards, group leaders and small group participants, other STEM faculty members at George Mason University, STEM students, and The National Science Foundation, under whose grant the program is funded.

The ensuing formative evaluation examines the fidelity of the program implementation and addresses the following questions developed in conjunction with the evaluators and key stakeholders (Corbin & Strauss, 2008; Fitzpatrick, Sanders, & Worthen, 2011):

1.

Broadly, how are the teams functioning?

2.

Is the program sustainable? As it proceeds from one year to the next, will it be possible for interactive teaching strategies to build upon what’s been done at this point? What might be the challenges to sustainability as well as aspects of the program that work well in the current framework design?

3.

How do the design memos foster interactive teaching practices? In what ways do the design memos help you? How do they not?

3

SIMPLE EVALUTION SUMMARY AND REPORT 4

Methods utilized in the evaluation were largely qualitative and included document analysis, semi-structured interviews, and observations of group leader meetings. Data sources included documents including agendas and minutes of monthly group leaders meetings, “Check-In” documents authored by group leaders, evaluator’s observation field notes of group leader meetings, observation data recorded by a Graduate Research Assistant, and Design Memos.

Additionally, data sources included interviews with Dr. Jill Nelson and group leader Dr. Jessica

Rosenberg. Using a grounded theory approach, evaluators coded, compared, and analyzed sections of data to create broader categories with subthemes. The evaluators analyzed data individually and discussed and refined categories and themes to agreement (Corbin & Strauss,

2008).

Key findings follow several themes. To a large degree, the teams are functioning quite differently, with varying regularity of meeting and varying use of Design Memos. Ongoing dialogue continues to take place around the sharing of ideas, successes and challenges in the development and implementation of interactive teaching practices. Groups and group leaders face a number of challenges, including difficulty scheduling and maintaining meetings due to the complex schedules of faculty instructors, who also serve as graduate mentors and, in the case of tenure-track faculty members, as prolific researchers and authors. However, data also pointed to evidence of substantial growth in the development of interactive strategies, collegial collaboration and a high degree of enthusiasm for the continuing project.

As a result of the evaluation findings, the evaluators suggest the following as possibilities to sustain and expand the SIMPLE Design Framework:

SIMPLE EVALUTION SUMMARY AND REPORT 5

 A more structured approach to the use of design memos, including an increased direction and template, allowing for the desired level of flexibility, but also offering the increased scaffolding necessary to motivate busy faculty to complete them.

 A consideration of the recruitment of faculty from the college of education and human development as guest speakers at SIMPLE meetings, acting to aid in continued faculty change by sharing their experience in the science of teaching and learning.

 Capitalize on the success of colleagues by pairing successful group leaders with those who may be struggling in their own small group. This will allow for the group leaders to dialogue about successful practices, reflect on their practice, and problem-solve together.

Additionally, this will provide an opportunity for those who may be struggling to observe what is going on in successful groups and note any possible strategies that they may translate to their own groups.

Introduction and Literature Review

STEM faculty members have multiple roles and responsibilities to fulfill beyond teaching. These professionals enter the field of academia with highly refined expertise in content knowledge, research methodology, and practical skills; yet often these individuals lack an understanding of effective teaching practices (Hiebert & Morris, 2012; Theall, 2005). This lack of teaching skills is likely due to common criteria for hiring, promoting, and awarding tenure to instructors in most research-level universities, which focus on research contributions to the

STEM discipline (Suchman, 2014). As such, STEM pedagogy in higher education has been characterized by faculty who focus on transmission of discipline-based content knowledge to students. Yet there have been calls to change this culture of transmission-focused pedagogy in

STEM due to a robust body of research findings that support the benefits of engaged learning for

SIMPLE EVALUTION SUMMARY AND REPORT 6

STEM undergraduates and their future careers in the field (AAU, 2013; Jamieson & Lohmann,

2012). The challenge facing STEM faculty then becomes how to begin implementing pedagogical changes in the college classroom in a sustainable way. The focus of this review, as an integral piece of the program evaluation, is to examine the current research on enacting pedagogical change for STEM faculty towards interactive teaching methods, as well as examining ways that such changes become practice and continue to develop, both at the individual and institutional levels.

Problem

Traditional instruction in higher education, particularly in STEM courses, is structured in such a way that allows static delivery of learning from teachers to students, primarily via lengthy lectures and slides. Extant literature regarding interactive pedagogy to change the way in which these courses are delivered is growing; yet there still exists a considerable gap in how these practices are translated into the university classroom (Fiore & Rosenquest, 2010; Hiebert &

Morris, 2012; Hjalmarson et al.,2013). At its core, pedagogy in the STEM classroom must shift beyond a passive transmission of information and toward an interactive approach to engage learners, retain students in the subject, and better prepare them for long-term success in the

STEM field. In order for this shift to occur, sustainable approaches to changing faculty teaching practice in the long term must be established.

P edagogical Change Strategies

There is a burgeoning need for the United States to reexamine instruction in STEM university courses from instructor-centric to student-centric (Henderson, et al., 2011). In order for this shift to occur, a fundamental change in both pedagogical practices and their regard as an essential practice for faculty needs to occur. While there is much research dedicated to

SIMPLE EVALUTION SUMMARY AND REPORT 7 improving teaching and learning, many of these strategies have not been successfully adopted into the changing paradigm of higher education (Darlaston-Jones, Pike, Cohen, Young, Haunold,

& Drew, 2003; Hay, Wells, & Kinchin, 2008).

Henderson et al., (2011) reviewed literature related to change strategies, coherent action plans for changing practices of a group of individuals, used in undergraduate STEM courses.

The authors found broad groupings of researchers, all focusing on different aspects of teaching and learning. STEM-based researchers put a heavy reliance on dissemination of curriculum and pedagogy, whereas faculty development researchers devoted much of their work to how to foster reflective practitioners. Finally, higher education researchers honed in on broad-based policy authorization. Not only did these bodies of literature remain largely isolated from one another, but a mere 21% of studies reviewed demonstrated strong evidence toward effective change strategies.

According to Ferrini-Mundy and Güçler (2009), there are three broad strategies for improvement of STEM pedagogy in universities. The first consists of building communities in institutions that are committed to changing pedagogy at the undergraduate level. The second relates to curricular resources that are more conducive to student learning. The third focuses on pedagogical changes to improve student learning, including the use of interactive technology such as clickers and engaging students in other interactive activities. The authors further examine the types of pedagogy research that have been carried out in the recent movement to transform STEM education. In STEM fields, particularly engineering, there is an established need for a shift in pedagogy; however, there appears to be no clear signs of progress toward making this shift (Splitt, 2003). While numerous positive findings on interactive teaching

SIMPLE EVALUTION SUMMARY AND REPORT 8 strategies in STEM have been found and continue to be researched, a clear next step is to plan a strategy for effecting these changes, beginning at the individual and departmental level.

Student Engagement and Learning

There exists a growing base of literature focused on the retention of students as it relates to institutional and faculty commitment in higher education settings (Peel, 2000; Tinto,

Goodsell-Love, 1993). A natural by-product of faculty and institutional commitment to learners is initiative to improve teaching.

This begins with an examination of how student learning occurs at the undergraduate STEM level. Relatively little is known about the specific processes that document student understanding about concepts in specific disciplines (Laurillard, 2002).

This idea of measuring change is one that is deemed nebulous, and thus, many frameworks have either not been sufficiently developed or applied. Hay et al., (2008) argue that discipline-specific learning in higher education can be defined through learning as change and measured through visible means such as concept mapping. The authors stressed that the student and instructor must work collaboratively to both become agents of change in the classroom. Finding supported the use of evidence-based concept mapping were effective in producing deep knowledge and further

“has the potential to engage students and teachers in a discourse of scholarship” (p. 235).

Stemming from these responsive practices is the link to student retention and commitment to their academic careers. Darlaston-Jones et al., (2003) conducted a series of interviews and surveys to better understand the climate of lecture classrooms at a large university. They found that students who felt a sense of belonging and acceptance in their academic community were less likely to withdraw from their studies than students who felt isolated and detached. Further, one factor identified by withdrawn students that contributed to their sense of dissatisfaction was lack of interpersonal activities and integration of content

SIMPLE EVALUTION SUMMARY AND REPORT 9 outside of lectures. Students reported that they felt ignored in large lectures and were hesitant to contact the professor to seek assistance or guidance. Darlaston-Jones et al., (2003) note that students are more likely to become and remain engaged in their learning if they are given opportunities to practice both self-directed and small-group directed learning. In order to create an accepting community within a classroom, the professor must establish his or her role as a facilitator of learning and a resource to students. Related findings on the importance of student feedback for the faculty member to continuously improve their teaching were also found by

Theall (2005) in a study that examined professional learning communities of teaching practice for faculty.

Faculty Learning and Development

The aforementioned research on student engagement supports the idea that self-directed learning as well as small-group directed learning are essential for students to remain engaged.

Similar findings exist in the extant literature on faculty learning and professional development.

Theall (2005) reviewed literature on college professor skill sets and pedagogy and developed a project aimed at the meta-profession of academia. One of the primary objectives of the project seeks to engage faculty members in professional dialogue and provide a structure to develop and evaluate faculty skills for the enrichment of student learning. The implications of such an undertaking are focused on the underlying foundation of university faculty professional development through the use of dialogue among faculty to develop a responsive community of professional development.

Community building is a central aspect of collaborative learning within institutes of higher education. Fiore and Rosenquest (2010) explore how teacher pedagogy is cultivated through monthly inquiry groups where teacher documentation of student learning serves as a

SIMPLE EVALUTION SUMMARY AND REPORT 10 source of dialogue to examine teaching practices. Documentation of learning (e.g., work samples, video, audio, photos, etc.) provides opportunity for teachers to actively engage and reflect, which are embedded themes within the inquiry groups. The groups prompt teachers to move out of isolation and into an enriching collaborative environment. “Multiple perspectives often generate new questions and challenges…. The explicit act of making our teaching public has a ripple effect--- similar to a still pond after a pebble has been tossed in the water” (Fiore &

Rosenquest, 2010, p. 19). Similarly, Rallis (2006) reported on a learning community of school superintendents implemented over several years. Year 1 focused on creating a community of superintendents which outlined explicit expectations, such as, encouraging individuals share work often, modeling sustained focus on an issue, and ensuring all discussion points were situated in teaching and learning literature. Year 2 refined these expectations into group norms to include attendance, involvement, respect for confidentiality, candor and humility, and attentiveness. Year 3 added three superintendents to the community and the focus was on reflective sessions after direct observation of one another in their specific school settings. After years two and three of the program, qualitative data analysis revealed that the superintendents were modeling meetings in their own districts after the design of the network, lending support to the idea that learning communities can effectively grow and have a sustained effect on others in the school community, as well as the idea that faculty development changes take place over longer periods of time. Hiebert and Morris (2012) examined a reform effort in China whose success was also based on collaboration: teachers improved their pedagogy through observing one another, collaboratively meeting to discuss detailed lesson plans, and engage systematic strategies to include innovative teaching into the curriculum.

SIMPLE EVALUTION SUMMARY AND REPORT 11

Not only are learning communities and collaboration a key element in beginning pedagogical change, but so are the individual instructor’s beliefs about teaching and learning. In order for pedagogical change to occur, the instructor must believe, or come to believe, that their current teaching practices as well as goals for student learning are unsatisfactory (Gess-

Newsome et al., 2003). In other words, the faculty member must be motivated to change their teaching practice. This is because pedagogical practice is often deeply rooted in the instructor’s individual beliefs on teaching and learning, an idea further supported by Trigwell and Prosser

(1996). The authors studied the correlation between conceptualization of teaching and learning in a group of first-year science lecturers at a university, and found that “those teachers who conceive of learning as information accumulation to meet external demands also conceive of teaching as transmitting information to students, and approach their teaching in terms of teacherfocused strategies” (Trigwell & Prosser, 1996, p. 281).

The research findings reviewed here on fostering student engagement, faculty learning and development, and the role of collaboration in effective faculty learning support the basis for principles of the SIMPLE program design. To further illustrate, the theory of social cognitive learning as it applies to SIMPLE will be discussed.

Conceptual Framework for the SIMPLE Design Framework

The SIMPLE Design Framework is a model developed to support undergraduate science, technology, engineering, and mathematics (STEM) faculty in the transmission of evidence and research based interactive teaching practices into undergraduate courses. The program’s initiative is informed from the literature on collaborative teaching and learning (Fiore &

Rosenquest, 2010; Rallis, Tedder, Lachman, & Elmore, 2006; Theall, 2005; Tinto, & Goodsell-

Love, 1993). The mission of SIMPLE Design Framework is aimed at supporting faculty

SIMPLE EVALUTION SUMMARY AND REPORT 12 development of interactive teaching approaches via small, supportive group networks of faculty from similar disciplines. The SIMPLE program design is based on researched principles that relate closely to the theory of learning put forth by social cognitive theory.

Among contemporary perspectives of social cognitive theory, Albert Bandura’s work has received copious recognition. Of the many components that comprise the theory, this review will focus on social persuasion and its relationship to self-efficacy, and how these concepts support the principles of the SIMPLE program.

Social cognitive theory upholds that individuals acquire knowledge through observing interactions within social contexts. The consequences of the actions of others help to shape the observer’s skills, beliefs, attitudes, and help build a general framework of their world (Schunk,

2012). Extending this idea to teaching, adults learn through repeated opportunities to examine issues and model from their peers. In her work on collaborative teaching and teaming, Rallis et al., (2006) discusses the importance of capitalizing on the social nature of teaching and learning and cites the concept of communities of practice. These groups are intended for professionals to come together in a neutral space and work toward a common goal of shared enterprise.

Together, those involved “work together to test out ideas, critique one another’s work, offer alternative conceptualizations, and provide both emotional and intellectual support.” (Rossman

& Rallis, 2003, p. xvi). Likewise, an aim of the SIMPLE program is to effect changes over time by forming small, supportive groups of faculty. The program officers’ idea behind the small group model is that learning is most effective when collaborating with others, particularly by learning from peers who have more experience with the learning outcome – in this case, small group participants will learn from colleagues who are more experienced with interactive learning strategies.

SIMPLE EVALUTION SUMMARY AND REPORT 13

Building on the idea that adults learn by observing interactions within social contexts is the idea of social persuasion. Bandura (1986) explain this persuasion in the sense that verbal persuasion from peers, as well as the observation of peers’ success, contributes to an individual’s sense of self-efficacy in that same context. Self-efficacy refers to an individual’s beliefs about their ability to carry out an activity. Previous studies have found support for increased adult learner motivation and self-efficacy in settings where learning strategies incorporated successful peers as resources and role models (McAlearney, Robbins, Kowalczyk, Chisolm, & Song, 2012).

Although the McAlearney et al., (2012) study focused on employee training in a healthcare setting, the principle of learning from successful and experienced peers can be applied to the

SIMPLE design in its emphasis on support from, and learning from, fellow faculty within the discipline. An aim of the SIMPLE program is to disseminate teaching practices across the wider

STEM community, which the authors theorize will be dependent upon faculty recruiting their peers to participate. In addition, the increased self-efficacy that will ideally result from observing and working with experienced colleagues will lead to increased motivation for faculty to implement teaching changes, which many perceive to be a daunting task. The SIMPLE program officers state that an important characteristic for participants is the desire to try implementing interactive teaching strategies, an idea further upheld by research on teaching beliefs (Gess-Newsome et al., 2003).

Taken together, Hjalmarson et al., (2013) sought to incorporate innovations in teaching strategies presented in the literature with underpinnings from social cognitive theory to form the collaboration- and support- focused design of the SIMPLE program. It was hypothesized that learning occurs best in collaboration with peers, an idea upheld by other studies of adult learning that focus on social cognitive theory. The use of ongoing small collaborative faculty groups will

SIMPLE EVALUTION SUMMARY AND REPORT 14 function as a platform to present new teaching strategies, transforming traditional STEM instruction in higher education in small, incremental steps. These ongoing groups will continue to disseminate, based on the model of peer leadership and collaboration. A detailed description of the program is provided below.

Program Description

The SIMPLE Design Framework is a model developed to support undergraduate science, technology, engineering, and mathematics (STEM) faculty in the transmission of evidence and research based interactive teaching practices into undergraduate courses. The program’s mission is aimed at supporting faculty development of interactive teaching approaches via small group networks of faculty from similar or complementary disciplines. It is hypothesized that participation in these small group networks will facilitate small changes to teaching and pedagogy which will lead to faculty making longer term, more permanent changes over time.

The SIMPLE Framework hypothesizes that by participating in support networks with a focus of small, incremental change in teaching practice over time, faculty will successfully impart larger changes distally (Hjalmarson et al., 2013).

Problem to be Addressed

The social problem addressed relates to the application of research-based pedagogy by university faculty in teaching STEM subjects. Historically, teaching of STEM subjects at the university level has been carried out through a transmission teaching style, emphasizing student mastery of content knowledge by a content expert (i.e., faculty member). In many universities, tenureship in the STEM fields is granted based on faculty research and contributions to the field alone. There is often little or no training for STEM faculty regarding teaching theory or pedagogy (Suchman, 2014). As a result, STEM faculty rely on personal experience as students

SIMPLE EVALUTION SUMMARY AND REPORT 15 and teachers as well as his or her content knowledge when teaching. Research has demonstrated that an increase in engaging, interactive pedagogies by STEM faculty is associated with increased student motivation, learning, and interest in the field (Jamieson & Lohmann, 2009).

Program Vision, Principles and Goals

The SIMPLE program aims to address the issue of STEM faculty pedagogical change at

George Mason University based on the PI’s experiences in a similar, earlier grant; as well as current research on developing faculty teaching practice.

Dr. Nelson and Dr. Hjalmarson carried out a related research study in 2012 that served as a contribution to the SIMPLE program’s design. This study examined the effects of small groups of faculty as well as shared resources in pursuit of improved teaching strategies for engineering faculty. The authors saw success in their goal of founding small, supportive groups of faculty teaching practice, with central aspects being the shared, formative evaluation of teaching practices by faculty (sharing and discussions) as well as deliverables such as design memos, in which faculty describe and outline their proposed classroom changes (Hjalmarson, et al., 2013). These successful outcomes carried over as essential aspects of the SIMPLE program: small groups to foster discussion and participation, emphasis on small incremental changes, and a supportive and open environment that faculty felt would be responsive to each individual’s needs.

In addition to findings from the PI’s previous faculty development grant, the design of the SIMPLE program also drew upon research related to faculty teaching development. One of the principle ideas behind the concept of the SIMPLE program and its implementation is that teaching is a design process. The PIs drew upon Laurillard’s (2012) concept of teaching as design in the way that teachers should be responsive to his or her classroom environment and

SIMPLE EVALUTION SUMMARY AND REPORT 16 make pedagogical changes accordingly. Rather than simply comparing best practices, the design concept emphasizes documenting proposed changes as well as continuously evaluating their implementation. The SIMPLE design allows for this by providing small, supportive groups of faculty that document their proposed teaching changes in design memos and discuss these changes in regularly scheduled meetings throughout the academic year. The creation of a tangible product to document design is further supported by research on K-12 teacher development. Another principle of the SIMPLE program is the idea that in order to disseminate interactive teaching practices, faculty collaboration and support is essential. Particularly, the ability to collaborate with colleagues who are experienced and/or knowledgeable about higher education teaching practices is beneficial (McKenna, Yalvac, & Light, 2009).

Another theme that arises regarding teaching development for university faculty is the gap between research and implementation. Despite the positive findings from many studies regarding implementation of interactive teaching strategies for STEM faculty, many traditional teaching practices continue unchanged (Cox & Harris, 2010; Jamieson & Lohmann, 2009). The

SIMPLE program seeks to address this by designing a deliberately open framework of small groups, to encourage participation of increased numbers of faculty each year. The design is flexible enough to allow faculty with no experience in interactive teaching to begin to develop their practice, while also supporting faculty who have been developing their practice for some time to be able to carry out self-studies on their practice and eventually submit their findings to publications. Ideally, these types of publications could serve as evidence for promotions and tenureship, as research has suggested that consideration of STEM faculty’s teaching practices for promotions should be recognized (Suchman, 2014).

SIMPLE EVALUTION SUMMARY AND REPORT 17

Considering the outcomes of earlier small-group based designs and existing research on faculty teaching development, two main goals for the current SIMPLE program are as follows:

1.

Support the implementation and use of evidence-based/research-based teaching practices by faculty; documenting the design of these practices (Design Memos), as well as encourage self-study and further inquiry on teaching practices by faculty as they become more experienced.

2.

Broaden the implementation and participation in the SIMPLE program across STEM departments at George Mason University.

Program Design and Implementation

The SIMPLE program has a three-year outline for implementation. In all stages of program implementation, emphasis on small groups is essential to promote scheduling and supportive environment of participants.

In Year One, group leaders participate in a small group as preparation to lead their own small groups the following year, termed teaching design groups. Teaching design groups are approximately 4-6 individuals usually from the same STEM discipline (to the extent possible) and meet roughly one to two times per month to discuss interactive teaching strategies. The small group size is purposeful and essential to foster communication and provide a comfortable environment that encourages individuals to share the changes they are making to their teaching practices. Group leaders seek to recruit participants who are open to discussing, and trying, small changes to their teaching practice. The creation of design memos is a critical component to teaching design as they serve as documentation of each participant’s proposed teaching strategies, as well as an avenue to share ideas and information to others who may be interested in implementation of similar practices. At least one of the teaching design groups will include

SIMPLE EVALUTION SUMMARY AND REPORT 18 participation by graduate teaching assistants, as they are heavily involved in the classroom activities of many STEM faculty in large undergraduate classes; and thus should be included in the development of any changes to teaching practice.

In years two and three of the program, Teaching Inquiry Groups will be formed, usually by experienced participants that have been implementing design-based interactive teaching strategies for some time. The aim of these more intensive groups is to encourage faculty selfstudy and research on teaching strategies, with the goal of possible publication. Like teaching design groups, the size of the Inquiry groups will remain small and will meet approximately one

– two times per month. The role of the group members becomes one of “critical friend” to provide advice and feedback, as well as perform validity and reliability checks on qualitative data collected or instruments issued to students as part of a faculty participant’s research efforts.

At the time of this evaluation, the SIMPLE program is roughly in Year Two. There are approximately eight faculty leading small groups. These faculty hold meetings approximately once per month which are attended by the evaluation team when possible. Discussions of how each participant’s small groups are going in terms of design memo development and sharing continues; as well as discussion of the formal start of Teaching Inquiry groups in the fall.

Participants are also discussing their plans for developing research and conference proposals.

Longer term impact. In addition to supporting the research and implementation of interactive teaching strategies by George Mason Faculty, the SIMPLE design will provide a framework for faculty teaching development at other universities. The model is flexible enough to be implemented at other universities while still providing guidance toward long term goals and outcomes.

SIMPLE EVALUTION SUMMARY AND REPORT 19

Stakeholders. Stakeholders include:

The project team and group leaders (some overlap)

PIs (Margret Hjalmarson, Jill Nelson, Cody Edwards)

Group leaders and small group participants including teaching assistants

 The broader network of George Mason University’s STEM faculty who are potential SIMPLE participants

STEM students

The NSF

Program Theory and Action

Theories of Change

Foundational in the SIMPLE Design Framework are the conceptualizations of collegial sharing and reflection leading to an ease of transition to a more interactive pedagogical practice, incremental changes in pedagogical practices leading to changes at a larger level both personally

(the instructor) and professionally (the university and the academy at large), and interactive pedagogy leading to increased student achievement..

Colleagues sharing their ideas, successes and challenges and engaging in group reflection leads to both a greater awareness of the range of interactive teaching practices and an increased ease in the process of implementing new strategies in the classroom, both individually and collectively. Small changes made by one instructor, reflected upon and shared with colleagues, may build upon each other incrementally, leading to greater changes as the instructor expands upon those experiences and as colleagues pool their experiences in organizational growth and professional development. While student engagement, motivation and, ultimately, achievement have all been linked to interactive teaching practices, pedagogy in the large STEM courses

SIMPLE EVALUTION SUMMARY AND REPORT 20 addressed in the framework is traditionally didactic and student-passive; an increase in interactive strategies designed to actively engage the students will drive enthusiasm and spur inspiration, thereby increasing student achievement (Fiore & Rosenquest, 2010).

Theory of Action

Built upon the theory of change as described, the SIMPLE Design Framework theory of action may be described as such:

If the SIMPLE Design Framework group leaders engage in discussion and reflection on successful, research-based instructional practices in regularly scheduled group meetings, then the group leaders will engage their colleagues in regularly scheduled

Teaching Design Group meetings.

If the group leaders engage their colleagues in regularly scheduled Teaching Design

Group meetings, then the participating instructors will develop supportive collegial networks and instructors will develop Design Memos.

If the participating instructors develop supportive collegial networks and Design

Memos, then the Design Memos can be used in the groups to facilitate dialogue.

If the Design Memos facilitate dialogue, then the dialogue will serve to build a supportive environment in which instructors may share their successes and challenges.

If the dialogue serves to build a supportive environment in which instructors may share their successes and challenges, then instructors will develop successful interactive instructional practices through small, incremental changes.

If instructors develop successful interactive instructional practices through small, incremental changes, then the small, incremental changes will lead to larger, more

SIMPLE EVALUTION SUMMARY AND REPORT 21 comprehensive changes in successful instructional practices in the Teaching Design

Groups.

If the small, incremental changes lead to more comprehensive changes in instructional practices in the Teaching Design Groups, then the instructors in those groups may share those changes in new Teaching Design Groups in other content areas and across the Academy.

If the Teacher Inquiry Groups are expanded into other content areas and across the

Academy, then those additional Teacher Inquiry Groups, using the same actions and practices, will develop additional interactive instructional practices..

If additional Teacher Inquiry Groups develop additional interactive instructional practices, then instructors will use those additional instructional practices to further improve instruction in the STEM fields.

If instructors use additional instructional practices to further improve instruction in the STEM fields, then student engagement and motivation will improve throughout

STEM fields.

If student engagement and motivation improves throughout the STEM fields, then students will achieve higher levels of academic success in the STEM fields.

Logic Model Narrative

The purpose of this logic model (see Appendix C) is to highlight components of the

SIMPLE Design Framework and to illustrate how the activities of the SIMPLE Design

Framework will ultimately contribute to the program’s short, medial, and long-term outcomes.

Broadly, the SIMPLE program is situated within the NSF Grant that is directed at increasing interactive teaching pedagogy for STEM faculty. The SIMPLE program is in response to NSF’s

SIMPLE EVALUTION SUMMARY AND REPORT 22

WIDER grant, which has the primary goal of creating supportive and collaborative environments for STEM faculty to promote and enhance their use of evidence-based teaching practices. The fundamental priorities of the SIMPLE Design are to support faculty in the development of interactive teaching strategies and to broaden implementation of such practices across STEM departments.

The inputs listed (e.g., Group Leaders, funding, Principle Investigators) in the logic model are resources that are needed for the SIMPLE Design Framework to operate. Thus, these inputs are necessary in order to engage in the participation of activities that are intended to produce outcomes. Central to the SIMPLE Design Framework is the utilization of a small group design. The Group Leader meetings provide a forum for Principle Investigators and Leaders to discuss implementation of interactive teaching methods as well as progress toward developing smaller group meetings. These group meetings are intended to provide an outlet for faculty conversation, willingness to try new methods, and discussion about challenges and successes.

Primary artifacts from the group meetings are Design Memos, which succinctly document how an interactive teaching design fits into the context of a particular course. These Memos will provide substance for the critical reflection of pedagogical practices employed by members.

Stemming from these Group Leader meetings are the smaller group meetings in which faculty engage in discussion about the Design Memos they create and challenges. As with any small group design, barriers and challenges exist that influence the adoption of new teaching practices.

External factors such as faculty time to dedicate to Design Memos and implementation raise a significant concern to the dedication needed to change pedagogical practice.

In addition to the external factors that may influence the program, underlying assumptions of the SIMPLE Design Framework influence the linkages among inputs, activities,

SIMPLE EVALUTION SUMMARY AND REPORT 23 and the expected short, medial, and long-term outcomes of the program. It is well established in the literature that teaching is a complex act and the SIMPLE Design Framework recognizes this critical issue (Henderson et al., 2011; Laurillard, 2002). However, the mission of SIMPLE is parsimonious; the overall assumption examines how incremental changes over time may lead to sustained pedagogical shifts in faculty teaching. The SIMPLE Design Framework emphasizes that context, students, and content are important aspects of teaching that should encourage professors to evaluate and adjust their teaching processes in response to these external factors.

Moreover, professional development is most effective when there is active collaboration among members of small groups.

Among the short-term expected outcomes, the small groups are designed to provide a community to foster discussion, encourage reflection, and dialogue about interactive teaching practices that have been developed via the Design Memos. Each individual Design Memo is dedicated to one interactive teaching practice, which will be the focus for that professor/faculty member. Concerning medial-term outcomes, the SIMPLE Design Framework is intended to expand the small group model to other disciplines and continue the development of interactive teaching processes. Herein it is important to acknowledge the potential barrier: the availability of individuals to join. Without interest from faculty in various disciplines, the SIMPLE Design

Framework is at risk of dissolution. Likewise, the risk of participant attrition may also contribute to weakening the program.

Additional medial outcomes concern the Design Memos. The use of these individual strategies may serve as a platform for an individual’s self-study and may translate into research for publications and future conference presentations. Engagement in these reflective practices is hoped to foster a better understanding of teaching and student learning (Samaras & Freese,

SIMPLE EVALUTION SUMMARY AND REPORT 24

2006). Looking broadly into the future, the long-term outcomes expected from the SIMPLE

Design Framework include sustained pedagogical change among STEM faculty. From these pedagogical changes, there will be natural shifts in how courses are designed and delivered among the various disciplines. These changes should be consistent with the shift in teaching practices and reflect interactive means of student engagement. Over time, it is intended that individuals will increase their fluency and proficiency with the implementation of the Design

Memos. Lastly, the most expansive goal of the SIMPLE Design Framework is to promote iteration of this type of program to other universities to change the face of how STEM faculty approach their pedagogy. The purpose of this formative evaluation is to provide information related to the fidelity of the implementation of the SIMPLE Design Framework program and to address the following evaluation questions:

1.

Broadly, how are the teams functioning?

2. Is the program sustainable? As it proceeds from one year to the next, will it be possible for interactive teaching strategies to build upon what’s been done at this point?

What might be the challenges to sustainability as well as aspects of the program that work well in the current framework design?

3. How do the Design Memos foster interactive teaching practices?

Methods

Need for the Evaluation

Just as the need for increased interactive teaching methods among STEM faculty has been documented (AAU, 2013; Suchman, 2014), so has the need for ongoing, formative evaluation of the efforts by faculty to make such changes. This evaluation of the SIMPLE

Design Framework program will serve that purpose. The NSF requires a program evaluation as

SIMPLE EVALUTION SUMMARY AND REPORT 25 specified in the grant for the SIMPLE Design Framework (Hjalmarson et al., 2013). The evaluation team will approach this formative evaluation requirement with qualitative research methods and a participant-oriented evaluation approach, as outlined below.

Evaluation Questions

1. Broadly, how are the teams functioning?

1.

Is the program sustainable? As it proceeds from one year to the next, will it be possible for interactive teaching strategies to build upon what’s been done at this point? What might be the challenges to sustainability as well as aspects of the program that work well in the current framework design?

2.

How do the Design Memos foster interactive teaching practices?

Participants

Participants will be drawn from the individuals participating as group leaders in the

SIMPLE program (n = 10). All group leader participants are faculty at the participating university, in various STEM subjects. The participating university is a state institution located in the suburbs of a mid-sized U.S. city. There are approximately 22,000 undergraduates and approximately 9,000 graduates of diverse ethnic backgrounds.

Sampling Procedures

Purposive sampling will be used to identify SIMPLE participants who are able to participate in interviews (Maxwell, 2012). A small sample such as this will suit the needs of this evaluation, as the evaluators hope to address many of the evaluation questions within each interview and to gather rich, informative data from interacting with each participant (Bach &

Oun, 2014).

SIMPLE EVALUTION SUMMARY AND REPORT 26

Measures

Data collection methods and procedures. Interviews were conducted using a semistructured interview protocol developed from the evaluation questions. Interview questions, and ideas for follow-up probes, are developed for each evaluation question (Appendix A).

Information gathered from the initial interview with Dr. Nelson provided preliminary structure for the evaluation team’s decision on how to frame the evaluation as well as ensuring that evaluation questions would provide utility to the evaluation. One of the primary concerns indicated by Dr. Nelson concerned the overall functioning of the individual Group Leader teams.

First and foremost, are the teams meeting and with what fidelity are these meetings occurring? .

Further, what do the meetings look like and what are individuals’ trajectories regarding their abilities to take information from these meetings and translate it into their classrooms?

Secondly, what was the “growability” or sustainability of the project over time? What was being done in these smaller groups that may be working or not working and affect the longevity of the program. Finally, much discussion revolved around the use of the design memo as the primary deliverable from this project. In what ways were these Design Memos being used? The evaluation team synthesized this, along with information from grant documents, and generated three evaluation questions. In order to find out about how this program functioned, semistructured interview probes that aligned to each of the evaluation questions were created (See

Appendix A).

Interviews took place in person, in a setting chosen by the participant to facilitate convenience and comfort. Additionally, understanding the nature of faculty scheduling constraints, alternative interview format options (e.g., phone, Skype) were offered by the evaluation team. Informed consent forms were collected from each participant. Copious field

SIMPLE EVALUTION SUMMARY AND REPORT 27 notes were taken during the initial interview with Dr. Nelson and Dr. Rosenberg’s interview was recorded with an audio recording device and transcribed in word processing software immediately following the interview (Appendix B). Data will be analyzed using constant comparative analysis (Bach & Oun, 2014). The three evaluators will read through the interview transcript(s) and take copious notes to generate a loose coding scheme. From these notes, themes will be extracted, and discussed among the evaluation team to reach agreement on overall themes and findings.

Document analysis will serve as a second method. A primary source of data for the evaluation team will come from the meetings of group leaders. The SIMPLE program’s

Blackboard web site acts as a hub for program-related and meeting-related documents. There are resources for group leaders and faculty such as research articles on interactive teaching strategies, as well as data collection from the meetings themselves. The document analysis will focus on the latter types of documents as these provide insight into group leader meeting activity.

Document inventory. Documents to be analyzed include:

1.

Design Memos from either group leaders or their faculty participants

2.

“Check-ins”, which are brief, open-ended question sheets filled out by each group leader at the end of every meeting. There are approximately 2 – 5 questions per meeting, which ask faculty about progress, successes, barriers, points of discussion for future meetings, etc.

3.

Meeting notes and transcriptions

4.

Field notes from observations of faculty classes

5.

Field notes from group leader meetings by members of evaluation team

SIMPLE EVALUTION SUMMARY AND REPORT 28

Finally, observation data were collected when the evaluation team observed three

SIMPLE Design Framework group leader meetings over the course of a three-month span. At least two members of the evaluation team were present at the February, March, and April 2015 group leader meetings and took extensive field notes throughout the duration of the meeting.

Each meeting lasted approximately one hour and attendance at each of these meetings was variable, with the meeting in April 2015 having the fewest participants (i.e., two group leaders, three PIs). Skype was utilized for PI participation at each meeting.

Analysis method. The evaluators approached this document analysis using grounded theory, a feature of which is the constant comparative method of coding (Corbin & Strauss,

2008; Maxwell, 2012). Key aspects of this theory are that data are collected and analyzed concurrently, and the researcher creates categories for data as they analyze, rather than prior to analysis. At the end, sections of coded data are compared and analyzed together to create broader categories with subthemes. In the later stages, themes begin to emerge as categories are compared. Since the evaluation does not seek to support or verify an a priori theory or hypothesis, the use of grounded theory was selected to synthesize and make meaning of the data.

To guide analysis, the authors focused on the development process as represented by the sequential order of documents. That is, many of the documents illustrate progression over an approximate six-month period of time (i.e., meeting agendas from September 2014 – March

2015 are available). This allowed for the examination of themes as they developed over the course of the academic year.

Methods to enhance quality. After the authors finished coding and developing categories and themes individually, findings for themes were discussed and refined until agreement was reached.

SIMPLE EVALUTION SUMMARY AND REPORT 29

Evaluation Design

Evaluation approach. Considering the exploratory, implementation-focused evaluation questions central to our project, the SIMPLE program evaluation team utilized a participantoriented evaluation approach. Key aspects of this approach include participation of program stakeholders at all levels, as well as continuous contact with stakeholders throughout the evaluation process (Fitzpatrick, et al., 2011). In the context of the SIMPLE evaluation, program officers were interested in learning the current state of program implementation for each of the groups. Indeed, an initial interview with a PI, Dr. Jill Nelson, provided a base of knowledge from which our evaluation questions were formed. Our second interview was conducted with

Dr. Jessica Rosenberg, a SIMPLE group leader. The flexible, open structure of the SIMPLE program allows for a high level of variability among various faculty groups, and a goal of this evaluation is to learn if what the groups are doing is sustainable.

Typically, participant-oriented evaluations are carried out with qualitative evaluation methods, which by their nature collect detailed, rich accounts of data from key participants

(Fitzpatrick, et al., 2011). The evaluation team employed qualitative methods such as semistructured interviews, observational field notes, and document analysis to gather information to assist in answering the evaluation questions. Limitations to be aware of with our participantoriented approach include bias, considering the PIs and group leaders of the program are the only informants for the current evaluation. However, at this point in the SIMPLE program implementation, this limitation may be considered a low risk because the goal of the evaluation is to obtain a detailed picture of what program implementation looks like in its current stage.

That is, what are the participants doing, how are they doing it; what are successes and barriers to future continued implementation.

SIMPLE EVALUTION SUMMARY AND REPORT 30

Data Analysis and Results

Since the nature of this evaluation is formative, the results presented in this section should be read with the understanding that this is an ongoing process. The information and suggestions offered is intended to spark future dialogue and communication to enhance the implementation of the SIMPLE Design Framework and are by no means finite or summative.

Analysis of Results by Method

Interviews. Interview data were collected via one initial interview with Principle

Investigator, Dr. Jill Nelson, and one interview with Group Leader, Dr. Jessica Rosenberg. The evaluation team made multiple attempts to schedule additional interviews with Group Leaders and PIs and offered alternative interview formats such as Skype, FaceTime, and teleconferencing; however, these attempts resulted in either non-responses or time conflicts due to increased end-of-semester faculty responsibilities

One interview was conducted with Group Leader, Dr. Jessica Rosenberg. Obtaining data from only one group leader interview is a limitation of the evaluation. Hence, overall themes related to the small groups are still in the initial phase of development. From the interview with

Dr. Rosenberg, the evaluation team was able to ascertain information about how her small group was functioning. Dr. Rosenberg explained that the makeup of her small group includes tenured and non-tenured staff, all of which have vastly differing schedule availability, specific to their roles. All five small group participants are part of the Physics, Astronomy, and Computational

Sciences department; however, a recent complication developed as this department is undergoing restructuring, causing additional stressors, meetings, and scheduling restrictions. The small group met somewhat regularly in the Fall 2014 semester (i.e., every other Friday); however, since that time, the frequency of the meetings has dropped off to meeting approximately once

SIMPLE EVALUTION SUMMARY AND REPORT 31 every five to six weeks. As of late April 2015, Dr. Rosenberg noted that it has been several weeks since their last meeting.

Within her small group, Dr. Rosenberg indicated that there is a “core group” of individuals who are more heavily focused on their teaching and pedagogy. These group members are highly motivated toward improving their teaching and incorporating and refining interactive teaching methods. The others in the small group have demonstrated behavior, such as frequent absences and lack of design memo participation.

Consequently, a large portion of the interview went on to discuss barriers faced by Dr.

Rosenberg and her small group. The most pressing issue that permeated throughout the interview regarded faculty time constraints. Dr. Rosenberg stressed that the composition of the small group (i.e., tenured and non-tenured faculty) exacerbated these constraints due to each faculty member’s vastly differing responsibilities, which included research, leadership roles, and other academic activities.

Much of the content discussed in Dr. Rosenberg’s small group focused on concrete logistical issues, such as accessibility of Blackboard, scheduling, and administrative items.

Often, much of the discussions revolved around how to solve issues in the classroom, rather than reflection on pedagogy or interactive teaching methods. Moreover, discussions about individuals’ Design Memos were usually met with some degree of resistance from some group members.

Concerning the Design Memos, there appears to be a lack of consistency regarding implementing changes from semester to semester. This issue is related to the specific format of courses each faculty teaches. Dr. Rosenberg provided an example, in which a participant taught a 300+ person lecture in Fall 2014 semester, and was able to successfully implement interactive

SIMPLE EVALUTION SUMMARY AND REPORT 32 teaching methods. However, beginning in the Spring 2015 semester, this same participant shifted to a significantly smaller upper-level course (i.e., approximately 20-30 students), wherein the interactive teaching method was not applicable.

Document Analysis. A document analysis was conducted using documents collected from the SIMPLE Design Framework Blackboard site as well as documents collected via the

Group Leader meetings. Analyzed documents included meeting notes and agendas, transcriptions of Group Leader meetings, observation field notes of faculty courses, rosters of small groups, Group Leader check-ins, and Design Memos. The grant documents were also used as a reference throughout the analysis.

Several initial themes arose from the document analysis. Broadly, teams are functioning in very different places. According to the meeting minutes and “Check In” documents, some groups were meeting more regularly than others, with varying levels of interactive teaching implementation. Consistent with what Dr. Rosenberg stated in her interview, the “Check-Ins” also indicated group leader concerns with lack of member participation. Design memo sharing is extremely limited as evidenced by the presence of four documents accessible on the Blackboard site. According to transcripts and meeting records, a large portion of the Group Leader meetings were dedicated to design memo sharing at the beginning of the 2014-2015 academic year (i.e.,

September, October, November) as compared to the meetings at the start of the spring semester

(i.e., February, March, April), which mostly centered around conference proposals, group attrition, and new potential small group participant recruitment. Findings from the check-ins collected from each participant at the monthly Group Leader meetings indicate that there is ongoing dialogue about sharing ideas and collaboration on how to approach courses occurring in the small group meetings in addition to several administrative and logistical issues. Discussions

SIMPLE EVALUTION SUMMARY AND REPORT 33 around assessments were also prevalent, specifically regarding the design of assessments used.

Notably, Design Memos are not mentioned as a discussion point during these small group meetings.

Primarily, content from the faculty check-ins indicate barriers, challenges, and changes the groups are facing. The documents available on Blackboard combined with information gathered from Dr. Rosenberg’s interview indicate there was some degree participant noncompliance with regard to completing his or her Design Memos.

The document analysis also yielded evidence indicating different uses of the Design

Memos by participants. In particular, issues regarding how to access resources on interactive teaching strategies appeared as a common theme throughout check-ins and meeting notes.

Notably, one Group Leader has taken steps toward translating the use of his design memo into research, with data collected from one of his courses.

Observations and Field Notes. Meeting rosters indicate that a total of ten individuals

(including the Graduate Research Assistant) present at the December 2014 meeting, yet only five were present in April (two of whom were Skyped in). This finding is contradictory of what may be expected, as December would have similar scheduling restrictions as late April/May (both meetings occurred near the end of the semester).

Synthesis of Results by Evaluation Question

1. Broadly, how are the teams functioning?

Data analysis reveals an overarching theme of greatly varying functionality of the

Teacher Inquiry Groups in several key areas, including the incidence of group meetings , makeup and participation of the group members, and the agendas and activities of the meetings held.

SIMPLE EVALUTION SUMMARY AND REPORT 34

Incidence of group meetings. A few of the groups began meeting regularly and enthusiastically in September; leaders of other groups struggled to bring together willing participants and schedule meetings, with teams not meeting until the latter months of the fall semester. In addition, the regularity of meetings has also varied between groups, with some groups meeting each month and others meeting sporadically; one group began with a high degree of consistency in scheduled meetings, but meetings waned as the spring semester activities increased. The increased attendance in the fall may be attributed to higher levels of motivation and excitement for the program, which may have waned over time.

Make-up and participation of group members. The Teaching Design Groups have varied between groups in their membership, and some have varied in individual group members during the year, as instructors have been either welcomed into the group or have fallen off in their participation. Some of the groups are comprised of faculty members only and some include graduate students serving as Teacher Assistants. One varying factor that has proven to be both beneficial and challenging is the inclusion of both tenure-track and non-tenured faculty members, who often have competing goals and differing standards to which they are held as instructors, researchers and members of the Academy at large.

Another factor that seems to vary between groups is the level of active participation of the group members. Two group leaders expressed frustration with group members’ repeated absences from scheduled meetings and apparent disinterest or pessimistic view of interactive teaching methods. However, some group leaders indicated a high degree of both participation and meaningful conversation about incorporating innovative practices into their pedagogy.

Meeting activities. Again, an overarching theme in the agendas, activities, and discussions held at Teaching Design Group meetings is the variable nature between groups. The

SIMPLE EVALUTION SUMMARY AND REPORT 35 format of group meetings is informal and flexible, and this flexibility comes through in the different undertakings of the groups. A few of the groups seemed to focus more often on the logistical and administrative matters, such as time management, technology, and course design, that often seem more pressing to practicing instructors, than on sharing and exploring interactive instructional techniques. In contrast, some group leaders expressed that their groups function with high levels of communication, sharing of challenges and successes incorporating new practices into their teaching, and “a strong culture of attempting new engaging instructional strategies.

2. Is the program sustainable? As it proceeds from one year to the next, will it be possible for interactive teaching strategies to build upon what’s been done at this point? What might be the challenges to sustainability as well as aspects of the program that work well in the current framework design?

The SIMPLE Design Framework implementation is in mid-process. This evaluation occurred at the end of the second year in a three-year design, and the primary investigators have all expressed an interest in extending implementation further into the future. The sustainability of the program as well as the possibilities for growth and expansion were of foremost concern to many of the program stakeholders. As noted previously, one group leader has begun to shape the use of his interactive teaching Design Memo into educational research using data collected from one of his courses. This provides evidence of growth of the program as well as motivation on behalf of group leaders to implement interactive teaching strategies. Data analysis seems to indicate that while challenges exist in the form of faculty time constraints, the varying membership and dynamics of existing groups, and the ready availability of resources and information, opportunities for growth upon the existing framework indeed exist in several forms.

SIMPLE EVALUTION SUMMARY AND REPORT 36

Taken together, the inconsistencies of small group meetings combined with variable participation and implementation of interactive teaching methods contribute to mixed findings regarding the sustainability of the program.

Faculty time constraints. In almost all forms of data available to the researchers, the underlying challenge to implementation at all levels is the nature of faculty membership at a large research university. Frustrations expressed by the group leaders largely stem from a lack of available time in their schedules to allocate to this or any project in addition to their obligations as instructors, mentors to graduate students, researchers, and authors. Indeed, of the 32 check-in documents produced at monthly group leader meetings, only nine did not explicitly contain references to time as a primary challenge (and several of those in which time was not mentioned as a factor were written in a format made up of multiple-choice answers to questions in which time was not given as one of the available answers).

Group membership and dynamics. One challenge that seems to have risen quite often in the current Teaching Design Groups is the changing nature of the group membership. Many of the group leaders expressed initial excitement about their group membership, but that excitement changed to frustration when group members left the group because of conflicts, changes in their other obligations, or, in a few cases, disinterest in developing interactive instructional practices. As the growing dissemination of interactive practices, as designed in the framework, relies upon building upon networks of colleagues who will continue to spread their innovations to other colleagues through an ever-increasing community of practice, it is a marked setback when members of the Teaching Design Group must withdraw from the project.

Another challenge exposed by the data analysis takes form in the dynamics of the

Teaching Design Groups. While the groups are designed to form collegial critical friend

SIMPLE EVALUTION SUMMARY AND REPORT 37 networks, in a few rare cases the group leaders shared that the groups viewed one or two colleagues as poor instructors; indeed, those colleagues may have been asked to participate because they are viewed so critically by their peers. While a critical friends network requires a level of criticism of each other in order to foster growth, the Teaching Design Groups are intended, by design, to be places in which colleagues may share successes and challenges in a safe, non-judgmental environment.

Ready availability of resources. Several of the group leaders expressed frustration in the lack of availability of resources for instructors seeking to investigate interactive instructional practices. While the resources exist, faculty often have little time to search for them. This was a repeated request from group leaders when asked about factors that challenged their efforts.

Opportunities for growth. While much of the data available focused on the hindrances that group leaders and Teaching Design Groups have encountered, opportunities for sustaining and growing the SIMPLE Design Framework were also apparent. Of particular use in determining this factor was some of the Check-In documents from the group leader meetings; these Check-Ins asked group leaders specifically about which topics and strategies they felt that they would benefit from discussing at future meetings. Obvious from their answers are a number of strategies and discussion topics to which the primary investigators may look for growth and expansion of the project.

3. How do the Design Memos foster interactive teaching practices? In what ways do the Design

Memos help you? How do they not?

A key factor in the implementation of the SIMPLE Design Framework is the use of Design

Memos as a tool to examine, reflect upon, and share emerging practices in interactive instruction.

Yet very little data is available on the use of the Design Memos. Four are posted on the

SIMPLE EVALUTION SUMMARY AND REPORT 38

BlackBoard site for the participating faculty to access, and very little discussion or documentation seems to have occurred regarding the Design Memos. Indeed, only one of the 32 check-in documents written by group leaders at their monthly meetings contained a reference to the Design Memos, and that at the very beginning of the academic year as groups were being formed. In the interviews available to the evaluators, faculty expressed that some group members resisted attempts to guide them to use the Design Memos. Therefore, what the data show regarding Design Memos is that it is unknown to what extent the different groups and group leaders are using them. While some may be using them a great deal (the existing Design

Memos describe thoroughly investigated, implemented and reflected-upon practices), others may not be using them at all.

Recommendations, Summary, Conclusions

The nature of this evaluation is formative, as such, it is the aim of the evaluation team to draw conclusions and recommendations based on the data presented that could benefit the

SIMPLE program in its continued development. In presenting our conclusions and recommendations, we ask that the program officers bear in mind the following limitations. First, time constraints from the perspective of the evaluation team were the semester-long timeframe to complete this evaluation project as part of a graduate course in program evaluation. The larger program evaluation, as part of the grant, will continue to develop and guide the SIMPLE program as it progresses. Second, the evaluation team worked with the data and other resources available to us. We relied heavily on document analysis and field notes from SIMPLE leader meeting observations. Documents, including meeting minutes and Design Memos, were often limited in their quantity and their extent. For example, meeting minutes were a helpful resource in capturing significant conversations between group leaders. Our conclusions about Design

SIMPLE EVALUTION SUMMARY AND REPORT 39

Memos are limited due to the small sample of Design Memos (i.e., four). The evaluation team began attempts at interviewing scheduling toward the end of the Spring 2015 semester. Only one interview was successfully scheduled with a group leader despite attempts to schedule more.

This may have been a direct result of the convergence of faculty time constraints at this latter point in the semester. Finally, with the exception of one participant design memo, all data used for this evaluation related to the perspective of the group leaders. As such, our conclusions about the state of the various small groups are based on the leader perspective alone. It should be noted that throughout meeting minutes, as well as check-ins, several group members made references to the enthusiasm for faculty interested in participating. However, since we focused this evaluation from the perspective of SIMPLE group leaders, we cannot incorporate participants’ perspectives into our conclusions for this stage of the evaluation.

Taking together the findings from data analysis, our knowledge of the program, and findings from relevant literature, we recommend considering the following program modifications, in the interest of program sustainability going forward.

The first suggestion is to consider a more structured approach to the use of Design

Memos. Emphasis on pedagogical changes as a design process, an essential component of which is the documentation of plans, reflections, and revisions for teaching changes, has been supported by many studies (Light, Calkins, Luna & Drane, 2008; McKenna et al., 2009; Gess-Newsome et al., 2003). Furthermore, this emphasis appears to be well established among group leaders in the

SIMPLE program and was a clearly outlined principle in the SIMPLE grant proposal. Indeed, most group leaders expressed that writing Design Memos did help them think about and plan their teaching. However, there appears to be lack of use of, or visibility of, these memos by participants. This could be due to the flexible nature of guidance on design memo usage. From

SIMPLE EVALUTION SUMMARY AND REPORT 40 the program documents available to the evaluation team, the continued use of Design Memos is unclear in terms of a participant’s continuing involvement in SIMPLE. That is, is the design memo template intended to be revised at regular intervals to reflect progress and revisions, i.e., a

“living document,” or should reflection questions be put forth to participants at regular intervals, asking them to write several Design Memos over the course of their participation? Providing a more scaffolded recommendation for the use (and continued use) of Design Memos for each stage of involvement in the program may offer more guidance for participants to be able to complete the memos. The evaluation team feels that providing increased direction and templates for the Design Memos would still allow for the desired level of flexibility that is needed to accommodate various subjects and class styles, but also offer the increased scaffolding necessary to motivate busy faculty to complete them. Furthermore, increased requirements for participants’ use of and completion of design memos could benefit further evaluation efforts by potentially having more design memos available for review.

Our second recommendation is to consider the recruitment of faculty from the college of education and human development as guest speakers at SIMPLE meetings. Preferably, and to the extent possible, would be to include these individuals as regular participants in SIMPLE small groups or co-group leaders. Other studies of similar programs for faculty teaching development have cited the importance of a mentor that is knowledgeable in the learning sciences as essential to aid in continued faculty change: “The investigators recognized that in order to carry out effective research, the participants needed to be inclusive of not only the engineering faculty, but also individuals who had expertise in learning science principles, assessment, curricula development, and research methodologies appropriate for education research” (McKenna et al., 2009). While the long-term goals for SIMPLE include having STEM

SIMPLE EVALUTION SUMMARY AND REPORT 41 faculty as leaders to help disseminate the group design, our findings related to meeting activities, group dynamics, and ready availability of resources suggest that group sustainability may benefit from a visiting faculty member with experience in the science of learning at group leader meetings. The inclusion of a non-department faculty member as a “critical friend” in small group meetings has the potential to provide a clearer distinction between small group meetings and regular departmental meetings. In addition, the visiting CEHD faculty could serve as a valuable resource for ideas related to interactive teaching strategies, which many groups struggled with. Finally, we speculate that the attendance of a non-departmental faculty in group meetings could motivate non-joiners to attend small group meetings more frequently, thus reducing attrition.

Our third recommendation also relates to capitalizing on the success of those group leaders who are meeting regularly and making progress with their small groups. We feel this recommendation could benefit the continuity of SIMPLE as a program particularly if our recommendation to recruit education faculty is not feasible. As stated in our findings, there is a high degree of variability in incidence of group meetings as well as participation. Successful leaders could invite a SIMPLE leader colleague who is struggling with group attrition to their own small group. This would allow for the group leaders to observe what is going on in more successful groups, and note any possible strategies that can carry over to their own groups.

Alternatively, as a second step in the evaluation process, a case study of a successful group leader could provide more insights into the underlying reasons why this small group is more active than others.

Finally, we would like to reiterate the limitations of our evaluation as mentioned above.

Throughout the evaluation we found evidence of both group leaders and their participants’

SIMPLE EVALUTION SUMMARY AND REPORT 42 motivation and excitement for transforming the nature of teaching in STEM subjects. However, it is becoming clear that faculty time and other intrapersonal barriers could become a threat to continued growth. We feel the best approach to addressing these barriers is through capitalizing on the strengths of knowledgeable and successful peers, as well as implementing an increasingly scaffolded design memo structure among the small groups. In further evaluation efforts, we recommend incorporating the perspective of small group participants, as well as carrying out a case study approach with more active small groups and their leaders to further investigate the reasons for their success.

SIMPLE EVALUTION SUMMARY AND REPORT 43

References

Association of American Universities (AAU). (2013). Framework for systemic change in undergraduate STEM teaching and learning. AAU Undergraduate STEM Education

Initiative. Retrieved from: http://www.aau.edu/WorkArea/DownloadAsset.aspx?id14357

Bach, C. & Oun, M. A. (2014). Qualitative Research Method Summary. Journal of

Multidisciplinary Engineering Science and Technology (JMEST), 1 5, 252-258. Retrieved from www.jmest.org.

Bandura, A. (1977). Social learning theory. Englewood Cliffs, NJ: Prentice Hall.

Bandura, A. (1986). Social foundations of thought and action: A social cognitive theory.

Englewood Cliffs, NJ: Prentice Hall.

Bandura, A. (1997). Self-efficacy: The exercise of control. New York, NY: Freeman.

Corbin, J., & Strauss, A. (2008). Basics of qualitative research: Techniques and procedures for developing grounded theory (3 rd

ed.). Thousand Oaks, CA, US: Sage Publications, Inc.

Cox, M., & Harris, A. (2010). Comparison of pretenured and tenured engineering professors’ pedagogical practices within undergraduate bioengineering courses. International

Journal for the Scholarship of Teaching and Learning, 4 (1), 1–11. Retrieved from http://www.georgiasouthern.edu/ijsotl

Darlaston-Jones, D., Pike, L., Cohen, L., Young, A., Haunold, S., & Drew, N. (2003). Are they being served? Student expectations of higher education. Issues in Educational Research,

13 , 31-52. Retrieved from http://ro.ecu.edu.au/ecuworks/3562

Ferrini-Mundy, J., & Güçler, B. (2009). Discipline-based efforts to enhance undergraduate

STEM education. New Directions for Teaching & Learning, 117, 55-67. doi:10.1002/tl.344

SIMPLE EVALUTION SUMMARY AND REPORT 44

Fiore, L. & Rosenquest, B. (2010). Shifting the culture of higher education: Influences on students, teachers, and pedagogy. Theory into Practice, 49 , 14-20. doi:10.10800/00405840903435535

Fitzpatrick, J. L., Sanders, J. R., & Worthen, B. R. (2011). Program evaluation: Alternative approaches and practical guidelines (4th Ed.). Boston, MA: Pearson.

Gess-Newsome, J., Southerland, S. A., Johnston, A., & Woodbury, S. (2003). Educational reform, personal practical theories, and dissatisfaction: The anatomy of change in college science teaching. American Educational Research Journal, 40 , 731-767. doi:10.3102/00028312040003731

Hay, D. B., Wells, H., & Kinchin, I. M. (2008). Quantitative and qualitative measures of student learning at university level. Higher Education, 56 , 221-239. doi:10.1007/s10734-007-

9099-8

Henderson, C., Beach, A., & Finkelstein, N. (2011). Facilitating change in undergraduate STEM instructional practices: An analytic review of the literature. Journal of Research in

Science Teaching, 48 , 952-984. doi:10.1002/tea.20439

Hiebert, J., & Morris, A. K. (2012). Teaching, rather than teachers, as a path toward improving classroom instruction. Journal of Teacher Education, 63 , 92-102. doi:10.1177/0022487111428328

Hjalmarson, M., Nelson, J. K., Huettel, L. G., Padgett, W. T., Wage, K. E., & Buck, J. R. (2013,

June 23-26). Developing interactive teaching strategies for electrical engineering faculty.

Paper presented at the ASEE Annual Conference and Exposition, Atlanta, GA.

Hjalmarson, M., Nelson, J. K., Huettel, L. G., Padgett, W. T., Wage, K. E., & Buck, J. R. (2012).

Creating small interactive teaching development groups.

SIMPLE EVALUTION SUMMARY AND REPORT 45

Jamieson, L. & Lohmann, J (2009). Creating a culture for scholarly and systematic innovation in engineering education: Ensuring U.S. engineering has the right people with the right talent for a global society. (Report No. 0743223). Retrieved from American Society for

Engineering Education website:. http://www.asee.org/about/board/committees/EEGE/upload/CCSSIEE_Phase1Report_Ju ne2009.pdf

Laurillard, D. (2002). Rethinking university teaching: A conversational framework for the effective use of learning technologies (2nded.). New York, NY: RoutledgeFalmer

Laurillard, D. (2012). Teaching as a design science: Building pedagogical patterns for learning and technology. New York, NY: Routledge.

Light, G., Calkins, S., Luna, M., & Drane, D. (2008). Assessing the impact of a year-long faculty development program on faculty approaches to teaching. International Journal of

Teaching and Learning in Higher Education , 20 , 168–181. Retrieved from http://www.isetl.org/ijtlhe

Maxwell, J. A. (2012). Qualitative research design: An interactive approach (3rd ed.). Thousand

Oaks, CA: SAGE Publications.

McAlearney, A. S., Robbins, J., Kowalczyk, N., Chisolm, D. J., & Song, P. H. (2012). The role of cognitive and learning theories in supporting successful EHR system implementation training: A qualitative study. Medical Care Research and Review, 69 , 294-315. doi:10.1177/1077558711436348

McKenna, A. K., Yalvac, B., & Light, G. J. (2009). The role of collaborative reflection on shaping engineering faculty teaching approaches. Journal of Engineering Education, 98 ,

17–26. doi:10.1002/j.2168-9830.2009.tb01002.x

SIMPLE EVALUTION SUMMARY AND REPORT 46

Peel, M. (2000). ‘Nobody cares:’ The challenge of isolation in school to university transition.

Journal of Institutional Research, 9, (1), 22-34. Retrieved from http://www.seaair.au.edu/journal.html

Rallis, S., Tedder, J., Lachman, A., & Elmore, R. (2006). Superintendents in classrooms: From collegial conversation to collaborative action. The Phi Delta Kappan, 87 , 537-545.

Retrieved from http://www.jstor.org/stable/20442072

Rossman, G. B., & Rallis, S. F. (2012). Learning in the field: An introduction to qualitative research (3 rd

ed.). Thousand Oaks, CA: Sage Publications.

Samaras, A. P., & Freese, A. R. (2006). Self-study of teaching practices. New York, NY: Peter

Lang Publishing, Inc.

Schunk, D. H. (2012). Social cognitive theory. In K. R. Harris, S. Graham, & T. Urdan, (Eds.),

APA Educational Psychology Handbook: Vol 1. Theories, constructs, and critical issues

(pp. 101-123). doi:10.1037/13273-005

Splitt, F. G. (2003). The challenge to change: On realizing the new paradigm for engineering education. Journal of Engineering Education, 92 , 181-187. doi:10.1002/j.2168-

9830.2003.tb00756x

Suchman, E. L. (2014). Changing academic culture to improve undergraduate STEM education.

Trends in Microbiology, 22 , 657-659. doi:10.1016/j.tim.2014.09.006

Theall, M. (2005). The multiple roles of the college professor. Retrieved from http://www.nea.org/home/34715

Tinto, V., & Goodsell-Love, A. (1993). Building community. Liberal Education, 79 , 16-22.

Retrieved from http://www.aacu.org

SIMPLE EVALUTION SUMMARY AND REPORT

Trigwell, K., & Prosser, M. (1996). Changing approaches to teaching: A relational perspective.

Studies in Higher Education, 21 , 275-284. doi:10.1080/03075079612331381211

47

SIMPLE EVALUTION SUMMARY AND REPORT 48

Appendix A

Grant Goals

2. Broaden implementation of the

SIMPLE Design Framework for sustainable faculty teaching development across and within multiple STEM departments.

Evaluation Questions

1. Support faculty use of evidencebased and research-based teaching practices both in terms of implementing research-based practices and collecting evidence to support designing interactive teaching

1. Broadly, how are the teams functioning?

[*PI facilitation to group leaders]

2. Is the program sustainable? As it proceeds from one year to the next, will it be possible for interactive teaching strategies to build upon what’s been done at this point? What might be the challenges to sustainability as well as aspects of the program that work well in the current framework design?

3. How do the Design Memos foster interactive teaching practices?

Interview questions and probes

What has been most effective about the team leader meetings?

…Least effective?

How are the team leader meetings supporting your work with your groups?

In what ways do you think you have changed as a leader as a result of your participation in this grant?

How are the team leader meetings supporting the change(s) that you are making within your classes?

What changes are you making to your classes? Why did you decide to make these changes?

In what ways do you think you have changed as a teacher, as a result of your participation in this grant?

What is your perception of how your changes as a teacher have affected your students?

How are your team meetings going?

Tell me about your (last) group meeting.

-- Tell me about what has worked well in your group

SIMPLE EVALUTION SUMMARY AND REPORT 49

-- What are some barriers/difficulties/challenges that you’ve had as a group?

-- (if needed) How often have you been able to meet as a group?

-- Have you found other avenues to collaborate outside of formal group meetings?

What types of teaching successes have faculty discussed in your group?

What types of teaching challenges have the faculty discussed?

-- Tell me about your thinking related to active learning – out of all of the grant activities, what has been most helpful to you in changing your teaching? How is your group helping you to make a change?

How has the Design Memos aided in this? (or not) (if applicable) How do you use the

Design Memos you’ve created?

Are there other strategies that you think would be helpful?

-- What was your motivation to participate in this grant? What was your expectation when you joined that program and is it living up to your expectations?

SIMPLE EVALUTION SUMMARY AND REPORT 50

Appendix B

Transcription of interview with Dr. Jessica Rosenberg on 4/21/15

Rob Stansbery, Colleen Barry

[Beginning of Interview- Getting Set Up]

R: Well he said, yeah, we don’t have a date yet… until May

J: Well, yeah

R: He’s the only other one that…

J: It’s like if “If I don’t respond now, you’re never gonna hear from me” and he’s just gonna fall off that cliff of …

C: recording on multiple sources… and ready to take notes

R: Some of this is what you’ve already been talking about so I hope we’re not

J: That’s alright

R: I hope we’re not double dipping

J: That’s alright, I can repeat myself (laughing) I’m good at that

R: So, I think our first question is just to ask you about how your group meetings are going and ask you to tell us about your group and how it’s happening

J: OK. Ok so, uh, I’m in physics and astronomy and the group is a physics and astronomy group.

Ummm. It was it is so all people within the department

R: [Uh huh}

J: uhhh and it’s a mix

R: is that one department? Physics and astronomy?

SIMPLE EVALUTION SUMMARY AND REPORT 51

J: Well right now we are the school of physics, astronomy, and computational science and we are about to become physics and astronomy and then um separate from computational data science, which is part of this complication that was referred to- this [air quotes] reorganization. Um but yeah physics and astronomy are together, they are one department and they always have been. And it’s a mix of both physics and astronomy people in the group, so we do have both sides of that. we’ve got both term faculty and tenure. I think right now we don’t have any tenure track faculty just we’re untenured uh, and currently one graduate student roped in, who’s… Interested in education… um and so as I’ve said it’s been sort of a mix of the term faculty are very active, very interested and involved.

They are the ones who tend to teach a lot of the labs. Um, at least two of the facul… well one of them is the lab organizer for the astronomy, the other one used to be one of their their come to depending on which track for physics and so she organizes the college physics, teaches the college physics which is the physics for biology students and premeds. The non-calculus based… um and so they’ve been very involved in sort of looking at their teaching and being involved with sort of changes in teaching and such things. And along with that have managed to rope in an evolving group of faculty who have some interest in these things but maybe less time, less sort of encouragement to do something about it. Um, and I would say it was fairly effective and constructive last semester. This semester, mostly for reasons of organization, I just sent an email to the group about our last meeting, and immediately got one back from the department chair

“That’s when we’re having our next faculty meeting. Please reschedule” and that has sort of been the nature of it all semester and just keeps running into these kinds of issues

R: Yeah [nods in agreement]

SIMPLE EVALUTION SUMMARY AND REPORT 52

J: Um. So I will shift it. Um. But. Yeah so I think it’s been a while since we’ve met now again because of the semester… but we’ve met a couple times. We’ve had one of our meetings we met with some of the Blackboard IT people to look at different ways to use

Blackboard and accessibility and it’s a lot of…[unintelligible] a lot of the group was interested in that. Um so tends to be a little bit focused on concrete things, “how do we solve this problem?” Um but not entirely. We try to kind of push them toward other things as well but again a lot of the core group have things that they are doing that they are working on totally revising this college physics curriculum. Uh Becky’s constantly adapting the astronomy lab curriculum and she teaches in the ALT room and teaches some of the studio courses plus some of the distance courses so, I mean, it’s mostly a group who is in the process of trying a lot of stuff anyway.

R: You talked. You’ve already talked about the timing of scheduling issues; are there any other barriers or difficulties or challenges that you have as a group?

J: That’s far and away the biggest. Um. The other issue I mean I think it’s just a different version of the time issue is… I think this question is sort of the term verses tenure track faculty… it’s interesting because the goals, the constraints, the sort of time commitments are different. And blending the two is an interesting challenge. I think it’s good and I think its kind of a good thing to have that blended group. To have both perspectives but I think it does have its challenges. Um… where the term faculty their job is teaching and most of them are, that’s what their priorities are and that’s what they’re engaged in and you know and at least ours are very engaged with that, which is great. Trying to bring in the tenured faculty to to that where they’re not the experts and they’re not the ones who have been as

SIMPLE EVALUTION SUMMARY AND REPORT 53 engaged and it’s dynamics are a little bit different [laughs]. I think that is a bit of a challenge

R: Um. What has worked well with your group?

J: I think some of the concrete things… like… you know those blackboard sessions, last semester um we actually did a little bit on that sharing on how people, you know we’re talking about testing, and one of our faculty at that point was teaching this 300 person lecture course and was really looking for ideas on how to more effectively interact and engage with 300 students in the classroom and sort of sharing the concrete things. Um has been useful. It’s also just mostly been, and I say this for people to sort of share ideas. I said there is this core group that’s doing a lot of stuff anyway about that. Um. So when we’ve had the opportunity to kind of do that so people can work through “ok how do we do these things” or “I’m encountering various issues” and just the space to discuss has been productive I think.

R: Have you found other avenues outside of your, you know, scheduled formal group meetings for that kind of thing? For for collaboration and idea sharing and…

J: not much [laughs]

R: No?

J: a little bit. I mean there has been a little bit of like I connected one of the group members with

Lori Bland to talk about assessment and how to evaluate whats going on in his course and um. A bunch of us, several people in the group, teach in the ALT room so we all attend the ALT room group sessions. So just not the same, but it’s sort of related so there are connections there

R: What is your thinking related to active learning?

SIMPLE EVALUTION SUMMARY AND REPORT 54

J: In what way?

R: um

J: I like it. I use it. Uh [laughs]

R: Well I think we are specifically looking at the Design Memos and how the Design Memos have aided in active learning and how how the different groups have used the memos they’ve created

J: So, we still need to work on getting the group to do that. As I’ve said there are … a … uh.. the core group that I work with are very involved in active learning anyway, as I’ve said, several of them teach in the ALT classroom, myself included this semester. So

[unintelligible] by default. That’s what we’re doing. There’s a lot of lab courses, people teaching labs, so I think it takes on a slightly different perspective. For the most part, people in the group sort of have been interested in that and doing that and it’s you know we’ve sort of shared some ideas about different ways we’ve done it. But a lot of that was going on anyway. One of my goals with the group was to try to bring in people who were who were doing less of it to try to engage them and last semester one of them was as I’ve said this guy teaching this large lecture course.

Other Individuals Enter Room: [unintelligible]

C: OK

J: OK

R: OK, we’re good. Actually that was our… sorry

C: what was the last thing you said? Convincing other people… sorry

J: Yeah, so part of it was to bring in people to try to convince them to try some of these things and

SIMPLE EVALUTION SUMMARY AND REPORT 55

C: ah the Design Memos, got it

J: and he thought about trying some of the active learning stuff. Mostly never got to it. It was a huge… and and a lot of the hurdles for him in particular because he was teaching this 300 person class, were these more sort of technical things which is why sitting down and talking about how to communicate with your students over blackboard. How do you use you know, clickers and is that an option. I think was useful to moving him in that direction but I mean you know, it’s one of those “welp, times up ok” you know, he’s onto a different course. He’s teaching a small upper level course now so. Maybe in the fall it will have an impact, we’ll see

R: Um has this been what you expected of it?

J: I’m not sure what I expected [laughs] of it honestly. Um it’s probably not too terribly different you know. I think… this blend of sort of different perspectives I think with this

[unintelligible] ppl who are very committed and doing this stuff anyway. Combined with people who “well this sounds like a good idea” but they’re not as engaged. It’s challenging

R: Yeah

J: It’s definitely challenging I’m not sure I would have been surprised. I think that’s probably what I would have expected when I started [laughs] did expect when I started but that’s certainly was the case.

R: thanks appreciate your time

C: thanks

J: if you have other questions, let me know!

R: thanks

SIMPLE EVALUTION SUMMARY AND REPORT

C: we will

J: we seem to have run out of time

R: we covered what we needed to?

J: so this is your project for class?

R: yeah

C: Thank you!

R: Thank you!

56

SIMPLE EVALUTION SUMMARY AND REPORT

Appendix C

57

Download