The Implementation and Evaluation of a Multi

advertisement

The Implementation and Evaluation of a

Multi-Site Teen Pregnancy Prevention

Program in Five Western States:

Lessons from the First Two Years

Alaska Idaho

Montana Oregon Washington

The Northwest Coalition for Adolescent Health and

Philliber Research Associates

Table of Contents

I. Introduction 3

II. Formation of the Coalition 4

III. The Interventions 4

IV. The Evaluation Requirements 5

V. The Planning Year 7

VI. Choosing School Sites and Students 7

VII. The Consent Process 9

VIII. The Random Assignment Process 11

IX. Implementing the Program 14

X. Data Collection 15

XI. Conclusion 18

References 20

Acknowledgements 21

Lessons from the First Two Years Page 2

I. Introduction

In 2010 the Office of Adolescent Health (OAH), part of the Department of Health and

Human Services, issued a Funding Opportunity Announcement (FOA) for the

Replication of Evidence-Based Programs for Teen Pregnancy Prevention. The FOA offered $75,000,000 for the replication of 28 programs found through rigorous evaluation to be effective in preventing teenage pregnancy. Most of these programs had been evaluated only once.

By September of 2010, OAH had funded 75 grantees in 32 states and the District of

Columbia through a competitive process. Grants were awarded on four levels:

A level: $400,000 to $600,000 per year

B level: $600,000 to $1,000,000 per year

C level: $1,000,000 to $1,500,000 per year

D level: $1,500,000 to $4,000,000 per year

Those receiving grants on the C and D levels had to develop an independent granteelevel evaluation, using rigorous methods.

In addition, OAH awarded a contract to Mathematica, a research firm in Princeton and

Washington, to assist these programs in meeting rigorous evaluation standards.

Research Triangle Institute (RTI) International also received a contract to design performance measures that would be required from all grantees and to maintain a data base of these measures from all of the funded programs. Curriculum-based programs were asked to submit their curricula for a review of medical accuracy and age appropriateness. All grants were awarded for five years, with the first of these years being for planning and pilot work.

The largest of these grants was $4,000,000 per year for five years, awarded to Planned

Parenthood of the Great Northwest, on behalf of six Planned Parenthood affiliates that together formed the Northwest Coalition for Adolescent Health (the Coalition). The group applied to replicate the Teen Outreach Program (TOP®) across Idaho, Alaska,

Montana, Oregon and Washington. During the application phase, the Coalition asked

Philliber Research Associates (PRA) to serve as its independent, outside evaluator and to design and implement the evaluation.

This report is the story of the joint work of the Coalition and PRA during the first three years of this project —one planning year and two years of implementation. It strives to capture the project’s challenges and successes, with the intent to assist others who are both implementing and evaluating similar projects.

Lessons from the First Two Years Page 3

II. Formation of the Coalition

Education leaders at the six Planned Parenthood affiliates were trusted colleagues who had a history of collaboration. Planned Parenthood of the Great Northwest decided to respond to the FOA to implement the Teen Outreach Program (

TOP®) – a program similar to the youth development program they had been implementing in Washington state for over 10 years. Six months prior to the FOA announcement, PPGNW did a thorough vetting of the programs on the CDC’s proven program list. The goal was to find a program that would be a good match for a range of communities, rural and urban with diverse populations, and one that would not be a completely new start for the education team. Because of the experience providing youth development programs, it seemed doable to select

TOP® and scale up to do a regional implementation. After selecting TOP® for the implantation the five Planned Parenthood affiliates approached to join the coalition reviewed the TOP® program information, participated in several exploratory conference calls, then committed to join the coalition. These partnerships expanded the project to encompass five states in the northwest: Alaska, Idaho,

Montana, Oregon and Washington. Planned Parenthood of the Great Northwest became the fiscal agent for the project and would provide administrative oversight.

Once committed, coalition partners worked for a solid month to complete the grant proposal, participating in conference calls several times per week, populating charts with data and fleshing out shared workplans. Coalition members were on the ground in their communities solidifying relationships and securing Memoranda of Understanding

(MOU) from school partners. One in-person meeting was held with the partners in order to finalize plans before final submission in May 2010.

III. The Interventions

From the list of 28 evaluated strategies for reducing teen pregnancy chosen by OAH for replication, the Coalition chose the Teen Outreach Program (

TOP®). This program uses a curriculum called Changing Scenes that is available in four levels, depending on the age of the students enrolled. Originally developed for ninth and tenth graders, the program uses a combination of curriculum sessions and service learning to create a safe and supportive atmosphere for students, while leading them through a series of lessons on values, goal setting, sexuality, relationships, decision-making, human development and other topics. The Wyman Center, Inc., in St. Louis owns TOP® and suggests students should receive 25 lessons over a nine-month period and 20 hours of community service learning (CSL). The 20 hours of service learning is focused on action, however planning, reflection and celebration are key elements to effective CSL.

Both the curriculum of

TOP® and the TOP® description of service learning have been changed since the early 1990’s when the program was first evaluated. The curriculum did not originally come in four levels and a curriculum revision was completed by

Cornerstone in Houston, a former owner of the program. In the original version of

Lessons from the First Two Years Page 4

TOP®, most of the community service involved individual placements of students in notfor-profit organizations where they remained for the entire school year. The curriculum has always, however, allowed for debriefing of the service learning experience. Later in its implementation, many

TOP® clubs found it more convenient or even necessary to offer group community service opportunities, especially for middle school youth.

The original evaluation of

TOP®, also completed by PRA and Dr. Joseph Allen of the

University of Virginia, used a randomized design with a control group and showed the program to be effective in reducing course failure, reducing school suspensions, and in delaying pregnancy (Allen, Philliber, Herrling, and Kuperminc, 1997).

In designing this new evaluation of TOP®, the Coalition and PRA discussed whether to offer the control group of students an alternative program. Both the Coalition and the evaluation team were anxious to avoid disappointing students who were not chosen for

TOP®, and were concerned that the concept “control group” might not be well accepted by school personnel.

It was decided that two programs would be offered. Community Voices (CV) was also offered to each school alongside TOP® and was described as a program where students would meet two to four times a year to talk about issues important to young people in the community. Teachers and students were told that classes to receive

TOP® or CV would be chosen by lottery—a phrase less technical than random assignment but well understood by the schools and the students. The same facilitators who delivered

TOP® also delivered CV and were careful to steer clear of offering any sexuality education, any portion of the

TOP® curriculum or any service learning opportunities to the CV group. In fact, this worked quite well and in some schools the

CV students wanted to meet more often, which was not allowed.

IV. The Evaluation Requirements

There were several rigorous evaluation requirements for this project, intended to help all of the programs produce credible evidence.

First, all of the programs receiving money at the C and D levels were required to use randomized control trials (true experimental designs) or at least quasi-experimental designs with some sort of comparison standard. The C and D level programs also had to follow students for one year after the program ended. In the case of our intervention, this meant we had to collect data from these students before the program began, when it was completed at the end of the school year, and then one year later. To ensure a large percentage of surveys were collected one year after the program, students had to be closely followed over time.

The evaluation also required an “intent-to-treat” model. This meant that any student who was originally randomized had to be followed through all three required data collection

Lessons from the First Two Years Page 5

points

—pre-program, post-program and at a one-year post-program follow-up. If students moved away during the program year, never attended for some reason, or otherwise really did not participate, they would always be counted and their outcomes assessed with the group to which they were originally assigned. Obviously, students with a lower dose of TOP® would not be expected to have the same outcomes as those who actually received the program as intended (25 curriculum sessions and 20 hours of community service). Everyone working on the project thus needed to be concerned about loss to follow-up and attendance. This was a new and frankly worrisome requirement for the program facilitators because many had never participated in such a rigorous evaluation before.

Each month, both the senior program staff on the project and the evaluation team participated in a conference call with the assigned OAH project officer and a technical assistance provider from Mathematica, a group tasked with monitoring evaluation strategies, providing assistance, and grading project progress on some 23 dimensions with red, yellow or green colors to indicate how well it was going.

Another requirement of the evaluation was that all of these programs use the same performance measures —such as whether students were sexually active, using contraception, or pregnant. Both the measures and the exact wording of these questions were required by OAH and RTI, to promote continuity across all the replications in what was measured and learned. This was an important research opportunity for the field, however, when schools or program staff wanted a question changed it was not possible to grant their requests.

One of the important lessons learned throughout this work was:

Evaluation team members needed to take time to explain the evaluation requirements in non-technical language to program staff and program staff needed to acquaint the evaluation team with their real world experiences and challenges.

It took at least a full year for the program and evaluation staff to learn each oth er’s jargon and develop a working relationship. The evaluators visited every affiliate in the

Coalition, met with facilitators, and tried to explain all the conditions and players in this work. The program staff, in turn, voiced their concerns, and made suggestions based on their program sites. Whenever possible, these suggestions were incorporated which helped build trust and a deeper understanding of both the research and program sides of the project. All program and evaluation staff met in a two-day in-person retreat to have fun, get to know each other, and share. Slowly, but eventually, a strong working partnership was developed, but staff turn over throughout a five-year project produced a continuing need to both establish new relationships and nurture old ones.

More opportunity to discuss challenges or highlight successes and best practices came in the form of coalition calls. Participants on these every-other-week GoTo Meetings included program managers from all Planned Parenthood affiliates, project leadership

Lessons from the First Two Years Page 6

from Planned Parenthood of the Great Northwest and members of the PRA team.

Each meeting included updates from the Training Team, the evaluation team, news from OAH and individual affiliate reports.

V. The Planning Year

Year one of the five-year project was dedicated to piloting and planning. During this year, an administrative team

(including an in house TOP® training team) was formed at

PPGNW, Coalition Trainers were certified by Wyman as TOP® trainers, program facilitators were hired in all states and trained by the TOP® training team and the evaluation team hired and trained data collectors who were located near the program sites. Wyman provided technical assistance and support as needed. This was especially important during the first year. The training team at PPGNW created materials and resources for the partnering affiliates and the evaluation team created a data collection manual for the data collectors. Pilot tests of the TOP® program were run by each of the Planned Parenthood affiliates that were part of the Coalition and data collection forms and strategies were pretested. Memoranda of Understanding were solidified with schools that agreed to participate, and extensive discussions began with school partners to identify which classes in each school would be most suitable to host

TOP® and CV.

The evaluation team asked each affiliate to develop implementation plans for their sites and scheduled calls with each one to insure that the classes chosen for random assignment did indeed meet the standards and were well matched (there is more detail about this below).

Both the training staff at Planned Parenthood of the Great Northwest and the evaluation team prepared documents that affiliates could use to recruit schools. These documents explained both TOP® and CV and tried to anticipate questions. Affiliates were assisted in what to tell schools, how to explain the evaluation strategies, and how to obtain both program parental consents and student assents for program and evaluation participation.

VI. Choosing School Sites and Students

Overall, in the first two years of implementation TOP® and CV were offered in 67 sites through 264 clubs (132 TOP® and 132 CV). Almost 5000 students were served.

Overall

Total

Cohort 1

TOP CV Total

Cohort 2

TOP CV

Sites 67 40 40 40 59 59 59

Clubs 264 102 51 51 162 81 81

Students 4857 1946 962 984 2911 1506 1405

Lessons from the First Two Years Page 7

Getting all of these schools to participate and figuring out which students and classes were most suitable for

TOP® and CV were not without challenges. Several important lessons can be taken from this process:

Develop strong and deep relationships with host agencies or schools.

What was learned early was that it was critical to develop and nurture good relationships with schools. Conditions and personnel at schools change so that some of the best plans had to be changed at the last minute when a school contact was lost.

The change in personnel is unavoidable but creating a relationship with more than one person at each site produced more stability. This effort was assisted by relationships that were in place between the Planned Parenthood Education department in each community had already forged with school partners. When new partnerships were required, well-honed skills were employed.

Remember that you are an outside program and must show sensitivity to the schools’ main goals.

The past track record of TOP® in improved grades and preventing suspensions, as well as its success in preventing early pregnancy

—one of the leading causes of school dropout —were attractive to schools. But nothing offered in these schools could interfere with the current emphasis on improving test scores under national and state mandates.

Removing students from core classes to do community service work, interfering with testing schedules or otherwise losing sight of the schools’ goals would likely be grounds for non-participation. Again the importance of the trust that had been developed over time between Planned Parenthood Educators and these school partners cannot be understated. These were not strangers coming into the school asking to set up programs. Rather, they were trusted professionals who had been doing community education in many of the settings. When new partners were required, references were available from nearby schools to endorse the value of the organization providing the program.

Recognize that some schools with high mobility and those who enroll stude nts in “alternative” settings will have poor attendance and high attrition.

The more mobility that exists at a school, the more difficult everything will become from gathering consents to maintaining attendance. The higher the attrition, the poorer the evaluation quality will be. In spite of the great need for a program such as TOP ® in schools with high risk students, if young people do not actually receive the program because of poor attendance or high mobility, program resources are not being used most effectively. While “alternative schools” welcomed these programs, attendance at these same schools led to consistently lower program exposure (see table below).

Lessons from the First Two Years Page 8

Similarly, intact classes received more program sessions than did “pull-out” classes or after school groups.

Average attendance per club

% of classes attended

Average classes attended

Total classes offered

(including curriculum and

CSL group work)

Total CSL offered

Average CSL hours per student

Group CSL hours offered

Maximum individual hours offered

Overall

54%

16

30

43

15

22

21

By Cohort

1 2

37%

11

66%

20

41%

School

Alternative

Non-

Alternative

56%

12 17

In

School

57%

Meeting type

After

School

48%

Pull out

44%

17 15 13

29

35

11

19

16

30

49

18

24

24

30

34

12

23

11

30

45

16

22

23

30

46

15

21

25

32

37

15

26

12

29

44

13

20

23

As the rising numbers from Cohort 1 to Cohort 2 (Cohort refers to the implementation years) demonstrate, data from the evaluation alerted program staff to these issues and program dosage improved substantially in the second implementation year. Facilitators intensified their efforts to maximize class attendance and created gains in both curriculum exposure and community service hours.

VII. The Consent Process

The human subjects review process for this project required both review of the entire project and sometimes, a second review by local school board committees. Both required that parental consents be obtained for student participation in either TOP

® or

CV, consent for participation in the evaluation, and student assent to both program and evaluation participation. All consents had to be collected before random assignment could be completed because as noted above, once students were randomized, they had to be tracked to the end of the data collection period. The loss-to-follow-up rate would rise if students who had never returned consent forms had to be counted, rather than doing random assignment only among those who were fully consented.

Start early to obtain consents

Gathering consents in the first year took longer than expected. Some TOP® facilitators used incentives to obtain consents. These things included gift cards, raffles, pizza or some other food reward. It was important to find things that were valuable to the students, while avoiding anything that was so valuable that they could not refuse or

Lessons from the First Two Years Page 9

anything that would get them to sign consents when they really did not intend to be part of the program.

By the second year, experience led to the process starting early and a better sense of what incentives were effective but also reasonable. School personnel too, seemed more aware of the need for this process to be efficient and brief.

TOP

® staff and facilitators must be committed to and heavily involved in the consent process.

Although school personnel were an important part of the consent process, it could not be turned over entirely to them. This was, after all, not their program. TOP

® staff and facilitators had to be active in getting the consents and checking frequently on progress.

Sometimes it was necessary to call parents to get these completed. In a few cases, facilitators went door to door to obtain them.

Avoid disappointment with program assignment .

Some facilitators began their recruitment by talking up only TOP

® to the students, rather than also presenting CV as a desirable program. That happened through lack of complete understanding about how the random assignment would occur and that there could be no exceptions. The evaluation team prepared a document for program staff to help them explain the positive purposes of CV and emphasize that each school was being offered two programs —both valuable. This was less of a problem in the second year because facilitators better understood the process and because students liked CV.

Develop a good system for transmitting completed consents to the evaluation team.

When the evaluation team went to sites to collect data from students, they could not survey any student from whom all required consents had not been received. While the evaluation team hired and trained local people to help collect data, there had to be a way to verify that all needed consents were collected, before deploying a data collector to a site. As noted above, during the pilot phase of the project, it was learned that promised consent forms were not consent forms in hand. A protocol calling for 85% of the expected consent forms to be in the hands of the evaluation team before data collection was scheduled was instituted. So as not to delay the start of a program, the evaluation team could then be at the site within five days after receiving completed consents.

To keep track of consent forms, a Consent Roster Form was created to record whose consents and assents were being transmitted since signatures were often not legible.

Sometimes faxed transmissions were missing pages or scanned consent forms did not all get sent to PRA. A new cover page was created in the second year to keep track of what a facilitator had tried to transmit and a second fax machine was installed to maximize likelihood of receipt. To reduce issues with emails being missed or not

Lessons from the First Two Years Page 10

received, each affiliate was given their own PRA email address to use. In addition, each affiliate was assigned a file transfer protocol (FTP) site. Here they could see exactly what PRA had received. Program and evaluation staff made regular contacts with each other to stay current on the status of consents.

Still, sometimes when a data collector arrived at a site, the facilitator ’s list of who had completed consent forms and thus the list of who could be surveyed, did not agree with the data collector’s list. This occurred because some consents arrived on the day before or day of data collection. This problem was solved in the second year by asking facilitators to bring copies of all their consents to a data collection event so that data collectors could verify that a student could be surveyed.

Provide incentives to facilitators for getting all their consents in prior to data collection day and for high attendance of consented students on survey days.

Even though the evaluation team tried to help with both of these tasks, the facilitators bore much of the work since they were on site and knew the students. While it was not originally in the budget, the evaluation team used part of its funds to reward facilitators for success at these tasks. In the first year, facilitators were incentivized for obtaining consents. In the second year, this money rewarded facilitators for having high attendance on survey days.

VIII. The Random Assignment Process

As noted above, many of the program staff had never participated in a randomized control trial and even the evaluation team, who had such experience, had to meet a very rigorous set of standards. Several valuable lessons were learned:

Create an understanding among program staff about the requirements of random assignment.

It is important for everyone in a project like this one to learn that “random” does not mean “haphazard.” Very intentional training was done with program staff so that they would understand the importance of the evaluation as well as the requirements. As the implementation expanded, the evaluation requirements of the program were emphasized during facilitator job interviews and questions specifically designed to assess the willingness to comply. As noted above, groups of students were randomized into TOP ® or CV. This was done in a few different ways. First, the facilitators could recruit students either individually for after-school or pull-out groups, or as a group for in-class groups. Students were always randomized by cluster. For after-school and pullout groups, students were first divided randomly into two groups by pulling names out of a hat and alternating groups. Once divided into two groups, the two rosters of students

Lessons from the First Two Years Page 11

were randomly assigned by the flip of a coin. This was the same process that was used with class rosters.

Guidelines were needed to encourage understanding and consistency. Here are some of the guidelines established during the first two years of implementation:

Class subjects

– Two classes randomized to receive TOP® or CV, did not have to be the same subject, but they had to be comparable. One group could not be an advanced

English class while the other was woodshop, for example, since these classes may enroll very different kinds of students. Any class that was required for all students could be used.

Grade level – Ninth grade students could not be compared to tenth grade students as these groups would likely be different in their levels of sexual experience prior to baseline. However, mixed grade groups could be used as long as they were similar (i.e., a mixed class of 9 th and 10 th graders could be compared to another mixed class of 9 th and 10 th graders).

Demographics – Two classes were also not good matches if they attracted students with different characteristics. For example, if one class enrolled primarily females while another attracted males, they were not a good pair for randomization.

Students could only be added up until two weeks after data collection – This allowed facilitators to add students who turned in consent and assent forms late while still maintaining a rigorous evaluation standard. After this point, if students joined the club they were not a part of the evaluation.

Students who lived in the same household were randomized to the same group

This reduced the interaction between the two groups to reduce cross-contamination.

All of these factors were checked via the written implementation plans prepared by each affiliate and discussed by telephone with the evaluation team before the school year began.

The random assignment “worked” in that in both Cohorts 1 and 2, students were well matched on demographic and family characteristics. In Cohort 1, TOP® students were significantly less likely than CV students to be African American or some other race or to have mothers who were college graduates or higher. They were significantly more likely to have mothers with less than high school educations or to have had sexuality education in the past. The Cohort 2 groups were even more similar with only one significant difference

– TOP® students were significantly more likely than CV students to be American Indians or Alaskan Natives.

Lessons from the First Two Years Page 12

Demographics

Gender (N=)

Female

Hispanic (N=)

Yes

Race (N=)

White

Black or African-American

American Indian or Alaskan Native

Age (N=)

Average

Grade (N=)

7th

8th

9th

10th

11th

12th

Overall

4842

7%

9%

34%

23%

16%

11%

4765

57%

4595

40%

4908

45%

13%

10%

4765

15.1

Home life

Languages spoken at home* (N=)

English

Spanish

Eligible for free or reduced lunch (N=)

Yes

During most of the time you were growing up, with whom did you live? (N=)

Mother and father (same house)

Mother only

Mother's Education (N=)

High school graduate or less

At least some college

I don't know

Similar Programs

Have you done volunteer work in the past?

(N=)

Yes

Have you had any sexuality education in the past?* (N=)

Yes

Overall

4908

85%

29%

4711

64%

4872

Total

1946

83%

29%

1807

65%

1910

47% 69%

*Significant at p<0.05.

Cohort 1

TOP

962

84%

30%

910

65%

4795

52%

19%

4753

45%

35%

1854

50%

20%

1833

47%

35%

926

50%

19%

920

48%

33%

19% 19%

*Significant at p<0.05.

19%

Cohort 1

Overall

Total TOP

4872

51%

1910

54%

954

53%

954

71%*

Lessons from the First Two Years

13%

10%

1865

15.6

1893

4%

11%

31%

Total

1865

54%

1762

42%

1946

41%

22%

18%

14%

12%

10%

930

15.6

943

4%

12%

30%

Cohort 1

TOP

930

53%

889

42%

962

41%

23%

17%

13%

15%*

9%

935

15.6

950

5%

10%

32%

CV

935

54%

873

41%

984

40%

21%

18%

14%

CV

956

54%

956

67%

CV

984

82%

29%

897

65%

928

50%

20%

913

45%

36%

19%

2941

53%

18%

2920

44%

36%

20%

Total

2962

87%

29%

2904

64%

13%

12%*

1509

14.8

1541

9%

7%

38%

Cohort 2

TOP

1509

59%

1477

39%

1546

48%

23%

14%

9%

Cohort 2

TOP

1546

87%

28%

1510

64%

12%

10%

2900

14.8

2949

9%

7%

37%

Total

2900

59%

2833

38%

2962

48%

23%

14%

9%

CV

1416

87%

29%

1394

64%

12%

9%

1391

14.7

1408

9%

7%

36%

CV

1391

58%

1356

38%

1416

49%

23%

15%

9%

Total

Cohort 2

TOP

2962

49%

2962

32%

1537

53%

18%

1524

45%

36%

19%

1546

50%

1546

31%

1404

53%

18%

1396

44%

36%

20%

CV

1416

48%

1416

33%

Page 13

IX. Implementing the Program

As noted above, TOP® includes four levels of its curriculum designed for students of various ages. Wyman allows facilitators to use lessons from different levels in a single class, if they believe this would best meet the needs of the group. The topics on each level are similar but the material is presented in different ways. The other main strategy of TOP

® is community service, often requiring time outside the classroom—especially for actual service action of some kind.

During particularly the first year, facilitators were getting to know the curriculum and learning how to organize and run community service learning. What did we learn?

Give facilitators clear guidance about curriculum implementation expectations.

While Wyman did allow for different levels of curriculum to be implemented in a single club, the coalition leadership felt that more consistent implementation and outcomes would occur if there was a limit to how many levels were used. Requiring that facilitators dig into the curriculum and plan a scope and sequence for each club that involved no more than two lessons from a different level created a deeper understanding of the curriculum as well as more cohesive implementation.

Start your program as early in the school year as possible.

Delays with consent forms, school agreements to participate, school scheduling changes, and in finding and training all of the needed facilitators caused some programs to have delayed start dates well into the fall. Thus, not all of them completed the 25 sessions by the end of the school year. Some of them kept meeting right into the summer to meet this benchmark but with low attendance, as might be expected.

Get the community service learning up and running quickly.

Similarly, TOP

® recommends a minimum of 20 hours of Community Service Learning

(CSL) for students. This does not mean 20 hours of actual service in the community, since some of the planning, reflection, and celebration time connected with CSL can be counted in this total. During the first year some TOP

® clubs planned their CSL right into the spring months, leaving little time for actual implementation.

During the second year of program implementation, service hours increased substantially. The group began to use “A Taste of Service”—short but interesting projects that had the goal of getting students enthused about community service and doing some kind of service very quickly.

Running a 9-month program in a semester-oriented school requires creative solutions.

Lessons from the First Two Years Page 14

Since the program selected for implementation had to be offered over a nine-month period, student groups needed to remain intact for an entire school year. Some classes in these schools that were particularly suited for TOP

®, like health or life skills classes, were only a semester in length. This problem was overcome when the schools agreed to keep the groups intact and move them into their next semester class together. This did not, however, always happen. In a few cases the CV classes and the TOP

® classes were intermixed, creating a problematic evaluation. In other cases, a few students were moved away from a mostly intact group and had to be pulled out of their second semester class once per week to keep them in TOP ® or once or twice during the semester to keep them in CV. At a few sites, TOP

® and CV students remained in their assigned group type but were switched between which TOP ® or CV club they attended.

X. Data Collection

Data collection in year one of program implementation was originally planned for fall and spring of the program year and in fall and spring of the year after the program. It soon became clear that the second fall data collection was simply too much. It often took two months of follow-up to complete all the surveys for one round of data collection, so the fall data collection timeframe often bled into when the spring data collection was to begin.

We obtained permission to drop the fall data collection in the second year and instead, held reunions in the fall of the previous year’s TOP® and CV groups for the purpose of updating our contact information. These reunion meetings were designed to be fun and included snacks and games. The lessons of our data collection experiences include:

Survey completion is improved by providing multiple methods of data collection.

There were many methods used to collect data. First, in-school group data collection was held. Program and evaluation staff supported this task by reminding students about survey day through texts, flyers, calls, emails, and announcements. Some also held raffles on survey day or paired the survey with a celebration of some kind. TOP ® Clubs and CV groups were read the surveys aloud. Students received detailed promises of confidentiality and were encouraged to be thorough and honest in their responses.

Data collectors returned to schools where five or more students were absent on survey day to survey those students before or after school, during lunch, or during their normal club time. Facilitators greatly aided this process by tracking down students and arranging appointments with the data collectors.

Students who had still not completed a survey were given yet other opportunities to do so. Online surveys were created and calls and flyers from the evaluation team and the facilitators provided students with the survey link and urged them to complete it, later

Lessons from the First Two Years Page 15

mailing their cash incentive or having the facilitator provide it. Four toll-free numbers were also created for students to call to take the surveys at their convenience. Students were emailed and texted twice a week until they completed the survey or the data collection period was ended.

In addition, a team of roughly twelve individuals was hired to make calls to students who had not taken the surveys. These students were called once or twice a week and more towards the end of the data collection period. Program staff also assisted by making calls themselves to the students to remind them to take the survey. On some occasions, students helped find other students or alerted their classmates that we were trying to survey them.

Having an evaluation team who is committed to evaluation standards and works toward problem solving when the protocol has stumbling blocks is critical to the success of the project and ensures a high percentage of surveys completed by both the TOP® and CV

Cohorts. A staff team who understands the importance and requirements of rigorous evaluation helps the on-the-ground efforts work smoothly.

These variety of ways students were invited to complete surveys; in class, after school, on the phone and via online survey, together yielded a response rate of 86% for both

Cohort 1 and Cohort 2 at the end of the program year, and a one-year post-program follow-up rate of 82% for Cohort 1.

Of course, it could be that some of these data collection methods yielded better or different data than others. Each survey was coded by its data collection method and the data will be reviewed to determine where there was the greatest success. There was some concern about the willingness of students to disclose sensitive behavior such as having sexual intercourse if they were being interviewed on the phone or interviewed in person. A quick scan of early surveys did not suggest, however, that students were less willing to admit sensitive behavior using these methods.

Follow up techniques

Day one in person

Pre

N

Post End

4105 3463 1029

Pre

82%

% of possible

Post

69%

End

53%

In person follow up

Total in person

Phone

Online

Door to door

Total

Possible

331 199 44

4436 3662 1073

44

392

354

285

319

196

7%

88%

1%

8%

4%

73%

7%

6%

2%

55%

16%

10%

0 6 6

4872 4307 1594

0%

97%

0%

86%

0%

82%

5019 5019 1946 100% 100% 100%

Incentives must actually incentivize.

Lessons from the First Two Years Page 16

At the end of the program year and the following fall, students were given $10 in cash for completion of their surveys or contact information. It appeared that this incentive was occasionally too low to encourage those who weren’t present to complete the survey on their own. As a response to this, the end of the one-year follow-up incentive was increased to $20. This was more effective and students were more eager to complete the survey as a result.

Attendance and fidelity data need to be collected often and in as simple a way as possible.

Forms were developed to track program attendance and program fidelity over time. In the first year of implementation, the attendance form used was a simple Word document. This was filled out by the facilitators each week and emailed into the PRA office. These were then entered into one overall database. The forms allowed facilitators to mark who attended, how long the sessions were, and if the sessions were curriculum lessons or community service.

In the second year, attendance was collected on an Excel spreadsheet that facilitators could email or upload onto a file transfer protocol site. These forms did not require facilitators to retype names but instead they just added additional dates to the form and used it all year. Facilitators showed regular and high compliance with this once a week task. Also in the second year, facilitators were asked to note on their attendance forms when a student left the school. This gave the evaluation team prompt notice to begin tracking a student to get a new address or way to reach them.

Fidelity forms were created with the help of other evaluators replicating the TOP ® curriculum. One was made for each curriculum lesson yielding over 100 forms.

Facilitators would complete the form created for the lesson(s) that had been used each week and send these to PRA with attendance forms. These forms were entered into an overall database. This was then compared with the attendance data to ensure accuracy in reporting over time

– each week each session must have an attendance form and a fidelity form and the type of lessons (community service or curriculum) were checked to see that they matched.

Contact absent students promptly.

As part of our efforts to prevent program attrition, keep our loss-to-follow-up rate low, and maximize program exposure for those in TOP ®, facilitators were urged to respond to absences quickly. Contacting absent students sends a message that they are missed and that program staff care. Each facilitator reached out to stude nts who missed TOP® in the way that they deemed would be most effective with the youth with whom they work

– phone, text, stopping by the school the next day, etc. Sometimes learning what is keeping a student away from a program provided an opportunity to problem solve how to encourage and support their regular attendance.

Collect all the contact data you can get.

Lessons from the First Two Years Page 17

As part of every survey data collection extensive contact information was collected. This included addresses, telephones, cell phones, information on whether we could text, names and numbers of two adults who would always know where to reach the student, and parents’/guardians’ work phone numbers. Cell phone numbers changed often so that updates were helpful. Sometimes it took all of those data to find a student for a survey and a concerted team effort by the program staff, the evaluation team, and a large number of paid helpers.

Interestingly, getting this contact information was often more difficult than getting students to tell us about their sexual behaviors and pregnancy histories. Even with repeated assurances of confidentiality and explanations that this would only be used if they couldn’t be found in school wasn’t always reassuring. Sometimes, students just didn’t have the information that was requested. Since most of the schools had rules about using cell phones in class, teachers gave students brief permission to look up numbers that they had not memorized. Students also often knew what street they lived on but did not know their exact address. Other students were reluctant to disclose information for a variety of reasons, including parental directions never to do so. This is a particularly sensitive issue. Students may be uncomfortable or embarrassed about where they live, some may not have a consistent home or may move between several addresses.

It’s important for evaluation and program staff to be sensitive to these issues and be willing to not push a student who is uncomfortable. In some schools, administrative staff may be able to be helpful.

On survey days, data collectors went with the class facilitator following the survey to the office to check on whether missing students were still registered in school and for how long they had been absent. That information helped start the search. This is another important reason for building strong relationships and trust with school staff

– they are key to evaluation success.

XI. Conclusion

This replication study has enrolled its third Cohort of students in the 2013-14 school year. Baseline data collection has been completed for this third year and given the lessons cited above and response to challenges, seems to have proceeded more efficiently than ever before. Exploration of early outcomes of the program has begun and those data will be shared in other documents. Program staff now receive bi-weekly reports from the evaluation team on attendance and community service hours —at their request. This enables better monitoring of progress in delivering a full Teen Outreach

Program. The next data collection will require the completion of post-program surveys for Cohort 3 and one-year follow-up surveys for Cohort 2.

Clearly the multiple locations, ambitious number of students targeted, and the geographic spread of the sites have created challenges. Gaining and maintaining

Lessons from the First Two Years Page 18

cooperation from schools require substantial investments of time and creative and customized approaches. Learning about partners in the project and genuinely seeking to connect with them to implement a program to help young people – a goal shared by all

– helped everyone keep their eye on the prize. Managing data coming from so many people and places, and scheduling data collection have also been daunting tasks, but project follow-up rates are high and program and evaluation staff have forged learning partnerships with each other.

Lessons from the First Two Years Page 19

References

Allen, J.P., Philliber, S., Herrling., S, & Kuperminc, G.P. (1997). Preventing teen pregnancy and academic failure: Experimental evaluation of a developmentally based approach. Child Development, 68 (4), 729-742.

Lessons from the First Two Years Page 20

Acknowledgements

This document was made possible by the work of:

Planned Parenthood of the Great

Northwest

Carole Miller ∙ Willa Marth

Amy Agnello ∙ Katherine Huffman

Maria Hamm

∙ Elise Pepple

Hannah Smith ∙ Libby Shafer

Teddy McGlynn-Wright ∙ Mona Grife

Jasmine Ramsey

∙ Jasmine Strasser

Yvette Avila

∙ Josie Evans-Graham

Heather Witt ∙ Sylvia Ramirez

Cody Hafer ∙ Mercedes Klein

Morgain MacDonald

∙ Andreya Smith

Megan Winn ∙ Meagan Niebler

Briana Galbreath ∙ Lacy Moran

Emily Reilly

∙ Mao Reich

Bea Daily ∙ Mandy Paradise

Francesca Castaneda-Barajas

Planned Parenthood of Columbia

Willamette

Camelia Hison

∙ Jennifer Melo

Austin Lea ∙ Megan Ackerman

Ernesto Dominguez ∙ Olivia Jarratt

Misha Mayers

∙ Aurora Rodriguez

Ann Krier ∙ Jana Deiss

Laura Blaney ∙ Ngozi Olemgbe

Christy Alger-Williams

∙ Jody Alaniz

Ada Dortch

∙ Carla Remeschatis

Gina Farrell ∙ Bianca Taveras

Amanda McLaughlin ∙ Rhiannon Henry

Dawnyel Murray

Planned Parenthood of Southwestern

Oregon

Maggie Sullivan ∙ Mary Gossart

Missy Hovland ∙ Gilberto Roman

Elissa Denton

Mt. Baker Planned Parenthood

Jill Sprouse

∙ Kendall Dodd

Sophia Beltran ∙ Tracy Dahlstedt

Lynnae Wilson ∙ Clint Weckerly

Blake Johnson

∙ Jason Fernandez

Planned Parenthood of Greater Washington and North Idaho

Ashlee Martinez ∙ Jamie Gilbert

Ian Sullivan

∙ Felicia Hernandez

Joy Jones

∙ Timara Shindehite

Amy Claussen ∙ Cindy Fine

Miranda Baerg ∙ Mike Palencia

Philliber Research Associates

Sally Brown

∙ Cindy Christensen

Ashley Philliber ∙ Susan Philliber

Theresa Stroble

Planned Parenthood of Montana

Jill Baker ∙ Kate Nessan

Angel Nordquist ∙ Nona Main

Joli Higbee ∙ Tovah Foss

Abby Sun ∙ Laura Mentch

Cindy Ballew ∙ Tracie Weiss

Hannah Wilson

Lessons from the First Two Years Page 21

The following organizations and individuals have partnered on this project. No other federal grant funding was used for this project outside of Grant Opportunity

TP1A000075-02-00.

Individuals:

Meg McBroom ∙ Janice Blackmore ∙ Molly Westring ∙ Jeannie Wright ∙ Jared Verrall

Kevin Beason

∙ Molly McAleer ∙ Melissa Ann Galvez ∙ Becky Samelson

Tammy Napiontek

∙ Avery Ironhill ∙ Theresa Scott ∙ T’wina Franklin ∙ Annie O’Connell

Dr. Ginger Blackmon ∙ Amelia Johnson ∙ Jan Gilbert ∙ Darren Kellerby

Todd Burningham ∙ Janice Blackmore ∙ Molly Westring ∙ Kelly Holmes ∙ Meg McBroom

Organizations:

Rogue River Junior-Senior High School ∙ Central Medford High School

Phoenix High School

∙ St. Francis Dining Hall ∙ Open Meadow’s Step Up program

IRCO (Immigrant and Refugee Community Organization)

∙ Impact NW

Neighborhood House ∙ American Red Cross Willamette Chapter ∙ Center 50+

Tacoma Urban League ∙ City of Salem Parks and Recreation Department

Clay Street Table

∙ Doernbecher Children’s Hospital ∙ Friends of Trees

Green Acres Farm Sanctuary ∙ Habitat for Humanity Portland/Metro East

Impact NW – AKA Science Program ∙ Marion Polk Food Share

Meals on Wheels People

∙ Oregon Food Bank ∙ Oregon Humane Society

Boise State TRiO / Upward Bound Program ∙ P:ear

Portland Parks and Recreation – No Ivy League

Salvation Army in Marion and Polk Counties

Salvation Army Kroc Corps Community Center

∙ SOLVE

The Pixie Project ∙ Willamette Humane Society

Multnomah Youth Commission – Youth Against Violence Summit bridgercare

∙ Aki Kurose Middle School ∙ Rainier Beach High School

Cleveland High School ∙ Franklin High School ∙ Global Connections High School

Health & Human Services High School

Technology, Engineering, Communication High School

∙ Cascade Middle School

Forks Middle School ∙ Salem YMCA’s Service Club ∙ Sequim High School

Upward Bound at Peninsula College ∙ Matt Griffin YMCA

Community Schools Collaboration

∙ Communities In Schools – Seattle

Southwest Youth and Family Services

– New Futures ∙ Highland Tech Charter School

Mt. Edgecombe High School ∙ Sitkans Against Family Violence (SAFV)

Mt. Baker School District ∙ Ferndale School District ∙ Sedro Woolley School District

Burlington-Edison School District

∙ Mount Vernon School District

Portland Public School District ∙ Salem-Keizer School District

This project is supported by Grant Number TP1AH000075 from the Office of

Adolescent Health. Its contents are solely the responsibility of the authors and do not necessarily represent the official views of the Office of Adolescent Health, the

Office of the Assistant Secretary for Health, or the Department of Health and

Human Services.

Lessons from the First Two Years Page 22

Download