Recruiting External Researchers Identification of Expertise The process of recruiting external researchers may begin with the development of a targeted list of potential researchers based on areas of expertise. Working from the topics and research questions on the district’s/state’s research agenda, districts and states might target researchers based on their experience publishing research in a particular content area, using a particular methodology, or working with partners using a collaborative approach. The district or state research agenda identifies the content areas of greatest interest to the organization. Working with researchers with experience in priority content areas not only brings additional capacity to district implementation, but also increases the likelihood of the district winning external grant competitions that assess researcher fit with the project. If the district or state is focused on a particular set of research questions, a search for research expertise can include researchers with expertise in a particular methodology as well as content area. The Evaluation Designs chart on the following page can serve as a resource for matching methodology to a research question. Districts and states must assess their own readiness to engage in different research designs while weighing the benefits of obtaining stronger evidence of impact. Districts and states interested in highly rigorous designs to address questions of impact or sophisticated quasi-experimental designs may focus more heavily on researcher expertise in methodology than in content area. [Go to next page.] The Continuum of Rigor in Impact Evaluation Designs Evaluation Design Matched comparison group Comparative interrupted time series Regression discontinuity (NOTE: NCEE is not accepting this method for questions of impact.) Random assignment Example Schools are selected to implement a new program through some nonrandom process (e.g., they volunteer). Before the program begins, these schools are matched on important background characteristics (e.g., student demographics and average test scores in the prior academic year) to other, nonparticipating schools. After the program has been implemented in the participating schools, the outcomes for the two groups of schools are compared to estimate the program’s effect. Schools are selected to implement a new program, again through a nonrandom process. Before the program begins, these schools are matched to comparison schools with similar histories of background characteristics and outcomes. After the program has been implemented in the participating schools, trends in outcomes over time are compared with the outcomes for schools that “just missed” being selected. Schools are selected to implement a new program based on a predetermined “cut point” on a welldefined and easily measured criterion (e.g., proficiency rates below 25 percent). The outcomes for participating schools are then compared with the outcomes for schools that “just missed” being selected. A set of schools is selected to implement a pilot program based on a random process (e.g., a lottery is used to select 20 pilot schools from among interested volunteers statewide). At the end of the pilot implementation period, the outcomes for pilot schools are compared with the outcomes for the other interested nonparticipating schools. Type of Questions the Evaluation Can Answer Did outcomes differ between the matched groups of participating and nonparticipating schools? Rigor Least Rigorous Design (Lowest confidence that results can be attributed to program) Did outcomes in the program schools improve more than would be expected given trends in similar nonparticipating schools? What is the impact of the program on outcomes? Or, are outcomes in program schools different than they would have been absent the program? What is the impact of the program on outcomes? Or, are outcomes in the pilot schools different than they would have been absent the program? Most Rigorous Design (Highest confidence that results can be attributed to program) From: Perez-Johnson, Irma, Kirk Walters, Michael Puma and others. Evaluating ARRA Programs and Other Educational Reforms: A Guide for States. Resource document developed jointly by the American Institutes for Research and Mathematica Policy Research, Inc. April 2011. Recruiting External Researchers page 2 Districts and states might also consider searching for researchers who have experience and interest in working collaboratively with education agencies. Oftentimes, researchers are focused on building a portfolio of published work and do not place great value on work that prioritizes relevance and utility to practice. However, a growing number of education researchers have expressed renewed interest in conducting research that is informed by and relevant to practitioners. Working with researchers who have experience with research partnerships may bring additional expertise in building structures and resources for collaborative research projects that serve districts’ or states’ interests. In considering researcher expertise in content, methodology, or approach, districts and states have several resources available to them that may assist in building a list of potential research partners. Education Resources Information Center (ERIC). ERIC is a searchable database of education literature with thousands of abstracts for published and unpublished work, often including links to full text. The literature can be searched using keywords from content area, methodology, and/or approach. Keyword searches can be combined to limit the results to literature that addresses multiple areas of interest (e.g., content area and methodology). A review of abstracts can provide an assessment of fit between the authors of a study and the district’s or state’s interests. www.eric.ed.gov Institute of Education Sciences (IES). National Center for Education Research (NCER), National Center on Education Evaluation and Regional Assistance (NCEE), and National Center on Special Education Research (NCSER) websites. IES is the major source for federally funded education research. Research projects funded through the three centers listed above are typically rigorous and thoroughly reviewed. The funding opportunities under NCER and NCSER are arranged by content area. A link is provided on each funding program’s website to view previously funded projects. (See screen shot on page 4.) By searching through these funded projects, districts and states can identify top quality researchers by content area and methodology used. NCER also in recent years has provided funding opportunities for research partnerships. Searches under the “Evaluation of State and Local Programs and Policies” and “Researcher-Practitioner Partnership” programs can help identify researchers who are experienced in collaborative work. In addition, the research conducted under NCEE’s Regional Educational Laboratory program typically represents collaborative research projects that are designed to answer a set of district or state research questions. Therefore, the NCEE REL website provides another source for finding potentially collaborative researchers. NCER website: http://ies.ed.gov/funding/ncer_progs.asp NCSER website: http://ies.ed.gov/funding/ncser_progs.asp NCEE REL website: http://ies.ed.gov/ncee/edlabs/ Recruiting External Researchers page 3 [Go to next page.] Recruiting External Researchers page 4 IES-funded Content Centers. NCER also funds National Education Research and Development Centers on specific topics of broad interest, such as preventing high school dropout or improving teacher quality. (See screen shot below for a listing of the R&D Centers.) The R&D Centers each maintain their own websites, which consolidate and disseminate information on research and researchers in that content area. The IES website provides links to each of the individual centers. IES website list of National R&D Centers: http://ies.ed.gov/ncer/randd Syntheses of research in the content area. Several journals such as the Review of Research in Education specialize in reviewing education research literature by content area. A well done synthesis can provide a bibliography that lists most of the major researchers in a particular content area. Local university websites. Researchers at local universities often are eager to work with local districts and state education agencies. Searches of university websites should extend beyond schools of education to include sociology, Recruiting External Researchers page 5 psychology, political science, statistics, and content-area departments that are likely to house active education researchers whose primary focus is not on teacher preparation. Reaching Out Districts or states can contact researchers they have identified as potential partners directly via email to set up a time to talk about the district’s or state’s research needs and how they might intersect with the interests of the researcher or the research organization. This introductory email is an opportunity for districts and states to emphasize their interest in establishing collaboration around their research agenda and the need to partner on proposals for funding. If the list of potential targeted research partners is long or the district or state has been unable to identify potential partners, the district or state agency can issue an RFR for a research partner. The RFR would solicit proposals from researchers for a no-cost contract to partner on joint submissions for funding grants. This RFR would not be a guarantee of money for the research partner nor does it set up an exclusive relationship. Rather, it allows the district or state to assess potential partners’ capacity and approach in a formal way. See Appendix A for an sample RFR issued in 2012 by the Massachusetts Department of Elementary and Secondary Education. Recruiting External Researchers page 6