Addendum to RFP - Center for the Study of Social Policy

advertisement
1
ADDENDUM
to the Request for Proposals for
Child Abuse and Neglect Prevention Research and Demonstration Projects
Table of Contents
I.
Purpose .............................................................................................................................................. 2
II.
Deciding to Submit a Proposal .............................................................................................................. 2
III.
Corrections in Powerpoint ―Timetable‖ Slides ......................................................................................... 2
IV.
Correction in the Powerpoint ―Proposal Format‖ Slide ............................................................................. 6
V.
The Overarching Research Question ...................................................................................................... 6
VI.
Defining ―Research‖ ............................................................................................................................. 6
VII.
Focusing on the Protective Factors ........................................................................................................ 6
VIII.
Focusing on at Least Two of the Four Core Areas ................................................................................... 7
IX.
Differentiating ―Social Support‖ and ―Community Connections‖ ................................................................ 7
X.
Focusing on and Measuring the Three Overarching Outcomes ................................................................. 7
XI.
Proposing an Appropriate Research Design ............................................................................................ 8
XII.
Proposing a Utilization-Focused Evaluation and Complex Systems Orientation ........................................... 9
XIII.
Regarding Collaborations...................................................................................................................... 9
XIV.
Considering Culture in the Design and Delivery of an Intervention .......................................................... 10
XV.
Making a Contribution to the Field........................................................................................................ 10
XVI.
Questions and Answers ....................................................................................................................... 11
XVII.
A.
Timetable .................................................................................................................................... 11
B.
Memoranda of Agreement, Administrative Data .............................................................................. 11
C.
Collaborations .............................................................................................................................. 12
D.
Budget and Tracking Costs ........................................................................................................... 13
E.
Core Areas, Data Collection, Instrumentation.................................................................................. 15
F.
Research, Research Design, Evaluation .......................................................................................... 16
G.
Staffing ....................................................................................................................................... 20
H.
Sections of the Proposal ............................................................................................................... 20
Face Sheet Template .......................................................................................................................... 22
2
Purpose
The purpose of the December 1, 2009 webinar was to: (a) provide general feedback about common
concerns that were observed in many of the letters of interest, (b) underscore and clarify selected points
described in the RFP, and (c) solicit additional questions from invitees. The Powerpoint slides for the
webinar were distributed to all invitees on December 2, 2009.
The purpose of this addendum is to: (a) reiterate some of the points highlighted in the webinar, (b)
correct three of the timetables included in the Powerpoint slides, and (c) provide answers to questions
submitted after the webinar. The majority of clarification is found in the questions and answers section
of this addendum which is organized by categories (e.g., budget, memoranda of agreement, etc.).
Deciding to Submit a Proposal
Although those receiving this addendum were invited to submit a proposal, you are under no obligation
to do so. Please inform the QIC-EC Project Director, via email, if a decision is made to forgo the
submission of a full proposal (qic-ec@cssp.org).
Corrections in Powerpoint “Timetable” Slides
In reference to the webinar Powerpoint slides 26, 27, and 28—slide 26 reflects the timetable from the
RFP indicating the 40 months of grant funding for the R&D projects by year. This slide is
completely correct. Slides 27 and 28 contain errors indicating that there are 43 months of grant funding
for the R&D projects. Several questions were raised about this discrepancy (as synthesized in question
#1 in the question and answer section). CORRECTIONS and clarifications in slides 27 and 28 are:
1. Funding for the R&D projects ends June 30, 2013; funding for the QIC-EC ends September 30, 2013.
2. R&D project activity ends March 31, 2013.
3. R&D project wrap-up begins April 1. Thus, grantees have 3 months—April 1 to June 30—to close-out
their projects, including preparation of final reports and preparation for the September webinar.
4. The final grantees meeting will be in June, not August; the final meeting of the QIC-EC is in August.
5. Final progress, budget, and evaluation reports are due no later than July 31, 2013. It is not unusual
for the deadline for final reports to be after the end of funding.
6. Grantees lead final QIC-EC webinar in September 2013.
The monthly and/or quarterly timeline presented in your proposal should be from March 1, 2010 – June
30, 2013. Some of the common dates/deadlines to include are indicated in the next three tables. Other
common events/dates/deadlines that should be included in the timeline can be found in the RFP. The
original timetable (page 3) and the corrected timetables (pages 4 & 5) follow.
3
Overall Timeframe of Project Implementation
Year 1:
 March 1, 2010 – September 30, 2010
=
7 months
=
12 months
=
12 months
=
9 months
=
40 months
Year 2:
 October 1, 2010 – September 30, 2011
Year 3:
 October 1, 2011 – September 30, 2012
Year 4:
 October 1, 2012 – June 30, 2013
Total Months of Funding for R&D Projects
4
Last 4 Months of R&D Project Funding (REVISED)
March 31, 2013:
 Project activity ends
April 1, 2013:
 Project wrap-up begins (including preparation of final reports and preparation for webinar)
June 2013:
 Final grantees meeting
June 30, 2013:
 Project wrap-up ends; funding for R&D projects ends
July 31, 2013:
 Deadline for submitting final progress, budget, and evaluation reports
September 2013:
 Grantees lead final QIC-EC webinar
5
Key Dates/Deadlines (REVISED)
Effective Date of Grants/Start-Up ................................................................ March 1, 2010
Refining the Plans ....................................................................... March 1 – April 30, 2010
Grantees Meeting .............................................................................March 24 & 25, 2010
Projects in Full Operation..............................................................................May 27, 2010
Early Childhood Summit ......................................................................... August 3-5, 2010
Project Activity Ends..................................................................................March 31, 2013
Project Wrap-Up Begins................................................................................. April 1, 2013
Final Grantees Meeting ..................................................................................... June 2013
Project Wrap-Up Ends ................................................................................. June 30, 2013
R&D Projects Funding Ends ......................................................................... June 30, 2013
Deadline for Submitting Final Reports ............................................................ July 31, 2013
Grantees Lead Final QIC-EC Webinar ....................................................... September 2013
QIC-EC Funding Ends .........................................................................September 30, 2013
NOTE: Semi-Annual Reports Due .............................................. September 30 & March 31
6
Correction in Powerpoint “Proposal Format” Slide
The Powerpoint slide #14, ―proposal format,‖ indicates that the abstract should be limited to 120 words. Upon
further review of the requirements for the content of the abstract (pages 58-59 of the RFP), this word limit is
withdrawn. You should adhere to the specifications for the one-page abstract outlined in the RFP.
The Overarching Research Question
How and to what extent do collaborative interventions that increase protective factors and decrease
risk factors in core areas of the social ecology result in optimal child development, increased family
strengths, and decreased likelihood of child maltreatment, within families of young children at high-risk
for child maltreatment?
Please note the change in the language of the overarching research question from, ―How and to what extent do
collaborations. . .‖ to ―How and to what extent do collaborative interventions . . .‖ to emphasize two points: (a)
intervention(s) should be a collaborative enterprise and (b) intervention(s) should be designed to increase
protective factors and decrease risk factors. This overarching research question, plus your understanding about
the gaps in knowledge, should guide the creation of more specific, relevant research questions which should then
guide the design of the collaborative intervention and evaluation.
Defining “Research”
Throughout the RFP, we use ―research‖ in a broad sense to refer to various aspects of this research and
demonstration project. This includes reference to: (a) the research question(s) and design; (b) the collaborative
intervention chosen to address the research question(s); and (c) the activities around data collection, analysis,
interpretation, and use—which we are referring to as the evaluation. See pages 41-42 of the RFP for how we
have framed the evaluation phases. See pages 60-64 for details on what information to include in the Approach
versus the Evaluation section of your proposal.
Focusing on the Protective Factors
More empirical evidence is needed about the processes and outcomes of systematically building protective factors
in families at high risk for child maltreatment. The projects funded by the QIC-EC must be designed to generate
new knowledge that fills this gap. Interventions must be designed to increase the composite six specific
protective factors listed here and defined in Table 2 (page 13) in the RFP:

parental resilience

social connections

knowledge of parenting and child development

concrete support in times of need

nurturing and attachment

social and emotional competence in children
Based on the specific research question, applicants may choose to focus on additional protective factors in their
7
interventions. However, the six cited above must be addressed by all grantees.
It may not be necessary to focus on each protective factor with each participating family; some protective factors
may already be sufficiently strong in a family. Thus, in conjunction with each primary caregiver, grantees will
gather protective factors baseline data to determine which one(s) of the six designated protective factors should
be the focus of the intervention for each primary caregiver’s family. A description of the existing protective
factors already strong in a family should be recorded, monitored, and tracked.
Focusing on at Least Two of the Four Core Areas
The QIC-EC has selected a core area at each level of the social ecology that serves as a leverage point or area of
change within which the R&D projects could focus interventions.

The core areas at the individual level are the primary caregiver and target child.

The core area at the relationship level is what we call social support.

The core area at the community level is what we call community connections.

The core areas at the systems/societal level are public policy and social norms.
Applicants must propose research questions, goals, objectives, interventions, and evaluation strategies that focus
on at least two of the four core areas—inclusive of the primary caregiver and target child core area. NOTE: The
core areas where there are the greatest knowledge gaps (where a project offers the greatest contribution to the
field) are the relationship level and the systems level.
While R&D projects must propose interventions that focus on at least two of the four core areas, common
baseline and end-of-project data must be collected in all four of the core areas in order to determine changes in
and among all of the core areas. The common data to be collected in all four core areas will be determined jointly
by the selected R&D projects, the QIC-EC Team, and the QIC-EC Evaluation Team. Applicants should also show
the relationship of their work to each of the core areas in the theoretical framework, logic model, and data
analysis plan even if they are not part of the focus of the project.
Differentiating “Social Support” and “Community Connections”
While the core areas ―social support‖ and ―community connections‖ may seem similar, the specific definitions for
these core areas are provided in the RFP—pages 22-23—and should guide interventions proposed at these levels.
Focusing on and Measuring the Three Overarching Outcomes
The three outcomes of focus for all research and demonstration projects are: (a) optimal child development, (b)
increased family strengths, and (c) decreased likelihood of child maltreatment.

The outcome ―optimal child development‖ will be measured by pre- and post-intervention assessments of
the ―child well-being‖ domain, to include health, education/cognitive well-being, and social-emotional wellbeing indicators. To assess the effectiveness of the interventions, changes among the target children in the
8
treatment group(s) will be compared to children in the comparison group(s).

The outcome ―increased family strengths‖ will be measured by pre- and post-intervention assessments of
the following domains and indicators: ―home and community‖ (home safety and social connectedness);
―parent capacity‖ (parenting skills, parenting knowledge of child development, and parent mental health);
―substance abuse‖ (type, frequency, and problem behaviors associated with risky substance abuse;
participation in substance abuse treatment programs); ―financial solvency‖ (income; housing stability; food
security) and ―family conflict‖ (types and levels of family conflict). To assess the effectiveness of the
interventions, changes among home safety, social connectedness, and other characteristics of the primary
caretakers in the treatment group(s) will be compared to primary caretakers in the comparison group(s).

The outcome ―decreased likelihood of child maltreatment‖ will be measured by pre- and post-intervention
assessments of the balance between protective factors and risk factors. To assess the effectiveness of the
interventions, the balance between protective factors and risk factors of the families in the treatment
group(s) will be compared to the families in the comparison group(s).
We will take a multifactorial approach to assessment, so several instruments will be used to gather pre- and postintervention data. This will enable projects to draw more adequate and useful conclusions from the assessments.
R&D projects will be required to use some common instruments to support the cross-site evaluation.
The instruments that are included in Table 4 in the RFP—on pages 26-28—are only prospective tools for
assessing indicators of the three outcomes. Final determinations about the number and types of common
instruments will be made after grantees are selected. Also, applicants may need to identify and use other
instruments related to unique aspects of their projects.
Measuring the Protective Factors
A protective factors assessment tool, currently under development by the QIC-EC, will be used by grantees as a
pre-, post-, and repeated measure instrument to gather baseline, intermediate, and final data on participating
families’ knowledge, attitudes, and behaviors that support and reflect the identified protective factors.
Measuring Risk Factors
Numerous risk assessment instruments have been developed to gauge the potential occurrence or reoccurrence
of child maltreatment; some are described in the RFP.
Proposing an Appropriate Research Design
The RFP asks applicants to have an intervention at the individual level and at least one other level of the social
ecology and to have a treatment and control/comparison group(s). We are very interested in learning about not
only the effectiveness of the individual level interventions, but also about the relationships and interactions
between other levels of the social ecology (i.e., relationship, community, and/or policy) and the individual level.
9
Our primary reason for presenting research designs A, B, and C in the December 1 webinar was to encourage you
to be as clear as possible about the research design you are proposing, to think through alternatives, and to
select one(s) that are appropriate to your research question(s). The 2X2 design that was presented during the
December 1 webinar is just one option. It is consistent with these expectations and can be a strong option for
investigating relationships. Applicants must consider their proposed collaborative intervention and their overall
research questions very carefully and justify the appropriateness of a research design that can address those
questions. Also, please keep in mind that, within the required experimental or quasi-experimental design, we do
anticipate that a mixed methods approach will be necessary to adequately address the systemic context and
nature of the interventions.
A comparison group is required at the individual level of the social ecology. Although a strong design would be to
have a comparison group at both levels of the social ecology that you are investigating, if a comparison group is
not proposed for the second level, applicants should provide a justification for why this is not possible and how
they plan to collect at least descriptive data that will help inform their local evaluation. Proposals will not be
automatically penalized if the design does not have a comparison group at the second level of the social ecology.
The bottom line is, let your research question(s) determine your design.
Proposing a Utilization-Focused Evaluation and Complex Systems Orientation
Proposed evaluations should reflect the evaluation framework delineated in the RFP—that is, a utilization-focused
evaluation and a complex systems orientation. You do not have to be an expert in utilization-focused design and
complex adaptive systems in order to develop an evaluation plan within this framework. Please review and think
about the articles referenced in the RFP; some of which are posted on our website or are easily downloaded from
various sites—see the RFP list of references.
Regarding Collaborations
The proposal must include a description of: (a) how long each partner has worked with the lead agency and (b)
the nature or level of the relationships between the lead agency and other key partners with respect to the
coordination-cooperation-collaboration continuum or hierarchy. Page 16 of the RFP refers to Pollard’s description
of this continuum. Also, the coordination-cooperation-collaboration hierarchy is described in a collaboration
training module organized by Healthy Workplaces, LLC that is posted on our website.
As grantees will be required to track population level community data and other data on the target children (e.g.,
if a CPS report occurs during the project period), it will be necessary to establish a relationship with the agency
that maintains this data. Applicants are required to obtain a Memorandum of Agreement (MOA) with the entity
from which this type of administrative data will be collected, or explain why a MOA could not be achieved.
Applicants must collaborate with their state lead agency for the Federally-funded Community-Based Child Abuse
Prevention (CBCAP) program; with their state Early Childhood Comprehensive Systems (ECCS) leaders, and with
their state children’s trust and prevention funds, or explain why these collaborations could not be achieved.
10
Multiple, prominent parent roles in the partnership are expected that give parents a strong voice in the planning
and implementation of the R&D projects. A resource document entitled ―Building and Sustaining Effective Parent
Partnerships: Stages of Relationship Development,‖ published by the National Alliance of Children’s Trust and
Prevention Funds, is on our website.
Considering Culture in the Design and Delivery of an Intervention
It is extremely important to consider the culture of the participants in the design and delivery of a maltreatment
prevention strategy. Shonkoff and Phillips (2000) asserted:
Culture influences every aspect of human development. . . . Understanding this realm of influence is
central to efforts to understand the nature of early experience, what shapes it, and how young children
and the culture they share jointly influence each other over the course of development. . . . Given the
magnitude of its influence on the daily experiences of children, the relative disregard for cultural
influences in traditional child development research is striking (p. 25).
It is arguable that the use of evidence-based practices could address the differential treatment and outcomes that
diverse populations encounter within the child welfare system. ―However, it is equally likely that EBPs could
exacerbate and deepen existing inequities if they are implemented without sufficient attention to cultural
competence and/or if policymakers fail to take into account the many practices within diverse communities that
are respected and highly valued‖ (Isaacs, Huang, Hernandez, & Echo-Hawk, 2005, p. 4-5).
Integrating cultural considerations into program planning decisions must go beyond what is typically regarded as
―culturally sensitive‖ practices. Daro, Barringer, and English (2009) noted:
Much has been written about the importance of designing parenting and early intervention programs that
are respectful of the participant’s culture. For the most part, program planners have responded to this
concern by delivering services in a participant’s primary language, matching participants and providers on
the basis of race and ethnicity, and incorporating traditional child rearing practices into a program’s
curriculum. Far less emphasis has been placed on testing the differential effects of evidenced-based
prevention programs on racial or cultural groups or the specific ways in which the concept of prevention
is viewed by various groups and supported by their existing systems of informal support (p.11).
Thus, proposals should include a description of how the intervention is culturally responsive and is appropriate for
the target population.
Making a Contribution to the Child Maltreatment Prevention Field
Applicants must make a strong case of how their proposed project could advance child abuse and neglect
prevention knowledge, research methodology, practices, services, and/or policy. An important part of the
contribution to the field could be related to cost issues—that is, identifying approaches to prevention that are
reasonable and cost-effective.
11
Questions and Answers
Timetable, Corrections in Powerpoint Timetable Slides
1.
Are projects only being funded from March 1, 2010 to June 2013? This period is 40 months. However, the
Powerpoint slides indicate that the grant ends September 30, 2013 - this represents a total of 43 months of
project activities. Are grantees being funded for 40 months, while performing in grants for 43 months?
Should the budget include funds only for the 40 months up to June 30, 2013, or for the 43 month period
ending in September 30, 2013? How will the costs associated with the June - September dates be covered?
Based on the corrections provided on pages 2-5 of this document, grantees are funded for 40 months and
are performing for only 40 months; thus, the budget should end June 30, 2013. There should be no costs
incurred after June 30th.
2.
If the grant ends on June 30, 2013 is there really a final report due July 31, 2013 as stated in the webinar?
Yes, July 31st is the deadline for submitting the final reports. Preparation of the final reports should begin
in April since project activity ends March 31st.
3.
If the grant actually ends September 30, 2013, what deliverables are due in the June - September 2013
time period? Funding for the R&D projects ends June 30th; funding for the QIC-EC ends September 30th.
As explained above, the deadline for submitting the final reports is July 31st and grantees will lead a
webinar in September. Preparation for these deliverables should be before June 30th.
4.
Is there a required date for when direct services to target populations must begin? A related question - is a
start-up period (after May 27th) allowed for projects to develop innovative programs and train staff prior to
initiating direct services? The intent of using collaborations that already exist is to shorten the start-up
time and move quickly to implementation. Because of this, our thinking is that the timeframe between
announcement (Feb. 26) and implementation (May 27) would be sufficient for additional program
development and staff training.
5.
Does the project require that each participant be provided 37 months of treatment/intervention/
comparison? What is the extent of the time that a participant is required to be directly serviced or to be
part of the R and D? No, we cannot require a specific length of time of participation. What is desirable is
that target participants are engaged for a sufficient time that measurable change can be documented.
Plans for identifying, recruiting, engaging, and retaining participants should be described. Also, plans for
addressing the possibility of attrition should be addressed.
Memoranda of Agreement, Administrative Data
6.
A Memorandum of Agreement (MOA) is needed from the appropriate entity from which administrative data
will be collected. What is meant by ―administrative data‖? What if such data is publicly accessible; is a
MOA still be needed? ―Administrative data‖ refers to population level data of the respective communities (in
this case, child maltreatment data) that is maintained by various agencies, such as reports of child abuse
and neglect for children ages birth–5 years; disposition of child abuse and neglect reports of, percentage
substantiated and unsubstantiated; and emergency room visits for children birth–5, disaggregated by
causes for the visit.
Obtaining the individual level child maltreatment data for participants in the projects will require having
MOAs in place with the state or local child welfare agency. The point of getting MOAs is to ensure
12
that grantees will have immediate access to administrative data and will not be delayed in being able to
secure this data. If all the administrative data that will be required is a part of the public domain, then
MOAs would not be needed. However, this is not the case in most states. The National Data Archive on
Child Abuse and Neglect does have some public use datasets available. This resource can be found at:
http://www.ndacan.cornell.edu/
7.
How important is it to obtain emergency room visit data? If emergency room data are in fact required, do
you have suggestions on methods and/or sources for obtaining it? Emergency room data will be required if
it is adopted as a population level indicator to be tracked by all grantees. This determination will be made
after feedback and discussion by the grantees, the cross-site evaluation team, and the QIC-EC Team.
Proposals should show the capacity to obtain population level data on indicators (e.g., reports of child
abuse and neglect for children ages birth–5 years; emergency room visits; etc.) from the appropriate
sources for their states and local jurisdictions.
Collaborations
8.
During the webinar, there was mention of three organizations with whom we have to collaborate. What
are the organizations? Applicants must collaborate with their state lead agency for the Federally-funded
Community-Based Child Abuse Prevention (CBCAP) program; with their state Early Childhood
Comprehensive Systems leaders (ECCS), and with their state children’s trust and prevention funds—or
explain why these collaborations could not be achieved. See page 49 in the RFP.
A list of the state contacts for each of the organizations is available at:
For CBCAP: http://friendsnrc.org/contacts/contacts.asp
For ECCS: http://www.state-eccs.org/granteelist/index.htm
For the Children’s Trust and Prevention Funds: https://www.msu.edu/user/nactpf/about_members.htm
9.
More information is requested about the requirement to collaborate with state/local CBCAP lead, ECCS, and
CTF. If our collaboration does not involve all three, what degree of justification do we need to provide for
not doing so? The ―degree of justification‖ is hard to quantify. You must provide a reasonable explanation
of why one or more of these collaborations could not be achieved.
10.
When we rate the collaborations in our application based on the scale provided, would you like that
described for every partner or just those that we identify as our major partners? The proposal should state
the nature and history of the relationships between the lead agency and other key partners in the
collaboration, to demonstrate that an extended start-up period to develop working relationships will NOT be
necessary. For example, ―the lead agency has contracted with the department of sociology on four multiyear evaluation projects over the past ten years.‖
11.
Does collaboration mean the collaborators need to play a role in the delivery of services/intervention?
No, all partners do not have to play a role in the delivery of services/intervention. You must delineate the
specific roles each partner will play in the project.
12.
Page 59 - #2 - The 4th bullet asks, ―How risks and resources will be shared.‖ Can you give more
clarification on what you are looking for with regard to a description of ―risks‖? Risk is typically defined as
the possibility of suffering from harm and involves a level of uncertainty; some of the inherent possibilities
may include a loss, hazard, or other undesirable outcome. Risk concerns the expected value of one or
more results of participating in a project or endeavor, and can be viewed as incurring a cost or failing to
13
attain a benefit. In this context one variable of risk is the potential impact to one’s organization.
In collaborative relationships, organizations agree to operate under a common mission, in this case the
development, management, implementation, and evaluation of the research and demonstration project.
There is risk in collaborative relationships because each member of the collaboration contributes not only
its expertise and other resources, but its reputation as well. Some questions to consider in weighing how
risks will be shared are: (a) What do the partners have to gain or lose from their involvement? (b) What do
they have to gain or lose from not being involved? (c) What is the impact to the individual organization,
the whole, and greater whole?
Budget, Tracking Costs
13.
The budget instructions say that the budget format should be 4 columns and also that we should identify
whether the various matching amounts are cash or in-kind contributions. What format should be used to
make this identification (i.e., and additional budget column, in the budget justification, or elsewhere)?
Do you have any suggested budget templates? We do not have a template. Please adhere to the
guidelines which indicate, among other specifications, that the non-federal match should be in a column(s)
that follows the federal column. An explanation about the match should be included in the budget
narrative/justification.
14.
Can responders request more than $1.24 million (since fewer than 5 grants may be awarded) or is $1.24
million the maximum in federal funds that may be requested? $1.24 million is the maximum amount
allowed for the proposal. A budget over this amount will be counted as non-responsive to the RFP. Any
adjustments will be made once the final number of grantees is determined.
15.
How much will be left from $1.24 million after you deduct 7% for the cross-site evaluation? $1,153,200
16.
How should we show the 7% for the cross-site evaluation in the budget? Should it be budgeted in the last
budget period only or allocated throughout the budget in some other way? The QIC-EC will withhold this
amount from the final award; this should be reflected in the applicant’s proposed budget from the
beginning showing the award ($1.24 million) minus the 7% ($86,800), with the remainder built around
$1,153,200.
17.
Since we are required to allocate 7% of the total funding request for the cross-site evaluation will those
funds be paid directly to the QIC's cross-site evaluator or will we be able to use those funds to pay our local
evaluator to participate in the cross-site evaluation? The QIC-EC will withhold the 7% from the final award
to cover the costs for the cross-site evaluation that will be conducted by the QIC-EC Evaluation Team. The
activities grantees engage in related to the cross-site evaluation (e.g., meetings with other R&D projects or
interacting with the cross-site evaluators and the QIC-EC partners) should be factored into the rest of your
budget.
18.
Since we will be submitting a budget that itemizes the costs of the intervention as well as evaluation for the
project, please explain how the ―cost tracking‖ that is required as part of the proposal and project
implementation is different from our budget and our actual expenses. Since the projects should be built on
existing collaborations and may use services that are NOT included in the budget for this project, the total
costs of all the services should be tracked. This is to ensure that replication attempts or policy changes
that might result from the research will include a more complete accounting of the true costs of operating
the collaborative intervention under study.
14
19.
In addition to the budget categories listed, what else should be considered when planning the budget? It
is important to submit a budget that fits the proposed research design, even if that means allocating more
than the minimum amount specified for the evaluation.
20.
Upon submission of the application, should we submit the approval of the indirect rate, or can we submit a
copy of request to approve an indirect cost? As indicated in section 3.7.10, #12 (page 69) in the RFP, copy
of the documentation of the lead agency’s federally approved Indirect Cost Rate should be included in the
appendix. An explanation would need to be provided in the budget narrative section if this is not possible
and only a request to approve an indirect cost is appended, rather than an approved rate. It is also
possible that some organizations will not have and do not plan to apply for an indirect cost rate. In that
case, actual costs for all expenses would be shown in the appropriate budget categories.
21.
Can the QIC approve an indirect rate? No. You will need to apply for an indirect cost rate agreement if you
don’t have one and if you wish to have one. This is a link to FAQs on indirect cost rates from HHS:
http://rates.psc.gov/fms/dca/faq-general.pdf
This is the link for the Children’s Bureau Program Support Center, Division of Cost Allocation for more
information: http://rates.psc.gov/fms/dca/faq.html
22.
Does the $1,240,000 cap on individual projects include indirect costs or is this the cap on direct costs? The
amount includes both direct and indirect costs.
23.
I gathered that 78% of funds can be used for programmatic costs and 22% for evaluation. However on
the webinar, I got the impression that we cannot use these funds for programmatic costs. Could you
please clarify that? The 78% of the budget is available to support the collaborative intervention (also
known as programmatic costs) as well as other additional costs to support the research and evaluation.
Applicants must propose a budget which provides a reasonable and adequate justification for the costs
proposed which will support the activities proposed for the project.
24.
Page 65 - #'s 5 and 6 - The instructions for allocating funds for the local evaluation (#5) say that it should
be 15% of the Federal award but the instructions for the cross-site evaluation (#6) say that it should be
7% of the total funding request. Is "total funding request" intended to mean the amount of the Federal
award or the total budget? Federal award.
25.
Since the total cost of the local evaluation will be composed of several different cost items, how do we
identify the sum or total cost of the evaluation (to verify to you that it is at least 15%) in the grant? Should
we do this in the Budget Justification or some other way? Budget and budget narrative/justification is OK
as long as we can see how the 15% minimum is accounted for.
26.
Can match be provided by a main applicant itself or one of partnering agencies in administrative salaries?
Could match be provided in local evaluation cost by a partner agency? The match could come from either
the applicant or the partner but this should be documented either through a MOA or other letter of support
which specifies the exact nature and amount of the match.
27.
Can we offer more than 10% in match? Yes, by all means! Please note that if higher amounts are offered,
the grantee will be held accountable for meeting the full amount listed in the proposal.
28.
Should administrative salaries of staff, who deal directly with client records, such as Office Manager, Client
Records Associate etc, be considered as‖ indirect charges‖ or as ―personnel salaries‖? This can be included
15
in indirect if there is an approved indirect rate, or in personnel if there is not an approved rate. Your
agency’s current practice may be your best guidance on this.
Core Areas, Data Collection, Instrumentation
29.
Does our data collection/data tracking need to address all 4 core areas as part of our program design? For
areas not addressed, do we need to propose what data we would collect and how we would do that or can
we indicate our willingness and capacity to participate in the cross-site study and wait for the details to
emerge from the QIC-EC evaluation team? The intervention and evaluations should address at least two
core areas. However, common baseline and end-of-project data must be collected in all four of the core
areas in order to determine changes in and among all of the core areas. The common data to be collected
in all four core areas will be determined jointly by the grantees, the QIC-EC Team, and the QIC-EC
Evaluation Team. With respect to the proposal, each of the core areas must be addressed in the
theoretical framework, logic model, data analysis plan, and budget even if they are not a part of the
intervention and evaluations and only common baseline and end-of-project data are being collected.
30.
Please clarify the difference between the relationship level and the community level since there could be
some overlap. The core area at the relationship level is what we call social support and the core area at
the community level is what we call community connections. These areas are described in the RFP on
pages 22-23.
31.
Is the QIC-EC expecting the local evaluator to monitor all outcomes measures, for each of the four (4) core
areas/levels? Yes, the local evaluator should be prepared to monitor all outcome measures. The final list
of indicators will be determined after the grantees have been selected, with the goal of simplifying as much
as possible.
32.
Do you have a specific definition of neighborhoods that you’d like projects to use? No. Applicants should
provide conceptual and/or operational definitions relevant to their specific projects.
33.
Since the QIC will convene a meeting to decide common measures, how much specificity is required in the
proposal about the exact instruments that will be used? You may propose instruments that you find
uniquely appropriate for your project—and propose common instruments, as well—with the caveat that
final determination of common instruments will be made jointly by the grantees, the QIC-EC Team, and the
QIC-EC Evaluation Team.
34.
The Ages and Stages Questionnaire is a very widely used screening tool. It was designed so that scores
beneath the cutoff points indicate a need for further assessment, while scores above the cutoff suggest the
child is on track developmentally. To use it the way it was designed you can only measure child
developmental outcomes as "delayed" or "not delayed", but you wouldn't be able to measure how much
progress they made in terms of a standard score. Is this delayed/not delayed approach to measuring child
developmental outcomes an acceptable way of describing child development or do we need to include
additional measures? In general, yes it is an acceptable way of describing development. However,
whether or not we will use a dichotomous variable only or will need more sensitive measurements is the
kind of issue that will be included in the joint discussion/decision making about data collection and common
instrumentation.
35.
Could you provide a reference for the Demographic Data Tool described in table 4 on page 26? This
16
instrument is currently under development by the QIC-EC.
36.
Three data collection points (base, mid and end) have been suggested. Are all three data collections
required? Yes. The specific times will be determined jointly by the grantees, the QIC-EC Team, and the
QIC-EC Evaluation Team.
37.
Could the R&D project collect the data over phone, on line, or by mail? A justification for data collection
methods—whether in person or other means—should be provided in the proposal.
38.
In response to the requirement to ―address the 6 designated protective factors‖ . . . would it be
acceptable to measure all 6 but not necessarily target all 6 through intervention? We understand that
measurement of these areas can inform intervention activities (i.e., if it is found that a woman receiving
parenting services needs some concrete supports), but, in the course of a randomized trial, there also has
to be standardized service delivery, to preserve evaluation integrity. It may not be necessary to focus on
each protective factor with each participating family; some protective factors may already be sufficiently
strong in a family. Thus, in collaboration with each primary caregiver, grantees will engage in an analysis
of the respective baseline data gathered to determine which one(s) of the six protective factors should be
the focus of the intervention for each primary caregiver’s family. This differs from a ―one size fits all‖
intervention strategy and focuses on service delivery tailored to each family’s situation. A description of the
existing protective factors already strong in a family should be recorded, monitored, and tracked.
Based on the specific research question, applicants may choose to focus on additional protective factors in
their interventions. However, the six cited in the RFP must be addressed by all grantees in the intervention
or via monitoring/tracking.
39.
Do we have to address all 6 Protective Factors OR can we address a subset of the Protective Factors? Is
there a minimum number we must address? See the answer to #38.
40.
Can you clarify the long-term outcomes discussed during the webinar for ―Increased Family Strengths‖ and
―Decreased Likelihood of Maltreatment‖? The indicators for those two outcomes appear to overlap as they
both increase protective factors. See pages 20-21 of the RFP.
Research, Research Design, Evaluation
41.
The term ―research‖ is used several times in the RFP. Please clarify if research refers to the project
intervention or the research methodology. For purposes of this RFP, we use ―research‖ in a broad sense to
refer to various aspects of the research and demonstration project. This includes reference to: (a) the
research question(s) and design; (b) the collaborative intervention chosen to address the research
question(s); and (c) the activities around data collection, analysis, interpretation, and use—which we are
referring to as the evaluation. See pages 41-42 of the RFP for how we have framed the evaluation phases.
See pages 60-64 for details on what information to include in the Approach versus the Evaluation section of
your proposal.
42.
Could we have two agencies involved for local evaluation purpose: one as a consultant in design setting
and IRB approval body, and another one providing data collection and analysis? Yes.
43.
Page 61, #3, 2nd bullet - Number three asks for a description of the "research intervention" and bulleted
17
items use the "intervention" language. The second bullet however asks for "any innovative features of the
research project such as design, technological innovations.‖ Are you asking for innovative features of the
intervention? Yes.
44.
It is stated that 15% of the total federal dollars awarded must be allocated for the local evaluation and
another 7% for activities related to the cross-site study. However there is a statement that the remaining
78% of the budget can be used to support the intervention ―and research related to the intervention.‖
Please clarify the distinction between your expectations for the local evaluation versus ―research related to
the intervention.‖ The research related to the intervention is the local evaluation. Thus, the research
related to the intervention should be a part of the local evaluation budget. There should be enough
funding within the 78% to cover the costs of the intervention being tested (if not otherwise covered by
other funding sources) as well as other research activities that support the local evaluation and the
research on the intervention. The specific allocation of the percentages within the 78% percent is up to
the discretion of the applicant, but the applicant needs to clearly demonstrate that all the required elements
are covered.
45.
Regarding the scope of the proposed evaluation . . . Does the QIC-EC want a broad analysis of all
factors/outcomes that have been proposed in the RFP or should the design focus on obtaining an in-depth
examination of the factors likely to be influenced by the innovation? Both an in-depth focus on the factors
likely to be influenced and a broader analysis of all the factors should be part of the evaluation’s scope. The
balance will be influenced by your particular research question.
46.
Following up on the above question . . . If the QIC-EC expects a broad AND in-depth analysis, there's a
question as to whether the maximum grant amount will support such a broad and in-depth analysis. Once
you have identified a research question(s) that is important to address (that is in line with the overarching
research question in the RFP), then you will need to make choices as to the mix of broad and in-depth
analyses that will be most valuable in addressing the question(s) within the available resources.
47.
Utilization-focused evaluation and evaluative inquiry for complex systems (including approaches suggested
in the Kellogg report on evaluating systems initiatives) are not necessarily inconsistent with intervention
research but they emphasize quite different goals, objectives and processes. This leads me to ask if you
are envisioning three different study components: a) data collection for the cross-site study; b) a local
evaluation that is focused on utility to the local stakeholders and decision makers; and c) a study of the
effectiveness of the intervention (or some variation of intervention research such as the nested design you
illustrated in the webinar)? This is not explicitly stated in the RFP but seems to be implied by the statement
in the early FAQs about 22% of the budget set aside for evaluation and cross-site participation, but says
also that research on the intervention would come out of the other 78%. As you have noted, utilizationfocused evaluation and evaluative inquiry concerning complex systems is not inconsistent with intervention
research. Although it is not common practice to date to combine these three features, this is your
opportunity to do so. We are not looking necessarily for three different study components. However, we do
expect that you will likely be using mixed methods. Hopefully, the research question and the resulting data
collection and meaning-making processes can serve the multiple purposes of the cross-site study, the local
needs for information, and the study of the effectiveness of the intervention. We expect that your data
collection and analysis for your research/evaluation study will serve as the input for the cross-site
evaluation.
For purposes of this RFP, we use ―research‖ in a broad sense to refer to various aspects of the research
and demonstration project. This includes reference to: (a) the research question(s) and design; (b) the
18
collaborative intervention chosen to address the research question(s); and (c) the activities around data
collection, analysis, interpretation, and use—which we are referring to as the evaluation. See pages 41-42
of the RFP for how we have framed the evaluation phases. See pages 60-64 for details on what information
to include in the Approach versus the Evaluation section of your proposal.
48.
Please clarify research design A, B and C from webinar. Slides 36, 38, and 40 provide explanations for
each of these designs. Also see answers to questions 49, 50, 51, and 52 below.
49.
In reference to Design A, in addition to the core comparison group are we expected to have another
comparison group? This will depend on your overall research question and the nature of the collaborative
intervention. A strong design would be to have a comparison group at both levels of the social ecology that
you are investigating. However, if a comparison group is not proposed for the second level, applicants
should provide a justification for why this is not possible and how they will plan to collect at least
descriptive data that will help inform their local evaluation. Proposals will not be automatically penalized if
the design does not have a comparison group at the second level of the social ecology. The bottom line is,
let your research question(s) determine your design.
50.
It looks like you want a 2 x 2 design with research questions for 2 core areas. This is really impossible for
the amount of money you are providing. You are doubling the project. Do you really want a 2 x 2 design?
The RFP asks applicants to have an intervention at both the individual level and one other level of the social
ecology and have a treatment and control/comparison group. We are very interested in learning about not
only the effectiveness of the individual level interventions, but also about the relationships and interactions
between other levels of the social ecology (i.e., social relationships, community, and or policy) and the
individual level.
The 2X2 design that was presented during the December 1 webinar is just one option. It is consistent with
these expectations and can be a strong option for investigating relationships. Applicants must consider
their proposed collaborative intervention and their overall research questions very carefully and justify the
appropriateness of a research design that can address those questions.
We do not intend for the use of a 2X2 design to be a doubling of the project. If it appears to be doubling
the project, it may be because your LOI only addressed the individual level. We invited several partnerships
to move forward to a full proposal who had defined an intervention at only the individual level because it
appeared that they could reasonably add the second level and achieve a significant R&D project that meets
our requirement.
Regarding the cost issue, as stated in the RFP, a project’s research is to focus on interventions that are
already underway and/or ones that can be adjusted in fairly minor ways to become a meaningful focus for
research. The interventions are not totally new initiatives. The funds are not intended to primarily support
interventions but rather to conduct research on interventions. This is a research and demonstration project
first and foremost.
51.
In response to the requirement to ―include a control/comparison group at the other core area(s)
addressed,‖ would the following design be appropriate/strong: A randomized trial of a parenting
intervention, implemented in collaboration with several community-based service agencies, with those
randomly assigned to the parenting intervention or control groups also being randomly assigned to a
supplemental social support component or not. That is, there would be 4 groups: parenting only, social
support only, parenting + social support, and control only. Three core areas would be addressed
(individual, relationship, and community core areas), 2 would be rigorously evaluated through the
randomized trial. Yes.
19
52.
On slide #11 you indicate that applicants can submit a randomized control design or a quasi-experimental
design. On slide #34, you provide an example of an appropriate research question addressing a
hypothetical community connections core area in which a random design or quasi-experimental design
might be employed.
On slide, #36 you indicate your preference for designs that conform to "Research Design A." As we read
the example, however, it could only be conducted using a quasi-experimental design. That is, if two
communities are included in the study (one serving as the treatment site and the other serving as the
comparison site), families cannot, of course, be randomly assigned to the community in which they reside.
As such, a true experimental design could not be possible. Correct. For the situation where the second level
is the community level, the design would be a quasi-experimental design. For the individual level the design
may be either experimental or quasi-experimental.
Can we assume instead that by "Research Design A," you are referring to proposals that assign young
children and families to a collaborative service model (what you call "community") as the treatment
condition vs. assignment to a non-collaborative or non-existent service model? Recall that there is to be an
intervention at both the individual level and one other level.
Research Design A could be interpreted either as an experimental or quasi-experimental design depending
on your research question and levels of the social ecology you are addressing.
Our primary reason for presenting research designs A, B, and C is to encourage you to be as clear as
possible about the research design you are proposing, to think through alternatives, and to select one(s)
that are appropriate to your research question(s). Let your research question(s) determine your design. A
preliminary answer to an important research question is more valuable than a conclusive answer to an
unimportant question. Findings from this research may well help set the stage for future research about
critical issues.
Please keep in mind that, within the required experimental or quasi-experimental design, we do anticipate
that a mixed methods approach will be necessary to adequately address the systemic context of the
interventions.
53.
Is it appropriate to propose a project that would be similar to the project and the targeted population
funded by another federal agency; the project ended 4 months ago? Can data from the previous project be
incorporated into an evaluation design? It depends on your research question. We are not immediately
ruling out the possibility.
54.
Is it acceptable to propose a research design that has conducted an evaluation that is evidenced informed
for the past 4 years and is moving along the continuum from Evidence-Informed to Evidence Based? It
depends on your research question and whether you can meet the other design requirements.
55.
In the sense of numbers for our project, what is the ideal number of participants including the intervention
and comparison group? It will depend on your research design and research question. Conduct an a priori
power analysis to determine what is appropriate in your situation.
56.
Would single subject design be an appropriate methodology to utilize? No.
20
Staffing
57.
I have a question regarding the local evaluation component of the application. Are there any
requirements/restrictions on inclusion of this function as part of the lead agency, or does the local
evaluation have to be conducted externally? The local evaluation does not have to be conducted
externally. If the local evaluation is conducted internally, the applicant should address how the local
evaluators plan to address any potential perceptions of bias or conflict of interest.
58.
We anticipate that the Principal Investigator will oversee both the implementation of the intervention and
the research and evaluation components. Do you see a problem with combining leadership for
implementation and evaluation in one position? As indicated on page 35 of the RFP, applicants are
required to identify individuals who will assume primary responsibility for (a) management activities, (b)
implementation of intervention activities, and (c) local project evaluation activities. The implementation of
the intervention activities and the local project evaluation activities should be sufficiently separated to
ensure an unbiased evaluation.
59.
Would it be a problem if the PI (overseeing the project) is also the ―local‖ evaluator? See the answer to
question #58. Also, see the section on ―Key R&D Project Personnel in the RFP (page 35).
60.
We would like to have a staff person working at a state agency overseeing child welfare, to be our project
director. But because she is a salaried staff on the state pay roll, she cannot draw salaries from this grant.
(Her efforts will be in-kind.) Could you confirm whether this is acceptable? The Project Director requires a
significant amount of time and level of effort to be committed. It is unclear whether the staff person you
are proposing would have time available if she is committed to other duties. Given the critical nature of the
role of the Project Director, the application will need to provide adequate justification that the proposed
staff person can take on those responsibilities and has the approval of her agency to do so.
In terms of in-kind staff, this is possible for this grant. The in-kind contribution (i.e., time allocated to the
project) should be included as part of the match for your project.
Sections of the Proposal (including logic model and appendix)
61.
Is there a word limit on the title? A title of no more than 12 words is recommended, but not required. The
title should be a concise statement of the focus of the project and should identify the actual variables under
investigation and the relationship between them.
62.
When reviewing the Evaluation section, there are a few slides regarding research methodology. Do you
want that information included in both the Approach and Evaluation sections of the application since it is
mentioned there? Although related information is requested in these sections, each has very specific
requirements. See pages 60-64 in the RFP.
63.
If we have more than one evaluator working on the project can we submit both CVs in the appendix? Yes.
64.
Can we include letters of recommendations from prior evaluation projects on behalf of the research team
members in the appendix? Include only the documents listed in section 3.7.10.
65.
Do you have a specific format in mind for the presentation of the logic model? How many pages can the
logic model be? The logic model should be one page; as it is a table or graphic, it does not have to be
21
double-spaced. There is no specific format for the presentation of the logic model. See section 2.5.1, page
30 of the RFP for information on the development of the logic model.
66.
On page 16 of the RFP it states that we are required to describe plans for ―(a) initial staff orientation to the
work of the R&D project, (b) other staff training, and (c) on-going technical assistance for staff.‖ Where
should this be described in the proposal? This should be described in the ―Approach‖ section; see #3,
second bullet, page 61 of the RFP.
67.
We are requesting clarification on the deliverables required by applicants. Deliverables required by
grantees are described in the RFP, for example sections 2.5.8, page 34 (Reporting Requirements) and
2.5.16, page 46 (Products of R&D Projects).
68.
May we use a slightly larger font than 12 pt (e.g. 14 pt) for labeling/titling on charts and graphs? (we're not
going smaller than 11 pt.) Yes.
69.
Must we use Times New Roman font on charts and graphs? A different font may be used for charts and
graphs.
22
Face Sheet Template
(1 page limit; if necessary, may be single-spaced)
Proposal Identification #:
Title of Project:
Individual Responsible for Overall Project Implementation (e.g., Principal Investigator):
Name
Affiliation
Mailing Address
Telephone Number
Fax Number
Email Address
______________________________________________________
Signature
__________________________
Date
Individual Responsible for Fiscal Administration of the Grant (e.g., Grants Manger, CFO, etc.):
Name
Title/Position
Affiliation
Mailing Address
Telephone Number
Fax Number
Email Address
______________________________________________________
Signature
__________________________
Date
Download