Focus Groups

advertisement
Adding Focus Groups to Your
Assessment Plan
Nathan Lindsay
November 12, 2013
The Assessment Cycle
1) Develop or revise divisional,
departmental, or program goals
and outcomes
6) Use results to
inform decisions
and practices;
provide evidence
for student
learning
5) Review
assessment
results
2) Provide learning
experiences (programs,
services, activities,
classes, etc.)
4) Identify, develop, and
administer measure(s)
3) Determine
assessment
method(s)
What type of data do you need?
Quantitative
Qualitative
• Focus on numbers/numeric values
• Easier to report and analyze
• Can generalize to greater
population with larger samples
• Less influenced by social
desirability
• Can be less of a time commitment,
and less expensive
• Focus on text/narrative from
respondents
• More depth/robustness
• Ability to capture “elusive” evidence
of student learning and
development
• Specific sample
Selecting focus groups as a method
Pros
Cons
•Understand perceptions, beliefs, or
opinions of participants
•Direct and indirect method
•More data in a shorter period of time
•Members build off of each other’s
ideas
•Increased understanding of specific
topic
•Less expensive
•Larger number of participants
•Less useful when statistical
data/reports are needed
•Group dynamics/mix can be difficult
to manage
•Not generalizable
•Time needed for training and
analysis
•Facilitation requires skill
•Lack of control over discussion
(Stage, 1992; Stage & Manning, 2003)
Steps to using focus groups
1. Determine what type of focus groups will work best.
2. Determine how many focus groups you will conduct and
select appropriate times and locations.
3. Write your script/questions for the focus group sessions.
4. Select the students who will be invited to participate.
5. Conduct the focus group.
6. Debrief with other facilitators.
7. Compile the data.
8. Write the report.
The Assessment Cycle
1) Develop or revise divisional,
departmental, or program goals,
objectives, and outcomes
6) Use results to
inform decisions
and practices;
provide evidence
for student
learning
5) Review
assessment
results
2) Provide learning
experiences (programs,
services, activities,
classes, etc.)
3) Determine
4) Identify, develop, and assessment method(s)
administer measure(s)
Resources Needed
Staff: One facilitator, one note taker (minimum)
Facilities: Eye contact, privacy, no interruptions
Materials:
•
•
•
•
•
Audio or video recorder (recommended)
Note-taking materials (e.g., paper/pen, laptop)
Nametags
Consent form
Incentives
Time: 60-90 minutes for each session
Participants: 6 – 12 per group
Rounds: 2 – 5 (until redundancy is reached)
(Krueger, 1998b; Stage, 1992; Schuh, Upcraft, & Associates, 2001; Vaughn, Schuman, &
Sinagub, 1996)
Types of questions
• Experience and behavior questions
“How does your organization utilize the Student Center?”
• Opinion and values questions
“Why do you think the University should consider including a multipurpose space in the
Student Center expansion? Why shouldn’t we?”
• Feeling questions
(Intro about inclusivity) “How do you feel when in the Student Center?”
• Knowledge questions
“What Student Center resources are you aware of?”
• Sensory/environmental questions
“What do you see when you enter our Student Center?”
• Background and demographic questions
”Please tell us your name, class year, major, and what student organizations you are
associated with.”
Developing questions
•
•
•
•
•
•
•
•
Be clear.
Avoid biased, loaded or leading questions.
Avoid making significant memory demands.
Ask only one question at a time.
Keep the question short.
Don’t make assumptions.
Define terms and concepts.
Consider whether they will feel comfortable answering
honestly.
• Let participants admit they don’t know or can’t remember.
• No closed-ended.
Sequence of questions
Trust focus
1. Ice breakers
2. “Safe” questions
3. Opinions,
interpretations, feelings
4. Knowledge questions
(Kruger, 1998)
Logic focus
1. Introductory
2. Transition questions
3. Key questions
4. Summary questions
Example Protocol
Introductory questions:
• Please tell us your name, your class year, major, and what student
organizations you are associated with.
• How does your organization utilize the Student Center? What are some of the
reasons that students, faculty, and staff utilize the Student Center?
Transition questions:
• What are some events for which you would like to utilize the Student Center
but don’t? Why don’t you utilize the Student Center?
Key questions:
• Why do you think the University should consider including a multipurpose
space in the Student Center expansion?
• Why shouldn’t the University consider including a multipurpose space in the
Student Center expansion?
• What would your organization use the multipurpose space for?
• After providing operational definition of the multipurpose space: What is
missing from the definition? Are there other ways that your organization
would utilize the space?
Summary questions:
• Do you have any final comments?
Selecting Participants
1. Purposeful sampling:
• Demographics (representative of population, or targeted
population)
• Experience and knowledge (ability to provide rich data)
2. Convenience (who will show up?)
Want to have commonalities among participants but be
aware of existing relationships
(Vaughn, Schuman, & Sinagub, 1996; Rea & Parker, 1997; Stage, 1992)
Facilitating focus groups
• Consider who should be the facilitator
• Proximity to topic
• Relationship to participants
• Establish ground rules
• Goal is to generate discussion among participants, not to
come to a consensus or to answer to facilitator
• Necessary to establish rapport with subjects
(Stage, 1992)
Focus Groups: Establishing Rapport
•
•
•
•
•
•
Speak clearly
Show interest
Control negative body language
Remember your role
Stay on track
Seek clarification
(Assessing Student Learning and Development, Bresciani et al., 2004, p. 55)
The “Dynamic” Element
synergistic group effect:
Interactions among and between group members stimulate
discussions in which one group member reacts to
comments made by another
(Berg, 1998, p. 101; Upcraft and Schuh, 1996; Stage & Manning, 2003)
Ensuring rigor
Trustworthiness
• Credibility: data, interpretations, and conclusions are credible to the
participants and others
• Transferability: the study might be useful in another setting
• Dependability: changes over time are accounted for
• Confirmability: the data and conclusions can be confirmed by
someone other than the researcher
Strategies
•
•
•
•
•
Triangulation
Peer debriefing
Inter-rater reliability
Respondent debriefing
Verification
(Lincoln & Guba, 1985)
The Assessment Cycle
1) Develop or revise divisional,
departmental, or program goals,
objectives, and outcomes
6) Use results to
inform decisions
and practices;
provide evidence
for student
learning
5) Review
assessment
results
2) Provide learning
experiences (programs,
services, activities,
classes, etc.)
3) Determine
4) Identify, develop, and assessment method(s)
administer measure(s)
Analysis
After data collection is complete:
1. Frame the discussion/context
2. Review the data (tapes, written documents, notes)
3. Take one: Identify categories or themes
4. Discuss themes that have been identified
5. Take two: Categorize data into themes; Refine themes
6. Measure the frequency of theme iteration
Debrief
• What themes or issues were discussed?
• How did these differ from what we expected?
• How did these differ from what occurred in earlier focus
groups?
• What points need to be included in the report?
• What quotes should be remembered and possibly
included in the report?
• Should we do anything differently for the next focus
group?
(Kruegger, 1998)
Reporting
A summary report lists the key themes under each topic,
along with a verbatim quote or two that illustrates the
theme and some indicators of frequency, extensiveness,
and intensity.
• Frequency -- how often was the theme (or comment)
heard?
• Extensiveness -- how many different students expressed
the same theme?
• Intensity -- how strongly was the opinion expressed?
Remember to acknowledge limitations and possible bias!
Focus Groups at the University of North
Carolina Wilmington (UNCW): 4 Examples
1) Association for Campus Entertainment (ACE)
Executive Board
2) Student Veterans Organization
3) Follow-up to Housing Benchmarking Survey
4) New Staff Members’ Perspectives on Diversity
Example #1: Campus Entertainment
Executive Board
• Focus groups conducted in fall 2008 and spring 2009
• Likert scale questionnaire of 13 items completed at the
beginning of the focus groups. Examples included:
• I have learned to work better as a part of a group/team.
• I am able to hold others accountable for their actions.
• Focus group questions mirrored the 13 outcomes:
• Have you learned to work better as a part of a
group/team? Why/why not?
• Are you able to hold others accountable for their
actions? Why/why not?
Campus Entertainment Executive Board
• Student responses recorded by note taking and a digital
recorder
• Graduate student took the lead in data analysis
• Very useful to compare related quantitative and qualitative
data
• The students gave valuable insights into several areas,
including:
• The ability to develop leadership skills and work as a team
• Accepting responsibility with completing and executing tasks
• Effectively communicating thoughts and ideas confidently in any
situation
Example #2: Student Veterans Organization
• There were 400 active military or veteran students at UNCW
• A focus group was conducted with the Student Veterans
Organization, which built upon a prior survey of veteran students.
• We learned that you need to balance convenience with location
(i.e., the lounge area can be noisy!)
• It was incredibly helpful to have direct quotes from the veterans for
report and grant writing
• The veterans’ feedback included the following:
• Please realize that we are our own group—we are not just nontrads or adult students
• “Veterans talk to veterans”—marketing and information sharing
is more effective coming from within the group
Applying the Student Veterans
Focus Group Feedback
• More proactive in educating faculty and staff on student
veterans’ issues and resources.
• Increased academic and personal support resources for
student veterans.
• Special session at Orientation created for veterans.
• Make Veterans Day a more significant event on campus
• Create more mentoring programs, social gatherings, and
leadership opportunities available to the Student Veterans
Organization.
Example #3: Follow-up to a Housing
Benchmarking Survey
• Benchmarking survey in fall 2007 produced many positive
results, but also some items of concern for Housing
administrators and the Vice Chancellor for Student Affairs
• Focus groups conducted in spring 2008 with a broad
range of students in Housing and Residence Life
• A document analysis of other institutions’ practices
(especially regarding safety) was also conducted to see
how we compared with our peers
Housing and Residence Life Focus Group Themes
Housekeeping:
• Students repeatedly identified issues with the work habits of housekeeping
staff and a general neglect of duties (excessive time on breaks, sleeping or
watching television in hall lounges, and lack of effort in cleaning bathroom
areas) being the most reported issue.
• Students identified weekends as a better use of time/effort for cleaning staff.
Maintenance/Work Order Reporting:
• Students identified timeliness of repairs and a lack of knowledge about
timetables for follow-up visits as major concerns.
• Students requested a mechanism for identifying that a maintenance item was
addressed, was in the process of being addressed, or was being forwarded to
physical plant for a more specialized level of response.
Safety/Security Issues:
• Students were complimentary of residential security initiatives including door
access, front desk operations, and staff responses (including the presence of
University Police officers in the halls).
• A troubling percentage of students responded that they do not lock the door
on a normal basis when leaving their room.
Example #4: New Staff Members’
Perspectives on Diversity
• Purpose: To understand how aspects of diversity had
played a role in the hiring process (including in the ad,
phone interview, campus visit, and experiences since
being hired)
• Participants included 11 staff members in student affairs
who had been hired in the last two years
• Focus Group findings:
• Ensure that we are conveying adequate and correct
information about our student and staff diversity
• Questions regarding diversity in the interview process
need to be more consistent across candidates
The Benefits of Focus Groups in
Mixed Methods Assessments
• Drill Down
• Quantitative =What are the issues/numbers?
• Qualitative = Why are they issues?
• Greater Validity of Reports (You’ve gone the extra mile)
• High Touch Opportunities
• Students feel that their voice is heard, and you’re able to
access students’ “meaning making”
• Schuh (2009) contends that mixed methods
assessments will become more widely used
Generating Interest, Participation, and
Ownership in Focus Groups
• Many of the same principles outlined for quantitative
assessment apply
• For sampling, a purposeful sample is often the best
strategy, but captive audiences can also be used
• Incentives (t-shirts, food, drinks) are helpful in recruiting
participants
• Provide opportunities to discuss positive/negative
feedback
To Foster Greater Ownership:
Include the Students
When developing and administering a focus group
assessment, reviewing the data, and discussing potential
changes/action steps, think about:
•
•
•
•
•
•
Resident assistants
Peer mentors
Orientation leaders
Student club/organization leaders
Tutors/Supplemental Instruction leaders
Student employees
References
Berg, B. (1998). Qualitative research methods for the social sciences. Needham
Heights: Allyn & Bacon.
Bresciani, M.J., Zelna, C.L., & Anderson, J.A. (2004). Assessing student learning
and development: A handbook for practitioners. Washington, D.C.: National
Association of Student Personnel Administrators.
Kruegger, R.A. (1998a). Analyzing and reporting focus group results. Thousand
Oaks, CA: Sage Publications.
Kruegger, R.A. (1998b).Moderating focus groups. Thousand Oaks, CA: Sage
Publications.
Lincoln, Y., & Guba, E. (1985). Naturalistic inquiry. New York: Sage.
Marshall, C. & Rossman, G. (1999). Designing qualitative research (3rd edition).
Thousand Oaks, CA: Sage Publications.
Patton, M. (1990). Qualitative evaluation and research methods. Thousand
Oaks, CA: Sage Publications.
References (cont.)
Schuh, J. (2009). Assessment methods for student affairs. San Francisco:
Jossey Bass.
Schuh, J.H., Upcraft, M.L., & Associates (2001). Assessment practice in student
affairs. San Francisco: Jossey-Bass.
Stage, F.K. and associates. (1992). Diverse methods for research and
assessment of college students. Alexandria, VA: ACPA.
Stage, F. K. & Manning, K. (2003). Research in the college context. New York:
Brunner-Routledge.
Subramony, D. P., Lindsay, N., Middlebrook, R.H., & Fosse, C. (2002). Using
focus group interviews. Performance Improvement, 41(8): 38-45. Silver
Spring, MD: International Society for Performance Improvement.
Upcraft, M. L. & Schuh, J. H. (1996). Assessment in student affairs: A guide for
practitioners. San Francisco: Jossey-Bass Publishers.
Vaughn, S., Schuman, J. & Sinagub, J. (1996). Focus group interviews in
education and psychology. Thousand Oaks, CA: Sage Publications.
Download