Impact and Experiences of Imagine Tomorrow 2014 Submitted to

advertisement
Impact and Experiences of Imagine Tomorrow 2014
Submitted to:
Imagine Tomorrow Executive Strategy Committee
and
Internal WSU Imagine Tomorrow Steering Committee
By
Jessica L. Beaver Chad M. Gotch
Brian F. French
80 Cleveland Hall
Washington State University
Pullman, WA 99164-2136
(509) 335-8394
IT Assessment Report 2014
1
Study Overview
Description of study
The Imagine Tomorrow (IT) event held at Washington State University brings students and
professionals together to present innovative and creative projects in the areas of science, technology,
engineering, and mathematics (STEM); design; and human-behavior fields. To continue the success and
growth of IT, stakeholders must learn about participants, how they view the experience, and how they
are influenced by their participation. By gathering such data, we can begin to describe the impact IT has
on participants, identify areas that may need improvements, and implement changes to help IT reach its
full potential to create meaningful learning opportunities for students and contribute to a skilled and
energetic workforce. The following report summarizes survey responses from students, advisors, and
judges who participated in Imagine Tomorrow 2014. The information from these surveys will be used to
inform the more formal assessment process for the 2014-2015 year. We note this is the second
systematic data collection targeting both proximal and distal outcomes of the program. This report gives
descriptive information from the 2014 students, advisors, and judges. During the upcoming year, data
across the two years will be evaluated together to assess trends and differences on outcomes.
Methodology
There were 542 students, 50 advisors, and 115 judges who participated in IT 2014. To gather
information about their experiences, opinions, and other variables of interest, a separate questionnaire
was constructed for each group. Completion times were 30 minutes for the student questionnaire and
10 minutes each for the advisors and judges questionnaires. Each person was assigned a unique survey
access code so their responses could be tied back to demographic information reported during event
registration. Participation was voluntary, and all participant responses to questionnaire items were
anonymous.
Survey methodology followed the Tailored Design Method (Dillman). The process involved a
series of contacts with the participants, with communication at each point in time and for each group
being tailored to the specific context. All contacts were made via email. The first contact contained a
description of the purpose of the study and a URL link for the individual to access the questionnaire.
Responses were monitored, and follow-up emails were sent when response activity waned. Two
reminders to participate were sent to judges and a third reminder was sent to advisors and students. As
an incentive to participate, student respondents were entered into a random drawing for three Kindle
Fire HDXs.
Response Statistics
Valid email addresses were available for 434 of the 542 student participants. At the close of data
collection, 188 students had responded to the survey. Inspection of responses revealed 24 students had
not completed more than the first couple pages of the survey. These students were removed from the
analysis, leaving a final sample of 164, corresponding to a response rate of 38%. In the advisor category,
37 of the 50 participants responded to the survey, corresponding to a response rate of 74%. Finally,
among the 115 judges who participated in IT 2014, 85 responded to the survey, for a response rate of
74%.
Event participation
Figure 1 displays project categories as identified by students who responded to the survey. Of
projects presented at the IT event, the most-represented category was technology, 37%. Projects
concerning human-behavior accounted for 31% of the total and projects concerning biofuels and design
accounted for 18% and 15%, respectively.
IT Assessment Report 2014
2
Behavior
31%
37%
Biofuels
Design
Technology
18%
15%
Figure 1. Project categories as identified by student respondents.
A majority of students (77%) indicated they heard about the IT event from a teacher,
administrator, or counselor at their school. Another 16% heard about the event from other students at
their school. Sources of advisor awareness of the event were more varied. Many advisors heard about
the IT event through a teacher, administrator, or counselor at their school (40%) or via an email from IT
staff (14%). Other channels advisors had for learning about the IT event were a student at their school
(9%), community members outside of school (3%), a poster or flier (11%), a conference (6%), from a
teacher, administrator, or counselor at another school (6%), or other unspecified means (11%).
Figure depicts students’ most important influences for participating in the IT event. The most
prominent influence for students to participate in IT was their interest in the topic (49%). Fewer
students indicated that their primary reason for participating in the IT event was because school
personnel had encouraged them to do so or because they believed participation could help them get
into college (14%). A few students wanted to work with their friends (9%). Smaller proportions of
students indicated that their number one reasons for participating in IT were because they were
attracted to the prizes (6%) or for reasons other than the ones identified in the questionnaire (8%).
Interested in topic
8%
6%
School personnel
encouraged me
9%
49%
May help me get into
college
Wanted to work with
friends
14%
Attracted by the prize
money
14%
Other
Figure 2. Students’ most important influences for participating in Imagine Tomorrow.
IT Assessment Report 2014
3
Students, STEM, and career aspirations
Motivation, Identity, Teamwork, and Self-efficacy
To learn more about the students participating in the IT event, particularly concerning their
orientation to STEM academics and careers, students were asked to complete four non-cognitive scales
assessing their motivation for STEM classes in their high schools, how closely STEM work aligns with
their identity, the experiences and orientations they have toward teamwork, and the belief they have in
themselves to successfully engage in STEM activities (i.e., their STEM self-efficacy). Students selfreported on a 5-point response scale for each measure, and were assigned a mean score based on their
responses.
Responses across all four scales were very positive. The median score on the STEM motivation
scale was 4.00, with scores ranging from 2.54 to 4.61. The median score on the STEM identity scale was
4.35, with scores ranging from 3.57 to 4.85. The median score on the teamwork scale was 4.23, with
scores ranging from 3.49 to 4.52. The median score on the STEM self-efficacy scale was 3.71, with scores
ranging from 1.97 to 4.58. While we cannot claim the Imagine Tomorrow experience produced such
positive responses, we can state that students who participated retained a positive orientation toward
STEM.
Career intentions
After participating in IT, students were largely positive in their interests to pursue a career in the
fields highlighted by the event (Figure 3). Over half the respondents were very interested or extremely
interested in pursuing a career in science or engineering. Less than one in four reported only a slight
interest in these career fields or no interest at all. Interest in a career related to studying human
behavior or design was more dispersed with the interest levels of students residing mostly in the middle
of the scale.
100%
80%
41%
31%
21%
12%
27%
22%
60%
40%
extremely interested
20%
27%
28%
24%
32%
very interested
moderately interested
slightly interested
20%
20%
0%
20%
16%
18%
10%
2%
9%
7%
Science careers
Engineering
careers
not at all interested
14%
Design careers Human behavior
related field
careers
Figure 3. Students’ interests in pursuing a career in science, engineering, design, or human-behavior.
IT Assessment Report 2014
4
In Figure 4 we see a representation of words used by students to describe whether they are
more or less interested in STEM, design, or human-behavior related fields after participation in IT. Large,
bold words represent high frequency of usage; smaller words were used less frequently. It appears
students have become more interested in the fields highlighted at the IT event. Students frequently
mentioned research as a reason why they had become more interested in the fields highlighted by the IT
event. Perhaps conducting their research projects enhanced their interests in these areas of study and
will inspire future careers in these fields.
Figure 4. Students’ interests in fields highlighted by the IT event.
Student respondents were also asked to report their opinions, along five dimensions, concerning
their interests in pursuing a STEM career. A three-point response scale was provided for each
dimension. Students overwhelmingly saw a STEM career as fun, inspiring, and interesting (Figure 5).
Responses also indicated such a career reflected what most students want to do and would be a career
while acknowledging pursuing that career would not be easy academically. It appears IT student
participants are up for a challenge in order to realize their career aspirations.
Not what I want to do
10%
46%
44%
What I want to
1%
Not fun
59%
40%
Not academically easy
23%
Not inspiring 4%
Fun
65%
29%
13%
67%
Academically
easy
Inspiring
1%
Not interesting
76%
23%
0%
20%
40%
60%
Interesting
80%
100%
Figure 5. Students’ opinions about pursuing a career in STEM fields.
IT Assessment Report 2014
5
Quality of the experience
Overall quality
All students, judges, and advisors were asked to rate the overall atmosphere and quality of the
IT event on a 5-point scale ranging from 1= poor to 5= excellent. However, because of the low number of
responses indicating a poor, fair, or good quality rating, these three response categories were collapsed
and made into a single category. Overall, the quality and atmosphere of the 2014 IT event was rated
very positively, with the majority of participants giving the two highest marks possible, very good (4) and
excellent (5), as shown in Figure 6.
100%
80%
60%
51%
50%
41%
40%
20%
51%
44%
Students
Advisors
Judges
30%
19%
6%
9%
0%
1, 2, or 3
4
5
Figure 6. Participant ratings of the overall quality of the IT event.
Competition Experience
Students, advisors, and judges all responded to an open prompt asking them to identify the best
part of their IT experiences. Their responses are depicted in Figures 7-9. Students responded to this
question by stating that the projects were the best part of their experience at the IT event. Other
common responses were seeing all the presentations, participating in the competition, and interacting
with judges and friends throughout the event. Advisors and judges had very similar responses to each
other and both stated that meeting and interacting with students was the best part of their experiences.
Advisors also mentioned meeting and interacting with judges and watching students’ presentations.
Judges also mentioned seeing all of the different ideas and projects students had as well as the students’
enthusiasm for the event competition. Clearly, the interactions amongst groups were a key component
to the event for all involved.
IT Assessment Report 2014
Figure 7. Words used by students to describe the best part of IT 2014.
Figure 8. Words used by advisors to describe the best part of IT 2014.
Figure 9. Words used by judges to describe the best part of IT 2014.
6
IT Assessment Report 2014
7
Future Imagine Tomorrow events
Future participation
Students, advisors, and judges were all asked how likely it is that they will participate in future IT
events. Responses were rated on a four-point scale from not at all likely to very likely. The majority of
students (71%), advisors (83%), and judges (71%) all indicated that they would be very likely to
participate in future IT events (Figure 10).
100%
83%
80%
71%
71%
60%
Students
Advisors
40%
27%
20%
11%14%
10%
8%
Judges
3% 2%
0% 0%
0%
Not at all likely
Somehwat
likely
Likely
Very likely
Figure 10. Participants’ likelihood for participating in the IT event again.
Recommendation to others
All participants were asked if they would recommend others to participate in IT events (Figure
11). The majority of students (64%), advisors (75%), and judges (73%) all indicated that they would be
very likely to recommend the IT event to colleagues and others. It is clear Imagine Tomorrow is
providing a positive and sustainable STEM experience to high school students and teachers in the
Northwest.
100%
75%73%
64%
80%
60%
Students
Advisors
40%
26%
22%24%
20%
3% 0% 0%
8%
Judges
3% 2%
0%
Not at all likely Somehwat
likely
Likely
Very likely
Figure 11. Participants’ likeliness for recommending the IT event to others.
IT Assessment Report 2014
8
Closing Remarks
The Imagine Tomorrow event has evidence to support that it is a productive and inspiring event
at Washington State University. Hundreds of participants from across the state and Northwest come to
the Pullman campus to present innovative and often ambitious work. These participants leave with
strongly positive feelings about their experience. At an even broader level of impact, IT appears to
maintain and perhaps even cultivate an interest in STEM education and careers. It is clear the projects
and, more importantly, the ideas, bring people together to interact in positive ways. This impact sets up
IT to make positive contributions not only to those who participate in the event, but also to the
economic health of the state, as the students move into careers that generate products and services
that position the state to compete in a global marketplace. Noted areas for potential improvement of
the event appear easily addressable. Ensuring participants have information necessary to get oriented to
campus, adequate time to carry out their responsibilities, good food, and relevant and stimulating
interactions with professionals in the field (including the keynote address) will allow IT to carry on the
positive contributions it is already making. The next phase of analysis with existing data will begin to
look at trends across years on common assessments and student outcomes.
Download