A Study of the Effectiveness of the Cengage MindTap Application

advertisement
Improving Student Knowledge and Skills With Digital Learning Tools:
A Study of
the Effectiveness
of the Cengage MindTap Application
ABSTRACT
The purpose of this study was to explore the impact of MindTap’s
use on student learning and to gather student and teacher
perceptions for further refinement of the product. This study
investigated the following question: Do students in classes that use
MindTap show gains in course knowledge and skills? The study also
explored the results by gender and ethnicity.
The study used a mixed-method, qualitative and quantitative
research design to evaluate this question. First, using a pre-post,
treatment-group-only design, this study evaluated student growth in
knowledge and skills. Second, a qualitative study of instructor and
student perceptions of MindTap effectiveness using surveys and
interviews was conducted. The findings indicate that students in
classes using MindTap substantially increased their course
knowledge and skills. An effect size of .84 was found for students in
Nutrition classes, and an effect size of .77 was found for students in
Macroeconomic classes.
This study was conducted by MarketingWorks and SEG Measurement, independent educational research firms.
This research was supported by a grant from Cengage Learning.
January 2013
1
Improving Student Knowledge and Skills With Digital Learning Tools:
A Study of the Effectiveness of the Cengage MindTap Application
Table of Contents
Executive Summary ............................................................................................................................................. 2
Overview ........................................................................................................................................................... 2
Findings ............................................................................................................................................................. 2
Conclusion ........................................................................................................................................................ 4
Chapter 1: Introduction ...................................................................................................................................... 5
Student Sample .............................................................................................................................................. 10
Chapter2: Quantitative Study ........................................................................................................................... 13
Study Design .................................................................................................................................................. 13
Description of the Pretest and Posttest ..................................................................................................... 13
Description of the Treatment ..................................................................................................................... 14
Findings ........................................................................................................................................................... 15
Descriptive Statistics ..................................................................................................................................... 15
Evaluating Growth in Language Skills....................................................................................................... 15
Chapter 3 Qualitative Study ............................................................................................................................ 16
Feedback from Instructors .......................................................................................................................... 16
Feedback from Students .............................................................................................................................. 23
Chapter 4: Conclusion ....................................................................................................................................... 30
2
Improving Student Knowledge and Skills With Digital Learning Tools:
A Study of the Effectiveness of the Cengage MindTap Application
Executive Summary
Overview
In 2012, Cengage Learning introduced a new online course management and instructional support
tool, MindTap. In the fall of 2012, MarketingWorks and SEG Measurement, two independent
research organizations, conducted a semester-long study of students and instructors in classes using
MindTap as a preliminary evaluation of the effectiveness of the product.
Instructors and students in classes using MindTap completed an introductory survey, a bi-weekly
survey, and a concluding survey to ascertain their perceptions of the efficacy of MindTap, how it
was used, areas of strength and weakness, and recommendations for improvement. A treatmentgroup-only design was used to gather quantitative evidence of effectiveness. Students in classes
using MindTap took a pretest of course content knowledge and skills at the beginning of the
semester and a posttest at the end of the semester. The amount of growth between pre- and
posttest was analyzed statistically as an indicator of the effectiveness of MindTap.
Findings
Students in classes using MindTap showed substantial growth from pretest to posttest in both
Macroeconomics and Nutrition (see Figure 1). Students in Nutrition classes using MindTap showed
significant gains in course content knowledge and skills (t=10.21;df=90;p<.01; Effect size=.84).
Students in Macroeconomics classes using MindTap showed significant gains in course content
knowledge and skills (t=5.89;df=58;p<.01; Effect size=.77).
3
Improving Student Knowledge and Skills With Digital Learning Tools:
A Study of the Effectiveness of the Cengage MindTap Application
At the conclusion of the semester, the average rating of instructors’ overall experience with
MindTap was 2.9 out of 4, or a grade of B. In addition, “adoption potential,” meaning the
willingness to recommend MindTap to other colleagues, was rated 3.1 out of 4, with 77% saying
there was an “excellent” or “above average” possibility that they would recommend MindTap to
their colleagues. All but one of the instructors said that the program contributed to the
improvement of student learning, and 85% said that students were either very or somewhat engaged
with their coursework, largely as a result of their use of MindTap.
4
Improving Student Knowledge and Skills With Digital Learning Tools:
A Study of the Effectiveness of the Cengage MindTap Application
Students rated their overall experience with MindTap as 3.3 out of 4, a solid B+ rating. In addition,
most found the coursework to be very engaging, saying that they were more engaged in the course
because of their use of MindTap. Finally, about two-thirds of students said that they thought they
had learned more than they would have otherwise because of their use of MindTap. On average,
students rated their agreement with the statement “I would recommend the use of MindTap for this
course to my friends” as 4.6 out of 5.
Conclusion
The results of both the qualitative and quantitative study suggest that MindTap is an effective tool
for improving student learning. Both students and instructors reported positive perceptions of the
MindTap product, and students in both the Nutrition and Macroeconomics classes using MindTap
substantially increased their course content knowledge and skills over the course of a semester.
Cengage plans to confirm these preliminary positive results in larger studies to be conducted during
the 2013-2014 school year.
5
Improving Student Knowledge and Skills With Digital Learning Tools:
A Study of the Effectiveness of the Cengage MindTap Application
Chapter 1: Introduction
In 2012, Cengage Learning introduced a new digital course management and instructional support
tool, MindTap, an online, personalized learning experience built on Cengage Learning’s content.
MindTap combines student learning tools - readings, multimedia, activities, and assessments - into a
Learning Path that guides students through their course. Instructors can personalize the experience
by customizing Cengage Learning content and learning tools, including the ability to add their own
content via apps that integrate into the MindTap framework with Learning Management Systems.
To gather preliminary evidence of the effectiveness of the MindTap program, MarketingWorks and
SEG Measurement, two independent research organizations, conducted a semester-long study in the
fall of 2012 of students and instructors in classes using MindTap. The study was designed to collect
information for use in refining the MindTap application and to conduct a preliminary evaluation of
the effectiveness of MindTap. The primary research question for this evaluation was: Do students in
classrooms using MindTap show significant gains in course content knowledge and skills?
All instructors and a sample of students in classes using MindTap completed an introductory survey,
a bi-weekly survey, and a concluding survey to ascertain their perceptions of the efficacy of
MindTap, how it was used, areas of strength and weakness, and recommendations for improvement.
Both instructors and students were interviewed mid-way through the semester to further determine
their perceptions of the MindTap product.
A pre-post, treatment-group-only design was used to gather quantitative evidence of effectiveness
from a subset of the classes. Students in these classes using MindTap were administered a pretest of
course content knowledge and skills at the beginning of the semester and a posttest of knowledge
and skills at the end of the semester. The amount of growth between pre- and posttest was analyzed
as a preliminary indicator of MindTap’s effectiveness.
6
Improving Student Knowledge and Skills With Digital Learning Tools:
A Study of the Effectiveness of the Cengage MindTap Application
Instructor Sample
A total of 13 instructors teaching a variety of courses at 13 different institutions of higher education
provided feedback about their experiences using the MindTap product throughout the fall, 2012
semester. All instructors used one of the following eight Cengage Learning courses:
A People & A Nation, Brief Ninth Edition: Beth Norton, et. al. (2)
Business: Pride/Hughes/Kappor, 10th Edition (1)
Business Communication: Process and Product, M. E. Guffey (1)
Management: Richard L. Daft (1)
Medical Terminology For Health Professions: Ann Ehrlich/Carol L. Schroeder, 6th Edition (2)
Nutrition: Concepts and Controversies: Frances Sizer (2)
Principles of Economics: N. Gregory Mankiw, 6th Edition (2)
Principles of Macroeconomics: N. Gregory Mankiw, 6th Edition (2)
Instructors and students in three course areas originally participated in the study of the efficacy of
MindTap: Macroeconomics, Nutrition, and Medical Terminology. Since fewer than 10 Medical
Terminology students participated in the full study, they were eliminated from this portion of the
study but remained in the qualitative feedback portion.
To summarize, all 13 institutions in the research were non-profit schools, with fourth being two-year
schools and 8 being 4-year. Over half of the schools were in urban environments, with the rest
divided between suburban and rural locations. A list of participating institutions appears in the
Appendix.
Instructors held a wide range of titles, from adjunct instructor to department chair. A majority of
instructors were full-time; were 41 or older; were white; held a masters as their highest degree; and
had four or more years of experience teaching overall and teaching this specific course. Just over
half were female. All said they were very or somewhat experienced with the use of technology.
All but three instructors were using MindTap for the first time when the research was conducted,
with the majority using the associated textbook for three years or less. All but one instructor
7
Improving Student Knowledge and Skills With Digital Learning Tools:
A Study of the Effectiveness of the Cengage MindTap Application
selected it on their own. They were using MindTap on a desktop (6), laptop (3), or with Blackboard
(3). The majority had used similar products in the past, including three who had used Aplia, another
Cengage Learning product. Video activities and assessments were the digital tools most widely used
with MindTap
Demographic breakdown of the instructors and institutions is as follows:
Number of instructors
Type of Institution
2-year non-profit
4
2-year for-profit
0
4-year non-profit
8
4-year for-profit
0
Instructor Title
Adjunct Instructor/Professor
3
Instructor
2
Assistant Professor
2
Associate Professor
3
Professor
1
Other
1 Dept Chair, 1 Sr Academic Staff
Full-time/Part-time Status
Full-time
10
Part-time
3
Highest Degree Attained
Bachelors
0
Masters
8
Other professional degree beyond BA
1
Ed.D/PhD
3
Other
1 All but dissertation
Teaching Experience
First year
2
1-3 years
2
4-10 years
4
More than 10 years
5
8
Improving Student Knowledge and Skills With Digital Learning Tools:
A Study of the Effectiveness of the Cengage MindTap Application
Years Teaching This Course
First year
2
1-3 years
2
4-10 years
7
More than 10 years
2
Years Using Textbook
First semester
6
1-3 years
3
4-6 years
3
More than 6 years
1
Time Using MindTap
First semester
10
1 semester
0
2 semesters
3
Use of Similar Products in Past
Yes
Cengage Emerge
SAM 2010
My Lab/Mastering
Aplia 3
Connect (McGraw Hill)
MyHistoryLab (Pearson)
Cengage Brain
No
5
Choice of MindTap
Instructor chose it
12
Department required it
1
Location of School
Urban
7
Suburban
3
Rural
3
Digital Tools Used
Lecture capture system
1
Student response devices
1
Social networking
2
Google apps for education
0
9
Improving Student Knowledge and Skills With Digital Learning Tools:
A Study of the Effectiveness of the Cengage MindTap Application
Video activities or assessment
6
Web collaborative tools
0
Other
Blackboard (2)
Tweet
Blog
Moodle (2)
Podcasts/movies
ANGEL
None
1
Experience with Technology
Very experienced
8
Somewhat experienced
5
Not very experienced
0
Not at all experienced
0
Gender
Male
6
Female
7
Age
21-30
1
31-40
3
41-50
2
More than 50
7
Ethnicity
White/Caucasian
12
Black
0
Hispanic
1
Asian
0
Platform Using MindTap
Blackboard
3
Desktop
6
Laptop
3
Smartphone
0
Other
1
10
Improving Student Knowledge and Skills With Digital Learning Tools:
A Study of the Effectiveness of the Cengage MindTap Application
Student Sample
Approximately 260 students participated in the study of the effectiveness of the Cengage Learning’s
MindTap. Table 1 shows the number of students in each gender and ethnic category. This sample
provides a 99% level of confidence that the margin of error is within 93% of the results that would
have been obtained had all US college students been studied. For MacroEconomics about threefifths of the participants were male and two-fifths were female. Nearly three-fifths of the
Macroeconomics students were Caucasian, about one-fifth were classified as Asian, and one-tenth
were Hispanic; the remaining Macroeconomics students were classified as other, with no students
reporting they were African American.
For Nutrition, half the students were male and half were female. One-third of the students
participating in Nutrition classes were Hispanic, one-quarter were Caucasian, just under one-fifth
were Asian or classified as other, and the remaining 6% were African American.
Table 1
Profile of Student Participants
Macroeconomics
Number of
Students
Nutrition
Number of
Students
TOTAL
49 (59%)
34 (41%)
83
86 (50%)
86 (50%)
172
TOTAL
47 (57%)
0 (0%)
10 (12%)
17 (20%)
5 (6%)
4 (5%)
83
44 (26%)
11 (6%)
57 (33%)
27 (16%)
26 (15%)
7 (4%)
172
Variable
Gender
Male
Female
Ethnicity
Caucasian
African American
Hispanic
Asian/Pacific Islander
Other
Not Reported
In some instances, teachers did not provide complete background information for a student or a
student did not take one of the tests included in the analyses. Where data was missing, the student’s
results were eliminated from those analyses.
11
Improving Student Knowledge and Skills With Digital Learning Tools:
A Study of the Effectiveness of the Cengage MindTap Application
A total of 26 of the 260 students were selected to provide feedback during the course of the study,
demographic information was collected direct from the students. This sample provides a 99% level
of confidence that the margin of error is within 90% of the results that would have been obtained
had all 260 students been sampled. In general, these students were under 30 (almost half are 21 or
less), spoke English as their native language, were enrolled full-time at their institutions, obtained
high school GPA’s that were B or better, expected to complete a Bachelor’s, or Master’s degree, and
were somewhat experienced with technology. Most had prior experience with digital tools in their
classes. Few were using any digital tools for studying, however, with social networking being the
standout as a choice of 23% of the students.
Demographic breakdown of the 23 students is as follows:
Age
0-20
46%
21-30
39
31-40
11
41-50
4
51 or above
0
English as Native Language
Yes
89%
No. of Courses Taken in the
Semester
1-3
11%
4-6
79
7 or more
11
GPA in High School
A or A+
14%
A-
29
B or B+
50
B-
4
C or C+
4
Less than C-
0
Disabilities/Medical Conditions
12
Improving Student Knowledge and Skills With Digital Learning Tools:
A Study of the Effectiveness of the Cengage MindTap Application
Learning disability (e.g., dyslexia)
4%
Psychological disorder
4
Highest Degree Intended to Earn
Associates
7%
Bachelor’s
32
Master’s
43
Professional (e.g., MD, PhD, JD)
18
Technology Experience
Very experienced
32%
Somewhat experienced
68
Not very experienced
0
Not at all experienced
Prior Digital Experience in Classes
Prior to MindTap
Yes
71%
No
21
Not sure
7
Technology Tools Used This Year
Required
Optional
E-portfolio
3%
1%
Student response system (clickers)
9
4
Lecture capture system
1
4
Social networking
14
23
Video activities or assessment
16
17
Web collaboration tools
8
10
Social document sharing
19
16
Group messaging for study groups
16
16
Note taking or Web clipping service
5
9
Other
9
1
in School for Studying/Learning
13
Improving Student Knowledge and Skills With Digital Learning Tools:
A Study of the Effectiveness of the Cengage MindTap Application
Chapter2: Quantitative Study
Study Design
The growth in students’ course content knowledge and skills was evaluated by using a treatment
only, pre-post test design. Students in classes using MindTap were given a pretest at the beginning
of the semester and a posttest at the end of the semester. The difference in the pretest and posttest
results was analyzed statistically as an indicator of the effectiveness of MindTap instruction.
Description of the Pretest and Posttest
Student learning was operationalized as the gains in course content knowledge and skills between
the pre- and posttest. Student knowledge and skills in the study was measured using a 30-item
pretest administered at the beginning of the semester in August or September and a 30-item posttest
administered at the end of the semester in December. The pretest and the posttest contained 15
questions in common and 15 unique questions.
An item analysis revealed that all items were technically sound. No items had p values below .30 or
above .95, and no items had point biserial correlations (discrimination coefficients) below .20. This
shows that the questions were neither too hard nor too easy and effectively differentiated between
those students who had the knowledge and skills and those who did not.
To help ensure the validity of the measures used, both the Macroeconomics and Nutrition tests were
designed to measure the content taught in the typical Macroeconomics and Nutrition courses
reflected in the textbooks used in the study. For each content area, the major content and skill
objectives were identified to correspond to the chapter headings in the book. Each content/skill
objective was “assigned” 1-3 test items based on the amount and importance of the content and
skills associated with the objective. The items were written to generate content parallel pre- and
posttests for the study. The reliability of the Macroeconomics pre- and posttest was .82(KR 20).
The reliability of the Nutrition pre- and posttest was .84 (KR-20). Reliability of .80 or above is
generally considered acceptable.
14
Improving Student Knowledge and Skills With Digital Learning Tools:
A Study of the Effectiveness of the Cengage MindTap Application
Description of the Treatment
The Treatment in this study was classroom use of MindTap. Teachers reported on average using
MindTap 4.17 hours per week. Students received between10 and16 weeks of instruction.
Data Collection
Instructors participating in the study were provided with the pretests, answer sheets, and
administration manuals in August 2012. The instructors then administered the pretest according to
the administration instructions provided. At the end of the semester, in December 2012, the
instructors administered the posttest.
The completed test booklets and answer sheets were returned to SEG Measurement for processing.
The answer sheets were analyzed and entered into a database. All data were reviewed and checked
for accuracy before scoring and analysis. The pre and posttest student records were merged to
create a single record for each student containing both pre and posttest data. The pretest and
posttest results were compared as a basis for evaluating growth.
Qualitative and quantitative feedback about MindTap was obtained from all instructors through
PubCentral’s CAP tool on a biweekly basis, starting with feedback for the two weeks prior to
September 30, 2012 and ending with feedback for the two weeks prior to December 14, 2012. In
addition, a concluding survey was administered at the end of the semester, typically during the week
of December 17. And finally, a telephone interview was conducted about mid-way through the
semester with all instructors.
In addition, 26 students from the 13 classes provided feedback about their experiences with
MindTap throughout the research study. Like instructors, qualitative and quantitative feedback
about MindTap was obtained from all students through PubCentral’s CAP tool on a biweekly basis.
In addition, a concluding survey was administered at the end of the semester, and a telephone
interview was conducted about mid-way through the semester.
15
Improving Student Knowledge and Skills With Digital Learning Tools:
A Study of the Effectiveness of the Cengage MindTap Application
Findings
Descriptive Statistics
The first step in analyzing the results of the study was to examine the average scores (mean)
achieved by students on the pre and posttests as well as the variation (standard deviation) in those
scores. We calculated the mean and standard deviation for both the pre and posttests (see table 2).
Table 2
Descriptive Statistics
MindTap Pre and Posttest Data
Test/Language Skill
Area
Mean
Mean
SD
N Pretest
Pretest
SD Pretest
N Posttest
Posttest
Posttest
Macroeconomics
59
14.25
3.16
59
17.31
3.96
Nutrition
91
14.47
4.07
91
18.30
4.54
Evaluating Growth in Language Skills
Student course content knowledge and skills at the beginning of the semester were compared to
their knowledge and skills at the end of the semester using t tests. The t test is a commonly
accepted statistic for comparing growth over time and evaluating whether or not those differences
are due to chance or reflect actual growth. The magnitude of the difference is expressed as the
effect size, a commonly accepted measure of the degree of difference. (Effect size was calculated as
the difference between the pre and posttest score means, divided by the overall standard deviation.)
Only students for whom matched pretest and posttest results were available were included in the
analysis. Students who left the class during this period or who joined the class during this period
were not included in the growth comparisons.
Statistically significant growth was found within classes using MindTap for both Nutrition and
Macroeconomics. Students in classes that used MindTap showed substantial growth from pretest to
posttest in both course areas. Students in Nutrition classes using MindTap showed significant gains
16
Improving Student Knowledge and Skills With Digital Learning Tools:
A Study of the Effectiveness of the Cengage MindTap Application
in course content knowledge and skills (t=10.21;df=90;p<.01; Effect size=.84). Students in
Macroeconomics classes using MindTap showed significant gains in course content knowledge and
skills (t=5.89;df=58;p<.01; Effect size=.77). This means that the students using MindTap showed
gains in course knowledge and skills greater than what would be expected by chance.
Chapter 3 Qualitative Study
Feedback from Instructors
Qualitative and quantitative feedback about MindTap was obtained from all instructors through
PubCentral’s CAP tool on a biweekly basis, starting with feedback for the two weeks prior to
September 30, 2012 and ending with feedback for the two weeks prior to December 14, 2012. In
addition, a concluding survey was administered at the end of the semester, typically during the week
of December 17. And finally, a telephone interview was conducted about mid-way through the
semester with all instructors.
At the conclusion of the semester, the average rating of instructors’ overall experience with
MindTap was 2.9 out of 4, representing a grade of B. Typical comments from participants included
the following:
The information included in MindTap is excellent. Great work!
For my three overall objectives - - cost to students, better engagement, and ease of adapting to our requirements
- - MindTap met and exceeded expectations.
All but one student was able to adapt to ebooks without going back to the textbook. It seems the technology
will be accepted by the students more as time goes by.
A key strength of MindTap is that the variety of exercises appeals to students with different learning styles.
The assignments are well rounded in that they measure comprehension of basic facts as well as concepts, and
the primary source assignments help students develop critical thinking skills.
17
Improving Student Knowledge and Skills With Digital Learning Tools:
A Study of the Effectiveness of the Cengage MindTap Application
MindTap allowed me to start students very late in the semester, completing the course with only a one-week
extension. Without MindTap, I would not have been able to assist them.
Negative comments typically came from a minority of instructors and usually related to the way
particular elements of the program functioned. For example:
MindTap was very difficult to get to work right in the first half of the semester, although this did get better as
time went on, perhaps as students became more familiar with the program’s quirks and how to work around
them.
There were some aspects of my ability to manage the course that were frustrating. For example, when only
one student takes a quiz, I couldn’t make further changes to the ground rules (e.g., how many attempts a
student could make).
The substance of the activities and quizzes fell below my expectations. For some activities, there’s not much
substance. And the quizzes included strangely worded questions. There’s a lot of potential, but significant
obstacles that need to be addressed in terms of technical issues as well as intellectual content.
“Adoption potential,” (meaning the willingness to recommend MindTap to other colleagues) was
rated 3.1 out of 4, with 10 out of 13 instructors (77%) saying there was an “excellent” or “above
average” possibility that they would recommend MindTap to their colleagues. Strengths of the
system included:
Ability to turn it over to students and get them to self-engage and go beyond listening to a lecture. Its basic
organization and ease of adaptability to specific instructors’ needs make it superior.
Having all of MindTap instead of the textbook and MindTap helps students to make better use of the
program.
18
Improving Student Knowledge and Skills With Digital Learning Tools:
A Study of the Effectiveness of the Cengage MindTap Application
Progress tracking and ability to add material as supplements that are gradable, use of video uploading
Applications that are embedded in the text
Help given students who need motivation in preparing for tests and reinforcing the lessons they should be
learning from the lectures and readings
The “live” links that allow students to move vertically as well as horizontally through the assigned chapters,
the layout of the chapter listing learning outcomes and objectives, and the links in each chapter allowing
students to navigate to a specific section of a chapter easily
The functionality and tools in the ebook
Suggestions for improving the product included the following:
More diversity in videos (to include people of color) and/or sharing database links for web sites with videos
that can be loaded into MindTap.1
More functionality in the progress tracking and kaltura/Google apps 2
More professor control with regard to quizzes (e.g., allowing students to see answers once the quiz is
completed, changing questions, changing the number of chances students have to complete the quizzes, and
integrating with Blackboard and MoodleRooms) 3
Accessibility for hearing and visually impaired students
1
Since the study was completed, the ability to add web links has been added to MindTap.
2
Similarly, MindTap 26 made significant improvements in the progress app.
33
Integration with Blackboard Learn 9.1 and above was added after the study and incudes grade book exchange.
19
Improving Student Knowledge and Skills With Digital Learning Tools:
A Study of the Effectiveness of the Cengage MindTap Application
More flexibility with assignments, more customization, tools to better manage individual students, and
improvements in the progress app
Flexibility in how students can view the ebook pages (a significant minority didn’t like having to use the etext
in a vertical position on the “blackboard” and wanted it traditionally laid out.) 4
On average, instructors felt that MindTap met the needs of students with a wide range of capabilities
(average rating 2.9 out of 4). In addition, most instructors felt that MindTap made teaching easier
and better (average rating of 2.5 out of 4), although three complained that it made more work for
them because students were asking a lot of questions about the program.
All but two of the 13 instructors (85%) said that students were either very engaged or somewhat
engaged with their coursework, and all but three (77%) said that students’ engagement with the
coursework was impacted by their use of MindTap.
Some relevant comments follow:
There was more in-class participation, better attendance, and more discussion.
The tool pushes them to stay with reading, and this helped with out-of-class structure and personal discipline.
Students are engaged with the product so they can learn the information. I explained to them that it’s like
learning an international language.
I’d say about two-thirds of the students were very engaged.
The students who were most engaged in the class also thought MindTap had promising capabilities, although
I agree that it’s not altogether necessary for an introductory course in psychology.
4
The MindTap 2.6 release will allow students with iPads to change the orientation of the text in the MindTap Reader.
20
Improving Student Knowledge and Skills With Digital Learning Tools:
A Study of the Effectiveness of the Cengage MindTap Application
Students are accustomed to going through many classes without having to open their textbooks. With
MindTap, that’s impossible as it requires students to actually read the book closely. It requires more work
from them. From an instructor’s point of view, this is excellent. However, when students complained about
MindTap I perceived that this is what they were really chafing against. The problem wasn’t with
MindTap; they were irritated that they had to actually read and think about the book .
There are some students who loved the course, others who I have not seen since the first day!
All but one of the instructors (92%) said that MindTap contributed to the improvement of student
learning outcomes. (31% said it greatly contributed, 54% said it contributed somewhat, and 8% said
it contributed slightly). Specific comments include the following:
Great ebook with pre-lecture and post-lecture parts, meaning that students have a notion of the materials
before the lecture without reading the whole chapter. That’s a major difference compared to other ebooks.
Students that used the ebook loved the interface.
I think it kept students focused on the material throughout the semester rather than simply before a test.
The great thing about MindTap is that it accommodates a great variety of learning styles so the students can
find a combination of resources that best help them learn.
As with any assignments, students varied in their dedication to attend to the work. Students who seriously
approached the assignments had higher scores on the unit tests and had a stronger commitment of chapter
content.
In spite of time management and other challenges students faced in these business writing classes, anecdotal
evidence suggests that students’ business writing seems better than it has in past semesters.
21
Improving Student Knowledge and Skills With Digital Learning Tools:
A Study of the Effectiveness of the Cengage MindTap Application
In addition, all but two instructors (85%) said that MindTap helped students to get through the
course and focus on areas where they needed the most help - - either extremely well (38%) or
somewhat well (46%). Specific comments include the following:
The structure of MindTap is such that it keeps students on task. I like the learning path because it focuses
students’ attention on what we are covering in class that week.
I think MindTap helps students understand the material to the degree that it gets them to think about the
material from more than one perspective..
I believe the quizzes helped students a great deal. The “check your understanding” activities were very
helpful.
The learning outcomes and outline of each chapter before the post-lecture quiz and reading are great ways to
get students focused on what’s important.
Average rating of satisfaction with the instruction they received for setup of their MindTap course
was 2.8 out of 4. Some instructors felt the resources and training sessions were “excellent” while
others felt the setup was “extremely complicated” and even “stressful.” One mentioned not
receiving a welcome package with instructions as they had expected from prior experience, and a
couple of instructors mentioned that they learned the program “on the fly” and discovered features
that had not been explained on the WebEx training they received.
To evaluate their own experience with MindTap, instructors were asked to rate their agreement with
the following statements on a 5-point scale, with 1 meaning complete disagreement and 5 meaning
complete agreement. Average ratings for the 13 instructors were as follows:
MindTap allowed me to engage
3.5
more with my students.
MindTap helped me track my
3.2
22
Improving Student Knowledge and Skills With Digital Learning Tools:
A Study of the Effectiveness of the Cengage MindTap Application
students’ progress more easily.
MindTap helped to improve my
3.6
teaching of the course.
MindTap’s content worked well
3.9
with my textbook in this course.
The primary issue raised by several was about the difficulty of tracking students’ progress. They
complained that no average is calculated for each student to give them an idea of their progress and
that instructors can view only percentages and not numbers that would allow grade calculations.
One instructor commented that the Aplia grade book was much more functional than the one in
MindTap and did a far superior job of tracking the progress of students.
Content assets named as most and least value to the instructors were as follows:
Most Value
Assessments
1
Game play
1
Notes
1
Read aloud
1
Flash cards
2
Videos
2
Pre and post-tests
1
Prelecture part of ebook
1
Glossary
1
Ebook functions
2
Quizzes
3
Kaltura
1
Audio chapter overview
1
Links to primary sources
1
Zoomable maps
1
Learning outcomes and objectives
1
Least Value
2 - -lack of diversity
2
23
Improving Student Knowledge and Skills With Digital Learning Tools:
A Study of the Effectiveness of the Cengage MindTap Application
ConnectYard
3
Homework post-lecture
1
Activities that merely test reading
1
comprehension
More advanced bells and whistles
1
(e.g., ability to use outside sources)
Progress app’s lack of functionality
1
Dictionary and glossary
1
Average ratings on a 4-point scale (where A=4 and F=0) of specific MindTap features from all of
the bi-weekly surveys completed throughout the semester were as follows:
Overall structure
3.3 out of 4
Overall experience
3.0
Cengage support
3.7
Readings
3.6
Quizzes and homework
3.1
Feedback from Students
Qualitative and quantitative feedback about MindTap was also obtained from a sample of students
through PubCentral’s CAP tool on a biweekly basis, starting with feedback for the two weeks prior
to September 30, 2012 and ending with feedback for the two weeks prior to December 14, 2012. In
addition, a concluding survey was administered at the end of the semester, and a telephone interview
was conducted about mid-way through the semester.
Twenty-six students completed all of the biweekly surveys, the mid-semester interview, and the
concluding survey. As a group, they used MindTap about once a week (38%), several times a week
(50%), or almost every day (8%); and they typically spent 2-3 hours a week on homework and
studying outside of class (65%).
24
Improving Student Knowledge and Skills With Digital Learning Tools:
A Study of the Effectiveness of the Cengage MindTap Application
Half of the students reported that they experienced problems using MindTap, although many noted
the problems receded as the semester progressed. The following issues were noted:










Freezing pages when using the read-aloud feature or taking quizzes
Java script not refreshing when doing assignments
Slow loading, freezing, or not loading at all without exiting out of the program completely
Responses to quizzes not always recorded
Getting results from quizzes
Speed of the system
Insufficient help in the tutorial about how to access the apps
Inability to use the backspace button without the page reloading
Opening assignments
Class book access that was disabled
On average, students rated their overall experience using MindTap as 3.3 out of 4 (a B+ average
rating). Comments from students include the following:
I thoroughly liked the program. The information is relevant and easily accessible.
A great alternative to the book. In fact, I only used the book twice during the semester and that was due to
not having access to MindTap on my iPad.
I love this program. The information is useful, relevant to what we are learning, and it provided a good
overview.
Much better than the other ones out there.
I enjoyed using MindTap. It was extremely useful for my course. It was very helpful as a study tool because
it kept my work very organized and there are a lot of useful tools to use.
MindTap was always awesome. I was always able to use it effectively and I’m overall very pleased with my
experience. I hope other classes I take in the future will utilize MindTap.
25
Improving Student Knowledge and Skills With Digital Learning Tools:
A Study of the Effectiveness of the Cengage MindTap Application
MindTap was extremely useful for me. Working with a computer program is much more attractive for people
in our generation than to do practice problems from a textbook. Also, it is a way for me as a visual and
kinesthetic learner to interact with the material. See the material more than once, through lectures, text
readings, and then MindTap helped me learn better - - repetition of the concepts. I only rated it a B because
of the technical problems and because the pre-lecture quizzes don’t allow you to see which questions you got
wrong.
I feel that MindTap really improved my learning of the materials with strengths being the text, media
resources, galleries, and quizzes. All of these were very useful to the learning process. Weaknesses of the
system include: ineffective tutorial, not being able to utilize the text and galleries in full screen(some of the
images appeared very blurry and were hard to read the small print), and at times the system ran very slow but
got better after mentioning it in the reviews. All in all, I would say that the positives outweigh the negatives.
Technically, MindTap functions well once learned.
My over all experience was a good experience. It is an extremely easy to use program that I would
recommend to anyone wanting to take an online history course.
A lot better than I expected!
By and large, students found the coursework to be either very engaging (58%) or somewhat
engaging (35%), with 54% saying that they were more engaged in the course because of their use of
MindTap. As a sign of their engagement, 19% said they discussed topics, ideas, or concepts with
their instructors or other faculty members outside of class frequently (almost every week), and an
additional 31% said they did so sometimes (several times during the course).
Comments from students about their engagement and learning included the following:
I realized, after I had stopped using MindTap, how convenient it was to be able to work on an assignment
and check the answers immediately to see how well I was understanding the material.
26
Improving Student Knowledge and Skills With Digital Learning Tools:
A Study of the Effectiveness of the Cengage MindTap Application
My only suggestion to be more engaged would be to include more quizzes that can be accessed (not just for
grades or request of the instructor), and to place any apps that would benefit writing papers on the tools bar.
One suggestion would be a thesaurus next to the dictionary to make it conveniently located while I am
reviewing learning concepts and writing papers at the same time.
The media made me want to learn more about what I was studying.
I was very engaged in my coursework because MindTap was a simple, helpful website that provided me with
my homework and quizzes.
I was very engaged because 1) thanks to MindTap, I was engaged in the material in more than one way -rather than only using the book or the notes to study the material, I was engaged with the material online,
and 2) thanks to MindTap, I spent more time with the coursework overall, resulting in my being more
engaged.
My interest in the course was definitely increased as MindTap helped me understand what was going on.
Even when I fell behind on the readings and lectures, I knew I could catch up through MindTap and use it
as a checkpoint for whether I was grasping important concepts.
I think this resource makes learning much more engaging than a typical book. The tutorials, flashcards and
videos were excellent.
Interaction with other students and the instructor was a must and was enhanced through MindTap.
I was much more engaged than I'd otherwise be because homework and quizzes that I take in lecture are not
as helpful as the ones on the website. I can refer to the textbook at any time and write notes.
I was more interested and engaged in the material thanks to MindTap because MindTap was not only a
comprehensive resource, it was
Improving Student Knowledge and Skills With Digital Learning Tools:
27
A Study of the Effectiveness of the Cengage MindTap Application
an enjoyable one. I looked forward to using it and that made me more interested and engaged.
MindTap kept be interested in the course!
In addition, almost two-thirds of the students (65%) said that they thought they had learned more
than they would have otherwise because of their use of MindTap. Comments included:
MindTap greatly improved what I learned over the duration of this course and I do feel that I learned more
than I otherwise would have. The apps, media gallery, and text make it easy to learn and provide an
alternative to just reading a paper text. This increases the interest in the topics.
Because I spent more time with the material, in a number of different ways, I learned more thanks to
MindTap.
Content assets names as most and least value to students were the following, with Quizzes being
cited most frequently as of most value and Connect Yard as of least value:
Most Value
Least Value
Read-aloud text
2
2
Flashcards
7
1
Highlighter
3
2
Note cards
4
1
Search tool
4
Post-lecture readings
1
Quizzes
11
Homework assignments
4
Dictionary/Glossary
4
Explanations of correct
1
1
3
28
Improving Student Knowledge and Skills With Digital Learning Tools:
A Study of the Effectiveness of the Cengage MindTap Application
answers
Tutorials
1
Videos
1
Diet Analysis Plus
1
Online textbook
1
Tests
2
Media
1
Small book in the side tab
1
Enlarged print
1
Chapter reviews
2
1
1
Connect Yard
4
No printable features
1
Objectives
1
Google Docs
2
Readings
1
To evaluate their own experience with MindTap, students were asked to rate their agreement with
the following on a 5-point scale, with 1 meaning complete disagreement and 5 meaning complete
agreement. Average ratings for the 26 students responding were as follows:
Average Ratings
I would recommend the uses of MindTap for this course to my
4.6
friends.
MindTap helped me better understand the expectations of my
4.1
instructor.
MindTap helped me complete assignments on time.
4.0
MindTap helped me better prepare for tests.
4.3
MindTap was valuable in helping me learn new concepts.
4.5
Using MindTap allowed me to better track my progress in this
3.8
Improving Student Knowledge and Skills With Digital Learning Tools:
29
A Study of the Effectiveness of the Cengage MindTap Application
course.
Using MindTap allowed me to engage more with my instructor.
3.5
Using MindTap allowed me to engage more with my fellow
3.3
students.
I used the textbook less because of MindTap.
3.6
The MindTap site was easy to navigate.
4.4
The content of MindTap was relevant to me.
4.7
The content of MindTap was relevant to my course.
4.8
Ratings overall were high and in all cases above the median point of 3.0 on the 5-point scale. The
highest ratings were given to MindTap’s relevancy to the students’ course (4.8) relevancy to them
personally (4.7), its value in helping students learn new concepts (4.5), its ease of navigation (4.4),
and its help in preparing students for tests (4.3). Most importantly in terms of overall acceptance, all
but two agreed completely (17 students) or somewhat (7 students) that they would recommend the
use of MindTap for this course to their friends.
There was less agreement about MindTap’s ability to engage students with their fellow students (3.3)
or with their instructors (3.5).
In addition to technical issues as detailed above, other specific recommendations to improve
MindTap include the following:

Allow students to see which questions they answered incorrectly and to view the answers to
the practice tests and quizzes

Add a thesaurus next to the dictionary to make it more conveniently located

Include a spell checker along with the dictionary if a student doesn’t know how to spell a
word being looked up

Make MindTap available in a textbook format

Add games or more interactive media to make it more engaging
30
Improving Student Knowledge and Skills With Digital Learning Tools:
A Study of the Effectiveness of the Cengage MindTap Application

Allow instructors to customize the quizzes to make them more aligned to the portions of the
work actually covered in class

Add mobile apps

Allow images to be zoomed in

Make the illustrations and charts more visible
Chapter 4: Conclusion
SEG Measurement and MarketingWorks conducted research during the Fall 2012 semester to
determine instructor and student perceptions of Cengage’s MindTap digital instructional tool and to
examine its effectiveness. A qualitative study collected survey and interview data from instructors
and students using MindTap to determine their perceptions of the product. A quantitative study of
MindTap users evaluated the extent to which students in Nutrition and Macroeconomics classes
increased their content knowledge and skills. Growth in knowledge and skills was evaluated using a
pre- post/treatment-group-only design.
The results of the quantitative study found that students in classes that used MindTap showed
substantial growth in knowledge and skills. Nutrition students grew over four-fifths of a standard
deviation (Effect size=.84) and Macroeconomics students grew over three-quarters of a standard
deviation (Effect size=.77).
These gains are considered to be quite large by researchers and are
meaningfully greater than are typically seen across a college semester, suggesting that MindTap may
be a useful tool for improving student learning.
The results of the qualitative study were generally positive as well, with over three-fourths of
instructors and 92% of students surveyed indicating a strong or good possibility of recommending
MindTap to their colleagues and friends.
The quantitative study results are preliminary. The positive findings of this study need to be
confirmed with larger samples of students and with more robust research employing a treatment-
31
Improving Student Knowledge and Skills With Digital Learning Tools:
A Study of the Effectiveness of the Cengage MindTap Application
control group design. Cengage Learning plans to complete such studies during the 2013-2014
school year to confirm the positive results found in this study.
Download