Solving Two Common ELearning Problems:
Improving Retention Rates
Managing The Proctoring Process
Provided by
Dr. Mac Adkins, President
SmarterServices
Archive: http://www.slideshare.net/socialsmarterservices/measuring
-what-matters-noncognitive-skills-grit
• How do you determine who can be enrolled at your school?
– Standardized test scores
– Prior grade point averages
– Admissions exams
• The National Association for College Admission Counseling rated these factors.
• CONSIDERABLY IMPORTANT
– College prep course grades
– Strength of high school curriculum
– Standardized test scores
– Overall GPA
• MODERATELY IMPORTANT
– Admissions essay
– Letters of recommendation
– Demonstrated interest
– Class rank
– Extracurricular commitment
A study funded by the Bill and Melinda Gates Foundation ranked these reasons:
1. Conflict with work schedule
2. Affordability of tuition
3. Lack of support from family – financial and practical support
4. Lack of belief that a college degree is valuable
5. Lack of discipline – too much socializing, not enough studying http://www.publicagenda.org/pages/with-their-whole-lives-ahead-of-them
Employers
Colleges
Faculty
National Research Council
US Department of Education
Mothers
National Association of Colleges and Employers
Survey of Employers http://www.unl.edu/svcaa/documents/how_e mployers_see_candidates.pdf
Elements of Mission Statements From 35 Universities
1. Knowledge, learning, mastery of general principles
2. Continuous learning, intellectual interest, curiosity
3. Artistic cultural appreciation
4. Appreciation for diversity
5. Leadership
6. Interpersonal skills
7. Social responsibility, citizenship and involvement
8. Physical and psychosocial health
9. Career preparation
10.Adaptability and life skills
11.Perseverance
12.Ethics and integrity
Michigan State University, 2004
WICHE Cooperative for Educational Technologies, 2013
COGNITIVE
Problem solving
Critical thinking
Systems thinking
Study skills
Adaptability
Creativity
Meta-cognitive skills
INTERPERSONAL
Communication
Social Intelligence
Teamwork
Leadership
Cultural sensitivity
Tolerance for diversity
INTRAPERSONAL
Anxiety
Self-efficacy
Self-concept
Attributions
Work ethic
Persistence
Organization
Time management
Integrity
Life-long learning
“The test score accountability movement and conventional educational approaches tend to focus on intellectual aspects of success, such as content knowledge. However, this is not sufficient .
If students are to achieve their full potential, they must have opportunities to engage and develop a much richer set of skills. There is a growing movement to explore the potential of the
“noncognitive” factors — attributes, dispositions, social skills, attitudes, and intrapersonal resources, independent of intellectual ability —that high-achieving individuals draw upon to accomplish success .”
Are You Beginning To See The Picture?
• Non-cognitive skills matter
– Determine student retention
– Determine employer satisfaction
– Determine online course success
– Federal agencies recognize their importance
– They are the mission of many schools
– Parents value them
“Years of schooling predicts labor market outcomes — cognitive skills account for only
20% ; therefore 80% of the “years of schooling” benefit is due to noncognitive skills” (Bowles, Gintis, & Osborne, 2001) http://www.umass.edu/preferen/gintis/jelpap.pdf
APTITUDE ATTITUDE SITUATION
You can’t change a tiger’s stripes, but you can teach that tiger to hunt in a different environment.
1. Optic – A lens through which students can view their strengths and opportunities for improvement
2. Student Service – A tool to guide students toward available resources for support
3. Placement – Developmental / remedial course placement
4. Talking Points – A collection of statements which academic advisors can use to advise their students
5. Early Alert – A list of students who are likely to be benefitted by the instructor reaching out to them early in the course.
6. Predictive Analytic - A set of data which can be analyzed at the individual and aggregate level to project student performance
• Instructor ratings – Time and task intensive for the faculty
• Observer records – Expensive and time consuming
• Letters of recommendation – Rarely objective
• Interviews – Time consuming to conduct and code
• Socioeconomic data – Beneficial mostly at the aggregate level due to exceptions and bias
• Self assessment – Yes, there are limitations, but it is the preferred method.
SmarterMeasure
Individual
Attributes
Life Factors
Learning Styles
Technical Skills
Reading Skills
Keyboarding Skills
Custom Questions
ACT
Engage
X
ETS Success
Navigator
X
Wonderlic
Admissions
Risk Profile
X
X
X
X
X
X
X
X
X
SmarterMeasure
Learning Readiness Indicator
• The leading student readiness assessment. Based on non-cognitive indicators of success.
• Validated assessment and highly configurable assessment engine
• Used by over 500 Colleges and Universities.
• Since 2002 taken by over 2,900,000 students.
What Does The Assessment Measure?
INTERNAL EXTERNAL SKILLS
INDIVIDUAL
ATTRIBUTES
LIFE FACTORS TECHNICAL
Motivation
Procrastination
Time Management
Help Seeking
Locus of Control
Availability of Time
Dedicated Place
Reason
Support from Family
Technology Usage
Life Application
Tech Vocabulary
Computing Access
TYPING
LEARNING STYLES Rate
Accuracy
Visual
Verbal
Social
Solitary
Physical
Aural
Logical
ON-SCREEN
READING
Rate
Recall
Adjusting the cut points can make the reporting a more accurate predictor of success.
• Orientation Course
• Enrollment Process
• Information Webinar
• Public Website
• Class Participation
• 68% of client schools administer the assessment to all students, not just eLearning students
• More important than taking your child’s temperature is taking appropriate action based on their temperature.
• More important than measuring student readiness is taking appropriate action based on the scores.
Progression of SmarterMeasure Data Utilization
Predictive
Correlation
Comparison
Descriptive
Student Service
Research
Ideas on the
Research
Page of the
Website
Approaches to Research Projects
Internally
Conducted
Company
Assisted
Professionally
Assisted
• 6% to 13% more students failed online courses than on-ground courses.
• Intervention Plan
- Administer SmarterMeasure
- Identify which constructs best predicted success
Provide “Success Tips” as identified
Distributed by website, email, orientation course, records office, library, posters, and mail
• Analyzed 3228 cases over two years
• Significant positive correlation between individual attributes and grades
Motivation Impacts Grades
Before SmarterMeasure™ was implemented, 6% to 13% more students failed online courses than students taking on-ground courses. After the implementation, the gaps were narrowed: 1.3% to 5.8% more online students failed than onground students.
Failure rates reduced by as much as 10%
• Empower eLearning staff, faculty advisors, and academic counselors with student data
Motivation
Self
Discipline
Time
Management
“In summary, the implementation of
SmarterMeasure has helped students to achieve better academic success by identifying their strengths and weaknesses in online learning .”
In essence, with various strategies implemented to promote
SmarterMeasure™, a “ culture ” was created during advising and registration for students, faculty, and support staff to know that there is a way for students to see if they are a good fit for learning online.
• We need to know which students to advise to take online, hybrid or on-campus courses.
• We need to know which students to direct to which student services to help them succeed.
• We need to know how to best design our courses so that new students are not overwhelmed.
• What is the relationship between measures of student readiness and variables of:
– Academic Success - GPA
– Engagement – Survey (N=587)
– Satisfaction – Survey (Representative Sample based on GPA and number of courses taken per term)
– Retention – Re-enrollment data
• Phase One – Summer 2011
– Included data from all three delivery systems – online, hybrid and on-campus
– Analyzed data at the scale level
• Phase Two – Fall 2011
– Focused the research on online learners only
– Analyzed data at the sub-scale level
• A neutral, third-part research firm (Applied
Measurement Associates) used the following statistical analyses in the project:
– ANOVA, Independent Samples t-tests, Discriminant
Analysis, Structural Equation Modeling, Multiple
Regression, Correlation.
• Academic Achievement
– The scales of Individual Attributes, Technical
Knowledge, and Life Factors had statistically significant mean differences with the measures of GPA.
• Retention
– The measure of Learning Styles produced a statistically significant mean difference between students who were retained and those who left.
• A 73% classification accuracy of this retention measure was achieved.
– The scales of Individual Attributes and
Technical Knowledge were statistically significant predictors of retention as measured by the number of courses taken per term.
• Engagement
– The scales of Individual Attributes and Technical
Competency had statistically significant relationships with the four survey items related to Engagement.
– The scales of Life Factors, Individual Attributes,
Technical Competency, Technical Knowledge, and
Learning Styles were used to correctly classify responses to the survey questions related to engagement and satisfaction with up to 93% classification accuracy.
• Satisfaction
– Structural equation modeling was used to create a hypothesized theoretical model to determine if
SmarterMeasure scores would predict satisfaction as measured by the survey.
– Results indicated that prior to taking online courses, student responses to the readiness variables were statistically significant indicators of later student satisfaction.
– Therefore, the multiple SmarterMeasure assessment scores are a predictor of the Career
Education survey responses.
• Statistically Significant Relationships
Academic
Achievement
Engagement Retention
Individual
Attributes
Technical
Knowledge
Learning
Styles
Life Factors
Technical
Competency
X
X
X
X
X
X
X
X
X
X
X
• Student Categorizations
– Enrollment Status
• Positive – active/graduated (34.3%)
• Negative – withdrew/dismissed/transfer (65.7%)
– Academic Success Status
• Passing – A, B or C (48.9%)
• Failing – D, F or Other (21.1%)
– Transfer Credit – (21.8%)
– Not reported – (8.2%)
Readiness Domain
Life Factor
Learning Styles
Personal Attributes
Readiness Domain Subscales
Positive vs. Negative Pass vs. Fail
Place, Reason, and Skills Place
Social and
Logical
N/A
Academic, Help Seeking, Procrastination,
Time Management, and Locus of Control
Time Management
Technical Competency
Internet Competency
Internet Competency and
Computer Competency
Technical Knowledge
Technology Usage and
Technical Vocabulary
Technical Vocabulary
Readiness Domains
Life Factor
Learning Styles
Personal Attributes
GPA
Place and Skills
Verbal a and Logical
Help Seeking, Time
Management, and Locus of
Control
Technical Competency
Computer and Internet
Competency
Technical Knowledge
Technology Vocabulary
F
12.35
3.95
21.11
22.75
38.76
p
.0001
.02
.0001
.0001
.0001
Readiness Domains
Life Factor
Credit Hours Earned
Place
Learning Styles Visual
Personal Attributes
Academic Attributes, Help
Seeking, and Locus of
Control
Technical Competency
Computer Competency and Internet Competency
Technical Knowledge
Technology Usage and
Technology Vocabulary
F
12.37
6.81
13.40
12.23
26.97
.0001
.0001
.0001
p
.0001
.01
• We need to know which students to advise to take online, hybrid or on-campus courses.
– A profile of a strong online student is one who:
• Has a dedicated place to study online
• Possesses strong time management skills
• Demonstrates strong technical skills
• Exhibits a strong vocabulary of technology terms
• We need to know which students to direct to which student services to help them succeed.
– An online student who should be directed toward remedial/support resources is one who:
• Has a weak reason for returning to school
• Has weak prior academic skills
• Is not likely to seek help on their own
• Is prone to procrastinate
• Has low, internal locus of control
• Has weak technology skills
• We need to know how to best design our courses so that new students are not overwhelmed.
– Limit advanced technology in courses offered early in a curriculum
– Foster frequent teacher to student interaction early in the course
– Require milestones in assignments to prevent procrastination
– Clearly provide links to people/resources for assistance
• Required in Freshman Experience course
• Students reflect on scores and identify areas for improvement in their Personal
Development Plan
• Group reflection with others with similar levels of readiness
• Compared the traits, attributes, and skills of the online and hybrid students.
• Substantial differences between the two groups existed.
• Changes were made to the instructional design process for each delivery system.
• Correlational analysis between SmarterMeasure scores and student satisfaction, retention, and academic success
Statistically Significant
Factors:
Motivation
Technical Time
Technical Competency
Motivation
Availability of Time.
Satisfaction
Retention
Success
• Aggregate analysis of SmarterMeasure data to identify mean scores for students.
• Comparison made to the national mean scores from the Student Readiness Report.
National
Scores
Argosy
Scores
• Findings were shared with the instructional design and student services groups and improvements in processes were made.
For example, since technical competency scores increase as the students take more online courses, the instructional designers purposefully allowed only basic forms of technology to be infused into the first courses that students take.
• Required as admissions assessment
• Integral part of their QEP
• Computed correlations with grades and
SmarterMeasure sub-scales of over 4000 students.
Life
Factors
• P
Attributes
Grades
Learning
Styles
Technical
• Statistically significant correlations:
- Dedicated place, support from employers and family, access to study resources, and academic skills
(Life Factors)
- Tech vocabulary
(Technical Knowledge)
- Procrastination
(Individual Attributes)
Scores Grades
70
60
50
40
30
20
10
0
Skills Resources Time
Less than 10% of students with low scores experienced academic success.
High Score
Low Score
What is the relationship between measures of online student readiness and measures of online student satisfaction?
Data from 1,611 students who completed both the
SmarterMeasure Learning Readiness Indicator and the Priority Survey for Online Learners were analyzed.
• There were statistically significant relationships between factors of readiness and satisfaction.
North Central Michigan College - Petoskey, MI
• 2014 Student Readiness Report
• Data from 460,406 students from 367 colleges and universities
• 67% were female
• 55% were Caucasian/White
• 51% had never taken an online course before
• 39% were traditional aged college students
• 71% were students at an associate’s level institution
• Dominant Social learning style
• Highly motivated
• Moderate reading skills
• Pressed for time
• Increasing technical skills
• Four demographic variables have had a statistically significant higher mean for six years in a row.
Females have had the highest means for six years in Individual
Attributes.
Males have had the highest means for six years in Technical
Knowledge.
• Caucasians have had the highest means for six years in Technical Knowledge.
• Students who have taken five or more online courses have had the highest means for six years in Individual Attributes and Technical Knowledge.
• Statistically significant relationships exist between measures of online student readiness and measures of academic success, engagement, satisfaction and retention.
SmarterMeasure.com
• What testing formats are provided by your school?
– On-campus in a testing center
– Virtual (Bvirtual, ProctorFree, ProctorU)
– Human proctor
• Who organizes the work flow for each?
– Faculty member’s responsibility
– Staff member’s responsibility
• How many exams are proctored in a typical term ?
• What LMS do you use?
• A virtual proctoring solution
– But it does help you organize the process of using virtual proctoring solutions like Bvirtual, ProctorFree, or ProctorU
• A student authentication solution
– But it can help you work with services like Authentify or
EVS (Electronic Verification Systems).
• Testing integrity software
– But it can help you use services like Respondus or Exam
Soft
You can continue using what ever testing software you are currently using.
We Just Make The Process of Coordinating it
Did I mention that it is typically FREE to schools!
What is SmarterProctoring?
Software that works within your LMS that helps students find different types (Virtual, Testing
Center, Human, etc.) of approved proctors, while also allowing the faculty to see where each student is in the test scheduling process.
• System to organize all of the tasks associated with planning a proctoring session
• A database of vetted, trained, identity-verified testing proctors
Organizing the Proctoring Process
• Dashboard to monitor each student’s progress
• Archived communication between student, faculty and proctor
• Manage multiple proctoring types such as local testing centers, human proctor, virtual proctor and machine proctor.
• Integrated services with tools such as Bvirtual,
RegisterBlast, ProctorFree and ProctorCam
• Create workflows with companies like ProctorU
And ExamSoft
Approved Human Proctors
• Nationwide database of human proctors
• Vetted by your selection criteria
• Verified through third party verification
• Trained and tested on proctoring responsibilities
Offer your students multiple proctoring solutions
Have a system that does the organizational work for you
Know where each student is in the proctor selection and scheduling process
Have confidence that students exams are being taken securely
Tools Included in SmarterProctoring
• Faculty Exam Dashboard – Shows a roster of all students and where each student is in scheduling their testing session.
• Exam Configuration Panel – Allows the instructor to input exam and proctor notes. It also allows the instructor to customize what types of proctors are available to their students for each exam.
• Database of Vetted Human Proctors –
Neutral proctors with qualifying credentials that SmarterProctoring has approved and trained.
• Integration
– Includes seamless integration with multiple proctoring solutions (Bvirtual, RegisterBlast, ProctorFree, etc.)
Course Dashboard
Exam Dashboard
Exam Configuration
Panel
Find a Proctor
• View/manage upcoming testing sessions which were scheduled via
SmarterProctoring.
• Integration with your scheduling software may also be possible.
Human Proctor Network
• Andrew Davis
• andrew@smarterservices.com
• 205 994 6264
• www.SmarterProctoring.com
SmarterServices.com
(877) 499-SMARTER info@SmarterServices.com