Measuring Grit Do Non-Cognitive Attributes Impact Academic Success, Engagement, Satisfaction and Retention? Alan Manley Sales Manager SmarterServices Provided by Measuring Grits • We recommend a glass beaker for measuring grits. We are from Alabama, we know about these things. What is Grit? • Why does one student who had straight A’s in high school drop out of college after one year, and another one excel? • Why does one single mom with three children graduate Summa Cum Laude and another one drop out? What is Grit? • Grit is that elusive quality that prompts one student to stick with it while others quit. • For over ten years we have measured levels of grit in over 2,000,000 students at over 500 colleges and universities. • Today I want to share with you the results of research related to the impact that grit has on student success. Some Students Seem To Have More Grit Than Others Three Approaches to Measuring Grit • Stick your head in the sand. • Use a brief, non-prescriptive survey. • Use SmarterMeasure Learning Readiness Indicator. What is SmarterMeasure? • A 124-item online skills test and attributes inventory that measures a student’s level of readiness for studying online • Used by over 500 Colleges and Universities • Taken by over 2,000,000 students What Are The Ingredients In Our Grits? What Does The Assessment Measure? INTERNAL EXTERNAL SKILLS INDIVIDUAL ATTRIBUTES LIFE FACTORS TECHNICAL Availability of Time Dedicated Place Reason Support from Family Technology Usage Life Application Tech Vocabulary Computing Access Motivation Procrastination Time Management Help Seeking Locus of Control LEARNING STYLES Visual Verbal Social Solitary Physical Aural Logical TYPING Rate Accuracy ON-SCREEN READING Rate Recall How Does The Beaker Work? How Do You Spot a Bad Grit? Adjusting Readiness Ranges Adjusting the cut points can make the reporting a more accurate predictor of success. How Do Schools Use It? • • • • • • Orientation Course Enrollment Process Information Webinar Public Website Class Participation Facebook Thermometer Analogy • More important than taking your child’s temperature is taking appropriate action based on their temperature. • More important than measuring student readiness is taking appropriate action based on the scores. Measuring the Grits Progression of SmarterMeasure Data Utilization Predictive Correlation Comparison Descriptive Student Service Research Ideas on the Research Page of the Website Approaches to Research Projects Internally Conducted Company Assisted Professionally Assisted Middlesex Community College • 6% to 13% more students failed online courses than on-ground courses. • Intervention Plan - Administer SmarterMeasure - Identify which constructs best predicted success - Provide “Success Tips” as identified Distributed by website, email, orientation course, records office, library, posters, and mail Research Findings • Analyzed 3228 cases over two years • Significant positive correlation between individual attributes and grades Motivation Impacts Grades Results of Middlesex Research Before SmarterMeasure™ was implemented, 6% to 13% more students failed online courses than students taking on-ground courses. After the implementation, the gaps were narrowed: 1.3% to 5.8% more online students failed than onground students. Results of Middlesex Research Failure rates reduced by as much as 10% Action Plan • Empower eLearning staff, faculty advisors, and academic counselors with student data Three areas of focus Motivation Self Discipline Time Management Project Summary “In summary, the implementation of SmarterMeasure has helped students to achieve better academic success by identifying their strengths and weaknesses in online learning.” In essence, with various strategies implemented to promote SmarterMeasure™, a “culture” was created during advising and registration for students, faculty, and support staff to know that there is a way for students to see if they are a good fit for learning online. CEC - The Need • We need to know which students to advise to take online, hybrid or on-campus courses. • We need to know which students to direct to which student services to help them succeed. • We need to know how to best design our courses so that new students are not overwhelmed. The Analysis • What is the relationship between measures of student readiness and variables of: – Academic Success - GPA – Engagement – Survey (N=587) – Satisfaction – Survey (Representative Sample based on GPA and number of courses taken per term) – Retention – Re-enrollment data The Analysis • Phase One – Summer 2011 – Included data from all three delivery systems – online, hybrid and on-campus – Analyzed data at the scale level • Phase Two – Fall 2011 – Focused the research on online learners only – Analyzed data at the sub-scale level • A neutral, third-party research firm (Applied Measurement Associates) used the following statistical analyses in the project: – ANOVA, Independent Samples t-tests, Discriminant Analysis, Structural Equation Modeling, Multiple Regression, Correlation. The Findings • Statistically Significant Relationships Academic Achievement Engagement Retention Individual Attributes X X X Technical Knowledge X X X X X Learning Styles Life Factors Technical Competency X X X The Findings • Student Categorizations – Enrollment Status • Positive – active/graduated (34.3%) • Negative – withdrew/dismissed/transfer (65.7%) – Academic Success Status • Passing – A, B or C (48.9%) • Failing – D, F or Other (21.1%) – Transfer Credit – (21.8%) – Not reported – (8.2%) The Findings - Correlates Readiness Domain Life Factor Readiness Domain Subscales Positive vs. Negative Pass vs. Fail Place, Reason, and Skills Place Social and Logical N/A Academic, Help Seeking, Procrastination, Time Management, and Locus of Control Time Management Internet Competency Internet Competency and Computer Competency Technology Usage and Technical Vocabulary Technical Vocabulary Learning Styles Personal Attributes Technical Competency Technical Knowledge The Findings – Predictors Readiness Domains GPA F p Life Factor Place and Skills 12.35 .0001 Learning Styles Verbal a and Logical 3.95 .02 Help Seeking, Time Management, and Locus of Control 21.11 .0001 Computer and Internet Competency 22.75 .0001 38.76 .0001 Personal Attributes Technical Competency Technical Knowledge Technology Vocabulary The Findings - Predictors Readiness Domains Credit Hours Earned F p Life Factor Place 12.37 .0001 Learning Styles Visual 6.81 .01 Academic Attributes, Help Seeking, and Locus of Control 13.40 .0001 Computer Competency and Internet Competency 12.23 .0001 26.97 .0001 Personal Attributes Technical Competency Technical Knowledge Technology Usage and Technology Vocabulary The Recommendations • We need to know which students to advise to take online, hybrid or on-campus courses. – A profile of a strong online student is one who: • • • • Has a dedicated place to study online Possesses strong time management skills Demonstrates strong technical skills Exhibits a strong vocabulary of technology terms The Recommendations • We need to know which students to direct to which student services to help them succeed. – An online student who should be directed toward remedial/support resources is one who: • Has a weak reason for returning to school • Has weak prior academic skills • Is not likely to seek help on their own • Is prone to procrastinate • Has low, internal locus of control • Has weak technology skills The Recommendations • We need to know how to best design our courses so that new students are not overwhelmed. – Limit advanced technology in courses offered early in a curriculum – Foster frequent teacher to student interaction early in the course – Require milestones in assignments to prevent procrastination – Clearly provide links to people/resources for assistance Argosy University • Required in Freshman Experience course • Students reflect on scores and identify areas for improvement in their Personal Development Plan • Group reflection with others with similar levels of readiness Argosy University COMPARE • Compared the traits, attributes, and skills of the online and hybrid students. • Substantial differences between the two groups existed. • Changes were made to the instructional design process for each delivery system. Online Hybrid Argosy University - EXPLORE • Correlational analysis between SmarterMeasure scores and student satisfaction, retention, and academic success Motivation Statistically Significant Factors: Technical Technical Competency Motivation Availability of Time. Time Satisfaction Retention Success Argosy University - TREND • Aggregate analysis of SmarterMeasure data to identify mean scores for students. • Comparison made to the national mean scores from the Student Readiness Report. National Scores Argosy Scores Argosy University - APPLY • Findings were shared with the instructional design and student services groups and improvements in processes were made. For example, since technical competency scores increase as the students take more online courses, the instructional designers purposefully allowed only basic forms of technology to be infused into the first courses that students take. J. Sargeant Reynolds Community College • Required as admissions assessment • Integral part of their QEP • Computed correlations with grades and Life Factors SmarterMeasure sub-scales of over 4000 students. • P Attributes Grades Learning Styles Technical Findings • Statistically significant correlations: - Dedicated place, support from employers and family, access to study resources, and academic skills (Life Factors) - Tech vocabulary (Technical Knowledge) - Procrastination (Individual Attributes) Scores Grades Academic Success Rates 70 60 50 High Score 40 Low Score 30 20 10 0 Skills Resources Time Less than 10% of students with low scores experienced academic success. Five Schools What is the relationship between measures of online student readiness and measures of online student satisfaction? Methodology Incoming vs Outgoing Data from 1,611 students who completed both the SmarterMeasure Learning Readiness Indicator and the Priority Survey for Online Learners were analyzed. Findings • There were statistically significant relationships between factors of readiness and satisfaction. National Data • 2012 Student Readiness Report • Data from 690,927 students from 324 colleges and universities Online Learner Demographics • • • • • 70% were female 59% were Caucasian/White 54% had never taken an online course before 35% were traditional aged college students 52% were students at an associate’s level institution Online Learner Demographics • • • • • • Dominant Social learning style Highly motivated Moderate reading skills Pressed for time Fast typists Increasing technical skills Profile of an Online Student • Four demographic variables have had a statistically significant higher mean for four years in a row. Females higher in Individual Attributes, Academic Attributes, and Time Management. Males higher in Technical Knowledge. Profile of an Online Student • Caucasians have had the highest means for four years in Technical Knowledge. • Students who have taken five or more online courses have had the highest means for four years in Individual Attributes, Technical Knowledge, and Procrastination. Conclusion • Statistically significant relationships exist between measures of online student readiness and measures of academic success, engagement, satisfaction and retention. Conclusion • Students individually benefit and schools collectively benefit from measuring learner readiness and appropriately responding. SmarterMeasure.com SmarterMeasure References Middlesex, Argosy, J. Sargeant Reynolds: http://smartermeasure.com/research/research-results/ Noel Levitz Study: https://www.noellevitz.com/papersresearch-higher-education/2011/2011-adult-and-onlinelearner-satisfaction-priorities-reports Student Readiness Report: http://smartermeasure.com/smartermeasure/assets/File/ Online-Student-Readiness-Report.pdf “Live as if you were to die tomorrow. Learn as if you were to live forever.” Mahatma Gandhi For More Info SmarterServices.com (877) 499-SMARTER info@SmarterServices.com