Department of Education School of Education and Behavioral Sciences Master of Education Program CIP Code : 130101 Program Code: 650 Student - Learning Outcomes 1. Curriculum - Develop and deliver curriculum based on theoretical foundations of the discipline (Application) 2. Technology - Demonstrate the use of technology in support of teaching and learning (Application) 3. SPA - Know and/or demonstrate the subject matter, professional knowledge and skills outlined by the respective specialized professional association (Application) 4. Research - Analyze, utilize, and conduct research critically (Evaluation) 5. Diversity - Identify developmental and individual differences and adjust practices accordingly (Application) 6. Assessment - Monitor and assess pupil learning (Application) 7. Reflection - Reflect upon and evaluate his/her own practices (Evaluation) *These objectives align with the following standards: National Board for Professional Teaching, Council for Exceptional Children, International Reading Association, Association for Childhood Education International Program Quality Improvement Report 2009-2010 Alignment of Outcomes 1. 2. The program outcomes for the Master of Education Program align with these elements of Cameron University’s mission statement: • “fosters a student-centered academic environment that combines innovative classroom teaching with experiential learning“ • “…prepares students for professional success, responsible citizenship, life-long learning, and meaningful contributions to a rapidly changing world;” Cameron University’s Strategic Plan 2013 has as its first commitment that we are “becoming the University of Choice by providing students a top quality education.” All of the program outcomes for the Master of Education Program are in alignment with this primary core value, specifically, we align with: • Maintain and enhance Cameron’s commitment to providing programs of the highest quality in instruction, research, and service to better meet the needs of the citizens of the region. • Assure efficient, effective course delivery in multiple formats • Attract, develop, and retain diverse, high quality faculty and staff Program Quality Improvement Report 2009-2010 Alignment of Outcomes (continued) 3. These outcomes relate to our students and Southwest Oklahoma in that we: • have programs that continue to meet the needs of graduate students in this area, to include coursework for those who have an alternative teaching license, and courses designed at the wide variety of teaching levels (i.e. elementary, secondary, special education, literacy). • constantly assess our students’ needs and offer courses at different sites via ITV and online as the need arises. • are continually aiming to attract more diverse students through recruitment efforts. Learning Outcomes: Curriculum, Technology, SPA (content, skills, professional knowledge), Research, Diversity, Assessment, Reflection Program Quality Improvement Report 2009-2010 Measures of Learning Outcomes Direct Measures • • • Portfolio Analysis of Growth (AOG) Paper Dispositional Assessments Indirect Measures • Exit Interviews Program Quality Improvement Report 2009-2010 5 Report on Actions for Previous Priority Outcomes Technology Program Outcome Actions Implemented When Implemented Resource Implications TECHNOLOGY •Added 3 Hitachi StarBoards •Trained faculty on use of StarBoards •Installed summer 2010 •Early fall 2010 •$4,110.00 •None TECHNOLOGY Implemented a hybrid class for EDUC 5143 Multiculturalism •Spring 2010 •None TECHNOLOGY •Purchased 2 laptops for student and faculty use •Spring 2010 •$1,600.00 Program Quality Improvement Report 2009-2010 Technology Mean Scores Scale: 1=Poor 2=Below Average 3=Average 4=Above Average 5 = Superior The minimum average required to pass is 3.00. 1 3 5 2007-2008 Mean Portfolio Multimedia Presentation 0% 65% 35% n=23 3.40 n=29 3.30 n=26 3.70 Analysis of Growth Knowledge of and ability to integrate technology 0% 50% 50% n=25 3.90 n=10 3.61 n=10 4.21 Exit Survey Ability to integrate technology in instruction N/A N/A N/A n=21 4.14 n=28 4.23 n=17 4.77 Exit Survey Use information technology N/A N/A N/A n=21 4.06 n=28 4.27 n=13 4.80 Program Quality Improvement Report 2009-2010 2008-2009 Mean 2009-2010 Mean Report on Actions for Previous Priority Outcomes SPA Program Outcome Actions Implemented When Implemented Resource Implications SPA •Renamed reflection rubrics so that course name is included •Spring 2010 •None SPA •Discussed importance of emphasizing SPA standards in course at Graduate Faculty Meeting •Spring 2010 •None Program Quality Improvement Report 2009-2010 SPA Mean Scores Scale: 1=Poor 2=Below Average 3=Average 4=Above Average 5 = Superior The minimum average required to pass is 3.00. 1 3 5 2007-2008 Mean 2008-2009 Mean 2009-2010 Mean Portfolio M.Ed. Reflections from all courses 2% 43% 55% n=263 3.30 n=128 3.44 n=122 4.00 Analysis of Growth SPA Standards 0% 70% 30% n=25 3.72 n=10 3.52 n=10 3.64 Exit Survey Comprehension of Professional Standards in your area N/A N/A N/A n=21 4.09 n=28 4.47 n=17 4.41 Exit Survey Acquired knowledge, skills, and Dispositions delineated in professional, state, and institutional standards N/A N/A N/A n=21 4.09 n=28 4.36 n=17 4.88 Program Quality Improvement Report 2009-2010 Report on Actions for Previous Priority Outcomes Research Program Outcome Actions Implemented When Implemented Resource Implications Research •Implemented methodology assignment in Introduction to Graduate Research class •Spring 2010 •None Research •Discussed importance of integrating educational research in ALL graduate classes in Graduate Faculty Committee meeting and Departmental faculty meeting •Fall 2009 •None Program Quality Improvement Report 2009-2010 RESEARCH Mean Scores Scale: 1=Poor 2=Below Average 3=Average 4=Above Average 5 = Superior The minimum average required to pass is 3.00. 1 3 5 2007-2008 Mean 2008-2009 Mean 2009-2010 Mean Portfolio Assessment (EDUC 5103 Research Proposal Rubric Methods Data Criteria 5 & 9) 0% 31.4% 68.6% n=36 4.20 n=49 3.69 n=51 4.20 Analysis of Growth Paper N/A N/A N/A N/A N/A N/A Exit Survey Ability to use research to improve practice N/A N/A N/A N/A n=22 4.50 n=17 4.62 Program Quality Improvement Report 2009-2010 Student-Learning Outcome and Measurements DIVERSITY MEASUREMENT OF PROGRAM OBJECTIVE PROGRAM OBJECTIVE 5 Identify developmental and individual differences and adjust practices accordingly CURRICULUM AREA OR TARGET AUDIENCE Diversity EDUC 5143 – Multiculturalism in American Education Core course required of all M.Ed. Candidates Measurements 1. Portfolio Assessment: Annotated Bibliography 2. Analysis of Growth Paper Methods used to determine validity of measurement instruments Content validity – all rubrics were reviewed by faculty and advisory board 3. Exit Survey Program Quality Improvement Report 2009-2010 Methods used to determine reliability of measurements Schedule for measurements Inter-rater Reliability - Each semester Beginning Fall 2009 started inter-rater reliability studies on portfolio artifacts. Last semester before graduation Analysis of Growth rubrics were reassessed. Last semester before graduation Display of Assessment Data DIVERSITY Mean Scores Scale: 1=Poor 2=Below Average 3=Average 4=Above Average 5 = Superior The minimum average required to pass is 3.00. 1 3 5 2007-2008 Mean 2008-2009 Mean 2009-2010 Mean Portfolio Assessment – EDUC 5143 Multiculturalism in American Education 4% 50% 46% n=19 3.81 n=39 4.50 n=20 3.20 Analysis of Growth Paper – Role and importance of diversity 0% 30% 70% n=2 3.85 n=10 3.86 n=10 3.64 Exit Survey – Work with diverse students N/A N/A N/A N/A n=21 4.26 n=17 4.59 Program Quality Improvement Report 2009-2010 Analysis of Assessment Data DIVERSITY 5 4 S C 3 O R 2 E S 1 2007-2008 2008-2009 2009-2010 0 Portfolio AOG Exit Program Quality Improvement Survey Report 2009-2010 Action Plan for Student Learning Outcome Diversity Program Outcome Action Implementation Timeline Resource Implications Diversity •Change format of class from 8 weeks to 16 weeks •Action based on scores below 3.00 in 3 areas of Annotated Bibliography assignment (n=20): 1. Research question: 2.6 2. Topic Rationale 2.8 3. Reading analysis 2.7 •Action was necessary because the average score excluding mechanics was 2.9 •Spring 2011 •None Program Quality Improvement Report 2009-2010 Student-Learning Outcome and Measurements TECHNOLOGY MEASUREMENT OF PROGRAM OBJECTIVE PROGRAM OBJECTIVE 2 Demonstrate the use of technology in support of teaching and learning CURRICULUM AREA OR TARGET AUDIENCE Technology EDUC 5913 – Multimedia in the Classroom Core course required of all M.Ed. Candidates Measurements 1. Portfolio Assessment: Multimedia Presentation 2. Analysis of Growth Paper Methods used to determine validity of measurement instruments Content validity – all rubrics were reviewed by faculty and advisory board 3. Exit Survey Program Quality Improvement Report 2009-2010 Methods used to determine reliability of measurements Schedule for measurements Inter-rater Reliability - Each semester Beginning Fall 2009 started inter-rater reliability studies on portfolio artifacts. However, the rubric for Multimedia in the Classroom was not included in this study because the same instructor teaches this course each semester. Analysis of Growth rubrics were reassessed . Last semester before graduation Last semester before graduation Display of Assessment Data TECHNOLOGY Mean Scores Scale: 1=Poor 2=Below Average 3=Average 4=Above Average 5 = Superior The minimum average required to pass is 3.00. 1 3 5 2007-2008 Mean 2008-2009 Mean 2009-2010 Mean Portfolio Assessments EDUC 5913 – Multimedia Presentation 0% 65% 35% n=23 3.40 n=29 3.30 n=26 3.70 Analysis of Growth Paper Knowledge of and ability to integrate technology 0% 50% 50% n=25 3.90 n=10 3.61 n=10 4.21 Exit Survey Ability to integrate technology in instruction N/A N/A N/A n=21 4.14 n=28 4.23 n=17 4.77 Program Quality Improvement Report 2009-2010 Analysis of Assessment Data TECHNOLOGY 5.00 4.00 S C O R E S 2007-2008 3.00 2008-2009 2.00 2009-2010 1.00 0.00 Portfolio AOG Paper Exit Survey Program Quality Improvement Report 2009-2010 Action Plan for Student Learning Outcome Technology Action Plan Program Outcome Action Implementation Timeline Resource Implications TECHNOLOGY •Increase use of technology by obtaining a grant for Apple Ipads Justification: •Cameron teacher education students learn on a weekly basis the need for effective communication of objectives and content to students. This necessary skill teaches candidates to create content delivery vehicles that will motivate and inspire students to learn. •Spring 2011 •$5,000 •Emerging technology holds promise that it will provide a new and more effective platform to promote student learning. The Apple iPad is at the forefront of new technology using application driven tools to provide a new and fundamentally different way to communicate and retrieve content for users. •Placing this emergent technology in instructional technology course work for teaching candidates will place Cameron’s graduates in the forefront of teaching. Program Quality Improvement Report 2009-2010 Student-Learning Outcome and Measurements RESEARCH MEASUREMENT OF PROGRAM OBJECTIVE PROGRAM OBJECTIVE 1 Analyze, utilize, and conduct research critically CURRICULUM AREA OR TARGET AUDIENCE Research EDUC 5103 – Introduction to Graduate Research Core course required of all M.Ed. Candidates Measurements 1. Portfolio Assessment: Multimedia Presentation 2. Analysis of Growth Paper Methods used to determine validity of measurement instruments Methods used to determine reliability of measurements Content validity Inter-rater Reliability - Each semester Beginning Fall 2009 started inter-rater reliability studies on portfolio artifacts. Last semester before graduation – all rubrics were reviewed by faculty and advisory board 3. Exit Survey Research Proposal Rubrics were reassessed . Analysis of Growth rubrics were reassessed. Program Quality Improvement Report 2009-2010 Schedule for measurements Last semester before graduation Display of Assessment Data RESEARCH Mean Scores Scale: 1=Poor 2=Below Average 3=Average 4=Above Average 5 = Superior The minimum average required to pass is 3.00. 1 3 5 2007-2008 Mean 2008-2009 Mean 2009-2010 Mean Portfolio Assessment (EDUC 5103 Research Proposal Rubric Methods Data Criteria 5 & 9) 0% 31.4% 68.6% n=36 4.20 n=49 3.69 n=51 4.20 Analysis of Growth Paper N/A N/A N/A N/A N/A N/A Exit Survey Ability to use research to improve practice N/A N/A N/A N/A n=22 4.50 n=17 4.62 Program Quality Improvement Report 2009-2010 Analysis of Assessment Data RESEARCH 5 4 S C O R E S 3 2008-2009 2 2009-2010 1 0 Portfolio Exit Survey Program Quality Improvement Report 2009-2010 Action Plan for Student Learning Outcome Research Program Outcome Action Implementation Timeline Resource Implications RESEARCH •Continue with implementation of methodology assignment because it has been in effect for spring ’10 and summer ‘10 •N/A •None RESEARCH Have EDUC 5103 candidates give a presentation on article analysis assignment •Spring 2011 •None Program Quality Improvement Report 2009-2010 Published Information on Graduates Academic Year 09-10 Summer 2009 Fall 2009 Spring 2010 Total Working In Discipline Unknown 0 2 (Teaching and Learning) 8 (4 Elementary, 4 Teaching and Learning) 10 0 Program Quality Improvement Report 2009-2010 0 1 1