PEC MEETING – COLLEGE OF EDUCATION AND HUMAN SERVICES Meeting Minutes Wednesday, April 18, 2007 I. Announcements Reminder: of Dean’s Spring Fling on the 24th. Vonnie opened by thanking everyone for their work on assessments. II. Presentation & Discussion Discussion was based on what Vonnie learned at the ATE – VA conference. Following is the breakdown: A. WHAT SHE LEARNED FROM THE CONFERENCE: •All colleges of education must complete a biennial measurement for the VDOE. •The CEHS Assistant Dean's Office will probably coordinate that. •But, there will be input from the programs too. Draft survey. (passed out sample survey that will be need to be completed) •VITAL is coming – this is a data collection system that will be used by the state to collect data on all student teachers in Virginia. In time, it will follow students from the time they enter the student teaching program until they retire. Vonnie expects that the bulk of the data will come from the biennial plans. B. STUDENT TEACHERS •Student teachers must have a minimum of 150 hours of direct teaching and at least 300 hours in student teaching. •Internships for administrators must be 320 hours. There was a question raised about being able to video tape students to show that they are meeting objectives? Vonnie said that it was a good idea, although she didn’t know how it would be organized. If they could find a way to make it work, it would be ok. C. IMPACT ON STUDENT LEARNING Must be able to provide evidence that the teacher candidate can affect student learning. Our TWS really helps with this piece. Can you document that anywhere else? If you can, please do!!! D. PRAXIS II •Programs must have passing rates for PRAXIS II of at least 70% until 2010 and then must be 80%. This could cause a problem for small programs. •No longer are 10 students necessary before this applies. ALL programs that require PRAXIS II must meet this requirement. •It must be passed before a student can be licensed. •We need to better prepare our students for Praxis II. •If a student fails and does not retake the test, it counts against us. Question from Math dept. - If the test is not taken, does it still counts against us. Students will be considered non completers. However, if they do not student teach, they are not really being tracked. Topic raised from Math department – failing tests count against us, why not advise students not to take the test until the end of the program E. PRAXIS II for 2005-2006 •El. Ed. •English •MS math •Social Studies •MS/SS •Business Ed. •Music •Art •Spanish •Biology •M. S. science •Health & PE 70 6 4 9 3 1 5 5 2 1 2 10 F. Getting Ready for NCATE Actually, we are in great shape!! The Unit •Continue with data collection for conceptual framework •Units fail when they have: no conceptual framework and/or assessment problems. •Thanks to you - we have neither problem SPAS •Talk about your standards to colleagues and to students – start now •Place appropriate numbers on syllabi •Collect artifacts – the good, the bad, and the ugly •Keep data managed. •Program meetings need minutes; even just meeting with 1 or 2 people or chair Shared Programs – How can we make this happen? •Arts & Sciences need to talk with Education more, and vice versa. We teach the same students, how can we be more efficient/effective? •Take minutes and save those. Office of Professional Services/A&S •MUST document feedback from cooperating teachers and university supervisors to student teachers. •Need to retain a copy of observation forms “New” Ideas for discussion •Have students complete a self-evaluation of dispositions at admission •Set specific dates for program entry – Oct. 15/March 1 •Are we really happy with current observation form for student teaching? •Collect diversity information during partnership semester too. Reliability •TWS need inter-rate reliability •What does unacceptable, acceptable, and target mean? a) Have examples b) Have notes from meetings How is it going? Unit data collection SPA data collection SPA data management Ling spoke regarding rubrics. How can you control the quality of rubrics in terms of validity? She said the smaller the group the harder it is to demonstrate – how good is good. She suggests taking a look at the rubrics to specify criteria. Make sure you can observe and measure; keep it simple. She also spoke about making sure that the performance levels have descriptions and are clearly distinguished. Rubrics must be documented. She suggest to always keep in mind, must demonstrate validity, and reliability evidence. Also, if there is a very small population, at best you can only demonstrate. They should use a more generic rubric which is easier for the college at the highest level to collect. Question remains, do we measure what we think we measure? Documentation is a key.