North Carolina Computer Skills Tests CTE Summer Conference Koury Convention Center Greensboro July 27, 2005 Presenter: Randy Craven Information Technology Manager, Technical Outreach for Public Schools NC State University Session Purpose This session will provide an overview of the North Carolina Computer Skills Tests to be used during 2005-06 including: 1992 & 1998 Curriculum Tests 2004 Curriculum Online Test 2004 Curriculum Alternate Assessment CAUTION This session will contain factual information concerning an actively developing process. Statements made will be representations of current plans for the near future. End results may turn out to be different. The Requirement • In May 1991, the North Carolina State Board of Education (SBE) adopted a policy which required that all students, beginning with the class of 2000, demonstrate computer proficiency in order to receive a high school diploma (Feature C of the Quality Assurance Program). In October of 1995, the SBE modified the requirement which made it effective beginning with the graduating class of 2001. Curricula - Tests • 1992 Curriculum Tests for Students Who Entered • • Grade 8 From 1996-1997 through 1999-2000 1998 Curriculum Tests for Students Who Entered Grade 8 from 2000–2001 through 2004-2005 2004 Curriculum Tests for Students Who Entered Grade 8 from 2005-06 and beyond One Requirement • 1992 & 1998 Curricula – Passing two separate tests needed to fulfill computer proficiency requirement for students entering 8th grade during school years 199697 to 2004-05 • Multiple-Choice • Performance • 2004 Curriculum – Passing only one test required What - Tests • • • • • • • 1992 Curriculum Multiple Choice Scored Locally 1992 Curriculum Performance Scored Centrally 1992 Portfolio Scored Locally 1998 Curriculum Multiple-Choice Scored Locally 1998 Curriculum Performance Scored Centrally 1998 Portfolio Scored Locally 2004 Curriculum Online Combination Scored Centrally • 2004 Curriculum Alternate Assessment Field Test Scored Centrally • 2004 Curriculum Alternate Assessment Scored Locally When - Schedule • Now - 2005-06 School Year • 2004 Curriculum Alternate Field Test – ASAP at the beginning of • • • • • • • • the school year for 9th Graders 1992 & 1998 Curriculum Tests – Early October - Mid November 2004 Curriculum Online Primary Window – Mid October – Mid January 2004 Curriculum Alternate – Early November – Mid January 1992 & 1998 Curriculum Tests – Early February – Mid March 2004 Curriculum Standards Analysis – Mid January – Mid February 2004 Curriculum Standards Set By SBE 2004 Curriculum Results Reported Early March 2004 Curriculum Online & Alternate Secondary Window – Late March – Mid June Grades Affected • Grade 8 – Online Test – Alternate Assessment • Grade 9 – Alternate Assessment Field Test • Grades 9 – 12 (for those not yet meeting the standard) – 1998 Curriculum Tests – 1998 Curriculum Portfolio • Students who entered grade 8 between 1996-97 through 1999-2000 (seeking diploma – not yet meeting the standard) – 1992 & 1998 Curriculum Tests – 1992 & 1998 Curriculum Portfolio Administration Times* • 1992 curriculum • • – Multiple Choice – 105 minutes – Performance – 90 minutes 1998 curriculum – Multiple Choice – 110 minutes – Performance – 133 minutes 2004 curriculum – One Online Test Estimated 120 minutes – One Alternate – Time not known currently *Note: Administration times do not include distribution of material, printing and organizing of student printouts [performance], packaging, shipping, and other logistical activities More About… • • • • • • • 1992 & 1998 Curricula Tests 2004 Curriculum Online Test 2004 Curriculum Alternate Test NCDesk NCRegistration 2004 Curriculum Test Development History Other Items Menu 1992 & 1998 Test Details • Pencil-Paper-Computer-Printer processes • 2 Separate Tests with different standards to be passed • Multiple-Choice Scored Locally • Performance Scored Centrally • Must be given once per year to students who have not yet met the standard Menu Old Paper Trail • Printing of test materials – Test booklets – Answer documents – Header sheets and shipping lists • Excess Ordering (~100% overage) • Printing of student work for performance • Handling of materials Menu Test Files Support • Currently supporting 23 different software packages for performance test • Distribution of test files to field • Equity of administration questionable • Online Test delivers required files to students as test is in progress Menu Scoring Performance • Currently performance test booklets are hand scored by a scoring contractor at a central location – Time • Fall administration – approximately 2 months • Spring administration – approximately 1 month – Costs • High • Scorers, staffing, spacing, supplies Menu Scoring Performance (cont.) • Scores are reliable and valid • Inter-rater reliability is high [93% +] • Reliability monitoring, qualified scorers But… • Potential for human error still exists • Student work does not always provide evidence that student used correct methods to accomplish task Menu 1992 & 1998 Test Details • • • • • • • End of Section 2004 Curriculum Online Test 2004 Curriculum Alternate Test NCDesk NCRegistration 2004 Curriculum Test Development History Other Items Menu 2004 Online Test Details • Tests the adopted 2004 curriculum • Uses NCDesk as the client interface • Delivered through the Internet from central testing server Menu Goals of the Online Test • Merge two tests into one… • Reduce the administration, testing, scoring, and logistical • • • • • • time required… Provide a universal delivery to increase equity for a “standardized” test… Eliminate costs of printing and reduce paper waste… Decrease handling so much paper… Maximize reliability and validity of scores… Return scores more efficiently and timely… Keep pace with the changing face of technology, testing, and scoring… Menu The Online Test • 72 items total • 4 sections • 18 items per section • Items not delivered randomly • Not divided into strand specific sections • Not divided into performance and multiple-choice sections Menu The Online Test • Different forms automatically delivered to each student • Performance vs. Multiple-Choice – 50% performance based – 50% multiple-choice based • Embedded field test items for future test development Menu NCDesk Client • NCDesk Integrated Java Applications Suite – Delivery of multiple-choice items – Delivery of performance items within testing environment – Includes: • editor/word processing application • database application • spreadsheet application • e-mail composer application • window management application Menu Online Test Environment • Secure – Encrypted – Save function disabled in NCDesk during test • Self-contained – No “surfing” of Internet within NCDesk – No “cut, copy, paste” functionality outside of test • Data Warehousing – Student responses stored on server when moving between sections and questions – Allows for recovery of test and data if workstations crash or other technical problems are encountered Menu Online Test Screen • Restore, flag control and position text • Item stem • Foils [multiple-choice] or application [performance] – Separate scroll bars for each • Navigation buttons • Pause button Menu Online Test Navigation • Section number identified on each item page [i.e., Section 1] • Item number within section identified on each item page [i.e., Question 2 of 18] Menu Online Test Controls • Restore – Clears item of changes and restores to original format • Flag – Identifies item on navigation bar at section end with a red question mark as an indicator that student may need to re-visit prior to exiting section – Student can still exit a section if items are flagged Menu Online Test Navigation • Students can move forward and backward within a section • Returning to section is not permitted once section is completed • Navigational buttons [PREV (previous), NEXT] • Linear movement backwards or forwards within section Menu Online Test Functions • End Section – Links to section summary page • Pause – Pauses the test at the immediate location for recovery without exiting test environment – 15 minute timeout – test administrator will have to log student back into the test if paused longer than 15 minutes. Menu Online Test Navigation • Section Summary • Navigation bar at end of section • Non-linear movement to any item in section • Displays which items have been flagged • Displays which items have been answered • Moving to next section displays warning Menu Online Test Details End • • • • • • • End Section 1992 & 1998 Curricula Tests 2004 Curriculum Alternate Test NCDesk NCRegistration 2004 Curriculum Test Development History Other Items Menu Online Test Development History • Feasibility Study/Trial Fall 2003 • Feasibility Study/Trial Fall 2004 • Spring 2005 Field Test Menu Feasibility Studies • Conducted research into feasibility of delivering Internetbased test environment • Fall 2003 • Fall 2004 • Conducted research into performance of local and central technology during delivery • Conducted research into overall performance of test environment and applications • Conducted research into item performance within test environment and applications • Received feedback and implement debugging, redevelopment, or new development Menu Feasibility Studies • Fall 2003 • Volunteer sites • Adults only • 1,926 starts: 1,351 finishes • 62 LEAs represented, 193 schools • Fall 2004 • At minimum, 10 locally chosen students per school containing eighth-grade students • 5,620 starts: 4,783 finishes Menu Field Testing • Spring 2005 • Continued research into overall performance of • • • • test environment and applications Conducted research into item performance Data used to construct operational form(s) Receive feedback and implement debugging or redevelopment where needed. Note: implementation of change can only occur where not affecting performance of item Menu Spring 2005 Field Test • Sampled population of schools and students • Window: April 11 – June 15 • 8,510 students chosen for sample • 6,361 starts: 6,198 finishes • Alternate Assessment also field tested – Window: May 9 – June 15 – 2000 students chosen for sample Menu Online Test Development History End • End Section • 1992 & 1998 Curricula Tests • 2004 Curriculum Online Test • 2004 Curriculum Alternate Test • NCDesk • NCRegistration • Other Items Menu Computer Skills Alternate Assessment • Results of feasibility study and Federal mandates required development of an alternate assessment instrument for two distinct populations: • Students with special needs who could not access the online test using available accommodations • Students who could not access the online test as a result of technical/technology limitations [i.e., unable to meet minimum requirements for bandwidth, computer, etc.] Menu Computer Skills Alternate Assessment • Field tested Spring 2005 – – – – ~2000 students sampled 1338 total documents processed 1323 contained multiple choice data 765 contained performance data • Different delivery from online test, but equal • rigor of standard [item difficulty level, thinking skills, etc.] One test consisting of two distinct sections Menu Computer Skills Alternate Assessment • Multiple-Choice Section – 36 items – Traditional • Performance Section – 27 total items • 26 performance-based, administrator rated [yes or no] items • 1 administrator rated [yes or no] item evaluating student proficiency with computer over course of time – Computer-based – Individualized administration – Use supplied files and local applications [i.e., word processing, database, etc.] to complete tasks required by items – Files provided in text format for conversion into local applications [PDFs provided to serve as blueprints] Menu Computer Skills Alternate Assessment • Item performance, results, and feedback being analyzed at this time • Highly likely to be field tested again in early Fall 2005 to 9th graders • Test files will be provided for State supported platforms/packages in future administrations Menu Computer Skills Alternate Assessment End • • • • • • • End Section 1992 & 1998 Curricula Tests 2004 Curriculum Online Test NCDesk NCRegistration 2004 Curriculum Test Development History Other Items NCDesk • Center of the Universe – Test • Access test at Log in page [School code, User name, Password] – Test Simulation • Practice activity to simulate real test environment – Verify Connection • Runs test to verify if secure connection to test server is established – Documentation • Links to website for information, updates, etc – Applications • Access to all applications integrated in test environment for use and familiarization Technical • NCDesk is a locally installed client Java application – Client computers must have Java runtime installed • Quality Internet connection required for accessing test environment – Internet connection not required for NCDesk applications when used for learning and practice • NCDesk communicates with a central server for testing [not • • • • • • hosted locally] Auto-update system check for current NCDesk version Sufficient RAM recommended CPU of good clock speed and recent vintage recommended Minimum amount of drive space available Sufficient amount of bandwidth required during testing Best Resource for technical recommendations – http://ncdesk.ncsu.edu/ncdesk/technote.asp Technical Notes - Proposed Client Computer Requirements Special Note: Client computer systems running the minimum 128Mb RAM need to reduce the number of background applications running when trying to use NCDesk. Background applications consume memory resources that can become critically low when other applications are running. These types of applications include hidden applications, system inits (Macintosh) and system tray applications (Windows). The following proposed client computer requirements are posted with the assumption that currently active background applications are at a minimum. Platform Minimum Java Runtime Oldest OS Version Minimum Processor Minimum Free Hard Drive Space Minimum RAM No Microsoft Windows 95 NA NA NA NA NA Yes Microsoft Windows 98 Java 2 Runtime Environment (JRE) Version 1.4.2_061 1st Edition Pentium 166 MHz 57 Mb 128 Mb Yes Microsoft Windows Me Java 2 Runtime Environment (JRE) Version 1.4.2_061 Me Pentium 200 MHz 57 Mb 128 Mb No Microsoft Windows NT NA NA NA NA NA Yes Microsoft Windows 2000 Pro Java 2 Runtime Environment (JRE) Version 1.4.2_061 SP3 Pentium 233 MHz 57 Mb 256 Mb Yes Microsoft Windows XP Java 2 Runtime Environment (JRE) Verstion 1.4.2_061 SP1 Pentium 200 MHz 70 Mb 256 Mb No Macintosh "Classic" NA NA NA NA NA Yes Macintosh OS X Java 2 Runtime Environment Version 1.4.1 Update 1 Jaguar OS X 10.2.x PowerPC x 26 Mb 128 Mb Yes Macintosh OS X Java 2 Runtime Environment Version 1.4.2 Update 2 (based on 1.4.2_05 SDK) 2 Panther OS X 10.3.3 PowerPC x 26 Mb 128 Mb Yes Macintosh OS X Java 2 Runtime Environment Version TBA Tiger OS X 10.4 PowerPC x TBA TBA Supported NCRegistration A test registration system (NCRegistration) is being developed to work in conjunction with any of the online testing programs being developed/offered to public schools in North Carolina. School systems and schools will use the system to indicate students who are eligible for testing within a testing window and schedule test administrations within predefined testing sessions for groups of students. NCRegistration Functions • Administrative - User access rights functions. Users are state level • • • • • administrators, regional level administrators, local district test coordinators, school test coordinators, test administrators, and possibly teachers. Bulk Registration - Function to allow bulk file uploads of student records to register large groups of students to a testing window. Single Registration - Function to allow registration of single student to a testing window. Test Session Scheduling - Indicating numbers of students at a school per test administration Student Information Questions (SIQ) -- additional data collection process Reports – Test Results Other Items • • • • • • • • Current Activities Accessibility Issues Helpdesk Mobile Labs Future Plans Necessities for Success Web Site Reference Contact Information Current Activities • Analysis of field test data – Item performance, results, feedback from field • Development and implementation of scoring parameters • for items Analysis of technical issues arising during field testing – Ongoing debugging and development of technology and test environment • Creation of operational form(s) based on analysis of field • • test data Development of new items [item writing] for embedding in the future Additions and testing of NCDesk & NCRegistration Accessibility Issues • • • • • • • Definite accessibility issues with online testing! Standard accommodations are still available Choice of large or regular font size for NCDesk Keyboard and mouse actions functional Currently developing the ability to integrate and support assistive technology [i.e., screen readers - JAWS] Exploring multiple options for accessibility [zoom functions, etc.] Implementation of additional assistive technology likely an extended process Helpdesk • Activated for feasibility studies/trials, field testing, • • • • and will be available for operational administration Assistance provided prior to, during, and after testing Addresses NCRegistration, NCDesk, Computer Skills Alternate Assessment, and any other issues involved in delivery and implementation of online test http://cskills.ncsu.edu/ncdesk/helpdesk.asp Email cskills@ncsu.edu to request assistance Mobile Labs • Available for schools/systems unable to test because of technical limitations • By request only [actual process for requests still in development] • Availability issues and division of time The Future • Online testing is the future – Most states are in the process of either implementing or maintaining an online testing program – North Carolina is moving forward with online testing…this is only the beginning • Students are more positive about online testing than administrators/teachers/staff – Trends suggest students more comfortable and engaged with online testing…overwhelming support from them • Technology concerns are warranted, but… – Implementation of technology will become seamless over time as traditional options for testing are exhausted – Systems/schools have been successful in implementing this test Necessities for Success • Dissemination and sharing of information – Local, State, National, International – Use resources available and act as a resource • Question – Online testing is a new world so do not be afraid to question things or offer your opinion • Support – There will be some growing pains, but never waver in your support – Support at all levels, between all divisions and peoples is absolutely required • Learn – Do not be complacent in your knowledge, always seek more Questions??? Web Sites Reference • • • • • • • http://cskills.ncsu.edu – Link to home page of the North Carolina Online Test of Computer Skills http://ncdesk.ncsu.edu – Direct link to the home page for the NCDesk application suite http://www.ncpublicschools.org/curriculum/computerskills – Link to Computer/Technology Skills Standard Course of Study on the North Carolina Department of Public Instruction website http://community.learnnc.org/dpi/tech – Link to Computer/Technology Skills page for Curriculum and School Reform on the North Carolina Department of Public Instruction website http://www.ncpublicschools.org/accountability/testing/computerskills – Link to computer skills testing information on the North Carolina Department of Public Instruction website http://tps.dpi.state.nc.us/ – Link to Technology Implementation & Planning Services page on the North Carolina Department of Public Instruction website http://www.ncpublicschools.org/techservices – Link to Technology Services page on the North Carolina Department of Public Instruction website Contact Information Scott Ragsdale Project Manager, North Carolina Computer Skills Assessments scott_ragsdale@ncsu.edu Randy Craven Technical Manager randy_craven@ncsu.edu Jim Kroening Program Manager, Performance Assessments jkroening@dpi.state.nc.us