Assessing General Education - A Discourse on

advertisement
Assessment of Web-based
Education Tools: Are your
Students Learning Through
Technology?
Andrea Irby
Undergraduate Academic Programs
North Carolina State University
Background
NC State is a large, public, research
one institution.
19,000 undergraduates; 9,000 grads.
Under pressure in late 1990s to grow
Possible expansion of Distance Education
Virtual Advising Center created 1999
Why Technology?
Enables us to transcend many barriers
Physical campus
Race
Country, nationality
“Saving Face”
Student development theory – meet them where they
are
Enables us to be more effective and efficient especially
in times constrained resources
Types of Information Exchange
Synchronous vs. Asynchronous
Data - ways of expressing things
Information - the arrangement of data into
meaningful patterns
Knowledge - the application and productive use of
information
Information versus Knowledge
Technology in Education
Can transform key relationships
Personalization: Targeting the student vs. the
department
Engaging the student: making applications interactive
Student Decision Support: Information becomes
knowledge
Create community: cultivate student relationships
Carl Jacobson, 2000
Engaging Students On-Line
Web-sites
On-line Course Management Systems
Email/Listservs
On line Chat or Instant Messenger type tools
Bulletin Boards and Blogs
Social Networking sites
“Students will gravitate toward the technology medium that best
meets their needs”
Karen Thurmond- Univ of Memphis
Focus on Web-sites
2nd Generation Web-sites: Information based
3rd Generation are interactive, knowledge based
How do we know if/ how many students come to
our website?
Are they able to find the information they need?
Is it clear and meaningful to them?
Do the receive our intended information?
How to Know
Log Data Analysis (data mining)
Usability Studies
Accessibility Testing
Log Data Analysis
Data (log) is created each time a person visits a webpage.
Analytic Tools then either pull data from the server
that hosts the web-site or read it from html code on
each web-page.
Sample tools
Google Analytics (reads code)- free
www.google.com/analytics
SAS Web Analytics (pulls from server)
www.sas.com/solutions/webanalytics
TopShareWare (pulls from server)
www.topshareware.com/LogSuite-download38225.htm
Easy to Read Reports
(in real life  )
Information Received
Visitors/Users
Repeat Visitors/Cached Visitors
How long on site
Traffic by time of day and month
Where enter, exit, or drop out of site
Browsers, operating systems
Where they are logging in from
Campaign/Action Tracking
Virtual Advising Center data mining
2000:
2006:
72,419 annual visitors
26,795 repeat visitors
176 avg. visitors per day
285,053 annual visitors
108,320 repeat visitors
780 avg. visitors per day
1,021,947 hits to site
8,269,215 hits annually
Virtual Advising Center data mining
Monday-Thursday busiest days.
Visitors on site for avg. of 2 minutes now about 5.
Path in and out; pages turned.
IP locations –whose accessing site on campus, off
campus, around the world.
Heaviest traffic times per day.
How we used this information
Sent surveys out on busiest days
Used busiest times of day to determine staff
coverage for chat
Used IP address/location information to help IT
staff on campus place kiosks in high demand
locations
To enhance relationships with local feeder
community colleges
Usability Testing
Observed “Focus” Groups for web-sites
Real data versus self reported data
Examines how people use some human-made object
Controlled experiment
Small, cross section of users
Trained users or “Hallway” users
Usability Testing
Pre-test questionnaires get input on subject’s
familiarity with web-site
Actual test involves a series of short, simple tasks
the subject is asked to complete
Observations are recorded by “tester”
Patterns of how subject completes each task are
noted
Emotional responses key to watch
Set up
Location can be:
In a computer lab with
multiple teams
In a usability lab
At your own computer
desk
Usability Testing
Post tests – record overall impressions of web-site
and experience; reflection
Very Informative Web-site on Usability Testing:
www.usability.gov
If testing uncovers difficulties, redesign
Test in small groups and often
For virtual advising center, its how we keep students
at the forefront of our design
Accessibility
Inclusive web-communities allow people with
diverse abilities to be engaged in on-line
learning
Web-Content Accessibility Guidelines
http://www.w3.org/TR/1999/WAIWEBCONTENT-19990505/#def-checkpoint
3 Priorities for Website Accessibility
Accessibility
Series of Checkpoints
Priority 1- Must satisfy these checkpoints or one or
more groups will find it impossible to get and use
the information
Priority 2- Should satisfy these design elements
Priority 3- May satisfy design elements
Results in A, AA, AAA website ratings
Accessibility
Test your site with on-line tools
Test your site with screen readers/JAWs
On-line web resources to help you:
http://atrc.utoronto.ca/
http://ncsu.edu/it/access/development/software
/tools.php
http://www.accessible.org/bobby-approved.html
Evaluating Learning
Web-site must be interactive
tests, essays, portfolios, chats, email
Determine what we want students to learn
Get student input
Variety of ways: focus groups, check lists, rubrics
and content analysis
Have more than one staff member working on
assessment
What we’ve tried and learned
Focus Groups- what we may see as learning, they
see as “help”.
Over time, they may recognize what they have
learned: raises question of timing of assessment
Electronic communication trails are rich with
evidence of learning.
Students are readmitting, graduating, learning about
themselves, picking majors, staying, expanding their
academic portfolios.
Evaluating Learning
Need to develop a rubric for evaluating rich content.
George Mason Business School example
Databases/Microsoft Access tool
Technology is a tool – through which students can
learn and for helping us with assessment.
Discussion? Questions?
Download