Report on the Web Accessibility Audit of Selected University of Guelph Websites Using Screen Reading Software – Summary This report was commissioned by the University of Guelph’s Information Services Council (ISC) and jointly funded by the Communications and Public Affairs, the Human Rights and Equity Office and the Office of the CIO. In keeping with ISC’s web accessibility plan, the objective of the report was to assess the accessibility and usability of key campus websites by conducting hands-on screen reader testing and to provide a detailed summary of the findings. The evaluation of the websites listed in the report was conducted during the summer of 2007. However, the campus web environment is constantly changing: some sites were being redesigned at the time of the audit and others have since been redesigned. While already out of date in some instances, the information gathered by the audit adds to our understanding of the state of campus web accessibility. It will also provide campus web designers with valuable feedback on the accessibility of their design practices – even those who have since revamped their website may benefit from reviewing the accessibility of the previous version. Method: Our web audit team consisted of two members: a former Guelph student with a visual impairment who is an experienced screen reader user and a sighted student who provides IT support on campus and has a knowledge of web design. The individuals worked in tandem, using the screen reader as a tool for testing web pages and detecting instances of inaccessible or non user-friendly web design. These instances included web content that: did not meet the World Wide Web Consortium’s Web Content Accessibility Priority 1 Guidelines (see the Checklist of Checkpoints for Web Content Accessibility Guidelines 1.0 - http://www.w3.org/TR/WCAG10/fullchecklist.html). was available to a sighted user but could not be perceived or accessed by an individual using a screen-reader. was unnecessarily difficult, time-consuming or confusing for an individual using a screen reader to access. The value of using a team as described above is that the sighted individual can help assess whether the person who is accessing a website using a screen reader is getting equivalent information to someone who is accessing a site visually. The students were given roughly 60 hours to complete the audit, which was conducted during the months of June and early July, 2007. They examined each site using Jaws version 7.x screen reading software in combination with Internet Explorer version 6. The testers approached each website from the perspective of students seeking relevant information or services. It should be noted that the time available to interact with the websites was limited, so only the main functions or services of each page were checked. Because of these time constraints, the web audit team may not have accurately identified some web accessibility p. 1 issues or alternatively, may not have discovered all the accessibility issues in the websites audited. We would therefore encourage campus web designers to familiarize themselves with the basic tenets and tools of accessible web design and incorporate them into their work – the responsibility for ensuring that the web environment of the University of Guelph is accessible to all is a shared one. Websites audited: The following websites, including associated sites, were audited: Academic Calendars Arboretum Computing and Communications Services Hospitality Services WebAdvisor Student Financial Services Athletics & Recreation Central Students Association Graduate Students Association Student Affairs College of Arts College of Management and Economics College of Social & Applied Human Sciences College of Physical & Engineering Science College of Biological Science Ontario Agricultural College Ontario Veterinary College Human Resources Human Rights and Equity Office McDonald Stewart Art Centre University of Guelph beta homepage University of Guelph-Humber My Portico Library beta website Open Learning WebCT Wellness Centre The web audit team recorded the accessibility issues they encountered and rated their severity according to the categories below. Barrier Legend: Serious: Moderate: Minor: Usability: Poses a barrier to accessing information Makes information more difficult to access Makes information more difficult to access but is easily corrected Could be improved for all users p. 2 Findings*: Total number of websites audited: Total number of websites with accessibility issues: Percentage of websites with accessibility issues: Percentage of websites without discernable problems: 80 47 58.7 41.3 Total number of websites with serious accessibility issues: Percentage of websites with serious issues: Total number of websites with moderate or minor accessibility issues: Percentage of websites with moderate or minor issues: Total number of websites with minor accessibility issues only: Percentage of websites with minor issues only: 17 21.3 16 20 14 17.5 Total number of issues identified: 111 Number of serious issues: 20 Percentage of issues that were serious: 18.1 Examples: Serious issues were mainly characterized by web content, objects or applications that were inaccessible to keyboard commands (e.g., javascript ‘mouse-over’ menus) or were presented in a strictly visual format (e.g., image-only PDF documents). Number of moderate issues 33 Percentage of issues that were moderate: 29.7 Examples: Moderate issues were characterized by design flaws that made finding information on the page difficult (e.g., misuse of headings, overly complicated tables, lack of ‘skip-nav’ links, missing labels on edit and drop-down boxes, etc.) Number of minor issues: 47 Percentage of issues that were minor: 55.3 Examples: Minor issues also made finding information on a webpage difficult but are easily corrected, typically consisting of missing alt-format text for images and graphical links or poorly identified links – e.g. “click here.” Number of usability issues: 11 Percentage of issues that were usability based: 13 Examples Usability issues were characterized by errors (e.g. broken links), missing information or unintuitive page layout. * these numbers are approximate since the audit team may not have been able to find all the accessibility issues on the sites they audited. p. 3 Interpretation: On the basis of the rough sampling above, the accessibility picture that emerges from the web audit is a positive one: 41.3% of the 80 websites checked had no discernable accessibility issues. Of the 47 websites that did have accessibility issues, only 17 websites (21.3%) contained serious barriers. This is also reflected in the range of accessibility issues encountered by the web audit team. Of the 111 accessibility issues documented in the report, only 20 (18.1%) were considered serious and the bulk of the issues (47 or 55.3%) were minor. The report does contain some caveats, however. The audit team noted that sites like the Library link to many third-party websites, which may not be designed with the needs of individuals with disabilities in mind. This can result in a mixed accessibility experience. Also, we need to continue to ensure that online learning systems such as Blackboard, My Portico and Open Learning are accessible because of the increasing role they play in the campus learning environment. Application: There is consensus amongst the report’s commissioners that it should be used to encourage the owners of the websites identified to correct any documented accessibility issues that are still outstanding. Correction of these issues would be noted in the final publication of the full report. Additionally, since the report gives us a good overview of the kinds of barriers that web designers inadvertently create, it could also be used to identify the topics addressed in any campus educational initiatives on web accessibility. Screen Reader Web Audit Team: Ed Granados Carin Headrick Project Manager: Gerrit Bos Project Coordinator: Athol Gow p. 4