BarCampLondon: Mike Davies on automated accessbility testing
The sessions at BarCampLondon have just started and I'm sitting in one right now called Automated Accessibility Testing presented by Mike Davies. The subject caught my attention as I'm very much about following a Usability Approach to Accessibility and I want to see what solutions there are on the other side of the fence with regards to a Checkbox Approach. (The two don't necessarily have to be diametrically opposed, of course.)
From what I've heard in the first few minutes, the session looks like it's going to be surprising...
Notes from the session:
Knock-on benefits of accessibility:
- Accessibility significiantly improving Search Engine generated traffic (30%)
- Reduces the task completion time for people without disabilities
- Lowers the cost of maintenance
- Improves convention rates (doubles)
- Behind closed doors, the numbers are more spectacular
Mike's talking about his last project and the wide range of testing they did, including usability and expert testing (so they didn't just do automated testing.) He's talking about how certain institutions -- for example, local governments -- only rely on automated testing (in other words, Checkbox Accessibility.
Mike's mentioned SiteMorse -- a company that does automated accessibility testing on your sites.
SiteMorse: Snakeoil or misunderstood?
- Marketing advantage
- Belligerent/aggresive
- Concerns of misselling
- Evidence of pressure sales calls
- Evidence of spam behavior
- Mark says he's biased and critical -- says SiteMorse has had a go at him publicly :)
- SiteMorse is being seen (erroneously) as a ranking for accessibility
- RNIB see it right as an alternative
- 100% compliances with automated tests != 100% requirements compliance
- SiteMorse: 20% of the score is made of accessibility tests (A, AA). Correlation between SireMorse and accessible web sites? 0.46 correlation score for FTSE 100. Local government correlation higher: 0.7.
- Sales pitch: On 125 web site, SiteMorse will save you 90 hours. But how much more manual testing is left? There's a lack of context to understand what this means.
- How do you measure the quality of a closed tool? Blackbox testing. Using test cases.
Mike created test cases to test SiteMorse.
- Checkpoint 1.1: Alternative content. How much can be automated? Images: Going to miss background images via CSS. SiteMorse will fail you if your image doesn't have an alt text. Alternatives: longdesc attributes. Text may be below the image on the page itself. Thus SiteMorse simplifies the Checkpoint and may fail you even if you don't really fail Checkpoint 1.1. Mike built a test case and is showing us a table of the results. What SiteMorse says and what is actually the case is very different apparently. For example, it assumes that any Object tag is Flash or a script when it could be an image or a web page.
- Mike wants to publish his extensive data so that people can make a rational decision based on them. If people know the weaknesses of a tool, they will know how to use it.
Was a very interesting session. I look forward to reading Mike's research once it's published.
Comments
by Isofarro on 2006-09-02 14:34:33
by aral on 2006-09-03 23:16:44