I just caught the last of Shawn Henry’s SXSW panel. Key takeway- there are white areas of things that are good to do for accessibility and black areas of things that are bad for accessibility- avoid worrying about the gray area in the middle. She mentioned the ability of web accessibility experts to endlessly debate the ins and outs of alt text. For example:
These discussions are helpful and essential for establishing best practices. However, these discussions are harmful to the extent that a developer becomes tied up arguing about “gray areas” instead of building accessible content.
I’m headed to Austin tomorrow, here is a list of some of the presentations specifically on accessibility:
In addition there are sessions on internationalization, web standards and the ongoing browser wars. Audio from each of the sessions will be recorded and at some point made available on the website.
Any sessions that I missed? I hope to see some of you there, you can get in touch with me here.
Via Accesssites.org, a summary of a 21 page report on the accessibility of social networks from AbilityNet. The report analyzes Facebook, MySpace, YouTube, Yahoo and Bebo and reports that they all stink. Of those 5 sites, Yahoo was the only was to receive a two-star rating- that’s two starts out of five. The other four all received one star indicating they are ‘very inaccessible’.
The Dataportability.org group has been in the news of late as a number of major social networks have become involved. While their stated mission is not to make social networks accessible for users with disabilities, the goal that they are working towards should have that overall effect over time. They are working to promote the use of existing open standards to facilitate the sharing of social network user content outside of that network. Over time this will open the door to the creation of more accessible interfaces to access at least some of the data that is currently available only through proprietary, inaccessible systems.
Hopefully, the process of making user content more readily available will also lead to improvements in accessibility along the way. In the meantime, are users with disabilities being left out of the social networking revolution? Is anyone doing this right?
I first heard of Livescribe from an announcement of their upcoming ‘smartpen’ and noted that a former professor, Andy Van Schaack, PhD was involved in the project. A few weeks ago, I was able to attend a presentation from Dr. Van Schaack and learned a little more about the project. I was especially interested in the note on the presentation flyer that said he would “present his current NSF-funded study on the use of the smartpen to support blind college students in science, engineering and math.”
He started off with promotional information about how the Livescribe system is a new computing platform with a great team behind it. By all accounts it appears to be a laudable effort in the arena of comparable smart pen technologies. It works by recording audio while using the pen on special paper. You can then go back to any point in your notes, tap on the page and listen to the corresponding audio. Your notes also become searchable and shareable.
Then he addressed how the project would help make content accessible for students who are blind. Currently, a student who is blind studying in a field that requires the use of graphs or diagrams would need a technology toolkit such as a laptop paired with a not inexpensive tactile talking tablet. The price point lowers significantly as students are able to use the Livescribe pen with the more affordable Sewell raised-line drawing kit. This combination of technology allows the students to draw and annotate a graph with the ability to reference the graphic later and access whatever audio was recorded at the time.
(Update) Andy sent me a couple of additional links to share:
Hopefully, anyone subscribed to this blog also follows 456 Berea Street. If not, you missed a great post from Roger Johansson titled Overdoing Accessibility. Go read the article and then subscribe to his feed.
Mike Cherim tackled the same subject awhile back on Avoiding Extreme Accessibility.
Bim Egan ran a whole series of articles titled Too much accessibility – TITLE attributes.
The two attributes that were on all three lists were tabindex and accesskeys. The lesson? Take the time to understand your users, then evaluate the work that you are doing to make sure it is actually helping those you are trying to help.
Day 13 of 24 Ways brings us CSS for Accessibility by Ann McMeekin. Ann discusses the proper use of line-height for users with dyslexia and how to use the :focus pseudo class to let keyboard users (even those using Internet Explorer) know when they are focused on a link.
I recently presented on disability awareness in building accessible websites to a group of interaction designers. At the end, I was asked about examples of a specific person with a disabilities as well as design considerations for that person. This is what I found:
Personas of Persons with Disabilities and Recommended Design Considerations
- Fluid, a user experience project for open source projects, created the persona of Sara Windsor, a faculty member who is blind and outlines some considerations in designing an accessible user experience for her.
- Living with Disabilities, profiles for a blind person, low vision, hearing impaired, motor control impaired, and cognitively challenged, with design considerations for each- from the University of Michigan.
Personas of Persons with Disabilities
Regardless of whether or not you use personas, the examples are helpful to go through to better understand accessibility from a different perspective, even though that perspective is that of a make believe person.
If the personas aren’t doing it for you, take a gander at some of these videos and experiences to get a better feel for how persons with disabilities access the web:
Guideline 1.3 Create content that can be presented in different ways (for example spoken aloud, simpler layout, etc.) without losing information or structure
Draft Guideline 1.3 of the Web Content Accessibility Guidelines 2.0 discusses the importance of making information available in a form that can be perceived by the user- either directly or through an assistive technology. For example, if a site uses an image to convey meaning, then the alt text should be present so that it can be seen (in a text-only environment), heard (by a screen reader) or even felt (through a refreshable braille display). This pertains to both presentation and structure.
The technologies mentioned above- text-only browsers, screen readers and electronic braille- all do a good job at presenting information to the appropriate audience. However, what about users with cognitive disabilities? Text read aloud may be helpful in some situations, but what about a graphical representation of content? There are already some efforts in this area with the Communicate: Webwide symbol supported browser that claims to be able to represent over 29,000 words with symbols and while I applaud their efforts, with a Windows-only, proprietary, subscription-based product, I doubt there will be any widespread adoption in the near future.
Perhaps one day there will be an open system that will facilitate the collection and use of symbols in everyday tools. Until then, here are some other project/ideas that are exploring the use of symbols:
Google, Microsoft, Yahoo and AOL have announced their participation in the Internet Captioning Forum (ICF) established to promote the use of captioning for online video. They will be working with the National Center for Accessible Media (NCAM) at WGBH Boston. You can read quotes from Vint Cerf and other representatives from each of the big four at the National Center for Technology Innovation regarding the effort.
I am neither deaf or hard or hearing, but the captions are almost always on when the television is on in our home. I enjoy the viewing experience and miss fewer words mumbled here and there when I can look down and glance at the words when needed.
Google video currently provides detailed instructions on adding captioning to videos. Additionally, in the Google Video Help Center, this questions is asked, “Do you generate captions/subtitles for my video?” and the answer they provide is a promising “Not at this time.” They also have a section where you can view examples of captioned videos.
NCAM Press Release
(hat tip: The Assistive Technology Blog)