Do We Get What We Pay For?

Introduction

Historically, both on this blog and on BlindConfidential, I have very rarely engaged with commenters. I write articles, I post them, people read them or not and some choose to comment. While I was reading Marco Zehe’s excellent Android review series , I observed him engage with his commenters both in the comments on the series and in the text of other articles in the series as they appeared. Last night, for the first time ever, I posted a comment to my own blog in response to something that an individual defending Android had posted. As there have been a pile of mostly negative comments posted regarding “The Amish User Experience, the article I posted yesterday, I chose to, instead of responding to them in the comments section, write a separate post containing my thoughts on their notions.

I am also going to explore the titular subject of this article, “Do blind technology consumers get what we pay for?” and, I’m quite certain the this subject that the Android fans will trash me again. Bring it on boys.

In most of my articles, I provide links to virtually all proper nouns and terms I think readers might find confusing. This article has a few links but I ran out of time today and didn’t add them. I’m sure that any links that today’s piece would have had are linked to from the one I wrote yesterday and they’re links on this page to that article.

Did We Get What We Paid For?

One commenter wrote, “The gay / LGBTQ community, in the past, used to be more flamboyant. They would openly dress or do certain actions to attract haters to them, in order to raise awareness. I am seeing a shift in this as of late, the community is taking a more humble approach and accepting themselves first before seaking acceptance from others.”

If we’re going to use the LGBT community as a metaphor, I’ll paraphrase the gay former Massachusetts congressman, Barney Frank on the day President Clinton signed the Defense of Marriage Act, “How long are we going to wait for our rights Mr. President? How long are we going to wait?” And, I ask you, how long are you going to wait for Google to end the discrimination they perpetrate against people with disabilities through their technological segregation?

I ask, “How would the LGBTQ community react if Google charged their community full price for a product or service and, then, only permitted them to use a subsection of the features?”

Accessibility doesn’t apply to the majority of the LGBTQ community so let’s suggest a hypothetical. What if Google decided to go into the hotel business and told all LGBTQ people, all racial minorities and a few religious minorities that, while they need to pay full price for their room, they cannot use the swimming pool, the gym or any other facility? Now, please tell me how the discrimination we face due to technological segregation is any different?

In the hotel example, any of the aforementioned minorities would probably start to remedy the problem by going straight to the Department of Justice and have the place closed down if changes aren’t being made immediately. About two years ago, Chris Cotters, a member of the Freedom Scientific board, flew to Tampa for a meeting and tried to check into the Westshore Hotel. There, the on duty manager tried to refuse him a room. He called his employer, a big time Boston law firm, their people called the company that owned the hotel and, within an hour, that manager had been fired. That is how discrimination should and must be handled. Why then shouldn’t the people in charge of making Android accessible receive the same treatment for continuing to enforce technological segregation on our community?

When I buy something, I expect to be able to use 100% of its features. I paid full price for an Android device and I bought a second one (the Nexus/7 I used for my research) second hand. In either case, I paid for all of the features on the device and, as Apple has done with iOS/7, I expect to be able to use all of the features for which I paid my hard earned dollars.

In reality, as anyone can read in the comprehensive testing I did and described in “I Give Up,” there are a whole lot of features that are not accessible to people with vision impairment and, far worse, to people who are deaf-blind. Shouldn’t we get a discount reflecting the percentage of inaccessible features when we buy such a device?

I don’t want to hear, “Well, I can use the subsection of accessible features to do everything I want,” as that’s the most selfish thing anyone can say about accessibility. Readers of this blog would know that I don’t write about personal use cases, I stick to objective measures things like standards, guidelines and best practices. I do this because, as a user, I am an statistically insignificant sample size of one. I also can only do functional testing based on a user who has a total vision impairment. Hence, I look at standards developed for universal accessibility so as to ensure that my testing applies not only to me but, rather, to all people with disabilities that require access technology.

If I only tested the apps and features that I would want to use, I would have saved myself a whole lot of time and frustration. Instead, I tried to test every feature, every app, every control in each and so on as I cannot predict what other people, the people who read this blog, might want to do and neither can you.

By claiming that an Android device is accessible means that your definition of accessibility means that my deaf-blind friend Scott can’t use it but, in your mind, that’s ok. You’re saying that we don’t deserve every feature for which we paid, even though we paid full price and you’re saying that your personal use cases are more important than the collective use cases desired by all people with disabilities.

Shooting The Messengers

If one takes a look at the traffic on the Eyes Free mailing list, one would think that my old buddy Marco and I were the most evil villains in the blindness community. what did Marco and I do to provoke such anger? We spent our personal time, entirely without compensation, to research an Android system from as an objective way possible. You’ll notice that there’s no “donate” button on this blog or on Marco’s either, we do this testing so as to inform readers of the results of our findings. I test against published standards, guidelines and objective measures; Marco did a functional testing process based in actually using the device.

When we each started our efforts, we both hoped that Android would be an accessibility giant, we both wanted to write really positive pieces. Instead, based on the data we gathered, we wrote articles telling the truth, Android, based in objective measures and more subjective functional testing failed on nearly every count. The reaction by the Eyes Free community, though, was to dig in and, without correcting a single fact in any of the articles we’ve published on the matter, toss ad hominem at us. We spent a lot of time and personal energy actually testing these systems and reported the results. So, I suppose, if you don’t like the news, you’ll shoot the reporter.

What amazes me, as a blind technology consumer, is that Marco and I received far more anger aimed at us than the same people who bought devices on which they could only use a subsection of the features but paid full price ever toss at Google, a company they obviously worship with some sort of religious fixation, for not being 100% accessible in the same way that Apple has done with iOS/7. You can shout at Marco and I all you like, it still doesn’t change the situation, you pay full price for Android, you don’t get a full feature set.

Sure, I used fairly inflammatory language in “Amish” but Marco wrote everything without the sarcasm readers expect from Gonz Blinko. Marco is a truly and incredibly nice person; the same is rarely said of me. Marco engaged with the Eyes Free community during his testing (something someone critical of my piece commented positively about yesterday); I did my testing in a black box. Even with two very different approaches, we concur, a blind person gets a subset of the features for which they paid.

Anyway, feel free to call me as many names as you like but, please, lay off Marco. I write using vocabulary that may incite, Marco does not. Be fair, he’s worked for a lifetime in accessibility and has delivered a whole lot of the software blind people enjoy today including the terrific accessibility experience you Android fans have in FireFox. Read my profile, I’m a self proclaimed crackpot, stoner and loudmouth; Marco is the real deal, he works his ass off to make the world a more accessible place nearly every hour of every day.

Bias?

Marco and, more so, I have been accused of having a pro-iOS bias. This is true but it’s not based in “belief” that Apple does a better job but, rather, in having tested both systems extensively, gathered our data, added it all up and, voila! we find that one system is more accessible than another. We report with a highly fact based bias. So, if we have a true “bias” it is for reporting on actual testing results and not by how we feel. Data matters !.

To those of you who have accused either or both of us of bias, I have a single challenge. When you have taken an iOS device and an Android device and have, as I did, tested every aspect in every app on each, and scored with one point for everything that meets every aspect of the iOS or Android accessibility API that passes (some controls will have six or more items to test) and give 0 points when any fail. Then, divide by the total number of tests that were performed to get a score. Apple will get an A+ with 100% (in integers) and Android will get a failing grade. Don’t take my word for it, don’t be lazy, do the work and you’ll see the results yourself. My work can be replicated and repeating an experiment is at the crux of finding the truth.

Dueling Mobile?

Years ago, there was an annual event at CSUN called “Dueling Windows.” On stage, there would be a JAWS user, a Window-Eyes user and users of a few of the long forgotten screen readers on the stage. The users on stage were not employees of the screen reader companies but they were allowed to approve the users as experts. Then, side by side, with identical PCs with all of the same software (excluding the different screen readers) they were asked to perform tasks by a panel. The users were not given the tasks in advance and were always designed to test a very wide range of use cases. This was really fun and, for those of us working on screen readers back then, it was incredibly informative. Many times, we would see something happen with our user on stage and return to the office to make it better in the future. It also gave consumers a good taste of what worked well and what did not with each screen reader so they could make a buying choice. Sadly, after JAWS won the event five or six years in a row, they stopped doing it as it was like watching the New York Yankees play against a Long Island Little League team.

I’d like to propose a “Dueling Mobile” event that works similarly. On stage, we could have a user hand picked by Apple, Google and Microsoft to represent them. A panel of experts could compile a list of tasks common to mobile computing. One at a time, the users on the stage will try to accomplish the tasks. Success will be judged on the amount of time it took each to accomplish the task, the number of gestures necessary to perform the same task and I’m sure my more scholarly friends would come up with a number of more metrics against which the contestants could be judged. The CSUN call for papers went out last week, if someone wants to work on this as a proposal, I’ll be happy to help.

What will be accomplished by such an event? We will have another data set based in an objective measure that we can publish and blind consumers will be better informed when they hope to make a purchasing decision.

Why Do I Write This Blog?

One commenter asked why I would take the time to write my blog. I enjoy writing, I studied writing in graduate school at Harvard, writing is what I do. I write this blog because I enjoy working through ideas in written form. I enjoy the process and I enjoy the conversation that my articles sometimes provoke.

I suppose a fair number of readers like it too as virtually all of my pics get hundreds of hits and, this year, a half dozen or so have gotten more than a thousand with one over 5000. In the past month, my blog has been featured on the front pages of Daring Fireball and TechCrunch so I suppose people in the mainstream are enjoying it too.

What I cannot answer is why readers come back to my blog as frequently as they do. When I write a piece, I never know if it will be a hit or not. I had thought, for instance, that two of my recent articles, one critical of the VoiceOver support in Safari on OS X and the other about how hard it is to find the history of access technology online, would be big hits (on my lowly standards of big hit), instead, they were two of the worst performing articles I published this year. Other articles, like “Remembering GW Micro” felt self serving even to me as it discusses my own role in AT history but it is one of the articles on which we’ve gotten more than a thousand hits. So, I never know, I just write what comes to mind and toss it out there and hope some people enjoy my work.

Conclusions

If we don’t get access to every feature for which we paid, we are being ripped off.

Discrimination through technological segregation, especially now that web sites, under ruling by US Department of Justice, are, indeed, places of public accommodation, is identical to segregation in the world of bricks and mortar. We don’t tolerate it there, why tolerate it in our technology?

Use data to drive your arguments and you won’t be accused of ad hominem and other logical fallacies.

And, if you want to shout about the accessibility in Android, put up or shut up. Do the testing like Marco and I did. Test everything like I did. Then, publish your results. If you are unwilling to do the work Marco and I did but insist you’re right, I just ask, where’s the data?

The Amish User Experience

Introduction

Last month, I attended the HopeX, Hackers on Planet Earth conference in New York City. It was a terrific event and I encourage all of my readers to come to the next Hope conference when it happens in 2016. At HopeX, I enjoyed a lot of different talks and I had a lot of fun hanging out in the Lock Picking Village where I was taught how to pick simple locks, a fun hobby for a blind person as everything one needs to do is entirely tactile but hearing little “clicks” helps too.

The first talk I attended at HopeX was presented by a terrific woman whom I would later get the chance to meet. Her name is Gus Andrews (@gusandrews on Twitter]). Gus described a talk at a previous Hope conference given by Eben Moglen, the founder and head of the Software Freedom Law Center (SFLC), a man I know reasonably well and someone whom I respect greatly. Moglen described Apple as a “vampire” that lures unsuspecting technology consumers into using its products by providing “sexy” user experiences that makes their technology easy to use while it takes away your information freedoms. Andrews, in her talk, responded to Moglen’s statement by asking the question, Are free software proponents, Stallman, Moglen and others who insist on using GNU/Linux systems the ‘Amish’ of the computer using public?”

Andrews thesis suggests that some people will eschew a nice, comfortable and simple user experience purely because they have some sort of religious obsession or philosophic bent that causes them to choose what is metaphorically similar to an “Amish” experience. They give up a nice and easy user experience, they even pronounce that they prefer a user experience that is less efficient, less “pleasant” as they seem to believe that doing things in a simple and intuitive manner somehow offends their religious fixation with living in the technological equivalent of a hand built survivalist type cabin in the woods where they can live out their fantasies of technological and moral superiority.

This article intends to explore this notion as it may apply to the community of blind people using computational devices.

Corrections and An Apology

In the text that follows, I state that the Microsoft mobile phone platform remained inaccessible. A regular reader sent me a correction on Twitter telling me of the recent release of Windows 8.1 Phone and that the Narrator it comes with is quite a credible screen reader. I haven’t seen one of these yet so I won’t write more about it.

I would like to apologize to readers for using the phrase “Ted Kaczynski cabin” as a metaphor for someone who ives without the standard amenities of modern life. When I wrote that, I was thinking, “off the grid, survivalist sort” and not specifically a man who committed murderous acts of terror, one of which severely injured a man no more than a few blocks from where I sit writing this in Cambridge, Massachusetts. I apologize for using this metaphor, it was insensitive and I’ve changed the article to reflect what I actually meant with that phrase.

Another Expert Gives Up On Android

Marco Zehe works as an accessibility engineer at the Mozilla Foundation. He personally has worked on the excellent accessibility solution provided in FireFox on Android so he knows that operating system both as a user and as a developer. Prior to joining the Mozilla Foundation, Marco worked at Freedom Scientific for a number of years and, in my opinion, was the single most important contributor to the excellent design that the braille support in JAWS has today.

In the summer of 2013, Marco wrote a blog article describing his experience attempting to use an Android based phone for thirty days. In that article, Marco detailed a number of fundamental showstoppers in his use cases; this month (August 2014), Marco tried to repeat the exercise to determine if, indeed, he could use an Android device accessibly, conveniently and effectively in his personal use cases. Marco’s series starts on his blog and, from there, you can find links to the entire series.

Earlier this year, I ran a series of three articles, “Testing Android Accessibility: I Give Up!” “Testing Android Accessibility: A Deaf-Blind Perspective” and “Testing Android Accessibility: The Programmers’ Perspective.” I wrote “I Give Up” and “The Programmers’ Perspective” and a really terrific deaf-blind fellow named Scott wrote the third.

Marco and I took very different approaches to our testing. I followed a system based entirely in objective measures, standards (take a look at the BBC Mobile Accessibility Checklist, it was written by the same team as who created the mobile checklist for Section 508 and it will be US law when GSA is completed with its final acceptance process) and the basics of the science of human factors. I tested every control in every app that ships on a standard tablet from Google looking for anything from unlabeled graphics, objects out of the swipe order and so on. Marco took a very personal use case approach and attempted to do fulfill all of his mobile computing needs with an Android phone. My approach was rigid and refused to take into account applications from third parties; Marco used every resource he could find to try to crate a usable experience for himself. I would flag every unlabeled graphic or control and everything that wasn’t in the swipe order as a failure; Marco would accept that sometimes a blind Android user may need to label controls for himself and poke around with “explore by touch” to find items that aren’t in the tab order. I slammed Google for refusing to include accessibility in its automated testing processes as, virtually all of the problems I had found could be discovered by an automated testing tool simply and corrected easily and inexpensively; Marco took an approach that ignored Google’s failed software engineering processes and only explored the user experience itself.

Quite obviously, Marco and I viewed the task of testing Android accessibility very differently. I did as I do and stuck to published requirements and known best practices; Marco took a user centric approach and listened to advice from people on the Eyes Free mailing list on third party applications that can, in their opinion, replace the broken apps that carry the Google brand name. The most surprising thing is that, given the radically different ways Marco and I looked at the platform, we came to the same conclusion and even used nearly identical same vocabulary to describe Android’s failed accessibility experience, namely, “I give up, in my case, and “I Quit,” in Marco’s.”

I wrote the first article in my series with the title, “I Give Up!” Marco wrote the eighteenth in his series and titled it, “I Quit!” My articles were widely criticized by blind Android enthusiasts for taking a standards based approach (something I documented in an article called “Standards Are Important”) as they all asked why I didn’t try a variety of third party apps that I could use in a moderately to very accessible manner. My answer was that I was only testing the out-of-the-box system sold by Google.

Marco’s work should end this controversy. Android accessibility failed both objective and subjective testing procedures performed and reviewed by noted accessibility experts.

The Blind Amish

In preparation for writing this article today, I went to Marco’s blog yesterday and reread each of his eighteen days of trying to live with an Android phone for thirty days before giving up and reading all of the comments written by his readers.. One comment jumped out at me. It’s author stated very eloquently that the people who hang out on the Eyes Free mailing list often say that blind Android users need to “be patient” and to “wait for Google to catch up.” The person who posted the comment and I seem to share the same opinion, “why suffer an inefficient and unpleasant user experience when there are good and excellent alternatives?” and “Why decide to be Amish in your technology choices?” Blind users of mobile devices already have two good choices for tablets (Apple and Microsoft) and, now, with the release of Windows 8.1 there’s choice on a mobile phone.

Let’s consider the notion that, indeed, Android is an actual choice when it’s accessibility is so fundamentally broken, even when one allows for using a bunch of third party apps to do what a sighted person can do with their Nexus or other Android device in the first few seconds of ownership. Is living without indoor plumbing, a hot water heater and all of the other comforts the Amish reject a true lifestyle choice? Is going through what Marco and I experienced in our testing of an Android device, when compared to iOS and Windows 8.1 on a tablet, really a “choice” when the interface is so inefficient? As Marco demonstrated, it is impossible for an expert level blind technology user, a person who invented a whole lot of things the rest of we blind people use every day, to live with an Android device for a month, let alone as a permanent solution. Given all of this, I can only conclude that, no, Android is not a choice at all.

If you like living in the technological equivalent of an Amish community, a system that requires far more effort and provides profoundly less comfort than the alternative, please go right ahead and do so, just don’t tell the rest of the world that your choice is “accessible” when, in fact, it’s really only marginally useful when compared to the state-of-the-art. When you claim that something is “accessible” when, in reality, it is not, you only encourage companies with a history of poor accessibility to continue being poor as they will find some blind Amish willing to state that virtually anything that talks at all is accessible. By claiming that Android is accessible, you make the work of accessibility professionals and advocates much harder as we then need to convince our clients that the truth of the Android accessibility experience is that they will fail all known regulations without doing profoundly more work than they would need to in order to make an app accessible on the Apple and Microsoft operating systems.

What About Other OS?

Almost every day, I have cause to use iOS/7 on my phone, OS X Yosemite beta on my laptop, Windows 8.1 on a convertible I got about a year ago and Ubuntu GNU/Linux via a command line via ssh and in a virtual machine on my Macintosh. I tried to live with an Android device for three months and, finally, I gave up on it. Of these operating systems, I feel that, regarding accessibility, iOS/7 is the most comprehensive with 100% (when rounded to integers) of its features, when measured objectively, are accessible not just to a blind person but to people with a panoply of disabilities. Windows 8.1 comes in second but I take away points for a relatively small number of stock items I found that had accessibility problems but, more so, because its built-in screen reader, Microsoft Narrator, remains sorely substandard when compared to the state-of-the-art coming from third party screen readers like NVDA and JAWS. I’d put OS X in third place based upon the official Mavericks release (it’s what I’ve been using for most of the past year) and, for now, I’ll reserve comment on the Yosemite beta as I signed their non-disclosure agreement (NDA) and I’m a bit of a stickler for obeying contracts I’ve signed. This leaves the GNU/Linux experience in dead last place as I’d also say that my Android experience was more pleasant than most of what I deal with daily in Gnome with Orca.

Gus Andrews said of all GNU/Linux users that they seem to be the Amish of the mainstream technology community. I’ll say that, in addition to android, GNU/Linux using blind people, especially those who use the Gnome windowing system, are the Amish of our community.

Conclusions

No matter how an expert tests accessibility on Android or GNU/Linux, whether it’s me doing an objective, standards based approach, if it’s Marco doing a user based subjective set of tests or the programmers who I used as sources on the “Programmers’ Perspective” article just trying to do their jobs, Android and GNU/Linux are accessibility outposts.

Some will argue that Android can save a user some money but so can going off the grid and living in an off the grid survivalist cabin in the hills. There are many ways people can save money but why go Amish on us to save a few bucks? If you believe your time is valuable, why spend so much more configuring a system when Microsoft and Apple provide excellent choices out-of-the-box?

So, don’t live in a cabin in the woods, come into modernity and enjoy mobile devices from companies who take accessibility so seriously that they actually deliver it today.

Apple and the Accessible Internet

Introduction

Any regular reader of this blog (both of you) would already know that I enjoy using a bunch of different products from Apple. I use an iPhone 5S running iOS/7, a Macbook Air running OSX/Mavericks with all of its updates, we have an AppleTV set-top box and we use an Apple TimeCapsule router. The first thing one notices when they get any of these devices is that their interfaces are 100% accessible in the iOS case and nearly 100% accessible on OSX right out-of-the-box. For this reason alone, Apple is by far the leader among mainstream companies trying to solve the problems of accessibility to people with vision and other print impairments. Apple continues to make its accessibility better with each release but, while it may be #1, Apple still has a lot of work ahead to be truly competitive with third party screen readers On the Internet.

Any user of a popular Windows screen reader (JAWS and NVDA) or even those with less popularity (Window-Eyes, SystemAccess, ChromeVox and Orca) will, for a variety of reasons, be entirely underwhelmed with the functionality of VoiceOver on a Macintosh with the Safari web browser.

This piece started as a bug report I wrote up for some contacts I have at Apple. For all intents and purposes, I have changed very little between the email I sent to friends there and this article. I’ve removed the names of some individuals who are not public figures, added a bunch of links and did a bit of other clean-up, removing some personal comments and such. This article is specifically about how VoiceOver works with Safari on OSX and may not be applicable in any way to iOS/7 or any other Apple products. Internet support, in my mind, is the single aspect of using a Macintosh with a screen reader that remains substandard which, as Apple is setting the standard in so many other areas, makes me sad.

My Specific Use Cases

It’s possible that each user has his or her own set of cases that are important to them. Like everyone else, I use the Internet for a lot of different things but, most importantly, I write a blog. My blog tends to use other Internet sites as source materials. Therefore, being able to copy and paste from sites is really important to me and,sometimes, when I go to a site and hit VO+ENTER to start selecting text, I hear the “scratching” sound and it actually selects text; sometimes, I just hear a ding and it refuses to select text using this method. On some occasions when the VO web site text selection facility doesn’t work, I can just use SHIFT+navigation keys and the text will be selected; on other occasions, I the only way I can select text on a web site is by doing a “select all,” copy and pasting the entire page into a text editor and finding the piece I’m looking for there. This is, in my mind, one of the worst problems with VO on OSX.

The Overall User Experience

Most other popular screen readers (JAWS and NVDA) and some less popular ones (Window-Eyes, ChromeVox, SystemAccess and Orca) allow the user to navigate around the page using only cursor keys as if in a word processing document. Originally, Orca’s FireFox support, also designed by the person who is now the lead UI developer for VoiceOver, functioned similarly to the VoiceOver design where arrow keys are virtually meaningless except when combined with a modifier key. Orca, not known for its tendency to be terribly competitive with other screen readers nor for its unpleasant user experience, took a step back and changed its UI design to be like JAWS, the screen reader that set the standard for Internet accessibility (if you disagree, I can provide a pile of links to actual testing scorecards that, quite objectively, demonstrate JAWS superiority in all of these areas including the WAI user agent guidelines). Apple, quite obviously, has infinitely more resources than does the Orca project (as far as I can tell, Orca has exactly one developer, Joanie Diggs, working part time on the effort) and can certainly make this happen.

Navigating on a Web Site

As far as I can tell, all other screen readers on general purpose computers (desktops, laptops) allow for single character navigation of a web page. In fact, all but Window-Eyes use the same standard set of keystrokes (h for next heading, t for next table, etc.) and, with all other screen readers, navigating a web page is profoundly more efficient. With NVDA (I don’t use JAWS), I go to a web page and hit “h” and I’m brought to the first heading, I hit “h” again and I’m at the next one and, if I follow that by typing a “t,” I then go to the next table and so on. With VO, I load a web page and, if I want to go to the next heading, I need to hit VO+u first to make sure it’s set for heading navigation and then either find the heading I’m looking for in the list box or, after setting the utility dialogue to headings, use VO+down arrow to find the next one and, then, when I want to find the table, I need to go back into the utility dialogue, change to tables and start over. Hence, finding the object I’m seeking requires far more keystrokes, requires far more cognitive processing, etc. but, worse, it makes switching from any other screen reader to VO much more difficult. I need to use OSX, iOS, Windows and GNU/Linux all nearly every day so anything that improves the similarity of screen readers is important based entirely in the HCI concept called “discoverability.”

On a personal use case note, I cannot tell you how many times I’ve been using VoiceOver, used Command+TAB to switch to another application, returned to Safari and found that I’m hitting the keystroke to go to the next object only to find that I had forgotten to set the granularity back to headings and hear something entirely useless like, “No more tables” which could have been avoided entirely if Apple would just implement the same sort of system as exists in the more popular Windows screen readers. Maybe I’m a bit of a stoner and, therefore, forget which granularity I had VoiceOver set to but I’m willing to bet that lots of other users make this mistake frequently as well. The rotor for granularity changes works reasonably well on iOS but changing granularity on OSX is unnecessarily cumbersome.

Correction: When I wrote the two prior paragraphs this morning, I did so in the absence of any awareness of the QuickNav Commander now available in VoiceOver. For all intents and purposes, if you go into the VoiceOver Utility (VO+F8 if you’re running VO), go to Commanders and select the QuickNav tab, you can turn on “Single character navigation” there and have an experience similar to that available in Windows screen readers. Back when I worked on JAWS, we had something of an unwritten rule, if we add a cool new feature, we made sure it was turned on by default in the next release of the screen reader so that users would find it right away. I don’t tend to read a lot of release notes and, until my friend and accessibility jock, Donal Fitzpatrick (@fitzpatrickd on Twitter) pointed this feature out to me, I didn’t know it was there. So, for all intents and purposes, you can ignore the two paragraphs preceding this one as, given this feature, they are just not true.

Performance and Time

VoiceOVer is ridiculously slow on “noisy” web pages (those with lots of objects). Go to this site about harmonica playing, search on a popular artist (Bob Dylan has a lot of stuff up there) and bring up the Item Chooser (VO+I) and count the seconds it takes to bring up the item chooser list box and, if you’re using the same 2012 model Macbook Air as me, you’ll see that this takes a little more than 7 seconds. Now, using NVDA on a cheap Windows laptop hit NVDA+F7 to bring up its analogue of Item Chooser and you will find that its list box is on the screen and talking in less than a single second. NVDA is also using cross application communication via an API to gather its data but, using caching and other performance enhancing techniques, it actually responds in a functional amount of time; in 2014, waiting 7 seconds for a computer to do anything other than downloading something big from an online source is simply absurd.

When, in September 1999, we at FS released JAWS 3.31, we used Jamal Mazrui’s EmpowermentZone web site as our favorite reference page. Jamal has something like 1700 links on the home page and, according to VO, it has 3789 objects in all. Back then, we were running on 60 mhz Pentium processors with megabytes of RAM and, then, JAWS 3.31 could load its object list dialogue on this page in about 20 seconds (compared with about 25 minutes using Window-Eyes). Just now, when I went into Safari to test this page, it took about 30 seconds for VO to load its item chooser on hardware more than a decade newer, using a quad-core system whose speed is measured in ghz, having thousands of times more RAM and so on. We solved this problem on Windows 98, effectively a 16 bit system; certainly, Apple can solve this problem now that much faster hardware is available.

The Broken Item Chooser

If a user hits VO+i to bring up the item chooser before a page has finished loading it will bring up the list box but, when one hits ENTER on an item, it will just ding and not bring the user to the point he had requested. VO seems to load all of its data much more slowly than any other screen reader (if I bring up the NVDA analogue of this dialogue by hitting the keystroke immediately after requesting a page, it appears immediately and is never out of sync with the rest of NVDA. I’m going to guess that this is a threading issue which are hard to fix but this bug has been present for years now, has been reported by me but I’m also certain that others also reported this problem to Apple.

For no reason apparent to anyone outside of Apple, it seems that the Item Chooser information isn’t cached anywhere. Hence, when one hits VO+i on the same page twice, VO takes as much time to build the list the second time as the first. If the page hasn’t changed, the Item Chooser information should all be present either in memory or cached on a disk and should, even given the other VO constraints, load virtually instantly the second time through. 

What About “Clickable” Items?

When VO describes an item as a link, using VO+SPACEBAR will always open it. When, however, VO reports an item as being “clickable,” more often than not, VO+SPACEBAR does absolutely nothing and hitting VO+SHIFT+SPACEBAR to send a mouse click to the object works infrequently. I have my VO configurations set to have the mouse cursor follow focus and also try hitting VO+SHIFT+F5 to route the mouse cursor to the object I’m on but that seems to rarely work as well. [Note: while here in TextEdit, routing the mouse cursor works properly but, while using Apple Mail,, with my mouse cursor set to follow VO, I hit VO+SHIFT+F5 and wasn’t brought to the word i had just typed nor was I even brought to the edit area where I’m typing this but, rather, I clicked something on the Dock, a seriously bad outcome.]   

Compared to JAWS

People who have read articles on this blog like “I Give Up” or “An Open Letter to Mark Riccobono” will know that I’m not just a user of Apple products but, based entirely on accessibility, I’m something of an advocate often recommending their hardware to other blind users. Now that Apple seems to have made iWork, their office suite, mostly accessible, the Internet is the only aspect of VoiceOver that I still don’t like much. Readers of this blog would have heard me say, “Apple has set the gold standard for out-of-the-box accessibility” which is true, for almost everything, except for the Internet. Online, JAWS remains the king with NVDA a close second. This is the one area where Apple really needs to do some massive improvements.

If I actually picked up the user agent guidelines and tested each item separately, I would find a ton more bugs. I would find a really big list of defects that one would not encounter if they used JAWS or NVDA. It’s pretty much the Internet access that causes me to use Windows to do most of my research and a lot of my writing these days. On all scorecards regarding screen reader functionality published online (I’m working on an article about these reports coming soon to this blog), JAWS remains the gold standard for using a speech interface to read the web. Apple may have set the bar for nearly everything else but, if Apple wants to be the best, they have a lot of work ahead of them.

The Object Model

VoiceOver arranges its web information by object but doesn’t also include a simpler navigation metaphor. Hence, as I wrote above, it uses a different system for moving from object to object so a separate keystroke is needed for each separate object. If a web site contains the sentence (including the links):

“Let’s compare the number of keystrokes necessary to read this sentence with JAWS, NVDA, SystemAccess, VoiceOver and a few other bits of access technology.”

an NVDA or JAWS user might read the entire sentence by issuing a single keystroke (a down arrow perhaps) but, with VoiceOver reading each object separately, one needs to issue a keystroke for each link plus each chunk of text separating such for a total of eight keystrokes to read a single sentence. Also, while doing a “read all” of an entire web page, the user will hear pauses caused by VoiceOver trying to add a tone for each link making the entire reading experience sound really choppy. This is massively inefficient for the users and should be corrected immediately.

The Interaction Model

When a user accesses a web site with JAWS or NVDA, the information is pretty much organized like a word processing document with all of the same keystrokes for navigating through such. With VoiceOver, Apple introduced a model that attempts to group chunks of a web site into larger blocks that a user can navigate between and, when they are in a place they want more detail, the users can “interact” with that portion of the web page. In theory, this system should allow for greater efficiency as it permits the user to easily jump past information in these groups.

Very unfortunately, though, the interaction model only seems to improve efficiency on tremendously well organized web sites and, more often than not, actually requires the user issue more keystrokes than in the “virtual buffer” model presented by most other screen readers out there. For a quick example of this, if you’re using a Macintosh running VoiceOVer to read this page, find the place where one can follow me on Twitter and you’ll notice that you need to interact (and, therefore, stop interacting when you’re done) with an item that contains very little actual information. With JAWS or NVDA, however, simply moving from line to line with arrow keys gets you everything you want.

In my experience, the interaction model causes far more efficiency problems than it solves.

Conclusions

This isn’t my most well organized piece and it gets repetitive in places. I wanted to show, however, how even Apple, the world’s leader in out-of-the-box accessibility, sill needs to continue improving. I’m certain that this item will gather me a bunch of comments (either public or privately through the contact form on this site) about other problems blind users encounter with VoiceOver on Macintosh. If you have other bugs to report, sending them to me may make for a future version of this article to be more comprehensive but, at the same time, I urge you to report any problems you encounter to Apples Accessibility email address so they may both know about the bugs but, also, they may understand just how many people are effected badly by defects and design flaws in their accessibility software.

Preserving Our History

Introduction

Recently, I wrote an article called “Job Access With Bugs?,” in which I explore some of the generally accepted notions around access technology for PWVI. That article came as part of my ongoing attempt to record the history of the screen reader in the years following 1998, when I joined Henter-Joyce as Director of Software Engineering. These articles have been popular with our readers and I’m happy that some of our history is preserved in them even if my work tends to be loaded with opinion, conjecture and is based largely in anecdote rather than serious historical inquiry.

In these articles, I try to include a link to every proper noun when it first appears in the pieces and I try to include links to concepts that may be unfamiliar to our readers. when doing so, as I wrote in the conclusion of “Job Access,” I try to find links to objective materials, mostly Wikipedia, rather than personal blogs or marketing oriented company web sites. While writing “Job Access,” however, I realized that little of our history, the history of technology for PWVI, has been recorded in the public record.

This piece intends to encourage people to write and edit Wikipedia pages about the technology we use and have been using for a few decades now, it proposes an idea for gathering an oral history describing our use cases and how such technology has effected our lives over the years and, lastly, about a computer museum interested in curating a collection of AT hardware.

The Blazie Engineering Braille ’N Speak

Arguably, the Braille ’N Speak (BNS) from Blazie Engineering may be the single most important bit of access technology for PWVI in history. I know literally hundreds of blind people for whom the BNS was their first piece of access technology who, using this once remarkable device, were able to attend school, go to university and perform a lot of professional functions using their BNS. I also know dozens of blind software professionals who got their start programming by first learning BNS Basic. This device is certainly an important part of our history but it has no Wikipedia entry nor is there an entry for Dean Blazie, the inventor of the BNS or for Blazie Engineering, the very important AT company that built the product.

In my mind, this should be the first item corrected on Wikipedia. Someone who knows a lot more about the BNS than I do (I never owned or used one myself) should write up an article about it. It would also be important to add entries for Dean the man (again, he’s someone I’ve met a few times so someone with a greater level of familiarity with Dean should write such) and for Blazie Engineering, a very important manufacturer of braille devices as well.

Henter-Joyce, GW Micro and Window-Eyes

While JAWS and Ted Henter have token Wikipedia articles about them (something that we really must improve and something I might edit myself), companies important to our history, Henter-Joyce and GW Micro do not. Window-Eyes, for many years the second most popular screen reader and the first to embrace an API based strategy for gathering data, regrettably also have no Wikipedia entries. Doug Geoffray, the most visible member of the GW Micro team, is also without a Wikipedia entry.

I can probably write an article about Henter-Joyce as I can call Ted to get the story right, but someone other than me would need to write up articles about GW Micro, Window-Eyes, Doug Geoffray and the others there who helped invent our future.

Less Prominent Technology

I know a real lot about a few things but virtually nothing about many of the other technologies that PWVI have used over the years. My own braille skills are horrible so I’ve never actually used a braille display nor have I done much with a braille keyboard. While I had managed MAGic and WYNN (a product for users with learning disabilities), I’ve never used them myself and, beyond the theoretical side of this sort of technology, I can’t really speak to such.

It’s important that our history is preserved so, please, if you’re so inclined, make yourself a Wikipedia account and start documenting our history.

An AT Oral History

If our readers think it’s a good idea, I will set up a wiki on this site where PWVI can write up their stories about how they use access technology and how it has effected their lives. Here, in an informal way, individuals can tell the stories that I hear daily from people orally. That an individual got themselves a copy of JAWS, spent time learning it and was able to use these skills to advance their career, further their education or do something else productive with such is a major part of our history that remains unrecorded. An “accessibility stories” collection would provide a single place on the Internet where these stories could be collected and made available to others.

Personal stories are a major aspect of history and, if we launch such a wiki, we’d have a place where such stories could be found, studied and organized in a manner that doesn’t exist today. Of course, such a wiki would be useless if no one is willing to write stories for it. So, if you think this is a good idea and are willing to post at least one story about how you’ve used AT, please tell me so and, if I hear from enough people, I’ll add the functionality to this web site.

What About the Hardware?

While the BNS may be the most important piece of hardware this community has ever enjoyed using, it is certainly not the only one with great importance. At the same time, as far as my research could tell, the only museum on Earth that has a BNS in its collection is the Smithsonian where, along with JAWS for Windows 3.20, it is the only piece of access technology in the collection.

Recently, I attended the HopeX conference in New York City. There, I had the opportunity to have a meeting with a bunch of guys involved with a vintage computer museum. I’m currently in negotiations with them about their launching an access technology area in their collection. To this end, if you have old access technology hardware around that you would like to donate to the museum, please connect with me through the contact form on this site and we can make arrangements to have your old hardware shipped to their museum. I will be writing up stories about each device and will, of course, require your input to ensure the accuracy of them so people visiting the museum will have something that they can understand about why the device was important to our community when it was current.

Conclusions

I personally feel that it is a tragedy that our history has not been properly preserved. Our heroes, people like Ted Henter, Dean Blazie, Glenn Gordon and others are simply not remembered online the way that those who had made far less important mainstream technology are. The devices that our community used to get our education, to work in industry and elsewhere are not remembered either. Adding and improving Wikipedia entries is easy and, if you have old hardware, I seem to have found a home for it as well. I’d like having the oral history wiki as well but I’m uncertain that we’ll get enough volunteers writing up their stories to make it worthwhile.

So, please help our community preserve its history. We’re at an interesting time when we can write our own history and, in my opinion, we really should be doing so.

An Open Letter To Mark Riccobono

A Note To Our readers

Mark Riccobono is the new president of NFB, that nation’s largest group advocating for people with vision impairment. I find him an interesting choice as president of the organization. This is a letter Ive drafted to him regarding NFB, technology and its recent resolution asking Apple to require accessibility for submission to its Appstore.

The Letter

Dear Mark,

If you don’t know who I am, as a matter of introduction, I’ve been working in access technology and accessibility since 1998. I’m a former VP/Software Engineering at Freedom Scientific and have been something of an accessibility researcher, advocate, activist, gadfly, loudmouth and crackpot since. You can learn all about me by reading the blog where this letter has been posted and in the archive of BlindConfidential, the very popular blog I wrote for a lot of years.

To start, please accept my sincere congratulations on your election to the presidency of the National Federation of the Blind (NFB). I’m highly encouraged that NFB now has someone at the top of the organization who seems (to me at least) to have a grasp of the world of technology, research and the tools that a blind person needs to compete in professional settings, in schools and to enjoy a connected life in the information age. I look forward to seeing how your insights effect NFB policy and actions as we move into the future.

I would also like to congratulate you on having successfully led the project that resulted in your being able to drive a car at Daytona. While I live in a big city and rarely need to get into a car for any reason, I recognize the profound level of freedom that could be accorded blind people if they could operate a motor vehicle independently. That NFB could work with Virginia Tech to make such an amazingly innovative system is, indeed, a tremendous achievement and I look forward to seeing it evolve into the future.

As I’m a technology specialist, I read the NFB resolution stating that it will work with Apple to improve the accessibility of third party applications on its AppStore with great interest. I also read the piece you wrote further explaining the resolution and found it informative as well.

Having read both, it is my understanding that the resolution states that NFB believes that Apple should require accessibility compliance as a condition of inclusion in their AppStore. I agree with this assertion entirely.

I was also happy to read the whereas clauses in the resolution and enjoyed reading how you summarized such in your article. It is heartening to hear NFB state publicly, in a resolution, that Apple is the clear leader in accessibility and that Apple has done more than any other OS vendor to accommodate our needs in their technology. I agree with these statements entirely as well.

If, however, I attended the NFB convention as a delegate (an incredibly unlikely event as I’m not an NFB member), I would, although I agree entirely with the language of the resolution, have had to vote against its passing. While everything the resolution says is excellent, the problems are with what it doesn’t say and how its passing was perceived in the community of blind technology experts.

If this resolution, instead of saying, “We resolve that Apple…” instead said, “We resolve that Amazon, Apple, Google, Microsoft and all OS vendors with an online software store…” I would be writing an article celebrating its passing. Singling out Apple, however, even with the statements that they already do a better job with accessibility than any other vendor, is not in my mind an acceptable statement to make.

Specifically, this resolution asks Apple to ensure accessibility of third party applications at a level which NFB has not resolved to ask Amazon, Google and Microsoft to do for applications that carry their brand names. As the whereas clauses clearly state, Apple is already the best in this space and asking Apple to do something regarding software over which it has no control is, as I said above, an excellent idea but, in absence of insisting that the rest of the industry first reach parity with accessibility on the Apple systems for software over which they have complete control, I find raising the bar of requirements for Apple in exclusion of its competitors to do the same to be a statement that will, at best, cause confusion in the world of technology for people with vision impairment. The resolution, because of what it didn’t include, is perceived as a criticism of the best player in the game while ignoring similar and much worse problems at Amazon, Google and Microsoft.

Please also realize that this is 2014, a time in history where a long written resolution followed by an article explaining such by the president of NFB will be read by very few people interested in the subject. I write long form essays and I’m an accessibility nerd of the highest order, hence, I’m the kind of guy who actually reads such things. I’m sad to report, though, that few of my peers in the world of blindness and technology would take the time to read through such and or take the time to fully understand the nuances therein. This is the age of 140 character conversations and, while not true, the perception of this latest NFB resolution on Twitter is, “NFB slams Apple again” and “Riccobono doubles down on Apple slam.” I agree that these summaries are unfair but we don’t live in a fair world where everyone takes the time to read the details. perception is tremendously important and, speaking on behalf of other blind technology professionals to whom I’ve spoken in the past week, NFB has yet another major problem in the hearts and minds of this community. I don’t know how to fix this problem but it’s something about which NFB needs to be aware if it hopes to regain credibility among this admittedly elite class of blind professionals.

I believe that you can be an agent of change within NFB. I’m happy to hear that NFBCS has a new leader and I hope to see NFB improve its statements on technology as we move forward. While I may or may not agree with NFB resolutions that are passed in the future, speaking for myself and others to whom I’ve spoken, simply making NFB public statements consistent when addressing technology vendors will help substantially with this credibility issue. If you’re going to resolve that one technology vendor do something, please resolve that they all do it and you’ll find that some people like me, vocal critics of NFB in the past, will start paying attention and, perhaps, join and become active in NFB in the future.

In conclusion, I’m very happy to see you as the new president of NFB. I’m excited about progress on your automobile project. I agree with the text of the aforementioned resolution but I’m concerned with seeing Apple singled out for technological developments out of its control without a similar standard being applied to Amazon, Google and Microsoft for software entirely under their control. It seems very inconsistent to me and those to whom I’ve spoken. NFB has a perception problem among blind technology professionals and consistency in statements about technology would go a long way to allowing NFB to regain credibility in this community.

Sincerely,
Chris “Gonz blinko” Hofstader
7/23/2014

Job Access With Bugs?

Introduction

For years, I’ve heard anecdotal reports that JAWS, the world’s most popular screen reader, has more bugs, is less reliable, more unstable and of a generally poorer quality than some of its competitors. In that same period, starting in 1998 and continuing until today, I have never seen a single bit of quantitative evidence demonstrating that this is true. I hear people around the community make these claims based on personal experience, experience that is certainly valid but no one has published a scorecard listing every feature in every application supported by each screen reader, tested each and published the results. I’ve also never seen any detailed reports of reliability, only the same sorts of personal stories.

In this article, I want to explore some of the generally accepted notions about screen reader quality and functionality and ask why, if JAWS is such a bad piece of software, does it maintain a marketshare over 50% and why does it still dominate in most professional settings. Furthermore, I want to explore some of the issues discussed in my article, “Remembering GW Micro” that I published last month.

As a matter of disclosure, I don’t use JAWS. For the most part, my primary system is a Macbook Air running OSX Mavericks with the VoiceOver screen reader. I do use Windows with some frequency but, on that system, I use NVDA because I really like how it works in FireFox. This is a second theme I hope to explore in this piece, does the opportunity provided for career advancement, educational opportunities and other advanced computer usage provided in JAWS more valuable than having fewer bugs if, indeed, JAWS does have more bugs than its competitors.

Ted henter

Before there was a JAWS, Ted henter, its inventor and leader for many years, came to a realization. Specifically, while some talking computer technology had already emerged, none of it was vocationally oriented. In those days, Ted worked for Dean Blazie, a close friend of his until today, where they made the Braille & Speak (BNS), a truly remarkable device in its day. A blind user could do a lot with a BNS but it provided no access to the programs that one might use in a job or university.

To solve this problem, Ted found an investor and started working on the DOS version of a program he called Job Access With Speech. From day one, the defining value behind JAWS was to provide access to professional situations and, to this day, it remains the dominant access technology for blind people in professional settings.

GW Micro Marketing

I joined Henter-Joyce in October of 1998. Among the first things I noticed was that the GW Micro web site claimed that Window-Eyes was “rock solid.” I’ve heard this claim repeated in their marketing materials and in reports from their users. What I’ve never seen is the scorecard I mention in the introduction of this article. I try to base my opinions in evidence, when I did my evaluation of Android, I tested every single feature that came out-of-the-box on my Nexus/7. Before I make a claim of quality or lack thereof, I try to perform as full an evaluation that I can or find a published report that contains such written by a credible source. In the 16 years since I’ve been following screen readers, I’ve never seen a single report card of this sort for Windows screen readers, just lots of personal reports, lots of anecdote without evidence.

Does the lack of quantitative evidence mean that the assertions that JAWS is less stable than its competitors are untrue? Absolutely not, it just means that there is no data that can answer this question so I’ll leave it unanswered. It’s not unreasonable for someone making a purchasing decision to rely on the anecdotal reports written by other users as, in the lack of real data, its all a blind consumer might have.

Regarding Window-Eyes, when Microsoft announced that one could get a copy at no extra cost if they owned Office, I grabbed a copy. I did not perform an extensive evaluation of the product as the reliability problems I found in the first half hour of using the product convinced me that continuing with my evaluation was a waste of time. Specifically, on the Windows login screen, if one mistypes their password, Window-Eyes does not read the error box that comes up saying that something was wrong beyond the “OK” button so a user doesn’t know what he’s saying “OK” to. Then, I discovered that when a user launches Window-Eyes, it may not read applications that were opened before it was started – a problem that does not exist in either JAWS or NVDA. Others whom I trust intimately have reported other major bugs as well. If Window-Eyes is, indeed, “rock solid,” I don’t see it.

Meanwhile, Window-Eyes remains the only screen reader on Windows that still does not support either touch gestures for navigation or Aria on the Internet (yeah, I know, GW Micro says it’s coming but it took them a decade to get Java supported so “is coming” may mean in 2025). Window-Eyes, in my mind, remains highly buggy and as feature poor as anything on the market today.

Let’s Look At Some Numbers

According to the 2014 WebAIM statistics, JAWS holds a marketshare in excess of 50% with NVDA approaching 20% and Window-Eyes falling in with about six points. To make the arithmetic easier, let’s say that JAWS has 8 times the number of users as does Window-Eyes. Hence, it is run on 8 times as wide a variety of hardware, in 8 times as many sets of personal settings, setups, and Windows configurations. Let’s also assume that there are 8 times as many JAWS users discussing their problems online and, therefore, it’s 8 times as likely that a JAWS bug will be seen by the Internet reading public as would a bug in Window-Eyes. Is it possible that JAWS much broader user base and much larger exposure in online media (formal and otherwise) may lead one to believe that it is actually more buggy? In absence of the aforementioned scorecard, we cannot know.

JAWS Broader Feature Set

No one questions that JAWS is more feature rich than any other screen reader. It became so because of Ted’s commitment to providing a tool that blind people could use in professional settings. As far as anyone can tell, JAWS is still dominant in these settings because of its feature set, features which are absolutely necessary for many people to hold a job or further their education.

After I wrote the article describing my memories of GW Micro, a reader posted a comment reasserting, without any evidence, that GW won’t release a feature until “it’s rock solid” parroting Window-Eyes marketing literature. The person who posted the comment continued by stating that GW didn’t add Java support to Window-Eyes until version 8.0 and suggested that the near decade it took them to catch up to JAWS in this area was because of their commitment to quality. This implies that GW Micro had been working on their Java support for all of that time but chose not to release it until it was “rock solid” which, of course, is false. GW Micro didn’t add Java support until they were absolutely forced to do so by market demands.

What if the JAWS team had also decided to wait many years before they added Java support? A year after JAWS first supported the Java Access Bridge, University of Florida (a in the top twenty public engineering colleges in the US) decided to change its computer science and computer engineering curriculum from being based in the Scheme programming language (a Lisp like language developed at MIT in the sixties) to Java. A blind student in that program could have, if he so chose, used Window-Eyes, it was among the approved AT provided by the university, but, if he had made that choice, he would have had to drop out of the program as, using Window-Eyes, he could not possibly have done his class work. I suppose that the person who wrote the post considered this when he posted his statement and I suppose also that he thinks that waiting a decade for your AT to catch up to the reality of the technological world is also a good idea. our hypothetical blind student had no choice, he either chose JAWS or he failed out of college.

Personally, I think that saving that student’s college career is the most important thing a screen reader team can do with its time but, as always, I’d like to hear your comments.

A Data Point I’d Like To See

WebAIM statistics are nice especially because they run year to year and allow us to observe trends. It’s also a self selecting survey which, like all self selecting surveys, is wrought with problems. Is one screen reader under represented in the report while another is over represented? This is data that the WebAIM report cannot answer. It would be impractical to expand the WebAIM survey to include some other more personal information about screen reader users. Unfortunately, there is very little other data published that can tell us much about the make up of the screen reading using public.

The data points I’d like to hear, in a real, well constructed study, would help us learn much more about the efficacy of a particular screen reader. Specifically, I’d like to learn what is the median income of an employed JAWS users versus the median income of users of other screen readers. I’d also like to learn the average level of education accomplished by users of JAWS versus the other screen readers. Based purely in anecdote and in complete absence of real statistical data, I’m willing to bet anyone $100 that JAWS users are A: more likely to be employed, B: make more money and C: more well educated than users of any other screen reader except, perhaps, NVDA. Of course, it would cost much more than a hundred bucks to do the study properly so the bet is probably not worth taking.

As I wrote in “Remembering,” I believe this is why Window-Eyes failed in the market and is why GW Micro is no longer a going concern. JAWS did everything possible to build a base in employment sectors, NVDA came along and grabbed a whole lot of the more technical blinks and SystemAccess grabbed the novice users while Window-Eyes offered nothing special at all.

Fanboyism

Earlier this year, when I published the three Android reviews, I expected and received a spanking from its loyal enthusiasts. Years ago, when I wrote BlindConfidential articles with titles like “Apple Just Sucks,” I got spanked by Apple’s fanboys. When I write critically about Window-Eyes, I hear from its loyal users as well. I understand that people love the things they use, the technology in which they’ve invested a lot of time and energy learning and they respond to criticism of their favorite things. I admit, I cringe when I hear some of my favorite things criticized as well.

What I didn’t expect from the Android series, though, was the celebration tossed by the iOS fans. In my mind, celebrating accessibility failures is never a good idea. I really like my Macbook Air and my iPhone 5S but I want all devices to be equally or more accessible. I take no joy in writing a review of accessibility that, based upon testing I’ve done or published reports from credible sources, is substandard. Because you chose a device that my blog suggests is “better” is a bad reason to celebrate that other devices may not be as good. This isn’t a game, Apple ain’t the Red Sox and Google ain’t the Yankees and there’s no reason to root for one massively profit generating corporation over another.

When I write a critical piece, I do so to inform my readers of results I have learned about some bit of technology. I do not do so to “gloat” that I had made a particular purchasing decision over another. I have no skin in this game, if a new device comes out tomorrow that I think will like, I’ll go get it no matter the vendor. I view technology as tools and nothing more and I don’t root for Craftsman versus Snap-on either.

Conclusions

In general, I think that the access technology business needs much more real data driving the opinion pieces that are so rampant in this community. We all have our favorite things and it’s good that some people write about such, create tutorials and do all of the other things that make using computing devices much simpler for our community but it’s also essential that we try to stick to facts, find the data to support our assertions and view all marketing literature with a very skeptical approach.

While editing this piece, I went through my usual process of adding links to as many of the proper nouns in this article as possible. I usually add a link to the first occurrence of any proper noun I use in an article. I always prefer including a link to a Wikipedia entry instead of a company or personal web site as Wikipedia’s crowdsourced manner of creating content is far more likely to be objective than are web sites written by businesses as marketing tools or by individuals about themselves. In this piece, I found an Wikipedia article I could link to about Ted Henter but not one about Dean Blazie. Some popular screen readers have Wikipedia entries, some do not.

Perhaps it’s a result of poor accessibility in the Wikipedia interface one uses to add or edit an article but, no matter the reason, the history of access technology, the products, the people who created them and the steady improvement of such is hardly reflected on Wikipedia. This is the one forum where we, as consumers, advocates, developers and users can write our own history and it’s something that we should do as soon as possible.

Back In The Game

Introduction

After I left my Freedom Scientific office for the last time in November 2004, I turned my professional path to one in which I worked for an AT company to working in accessibility contracting, mostly for research groups. This time has allowed me to work on projects far outside the range of things that may be profitable in the future, to learn a whole lot about how standards work and to work alongside some of the nicest and smartest people in the AT world.

Last week, we announced that a new and very different accessibility consulting company, 3MouseTechnology (3MT), founded by Mike Calvo, Christopher Toth and I along with nine others had opened for business.

I wrote the 3MT announcement press release and, for this article, decided to pretend I was a real journalist and compose using the third person as if I had been a recipient of the release and not its author. I talked to other 3MT members, got quotes and even quote myself in the article as if I was someone other than me. Gonzo journalism, my typical style, tends to be written in the first person so this is a bit of an experiment for me to see if I can write effectively in this style.

This article will, in third person, tell the story of 3MT, how we came to the decision to create the company, why we created the business and a bit about what we hope to do in the future. I hope you, my loyal readers, enjoy the piece and, as always, please post comments as I’d love to hear what you think.

A New Approach to Accessibility Consulting

“Mike and I wanted to do something different,” says Chris Hofstader, a founding member of 3MT. “We knew a lot of really smart, independent and productive blind software engineers, guys who’ve made some really impressive technology over the years and, given the enormous and growing demand for qualified accessibility specialists, we decided to build a new company around technology experts from within the community of AT users.”

“At 3 Mouse Technology (3MT), three quarters of our owners, our members if you will, are either blind or low vision. We bring the engineering skills, the knowledge of standards, guidelines and best practices and an intimate knowledge of the usability of a system to our clients,” adds Mike Calvo, another 3MT founder.

Nothing About Us Without Us

“Many times over the years,” adds Hofstader, “I’ve been asked, where can we find blind software engineers? and, with 3MT, I have an answer and it’s ‘right here.’”

“We believe that it is essential that people with disabilities be involved in every step of the development process,” says Mike Calvo, “As we can do the design, testing and usability components of a process, we’re available to those who hope to make their technology accessible at every stage of their project.”

3MT Services?

3 Mouse Technology is available to work on every sort of accessibility related project. 3MT does web accessibility design and remediation, 3MT members have made accessible applications and access technology on most operating systems, its members participated in the development of five different screen readers and have used virtually every access technology designed for this population.

“With our experience in AT, web and application development,” says Christopher Toth, “we have a combination of skills that no other accessibility contractor can claim. We are a total solutions provider available to companies of all sizes and needs”“

Companies seeking help with compliance to regulations like Section 508, ADA Restoration Act, 21st Century Video and Communications Act (CVAA), and similar laws in various US states, EU and other nations that signed the UN Convention regarding people with disabilities, will find solutions at 3MT.

The Exploding Need For Accessibility Services

Last year, the US Department of Justice (DOJ) announced that it had to delay the roll out of the technological requirements in the ADA Restoration Act to state, county and municipal governments for exactly one reason, “there are too few accessibility experts available to handle all of the remediation work required to come into compliance.” Meanwhile, the same US DOJ joined NFB in its lawsuit against H&R Block contending that corporate web sites, like that of the tax preparation giant, are “places of public accommodation” and, therefore, fall under the regulation of the Americans with Disability Act (ADA), creating an even larger set of opportunities for contractors in this field.

“We don’t view the other accessibility contract services companies as competitors, “says Calvo. “In fact, most of us have good friends who own, operate and/or work for some of the other companies doing accessibility work. The fact is, this is a really big pie with a lot of slices and there’s tremendous opportunity in this field.”

3 Mouse Technology intends to work in cooperation with other related businesses and organizations and will in the coming weeks be announcing a number of strategic corporate partnerships with other teams with a similar set of goals.

Who Is 3 Mouse technology?

“As far as I can tell,” says Hofstader, “3 Mouse technology is the first team ever to include individuals from all areas of technology and vision impairment. We have Matt Campbell, the single most accomplished developer in this field. We have people who’ve worked on five different screen readers. We have folks who’ve made accessible applications on every major operating system out there. We have done web accessibility, application accessibility and have made access technology on multiple platforms. We are a total solutions provider.”

“If you asked me if we could get people from JAWS, SystemAccess, NVDA and so many other former competitors all on the same team,” adds Calvo, “I’d have never believed it. Here, with 3MT, though, we’ve assembled a team of former rivals to work toward a common set of goals.” , .

A Differently Organized Company

“When Mike and I first started tossing around ideas for a new project,” says Hofstader, “we knew we wanted to do something to help promote the careers of other blind people working in engineering. I knew some of the guys, Toth, Tyler, Matt and a few others and was so impressed with their skills, their history, their energy and commitment to making accessible technologies that I didn’t want them as employees but, rather, I felt that we should all be partners running a business using democratic principles.” 3MT is organized as a worker’s cooperative in which each member owns an equal portion of its equity and has the same voting power on company matters.

“I wanted to ensure that every member of this team had full access to all of its information, including financial and contract details,” continues Hofstader. “I’ve seen too many consulting businesses take on risks without including their employees in the decision making process and later surprise their staff with an unexpected layoff. We wanted all of our members to be able to participate in proposing contracts, following our income and be able to perform their own risk analysis before electing to participate in a an effort. In effect, we wanted the entire team to participate in executive level functions.”

“While a democratic decision making process slows us down a little,” says Christopher Toth, “it also brings us a level of scrutiny to everything we do and, because this is such an accomplished team, it includes a far wider range of inputs than would a typically organized company.”

The 3MT Future

Along with its short term goals of providing expert level contract services, 3MT will, in the future, be doing a number of different and innovative technology projects. While no one at 3MT is willing to talk too openly about its future, raiders can assume that there will be a number of exciting announcements coming from it in the next few months.

Conclusions

I honestly believe that 3MT now boasts the most accomplished team of blind programmers in history. We have such a broad range of skills as well as bringing the user perspective, an essential for testing all functional requirements of the standards out there to the industry. If you would like to buy some services from us, please go to the 3MT site and send us an email. While there, sign up for our mailing lists so you can ensure you get all of our announcements.

Remembering GW Micro

Introduction

Over the past few weeks, those of us who follow such things have heard two major announcements from GW Micro, the first announcing that they had decided to start selling consulting services and the second announcing that they had been acquired by AI Squared, the Vermont based publishers of the market dominant low vision AT ZoomText.

Over the years, in this blog and on BlindConfidential, I had often mused about GW Micro and its Window-Eyes screen reader. After learning of their merger with AI, I found myself reminiscing about Window-Eyes, especially during the years when I worked at Henter-Joyce and Freedom Scientific.

Before I start, though, I want all readers to understand that this article is a personal essay written entirely from my own memory of events and based entirely in my own opinion of the history I relate herein. This article will not cite references as I’m not using any. I will try to tell the story as best as I can remember but readers should keep in mind that human memory is highly fallible so be assured that some facts and dates and such in this article are probably incorrect. Also, readers should remember that I’m an engineering manager by profession and a writer by avocation and not, therefore, a credible business analyst. The opinions regarding how companies behave and how businesses ebb and flow are given from my personal perspective. I have worked professionally in the software business since 1979 and my opinions are based in having worked in the industry for most of my life and are not definitive by any known definition of the word.

Thus, if I get some facts wildly wrong, I’m happy to (as always on this blog) correct them. Minor factual issues, though, will be ignored if they do not reflect on the story I’m telling.

Window-Eyes: My First “Real” Screen Reader

In 1997, I found myself in a tremendously unfamiliar situation, I had no job and my vision had, due to retinitis pigmentosa, deteriorated to a point in which I could no longer see well enough to use a computer. As I had been programming computers professionally since 1979 and as a hobbyist since 1971, I fell into an incredible state of despair. “Computing is all I know,” I thought, “My career is over.” Then, as I had done so many times before, I decided to solve my problem by making a program that would read aloud the contents of the screen. I had a Macintosh back then and using Apple’s development tools and a combination of C++ and AppleScript, I wrote myself a little utility that would take the selection from the screen, copy it to the clipboard and then push the contents of the clipboard to the old AppleTalk speech synthesizer. Thus, my first screen reader was a home-brewed hack I made for myself.

With my old Macintosh desktop loaded with my personal screen reader, I could navigate menus using CloseView, a really terrible screen magnifier that Apple once included with Macintosh products, at 16X magnification in reverse video. With that, I enrolled in a creative writing program at Harvard and never expected to work in software again. A few months later, though, my father told me had had a conversation with an old friend who, coincidentally, was also blind from RP who had told him of a program called Window-Eyes. Being a terrific dad, mine bought me a Gateway laptop and a copy of Window-Eyes.

When the packages arrived at our home, I had already done some research into Window-Eyes and was really looking forward to giving it a try. After my wife spent a few hours on the phone with GW Micro technical support getting it installed, I sat down and gave it a whirl. “Holy shit!” I exclaimed. “This is the coolest thing I’ve ever seen, er… heard!” I would use Window-Eyes for the rest of that year enjoying learning about writing from some really outstanding instructors.

The following summer, though, I started thinking more vocationally and wanted to get back into the work force. Clearly, with a screen reader, I could probably work making software again. I sent my resume to GW Micro and never heard a response. I sent my resume to a friend at Microsoft and he suggested I apply for a job at a Florida based company called henter-Joyce. I went to the HJ web site and saw that, indeed, they had an opening for an engineering manager, I sent them my resume, had a few phone calls with Ted, Glen and some others and the rest is history.

Switching To JAWS

During the year in which I used Window-Eyes, I really didn’t become anything one would describe as a “power user.” Instead, I did pretty well in Word, Internet Explorer 3.x and the long forgotten email client, Eudora. I studied creative writing so all I really needed was a word processor, a browser and a mail program. While I interviewed with HJ, though, Ted Henter felt it essential that I at least try JAWS and he sent me an evaluation copy while they decided whether or not to hire me. After getting JAWS installed on the same Gateway laptop, I started using it for more or less the same tasks. The one major difference I noticed was that, at that point in history, the Window-Eyes “MSAA Mode” for reading web content did a much nicer job than did the JAWS 3.20 “page reformatting” techniques. Otherwise, once I learned the JAWS keymaps, the switch was simple and, as HJ offered me the job leading the JAWS development team, I was sold on it.

The Collegial Competitors

Prior to joining HJ in 1998, my career in software had been entirely mainstream. I was, therefore, accustomed to the sort of competition one observed in the mainstream software industry back in those days. In access technology, though, I was introduced to a whole new way of viewing things. When, after we released JAWS 3.31, the first with the virtual buffer approach to the Internet in September 1999, I mentioned on a public mailing list discussing screen readers that “JAWS can load Jamal Mazrui’s www.empowermentzone.com site in under 30 seconds while it takes Window-Eyes more than 25 minutes to do the same,” the reaction I got puzzled me. Another HJ executive pulled me aside and told me, “the GW guys are upset about your email.”

“Sure,” I said. “our performance kicks their ass, I’d expect them to be upset.”

“It’s not that,” continued my colleague. “They’re upset that you criticized them so publicly.”

“How else would I criticize them, they’re the competition after all,” I added.

I couldn’t believe that the AT biz took such a “candy assed” approach to competition. In my mind, if we said something that was absolutely true, something anyone who owned the two screen readers could test side by side, that we should announce where we did better as loudly as possible. I also felt that, when we heard that Window-Eyes bested JAWS in some areas, that we would do our damnedest to ensure that our users would soon have something equal to or better in an upcoming JAWS release.

I thought this was how competition worked in the software industry: if a competitor beats you at something it’s incumbent upon you to catch up and try to beat them; if you do something better than a competitor, then you should expect them to retaliate with something cool in their next release as well. What I would learn about the AT biz, though, is that it follows a very different set of rules.

The Race To Windows NT

In the years prior to my joining HJ, they acquired a screen magnification program called MAGic from its Massachusetts based developer. This deal happened before I joined the team so I wasn’t present and have no details about the acquisition. MAGic would never become a market success, losing in every marketshare report I’ve ever seen regarding screen magnifiers to the AI^2 ZoomText which still maintains a monopoly like share of the market. What MAGic did, though, was give JAWS the video hooking technology necessary to get into the Windows NT market. Ted and the gang at HJ realized that the US federal government, primarily for security purposes, were upgrading all of its computers from DOS to Windows NT 4 and JAWS and MAGic became the only screen reader and magnifier combination that the feds could buy.

Having a Windows NT solution allowed HJ to sign a long term contract with the US Social Security Administration (SSA) which would be followed by large and long term contracts with the Department of Education and a number of other US federal government agencies. These big time deals followed by the implementation of Section 508 in the federal space gave HJ and later Freedom Scientific a massive infusion of cash dollars, something that neither AI^2 or GW could then boast.

When I joined HJ, the estimated marketshare figures for JAWS and Window-Eyes would have them tied with around 40% each with Dolphin and all others holding onto the remaining 20%. On screen magnifiers, MAGic was third behind both ZoomText and the Dolphin products.

Investment In JAWS

When HJ got the first contract with SSA, as I show above, JAWS and Window-Eyes were effectively tied in the marketshare competition. Ted henter could have chosen to take a whole lot of those dollars home as a windfall but, instead, chose to invest most of your tax dollars back into the business. Along with other executives, Jerry Bowman, Eric Damery and Glen Gordon, Ted worked to bring in professionals like Sharon Spenser and me to help build a management team who could successfully bring the business to the next level. From my first months on the job, I worked well on the team with Glen and Eric and, together, we set out to win the screen reader wars. We were done with “speak no evil” marketing, we dropped the gloves and got down to the business of building a market giant.

What none of us expected, though, was just how easy GW Micro would make our quest for the dominant position.

The GW Micro MSAA Fetish

As I wrote in my recent article on the importance of standards, MSAA was the first accessibility API available on a major operating system. Unfortunately, as I also describe in that article, the early versions of MSAA were not up to the task of providing a truly accessible experience in all but the most simple of applications. For that reason, we at HJ/FS chose to use different techniques to gather information which we would then provide to our users in speech and braille. Although, with JAWS 3.31, we demonstrated how the GW MSAA solution had profound performance problems and would, along with IBM’s Home Page Reader, over the following few years also demonstrate how we could, using other techniques, approach 100% of the web user agent guidelines, GW Micro steadfastly adhered to their MSAA only strategy, a strategy that would help spell their market demise. Year after year, review after review, blind users, government purchasing agents and everyone else who paid any attention would read how JAWS had gotten “even better” on the Internet while Window-Eyes stood more or less in place.

To a guy like me with a mainstream software background, this was incredibly confusing. Both FS and IBM demonstrated solid progress year in and year out and GW Micro, ostensibly the competition to JAWS did virtually nothing to catch up.

One might think that the income HJ/FS derived from its big time government contracts might have fueled the rapid improvements in JAWS but, while this is true for other areas, virtually all of the code written to support the Internet in JAWS 3.31 until at least JAWS 5 was all done by one guy, Glen Gordon, who was in the employ of HJ before the big contracts started coming in. If one programmer, granted one very smart programmer, at HJ/FS could do all of the work, certainly GW Micro could have afforded to do the same. Of course, that’s how a “normal” company in a “normal” industry would behave and, as I said at the top, the AT industry plays by a weird set of rules.

The Great GW Micro Contribution

Shortly after Congress passed Section 508 of the Rehabilitation Act and the Bush administration published its rules for implementing such, a whole lot of mainstream software companies realized that, in order to keep their big time government sales, they had to at least pretend to pay attention to accessibility. AT FS, we would get inquiries from multi-billion dollar companies like Adobe, MacroMedia, PeopleSoft, Oracle and others. These mainstream giants needed an accessibility solution. To work with these businesses I proposed and FS launched its software consulting team, today, roughly 15 years later, led by accessibility rockstar Matt Ater. We viewed the accessibility of mainstream software as the problem of its publishers and, while such work was important, we were certainly not going to work for software monsters like these without compensation. Typically, in one of these situations, the big time software company would approach us, we’d say that we’d be happy to work with them for our hourly rate ($125 per hour in those days) and, soon, we’d get a “thanks but no thanks” note from the prospective client and would learn that GW Micro was willing to take on the work at no charge.

All of that was a long time ago so I do not recall the order of the inquiries but, after the first one approached FS but chose to go with GW instead, we heard back from the mainstream company with a, “the federal government doesn’t want it if it doesn’t work with JAWS,” which, of course, we knew already and had already told the potential client in our original proposal. At this point, though, the mainstream vendor had an MSAA solution delivered to them at no cost by GW Micro so would only need to pay FS to do the JAWS side of the effort, saving the billion dollar corporation tens to hundreds of thousands of dollars. At FS, we would privately say “thank you” to GW for doing the work as, typically within a month of their doing the heavy lifting, JAWS would use the MSAA implementation they had done pro bono.

While this work was disastrous for the GW Micro business and, until today, I cannot understand why they would repeatedly make this same mistake, it served the community of access technology users very well. A whole lot of mainstream software became more accessible because GW Micro worked as volunteers on their MSAA solutions. This, in turn, helped set a precedent that, indeed, using documented accessibility API was a good strategy and, today, with excellent API on virtually all OS, the model GW professed is the industry standard.

We should all be grateful to GW Micro for doing this work as it established a solid foothold for a standards based approach to accessibility if, indeed, it hurt their business badly. It may have made sense if an independent advocacy group like NFB or ACB helped fund this kind of development work but for a small company like GW Micro, it seemed suicidal.

Microsoft Office

The team at Microsoft that wrote Office back then ignored MSAA almost entirely. JAWS users needed to use Office in their jobs and would need access to its more advanced features in order to get a promotion or do well in a university. At FS, we used a different standard to gain access to the information in these programs. Our decision was to use Visual Basic for Applications (VBA), a programming tool used by corporations and others who want to use Office features in their proprietary applications. By simply adding the ability to query applications via the VBA interface, the JAWS scripting language would have a well documented manner of providing our users with access to information otherwise impossible to a screen reader. This is one area where the greater HJ/FS income does come into play, once we added the VBA feature to the scripting language, we could hire a staff of full time JAWS scripters and let them invent the future.

GW Micro held to the position that a scripting language was too hard for the average user to learn, a straw man argument as few JAWS users would ever attempt to write a script as few ever have the need to do so. As we’d add dozens of new features to the JAWS Office support with each release, including advanced features in PowerPoint, MS Project and even charts and graphs in Excel, Window-Eyes seemed like it was simply waiting for the day that MS would improve MSAA enough to do all of the cool things we could with JAWS. Ultimately, GW Micro would add a VBA enabled scripting facility to Window-Eyes but it would come years after JAWS and NVDA had crushed them in the marketshare battles.

Why GW Micro refused to add the functionality that JAWS had for years in Office still confuses me today. Did they really think they were competing by simply doing what appeared to me to be as little as possible?

The No Cost Window-Eyes

Earlier this year, Microsoft announced that any user of a licensed version of MS Office 2010 or newer could download a copy of Window-Eyes at no charge. This announcement made quite a splash but I chose not to write about it as our friends at Serotek had written a good piece on the matter and I had nothing to add. Like many others who hadn’t given Window-Eyes a look in a long time, I downloaded a copy and gave it a try. Right off, I noticed that it didn’t work with my favorite applications as well as does NVDA and, rather than spending a lot more time digging into it, I left it installed on my Windows tablet but haven’t launched it in months.

My impressions, though, from talking to a bunch of other blind people who’ve also tried the no cost Window-Eyes is a solid sense of “who cares.” Most of my screen reader using friends gave it a whirl and, for the most part, returned to NVDA with a few still using JAWS.

Any regular reader of this blog would know that I’m something of a standards freak. A decade ago, Window-Eyes refused to compete in the standards battle. Today, compared to NVDA, a free screen reader written mostly by two very underpaid guys in Australia, (I cannot comment on the current state of JAWS as I don’t have a copy installed anywhere), Window-Eyes remains far behind. As far as I can tell, Window-Eyes ignores most of the Aria specification so would be useless even in a properly standard accessible web app. So, if you care about web accessibility through standards, best practices and objective measures, support NVDA as they’re doing it right and they need all of the help they can get.

Conclusions

To be honest, I can’t conclude anything about GW Micro other than they were the most puzzling competitor I’ve ever come up against. Their willingness to give away their consulting time for free and refusal to even try to match JAWS functionality in MS Office or on the Internet still feels like corporate suicide to me. That GW Micro lasted as long as it did as an independent endeavor also confuses me, who are their customers? Obviously, given the most recent WebAIM statistics that show Window-Eyes with a share below virtually all other screen readers, the answer is “very few people.”

I predict that when MS releases Windows 9, it will contain a new Narrator that will be competitive with Apple’s VoiceOver on OSX. At that point, the no cost Window-Eyes will be obviated and it will become a forgotten product overnight. People who need more performance and more features than the Narrator I imagine will continue to use either NVDA or, especially those who use applications that require a lot of customization, JAWS. My predictions are based on nothing more than an idea I’ve pulled out of my butt so, if it comes true, remember, you heard it hear first.

Standards Are Important

Introduction

For nearly twenty years, the community of people working on disability related issues involving technology have worked very hard to create a set of standards, guidelines and best practices for accessibility. This article intends to explore the history of why the rich set of standards we now have available to developers had to be created and why it is essential that standards based, objective measures are the only way to reliably measure the accessibility of a device, an app, web site or an operating system.

It’s Not Just About Blind Users

This article was motivated by a tweet I received from one of my Twitter friends. I had stated that universal accessibility means including people with all disabilities, not just we blinks. His response, “why do i when i look at a phone or computer should i care about other disabilities? like motor impairment…” startled me. He, a blind person, was stating boldly that if technology works for him with a screen reader, that it didn’t matter if it also works with access technology designed for people with disabilities unrelated to vision. Following his logic, all mainstream web developers should probably just ignore the accessibility standards altogether as, if we blind people don’t care about people with other disabilities, why should any web developer care about people with any disability, blind or otherwise?

As far as I can tell, all technological accessibility standards of which I am aware are designed to provide access to people with as many different disabilities as possible. Hence, a standards based approach to accessibility solves not just problems encountered by people with vision impairment but allows for access technology designed for every imaginable disability to function successfully.

Some History

Fifteen or so years ago, when the Internet was truly the wild west, accessibility was handled on a site by site, AT by AT basis. Web sites that used mostly text and stuck to standard controls were, more or less, accessible; those that were more visually appealing were rarely so. The world is dominated by sighted people who like looking at pretty things and highly visually oriented web design became the norm so accessibility of web content had to be invented.

Screen readers like JAWS and a nifty Internet accessibility tool from IBM called Home Page Reader (HPR) started tackling the problem of delivering the Internet to people with vision impairment. GW Micro used the strategy of gathering its web information using the Microsoft Active Accessibility (MSAA) API, which, as we’ll see, wasn’t yet up to the task of supplying a screen reader with everything it needed to provide to its users. JAWS, Window-Eyes and the popular low vision tool, ZoomText all also used different heuristics for gathering information from a web browser that were as non-standard and as proprietary as proprietary can be. In the blindness/low vision space alone, chaos dominated the accessibility field.

The Invention of the Virtual Buffer

While GW Micro chose to stick with the MSAA approach to web accessibility, the teams at Freedom Scientific and IBM took JAWS and HPR in a radically different direction. With the release of JAWS 3.31, the world of blind computer users were introduced to the virtual buffer, now the standard way of delivering web content in all Windows based screen readers.

[Just a definition: the term “virtual buffer” was coined by Eric Damery, Glen Gordon and me in a meeting at FS in the months immediately prior to the 3.31 release. Our definition of the term was, “a buffer with no on screen components that a user could read using arrow keys like in a word processor, activate links and do a few other things necessary then for JAWS to provide superior access to the Internet.” In the years since, though, only when speaking to friends who work on or use the Orca screen reader on the Gnome desktop, I’ve heard a modified definition of “virtual buffer” that adds a component of preprocessing done by a screen reader to the data before it is placed into the buffer to be read by the user. As it is very true that both JAWS and HPR did, indeed, include augmentations to the text as well as including workarounds to force some information to be readable when it was far from accessible in a standard manner, the term “virtual buffer” with both definitions apply.]

The teams working on JAWS at FS and on HPR at IBM came to an identical conclusion simultaneously. If we want to provide a rich web browsing experience to blind computer users, if we hope to follow the web User Agent Guidelines (UAG), we cannot use MSAA to gather the data as MSAA didn’t support most of the elements needed to comply with the UAG. At FS, we solved the problem by using the VBA interface to Internet Explorer and ask it to give us all of the HTML it was using to display the page on screen. JAWS then, as if it was a mini-browser built into a screen reader, would parse the HTML itself, add words like “link” and the like and place the user into a “virtual buffer” that could be navigated like a word processing document. IBM would follow about six months later with a very similar solution in HPR.

Virtual Buffer Workarounds

Both FS and IBM took the approaches we chose in order to provide the best possible experience to blind users who wanted to use the Internet. Neither team should ever apologize for what we did as, if we had waited for standards to emerge, our users would have had no more than a tiny fraction of what we could provide through what, in retrospect, were truly ugly, kludgerous hacks. In our quest to make an excellent experience for ourselves (both the JAWS and HPR teams were filled with actual users, something GW Micro couldn’t boast), our users, our customers and to set a bar for what “excellent” meant for accessibility to blind users of the Internet meant, we broke every rule in the book and, to this day, I’m proud of that work.

Unfortunately, there was, as is often the case, unintended negative consequences of our having taken the law into our own hands. Specifically, a lie started to spread around the community of people interested in web accessibility and I was one of the people who promoted the great falsehood, “A site is accessible if it works with JAWS.”

JAWS was, without a doubt, the gold standard for accessibility for people with profound to total vision impairment and, indeed, regarding standards, JAWS and HPR could boast a score of greater than 90% when tested against the user agent guidelines while Window-Eyes and the Dolphin products scored forty precent and below so we were “standards compliant” in that regard. We were not, however, compliant with the web content guidelines and all of the razzmatazz that we did behind the scenes to make non-compliant content appear to our users in an accessible manner made JAWS a terrible testing tool for all but one case: testing if something was compatible with JAWS. In those days, a web site could claim it was “accessible” if they tested it with JAWS and it worked but users of Window-Eyes, ZoomText, the Dolphin products and access technology designed for disabilities other than blindness were screwed.

The Testing Against JAWS Hangover

A decade ago, when JAWS was the king and truly set the standard for accessibility on the web for blind users, the statement that something was “accessible if it worked with JAWS,” while untrue, did hold some credibility. By 2004, JAWS was holding a monopoly marketshare and, as the other screen readers were far behind in standards compliance, was also the one blindness related product that one could use to exercise the standards fully. I don’t know if it’s still there but FS then published an HTML document we wrote with a title like “The HTML Challenge” that one could load into their favorite browser with their favorite screen reader to see just how much of the standard was exposed to them. Of course, users who tried the challenge with JAWS or Home Page Reader got terrific results; those who tried it with Window-Eyes, HAL and other screen readers were sadly disappointed with the product they chose.

Thus, for good reasons a decade ago, the practice of testing web content against JAWS became a practice used at many web development shops that care about accessibility. Unfortunately, today, it is both unnecessary to test with JAWS but, even worse, JAWS is no longer the screen reader most compliant with web standards like WCAG 2.0 and WAI/Aria. Today, the top screen reader regarding standards compliance is NVDA, a FLOSS screen reader for Windows.

This does not, however, mean that testing against NVDA is a guarantee of standards compliance either. Remember, testing against a screen reader will only provide you with results apropos to screen reader users and will ignore important aspects of the standards that are meaningless to people with vision impairment.

One really important item in WCAG is the regulations on the rate with which something can “flicker” on the screen. Obviously, NVDA, JAWS or any other screen reader wouldn’t monitor if something is flashing on and off, blind users wouldn’t care but some people with epilepsy can have a seizure triggered by an object on the screen flickering at a specific number of hertz. In some cases, such seizures can cause permanent brain damage so this is a really important part of the accessibility standards that one would miss if they only test against a screen reader.

If Not With JAWS, How Can We Test?

There are a number of good automated accessibility testing tools out there and I’m (as part of a new project I’m doing on another blog) working to build a web page filled with pointers to such resources. As I’ve just started creating a catalogue of such tools and haven’t had the time to see which are current, which are free, no cost or come with a hefty price tag and which are accessible (I’m not going to promote an accessibility tool that a PWD cannot also use), I haven’t such a list to post publicly yet. I am told that the Internet Explorer accessibility plug-in from The Paciello Group is very good but I can’t vouch for it myself. I’m sure if you google “web accessibility test remediation tool” you will find some other good ones as well.

Now, Standards Exist

As I wrote above, fifteen years ago, AT developers had to hack their way through web accessibility; today, that is no longer true. Today, we have standards like the Web Content Accessibility Guidelines 2.0 (WCAG), WAI/Aria for web apps and, most recently, the BBC Mobile Checklist for device accessibility. There is no reason that any technology in 2014 is not fully compliant already as none of these standards contain any difficult engineering. Some of the aspects of making a web app accessible can be tricky but if you’re an accomplished enough a developer to be making a complex web app, I’m highly confident that you can figure out the WAI/Aria part as well.

A Call For Universal Accessibility

Atop this article, I mention the problem that can arise when blind people decide that we’re “special” and agree to leave behind people with other disabilities as if they didn’t matter. If we make the argument, “it works for me” and walk away from other groups who benefit from standards based accessibility, we are cutting our own throats. The reason that the web is more accessible today than ever before is because an increasingly large number of web sites are choosing to be compatible with standards and guidelines for accessibility. If we, as consumers of access technology eschew standards in favor of our own selfish desires, we are endorsing bad accessibility for ourselves as well.

Conclusions

If you want to make sure that your web site is accessible, please do so in a manner that is based in the standards and don’t just test with a screen reader. Testing with a screen reader is a good way to ensure that you’ve gotten most things right but, unless you’ve tested against the standard itself, you may accidentally trigger a seizure somewhere by accident.

If you are a consumer of access technology, whether blind or otherwise, insist that your AT be as standards compliant as possible so you can enjoy the excellent accessibility available in programs like Microsoft Office Online.

If you are a person with a disability, please try to stand together with those of us who endorse universal design as the only appropriate route to universal accessibility. The route to universal design on the Internet is through standards compliance, whether accessibility related or not. If we, as blind people, ignore the needs of those with other disabilities, we are simply begging the mainstream to ignore our needs as well.

Back In The USA

Introduction

As I wrote in my previous piece, I spent the weekend of 4/12-13 at the QED conference in Manchester, England. This event is run by the terrific people at the Merseyside Skeptics Society (MSS) who did an incredible job of lining up speakers, arranging panels and delivering all of it in an entirely accessible manner. If you are interested in science, humanism, skepticism, and related subjects, do attend QED in the future, you will not be disappointed.

Gratitude

I’d like to start by thanking the amazing team at MSS for doing such an incredible job organizing and delivering my favorite conference every year. These guys, Mike Hall, Michael Marshall, Andy Wilson and the rest of the gang did an amazing job of making this a tremendously welcoming event for all, including we people with disabilities. They are a terrific bunch of people whom, if you get the opportunity, you should meet and befriend as they are simply awesome.

I’d also like to specifically thank a few friends for hanging out and making my time there so special. These include Hayley and Charlie Stevens, James and Liz from Pod Delusion, Adam and his terrific mom Jeanie and many more. Part of what makes QED so special is having the opportunity to socialize with so many other really smart and interesting people. If you attend a QED in the future, you will find that you already have friends there, you just haven’t met them yet.

The QED Speakers

I enjoyed virtually every presentation and panel I attended at QED. Most special, however, was the keynote speaker, Nate Phelps, formerly of the hateful Westboro Baptist Church. Phelps described, in harrowing detail, his life growing up as son of the violent Fred Phelps. What made Nate’s talk so compelling is that it was delivered entirely without anger, bitterness or the “hate” one might assume that one who escaped from the hell of his early life would maintain. Phelps spoke with kindness and I don’t think a single person of the 550 or so in the room didn’t feel tremendously moved by his talk.

Being Reasonable

A large part of QED is the notion of “being reasonable.” In fact, Michael Marshall, a QED organizer and my dear friend Hayley Stevens do a terrific podcast called “Be Reasonable” in which they interview people with beliefs radically different from heir own. I asked Hayley, “How do you remain so patient? How don’t you lose your shit talking to these people?” Hayley said, “We, Marsh and me, we just want to learn so we’ve learned to be good at listening.”

At one point in the conference, I had the opportunity to enjoy a hallway chat with James O’Malley, editor of the awesome Pod Delusion podcast to which I make an occasional contribution. James asked me the simple question, “How can we enforce accessibility regulations, standards and such on web sites that get fewer than 20 hits per month?” James’ query interested me, I’m terrific at telling people that the accessibility of their technology (web site, app, whatever) is shit but, when I do, I have, in the past, only had solutions for remediation in hand for the wealthiest of organizations out there, big companies, government agencies, universities and the kinds of institutions who can afford to pay high priced consulting dollars for experts while offering nothing for all of the important skeptical sites out there run by individuals and groups altogether too small to pay anyone, let alone a contractor to do accessibility remediation.

During one panel, I asked a question that was really more of a statement on accessibility, the right to read, literacy rights, discrimination and other fundamental issues regarding disability. I became very aggressive, I was a dick. The reality of the situation is that everyone on the stage wanted to be accessible, they didn’t know how. Thus, I’m launching into a new project associated with Skeptability, a disability centric sister site to Skepchick, that will gather accessibility resources in a manner that they an be used by non-engineers to do their own remediation. Overwhelmingly, new web sites in this community are based in WordPress and those that aren’t tend to use either Drupal or Joomla, systems on which an author can make their work very accessible with very little time or effort involved. I’m going to try to make it all as simple as possible and will try to write the prose using as little jargon as I can. I hope having such a resource will help make the world of skepticism more welcoming to all.

If I’m not part of the solution, I’m part of the problem. Around this community, I’ve been a good critic but a terrible fixer.

Be Unreasonable?

Of the more than a dozen presentations and panels I attended in Manchester, including the terrific Skepticamp organized by the Pod Delusion gang on Friday, there was only one that I didn’t enjoy too much. this was the “Guerrilla Skeptics” talk given by Las Vegas magician and mentalist Mark Edward.

First, Edward, a performer of tremendous talent told us that he wouldn’t do a demonstration of cole reading or mentalism as the audience would already know about that stuff. The audience has all previously seen Richard Wiseman turn a tea towel into a rubber chicken but the audience always enjoys seeing the trick again, even if Wiseman had long previously bored with doing the gag. Perhaps, if Edward and warmed up the audience with a bit of humor and “magic,” he would have seemed less angry and would have been a more effective presenter.

During his talk, Edward showed a slide containing a picture of a banner stating, “Sylvia Brown is a Convicted Felon.” This is true, the late pseudo-psychic, Sylvia Brown was convicted of a felony in her past. When I heard him mention this, I muttered to the person sitting beside me, “Convicted felon? You mean people like Nelson Mandela, Mohandus Gandhi, Martin Luther King, Malcolm X and Brian Dunning?” and he snickered.

Ad hominem, last I checked, was a logical fallacy. That Brown had been convicted of a felony is orthogonal to whether or not she had actual psychic abilities.

If we explore the case of skeptical celebrity, Brian Dunning, we may have a different perspective on using such logical fallacies in our arguments. Dunning pled guilty to fraud charges and admitted that he had stolen more than a million US dollars using illegal and fraudulent techniques. Dunning has, thus far, refused to apologize for the crimes for which he was convicted, instead, saying, “It wasn’t really very much money.” Having had a drug addict friend of mine spend six months in a Florida County Jail for stealing $200 worth of crap from a discount store while seeing Dunning steal more than 5000 times as much and get no time certainly annoys me but that’s a function of the general inequities of the American legal system – steal a little, go to jail; steal a lot, get a fine and continue with your safe suburban life.

Do Dunning’s criminal behavior cast doubt on the value of his Skeptoid podcast? I would say “no,” hacking crimes of which he was convicted say nothing about the quality of the research, presentation or anything else about Skeptoid, one of my most favorite podcasts. Dunning’s work product is outstanding and it’s one of the very few podcasts that never backs up in my pod catcher, when I see a Skeptoid episode has dropped, I listen almost immediately. Dunning’s work in the skeptical movement is undoubtedly excellent but, indeed, he is not just a convicted felon, he’s a convicted fraud. Thus, the ad hominem statement that Sylvia Brown had been a felon says as much about her as the same statement does about Dunning. If you’re going to use logical fallacies to combat those with whom we disagree, you need, to avoid hypocrisy, to use the same fallacious statements about our friends.

When Edward said that Phil Plate’s notion of, “don’t be a dick,” was a bad idea for behavior towards all but “friends and family,” I started hearing “under the breath” mutterings from others sitting near me. In short, it came down to, “this is what is wrong with skepticism in America.” I felt a bit of shame for my fellow US skeptics and, more so, started questioning my own tactics regarding accessibility and the skeptical movement.

I’m happy that I sat through Edward’s talk as, in many ways, it’s helped me formalize my approach to “be reasonable” while trying to affect change regarding accessibility. Edward caused me to ask, “am I that guy?” and, when the answer was, “well, shit Chris, you are…” I decided to make some changes in how I approach people regarding the issue I personally find most important.

The 20 Is Plenty Campaign

One of the most interesting conversations I had at QED was over breakfast with a lovely woman named Anna Semlyen. Anna is the leader of the “20 Is Plenty” campaign to have speed limits in residential areas reduced to twenty miles per hour. Her core issue is the rights of individuals to walk and ride bicycles more safely. Her Skepticamp presentation was loaded with highly compelling data for why this is a really good idea and, of course, pedestrian issues are also at teh core of the movement for independence for people with disabilities. It was absolutely terrific to have the opportunity to discuss the intersection of her issues with those on which I work and I look forward to helping try to promote this issue in the US in the future.

Conclusions

As I say at the top of this piece, if you haven’t attended QED before, come next year; if you’ve come in the past, please return as I’d enjoy meeting you again. To all of the organizers, speakers and attendees, here’s a big Gonz thank you for making the even so incredible.