2014 In Review And Predictions For 2015

Introduction

It’s been a while since I’ve done one of my random musings pieces. I think about all sorts of shit and, sometimes, I type up such notions and present them to you, my loyal readers. In this piece, I thought I might talk a bit about this blog itself and how it performed over the past year, the various topics discussed here and then let my stream of consciousness take this piece in whatever direction the wetware points.

I had gone into my web analytics page to take a look at a few things and, as I found certain trends interesting, I thought I’d share them with you as you may find something of interest in such. As this is really musings about these numbers, I didn’t do any math with the numbers beyond calculating a few averages and lumping things together in a manner that makes sense to me. Thus, don’t draw any serious conclusions about anything from this piece as, in addition to being heavily biased by my own ideas on matters, the statistics are noisy and are in absence of any independent data from which a conclusion could be found. Thus, as it says on the programs given out at Vegas psychic shows, “for entertainment purposes only.”

I’d also like to express my sincere gratitude to all of the people who help me make this blog possible. There are a number of you out there, I appreciate the help you give me by providing answers to technical questions, advice on organizing articles and the other things you do to permit me to publish articles that are reasonably accurate and informative. Of course, I’d also like to thank all of you who visit the blog, read the articles, occasionally leave comments and so on, it’s you readers that motivate me to keep writing.

Blog Statistics

When I wrote the BlindConfidential blog, I used some hit count utility available to those of us who wrote on blogger back in those days. I’d look at the statistics now and then and occasionally feel proud of a big week or feel disappointed in a bad one. Those statistics were very raw and extraordinarily noisy and I doubt they reflected true visits in that that particular utility didn’t even filter for uniqueness.

In late November 2013, a year ago or so, I decided to install Piwik to track statistics about this blog. Piwik is profoundly more interesting than the utility I used previously, it’s UI is mostly accessible (my best experience with it was with NVDA and FireFox but quite acceptable with VoiceOver and Safari on OS X), it’s developers seem committed to improving its accessibility and it has a panoply of features for analyzing a site’s web traffic. I turned my Piwik installation live on 12/1/13, a year ago so now have twelve full months of data about this blog.

Note: I installed Piwik and started looking at our statistics purely out of curiosity and not for any business purpose. this blog and its predecessors does not accept advertising nor does it ask for donations. Hence, gathering analytics for me is a intellectual endeavor that leads me to questions like, “Why do I get a lot of hits when I write about topic A but very few if I write about topic B?”

The Big Numbers

In the twelve months ending November 30, 2014, this blog, discounting for bots and such, received something more than 26,000 hits, an unremarkable number in an era of viral Internet media but not bad for a crackpot like me. A bit unfortunate also is that, to get a truly accurate representation of my 2014 statistics, I’ll exclude nearly 6000 of those hits in my analysis as they all happened when Daring Fireball linked to an article I had written here in 2013 which alone accounts for nearly 6000 hits. Thus, I’ll be using the number 20,000 as the grand total for the year and will note otherwise when and if I use the top line number.

In total, we published 22 articles here this year which gives us a mean hit count of about 900 per article. A mean isn’t a terribly interesting statistic, though, as the distribution of hits across the articles is far from even.

The Article Popularity Curve

This year, the subject matter on this blog can be broken down in a variety of ways. Most obviously to me, however, is that I wrote about accessibility on Android and Apple products, I wrote articles about the history of screen reading on Windows and I wrote what we’ll lump together as general interest pieces, a category we’ll call “other” giving us four major topic categories.

When, in 2014, we published an article here about Android accessibility, it attracted a mean readership (excluding the article that Daring Fireball linked to) of more than 2000 hits with one, Testing Android Accessibility: I Give Up receiving just over 4500 on its own. Also interesting is that, including the article that got the mainstream link, a number of the most popular articles this year were those that I had written in 2013 about Android accessibility as well.

In second place comes the articles about screen reader and access technology history, specifically those articles I wrote about my personal experience during the days when the Windows screen reader wars were raging between FS and GW Micro. Of the top ten articles this year, three come from this category and one was a 2013 article on the subject that was also (based in the really crappy analytics thing I used then) the most popular article of that year as well, The Death Of Screen Reader Innovation. I’ve been writing articles like these since the BC days and they’ve always been popular among readers.

Articles we published critical of Apple accessibility were the third major group of pieces that I had written in 2014. These gathered a mean hit count of about 500 each and, to me, were a disappointment. I hadn’t written anything terribly critical of Apple in years, since the BC days in fact. Unlike the Android and historical pieces, these articles got few comments and little noise on Twitter as well.

The fourth and final category, the one I call “Other” as it ranges in subject matter from William Faulkner and Led Zeppelin to announcing a couple of things to preserving the history of access technology, received very few hits whatsoever. It appears that, if Gonz Blinko strays too far from his central themes, very few people take the time to click through.

I “advertise” each blog article the same way, when I first post it, I send out a tweet with the headline and the link and, when appropriate, include the #a11y and #accessibility hash tags. About five days later, I’ll tweet out the link again with an “in case you missed it” preface. When I last looked, I had just over 600 Twitter followers. If someone tweets something I especially like about an article I’ll retweet and favorite it as well. Thus, I don’t spend much time marketing this blog and I’m happy to see that the numbers on some articles must come from word of mouth as they exceed the number to whom I send links.

Analyzing These Numbers

Here’s where I find myself scratching my head. I’ve broken a number of the things we’ve published here this year into series. The series that, by far and away received the most hits was, “Testing Android Accessibility” with I Give Up getting over 4500 and The Deaf-Blind Perspective and The Programmers’ Perspective receiving around 1200 each. Historically, my articles get about three quarters of their hits in the first three days after publication and the totals tend to stop rising after an article has been live for around ten days. This series, however, as a group continues to get about 75 hits per article per month showing continued interest in our testing.

What distinguishes the three “Testing Android Accessibility” pieces from others I had written critical of Android or the much less popular articles I did on Apple? What made these three pieces so much more attractive than the more outlandish [Amish User Experience10 or Do We Get What We Pay For?,

I think the thing that made these three pieces so much more popular is that they were the most data driven articles I’ve ever written or, in the deaf-blind case, that a guest author had written. While those articles contained opinion, the conclusions were drawn from actual test results included in the articles themselves. Most of the content here is derived from my somewhat educated personal thoughts on a subject; in this series, we spent a lot of time doing a lot of work in preparation before they were written and published and I think they’re success reflects that effort.

The Statistics Appear Upside Down

The next question that I find when reading my Piwik report comes when I compare the relative success of articles on this blog with actual marketshare numbers, published and observed. As far as I can tell, Apple has an overwhelmingly large portion of the blind user market on mobile devices and Android, with published statistics at 12% but an observed share that’s even smaller, has had at best marginal uptake in this community but, if I publish an article about Android accessibility, I’ll have 500 hits in the first few hours while an Apple piece is lucky to get 500 hits in its lifetime.

Android fanboy behavior cannot account for this large a discrepancy but our friends on the Eyes Free list do feel a fierce sense of loyalty to their chosen platform and I’m certain to hear from them when I write a critical piece about the platform. At the same time, while vocal, there aren’t that many people who are actually active on Eyes Free and in other blindness and Android communities. Thus, the fanboys, my haters cannot account for the popularity alone. Digging a bit deeper into my Piwik statistics, I find that (including the article Daring Fireball linked to), only 6.8% of the 26,000 hits came from Android systems while more than 50% came from Windows, and roughly 30% from Apple products which, allowing for a reasonably large margin of error, roughly reflects the actual user distribution among blind technology consumers both published and observed.

Nothing in the Piwik reports about geography says much about who reads these articles either. One outlier here is that all three of the Testing Android Accessibility articles got a spike in hits from a specific city in South Korea where Samsung has a plant. So, maybe an actual engineering organization is paying attention. I notice a similar spike from Cupertino, California when I write about Apple and if I mentioned GW Micro, I would see a spike in hits from Fort Wayne, Indiana. But, I can’t believe that the insiders, the actual engineering sorts responsible for this technology cause such a large bump in hits either. In fact, when I drill down on these numbers, I see that we’re talking about a very small number of actual hits so, statistically, we can’t derive any conclusions from the geographical data.

By looking at my Piwik report, I learn that most of the hits I get here come via Twitter but, after that, it’s search engines. And, indeed, the top search term that leads people to this blog is, in 2014, “Android Accessibility.” This, in turn, becomes a self-fulfilling prophecy, people google on that term, they find one of our articles, they click and our SEO gets better so it’s more likely that the next person who searches on that term will find us and so on.

Over the past couple of days, I have talked about these statistics and this article trying to find conclusions we can draw. Fundamentally, why do so many Windows, iOS and Macintosh users come to this blog to read articles about Android? The only solid answer we could come up with is, “we don’t know.” We all seem to suspect that people read articles about what they don’t already own as, if they use Apple devices, they already know about its defects so don’t come here to read about such. We came to a few other conclusions but none were supported by the data so I’ll leave them out.

What About The History Articles?

A whole lot of people upon meeting me for the first time tell me that they really enjoy these articles. I enjoy writing them as well as I get to talk about the days when I was actually relevant, productive and on top of the world’s most popular screen reader. I don’t just write them to massage my ego though. These articles have and have had for years a running theme, the lack of fundamental business principles, specifically the lack of real competition in the blindness sector of access technology.

I think these articles remain popular for a couple of reasons. First, a lot of you readers have been looking at my stuff for a lot of years and these articles tend to be part of a continuum of commentary. Those days were an exciting time to be around the AT biz, in JAWS, we were doing significant things with each release and, as happens during periods of irrational exuberance, we thought that run would never end.

In addition to the rapid progress in screen reading during that time, we also saw the beginning of Section 508 which, while certainly not a success even a decade later, caused an explosion in new jobs in the federal government for blind people. Not only was the JAWS marketshare growing to complete dominance, the market itself was expanding faster than ever previously observed. Those days were a lot of fun and I hope another generation of accessibility specialists gets to experience something as much fun in the future.

Lastly, a whole lot of the things I predicted on the pages of BlindConfidential came true. Readers who were rightfully skeptical of the articles I published back then (I encourage my readers to be as skeptical as possible of everything I write as well as everything else they read on a blog and in mainstream publications as well, believe nothing, not even this) have, years later, come back and agreed that my predictions about the Windows screen reading future had largely come true. The lack of competition in the space allowed JAWS to deteriorate, falling sales caused by their suicidal business plan prevented GW Micro from catching up, NVDA came along and grabbed ~22% of the Windows screen reader market but, lagging in their Office support, haven’t caught on in institutional settings and the Windows screen reading world is a quagmire of differently broken access technologies.

The Year’s Biggest Disappointment

As I’ve written in the introduction section of various other articles, I’m never certain what will and what will not be a hit on this blog. Having analyzed the Piwik reports, I suppose that I can predict that an article about Android or screen reader history will outperform something I might do about Apple and that few people will read anything I write about anything else but, on a specific article, I never really know. One such article published back in July, Preserving Our History was one such a piece.

In Preserving I present the problem that the history of access technology for people who are blind seems to remain unwritten. Products I felt were of tremendous historical importance, the Blazie Braille ’N Speak for instance and people like it’s inventor, Dean Blazie, have no Wikipedia entries and very little of merit written about them anywhere online. Wikipedia has long articles about the completely random pieces of mainstream technology but even the most groundbreaking access technologies are disappearing to history.

I thought this subject felt both important and of interest to a broad group of potential readers. I wrote the piece and, as I describe above, tweeted out a link with the two accessibility related hash tags I use. The next day, I looked at Piwik and saw that maybe 25 people had clicked through. After waiting a few days, I did my usual second tweet and was greeted with the most deafening silence I could have imagined. To date, the article has received fewer than 100 hits. People in this community, excepting those who wrote comments or sent me a note through the contact form, just don’t seem to have an interest in preserving our history. This made me sad.

What About The Future?

In preparation for this piece, I reread the twenty top articles from my statistics. This blog was really dark this year. Almost every article I wrote in 2014 has a rather negative conclusion. I’m uncertain if it will be any more cheerful moving forward.

For the first year in many, 2014 saw no new fiction written under the Gonz Blinko nom de plume. I have three such pieces in various sates of disrepair and incompletion but my inner Hunter S. Thompson has failed to inspire me to lampoon the industry and my own life lately. Some people around FS hated those pieces but, in general, they were pretty popular. I don’t like poking fun at people whom I don’t know well as I’m not sure how they’ll take it and, these days, I do try to be less of a dick than I was when I wrote BC, especially in its early days. So, to those of you who’ve asked, you can probably expect new Gonz material in the coming year.

The Gonz Blinko Predictions for 2015

I’m going to take a looked into my somewhat resin covered crystal ball, well, actually, I cannot afford an actual crystal ball so I’m using the water chamber on a glass bong that I bought up in Haight-Ashbury a few years back to observe the murky future. I’m feeling exceptionally intuitive as I stare into this bit of glass and see nothing (I can’t see, I’m blind you morons) but the sound of the diffuser clinking against the sides surrounded by the splashing of the water makes me as confident of the following predictions as I am in the dead Sylvia Brown’s ability to find a kidnapped child:

  • Something important regarding accessibility for people with vision impairment will come out of Amazon. I don’t know whether this means that Peter Korn will lead an effort to fork Android accessibility and make a proprietary Amazon solution for Android, whether we’ll see tremendous improvements to the Amazon web properties accessibility or what specifically will happen but, from the sounds in the bong, I predict significance.
  • Google’s accessibility will see some dramatic improvements but will probably mostly come in the second half of 2015. This prediction is based on one bit of actual data, Vic Tsaran, a blind guy who did a terrific job on accessibility at Yahoo now seems to be leading the charge at Google. Past performance does not guarantee future returns but I’ll wager a few bucks that Victor, if anyone, can start fixing the systemic problems with accessibility at Google.
  • NVDA and VoiceOver will continue to see marketshare growth while JAWS, Window-Eyes and others continue to fall. One serious wildcard in this specific equation is whether or not Narrator in Windows 10 which apparently has a scripting language built-in will succeed. I don’t know anyone who has tried to use the Narrator in W10 and, obviously, I haven’t heard anyone tell me that they wrote or edited a script for such so all I can say is that this is an interesting random data point that we should keep an eye on in the coming year.
  • Samsung, for reasons entirely unrelated to accessibility, will follow in Amazon’s footsteps and fork Android and give it its own brand name. Huge companies rarely like being beholden to each other and Samsung needs its own OS to optimize for features on its hardware. A major reason that iOS devices can outperform their Android cousins while having a lower powered CPU results from its software being optimized for very specific hardware components, something impossible in a generic platform like Android.
  • A number of new micro businesses will launch selling NVDA technical support, making my favorite Windows screen reader considerably more attractive to institutional installations.
  • I will publish at least one article that gets me slammed by the fanboy community surrounding some bit of access technology. It’s unlikely that this will be the Android peeps as, given that Google has brought on an individual like Tsaran suggests they mean business so, while I would still recommend avoiding Android if you’re blind, I’ll postpone any further analysis for a while to wait and see what Victor might accomplish there. And, on a personal note, I’m really bored with all things Android, progress in accessibility in the L release, based on scraping Eyes Free and not testing anything myself, seems to remain tragically slow so there’s nothing left for me to write about regarding it until something gets profoundly better or actually gets much worse.
  • Windows tablets and very low cost micro-laptop things from Dell, HP and elsewhere will emerge as the first real competition to iOS in the blindness mobile accessibility space. I see a whole lot of hardware coming online in the under $300 price point range that, if a user tosses NVDA or their favorite screen reader onto them, they’ll have a low cost portable device with a UI they’ve been using for years.
  • Something important will happen regarding accessibility to mathematics for screen reader users. Over the past 18 months, we’ve seen MathML support added to VoiceOver and in JAWS 16, we saw the impressive demo that Sina Bahram and his friends at Design Science did at CSUN 2014 in this area as well. Meanwhile, I’m hearing poor reports about the VO and JAWS solutions in their current incarnation but the trend points toward improvement in providing math to blind people.
  • Sadly, I believe that accessibility on Apple devices will both remain the best thing available for blind users in the mobile space but the accessibility to such will continue to deteriorate. On any institutional sale where accessibility is a requirement, iOS can continue winning the sales in absence of any real competition on accessibility. Hence, there’s no market force pushing Apple to regain its 100% compliance policy, something I think is reflected in both iOS 8.x.x and OS X Yosemite.

Will these things come true? I don’t know, the bong likes to give hints of the future but is rarely specific. These predictions are Gonz’s first attempts at the paranormal so his intuition may not be focussed properly.

Conclusions

I want to thank all of you who’ve visited the blog, read the articles, posted the comments, sent me emails through the contact form, tweeted and retweeted links to the articles, connected with me on Twitter, told me that you read my work when we’ve met at a conference or participated in this blog in anyway in 2014 and, indeed, over the entire 8 years I’ve been blogging. I honestly enjoy reading all comments posted here, even those that are quite antagonistic toward me personally.

Of all of the comments we got in 2014, my favorite part of any of them came when a reader wrote describing me as an “irresponsible journalist,” because, it elevates me, a self-described stoner, crackpot and loudmouth to a level of “journalism,” something I’d never say about myself. If this guy is right, I’ve taken a step up from blogger, an author who is irresponsible almost by definition up a few notches all of the way up to journalist, albeit an irresponsible one. Those words in a comment on this blog make me smile as they suggest that, although inaccurately, some people actually think these articles have actual power to influence people and their purchasing decisions.

I’m not sure what to expect on this blog in the coming year. I’ll focus less on Android as there’s nothing left to say about it other than “I hope Victor is successful at Google.” I’ll likely explore the competition or lack thereof theme from different angles as we see events actually unfold. And, while there my least popular pieces, I’ll probably be writing more articles in the “other” category as that seems to be the ideas that I’m thinking up lately.

Thanks again for your support!

Apple, The Company I Hate To Love, Part 3: The Macintosh User Experience

Introduction

Recently, I have been writing a series of articles about accessibility and Apple describing the cognitive dissonance I feel when I’m in a position in which I must praise the Cupertino technology giant. I wrote the first article, “Apple and the Accessible Internet” before I realized this would become a series so it reads like a stand-alone piece. Then, after the release of iOS/8 and the Yosemite version of Macintosh OS X and a bit of encouragement from some readers, I launched a series investigating broader issues regarding Apple and accessibility. You can read the first article, “My Long History Fighting Apple,” about my activism on intellectual and information freedoms and the second item, “Where’s The Competition?,” in which I revisit a common theme for this blog, namely the current and historic lack of competition in accessibility and how this phenomena hurts blind users. These are not great examples of my writing skills, I stand by the opinions presented but please forgive me for the mediocre writing and repetitiveness of the material, I’ve been highly distracted while working with my new guide dog.

I also work on another blog called Skeptability, a pan-disability site that discusses the intersection of disability with feminism, social justice, skepticism, humanism, atheism and related subjects and disability. My general rule separates my articles between the two sites by publishing those that are more technical, more laden with jargon and require more historical knowledge about the access technology field here but, when I write for Skeptability, I write about things of interest to a broader audience. My Skeptability articles tend to be less dark than is this blog and, if you’re interested, you can read an article about my experience at guide dog school called, “My Time At Guide Dog School” there if you’re so inclined. . ,

I have been a blind user of Macintosh for a pretty long time now. I first wrote about this experience on my old BlindConfidential blog in an article called, “Eating An Elephant, Part 2: Apple Rising,” where I prefaced the piece with a discussion of Apple’s deplorable history regarding intellectual property law but continued to talk about how good Macintosh accessibility had become at that point. Back then, I did an experiment in which I didn’t reboot or restart my Macintosh with VoiceOver running until I absolutely had to. My record for testing the reliability of a Macintosh back then was more than 40 days without needing to restart the laptop or the screen reader. Today, a bunch of years later, I rarely go a single day without rebooting my Macbook Air or restarting VoiceOver. Plain and simply, I cannot be as productive with my Macintosh as I once was and I will soon be returning to Windows as my full time system, using Macintosh only for my audio work.

This article explores the very accessibility sloppy Yosemite operating system release as well as discusses problems with [OS X] accessibility that have been with us for years. As far as I can tell, Apple has been made aware of all of these issues and reports of these issues have been received by Apple repeatedly for a lot of years but have been ignored by the Apple engineers. In fact, Apple seems to treat Macintosh accessibility as an orphan stepchild of the much more comprehensive iOS versions of the same.

I’d like to thank my friend and fellow accessibility expert Bryan Smart for the conversations we’ve had in preparation for this piece. Readers should visit his blog where they can listen to his work investigating some of the issues described herein. Bryan is a really smart and very insightful individual on issues regarding accessibility and you, my loyal readers, should check out his stuff too.

The Sloppy Yosemite release

As I mentioned in the second article in this series, it appears as if Apple had hired an accessibility quality assurance specialist out of the notoriously sloppy Google testing department. Yosemite also contains some accessibility improvements, most notably in the browser, iWork and by adding support for MathML in a number of apps. These are all very solid steps forward but, very sadly, they are overshadowed by the newly introduced accessibility problems along with long standing issues that have yet to be remedied. I didn’t do a lot of testing to prepare for this piece and will be writing from personal experience rather than reporting results found from a formal testing procedure. The guys on AppleVis wrote a terrific and much more detailed article called, “Features and Bugs in OS X 10.10 Yosemite,” which you should read if you’re looking for a more detailed report.

AppleMail

I tend to keep my email app running at all times on all of the different OS I use. Email is, for me, an essential tool for business, recreation, personal and professional correspondence and nearly every other activity in which I participate. Years ago, when I wrote “Apple Rising,” AppleMail was both entirely compliant with the Apple accessibility API as well as being very usable for a VoiceOver user.

Over the years, AppleMail has seen its accessibility deteriorate. In the Yosemite version, using “Classic Mode” for the display, when a user opens an email that is part of a thread, they will hear “Embedded ” followed by “Embedded Unknown, Embedded Unknown.” If one then interacts with the first thing labeled as “Embedded,” they will find themselves in that email but first most navigate through no fewer than a half dozen buttons that VoiceOver only identifies as “button” in speech. Thus, we find ourselves in a window in a an app that’s very important to my daily life with a bunch of unlabeled items in its interface, even in “Classic Mode.” In general, AppleMail feels a lot like something released by Google, regarding accessibility, it isn’t even up to an alpha test level as it remains feature incomplete. The bugs in AppleMail are all really easy things to fix and are easy test cases that should have been caught by an automated test suite, hence, they are solidly “stupid” bugs.

Finder

When Apple released the Mavericks version of OS X in 2013, they introduced some nasty accessibility bugs in Finder, one of the most essential bits of software to all Macintosh users. Specifically, when one tried to navigate through the sidebar to move to a certain folder, focus was lost and instead of reading the items in the part of the interface a VoiceOver thinks he’s interacting with, it read file names and, indeed, moved focus from the sidebar to the table of files. For a VoiceOver user, this is a usability nightmare and, while I think Apple had fixed this in a later version of Mavericks, it appears to have been broken again in Yosemite.

This problem leads me to question Apple’s quality assurance and software engineering methods. If a bug existed and was fixed in an earlier version of the operating system, the fix should have long ago been integrated into the main trunk of the source tree but, apparently, Apple has chosen to ignore accessibility fixes present in Mavericks in the Yosemite release. This also speaks to what must be a fact that Apple either does not test its accessibility features and VoiceOver or chooses to ignore bugs reported either by their internal testing teams or by the army of blind people out there willing to spend their personal time reporting problems to Apple regarding accessibility. I know which bugs I had personally reported during the Yosemite beta cycle and, much to my chagrin, I also saw very few of the many I had reported fixed in the final release.

Other Problems

While my notions about AppleMail and Finder are accurate and things you can test for yourself, they do not even approach a complete look at Yosemite accessibility. As I suggest above, please do read the AppleVis article to get far more details. I’ll suffice it to say that OS X has had problems for a number of releases and, with each new version of OS X, the accessibility deteriorates further.

Yosemite And The Internet

After publishing “Apple And The Accessible Internet,” I received an email from the people who work at the email address, accessibility@apple.com. The author of the email asked me to install the Yosemite beta, to test the improved Internet support and report my findings to them. I typically politely refuse to run pre-release software without being compensated for my time but, in this case, I made an exception and elected to work as a volunteer testing this OS release.

I was pleased when I went to my first web site using Safari, VoiceOver and Yosemite. The first thing I did, with QuickNav turned off, was to start navigating around using cursor keys in a manner similar to how I interact with FireFox using NVDA or Internet Explorer with JAWS. I also enjoyed the relatively new feature that allows a VO user to navigate a web site with single key commands similar to QuickKeys in JAWS and similar features in all Windows screen readers.

When, however, I tried to actually use the new Yosemite version of VoiceOver in Safari, I found a number of problems.

An Interface Out Of Sync With Itself

If you are running OS X Yosemite (10.10),you can try this on this very page. First, make sure QuickNav is turned on, then hit “h” a few times to get to a heading somewhere on the site, it doesn’t matter where. Next, turn QuickNav off (left arrow plus right arrow toggles it) and start navigating with the cursor keys in the new simulated virtual cursor mode. You will discover that the two navigation modes are out of sync with each other. A user would expect that hitting a down arrow after navigating by heading would read the first line after the heading text, in Yosemite, you will find that the cursor navigation, assuming you hadn’t used it earlier, starts from the top of the page no matter where QuickNav had left you. This turns the new cursor navigation feature into a demo of things to come in the future as it is not actually usable in its current state. A lot of VoiceOver for OS X has seemed more like a demo than production code for a long time.

Split Lines

Due to Apple’s philosophical obsession with ensuring that VoiceOver only represents information that appears on the screen (more on this later in this article), when using cursor navigation on an Internet site, it reads the information exactly as it appears visually in Safari. This means that when using cursor navigation or having cursor navigation turned on during a “read all” the user will hear words hyphenated by Safari read with the hyphens included. If the user has sounds turned on for misspelled words, the hyphenation will create misspelled words by its nature and the user will experience the latency problem caused by having sounds inserted sequentially into the audio stream. NVDA does not exhibit this problem and, if I remember correctly, neither does JAWS.

Faithful representation of on screen information is very nice in some cases but, in this one and a number of others, it inserts a layer of inefficiency into the user experience.

Copy And Paste

I spend a lot of my time writing and, like most authors these days, I use the Internet as source material for my work. It is therefore essential that I be able to copy information from web sites and paste it into my text editor for integration either into one of these blog articles or, far more important, into the documents I prepare for my clients. With VoiceOver and Safari, copy and paste is a never ending adventure.

On this site, one can select using the cursor key navigation along with the SHIFT key as one would expect but, on many sites I tried, the same selection, copy and paste do not work at all. VoiceOver does provide a keystroke for selecting text on web pages but it also works very inconsistently. When I’ve reported problems with selecting text on web sites to Apple, they responded with ambiguous answers that tended to say something unspecific like, “something about that site prevents us from selecting text.” I’d accept this as an answer based in web accessibility standards and guidelines if the people at Apple would tell me which piece of WCAG 2.0 or standard HTML was violated but they never include that piece of information in their responses to me. Meanwhile, NVDA handles the same pages perfectly in FireFox and, in my opinion, if one screen reader can do something properly, they all can.

In general, the Yosemite version of VoiceOver and Safari provide a nicer experience on the web than did Mavericks but, as it also contains a whole lot of the problems that were reported by users of earlier versions of OS X, it remains far behind JAWS and NVDA in its actual usability.

Latency and Sounds in VoiceOver

A really long time ago, TV Raman (now at Google accessibility), added the notion of an “earcon” to his emacspeak software. More than ten years ago, JAWS became the first screen reader to include this idea with the advent of its Speech and Sounds Manager. An earcon is a sound a user hears in lieu of speech to augment the audio stream in order to spend less time listening to speech and more time actually getting their work done. Going back to the early versions of VoiceOver on OS X, Apple included the concept of an earcon to deliver information but implemented it in the worst way possible.

While I worked at HJ, Ted henter personally taught me to count syllables in any text that JAWS would speak to its users. Ted demonstrated that every syllable or pause spoken to a user takes up a single unit of said user’s time. We invented the speech and sounds manager in order to help users reduce the number of syllables they need to hear in order to enjoy the same amount of semantic information in less time. As a quick example, one can set JAWS to play a tone instead of saying “link” when it finds one. The important feature of the JAWS implementation, however, is that the sound plays simultaneously with the text being spoken.

As you can hear if you listen to Bryan Smart’s recordings on this matter, the VoiceOver developers made a rather bizarre decision when they implemented the sound feature on OS X. Specifically, instead of playing the sound simultaneously with the spoken text, VoiceOver adds its sounds sequentially to the audio stream. Thus, instead of saving time, each sound played by VO adds more time to that which the user needs to spend hearing the same amount of information. According to Bryan’s work, this delay is never less than 200 milliseconds and can go as long as a half second. One fifth of a second doesn’t sound like much but such interruptions cause a cognitive hiccup that could easily be avoided by playing the sounds at the same time as the text is spoken. Apple’s sound system adds time, thus reducing efficiency while also breaking up the text in a manner that disrupts one’s attention.

This problem and its related efficiency issues have been reported to Apple many times over the years, people have discussed it in blog articles and podcasts but over the years, Apple continues to refuse to remedy this major problem with their interface.

The latency issues aren’t always associated with the sounds being played. If one uses any text augmentations, including having VO change the pitch for links and misspelled words, they are accompanied by a delay of no less than 100 milliseconds, making these features interesting but not entirely usable.

Complex Apps And Efficiency

Apple must be commended for the excellent work it has done regarding accessibility in software like Xcode and Garageband. As far as I can tell, a VoiceOver user now has access to all of the features in both of these very complex user interfaces. For me, an occasional podcaster, having Garageband for recording and mixing available to me has been a lot of fun. I also enjoy using Garageband to create “virtual bands” to jam along with using loops and related features. At the beginning, the VoiceOver interface in Garageband worked well for me but I was a novice then. and, as I grew more proficient with the program, I found many tasks were tremendously cumbersome.

As I’m only passingly familiar with Xcode (I don’t write software for Apple devices), the examples I’ll use in this section will come from Garageband but apply to almost every Apple branded Macintosh application of any complexity, including iWork apps like Pages and Numbers.

Faithful Representation Of On Screen Information

When I worked on JAWS, a frequent complaint we would receive at FS from the field often came from sighted people or from actual JAWS users who needed to work closely with sighted colleagues. The problem came as a result of JAWS speaking information in a manner differently from how it appears on the screen. Sighted trainers became frustrated when the speech didn’t match the visual display and I can remember trying to ask my sighted wife for help at times and both of us getting frustrated by the difference between speech and screen information. The people who designed VoiceOver chose instead to take a radically different approach and ensure that on screen information is accurately represented in what the user hears.

The JAWS philosophy comes from Ted henter’s insistence on not only providing an accessible solution but also making sure that the solution is as efficient to use as possible. I’m sad to say that, as far as I can tell, no screen readers other than JAWS and NVDA even attempt to maximize efficiency anymore. The problem with the JAWS approach, however, is that it comes with a steep learning curve, users to use complex applications efficiently with JAWS must spend a fair amount of time learning different keystrokes specific to the application they need to use and will need to live with aspects of the application remaining inaccessible in most cases. The Apple approach solves the discoverability problem, a novice can poke around the Garageband interface and find everything in a fairly intuitive manner; the Apple approach, at the same time, provides little in terms of efficiency for intermediate to advanced users.

Using Garageband, I often find myself spending more time navigating from control to control than I do actually working on my recordings.

A Lack Of Native Keystrokes

In general, Windows programs tend to have more accelerator keys to handle interacting with features than do those on Macintosh. It would be useful for Macintosh apps to have the same. While I can perform every task and use every feature in Garageband, many require me to issue a pile of keystrokes to both navigate from place to place but also to use a on screen simulator. Indeed, my experience is nearly identical to what a sighted user enjoys but without the efficiencies provided by having vision. Where a sighted user can move quickly with a mouse or trackpad, a blind user needs to step through every item in between and often perform actions with a keyboard that could be made profoundly more easy if a single keystroke was available.

The Interaction Model

In an attempt to make navigation more efficient, the VoiceOver developers invented a user interface system that grouped interface items together in order that the user could either jump past its contents or, if they so choose, to interact with the group and access the information therein. Unfortunately, the grouping seems to be done algorithmically and that this facility doesn’t work terribly well.

Using the Macintosh version of iTunes as an example, a user can observe some areas made more efficient by the interaction model while also finding areas where they need to step through a bunch of controls that are not grouped together in a useful manner. This is true of many other applications as well, the interaction model demoes well but is implemented in such a random manner throughout the Apple branded apps on OS X so as to be of marginal use at best.

The interaction model also inserts a hierarchy on the interface. In a complex app like Garageband or Xcode, a VoiceOver user needs to climb up and down a tree of embedded groups with which they must interact separately. Moving from a place in the interface buried deeply in one set of nested groups to another place buried in a different group requires a ton of keystrokes just to do the navigation which could be obviated with either native accelerator keystrokes or keystrokes added specifically for VoiceOver users.

It appears as if these groups and the interaction model had been presented as an idea, included in VoiceOver and then ignored as the software matured. I do not believe that this interface model is mutually incompatible with efficiency, I just think that it has only been partially implemented and that it needs much more work moving forward.

A Lack Of A Real Scripting Language

AppleScript is available but has so many restrictions that it is nearly useless as a scripting system for VoiceOver. First and fore mostly, it is very difficult to share AppleScript with other users as such requires copying the files individually and adding keystrokes separately on each system. It is also impossible to assign a non-global keystroke to an AppleScript so application specific ones are impossible as well. AppleScript cannot fire on UI events so, continuing with the Garageband examples, one cannot have a sound play only when an on screen audio meter hits a certain level or some other interesting UI event had happened. After many years of criticizing JAWS for having a scripting language but falling further and further behind in the functionality wars, GW Micro finally added a real scripting facility to Window-Eyes, it’s now time that Apple do the same for VoiceOver.

Bryan Smart works for DancingDots, a company that makes Caketalking an impressive set of JAWS scripts that, among other things, provide access to the popular Sonar audio editing software on Windows. Why would people pay a lot of money to get JAWS, a lot of money for the DancingDots scripts and a lot of money for Sonar when they can get Garageband, VoiceOver and a laptop all for the price of a Macintosh? Because they need to use Sonar efficiently and Garageband, while being an excellent choice for a novice, cannot be used efficiently by a VoiceOver user. Complex applications seem to need a scripting language to accommodate users as they grow increasingly proficient with an application.

Syllables, Syllables, Syllables

As I wrote above, Ted Henter taught JAWS developers to count syllables whenever we added text to be spoken by JAWS. After running Yosemite for a few days, I changed my verbosity setting from “High” (the default) to “Medium” but still find that VoiceOver takes too much time to express some very simple ideas.

In AppleMail, for instance, VoiceOver reads “reply was sent” instead of simply “replied” which could save two syllables and the time spent on the whitespace to separate words. When I used CMD+TAB to leave my text editor to use another app and then again to return, VoiceOver says, “space with applications TextEdit, Mail, Safari…” and lists all of the apps I have running, even if I had hit CONTROL to tell VoiceOver to stop speaking. In TextEdit, where I’m writing this piece, if I type a quotation mark, instead of saying “quote” or some other single syllable term, VoiceOver “left double quotation mark” enough syllables to fill a mouthful or more.

I could go on. It seems that VoiceOver speech is overly verbose in far too many places to list. Whether a key label or a text augmentation, it is essential that the user hear as few syllables as possible in order to maximize efficiency.

Forcing A Keyboard Into A Mouse’s Job

Most blind people access general purpose computers using a keyboard and this is how I use my Macintosh, only rarely using the TrackPad. As I mention above, the VoiceOver UI is designed to mimic as closely as possible the on screen information. Quite sadly, a keyboard is not an efficient mouse or trackpad replacement.

The notion of drag and drop makes sense visually, the user “grabs” an object with the mouse or trackpad, drags it across the screen to its destination and then drops it by releasing the button on the pointing device. Using a keyboard to navigate by object until one finds themselves at their target destination is a hunt and peck process at best and far too cumbersome to use at worst. But there are Apple branded apps, including Garageband, that allow the user to interact with some features only with drag and drop, inserting a profound level of inefficiency into a VoiceOver user’s experience. Why not also allow for cut, copy and paste as a alternative to drag and drop? Doing such would provide a UI metaphor that makes sense to a person driving a Macintosh with only a keyboard.

In Garageband, there are custom controls for moving the play head, selecting blocks of audio information, inserting and deleting blocks and so on. For a VoiceOver user to do these things they must jump through weird UI hoops to force a keyboard to act like a mouse. Plain and simply, this can be corrected by either adding native keystrokes to Garageband or by allowing VoiceOver to be customized as extensively as one can do with JAWS or NVDA. In its current state, a blind person can use these features (along with similar ones in other Apple apps) but only with a great deal of superfluous keyboarding involved.

In short, though, using a keyboard to faithfully mimic what sighted users would do with the mouse is a poor idea in practice.

Conclusions

  • Apple, largely due to its iOS offerings, remains the leader in out-of-the-box accessibility. It is also true that the accessibility on OS X has both deteriorated from release to release and has had major problems delivering information and permitting interaction in an efficient manner.
  • Both iOS/8 and Yosemite contain a lot of “stupid” bugs, defects that should have been discovered by automated testing and remedied with about a minute of effort typing some text into a dialogue box.
  • Making VoiceOver on Macintosh into an efficient system will require changing some of its deeply held philosophical positions and I doubt this will ever actually happen.

[14] http://en.wikipedia.org/wiki/Xcode

Apple: The Company I Hate To Love, Part 2: Where’s the Competition?

Reintroduction

Most of this article appeared first on this blog yesterday (10/14/14) under the title, “Apple: The Company I Hate To Love.” A number of our most loyal readers asked that I split that story up and start, as I did for Android, a series on the problems I perceive surrounding Apple and accessibility these days. So, being a blogger who tries to be responsive to his readers and fully understanding why they felt this should be a series, here’s part 2, in which I discuss the problems I’ve experienced since installing iOS/8 and my continued issues with the lack of competition in this space.

Introduction

For years, both here and on my BlindConfidential blog in the past, I have railed against the lack of competition in the screen reader business. Years before systems like iOS, Android and Fire existed, I ranted about how GW Micro chose to take what I had described as a “non-compete” strategy in the market battles between JAWS and Window-Eyes. I’ve demonstrated in these articles how the community of screen reading using people were screwed in the end as, once JAWS was allowed to reach a position of market dominance, FS was left without incentive to continue making JAWS great as, in reality, if the competition “sucks worse” you remain the winner.

I have always and probably will always blame the lack of competition not on the winners nor on the consumers but, rather, squarely in the lap of the businesses who chose not to compete. It isn’t the fault of the JAWS developers that they built the best screen reader on the market back in those days.

Actually, rethinking, I suppose, indeed, that it is my fault and that of Eric Damery that we elected to spend the development dollars to make software like Excel and PowerPoint not just demo well but be usable in real professional settings. It’s my fault and that of Glen Gordon that we didn’t take the then broken MSAA approach to web accessibility but, rather, decided to invent the virtual buffer, the invention most blind people enjoy on Windows and to a lesser extent other platforms today. It’s definitely our fault personally as we are bad people who did the awful, we made the best thing out there and, as a result, we achieved a monopoly position, a position Apple holds today in the mobile accessibility space.

It isn’t the fault of people who told the world to buy Apple products for being accessible and to eschew products whose accessibility remains poor. It isn’t my fault that Google makes a poor accessibility solution, that’s Google’s fault. I report on what I observe and I encourage people to buy the best and, today, in spite of the disappointing iOS/8 release, Apple remains the best, even if they may not be as good as they were in their previous release.

Buying an Android device today, purely if accessibility is the standard on which one makes their decisions, is a really bad idea. Buying Android today doesn’t create competition but, rather, discourages such as it tells the manufacturers “it’s ok to suck.” It also tells the leader that they can stop working as, if users accept that crap, why should the best even consider for a second getting even better? competition will start in this space when there are two or more players who can claim what iOS/7 did, namely,100% compatibility with their own accessibility API. As no mobile device other than those running iOS come even close to iOS/8, defects and all, going to Android is only telling Apple that it’s ok to suck even more as we’ll buy this stuff just to not buy product from you.

I heard this exact same argument while at FS. People would say things like, “Sure, Window-Eyes is a poor alternative but I’ll use it just to promote competition.” How well did that work out? In those days, I was told by people at AFB that they refused to write a fair review of JAWS or Window-Eyes that compared the two as they feared killing the competition between the two screen readers. I railed very publicly as an FS VP against AccessWorld for saying that MAGic (the FS low vision software) was nearly as good as ZoomText because it was not so, I found this sort of article to be entirely misleading for readers as some, if they actually believed AccessWorld, might choose MAGic over its far superior competitor and vowed to work to drive MAGic to catch up (another of my personal failures). Promoting substandard solutions does not drive the leader to improve, it does the exact opposite and, as we saw with JAWS and Window-Eyes, a leader who isn’t pushed by its competitors will allow its technology to atrophy.

Can someone find me another industry where any consumers say, “I’m going to buy the crappy one, I’m going to reward them with my dollars just to encourage them to do better in the future?” No, of course not.

Fans Versus Consumers

It’s playoff time of year so my attention turns to baseball and I’ll use a baseball metaphor to describe what I consider to be the difference between a “fan” or “fanboy” if you prefer and a consumer.

Let’s say that you live in New York where you have a choice between two baseball teams, the Yankees and the Mets. Let’s add that, in this particular season, the Mets are a really terrific team and the Yankees are a poor one. If the Yankees and the Mets are playing at the same time but, of course, in their separate stadiums and you want to go to a baseball game, you need to make a choice as to whether to travel to the Bronx or out to Queens, you need to decide whether or not to pay to see the Yankees or pay to see the Mets.

If, in this case where the Mets are a superior product, you choose to go see the Yankees, you do so because you are a Yankee “fan” or “fanboy” if you prefer; if you choose to go to the Mets game, you are making a consumer based choice and buying the better product. If you think that buying a ticket for the Yankees will help them build a better team in the future, you are like the fans of the Chicago Cubs who haven’t won a World Series in more than a century, you are buying hope without reality.

This is, fundamentally, why winning teams draw large crowds and, in cities other than Boston, San Francisco or New York where money is so abundant, poorly performing teams draw poor attendance.

The iOS/8 Fiasco

I did not join the beta program to test iOS/8, I’m entirely unwilling to pay Apple $100 per year for the privilege of running broken, pre-release software, that’s an effort for which individuals should be paid as quality assurance professionals and is not something that billion dollar corporations should be enjoying as free labor from volunteers. All I can say, however, is that it appears as if Apple accessibility must have hired a QA person out of Google as the number of glaringly obvious accessibility bugs, defects that were reported by people paying Apple for the right to report bugs, remain in the released version of the software. We’re not talking about obscure problems that require a lot of steps to reproduce or may be the result of a strangely and unpredictable combination of features/apps/hardware but, rather, these are the really stupid bugs, the ones that any automated testing process should have caught that are present in many areas in iOS/8.

So, it remains that iOS/7 is the all time out-of-the-box accessibility champion. As iOS/7 can no longer be purchased from Apple, this also means that the most accessible solution for mobile computing is now a thing of the past. We’ve regressed in iOS/8 and Apple must be taken to task for such. That iOS/8 is crappy, though, does not mean, “go out and get an Android device” as Android remains far worse. Apple set the gold standard in iOS/7 and, with iOS/8, has taken a step backward but remains, by far, the best accessibility solution for people with profound to total vision impairment.

I’ve spent most of the past month in a car traveling from the Boston area where we spend our summers to Florida, unpacking and then getting back in the car for a much shorter drive south to Palmetto, Florida where I spent 25 days in guide dog school. As I was learning to work with a wonderful new dog, I didn’t have the time to do any serious testing of iOS/8 myself. Please do read the very comprehensive article on iOS/8 accessibility bugs on AppleViz if you need more details as the problems I mention in this article are a subsection of those I’ve personally experienced and is not a result of a comprehensive plan. I tend to ignore AppleViz in general as I find their editorial gist is too soft on Apple and contains too little criticism. This article, however, is pretty good and reflects much more of an effort than the item you are currently reading.

The Stupid Bugs

I am using the word “stupid” here specifically as a term to describe obvious bugs that should have been caught by automated testing. These are the sorts of bugs that drive me crazy about Android accessibility with the question, “How can you miss something as simple as putting something into the tab order or adding a label to a button?” as testing for such should take no more than a few seconds of an automated testing tool telling the developer, “Hey stupid, you forgot the damned tab stop.” These are the bugs that require thought to remedy, they can mostly be handled with a tiny bit of typing. If, on iOS/7, a blind user installed everything that came out-of-the-box plus all no cost iOS/7 apps that carried the Apple brand name, they would find that there are more than a thousand total accessibility API tests that could be performed and that all but a tiny fraction (10 or so) passed on iOS/7 giving a result of 100% when the results are rounded to integers. While this number is far worse on Android than iOS/8, the new iOS offering certainly does not hit the 100% mark but is probably still in the greater than 90% score level. Compared to Google, Microsoft, Amazon and Samsung, this is still the best score on the market today by at least 30 points but, as the newly introduced bugs are mostly “stupid” ones, the trend toward regression at Apple is alarming.

The Apple Monopoly Position

I had a lot of time alone while at guide dog school, it’s largely a “hurry up and wait” experience and while other students were training, I had a lot of time to think. What came into mind as I went from iOS/8 to 8.01 to 8.02 was also partially propagated by other students in the class and at another guide dog school where, coincidentally, my dearest friend was getting a new dog at the same time.

Of the 24 students in the two classes, 22 used at least one Apple device. All of these people, quite obviously, were also blind. What they are not in any other way is a narrow sample as they spread an age range from around 20 to over 80, a wide array of educational backgrounds, personal histories and so on. As a sample of adult blind people, this, while not scientific in any manner, was a fairly diverse group. The single thing that we had in common was that we used iOS devices and that we handled guide dogs. Of the two who didn’t use iOS, one was an older woman who still used an old Nokia N82 with Talx and the other was an Android user who admittedly didn’t use the device to do much more than accept and send phone calls. So, 22 of 24 users had already moved to iOS.

This complete market dominance led me to think of JAWS in late 2002. By then, Freedom Scientific was holding a marketshare among new product sales of over 80%. We had achieved a monopoly position and, while I may rant and rave about such decisions on philosophical grounds, it would have been a seriously poor business decision to continue investing as we were in JAWS as, quite simply, the market demands caused by serious competition had disappeared. I use the word serious about competition in this space as, in theory, GW Micro and Window-Eyes “competed” with JAWS and Android, also in theory only, competes with Apple in the accessibility space, they just do not compete with any serious efforts at doing so. Thus, in the lack of competition, why should Apple do anything but wait for the others to catch up?

Don’t blame the Apple fanboys for creating this environment, they saw the best thing this community has ever seen out-of-the-box, they rightfully celebrated the top thing on the market. The blame here falls directly in Google’s lap, TV Raman and his team produced a horrible solution wrought with the stupidest of bugs and Google corporate policy doesn’t even require accessibility testing on anything they make. Don’t blame the monopoly position on Apple, they only did what they were asked by this community to do: namely, deliver a device 100% accessible to people with profound to total vision impairment. In the same way that blaming FS for the failures at GW is absurd, so is blaming Apple for what are solidly problems at the businesses that claim to compete with them.

I’m not blaming the users for buying the best thing, that’s, indeed, how competition works, two or more companies release similar products, consumers evaluate them and buy the one they prefer. Apple built a highly preferable system or that’s what the marketshare numbers tell us and it was so profoundly preferable that virtually all blind consumers, based in a function of competition, chose the system that best met their needs. If a large number of blind people were to suddenly abandon iOS in the hopes that buying an Android device would “promote competition in the future,” they miss the definition of competition because, on the day you’ve bought the device, you have, by rewarding the manufacturer with your money, actually announcing that the inferior option has won because you’ve given them the only actual prize that a large corporation cares about.

No traditional market forces are at play in this situation and all I can say is that I really do hope that Peter Korn can bring some actual competition to this space.

How Does This Happen?

Something, I don’t know what, is different inside Apple these days. Maybe it’s the new CEO, maybe it’s something else, maybe they really did put a person out of Google in charge of accessibility QA, I don’t know. All I know is that no one seems to be minding the store. If the stupid bugs are starting to slip through, what can we expect next. I’m glad that iOS/8 has support for MathML and has added some other interesting new features but, overall, the release is unnecessarily sloppy.

Some of the most annoying bugs I’ve encountered have nothing to do with accessibility. One in specific, I hang up a call, another finger happens to accidentally tap a number on the keypad, the tone from that number starts to play and does not stop. If this thing is called a phone, the one app that should work flawlessly would be the one for using the phone, isn’t it? This doesn’t just happen to VO users, it’s a stupid bug that a lot of people are experiencing, having to entirely reboot the phone to get it to work. Really? A phone button is stuck down? You guys didn’t think of testing such?

Other bugs, some related to accessibility, some not, seem so stupid that I can only wonder if anyone at Apple either tested such or if they listened to beta testers at all as I’m highly confident that most, if not all, of the most obvious bugs would have been caught there. As I wrote above, I’m not an iOS beta tester so I’m running on assumptions here but, if they had as few as two blind people testing and reporting iOS/8 bugs, they’d have heard reports of most if not all of these problems and, as I wrote above, most of these could have been remedied in less than a minute each by anyone who can type.

What is it that seems to have, regarding accessibility at least, to have allowed Apple to think it can do such a sloppy release? in my mind, it’s the fault of their competitors refusing to make a credible solution at all. If everyone else sucks, they are giving the leader carte blanche to suck too. When Window-Eyes fell behind JAWS, they could have worked really hard to catch up, especially when it, around the release of JAWS 7, became very obvious to the general public that FS was working far less hard on JAWS than we had previously. If Google released an Android with an accessibility score even close to the iOS/8 with all of its bugs included, it would be true choice and would incentivize a lot of users to give it a try; in its current condition, Android is not “competition” but, rather, capitulation to Apple’s dominance.

Conclusions

Apple is doing something different and dangerous with their accessibility strategy. By choosing to release iOS/8 with so many glaringly obvious bugs, they have allowed accessibility regressions to vastly overshadow any improvements in such in iOS/8. My personal conclusion is that this is the result of a failure by the Apple competitors, most notably Google and Microsoft, to actually compete in this space. Apple released iOS/7 with a 100% accessibility API compatibility rating, the only out-of-the-box solution that has even tried to achieve such. Apple is still the clear leader in accessibility in the mobile computing arena but has proven that they can disappoint as well as surprise this community with their accessibility efforts.

I’m feeling tremendously discouraged. I’d love to be able to say, “Apple is blowing it, support one of their competitors,” but, in good faith, as iOS/8 is still substantially better in all areas of accessibility than is Google, Amazon or Microsoft, I’d be recommending an even worse solution. Apple and iOS/8 may suck but it sucks far less than its competitors. I refuse to look at trend lines in this space as they are historically unpredictable but, based in both insider and public information, I think that MS and Amazon might be making a solid move in accessibility and its a move forward. Google has demonstrated a few promising signs (Chrome is more accessible on Android and Windows, GoogleDocs seems to be catching up to Microsoft Office Online in accessibility) but we’ve heard so many promises from Google for so long that, with them, I take a wait and see attitude ignoring all statements about the future that isn’t accompanied by actual functioning bits.

I still conclude that the fault for this lies entirely in the hands of Apple’s competitors. If Apple had someone knocking on the accessibility marketshare door, they might not be so cavalier with choosing which bugs to fix and which to force upon us as paying customers. As long as Apple can say, “we suck less,” they will continue to e allowed to suck further until they drop all of the way down to the standards of their competition. If we, as blind consumers, accept a lower standard for accessibility, we are part the problem, not part of the solution.

Link

Reintroduction

Yesterday (10/14/14), I posted an article called “Apple: The Company I Hate To Love,” in which I described my personal history fighting Apple on things regarding intellectual property law and software freedoms as well as discussing their recent iOS/8 release (a major disappointment to me). A few of my most loyal readers suggested that I break the article up into two or more pieces as I did with the Android articles written throughout this year. So, this is the a revised version that will serve as part one of the series and it discusses the IP related issues and Apple’s poor record on such.

Introduction

For the past few years, based on what I’ve written in this blog and elsewhere, blind enthusiasts of the Android platform have labeled me as an Apple fanboy. It is true that I use Apple devices and that I applaud Apple for its outstanding out-of-the-box accessibility in iOS/7 and the pretty good version of the same on OS X. For full disclosure, I also have an AppleTV attached to our home entertainment system (if you like Netflix, Hulu Plus and the other content available on this set top box, this $99 device is fully accessible and works really nicely with its mini version of VoiceOver) and we use an Apple TimeMachine router with a really big in hard disk for backups. I make my personal purchasing decisions based almost entirely on accessibility and, today, Apple is the clear leader in such.

What the Android fans neglect to notice are the 66 articles I had published on BlindConfidential that mentioned Apple. Most of those, all but eight if I counted correctly, treated Apple with harsh criticism. In those days, Apple products were an accessibility nightmare and I took them to task for such. For most of those years, in fact, Apple’s accessibility was so bad that it wasn’t even worth writing about.

My Mentions of Apple in BlindConfidential, therefore, had little to do with accessibility as, really, how much can one write when the content can boil down to “Does. Not. Work?” My criticism of Apple, then and again in the piece you are reading right now comes in their what I believe to be terrible history in many things related to intellectual property. Unlike their then miserable accessibility, this aspect of Apple hasn’t changed in any substantive manner in the years hence.

I did write about this with some frequency on my old BlindConfidential blog. BC was a very different sort of blog than is this one. Then, I published nearly three times per week and wrote in a very informal personal essay style. I rarely cited references in those days and wrote mostly opinion pieces based in my own gonzo view of the world. BC also contained a lot of satirical fiction written by my alter-ego and nom de plume, Gonz Blinko which a lot of people enjoyed back in those days. BC was really popular in its day but, over time, it became too hard to keep up its pace and I got tired of writing from such a personal perspective.

If you go back and read the things I wrote about Apple in those days, please do also read what the Apple fanboys said about me back then. If I did a global search and replace on words like “Apple,” “Macintosh,” and “iPod” with “Google,” “Android” and “Nexus” the comments are virtually identical to what I read in the comments I got from Android fans when I write about their favorite thing. Then, as now, I compare all accessibility to the gold standard. Today, iOS/7 is that gold standard, when I wrote BC, it was JAWS on Windows XP and, of course, in those days I was labeled a Windows fanboy.

When Apple got accessibility mostly right, I changed my position, that’s how I roll, I’m not “religious” about anything, when new information is presented, my position changes. That’s just how we skeptics behave. Based entirely on the logical fallacy called “argument from authority” (it’s the opposite of ad hominem, it suggests that because of the messenger, an idea is a good one) I believe that Peter Korn (just because he’s Peter Korn), the head of accessibility at Amazon these days, will lead the Fire OS products to a level of accessibility similar to that which we enjoy on Apple products today. If and when that happens or when any other player reaches that level, I’ll celebrate it loudly.

My History Around Apple

The rest of this article comes from the original version. It discusses some areas of intellectual property law related issues with which I’ve fought Apple tooth and nail in the past. Some readers ask me why I don’t write about such anymore. First, for a long time, I was the only blind blogger who ever ranged into areas involving intellectual freedoms, FLOSS software and other similar ideas. I actively embraced the free software philosophy but, in four years as the Director of Access Technology at Free Software Foundation, I accomplished nothing of merit but sincerely hope that Jonathan, my replacement in the role, can be more successful in it. I had done my time in that arena, had felt rejected by that community purely on accessibility notions and had decided to move on.

More importantly, though, is that mainstream thinkers like my friends Richard Stallman and Eben Moglen do a far better job at addressing than do I. My expertise is in accessibility and I think I do a pretty good job of commenting on such. I understand a lot of the theory and philosophy behind the free software movement, intellectually, I embrace a lot of it still but, if you want to learn about such matters, read what you can by Stallman and Moglen and you’ll learn far more than I can teach you. My accessibility niche is good enough for me but I encourage you to read the real leaders in that field to learn more and make informed decisions on your own. Thus, if you’re looking to read about free software, please read the best work from the top guys and not my occasional rantings on the matter.

Finally, these days, accessibility is my passion. I work with a number of super smart and talented young blind technology people and see them, on a daily basis, struggle with things that were simple for me when I was a partially sighted college student back in the late seventies. I have one friend and business partner who, when he wanted to build his own HRTF library, had to teach himself enough of the differential equations, linear algebra and other really hard high level mathematics by reading raw LaTeX files. Without support for MathML, something FS tried to patent more than a decade ago, a screen reader really cannot perform to the level he needs to accomplish his educational goals. For me, that this extraordinarily bright young man has to deal with such overt inconveniences, is a fairly of accessibility in general on all platforms as none provide him with what he needs now. His situation makes me angry as hell, FS could have been doing this with JAWS for a decade now but, instead, a small market portion (math students) of a small minority (blind people) gets even less attention than the already tiny level of attention that a more generic set of use cases for a blind person would. It’s for guys like him, the future of people with vision impairment that I get excited. If, in any way small or large, I can contribute to a more accessible future, I’ll do so. Thus, I don’t spend too much time thinking about information and software freedom issues anymore. I’m just not that into it these days.

Apple v. Microsoft

In early 1987, I worked programming in assembly language and C on DOS and Unix based computers. I was a low level hacker who worked on device drivers and other silicon under the fingernails, right on top of the hardware programming tasks. If you had less than 1K left on a ROM chip and needed a routine optimized for size, I was your guy. While, a few years earlier, I had done some work on Apple II systems in 6502 assembly language, Apple, in the early Macintosh era, played no noticeable part in my life.

Then, the phone rang in our Cambridge condo. My wife answered it and called, “Chris? It’s rms on the phone.” I had no idea that this call from Richard Stallman would have such a profound influence on the rest of my life. In fact, the call itself was only to invite me to join him and a few others from around the MIT Laboratory for Artificial Intelligence for lunch at a Central Square Italian restaurant.

As that meal was eaten more than 27 years ago, I ask my readers to forgive me for not remembering who else had joined us that afternoon as there would be so many more informal meetings like this in the immediate future that, in my mind, who was in attendance at any particular one in any particular restaurant all blends together. I was always there, Stallman was always there, Gerry Sussman, Hal Abelson and others were often there. Lots of others popped in and out.

At the first Central Square luncheon, Stallman presented the idea of working to fight a legal assertion made by Lotus Development Corporation that their user interface was covered by copyright. Lotus had filed a lawsuit against two small software companies who made Lotus 1-2-3 knock-offs called “VP Planner” and “The Twin.” Our little group of friends with an interest in intellectual freedoms would, to fight further encroachment of such by broadening the scope of copyright to include functional elements, would form the League For Programming Freedom (LPF) and Stallman and I would be its first leaders.

Unfortunately for VP Planner and The Twin, the LPF was incorporated and off the ground a few months too late. They had both lost their cases at the federal district court level and neither had the financial wherewithal to file an appeal and settled with Lotus out of court. With this victory in hand, Lotus would escalate their monopolistic assertions and file suit against Borland over having a 1-2-3 compatibility mode in Quatro, their spreadsheet. Observing the success Lotus had in the lower courts, Apple Computer would, in turn, file a “look and feel” lawsuit against Microsoft for having overlapping windows in Windows 3.1.

As in their cases against VP Planner and The Twin, Lotus would prevail in the federal district court. Unlike those earlier cases, though, Borland had the resources to appeal; they had Philippe Kahn, a nice guy in person but a ferocious CEO in battle; they had Bob Kohn, another super nice guy in person but one of the smartest general counsels any corporation has ever had on their team and they had LPF on their side. Likewise, Apple would win their case in the lower court and Microsoft would file an appeal.

What could a tiny non-profit like LPF do with its few thousand dollars in the bank to effect the outcome of a lawsuit between bazillion dollar giant corporations? We decided our best strategy would be to do whatever we could with publicity for as little money as possible but, more so, I would organize a “friend of the court” brief to file with the circuit court of appeals demonstrating how Apple and Lotus were wrong in their legal reasoning. In this effort, I had the fantastic opportunity to talk to and gather the signatures of most of the greatest minds in computer science history to oppose Apple and to oppose Lotus. Among the signatories were Marvin Minsky and John MacArthy, the fathers of artificial intelligence. Richard Stallman, the founder of the free, libre, open source software movement. Jerry and Julie Sussman and Hal Abelson, authors of the most popular computer science textbook in history. Geoffrey Knuth, the man who invented the science of algorithmic and the author of TeX and LaTeX. We had 125 individuals of this sort on the list. We won the case in Borland and the court hearing the Apple case put that one on hold until the Supreme Court ruled in our favor, where we had filed another very similar brief but with 150 signatures of top computer scientists around the world.

The battle over look and feel copyright, a battle we would win in the US Supreme Court, was a lot of fun for me on a personal level. I got to meet and spend time with legends of computer science, pretty lofty company for an assembly language hacker who specialized on working in tiny spaces where complex algorithms need not apply. It was also fun winning such a big and important effort. For a few days after the Supremes made their ruling, I got to celebrate with Philippe Kahn (who gave LPF a huge part of the credit for their win) and a bunch of others. As this was my first serious endeavor in software freedom policy, though, I also didn’t realize how hard the future would be.

Software Patents

The next major effort taken on by LPF was to stand in opposition to all patents on software. I still hold this position today. All software patents are, for a variety of very good reasons that you can read in works by Stallman, Moglen and even older articles that I wrote on BC and before with titles like “Patently Absurd” and “Patently Ridiculous,” a bad idea. If you don’t care to read what the leaders in that movement have written about such, I’ll summarize: all algorithms can be expressed as a function of the lambda calculus, hence, all algorithms are , indeed, mathematics and math isn’t invented but, rather, discovered. When Benoit Mandelbrot and IBM tried to patent his work on fractal geometry, the US Supreme Court ruled that mathematics cannot be patented as math is a discovery and not an invention and, thus, is not covered by US patent law. A number of court cases, led most notably by AT&T, chipped away at this ruling and ultimately we find ourselves in the horrible intellectual property framework we are in today.

Apple hopped onto the software patent nonsense early and often. They have not changed a wit since those days more than two decades ago when I was active in LPF. As far as I can tell, Apple has not, as I suggested they would, use their patents on accessibility technologies against any of their competitors but, due to the ridiculous notions about patents in the US, EU, Japan and elsewhere, Apple is permitted in the “free” world to take away your software freedoms. Again, don’t blame Apple, they are doing what is best for their business; blame democracy for allowing their populations to elect officials who would promote such intellectual monopolization. You, my readers, are also voters. I’ve testified in congressional hearings, have written articles, have organized briefs, I’ve done everything I possibly can to stop software patents and I’ve, as an individual, failed in this effort. If you are as angry about software patents as me, go out and do something about it. Otherwise, we cannot blame big corporations, Apple or otherwise, for acting legally. If we, the voters, give Apple these business advantages, we have only ourselves to blame when they screw us.

Conclusions

This section is easy, Apple has today and has always had a deplorable record on issues involving intellectual property and software freedoms.

Do We Get What We Pay For?

Introduction

Historically, both on this blog and on BlindConfidential, I have very rarely engaged with commenters. I write articles, I post them, people read them or not and some choose to comment. While I was reading Marco Zehe’s excellent Android review series , I observed him engage with his commenters both in the comments on the series and in the text of other articles in the series as they appeared. Last night, for the first time ever, I posted a comment to my own blog in response to something that an individual defending Android had posted. As there have been a pile of mostly negative comments posted regarding “The Amish User Experience, the article I posted yesterday, I chose to, instead of responding to them in the comments section, write a separate post containing my thoughts on their notions.

I am also going to explore the titular subject of this article, “Do blind technology consumers get what we pay for?” and, I’m quite certain the this subject that the Android fans will trash me again. Bring it on boys.

In most of my articles, I provide links to virtually all proper nouns and terms I think readers might find confusing. This article has a few links but I ran out of time today and didn’t add them. I’m sure that any links that today’s piece would have had are linked to from the one I wrote yesterday and they’re links on this page to that article.

Did We Get What We Paid For?

One commenter wrote, “The gay / LGBTQ community, in the past, used to be more flamboyant. They would openly dress or do certain actions to attract haters to them, in order to raise awareness. I am seeing a shift in this as of late, the community is taking a more humble approach and accepting themselves first before seaking acceptance from others.”

If we’re going to use the LGBT community as a metaphor, I’ll paraphrase the gay former Massachusetts congressman, Barney Frank on the day President Clinton signed the Defense of Marriage Act, “How long are we going to wait for our rights Mr. President? How long are we going to wait?” And, I ask you, how long are you going to wait for Google to end the discrimination they perpetrate against people with disabilities through their technological segregation?

I ask, “How would the LGBTQ community react if Google charged their community full price for a product or service and, then, only permitted them to use a subsection of the features?”

Accessibility doesn’t apply to the majority of the LGBTQ community so let’s suggest a hypothetical. What if Google decided to go into the hotel business and told all LGBTQ people, all racial minorities and a few religious minorities that, while they need to pay full price for their room, they cannot use the swimming pool, the gym or any other facility? Now, please tell me how the discrimination we face due to technological segregation is any different?

In the hotel example, any of the aforementioned minorities would probably start to remedy the problem by going straight to the Department of Justice and have the place closed down if changes aren’t being made immediately. About two years ago, Chris Cotters, a member of the Freedom Scientific board, flew to Tampa for a meeting and tried to check into the Westshore Hotel. There, the on duty manager tried to refuse him a room. He called his employer, a big time Boston law firm, their people called the company that owned the hotel and, within an hour, that manager had been fired. That is how discrimination should and must be handled. Why then shouldn’t the people in charge of making Android accessible receive the same treatment for continuing to enforce technological segregation on our community?

When I buy something, I expect to be able to use 100% of its features. I paid full price for an Android device and I bought a second one (the Nexus/7 I used for my research) second hand. In either case, I paid for all of the features on the device and, as Apple has done with iOS/7, I expect to be able to use all of the features for which I paid my hard earned dollars.

In reality, as anyone can read in the comprehensive testing I did and described in “I Give Up,” there are a whole lot of features that are not accessible to people with vision impairment and, far worse, to people who are deaf-blind. Shouldn’t we get a discount reflecting the percentage of inaccessible features when we buy such a device?

I don’t want to hear, “Well, I can use the subsection of accessible features to do everything I want,” as that’s the most selfish thing anyone can say about accessibility. Readers of this blog would know that I don’t write about personal use cases, I stick to objective measures things like standards, guidelines and best practices. I do this because, as a user, I am an statistically insignificant sample size of one. I also can only do functional testing based on a user who has a total vision impairment. Hence, I look at standards developed for universal accessibility so as to ensure that my testing applies not only to me but, rather, to all people with disabilities that require access technology.

If I only tested the apps and features that I would want to use, I would have saved myself a whole lot of time and frustration. Instead, I tried to test every feature, every app, every control in each and so on as I cannot predict what other people, the people who read this blog, might want to do and neither can you.

By claiming that an Android device is accessible means that your definition of accessibility means that my deaf-blind friend Scott can’t use it but, in your mind, that’s ok. You’re saying that we don’t deserve every feature for which we paid, even though we paid full price and you’re saying that your personal use cases are more important than the collective use cases desired by all people with disabilities.

Shooting The Messengers

If one takes a look at the traffic on the Eyes Free mailing list, one would think that my old buddy Marco and I were the most evil villains in the blindness community. what did Marco and I do to provoke such anger? We spent our personal time, entirely without compensation, to research an Android system from as an objective way possible. You’ll notice that there’s no “donate” button on this blog or on Marco’s either, we do this testing so as to inform readers of the results of our findings. I test against published standards, guidelines and objective measures; Marco did a functional testing process based in actually using the device.

When we each started our efforts, we both hoped that Android would be an accessibility giant, we both wanted to write really positive pieces. Instead, based on the data we gathered, we wrote articles telling the truth, Android, based in objective measures and more subjective functional testing failed on nearly every count. The reaction by the Eyes Free community, though, was to dig in and, without correcting a single fact in any of the articles we’ve published on the matter, toss ad hominem at us. We spent a lot of time and personal energy actually testing these systems and reported the results. So, I suppose, if you don’t like the news, you’ll shoot the reporter.

What amazes me, as a blind technology consumer, is that Marco and I received far more anger aimed at us than the same people who bought devices on which they could only use a subsection of the features but paid full price ever toss at Google, a company they obviously worship with some sort of religious fixation, for not being 100% accessible in the same way that Apple has done with iOS/7. You can shout at Marco and I all you like, it still doesn’t change the situation, you pay full price for Android, you don’t get a full feature set.

Sure, I used fairly inflammatory language in “Amish” but Marco wrote everything without the sarcasm readers expect from Gonz Blinko. Marco is a truly and incredibly nice person; the same is rarely said of me. Marco engaged with the Eyes Free community during his testing (something someone critical of my piece commented positively about yesterday); I did my testing in a black box. Even with two very different approaches, we concur, a blind person gets a subset of the features for which they paid.

Anyway, feel free to call me as many names as you like but, please, lay off Marco. I write using vocabulary that may incite, Marco does not. Be fair, he’s worked for a lifetime in accessibility and has delivered a whole lot of the software blind people enjoy today including the terrific accessibility experience you Android fans have in FireFox. Read my profile, I’m a self proclaimed crackpot, stoner and loudmouth; Marco is the real deal, he works his ass off to make the world a more accessible place nearly every hour of every day.

Bias?

Marco and, more so, I have been accused of having a pro-iOS bias. This is true but it’s not based in “belief” that Apple does a better job but, rather, in having tested both systems extensively, gathered our data, added it all up and, voila! we find that one system is more accessible than another. We report with a highly fact based bias. So, if we have a true “bias” it is for reporting on actual testing results and not by how we feel. Data matters !.

To those of you who have accused either or both of us of bias, I have a single challenge. When you have taken an iOS device and an Android device and have, as I did, tested every aspect in every app on each, and scored with one point for everything that meets every aspect of the iOS or Android accessibility API that passes (some controls will have six or more items to test) and give 0 points when any fail. Then, divide by the total number of tests that were performed to get a score. Apple will get an A+ with 100% (in integers) and Android will get a failing grade. Don’t take my word for it, don’t be lazy, do the work and you’ll see the results yourself. My work can be replicated and repeating an experiment is at the crux of finding the truth.

Dueling Mobile?

Years ago, there was an annual event at CSUN called “Dueling Windows.” On stage, there would be a JAWS user, a Window-Eyes user and users of a few of the long forgotten screen readers on the stage. The users on stage were not employees of the screen reader companies but they were allowed to approve the users as experts. Then, side by side, with identical PCs with all of the same software (excluding the different screen readers) they were asked to perform tasks by a panel. The users were not given the tasks in advance and were always designed to test a very wide range of use cases. This was really fun and, for those of us working on screen readers back then, it was incredibly informative. Many times, we would see something happen with our user on stage and return to the office to make it better in the future. It also gave consumers a good taste of what worked well and what did not with each screen reader so they could make a buying choice. Sadly, after JAWS won the event five or six years in a row, they stopped doing it as it was like watching the New York Yankees play against a Long Island Little League team.

I’d like to propose a “Dueling Mobile” event that works similarly. On stage, we could have a user hand picked by Apple, Google and Microsoft to represent them. A panel of experts could compile a list of tasks common to mobile computing. One at a time, the users on the stage will try to accomplish the tasks. Success will be judged on the amount of time it took each to accomplish the task, the number of gestures necessary to perform the same task and I’m sure my more scholarly friends would come up with a number of more metrics against which the contestants could be judged. The CSUN call for papers went out last week, if someone wants to work on this as a proposal, I’ll be happy to help.

What will be accomplished by such an event? We will have another data set based in an objective measure that we can publish and blind consumers will be better informed when they hope to make a purchasing decision.

Why Do I Write This Blog?

One commenter asked why I would take the time to write my blog. I enjoy writing, I studied writing in graduate school at Harvard, writing is what I do. I write this blog because I enjoy working through ideas in written form. I enjoy the process and I enjoy the conversation that my articles sometimes provoke.

I suppose a fair number of readers like it too as virtually all of my pics get hundreds of hits and, this year, a half dozen or so have gotten more than a thousand with one over 5000. In the past month, my blog has been featured on the front pages of Daring Fireball and TechCrunch so I suppose people in the mainstream are enjoying it too.

What I cannot answer is why readers come back to my blog as frequently as they do. When I write a piece, I never know if it will be a hit or not. I had thought, for instance, that two of my recent articles, one critical of the VoiceOver support in Safari on OS X and the other about how hard it is to find the history of access technology online, would be big hits (on my lowly standards of big hit), instead, they were two of the worst performing articles I published this year. Other articles, like “Remembering GW Micro” felt self serving even to me as it discusses my own role in AT history but it is one of the articles on which we’ve gotten more than a thousand hits. So, I never know, I just write what comes to mind and toss it out there and hope some people enjoy my work.

Conclusions

If we don’t get access to every feature for which we paid, we are being ripped off.

Discrimination through technological segregation, especially now that web sites, under ruling by US Department of Justice, are, indeed, places of public accommodation, is identical to segregation in the world of bricks and mortar. We don’t tolerate it there, why tolerate it in our technology?

Use data to drive your arguments and you won’t be accused of ad hominem and other logical fallacies.

And, if you want to shout about the accessibility in Android, put up or shut up. Do the testing like Marco and I did. Test everything like I did. Then, publish your results. If you are unwilling to do the work Marco and I did but insist you’re right, I just ask, where’s the data?

The Amish User Experience

Introduction

Last month, I attended the HopeX, Hackers on Planet Earth conference in New York City. It was a terrific event and I encourage all of my readers to come to the next Hope conference when it happens in 2016. At HopeX, I enjoyed a lot of different talks and I had a lot of fun hanging out in the Lock Picking Village where I was taught how to pick simple locks, a fun hobby for a blind person as everything one needs to do is entirely tactile but hearing little “clicks” helps too.

The first talk I attended at HopeX was presented by a terrific woman whom I would later get the chance to meet. Her name is Gus Andrews (@gusandrews on Twitter]). Gus described a talk at a previous Hope conference given by Eben Moglen, the founder and head of the Software Freedom Law Center (SFLC), a man I know reasonably well and someone whom I respect greatly. Moglen described Apple as a “vampire” that lures unsuspecting technology consumers into using its products by providing “sexy” user experiences that makes their technology easy to use while it takes away your information freedoms. Andrews, in her talk, responded to Moglen’s statement by asking the question, Are free software proponents, Stallman, Moglen and others who insist on using GNU/Linux systems the ‘Amish’ of the computer using public?”

Andrews thesis suggests that some people will eschew a nice, comfortable and simple user experience purely because they have some sort of religious obsession or philosophic bent that causes them to choose what is metaphorically similar to an “Amish” experience. They give up a nice and easy user experience, they even pronounce that they prefer a user experience that is less efficient, less “pleasant” as they seem to believe that doing things in a simple and intuitive manner somehow offends their religious fixation with living in the technological equivalent of a hand built survivalist type cabin in the woods where they can live out their fantasies of technological and moral superiority.

This article intends to explore this notion as it may apply to the community of blind people using computational devices.

Corrections and An Apology

In the text that follows, I state that the Microsoft mobile phone platform remained inaccessible. A regular reader sent me a correction on Twitter telling me of the recent release of Windows 8.1 Phone and that the Narrator it comes with is quite a credible screen reader. I haven’t seen one of these yet so I won’t write more about it.

I would like to apologize to readers for using the phrase “Ted Kaczynski cabin” as a metaphor for someone who ives without the standard amenities of modern life. When I wrote that, I was thinking, “off the grid, survivalist sort” and not specifically a man who committed murderous acts of terror, one of which severely injured a man no more than a few blocks from where I sit writing this in Cambridge, Massachusetts. I apologize for using this metaphor, it was insensitive and I’ve changed the article to reflect what I actually meant with that phrase.

Another Expert Gives Up On Android

Marco Zehe works as an accessibility engineer at the Mozilla Foundation. He personally has worked on the excellent accessibility solution provided in FireFox on Android so he knows that operating system both as a user and as a developer. Prior to joining the Mozilla Foundation, Marco worked at Freedom Scientific for a number of years and, in my opinion, was the single most important contributor to the excellent design that the braille support in JAWS has today.

In the summer of 2013, Marco wrote a blog article describing his experience attempting to use an Android based phone for thirty days. In that article, Marco detailed a number of fundamental showstoppers in his use cases; this month (August 2014), Marco tried to repeat the exercise to determine if, indeed, he could use an Android device accessibly, conveniently and effectively in his personal use cases. Marco’s series starts on his blog and, from there, you can find links to the entire series.

Earlier this year, I ran a series of three articles, “Testing Android Accessibility: I Give Up!” “Testing Android Accessibility: A Deaf-Blind Perspective” and “Testing Android Accessibility: The Programmers’ Perspective.” I wrote “I Give Up” and “The Programmers’ Perspective” and a really terrific deaf-blind fellow named Scott wrote the third.

Marco and I took very different approaches to our testing. I followed a system based entirely in objective measures, standards (take a look at the BBC Mobile Accessibility Checklist, it was written by the same team as who created the mobile checklist for Section 508 and it will be US law when GSA is completed with its final acceptance process) and the basics of the science of human factors. I tested every control in every app that ships on a standard tablet from Google looking for anything from unlabeled graphics, objects out of the swipe order and so on. Marco took a very personal use case approach and attempted to do fulfill all of his mobile computing needs with an Android phone. My approach was rigid and refused to take into account applications from third parties; Marco used every resource he could find to try to crate a usable experience for himself. I would flag every unlabeled graphic or control and everything that wasn’t in the swipe order as a failure; Marco would accept that sometimes a blind Android user may need to label controls for himself and poke around with “explore by touch” to find items that aren’t in the tab order. I slammed Google for refusing to include accessibility in its automated testing processes as, virtually all of the problems I had found could be discovered by an automated testing tool simply and corrected easily and inexpensively; Marco took an approach that ignored Google’s failed software engineering processes and only explored the user experience itself.

Quite obviously, Marco and I viewed the task of testing Android accessibility very differently. I did as I do and stuck to published requirements and known best practices; Marco took a user centric approach and listened to advice from people on the Eyes Free mailing list on third party applications that can, in their opinion, replace the broken apps that carry the Google brand name. The most surprising thing is that, given the radically different ways Marco and I looked at the platform, we came to the same conclusion and even used nearly identical same vocabulary to describe Android’s failed accessibility experience, namely, “I give up, in my case, and “I Quit,” in Marco’s.”

I wrote the first article in my series with the title, “I Give Up!” Marco wrote the eighteenth in his series and titled it, “I Quit!” My articles were widely criticized by blind Android enthusiasts for taking a standards based approach (something I documented in an article called “Standards Are Important”) as they all asked why I didn’t try a variety of third party apps that I could use in a moderately to very accessible manner. My answer was that I was only testing the out-of-the-box system sold by Google.

Marco’s work should end this controversy. Android accessibility failed both objective and subjective testing procedures performed and reviewed by noted accessibility experts.

The Blind Amish

In preparation for writing this article today, I went to Marco’s blog yesterday and reread each of his eighteen days of trying to live with an Android phone for thirty days before giving up and reading all of the comments written by his readers.. One comment jumped out at me. It’s author stated very eloquently that the people who hang out on the Eyes Free mailing list often say that blind Android users need to “be patient” and to “wait for Google to catch up.” The person who posted the comment and I seem to share the same opinion, “why suffer an inefficient and unpleasant user experience when there are good and excellent alternatives?” and “Why decide to be Amish in your technology choices?” Blind users of mobile devices already have two good choices for tablets (Apple and Microsoft) and, now, with the release of Windows 8.1 there’s choice on a mobile phone.

Let’s consider the notion that, indeed, Android is an actual choice when it’s accessibility is so fundamentally broken, even when one allows for using a bunch of third party apps to do what a sighted person can do with their Nexus or other Android device in the first few seconds of ownership. Is living without indoor plumbing, a hot water heater and all of the other comforts the Amish reject a true lifestyle choice? Is going through what Marco and I experienced in our testing of an Android device, when compared to iOS and Windows 8.1 on a tablet, really a “choice” when the interface is so inefficient? As Marco demonstrated, it is impossible for an expert level blind technology user, a person who invented a whole lot of things the rest of we blind people use every day, to live with an Android device for a month, let alone as a permanent solution. Given all of this, I can only conclude that, no, Android is not a choice at all.

If you like living in the technological equivalent of an Amish community, a system that requires far more effort and provides profoundly less comfort than the alternative, please go right ahead and do so, just don’t tell the rest of the world that your choice is “accessible” when, in fact, it’s really only marginally useful when compared to the state-of-the-art. When you claim that something is “accessible” when, in reality, it is not, you only encourage companies with a history of poor accessibility to continue being poor as they will find some blind Amish willing to state that virtually anything that talks at all is accessible. By claiming that Android is accessible, you make the work of accessibility professionals and advocates much harder as we then need to convince our clients that the truth of the Android accessibility experience is that they will fail all known regulations without doing profoundly more work than they would need to in order to make an app accessible on the Apple and Microsoft operating systems.

What About Other OS?

Almost every day, I have cause to use iOS/7 on my phone, OS X Yosemite beta on my laptop, Windows 8.1 on a convertible I got about a year ago and Ubuntu GNU/Linux via a command line via ssh and in a virtual machine on my Macintosh. I tried to live with an Android device for three months and, finally, I gave up on it. Of these operating systems, I feel that, regarding accessibility, iOS/7 is the most comprehensive with 100% (when rounded to integers) of its features, when measured objectively, are accessible not just to a blind person but to people with a panoply of disabilities. Windows 8.1 comes in second but I take away points for a relatively small number of stock items I found that had accessibility problems but, more so, because its built-in screen reader, Microsoft Narrator, remains sorely substandard when compared to the state-of-the-art coming from third party screen readers like NVDA and JAWS. I’d put OS X in third place based upon the official Mavericks release (it’s what I’ve been using for most of the past year) and, for now, I’ll reserve comment on the Yosemite beta as I signed their non-disclosure agreement (NDA) and I’m a bit of a stickler for obeying contracts I’ve signed. This leaves the GNU/Linux experience in dead last place as I’d also say that my Android experience was more pleasant than most of what I deal with daily in Gnome with Orca.

Gus Andrews said of all GNU/Linux users that they seem to be the Amish of the mainstream technology community. I’ll say that, in addition to android, GNU/Linux using blind people, especially those who use the Gnome windowing system, are the Amish of our community.

Conclusions

No matter how an expert tests accessibility on Android or GNU/Linux, whether it’s me doing an objective, standards based approach, if it’s Marco doing a user based subjective set of tests or the programmers who I used as sources on the “Programmers’ Perspective” article just trying to do their jobs, Android and GNU/Linux are accessibility outposts.

Some will argue that Android can save a user some money but so can going off the grid and living in an off the grid survivalist cabin in the hills. There are many ways people can save money but why go Amish on us to save a few bucks? If you believe your time is valuable, why spend so much more configuring a system when Microsoft and Apple provide excellent choices out-of-the-box?

So, don’t live in a cabin in the woods, come into modernity and enjoy mobile devices from companies who take accessibility so seriously that they actually deliver it today.

Apple and the Accessible Internet

Introduction

Any regular reader of this blog (both of you) would already know that I enjoy using a bunch of different products from Apple. I use an iPhone 5S running iOS/7, a Macbook Air running OSX/Mavericks with all of its updates, we have an AppleTV set-top box and we use an Apple TimeCapsule router. The first thing one notices when they get any of these devices is that their interfaces are 100% accessible in the iOS case and nearly 100% accessible on OSX right out-of-the-box. For this reason alone, Apple is by far the leader among mainstream companies trying to solve the problems of accessibility to people with vision and other print impairments. Apple continues to make its accessibility better with each release but, while it may be #1, Apple still has a lot of work ahead to be truly competitive with third party screen readers On the Internet.

Any user of a popular Windows screen reader (JAWS and NVDA) or even those with less popularity (Window-Eyes, SystemAccess, ChromeVox and Orca) will, for a variety of reasons, be entirely underwhelmed with the functionality of VoiceOver on a Macintosh with the Safari web browser.

This piece started as a bug report I wrote up for some contacts I have at Apple. For all intents and purposes, I have changed very little between the email I sent to friends there and this article. I’ve removed the names of some individuals who are not public figures, added a bunch of links and did a bit of other clean-up, removing some personal comments and such. This article is specifically about how VoiceOver works with Safari on OSX and may not be applicable in any way to iOS/7 or any other Apple products. Internet support, in my mind, is the single aspect of using a Macintosh with a screen reader that remains substandard which, as Apple is setting the standard in so many other areas, makes me sad.

My Specific Use Cases

It’s possible that each user has his or her own set of cases that are important to them. Like everyone else, I use the Internet for a lot of different things but, most importantly, I write a blog. My blog tends to use other Internet sites as source materials. Therefore, being able to copy and paste from sites is really important to me and,sometimes, when I go to a site and hit VO+ENTER to start selecting text, I hear the “scratching” sound and it actually selects text; sometimes, I just hear a ding and it refuses to select text using this method. On some occasions when the VO web site text selection facility doesn’t work, I can just use SHIFT+navigation keys and the text will be selected; on other occasions, I the only way I can select text on a web site is by doing a “select all,” copy and pasting the entire page into a text editor and finding the piece I’m looking for there. This is, in my mind, one of the worst problems with VO on OSX.

The Overall User Experience

Most other popular screen readers (JAWS and NVDA) and some less popular ones (Window-Eyes, ChromeVox, SystemAccess and Orca) allow the user to navigate around the page using only cursor keys as if in a word processing document. Originally, Orca’s FireFox support, also designed by the person who is now the lead UI developer for VoiceOver, functioned similarly to the VoiceOver design where arrow keys are virtually meaningless except when combined with a modifier key. Orca, not known for its tendency to be terribly competitive with other screen readers nor for its unpleasant user experience, took a step back and changed its UI design to be like JAWS, the screen reader that set the standard for Internet accessibility (if you disagree, I can provide a pile of links to actual testing scorecards that, quite objectively, demonstrate JAWS superiority in all of these areas including the WAI user agent guidelines). Apple, quite obviously, has infinitely more resources than does the Orca project (as far as I can tell, Orca has exactly one developer, Joanie Diggs, working part time on the effort) and can certainly make this happen.

Navigating on a Web Site

As far as I can tell, all other screen readers on general purpose computers (desktops, laptops) allow for single character navigation of a web page. In fact, all but Window-Eyes use the same standard set of keystrokes (h for next heading, t for next table, etc.) and, with all other screen readers, navigating a web page is profoundly more efficient. With NVDA (I don’t use JAWS), I go to a web page and hit “h” and I’m brought to the first heading, I hit “h” again and I’m at the next one and, if I follow that by typing a “t,” I then go to the next table and so on. With VO, I load a web page and, if I want to go to the next heading, I need to hit VO+u first to make sure it’s set for heading navigation and then either find the heading I’m looking for in the list box or, after setting the utility dialogue to headings, use VO+down arrow to find the next one and, then, when I want to find the table, I need to go back into the utility dialogue, change to tables and start over. Hence, finding the object I’m seeking requires far more keystrokes, requires far more cognitive processing, etc. but, worse, it makes switching from any other screen reader to VO much more difficult. I need to use OSX, iOS, Windows and GNU/Linux all nearly every day so anything that improves the similarity of screen readers is important based entirely in the HCI concept called “discoverability.”

On a personal use case note, I cannot tell you how many times I’ve been using VoiceOver, used Command+TAB to switch to another application, returned to Safari and found that I’m hitting the keystroke to go to the next object only to find that I had forgotten to set the granularity back to headings and hear something entirely useless like, “No more tables” which could have been avoided entirely if Apple would just implement the same sort of system as exists in the more popular Windows screen readers. Maybe I’m a bit of a stoner and, therefore, forget which granularity I had VoiceOver set to but I’m willing to bet that lots of other users make this mistake frequently as well. The rotor for granularity changes works reasonably well on iOS but changing granularity on OSX is unnecessarily cumbersome.

Correction: When I wrote the two prior paragraphs this morning, I did so in the absence of any awareness of the QuickNav Commander now available in VoiceOver. For all intents and purposes, if you go into the VoiceOver Utility (VO+F8 if you’re running VO), go to Commanders and select the QuickNav tab, you can turn on “Single character navigation” there and have an experience similar to that available in Windows screen readers. Back when I worked on JAWS, we had something of an unwritten rule, if we add a cool new feature, we made sure it was turned on by default in the next release of the screen reader so that users would find it right away. I don’t tend to read a lot of release notes and, until my friend and accessibility jock, Donal Fitzpatrick (@fitzpatrickd on Twitter) pointed this feature out to me, I didn’t know it was there. So, for all intents and purposes, you can ignore the two paragraphs preceding this one as, given this feature, they are just not true.

Performance and Time

VoiceOVer is ridiculously slow on “noisy” web pages (those with lots of objects). Go to this site about harmonica playing, search on a popular artist (Bob Dylan has a lot of stuff up there) and bring up the Item Chooser (VO+I) and count the seconds it takes to bring up the item chooser list box and, if you’re using the same 2012 model Macbook Air as me, you’ll see that this takes a little more than 7 seconds. Now, using NVDA on a cheap Windows laptop hit NVDA+F7 to bring up its analogue of Item Chooser and you will find that its list box is on the screen and talking in less than a single second. NVDA is also using cross application communication via an API to gather its data but, using caching and other performance enhancing techniques, it actually responds in a functional amount of time; in 2014, waiting 7 seconds for a computer to do anything other than downloading something big from an online source is simply absurd.

When, in September 1999, we at FS released JAWS 3.31, we used Jamal Mazrui’s EmpowermentZone web site as our favorite reference page. Jamal has something like 1700 links on the home page and, according to VO, it has 3789 objects in all. Back then, we were running on 60 mhz Pentium processors with megabytes of RAM and, then, JAWS 3.31 could load its object list dialogue on this page in about 20 seconds (compared with about 25 minutes using Window-Eyes). Just now, when I went into Safari to test this page, it took about 30 seconds for VO to load its item chooser on hardware more than a decade newer, using a quad-core system whose speed is measured in ghz, having thousands of times more RAM and so on. We solved this problem on Windows 98, effectively a 16 bit system; certainly, Apple can solve this problem now that much faster hardware is available.

The Broken Item Chooser

If a user hits VO+i to bring up the item chooser before a page has finished loading it will bring up the list box but, when one hits ENTER on an item, it will just ding and not bring the user to the point he had requested. VO seems to load all of its data much more slowly than any other screen reader (if I bring up the NVDA analogue of this dialogue by hitting the keystroke immediately after requesting a page, it appears immediately and is never out of sync with the rest of NVDA. I’m going to guess that this is a threading issue which are hard to fix but this bug has been present for years now, has been reported by me but I’m also certain that others also reported this problem to Apple.

For no reason apparent to anyone outside of Apple, it seems that the Item Chooser information isn’t cached anywhere. Hence, when one hits VO+i on the same page twice, VO takes as much time to build the list the second time as the first. If the page hasn’t changed, the Item Chooser information should all be present either in memory or cached on a disk and should, even given the other VO constraints, load virtually instantly the second time through. 

What About “Clickable” Items?

When VO describes an item as a link, using VO+SPACEBAR will always open it. When, however, VO reports an item as being “clickable,” more often than not, VO+SPACEBAR does absolutely nothing and hitting VO+SHIFT+SPACEBAR to send a mouse click to the object works infrequently. I have my VO configurations set to have the mouse cursor follow focus and also try hitting VO+SHIFT+F5 to route the mouse cursor to the object I’m on but that seems to rarely work as well. [Note: while here in TextEdit, routing the mouse cursor works properly but, while using Apple Mail,, with my mouse cursor set to follow VO, I hit VO+SHIFT+F5 and wasn’t brought to the word i had just typed nor was I even brought to the edit area where I’m typing this but, rather, I clicked something on the Dock, a seriously bad outcome.]   

Compared to JAWS

People who have read articles on this blog like “I Give Up” or “An Open Letter to Mark Riccobono” will know that I’m not just a user of Apple products but, based entirely on accessibility, I’m something of an advocate often recommending their hardware to other blind users. Now that Apple seems to have made iWork, their office suite, mostly accessible, the Internet is the only aspect of VoiceOver that I still don’t like much. Readers of this blog would have heard me say, “Apple has set the gold standard for out-of-the-box accessibility” which is true, for almost everything, except for the Internet. Online, JAWS remains the king with NVDA a close second. This is the one area where Apple really needs to do some massive improvements.

If I actually picked up the user agent guidelines and tested each item separately, I would find a ton more bugs. I would find a really big list of defects that one would not encounter if they used JAWS or NVDA. It’s pretty much the Internet access that causes me to use Windows to do most of my research and a lot of my writing these days. On all scorecards regarding screen reader functionality published online (I’m working on an article about these reports coming soon to this blog), JAWS remains the gold standard for using a speech interface to read the web. Apple may have set the bar for nearly everything else but, if Apple wants to be the best, they have a lot of work ahead of them.

The Object Model

VoiceOver arranges its web information by object but doesn’t also include a simpler navigation metaphor. Hence, as I wrote above, it uses a different system for moving from object to object so a separate keystroke is needed for each separate object. If a web site contains the sentence (including the links):

“Let’s compare the number of keystrokes necessary to read this sentence with JAWS, NVDA, SystemAccess, VoiceOver and a few other bits of access technology.”

an NVDA or JAWS user might read the entire sentence by issuing a single keystroke (a down arrow perhaps) but, with VoiceOver reading each object separately, one needs to issue a keystroke for each link plus each chunk of text separating such for a total of eight keystrokes to read a single sentence. Also, while doing a “read all” of an entire web page, the user will hear pauses caused by VoiceOver trying to add a tone for each link making the entire reading experience sound really choppy. This is massively inefficient for the users and should be corrected immediately.

The Interaction Model

When a user accesses a web site with JAWS or NVDA, the information is pretty much organized like a word processing document with all of the same keystrokes for navigating through such. With VoiceOver, Apple introduced a model that attempts to group chunks of a web site into larger blocks that a user can navigate between and, when they are in a place they want more detail, the users can “interact” with that portion of the web page. In theory, this system should allow for greater efficiency as it permits the user to easily jump past information in these groups.

Very unfortunately, though, the interaction model only seems to improve efficiency on tremendously well organized web sites and, more often than not, actually requires the user issue more keystrokes than in the “virtual buffer” model presented by most other screen readers out there. For a quick example of this, if you’re using a Macintosh running VoiceOVer to read this page, find the place where one can follow me on Twitter and you’ll notice that you need to interact (and, therefore, stop interacting when you’re done) with an item that contains very little actual information. With JAWS or NVDA, however, simply moving from line to line with arrow keys gets you everything you want.

In my experience, the interaction model causes far more efficiency problems than it solves.

Conclusions

This isn’t my most well organized piece and it gets repetitive in places. I wanted to show, however, how even Apple, the world’s leader in out-of-the-box accessibility, sill needs to continue improving. I’m certain that this item will gather me a bunch of comments (either public or privately through the contact form on this site) about other problems blind users encounter with VoiceOver on Macintosh. If you have other bugs to report, sending them to me may make for a future version of this article to be more comprehensive but, at the same time, I urge you to report any problems you encounter to Apples Accessibility email address so they may both know about the bugs but, also, they may understand just how many people are effected badly by defects and design flaws in their accessibility software.

Preserving Our History

Introduction

Recently, I wrote an article called “Job Access With Bugs?,” in which I explore some of the generally accepted notions around access technology for PWVI. That article came as part of my ongoing attempt to record the history of the screen reader in the years following 1998, when I joined Henter-Joyce as Director of Software Engineering. These articles have been popular with our readers and I’m happy that some of our history is preserved in them even if my work tends to be loaded with opinion, conjecture and is based largely in anecdote rather than serious historical inquiry.

In these articles, I try to include a link to every proper noun when it first appears in the pieces and I try to include links to concepts that may be unfamiliar to our readers. when doing so, as I wrote in the conclusion of “Job Access,” I try to find links to objective materials, mostly Wikipedia, rather than personal blogs or marketing oriented company web sites. While writing “Job Access,” however, I realized that little of our history, the history of technology for PWVI, has been recorded in the public record.

This piece intends to encourage people to write and edit Wikipedia pages about the technology we use and have been using for a few decades now, it proposes an idea for gathering an oral history describing our use cases and how such technology has effected our lives over the years and, lastly, about a computer museum interested in curating a collection of AT hardware.

The Blazie Engineering Braille ’N Speak

Arguably, the Braille ’N Speak (BNS) from Blazie Engineering may be the single most important bit of access technology for PWVI in history. I know literally hundreds of blind people for whom the BNS was their first piece of access technology who, using this once remarkable device, were able to attend school, go to university and perform a lot of professional functions using their BNS. I also know dozens of blind software professionals who got their start programming by first learning BNS Basic. This device is certainly an important part of our history but it has no Wikipedia entry nor is there an entry for Dean Blazie, the inventor of the BNS or for Blazie Engineering, the very important AT company that built the product.

In my mind, this should be the first item corrected on Wikipedia. Someone who knows a lot more about the BNS than I do (I never owned or used one myself) should write up an article about it. It would also be important to add entries for Dean the man (again, he’s someone I’ve met a few times so someone with a greater level of familiarity with Dean should write such) and for Blazie Engineering, a very important manufacturer of braille devices as well.

Henter-Joyce, GW Micro and Window-Eyes

While JAWS and Ted Henter have token Wikipedia articles about them (something that we really must improve and something I might edit myself), companies important to our history, Henter-Joyce and GW Micro do not. Window-Eyes, for many years the second most popular screen reader and the first to embrace an API based strategy for gathering data, regrettably also have no Wikipedia entries. Doug Geoffray, the most visible member of the GW Micro team, is also without a Wikipedia entry.

I can probably write an article about Henter-Joyce as I can call Ted to get the story right, but someone other than me would need to write up articles about GW Micro, Window-Eyes, Doug Geoffray and the others there who helped invent our future.

Less Prominent Technology

I know a real lot about a few things but virtually nothing about many of the other technologies that PWVI have used over the years. My own braille skills are horrible so I’ve never actually used a braille display nor have I done much with a braille keyboard. While I had managed MAGic and WYNN (a product for users with learning disabilities), I’ve never used them myself and, beyond the theoretical side of this sort of technology, I can’t really speak to such.

It’s important that our history is preserved so, please, if you’re so inclined, make yourself a Wikipedia account and start documenting our history.

An AT Oral History

If our readers think it’s a good idea, I will set up a wiki on this site where PWVI can write up their stories about how they use access technology and how it has effected their lives. Here, in an informal way, individuals can tell the stories that I hear daily from people orally. That an individual got themselves a copy of JAWS, spent time learning it and was able to use these skills to advance their career, further their education or do something else productive with such is a major part of our history that remains unrecorded. An “accessibility stories” collection would provide a single place on the Internet where these stories could be collected and made available to others.

Personal stories are a major aspect of history and, if we launch such a wiki, we’d have a place where such stories could be found, studied and organized in a manner that doesn’t exist today. Of course, such a wiki would be useless if no one is willing to write stories for it. So, if you think this is a good idea and are willing to post at least one story about how you’ve used AT, please tell me so and, if I hear from enough people, I’ll add the functionality to this web site.

What About the Hardware?

While the BNS may be the most important piece of hardware this community has ever enjoyed using, it is certainly not the only one with great importance. At the same time, as far as my research could tell, the only museum on Earth that has a BNS in its collection is the Smithsonian where, along with JAWS for Windows 3.20, it is the only piece of access technology in the collection.

Recently, I attended the HopeX conference in New York City. There, I had the opportunity to have a meeting with a bunch of guys involved with a vintage computer museum. I’m currently in negotiations with them about their launching an access technology area in their collection. To this end, if you have old access technology hardware around that you would like to donate to the museum, please connect with me through the contact form on this site and we can make arrangements to have your old hardware shipped to their museum. I will be writing up stories about each device and will, of course, require your input to ensure the accuracy of them so people visiting the museum will have something that they can understand about why the device was important to our community when it was current.

Conclusions

I personally feel that it is a tragedy that our history has not been properly preserved. Our heroes, people like Ted Henter, Dean Blazie, Glenn Gordon and others are simply not remembered online the way that those who had made far less important mainstream technology are. The devices that our community used to get our education, to work in industry and elsewhere are not remembered either. Adding and improving Wikipedia entries is easy and, if you have old hardware, I seem to have found a home for it as well. I’d like having the oral history wiki as well but I’m uncertain that we’ll get enough volunteers writing up their stories to make it worthwhile.

So, please help our community preserve its history. We’re at an interesting time when we can write our own history and, in my opinion, we really should be doing so.

An Open Letter To Mark Riccobono

A Note To Our readers

Mark Riccobono is the new president of NFB, that nation’s largest group advocating for people with vision impairment. I find him an interesting choice as president of the organization. This is a letter Ive drafted to him regarding NFB, technology and its recent resolution asking Apple to require accessibility for submission to its Appstore.

The Letter

Dear Mark,

If you don’t know who I am, as a matter of introduction, I’ve been working in access technology and accessibility since 1998. I’m a former VP/Software Engineering at Freedom Scientific and have been something of an accessibility researcher, advocate, activist, gadfly, loudmouth and crackpot since. You can learn all about me by reading the blog where this letter has been posted and in the archive of BlindConfidential, the very popular blog I wrote for a lot of years.

To start, please accept my sincere congratulations on your election to the presidency of the National Federation of the Blind (NFB). I’m highly encouraged that NFB now has someone at the top of the organization who seems (to me at least) to have a grasp of the world of technology, research and the tools that a blind person needs to compete in professional settings, in schools and to enjoy a connected life in the information age. I look forward to seeing how your insights effect NFB policy and actions as we move into the future.

I would also like to congratulate you on having successfully led the project that resulted in your being able to drive a car at Daytona. While I live in a big city and rarely need to get into a car for any reason, I recognize the profound level of freedom that could be accorded blind people if they could operate a motor vehicle independently. That NFB could work with Virginia Tech to make such an amazingly innovative system is, indeed, a tremendous achievement and I look forward to seeing it evolve into the future.

As I’m a technology specialist, I read the NFB resolution stating that it will work with Apple to improve the accessibility of third party applications on its AppStore with great interest. I also read the piece you wrote further explaining the resolution and found it informative as well.

Having read both, it is my understanding that the resolution states that NFB believes that Apple should require accessibility compliance as a condition of inclusion in their AppStore. I agree with this assertion entirely.

I was also happy to read the whereas clauses in the resolution and enjoyed reading how you summarized such in your article. It is heartening to hear NFB state publicly, in a resolution, that Apple is the clear leader in accessibility and that Apple has done more than any other OS vendor to accommodate our needs in their technology. I agree with these statements entirely as well.

If, however, I attended the NFB convention as a delegate (an incredibly unlikely event as I’m not an NFB member), I would, although I agree entirely with the language of the resolution, have had to vote against its passing. While everything the resolution says is excellent, the problems are with what it doesn’t say and how its passing was perceived in the community of blind technology experts.

If this resolution, instead of saying, “We resolve that Apple…” instead said, “We resolve that Amazon, Apple, Google, Microsoft and all OS vendors with an online software store…” I would be writing an article celebrating its passing. Singling out Apple, however, even with the statements that they already do a better job with accessibility than any other vendor, is not in my mind an acceptable statement to make.

Specifically, this resolution asks Apple to ensure accessibility of third party applications at a level which NFB has not resolved to ask Amazon, Google and Microsoft to do for applications that carry their brand names. As the whereas clauses clearly state, Apple is already the best in this space and asking Apple to do something regarding software over which it has no control is, as I said above, an excellent idea but, in absence of insisting that the rest of the industry first reach parity with accessibility on the Apple systems for software over which they have complete control, I find raising the bar of requirements for Apple in exclusion of its competitors to do the same to be a statement that will, at best, cause confusion in the world of technology for people with vision impairment. The resolution, because of what it didn’t include, is perceived as a criticism of the best player in the game while ignoring similar and much worse problems at Amazon, Google and Microsoft.

Please also realize that this is 2014, a time in history where a long written resolution followed by an article explaining such by the president of NFB will be read by very few people interested in the subject. I write long form essays and I’m an accessibility nerd of the highest order, hence, I’m the kind of guy who actually reads such things. I’m sad to report, though, that few of my peers in the world of blindness and technology would take the time to read through such and or take the time to fully understand the nuances therein. This is the age of 140 character conversations and, while not true, the perception of this latest NFB resolution on Twitter is, “NFB slams Apple again” and “Riccobono doubles down on Apple slam.” I agree that these summaries are unfair but we don’t live in a fair world where everyone takes the time to read the details. perception is tremendously important and, speaking on behalf of other blind technology professionals to whom I’ve spoken in the past week, NFB has yet another major problem in the hearts and minds of this community. I don’t know how to fix this problem but it’s something about which NFB needs to be aware if it hopes to regain credibility among this admittedly elite class of blind professionals.

I believe that you can be an agent of change within NFB. I’m happy to hear that NFBCS has a new leader and I hope to see NFB improve its statements on technology as we move forward. While I may or may not agree with NFB resolutions that are passed in the future, speaking for myself and others to whom I’ve spoken, simply making NFB public statements consistent when addressing technology vendors will help substantially with this credibility issue. If you’re going to resolve that one technology vendor do something, please resolve that they all do it and you’ll find that some people like me, vocal critics of NFB in the past, will start paying attention and, perhaps, join and become active in NFB in the future.

In conclusion, I’m very happy to see you as the new president of NFB. I’m excited about progress on your automobile project. I agree with the text of the aforementioned resolution but I’m concerned with seeing Apple singled out for technological developments out of its control without a similar standard being applied to Amazon, Google and Microsoft for software entirely under their control. It seems very inconsistent to me and those to whom I’ve spoken. NFB has a perception problem among blind technology professionals and consistency in statements about technology would go a long way to allowing NFB to regain credibility in this community.

Sincerely,
Chris “Gonz blinko” Hofstader
7/23/2014

Job Access With Bugs?

Introduction

For years, I’ve heard anecdotal reports that JAWS, the world’s most popular screen reader, has more bugs, is less reliable, more unstable and of a generally poorer quality than some of its competitors. In that same period, starting in 1998 and continuing until today, I have never seen a single bit of quantitative evidence demonstrating that this is true. I hear people around the community make these claims based on personal experience, experience that is certainly valid but no one has published a scorecard listing every feature in every application supported by each screen reader, tested each and published the results. I’ve also never seen any detailed reports of reliability, only the same sorts of personal stories.

In this article, I want to explore some of the generally accepted notions about screen reader quality and functionality and ask why, if JAWS is such a bad piece of software, does it maintain a marketshare over 50% and why does it still dominate in most professional settings. Furthermore, I want to explore some of the issues discussed in my article, “Remembering GW Micro” that I published last month.

As a matter of disclosure, I don’t use JAWS. For the most part, my primary system is a Macbook Air running OSX Mavericks with the VoiceOver screen reader. I do use Windows with some frequency but, on that system, I use NVDA because I really like how it works in FireFox. This is a second theme I hope to explore in this piece, does the opportunity provided for career advancement, educational opportunities and other advanced computer usage provided in JAWS more valuable than having fewer bugs if, indeed, JAWS does have more bugs than its competitors.

Ted henter

Before there was a JAWS, Ted henter, its inventor and leader for many years, came to a realization. Specifically, while some talking computer technology had already emerged, none of it was vocationally oriented. In those days, Ted worked for Dean Blazie, a close friend of his until today, where they made the Braille & Speak (BNS), a truly remarkable device in its day. A blind user could do a lot with a BNS but it provided no access to the programs that one might use in a job or university.

To solve this problem, Ted found an investor and started working on the DOS version of a program he called Job Access With Speech. From day one, the defining value behind JAWS was to provide access to professional situations and, to this day, it remains the dominant access technology for blind people in professional settings.

GW Micro Marketing

I joined Henter-Joyce in October of 1998. Among the first things I noticed was that the GW Micro web site claimed that Window-Eyes was “rock solid.” I’ve heard this claim repeated in their marketing materials and in reports from their users. What I’ve never seen is the scorecard I mention in the introduction of this article. I try to base my opinions in evidence, when I did my evaluation of Android, I tested every single feature that came out-of-the-box on my Nexus/7. Before I make a claim of quality or lack thereof, I try to perform as full an evaluation that I can or find a published report that contains such written by a credible source. In the 16 years since I’ve been following screen readers, I’ve never seen a single report card of this sort for Windows screen readers, just lots of personal reports, lots of anecdote without evidence.

Does the lack of quantitative evidence mean that the assertions that JAWS is less stable than its competitors are untrue? Absolutely not, it just means that there is no data that can answer this question so I’ll leave it unanswered. It’s not unreasonable for someone making a purchasing decision to rely on the anecdotal reports written by other users as, in the lack of real data, its all a blind consumer might have.

Regarding Window-Eyes, when Microsoft announced that one could get a copy at no extra cost if they owned Office, I grabbed a copy. I did not perform an extensive evaluation of the product as the reliability problems I found in the first half hour of using the product convinced me that continuing with my evaluation was a waste of time. Specifically, on the Windows login screen, if one mistypes their password, Window-Eyes does not read the error box that comes up saying that something was wrong beyond the “OK” button so a user doesn’t know what he’s saying “OK” to. Then, I discovered that when a user launches Window-Eyes, it may not read applications that were opened before it was started – a problem that does not exist in either JAWS or NVDA. Others whom I trust intimately have reported other major bugs as well. If Window-Eyes is, indeed, “rock solid,” I don’t see it.

Meanwhile, Window-Eyes remains the only screen reader on Windows that still does not support either touch gestures for navigation or Aria on the Internet (yeah, I know, GW Micro says it’s coming but it took them a decade to get Java supported so “is coming” may mean in 2025). Window-Eyes, in my mind, remains highly buggy and as feature poor as anything on the market today.

Let’s Look At Some Numbers

According to the 2014 WebAIM statistics, JAWS holds a marketshare in excess of 50% with NVDA approaching 20% and Window-Eyes falling in with about six points. To make the arithmetic easier, let’s say that JAWS has 8 times the number of users as does Window-Eyes. Hence, it is run on 8 times as wide a variety of hardware, in 8 times as many sets of personal settings, setups, and Windows configurations. Let’s also assume that there are 8 times as many JAWS users discussing their problems online and, therefore, it’s 8 times as likely that a JAWS bug will be seen by the Internet reading public as would a bug in Window-Eyes. Is it possible that JAWS much broader user base and much larger exposure in online media (formal and otherwise) may lead one to believe that it is actually more buggy? In absence of the aforementioned scorecard, we cannot know.

JAWS Broader Feature Set

No one questions that JAWS is more feature rich than any other screen reader. It became so because of Ted’s commitment to providing a tool that blind people could use in professional settings. As far as anyone can tell, JAWS is still dominant in these settings because of its feature set, features which are absolutely necessary for many people to hold a job or further their education.

After I wrote the article describing my memories of GW Micro, a reader posted a comment reasserting, without any evidence, that GW won’t release a feature until “it’s rock solid” parroting Window-Eyes marketing literature. The person who posted the comment continued by stating that GW didn’t add Java support to Window-Eyes until version 8.0 and suggested that the near decade it took them to catch up to JAWS in this area was because of their commitment to quality. This implies that GW Micro had been working on their Java support for all of that time but chose not to release it until it was “rock solid” which, of course, is false. GW Micro didn’t add Java support until they were absolutely forced to do so by market demands.

What if the JAWS team had also decided to wait many years before they added Java support? A year after JAWS first supported the Java Access Bridge, University of Florida (a in the top twenty public engineering colleges in the US) decided to change its computer science and computer engineering curriculum from being based in the Scheme programming language (a Lisp like language developed at MIT in the sixties) to Java. A blind student in that program could have, if he so chose, used Window-Eyes, it was among the approved AT provided by the university, but, if he had made that choice, he would have had to drop out of the program as, using Window-Eyes, he could not possibly have done his class work. I suppose that the person who wrote the post considered this when he posted his statement and I suppose also that he thinks that waiting a decade for your AT to catch up to the reality of the technological world is also a good idea. our hypothetical blind student had no choice, he either chose JAWS or he failed out of college.

Personally, I think that saving that student’s college career is the most important thing a screen reader team can do with its time but, as always, I’d like to hear your comments.

A Data Point I’d Like To See

WebAIM statistics are nice especially because they run year to year and allow us to observe trends. It’s also a self selecting survey which, like all self selecting surveys, is wrought with problems. Is one screen reader under represented in the report while another is over represented? This is data that the WebAIM report cannot answer. It would be impractical to expand the WebAIM survey to include some other more personal information about screen reader users. Unfortunately, there is very little other data published that can tell us much about the make up of the screen reading using public.

The data points I’d like to hear, in a real, well constructed study, would help us learn much more about the efficacy of a particular screen reader. Specifically, I’d like to learn what is the median income of an employed JAWS users versus the median income of users of other screen readers. I’d also like to learn the average level of education accomplished by users of JAWS versus the other screen readers. Based purely in anecdote and in complete absence of real statistical data, I’m willing to bet anyone $100 that JAWS users are A: more likely to be employed, B: make more money and C: more well educated than users of any other screen reader except, perhaps, NVDA. Of course, it would cost much more than a hundred bucks to do the study properly so the bet is probably not worth taking.

As I wrote in “Remembering,” I believe this is why Window-Eyes failed in the market and is why GW Micro is no longer a going concern. JAWS did everything possible to build a base in employment sectors, NVDA came along and grabbed a whole lot of the more technical blinks and SystemAccess grabbed the novice users while Window-Eyes offered nothing special at all.

Fanboyism

Earlier this year, when I published the three Android reviews, I expected and received a spanking from its loyal enthusiasts. Years ago, when I wrote BlindConfidential articles with titles like “Apple Just Sucks,” I got spanked by Apple’s fanboys. When I write critically about Window-Eyes, I hear from its loyal users as well. I understand that people love the things they use, the technology in which they’ve invested a lot of time and energy learning and they respond to criticism of their favorite things. I admit, I cringe when I hear some of my favorite things criticized as well.

What I didn’t expect from the Android series, though, was the celebration tossed by the iOS fans. In my mind, celebrating accessibility failures is never a good idea. I really like my Macbook Air and my iPhone 5S but I want all devices to be equally or more accessible. I take no joy in writing a review of accessibility that, based upon testing I’ve done or published reports from credible sources, is substandard. Because you chose a device that my blog suggests is “better” is a bad reason to celebrate that other devices may not be as good. This isn’t a game, Apple ain’t the Red Sox and Google ain’t the Yankees and there’s no reason to root for one massively profit generating corporation over another.

When I write a critical piece, I do so to inform my readers of results I have learned about some bit of technology. I do not do so to “gloat” that I had made a particular purchasing decision over another. I have no skin in this game, if a new device comes out tomorrow that I think will like, I’ll go get it no matter the vendor. I view technology as tools and nothing more and I don’t root for Craftsman versus Snap-on either.

Conclusions

In general, I think that the access technology business needs much more real data driving the opinion pieces that are so rampant in this community. We all have our favorite things and it’s good that some people write about such, create tutorials and do all of the other things that make using computing devices much simpler for our community but it’s also essential that we try to stick to facts, find the data to support our assertions and view all marketing literature with a very skeptical approach.

While editing this piece, I went through my usual process of adding links to as many of the proper nouns in this article as possible. I usually add a link to the first occurrence of any proper noun I use in an article. I always prefer including a link to a Wikipedia entry instead of a company or personal web site as Wikipedia’s crowdsourced manner of creating content is far more likely to be objective than are web sites written by businesses as marketing tools or by individuals about themselves. In this piece, I found an Wikipedia article I could link to about Ted Henter but not one about Dean Blazie. Some popular screen readers have Wikipedia entries, some do not.

Perhaps it’s a result of poor accessibility in the Wikipedia interface one uses to add or edit an article but, no matter the reason, the history of access technology, the products, the people who created them and the steady improvement of such is hardly reflected on Wikipedia. This is the one forum where we, as consumers, advocates, developers and users can write our own history and it’s something that we should do as soon as possible.