Regular readers of this blog (both of you) would already know that I’m a vocal skeptic, humanist, atheist and science enthusiast. When I have the opportunity, I enjoy meeting up with other like minded people at conferences, “pub” events, local dinners and other similar gatherings. In our winter residence in Florida, I enjoy the South Pinellas Skeptics meet-ups and, in our Cambridge, Massachusetts home during the summer, I try to attend Mary Brock’s Boston Skeptics Book Club meetings in Harvard Square.

Conferences, however, require planning, travel and a reasonably large expense. I’d love to attend a bunch of conferences around the world, alas, I need to be picky as I can only afford to attend a few events per year and these include technology events unrelated to skepticism in any way.

Last year, I attended two skeptical/humanist/science sorts of events, QED in Manchester, England and Women in Secularism (WiS) in Washington, DC. So far this year, I’ve only attended the One Web For All hackathon in San Francisco but, next month, I’ll be flying back to the UK to attend QED for the second year in a row.

What Makes QED So Special?

Before attending QED 2013, I wrote a somewhat tongue in cheek blog post called “Gonz and the X-Dog at QED” which provided a remedial tutorial in how people can and should engage with a blind person and his dog while at a conference. When I wrote that piece, almost exactly one year ago, I thought it would provide an amusing look at social relationships and a person with vision impairment. What would happen to my blind friend and I at WiS would, however, teach me that the wonderful time I had at QED 2013 may not have been a reality I could expect elsewhere.

QED and Accessibility

When, in autumn 2012, I purchased my tickets for QED, I followed up with an email to the Merseyside Skeptics Society (MSS) telling them of potential accessibility problems at a conference and received a very friendly email from Mike Hall, one of the guys on the terrific “Skeptics With A K” podcast and the guy responsible for the technological portions of the conference (web site, hand outs, etc.). In his note, Mike said that, closer to the conference date, he’d send me the files and, if there were any problems, that he would remediate them before the actual event date. About eight weeks before QED 2013, I got a nice email from Mike congaing the PDF files containing the conference handouts and, as in advance, he had already looked up how to properly tag a PDF for accessibility, they were all fully accessible before I even saw them. This is exactly how accessibility should be handled, if there’s a standard to follow it just do as Mike did and follow the guidelines and you won’t need anyone to help with remediation as there won’t be anything to remedy.

Social Models at QED

Most people who attended QED 2013 had not read my blog article describing how to act around a blind person and his dog. Nonetheless, while in Manchester last year, not a single QED conference attendee touched either me or my dog without first announcing their presence. Perhaps, the public school systems in UK do a better job of educating their society about human relationships with people with disabilities or, perhaps, the QED attending population is far more “with it” than those who might attend a CFI event in DC. I honestly don’t know why QED attendees are so nice to be around but, all I can say, is that I deeply appreciate the culture of the event.

Other Blind People at QED?

Last year, as QED wound down, I promised that I’d do whatever I could to at least double the number of blind people in attendance. Mike Hall and the MSS gang did such a gray job with accessibility in 2013 and, this year, did an even better job with the QED web site than before. When I bought my QED tickets this year, I found a single and very minor accessibility bug on the site, I reported it to Mike and it was fixed less than a half hour later. Mike’s commitment to accessibility is stronger than some people who work on the technical side of some accessibility/disability oriented conferences and is, by far, the most accessible mainstream conference I’ve ever been around.

Unfortunately, although nearly a year ago, I promised to pay for a QED ticket for any other blind person who cared to attend, I’ve received zero requests for such. A few friends toyed with the idea but, for a variety of pretty good reasons, they couldn’t attend this year. This makes me sad as Mr. Hall has done a lot of excellent work to make the conference accessible to our population but I’m the only one who seems capable of enjoying his work. If you’re blind and enjoy science, humanism, atheism, skepticism and related topics, please do consider attending QED. I’m told there are very few tickets left but, if you’re blind and want to attend, write to me through the contact form and, if you’re serious, I’ll get you a ticket if any remain. I’d just love to be able to thank Mike for his terrific efforts by showing that more blind people than just me will attend.

How Is QED Special Otherwise?

Unlike some skeptical events, TAM for instance, QED has no green room for the speakers and other celebrities to hide. In 2013, I enjoyed chatting with Lawrence Krauss as if he was just another attendee. During an overflow panel a lot of people had to sit on the floor. Richard Dawkins himself was on the floor beside the X-Dog, showing a side of the controversial and often difficult man that one would rarely otherwise have the opportunity to witness. I was able to meet and hang out with as many of the speakers I had hoped to and enjoyed establishing a friendship with people like Carrie Poppy, Michael Marshal and the Pod Delusion people.

The QED 2014 speaker list contains a broad section of different sub-topics from the entire spectrum of scientific skepticism. I’m really looking forward to hearing a lot of the talks and panels announced so far.

More than the presentations, though, I look forward to hanging out with friends I had made at QED last year and before then as well. I can’t wait to see skeptical notables like Hayley Stevens and Rhys Morgan but also all of the nice people whose names few of you would recognize. At QED 2013, I felt that I had “found my tribe” and I look forward to meeting more friends whom I haven’t met yet.

Getting Involved In Skepticism

Since attending QED 2013, I’ve continued making the occasional contribution to Pod Delusion and have done a handful of guest posts for Skepchick as well. Most interestingly, though, I’m a founding contributor to a new site called Skeptability, a Skepchick sister site about disability. Skeptability isn’t online yet so please follow this blog to learn when we’ve launched.


Whether you are blind or have another disability or not, I recommend you attend QED next month. Check out the QED site and you’ll undoubtedly find aspects of it you would like to hear. Come to reward MSS for doing a terrific job of accessibility for their event but, mostly, attend this event to meet amazing people with interests similar to mine. If you like this blog, you’ll love QED.

Good Books, Bad Books, You Know I’ve Had My Share


Regular readers of this blog, which is to say people who read the articles I write unrelated to disability as well as those about accessibility and such (about 10 of you), would know that I’m also passionate about music, literature, poetry and a lot of other artistic endeavors. At any given time, I’m usually reading two or three books concurrently. Recently, I’ll have one silly book, something by Terry Pratchett for instance, a non-fiction book about some topic I find interesting and a work of “art literature” often from the past.

Over the past week, I read a non-fiction book that I thought was so terrible that I had to reach for one of the greatest books in the canon of American literature just because I needed a strong dose of beauty and genius to rebalance a brain punished by the non-fiction work I had read immediately prior.

The Blues

The blues, a style of music that derived from spirituals sung by African slaves in 18th and 19th century America, is the genre that inspired most other styles of American music. Rock and roll, a lot of jazz, country, rockabilly, R&B and other American inventions have their roots in whole or in part in the blues.

The blues is also the style of music I enjoy playing. By no known definition of the word can I be called a “musician” but I can blow blues harmonica well enough to have a lot of fun jamming with friends. It was when I played acoustic blues with a friend then working on his PhD in aero-astro engineering at MIT (an actual rocket scientist) that I acquired the nickname Blind Christian as we thought that a pair of nerds playing American roots music called “Blind Christian and Chunder” sounded like a couple of old guys from blues history.

As I do with many things I love, I’ve spent a lot of time over the years reading about the history of the blues and listening to the music chronologically so as to learn about its evolution as to see which musicians influenced those who would come later. I enjoy the history of artistic movements and observing that, as one generation stood on the shoulders of the giants who came before them, how they would take the art into a different direction.

Most recently, I’ve been listening to a lot of blues-rock from the sixties and seventies, largely acts from England. I reached the British blues-rock era led by performers like Buddy Guy, Muddy Waters, BB King and a few others. I found myself listening to a lot of Cream, Jimi Hendrix (sure, he was born in America and learned his craft here but he had to go to the UK to find the other members of his band and to develop an audience that he couldn’t in the states), The Yardbirds and a few others. This brought me to Led Zeppelin,, an act I ignored during their peek years while I was in high school.

The Bad Book

As I spent my time listening to early recordings by Led Zeppelin, I found that I was tremendously impressed by their skill as musicians. Jimmy Page could play guitar as well as any navy blues-rock player, including Hendrix. John Bonham followed in the tradition of drumming masters like Buddy Rich, Elvin Jones, Gene Krupa, Ginger Baker, Art Blakey and others who brought the drum kit into the forefront of their music rather than just keeping time and banging away. John Paul Jones played bass ones that often needed to be highly complex so as to allow the guitar and drums to sound musical during wild improvisational moments. Robert Plant had a perfect voice for blues-rock while, in his prime, also looking like a Greek statue of male perfection. I wanted to learn more about this band so I got a biography of the band from Audible.com.

When Giants Walked The Earth: A Biography of Led Zeppelin was the top search result on Audible when I looked for “led Zeppelin.” The description of the book on the Audible site sounded interesting so I bought it with one of my Audible credits.

Excuses For Crimes

Over the years, I’ve read a number of rock biographies. In general, the authors tend to canonize their subjects and describe even their worst moments in glowing terms. In “When Giants Walked The Earth”, however, these relatively juvenile passages of the wild behavior of the members of Led Zeppelin during their peek years often sound like the author is suggesting that their horrible treatment of women, acts of sadism and possibly rape, were “a sign of the times” and not the reprehensible violence that the really were. A lot of bands lived wild lives during the seventies but this was the only book in which I’ve ever read such bogus excuses for what I think is actually criminal behavior.

Incredibly Pretentious

“Giants” contains a bunch of flashbacks to periods in the lives of the Led Zeppelin members. These, in the fashion of a creative high school level writer, are described in the second person. The phrase, “You’ve wanted to be a singer since your fifth birthday…” I’m the reader and that’s to whom second person is usually addressed but, in this book, either the author thinks his readers are actual former members of Led Zeppelin (a total of five people in its history) or would be entertained by long passages in the second person. I found this aspect of the book to be entirely annoying with all of the immature notions of a teenaged writer. Of course, as this is a rock and roll biography, high schooled aged boys are probably the target audience, hence, maximizing the faux artistic stylings may allow them to think their being all intellectual and shit. This technique falls flat on its face.

The author also seems obsessed with Alastair Crowley, the philosopher of modern satanism. While Jimmy Page was and likely remains a member of OTO and some Led Zeppelin album covers contain symbols derived from OTO imagery, the implication that “Stairway to Heaven,” possibly the most popular song in rock history, could only have been written with “supernatural, satanic magic flowing through Jimmy Page and Robert Plant, the Led Zeppelin members who wrote the song, is so ridiculous that, while reading these passages, I found myself yelling at the narrator on the audio version of the book. If an author wants to assert that something satanic occurred, it is first incumbent upon said author to first prove that satan, Lucifer or whatever supernatural actor is at work exists at all. Art doesn’t come from supernatural inspiration, it comes from hard work, lots of practice and a level of individual creativity based purely in the human condition.

Awful People

“When Giants Walked The Earth” is about five people: Jimmy Page, founder of Led Zeppelin and its guitar player; Robert Plant, lead singer; John Bonham, one of the greatest drummers of all time; John Paul Jones, bass and keyboard player and Peter Grant, the band’s manager. With the exception of Jones, the quiet member of Led Zeppelin who described his wild years as “I participated in the fun parts but when things got too ugly, I would disappear,” the people described in this book are not individuals whom I would ever want to meet in person.

Jimmy Page was a sadistic misogynist. John Bonham was a violent drunk and, based on description of events, a rapist as well. Robert Plant was a self absorbed tyrant. Peter Grant turned into a paranoid and abusive individual who acted horribly toward nearly everyone outside of the band.

Over the years, I’ve learned to separate the art from the artist. Musical giants, Ludwig Van Beethoven, Miles Davis and Charlie Parker, for instance, were all horrible people. Beethoven abused nearly everyone in his life, Miles beat up wives Eartha Kitt and Cicely Tyson. Charlie Parker, such a bad heroin junky, would steal from even his closest friends. Artists are all humans and some humans do very bad things. Their art, however, stands on its own and, these three at least, created works of such profound beauty that, separate from the awful men who wrote these compositions, they will stand as beautiful forever.

The members of Led Zeppelin did horrible things while also creating some of the most lasting rock and roll music of all time. I can despise them as individuals while admiring their creations.

Major Factual Problems

Perhaps the author doesn’t know about BitTorent and the availability of Led Zeppelin bootlegs online. In one passage of the book, the author describes a concert performed at a venue called “Earl’s Court” in London. The band had to leave the UK to become “tax refugees” so as to avoid the then incredibly high British tax rates, up to 95% on income earned abroad, back then, In this section of the book, the author puts a whole lot of words into Plant’s mouth that, when I listened to the bootleg recording of the concert, I found that he was making a lot of the stuff up entirely. Robert Plant made a few snarky comments about the UK government during the concert but didn’t say about half of the things the author had attributed to him. In fact, no one at all can be quoted as having said about half of the statements he attributed to Plant so he must have just made them up.

Don’t Get This Book

If you want to learn more about Led Zeppelin, find a different source. The over the top level of pretentiousness in this book makes it nearly unreadable at times. The author tells a story that doesn’t seem compatible with other sources and does so in a manner that seems designed to amuse high school aged boys.

Rebooting My Brain

When I finished “Giants,” I desperately needed to find literature of the highest level of artistic expression. While I enjoy literature from around the world, my greatest literary passions are for 20th century American writers. As I was looking for something I knew in advance would contain prose written with the highest levels of skill, a book with stark poetic beauty, a work with rhythmic properties that can leave one’s jaw dropped and one that touches the heart of the human condition, I decided to read for abut the tenth time, William Faulkner’s Light In August.

For this blog article, I’m going to write a bit about this incredibly important novel and why I decided to read it now. Do not consider this to be an adequate review of “Light In August.” This novel has been dissected, discussed, reviewed, studied, analyzed and written about by thousands of literary experts. I’m an advanced reader and haven’t the skills to even begin to write a proper review of a real literary masterpiece. Please, if you enjoy reading great literature and are not already familiar with the works of William Faulkner, take the time to read his three masterpiece novels: “Light In August,” The Sound and The Fury and Absalom, Absalom. All three of these novels are considered by people who study such things to be among the greatest works in the English language. Faulkner is often compared favorably to literary legends like Chaucer, William Shakespeare, James Joyce, Marcel proust and others. Unfortunately, likely due to Faulkner’s adult themes, sex, violence, racism, hatred and the fundamental condition of southern Americans, his work is rarely taught in high schools and if you, my loyal readers, are like most people, you probably managed to graduate from college having read few masterworks during your four years of vocational education.

The Great American Novel

Throughout the 20th century a notion called “the great American novel” persisted. In short, the idea was that there would some day be a single novel that would, better than any other, describe the American condition in prose similar to that of British writers. When many discuss this idea today, they don’t assume that it’s a single novel but, rather, a collection thereof. In general, though, there is an assumption that “American” literature didn’t really exist until the late 19th century, hence, works like Moby Dick are often categorized as “English language literature” rather than “American literature” as, stylistically, they are far more similar to English literature than American.

Like the earliest work to gain international attention as a great American novel was Mark Twain’sAdventures of Huckleberry Finn.” It described the pain in all Americans, black and white, caused by racism, slavery and hatred. Twain, however, hadn’t the luxury of including topics of sexuality and “white on white” violence in his works as 19th century readers wouldn’t have accepted such and the censors would have banned books about such topics.

In the 20th century, authors like Sherwood Anderson, Earnest Hemingway and others would raise the artistic bar for American literature and expand the subjects covered. William Faulkner would be at the forefront of this movement.

In the decades since Faulkner died, the influences of his works are readily apparent in other masterpiece novels as well. If one reads novels by another American recipient of the Nobel Prize for Literature, Toni Morrison, they will hear echoes of Faulkner throughout. Morrison’s character, Milkman, in her novel “Song of Solomon” may not have been possible without Faulkner having created his Joe Christmas character, the protagonist in “Light In August.”

If you are looking for the great American novel, “Light In August” is a terrific place to start learning about the art of American literature.

Joe Christmas

Of all of the characters in American literature, including Huckleberry Finn himself, Joe Christmas may be the most well studied. This character is so complex, so wonderful and horrible, so deeply American that, decades after the book was written, he remains one of the most important characters ever created in English language fiction. If you read “Light In August” you will recognize him, love him, despise him, pity him, fear him and, quite definitely, learn to think differently about him.

The Story

“Light In August” functions on multiple levels. One can read it as a crime/mystery novel. It can be read as a novel about the American south in the post civil war and pre civil rights era. It is kind of a love story and a story of hatred and violence. The story, therefore, is far more complex than you are likely to find familiar. It is, however, compared to other modernist masterworks, novels like Marcel Proust’s multi-volume “Remembrances of Things Past,” James Joyce’s “Finnegan’s Wake” and Faulkner’s “Absalom, Absalom,” “Light In August” is also very “accessible.”

In “light In August,” Faulkner uses a tremendous number of what were then largely experimental literary techniques but does so in a fashion that doesn’t ever distract from the meaning of his words. If you enjoy audio books, you will undoubtedly notice the tremendously effective use of meter and rhythm in the author’s prose that adds to the overall beauty of the work without ever sounding clunky. In some passages, Faulkner uses iambic pentameter, the rhythmic pattern popular in the Elizabethan era, written using southern and African American dialect in prose. Perhaps Faulkner did this to suggest, “Take that Shakespeare” and show off his own skills, others have suggested he did this to increase the dignity of otherwise very poor and ignorant characters. No matter the motivation, it’s a delight to hear.


If you’re really interested in Led Zeppelin, find a source to satisfy your curiosity other than “When Giants Walked the Earth.” It’s a terrible book.

If you want to learn about American “art” literature, William Faulkner and “Light In August” would be a terrific place to start your education. And, if you’ve just read something so terrible that you need a brain cleansing, the works of William Faulkner are a gray place to look for tremendous beauty.

Testing Android Accessibility: The Programmers’ Perspective


[This is an edited and corrected version of the article I posted on 2/21/2014. As always, when a factual correction is presented, I go back and fix the problems. I’ve listed the two factual corrections in a section following the Introduction section. I’ve also made a few grammatical changes but these change nothing at all in the theme of the article.]

Thus far, we’ve explored Android accessibility from my personal perspective a (totally blind user who doesn’t read braille) and from the view of a deaf-blind person who access his computational devices using braille only. The results of these extensive bits of research (I spent three months using a Google Nexus/7 on a daily basis and Scott tested as much as possible with braille only) demonstrate that, from the perspective of these two classes of user, that, out-of-the-box, the Android accessibility experience is dismal. This article, the third and probably last entry in my Android series demonstrates why, according to a number of blind Android developers, this is the case.


I was under the impression that the Android GUI was based in Gnome’s GTK. I had heard and had this confirmed by a Google employee and a pair of experts in such things. After posting the article, Matt Campbell (author of the SystemAccess screen reader as well as all of the other programs from Serotek) sent me an email saying that I had got this detail wrong. Later in the day, Peter Korn, the guy behind the excellent accessibility framework in GTK, posted a comment correcting me as well and, well, if anyone would know it is Peter so I’ve removed references to the relationship between Android and GTK from this article and replaced it with a generic paragraph on accessibility API from which Google could have used to, to misquote Isaac Newton, “stand on the shoulders of giants like Peter, the Apple and Microsoft teams, Aaron Leventhal and all of us who participated in some way in working on the excellent accessibility API on the other platforms.”

Steve Nut, the guy from Serotek’s “That Android Show” posted a largely erroneous comment (see below) in which he stated that Kindle on Android is “accessible” which, given my high standard for the definition of the word “accessible,” it is not. He is correct in his assertion that, indeed, Kindle does exist on Android so I added the adjective “accessibly” after the word “exists” in the paragraph where I mention it. In a comment below, I’ve asked Steve (a really good guy in spite of his fan-droid fixation) to write a specific definition of the word “accessible” so I might understand how he can conclude that, indeed, Kindle is “accessible” on Android. I sincerely hope he posts such as I honestly do not understand how anyone can use the word “accessible” to describe something that does nothing more than talk and is completely, 100% inaccessible to our deaf-blind friends like Scott.

Lastly, I decided to remove the word “Kindle” from this article entirely. When I tried using it on Android, I found a pile of accessibility problems. There are workarounds to the biggest problems I found and, frankly, including Kindle in this article is just a distraction as, while some find it accessible enough to use, I don’t want to spend any more time thinking about it.

Who Am I?

For readers who do not know my background, I’ve been programming computers since 1971 when I wrote my first program at age eleven in the computer science area of Lawrence Berkeley Labs (LBL) in California. I became a professional computer programmer when I accepted my first job in the business in 1979 at Lincoln Savings (don’t blame me, I was only 19 years old and never met Charles Keating). In 1983, I moved to the Boston area and jumped into personal computer software development. When, in 1998, after taking a couple of years off from making software as I had lost the last of my vision, I accepted a job as Director of Software Engineering and later would be promoted to the VP position at Freedom Scientific. Needless to say, I understand how to make software as I’ve been doing so for the majority of my life and, baby, we’re not so young anymore.

Nonetheless, I have never even tried to write a program on the Android platform. I do not know the Java programming language which is what most Android software is written in. I’m mostly a C/C++ and assembly language programmer and I don’t know a whole lot about more modern programming systems, languages, UI and so on. Thus, for the purposes of this article, I took a number of posts from the accessibility@google.com mailing list written by blind programmers whom I know personally and have called a few blind Android programmers on the phone for further clarifications. These other programmers have all made fully accessible software on Windows, iOS, OSX, Gnome and have done so in multiple versions of these OS over the years. These people work at APH, in the PhD program at NC State University, EZ Fire and, in one case, had worked on the popular “Q Read” book reading software for Windows.

My personal expertise in this subject comes from having a solid understanding of how software is made and, having worked on the committee that whose work product was intended to be a cross platform accessibility API but that, instead, led to the Gnome, Apple and Microsoft (UIA) accessibility API which, while incompatible with each other, provide a full set of features that developers can use to easily make their software accessible.

A Generic Look at Accessibility API

Over the years, there have been a number of books, articles, web sites and other sorts of publications that provide a list of all of the controls needed in a graphical user interface (GUI). I will not take the time here to summarize them as this isn’t an article on GUI but, suffice it to say, all modern GUI with any popularity (Windows, OSX, Gnome, iOS and Android) provide all of these different types of control for programmers to use in their software.

On Windows (XP or newer), OSX (Tiger or newer), Gnome (2.0 or higher) and iOS (version 3 or higher), every one of the 20 or so controls available in the GUI has a corresponding element in the accessibility API. The most complicated of the popular controls is the “web view” which a programmer might use to add HTML information to a piece of software and, of course, the web control should behave identically to a web browser when a user of access technology encounters such.

The Android Accessibility API

For a long time, most of the three months I spent with the Nexus/7, I made the wild assumption that most of the accessibility problems Scott and I had encountered and documented in the previous articles were the result of the Google branded apps ignoring the accessibility API, a programming crime if one intends to do no evil as such programming decisions enforce discrimination against our population. While this action is horribly wrong, especially at a company that can easily afford to get it right, it is easy to fix. To put all controls into the tab/swipe order, to ensure that all controls have labels, help text and such is nothing more than typing and can be remediated easily by an intern with the IDE in hand.

Then, my friend Tyler Spivey tried to port Q Read, an Windows based epub reader to Android and I started to learn that the Android accessibility API, the infrastructure necessary to ensure a fully accessible GUI, was sorely substandard when compared with Windows, iOS, OSX and Gnome.

What makes this all very confusing is that the Android accessibility is the newest in a line of increasingly powerful accessibility frameworks on the market today. When MSAA was the only accessibility API it was the “best” one by default. MSAA wasn’t very good but it was the first and developers concerned with accessibility learned a lot from it. The next generation of accessibility frameworks, Gnome/GTK, iAccessible2, UIA, OSX (in Tiger) showed dramatic improvements with concepts like “relationships” and “contexts” making their first appearance as well as being the first group of accessibility API to support large blocks of text and complex objects like web controls. The latest generation of accessibility API come from Apple on iOS and Google on Android. Having had the shoulders of giants on which to stand, how did Apple continue to show progress and innovation in the iOS accessibility API while Google, given mountains of reference materials and access to every accessibility professional on Earth (we all have our price) showed a major step backward in this type of technology?

Making An Accessible Book Reader for Android

Tyler Spivey, a blind Canadian hacker of tremendous skill and reputation for doing what other programmers cannot (among those in the know), tried to port Christopher Toth’s popular epub reader, Q Read, to Android. The first step in this process is to create a text control in which the app can stuff its data. As this is 2014 and that every other major operating system (Microsoft’s Windows, Apple’s iOS and OSX and GNU/Linux running the Gnome desktop) provides a text control that is fully accessible out-of-the-box, Tyler expected that Android would also provide such and, sadly, he was greatly disappointed.

In any operating system other than Android, one gets an accessible text control “for free.” Using pseudo code, to add a text control that allows a screen reader (for instance) to provide features like reading by object type (headings and the like), semantic elements (word, sentence, paragraph, character), filling out forms and performing every other task available in virtually all other systems, one writes code that looks like the following:

MyAccessibleTextControl = new GenericTextControl;

That’s right folks, a programmer who wants a fully accessible text control in his app only needs to write a single line of code and it will work with any useful screen reader on that platform. For those of you who enjoy attacking me with ad hominem and repeat that my opinion is colored by a “love” I have for iOS, I’ll remind you that this example is true not just on iOS but also on Windows, OSX and Gnome.

The substandard Android accessibility framework will allow for making a text control accessible. I don’t know every type of control used by Barnes and Noble in their Nook app so I’m uncertain if the main reading window is a standard Android text control or if it’s entirely custom but B&N have obviously spent a tremendous amount of time and money to make an accessible book reader for Android, time and money that is not required to do so on any other OS.

An Inaccessible Web Control

In current software, especially on mobile platforms, many programmers like including a web control in their app so they can display HTML information and use the web control to drive features in the app itself. Like every other popular OS, Android provides a web control in its UI library. Unfortunately, largely because the web control is built on top of the Chrome code base (this only beam true when Google released the Kit Kat revision of Android) and Chrome, as a stand alone app or as a web control, is simply not accessible in the way that web controls are in Windows, Gnome, iOS or OSX.

Again, using pseudo code to illustrate the issue, making an accessible web control on the other popular operating systems would look something like:

MyAccessibleWebControl = new genericWebControl; 

Windows programmers may need to add a couple of lines of code to ensure that the control has properly gotten focus but this detail requires a total of fewer than five lines of code on any OS other than Android where providing a fully accessible web control in an app can require hundreds of lines of code, special JavaScript code to tell only one screen reader on Earth what to say and, in general, even in the best circumstances, will provide a sorely substandard experience for the end users.

What We Lose Due to Having No Accessible Text or Web Control

Let’s take a look at a lot of popular software used by blind people on other operating systems. We have programs like Q Read, VoiceStream, NLS Bard Mobile and others that depend on having an accessible web and/or text control. These useful and popular programs do not exist or are not fully accessible on Android. In two cases, I’ve had the opportunity to communicate directly with the developers of these programs and asked them why they hadn’t ported their software to Android. The answer was identical in all cases, “we tried porting to it, we do not have the resources to make the web or text control accessible.”

If you want to point to some of the very accessible apps available on Android, programs like the popular Nearby Explorer for instance, I need only remind you that these are usually “self voicing” applications that work around the broken accessibility framework by talking directly to a speech synthesizer. These “ghetto” apps are used exclusively by blind people and dash the notion of universal design by ignoring such principles. Ghetto software was a necessary evil in the days before accessibility API as making a mainstream app fully accessible was very difficult. The excellent accessibility frameworks on iOS, Windows, Gnome and OSX obviate this requirement as, with the new technology common to all platforms other than Android one can make a mainstream app as accessible as anything targeted specifically to the population of people with print impairment.

Some apps designed specifically for our community, apps that perform special tasks that most mainstream users wouldn’t care for should have their UI designed using the same principles of universal design as the mainstream apps. Some of these apps will be useful to people with print disabilities unrelated to blindness and the AT they employ needs the same compliance as do programs used by blind people.

Is Making An Accessible Web Control Too Hard For Google?

If one simply takes a look at the Android version of the FireFox browser, developed by Mozilla Foundation, a group with a budget when rounded to integers comes to 0% of the Google revenue stream, they will find as accessible a browser window as exists anywhere, on any OS, ever. If a team with no money at all (when compared to Google’s dollars) and a staff tiny in comparison can do it, making a fully accessible web control and, indeed, making Chrome into a useful solution, cannot possibly be too difficult for a company like Google to accomplish. Similar companies, Microsoft and Apple have done it for years now, the Gnome Foundation does it really well, what, therefore, is the cause of Google’s failure to comply with generally accepted accessibility practices?

What Does Google Need to DO to Remedy These Problems

At the API level, to make Android as accessible as is the generally accepted standard, they must:

  • Make the accessibility API more “automatic” as it is on the other OS. It should be nearly impossible to use a standard control, whether something simple like a button or complex like a web view and not get the accessibility by doing more than typing a few words into edit boxes in the IDE.
  • Fix Chrome, both as a stand-alone browser and as a control in other apps, to be as accessible, using the same generic accessibility techniques as the other OS, as IE, FireFox or Safari on various other platforms.
  • Fix the text control in the same manner as the web one.

To make Google branded apps accessible, Google must:

  • Enforce its own internal accessibility standards on its own product teams as part of the the other requirements for releasing software to the public. This is, perhaps, the most frustrating part of Google’s poor accessibility as accomplishing this simple goal would be very inexpensive in terms of engineering hours, testing and so on. That Google doesn’t already do this demonstrates that they do not have a corporate wide accessibility policy but, instead, leave the decision as to whether to enforce segregation against this population up to individual product teams.
  • Make sure that all of its web apps, Google Docs, Analytics and such are fully compliant with the Web Content Accessibility Guidelines (WCAG 2.0) at its AA level. Soon, this will be the requirement for all US federal government sales under the Section 508 Refresh so, if they hope to sell to America’s largest customer, they should probably get started on this effort now. Of course, if “doing no evil” is part of their actual strategy, Google should do all of this because discrimination is evil and they should stop doing it on purely moral grounds.


The only conclusion I can come to regarding Google’s accessibility, from the perspective of a career software engineer, is that it is a failure. It is probably not irreparable but, as they are one of the wealthiest and most powerful technology companies on Earth, that they claim to have an accessibility strategy, that they’ve released products with the claim that they are accessible, that zero Google branded apps on a Nexus/7 running Kit Kat are entirely accessible, that Google Docs, Analytics and other of their web apps completely ignore WCAG and a long laundry list of other Google accessibility failures, I must conclude that poor accessibility in Google products is intentional as, otherwise, what is the logical conclusion?


I’m sure this article, although entirely based in fact, data and the impressions of experts, will draw me a pile of ad hominem from the fan-droids. If you are one of these people and you want to tell me that I write these articles only because I “love” iOS or Apple in general, I ask that you first read an article I wrote when I did the BlindConfidential blog. It’s called “Apple Just Sucks” and describes the opinions I then held of that company before they set out on their excellent accessibility effort. When I write that something has excellent accessibility, it is because I’ve tested it extensively and when a new player comes along with something better, I revise my opinions based on the new realities. In the BlindConfidential days, if one goes back and reads all of the articles about screen readers I wrote then, they would notice that, when I started, I wrote that JAWS in Windows XP was the best solution and, at the end, I had switched to Macintosh. I had used a Windows Mobile T-Mobile Dash running the Code Factory screen reader for a few years as it was the best thing out there, then, when Apple released the 3GS with built-in screen reader and all, I switched.

The opinions I hold are not based in my own personal preferences, that would mean I’m speaking from a statistically insignificant sample size of one. To make matters worse, my personal preferences tend toward UNIX like command line shells, using emacs for a whole lot of things and, generally, having a very “nerdy” view of technology. When I write about accessibility, I do so with one eye on generally accepted standards and guidelines and the other eye on usability when compared to systems people with vision impairment are already accustomed to using.

People yell at me, you just want Android to be the same as iOS,” which is true on the macro level, I want Android to be (rounded to integers) 100% compliant with its own, internal, defined entirely by people at Google, accessibility API the way that iOS is already. Apple even takes iOS accessibility a step further in that, along with being 100% compatible with its internal accessibility API in all of its apps that come out-of-the-box in an iOS/7 device, all Apple branded iOS apps that one can download from its AppStore are also fully compliant with its accessibility framework. I do not insist that Android have the same apps, the same functionality, the same user interface or any of the features that, competitively, would make a difference, I just insist that it become fully compatible with its own standards the way that Apple and, to a lesser extent, Microsoft and the Gnome Foundation have already done. That isn’t “bias” toward Apple but, rather, a strong inclination toward generally accepted standards for accessibility.

I thank the people on the “Eyes Free” blind Android user mailing list for setting me straight about so many facts that I had gotten wrong in my early analysis of the Nexus/7 I bought back in October. They also pointed out what, in fact, in my assertions were purely opinion and, as a result, forced me to work much harder on these articles than anything I had previously written on this blog. Typically, when I do a technology review, I would have spent between hours and a few days evaluating such; with the series of Android articles, I spent more than three months doing the user research and many hours reading email posts, doing quick and dirty statistical analysis of the type of problems reported on Eyes Free versus user lists for iOS and Windows and much more. That these articles have become some of the most popular in the history of this blog reflects the extra work as the articles are fact and data driven and, since I published the revised and corrected versions of the articles (I’ve fixed a few relatively minor things people have reported as incorrect) in the series, not a single fact has been disputed by thousands of readers.

So, if you love Android so much and want to yell at me, feel free. I’ve never edited a comment for anything but to remove racist and sexist epithets and, as you can read here and in the BlindConfidential archives, you can see that I’ve allowed any commenter to say whatever they want about me. I do request (not insist), though, that you try to attack my articles with some level of intellectual clarity. Saying, “Well, I love my Android device,” is a nice statement of opinion but it neglects any notion of best practices, standards and guidelines and, frankly, it’s an entirely selfish view of things as the standards were developed in such a manner as to address issues faced by people with a panoply of different disabilities and, just because you can use something, doesn’t mean that it is accessible to everyone else. This specific point is illustrated greatly in Scott’s article about the experience of a deaf-blind user of Android which, unlike my first article, received absolutely no criticism. Sadly, if Google’s accessibility API actually worked properly and they were faithful to using it in their apps, Scott would be enjoying using an Android tablet today, alas, it is impossible for him to use.

If you insist that Android is accessible just because you can enjoy it, you are, by making the claim that Android is accessible, is no more than dooming others who, for any variety of reasons from disabilities other than yours too having a different aptitude for technology, to a miserable experience compared to that which they can enjoy on iOS, Windows, GNU/Linux/Gnome or OSX. The fact is, whether you like it or not, “accessibility” due to laws like CVAA and ADA Restoration Act are growing a legal definition and, if the level of accessibility out-of-the-box in an Android system is allowed to pass as “accessible” we will be lowering the bar for the definition of “accessible” when compared to all other major operating systems. It is essential that we, as a community, push the highest bar that currently exists as the minimum standard for accessibility as, otherwise, we’ll never get anything better.

Chicken Nugget, Accessible Twitter Client Released

Some people send me press releases when they do something new in accessibility. I tend not to publish them and have never published one verbatim. As I edited this one as a favor to its author, though, I’ll let it stand on its own. I use the Chicken Nugget Twitter client when I’m on my Windows 8.1 box and I like it a lot. Having switched from Macintosh where I’ve used the same Twitter client for years, I found that my “muscle memory” caused me to hit incorrect keystrokes but, as this is a different interface, I can’t blame the authors for not making a clone of my favorite Twitter client as they have done so much more in Nugget than I can currently enjoy using on Macintosh. So, here’s the Chicken Nugget press release, please, if you are a Windows user who enjoys Twitter, go to the Get Accessible Apps web site and buy this software.

Accessible Apps announces Chicken Nugget, powerful new Twitter client for the Windows Operating System

Chicken Nugget features unprecedented access to Twitter at an affordable price

Denver Colorado, US,—January 9, 2014–Accessible Apps today announced Chicken Nugget, a powerful new Twitter client for the Windows platform featuring instant access to the social networking service through an intuitive and highly responsive interface. Christopher Toth, head of Accessible Apps and primary author of Chicken Nugget states, “We set out to build a Twitter client that we, as blind people could use conveniently while also including all of the features people without disabilities might want.”

Along with all of the features users of many different Twitter clients enjoy, Chicken Nugget users can send an receive tweets while focused in any application on their computer, using its innovative global hotkeys. These and all other Chicken Nugget features automatically speak through any installed screen reader, the software used by blind and otherwise print impaired computer users to access information in their daily lives. Users with diminished vision can enjoy Chicken Nugget’s features through its well designed and very simple graphical user interface, providing the ideal hybrid of both worlds.

“Chicken Nugget represents the next step in our continuing plan to provide people with vision and other print impairments affordable accessibility to services enjoyed by our sighted peers”, says Christopher Toth, lead programmer at Accessible Apps.

With nearly a year of active development, Chicken Nugget is the most polished, powerful, and well-designed accessible Twitter client available for blind and otherwise print impaired user. Meanwhile, Chicken Nugget can be used by any person who prefers a clean, convenient and uncluttered interface to Twitter.

Unlike other accessible Twitter clients, Chicken Nugget is optimized to minimize its impact on system resources. It will run well even on older netbooks, and special care was taken during the development process to ensure that it will be kind to battery life for laptop users.

Chicken Nugget provides all of the features of the Twitter website and much more, including translating tweets into the user’s language, playing audio other users post, and locating tweets with geo info.

It doesn’t matter if you are new to Twitter, or are an experienced user. If you have several Twitter accounts, Chicken Nugget is here to grow with you.

Chicken Nugget users can:

  • Seamlessly control an unlimited number of Twitter accounts, while never even leaving your active application.
  • Share news stories with your followers by simply Pressing a single global hotkey and tweeting from within your web browser or feed reader.
  • Store and archive an unlimited number of tweets without ever causing Chicken Nugget to become sluggish, no matter how many tweets fill your timelines.
    • Easily search through your timelines making locating a link someone may have shared with you months earlier a snap.
  • Perform real time Twitter searches, so you can stay up to date with exactly what’s going on in the world, as it happens.
  • Mark a given timeline to be automatically read, so you can create a Twitter search read aloud as the items arrive.

About Accessible Apps

Accessible Apps creates high quality applications for multiple computing platforms which give people who are blind and vision impaired access to the modern computing world that the sighted take for granted. They offer low-cost solutions for many of the problems which face people with vision impairment on the web today, including access to ebooks, social networking, music, podcasts, and more. Formed in 2010, Accessible Apps is made up of a group of dedicated and talented developers with a passion for accessibility.

Testing Android: A Deaf-Blind Perspective

[Editor’s note: This guest post was written by my new friend Scott. He’s a deaf-blind technology consumer who uses computational devices via a braille interface alone. He is also an adaptive technology instructor teaching those who are deaf-blind, and a member of the AppleVis editorial Team. This post is written purely from the perspective of a consumer, and does not reflect the views of any agency or organization he is affiliated with. My own braille skills are so poor that I cannot test such interfaces myself to gather any data that may be useful to you, our loyal readers. I’m happy that Scott volunteered to write this piece and I hope he makes more contributions to this blog in the future.]

A Deaf-Blind Person’s Take on Android BrailleBack


When I first read that Google had made substantial improvements to its BrailleBack accessibility service,  I hoped  that, as a deaf-blind person who relies almost entirely on braille access, that this could be a viable option for myself. I had read various posts on Android mailing lists related to blindness regarding how TalkBack works quite well for those who are blind. While the selection of accessible apps isn’t quite what it is on iOS, I had heard it was expanding rapidly. With the lower cost for equipment, coupled with the fact that I wouldn’t need to use iTunes, this was starting to seem like an option worth considering. After the release of iOS 7, which introduced a number of  new braille bugs, I was looking for something different.

The following review is based on my use of a Nexus 7, 2013 model running Android 4.4 and the latest build of BrailleBack. The stock apps were used to conduct this evaluation, with the exception of Firefox. Many of the people I work with do not wish to configure many different options to get a system to work, so I decided to take this route when conducting this evaluation of braille. No real mention of TalkBack will be given, since a thorough review has already been written regarding its functionality in this blog

Setting up

If you wish to use braille on your Android system,,  you must first find a hearing person to install BrailleBack for you to get any sort of braille feedback. Unlike TalkBack, it does not come with the device, but  it’s a no cost download from the Google Play Store. After your hearing helper downloads BrailleBack,  they must go into settings and bluetooth to pair the braille device.
Then, the person helping the user must go to settings and accessibility and turn BrailleBack on. This is also where your hearing helper sets your braille preferences such as which table to use, braille input and output, etc. In order to use keyboard input with a supported display, you’ll need to go to Language and input settings and enable the Braille keyboard. Note that if you are able to understand the speech offered by TalkBack, you can still complete this process independently. So, for a deaf-blind user, this is not easy to do at all. Instead of just getting assistance to enable VoiceOver and
then pairing the device, you must complete all of the steps listed above to get your braille device to function. It seems more complicated than setting up braille displays with Windows, Gnome, OSX and iOS.

Using BrailleBack

Once a user has completed the setup, they can start using BrailleBack with their Android device.

Here, we’ll explore some  specific details regarding Braille Back and how it works, or in most cases, doesn’t work.

An Unfamiliar Interface

Most screen readers that interact with braille displays,  whether it’s VoiceOver with Apple, JAWS, NVDA, or any screen reader on Windows, export a fairly consistent series of keyboard commands on all braille devices. There is a published standard for braille keyboard layouts that has been accepted and, to the largest extent, implemented on virtually all systems other than Android.

For example, virtually all other braille systems allow the user to go to the top of
the current  screen, by pressing space with L, to go to the bottom, hit space with
dots 4-5-6. With BrailleBack, space Using L launches the keyboard help file, and space with 4-5-6 does nothing. One can scroll to the top of a window with some displays but not others.

Another example is that space with dot 1 on the Braille Edge will move to the previous item, but this does not work with the RefreshaBraille 18. Instead, you must use joystick left.  So in terms of navigation, you’ll need to be very aware of whichever braille device you’re using, as there appear to be some compliance, but many differences. With some basic navigation lacking on some devices, one must really be careful of which display they use if they wish to have any sort of level of productivity on the Android platform.

Why Google would reject a generally accepted standard for braille keyboard input, a standard that has been with us since before the Blazie Braille N Speak, remains an open question.

An Open Source System

The open source nature of BrailleBack ostensibly permits braille display manufacturers to customize commands for each device. This is good because it can further allow each braille display to utilize all of its unique features. However, it can also be a disadvantage as the commands are so customizable that the  functionality on BrailleBack can vary greatly from device to device. I have no information
whatsoever about how easy it is to get a manufacturer’s changes “upstream” nor do I know how much of this code was written as part of BrLTTY and how much was written by Google itself so, placing blame may be difficult.

Because it appears that developers are not following the conventional nature of keyboard command structures found in the standard, the transference of knowledge for users will go way down, and
the learning curve will be much more steep. While I do not have an issue learning different commands, I can assure you that the average user doesn’t want to have to relearn an entirely new system. This is most likely why manufacturers  of notetakers have followed what have
become a conventional set of keyboard commands first introduced in the original Braille N Speak, nearly 2 decades ago. If the user wishes to learn the new command structure, will they be successful? More aptly, why expose an unfamiliar interface without publishing a reason for
making such UI decisions?

access to Books

The news for those wishing to read books in braille using BrailleBack is not good. Whether it’s Google Playbooks, Kindle, or the Nook app, while menus are manageable to some degree in that they are mostly labeled, once you open a book, even though TalkBack will read Google Play books and Nook books, neither app will display the contents of these books in braille so are entirely useless for deaf-blind users.


Email also has an issue similar to what you find when trying to read books in braille. You can access the menus for the gmail app, but you cannot actually read the contents of messages. Editing of text when replying to messages works fine, and I was able to send messages successfully. Forwarding messages also works, but scrolling down to read the original  message content still is not possible with BrailleBack. To a braille user who can have this access on either the Windows Surface Pro or any of the supported iDevices, this makes Android a very unattractive option.

[Editor’s note: Please do not send in the most standard of all Android accessibility excuses. Namely, that Android is “younger” than iOS and hasn’t had the time to catch up yet. As Chris posted in the corrected version of “I Give Up,” Android is exactly 1.24 years newer than iOS which would suggest that, if this is a valid idea, it should be up to par with iOS 6 which, regarding braille, it falls far short of.]

Issues Specific To One Who Is Deaf-Blind

On top of what I’ve said thus far, , which I would describe as far less than substandard support for braille, there remains another issue for those who are deaf-blind: TalkBack uses sounds to convey certain types of information that are neither conveyed in speech nor in braille. For the braille user who cannot hear these sounds, they are left without any way to access this information. Blind people who can hear enjoy TV Raman’s “earcons” but, we who are deaf-blind, get no equivalent in BrailleBack.

Some users may enjoy that  BrailleBack does not display just one item on a line. For example, if on a screen such as in the Mail app, several options may appear at once. Pressing a cursor routing button will activate any of these items. This is good in terms of being able to activate items quickly, but could be an issue in certain apps where things are not labeled as buttons, headings , links, etc as one can easily activate a control without wanting to.

Web Browsing

With web browsing using Firefox, activating links, moving by headings, typing text into a search field, etc, all work fine only using braille, thus offering a pleasant web browsing experience.  Most content is readable with web pages, and it’s the one area where I feel like a braille user can actually have a chance of using an Android device effectively.

Reactions to Criticism of Android Accessibility

Many people have criticized those who do not like Android by stating that if you’re looking for the user experience found on iOS, you won’t have it. If by a similar user experience these people  mean that I’d like to read my email and some books, they’re right. I do expect to be able to conduct basic functions in some way or another just like I would be able to on any other number of devices. It’s not that the new way of operating the device intimidates me, it’s that the option simply doesn’t exist. The same goes for the new Kindle HDX which runs off of a modified version of the Android operating system. The Kindle is, primarily a book reader, and this part of the hardware is not accessible to braille users. That’s right, you can’t read books on the eBook reader from the bookseller in braille.

While many of those who are blind may consider braille a luxury, for some, it’s a necessity. One may ask whether braille users are even worth considering, since they make up such a small amount of the population. Is there money to be made off such a market? Apple clearly seems to think so, as do most screen reader manufacturers on the Windows platform. There is also a developing push for braille literacy in the field of education. My question to Google is, do you want a stake in the education market? If so, you may wish to give braille access some further effort. More importantly, though, if you hope to “do no evil” you might consider the evil of discrimination against
people like me and, using a universal design approach, allow me to enjoy your products as fully as anyone else.

Many deaf-blind people, the people who absolutely must use braille to access a computer,  won’t even consider Android as an option at this point, because there is little to no accessibility functionality for this population. While the screen customization may be good for low vision users, it doesn’t help those who require braille.


In conclusion, I’m sad to have to report such negative findings with regard to braille access on the nexus 7. As someone who would like to have choices, and who is very passionate about using technology to help level the playing field for those with disabilities, I am very disappointed in the small amount of braille access offered by BrailleBack. I hope that the developers at Google will work to make this more of a comprehensive option, so that braille users like myself are one day able to actually use the technology as effectively as our hearing blind and sighted counterparts.

Testing Android Accessibility: I Give Up

A few months ago, a friend of mine who prefers Android accessibility to that available from Apple on its iOS devices, sold me a Google Nexus/7 tablet so I could try it out and, perhaps, write an article about its accessibility. Since early October,, I’ve tried to use the Nexus/7 to perform the same tasks that I enjoy on my iPhone 5S. I can say that, while accessibility on Android is better than I expected, it is still far from being a solution I can use full time the way I can with a device from Apple or Microsoft.

This article isn’t up to my usual writing standards. This is a compilation of a lot of notes I’ve taken during my time with the Nexus/7. This article contains a bit of repetition, some clunky sentences and could be more well organized. Unfortunately, I haven’t the time to do my usual amount of editing. As far as I know, all of the “facts” I present in this article are true, I’ve tested all of this software myself and these are the description of my results. Obviously, the opinions in this piece are my own and may not reflect the impressions held by others.

Usually, I provide links to many things in my articles. I didn’t do so in this one both to save some time and because this isn’t based on articles I’ve read but, instead, my actual experiencing testing accessibility on the Nexus/7.


Back in October, I got a package from UPS containing a 2012 edition, Google Nexus/7 tablet. A friend had sold me the unit for $50 so I could have an Android device in hand to test its claims of accessibility.

For research on this article, I tested only with synthesized speech as my braille skills are abysmal and I cannot write with any authority on a braille interface to anything. I am also only speaking to issues encountered by myself, a person with a total vision impairment. I’d like to be able to write about low vision and other print impairments but, in this article, I’m looking only at accessibility for speech only users with profound to total vision impairment. , In preparing this article, I did the testing alone and will only report on things I actually encountered myself.

For the most part, in this article, I focus on the accessibility of the system in its out-of-the-box condition. As we’ll see, a blind person can have a better than adequate but also substandard experience on an Android device if they install a bunch of software to replace the apps and various parts of the system including the home screen. That Android is so customizable is one of its strongest points; that a blind person really cannot use the device without a lot of customizations is an outrage. To make my Nexus/7 at all usable, I had to install a third party home screen (Apex Launcher) and a whole lot of apps that, ostensibly, exports similar functionality to the inaccessible equivalents shipped by Google on the device.

Corrections and Clarifications

After I originally posted this article, I got a little criticism (far less than I had anticipated) about its content. Some readers asked what version of Android I was running and I received one factual correction regarding a gesture I had not included in the list of such in the original version. Here’s are the clarifications and single correction:

I used the Android Jellybean release for almost all of this testing but, during the final two weeks, I had the Kit Kat release installed. I accepted all of the latest updates when I received the notification telling me to update my Nexus/7. The first thing I noticed about Kit Kat is that the default keyboard does some things, like suggesting spelling corrections and guessing at the next word you might want to type that are popped onto the screen with no feedback from TalkBack when you are in a web view. When I tried to enter my account information in the Nook app, for instance, when TalkBack said, “Submit button” and I double tapped, it replaced the text I had written with another word and ignored my attempt at pressing the button. In it’s default configuration, the default Kit Kat keyboard is not usable in web views and, perhaps, elsewhere. You can turn off the annoying word suggestions or replace the keyboard entirely though if you like but, out-of-the-box, the default keyboard has these inaccessible features turned on and would be a huge problem for anyone unaware of these issues, especially if they do not know how to turn the default features off.

The Kit Kat release fixed a few things so I removed them from the long list of defects at the end of the article. Google Play, for instance, while still not perfect, has fixed some of its tab order problems. Nonetheless, I still couldn’t find a single default app without a single accessibility failure in either version.

Later in the article, I included a list of TalkBack related gestures. I had found the list on the Google web site and, as far as I could tell, it’s the latest and most comprehensive documentation Google has published. So, when I just copied it and pasted it into the article, I assumed that, indeed, as it was the most current documentation I could find on the software author’s site, it was complete.

There are two gestures that include more than one finger. These are two finger swipe up and swipe down. They allow a TalkBack user to scroll some things. As I’ve given my Nexus/7 to a sighted friend who really likes it, I can’t try such myself but I’m told that scrolling lists in Android using these gestures is yet another example of Google implementing accessibility features in a manner entirely unfamiliar to anyone who has used any other screen reader before.

Here, I’d also like to add a response to some of the apologetics used of Android accessibility by some other blind people that had been posted by a friend and accessibility expert to the accessibility@google.com mailing list. This is unedited and pasted “as is” because I can’t find anything to change about it, it’s clear, to the point and definitely true:

“I think that facts are very important. When I hear things like Android is young or Android is open source, it saddens me. that’s not an excuse to do worse, that’s an excuse to do orders of magnitude better. Let’s get out of this systemic complacency mode that Android accessibility is perpetually stuck in, shall we? But, more on that in a second.

“So, having mentioned facts, I present the following:

“1. Android was released on September 23, 2008
2. IOS was released on June 29, 2007

“I’ll save you the math, that’s 452 days. using an approximation of 365.25 days per year, that’s 1.24, rounded, years.

“So, the below argument [an argument that states that the community of people with vision impairment should be patient and wait for Android to catch up to iOS], if I may summarize it, is that because Android is 1.24 years younger than IOS, we should not compare them. this argument already implicitly accepts that the comparison is unfavorable in the first place, otherwise there’s no point in making the unfairness claim, but let’s move on past that.

“I put forth, that with all due respect, this is a silly and vacuous argument, and I request that we, as a community, please stop making it.

“Tesla motors is almost a century younger than Ford. Are you seriously claiming that they get a 100 year pass to suck until their cars can be compared against the modern market?

“There are thousands of little startups in the bay area and Research Triangle Park, just to name two U.S. hotspots, that are less than 2 years old. Are you seriously suggesting that they can’t be expected to compete against the likes of Microsoft, Google, Apple, Dropbox, etc. because they are younger? Never mind the fraction of the percent of budget they have compared to their competitors.

“This is not how the world works, I’m sorry.

“To paraphrase a brilliant statement that Isaac Newton made hundreds of years ago, if I can see farther than others, it is only because I stand on the shoulders of giants.

“So, please, let’s not seriously claim that Google is at some sort of disadvantage because of a 1.24 year difference in the age of an operating system.

“Also, I really don’t understand why this argument is ever made. If we subtract 1.24 years from now, and then compare modern day Android to 1.24 year old IOS, it still comes out unfavorably for Android.

“I didn’t count them each, but there are definitely a few dozen cited and evidence-based points made in Chris’s post. If someone wishes to rebut the post, then can we try using the same techniques, or should we stick to parroting marketing literature and claiming that everything will be better 1.24 years from now?”

#Here’s the original article with a few inline corrections:

My Definition of Accessible

Like anyone who hangs around in the world of access technology, I hear the word “accessible” applied to mean everything from, “fully complies with standards, guidelines, best practices and the OS level accessibility API” to “if one blind person can figure out how to use something, no matter how inefficiently, no matter what portion of the features are available to them, no matter how many items are unlabeled.” I accept only the former and reject the latter entirely.

It is definitely true that virtually every blind computer user, including me, must use software with substandard accessibility. This is the reality of the world today but it should not be the goal. The accessible technology community was asked for standards as mainstream developers were entirely baffled by the “how to” parts of accessibility before such existed. Now, for any web app, web site and any application on virtually all currently popular OS it’s all documented in excruciating detail. There is no reason whatsoever for any major company and most all small companies not to be accessible, using my 100% compliance definition of accessibility. Doing or accepting anything less is tantamount to endorsing discrimination.


Typically, I put the conclusions section at the end of my articles but, in this case, as the rest of this article is a list of specific defects and other issues with Android accessibility, I thought that putting the conclusions at the top may allow some who don’t care to read the details to get a quick idea of my experience.

  • in comparison to an iOS 7 device, with zero out-of-the-box accessibility failures and only a few accessibility bugs, a Nexus/7 comes shipped with zero built-in apps that do not contain between one and many accessibility failures. The accessibility issues on Android, therefore, are not just bugs but a systemic problem at Google.

, * Some standard apps, shipped on the Nexus/7, apps like those for playing movies and television, for reading books and browsing the web, the out-of-the-box accessibility is poor at best.

  • A blind person can use an Android tablet to accomplish a lot of tasks but will encounter lots of accessibility failures along the way. A blind user with very strong technical skills and/or a whole lot of patience can find fully accessible software from third parties to replace those shipped by Google and the people on the “Eyes Free” blind Android user mailing list can and will help individuals navigate this process.

  • The PDF manual for the Nexus/7 is “untagged” and, therefore, not actually accessible. As there are all sorts of programs out there that can test the accessibility of a PDF, Google obviously intends to make its documentation inaccessible by choice or, at best, is guilty of willful negligence.

  • The TalkBack screen reader is, while feature poor, actually pretty good. Unfortunately, the Google developers making the apps shipped with the device seem to ignore Google’s internal standards and refuse to use the accessibility API properly. Investigation of the Android accessibility API also shows that some very standard accessibility features, like being able to create a fully accessible web view, are far more difficult than on other OS, a likely source of inaccessibility on this platform as, if an app developer wants to be fully accessible and fully comply with the Android API, they still won’t be guaranteed to have an accessible app in the end. Making accessibility more difficult on one OS than on others is not going to promote accessibility on a platform.

  • If you like playing around with work arounds, figuring out an interface that fails the basics of human computer interaction (HCI) and figuring stuff out, if, indeed, you like the inconveniences encountered by early adopters, Android my be for you; if, however, you are looking for a pleasant accessibility experience, you’ll avoid Android for now.

  • Google insults the community of people with print impairments by claiming that this device is accessible. The accessibility is, at best, a functional prototype of something better to come in the future but Google seems to believe that this is an adequate solution.

  • People with profound to total vision impairment must eschew all devices that provide less than the “gold standard” of 100% out-of-the-box accessibility as, accepting less than 100% will only encourage corporations to provide us with incomplete solutions in the future. If we reward Google with our dollars today, we’re telling them that this state of affairs is acceptable, a tremendously dangerous statement to make.

My Experience With Android

After receiving my Nexus/7 from UPS, I turned it on and ran the setup routines. Then, as its battery was low, I charged it up and on the next day started actually exploring it.

This is my story:

General Issues

Human Factors

  • The first thing a human factors student learns is “leverage what the user already knows.” As all other notable screen readers and every published “best practices” guidelines for accessibility says, “everything must be available in the software’s tab order.” As this simple, obvious to test task is ignored in some parts of virtually every app that carries Google’s brand name, in this case, they are ignoring their own accessibility API in a systemic manner. This is something that could easily be added to Google’s automated test procedures but, apparently, that isn’t a priority for Android accessibility.

A developer can make UI decisions that do not comply with “best practices” if and when they have done the research to demonstrate that a different approach is actually more efficient or more enjoyable for their users. This is how UI innovations come about. Google, on Android and in many of its web apps ignores published standards, guidelines and best practices regarding accessibility, including standards that they’ve set for themselves in their own API. They do all of this without publishing an iota of data that demonstrates that, indeed, people with vision impairment would prefer a departure from the best practices followed on every other system.

  • Another major departure from best practices on the Nexus/7 also results from the tab order being ignored. This comes from the human factors concept of “discoverability.” It is essential that users be able to find features as easily as possible. On iOS 7 and Windows 8.1 (I haven’t tried using Gnome with a touch screen) one can find everything by just swiping. If you don’t know what you’re looking for and you miss it while exploring by touch, then there is a feature that you cannot easily discover – another violation of not just accessibility but general design principles for software.


This article is about the out-of-the-box experience I had with a Nexus/7. As no hardware keyboard comes with the device, I did not testing with a physical keyboard. I’ve heard reports that Android works poorly with an external keyboard but I cannot comment as I haven’t tried it.

Needless to say, a gesture interface is the core of how most people access a tablet. It is the fundamental interface blind people use on Apple’s iOS devices and it seems to be the interface of choice on the Nexus/7 Android tablet.

This is what I found regarding gestures on this tablet:

  • TalkBack uses right angle gestures, like swipe up and left or down and right. I use them without a problem but, given the amount of chatter on the Eyes Free blind Android user list from people struggling to use them efficiently, I can only conclude that Google thrust this UI decision on TalkBack users without doing any actual usability testing. on iOS 7 and Windows 8.1 blind user mailing lists, I never hear people complain that they cannot figure out how to make a gesture work after using the same device for a few weeks the way I do on the Eyes Free mailing list from a fair number of blinks trying to use Android. If more than a tiny fraction of a user population has problems with a gesture, it must be considered to be a field failure.

  • I had trouble finding a gesture to perform a “SayAll” like two finger swipe down in iOS. I did find a setting that would allow me to assign “read from top” or “read from next object” in the TalkBack settings for gestures. As SayAll is about as common a screen reader feature everywhere, why would, without publishing a reason for doing so based in some kind of evidence based model, this not be on by default as it is in every other screen reader that has ever existed? One can do a say all by shaking the device but this seems an inelegant replacement for a gesture.

  • gestures seem to be recognized slowly. I’m told this is a function of Nexus/7 hardware and not due to TalkBack but I’d like to learn how quickly things happen for sighted users as a comparison.

  • Why would, by default, swipe up and swipe left and swipe right and swipe down do the same thing? In general, it seems that there are far too few accessibility related gestures available to the user and wasting some of the simplest ones seems bizarre.

  • Accessing some other very common screen reader functions, like changing granularity, requires three gestures, a right angle followed by two circles. A bit of snark, “Draw the pentagram, dance in a circle, light some incense – was this interface designed by a coven of witches?

Default Android Gestures

While VoiceOver on iOS provides a user with a rich set of one, two and three finger gestures, Android/TalkBack provide only the following:

  • Two part vertical gestures: (swipe up and down or down and then up)- Cycle through granularities.
  • Swipe up then right: Open local context menu
  • Swipe up then left: Home button
  • Swipe down then right: Open global context menu
  • Swipe down then left: Back button
  • Swipe right then down: Open notifications
  • Swipe right then up: Unassigned
  • Swipe left then down: unassigned
  • Swipe left then up: Recent apps
  • Two finger swipe up and down: scroll a list  

Why do all but one of these gestures use only one finger? Is it physically impossible for Android to handle multiple finger touches? Why are there two unassigned gestures when one can turn on two variations of “say all” from the TalkBack settings dialogue? Why do most of the TalkBack gestures require one to make a right angle on the screen while there are so many possibilities left unassigned? Until someone at Google answers these questions publicly, I will maintain the impression that this UI was designed by a programmer’s personal notions and not science.

Editor’s note: In One of the criticisms of the original version of this article, people have said that Apple has all of their gestures wrapped up in patents and that Google could not, therefore, use them with TalkBack. As Microsoft, in Windows 8.x, has added a set of gestures in Narrator that are nearly identical to those in iOS, I doubt this is actually a problem for Google. Theres no technical nor HCI reason for such a radical departure from the gold standard.


Here is what I found trying to navigate around the Nexus/7 system:

  • TalkBack, in its default configuration,, makes it impossible to read non-editable text by word, character, line, sentence, etc. One can turn on the vertical swipe gestures to allow for switching granularity but must find their way to TalkBack settings to perform this action.

  • In many places in apps shipped by default with the device, there are controls that one cannot get to by swiping. This makes learning what’s available to a user pretty difficult as exploration by touch is simple but, if we don’t know what we’re looking for, how do we know what’s there?

  • The HCI concept of “discoverability” seems to be ignored entirely as finding objects on the screen is often a “hunt and peck” process that feels a bit like playing an adventure game.


Here are some notes on basic operation of the Nexus/7:

  • The default synthesizer doesn’t speak fast enough. I know one can install third party synthesizers but the one that comes out-of-the-box, on its fastest setting, is too slow for my taste. Unlike iOS, though, where one cannot choose a third party synthesizer, I could buy one I like more for Android, a definite benefit to the more open system.

  • Most buttons and some other controls aren’t labeled as buttons so speech only says the text in them and I thought they must be headings. Unlabeled controls vastly outnumber those with something meaningful like “button.” I understand that, in some places TalkBack uses an earcon, a tone played instead of saying a word to convey information. Earcons are a terrific idea similar to the Speech and Sounds Manager in JAWS but, as TalkBack seems to use them in some places, text descriptions in others and does nothing at all to indicate a control type in many other places, the system is so inconsistent that none of the indicators are of much value as they only seem to occur randomly.

Documentation and Resources

  • The TalkBack documentation is either non-existent or entirely out of date when one searches for it on Google. There is an article about accessibility gestures on the Nexus/7 published by Google itself that contains some accurate and some completely wrong information.

  • It seems that the only way to learn how to use the various accessibility gestures available using TalkBack, is by going to the settings dialogue and read the settings as there is no other place this is written down.

Compared to documentation about VoiceOver, JAWS, Window-Eyes and the free, written mostly by volunteers, NVDA and Orca, this looks entirely like a project led by a kid in a dorm room somewhere and not a professional development team.


JustSpeak is a new voice recognition accessibility service from Google. It allows one to say things and then TalkBack will take action. This software is incomplete and incredibly buggy. There are my notes on it:

  • With JustSpeak running, one cannot control TalkBack volume. Why doesn’t it allow the volume rocker to work while watching a video?

  • JustSpeak would be a useful tool if it had a command that would tell the user what can be said in a specific context. Especially in an app where controls aren’t in the swipe order, it would be useful to be able to say, “List controls,” and hear something like, “Controls on screen: Play, Fast Forward, Rewind…” or whatever is there in an application. This would be similar to the list of objects that JAWS has via JAWS+F7 or the Item Chooser with VoiceOver.

  • JustSpeak seems to do really bad things to the hardware volume control. I’ve heard TalkBack say, “Volume set to zero” without the actual volume of the device dropping at all.

  • JustSpeak seems to have crashed while I was writing this piece but restarting it in Settings got it working again without much problem.

I think Google refers to JustSpeak as a beta but, in my mind, it’s less than an alpha and, at best, can be described as a functional demo.

The Default Android Apps


After turning the device on for the first time (the friend who sold me the device had already turned the screen reader on), I found myself in the setup routine and encountered the following list of issues:

  • The on screen Keyboard is not in tab (swipe) order so, for a person new to the system who may not know that they need to first explore the screen with their finger to find it, the on screen keyboard’s location is not obvious. .
  • The email one gets for setting up the device is loaded with accessibility failures. I can only say that, if Google indeed cared about accessibility, they can make their entire system compliant with web accessibility standards and guidelines.

Home Screen

This is what I found exploring the default Android home screen, after a few days of using it, at the recommendation of some helpful people on the Eyes Free mailing list,I installed a replacement home screen called “Apex Launcher” which is far more accessible:

  • Within ten seconds, I found an unlabeled graphic announced as “Image 53, Unlabeled. If a device claims to be “accessible” it should have zero such defects. Finding and fixing such problems is trivial and could new included in the Android automated test process but, as they obviously don’t care to become fully accessible, they ignore such insults to our community.


I tried to go through every app that came shipped by default with the device. As I found none without serious accessibility defects, I stopped trying so these are the notes I’ve made on various apps that I did try:

  • The Google Chrome app seems to be entirely usable but, as TalkBack is fairly feature free in the browser, it isn’t actually accessible to anyone who considers “efficiency” to be part of accessibility.

  • Having no ability to navigate a web page by semantically useful “chunks” like heading, form control or anything other than swiping by object makes browsing excessively cumbersome. As this is a feature that has been in JAWS for more than a decade and in VoiceOver on iOS, I ask, why is Google still pushing a Model T on us?

  • The default Chrome home screen contains unlabeled images. Again, how difficult or expensive would it be for a massively wealthy company like Google to fix?

  • Trying to read Google search results without being able to navigate by heading is very time consuming and cumbersome.

  • I like the “ear cons” audio effects played for VoiceOver users on various web objects.

After a little while, I stopped using Chrome on the Nexus/7 and installed Mozilla FireFox. While not a default app, I can highly recommend FireFox to any blind Android user. Mozilla’s team obviously spent a huge amount of effort making FireFox accessible, very accessible on this platform. It’s unfortunate that developers need to do so much to make a fully accessible browser on this platform but Mozilla’s team has demonstrated that it is possible to make a browser on Android profoundly more accessible than Google attempted with Chrome.


  • This app seems to be mostly usable but has major accessibility problems.

  • everything is in the swipe order, reading the stories can be pleasant if it doesn’t contain too many unlabeled things.

  • In a read all (started by shaking the device) in a Popular Science article, speech got stuck on “Heading Image” and read it repeatedly until I stopped it.

  • Tapping on the title of an article most often opened a different article than the one I wanted.

Play Books

  • This app seems to be marginally usable but it contains some areas that a user must find by moving a finger around the screen. Many items seem not to be in the “swipe” (tab) order. It’s easy to get lost..

  • Reading a book is difficult as navigation seems to break in many places.

  • I seem to have found “temporary” buttons that TalkBack announces but that disappear before one can act upon them. This is a violation of all published accessibility standards, guidelines and slaps best practices in the face.

Play Movies and TV

  • It was really hard getting gestures recognized while playing a movie.

  • With JustSpeak running, I couldn’t change the volume of the video using the rocker bar on the device.

  • Finding the “pause” button proved impossible for me while playing a video.

  • Lots of items seem not be in swipe order.

  • App contains at least one unlabeled image.

  • No buttons are labeled as such.

I can only describe the accessibility of this app as a total failure.


  • The YouTube app is more usable than Play Movies and TV but describing it as “accessible” would be a stretch.

  • When a video is running, swiping from control to control is very slow.

  • Many swipes resulted in a sound playing with no voice feedback. It was unclear if I was actually moving around the interface, if there was an object on which I had landed, etc.

  • The volume rocker, likely due to JustSpeak, causes TalkBack to announce that the volume has been set to zero (when I had hit the volume down side of the rocker a bunch of times) but the actual volume of the video doesn’t change.

  • I found at least four unlabeled buttons in this app.


  • The default Android calculator app seems to be usable but has a number of curious aspects to its UI.

  • The swipe order seems somewhat random as sometimes it loops back around a group and other times it goes to everything in the interface and I could not tell you why.

  • The calculator does not honor the setting for typing by removing one’s finger from the last spoken item.


I was really looking forward to trying out an Android tablet. Friends like Aaron and Josh speak so highly of Android accessibility that I wanted to give it a whirl. Sadly, what I found after spending a few months with the tablet is that it isn’t actually accessible the way iOS 7, Windows 8.1 and even Gnome are. The Google accessibility team must have little or no authority within Google corporate as these problems should be simple to remedy if they wanted to make a fully accessible solution.

Fighting Plate Tectonics

“Continental drift causes an enormous level of financial hardship, personal tragedy property damage and destruction. Its earthquakes, volcanoes and tsunami kill indiscriminately. Plate tectonics must be stopped!” said Doctor Genevieve Sitarski from the University of New Mexico, Peyote Campus in her keynote address at the Conference on Ending Plate Tectonics, held earlier this year at the Shangra La hotel on Rosa Sentosa island, Singapore.

I had heard of the conference and asked my friend and gonzo journalist, Gonz Blinko to fly to Singapore to gather as much information on this emerging movement as he could. Gonz, as always, had iPhone in hand and recorded what he could under the veil of secrecy surrounding the event. This report was built from those recordings.

Gonz, “This may be the best place on Earth for us blind people. I think they’re Buddhists or some group that revere us. It’s weird, I sit at a restaurant and they make me something special and won’t let me pay, I go to the mall and strangers give me money, I think I’ll try the brothel and see what I can get for free there.”

[Editor’s note: The next few hours of audio are far too graphic to include here.]

Gonz (with giggling Asian sounding woman in background), “Conference starts in 20 minutes and we’re forty minutes across the island. Shit, think we’ll make it?”

“Don’t know,” says giggling Asian babe.

The recordings become strange at this point. Gonz yells at a taxi driver a whole lot, sounds of screeching wheels, brakes and high speed automotive engines punctuated by Gonz continuing to yell to the driver to go faster while the woman continues to laugh, scream and sound, in general, like she’s having the time of her life.

Gonz, “We made it on time and I’m sitting at a conference table, drinking coffee and acting like I’m actually interested.”

The voice of Doctor Sitarski from the PA dominates the recordings. “We are all gathered here, in this volcanic island to stop plate tectonics. We now have the technology…” The recording breaks off, dominated by Gonz’s sipping sounds and a few plates clanking against the microphone, presumedly because he was eating a stale, conference danish as he made the recording.

Sitarski, “Our next speaker is Doctor Hiroshi Udon of the Tokyo Institute of Technology, known to its students as ‘the big TIT. He’s speaking about potential engineering solutions to avoid another Fukushima like disaster. Doctor Udon…”

We hear some polite applause and Udon clears his throat. “”How many plate tectonic experts does it take to change a lightbulb?” He asks and, hearing no response from the audience at all, he adds, “We don’t change lightbulbs, we wait for the plates to bring us under the sun!” he exclaimed with a huge laugh, laughter unshared by his audience.

“Uh, ok, then…” added Udon, “Let’s look at the problem.

“We have a tectonic plate in the Sea of Japan. It moves violently now and then. The solution is silicon gel. If we pump a few thousand megatons of silicon gel into the crack in the Earth, we will buffer the violence of the plate motion by making the edges softer and more fluid, hence, the energy of the system will go into the soft silicon instead of rock, softening the blow substantially.”

Gonz, “Hmm…” and his pocket calculator starts talking over the speech as he tried to do the calculations following Professor Udon’s description of the system. “Shit, either he’s way off or I slipped a digit somewhere,” says Gonz as we hear the speaker grow soft as Gonz leaves the room for more coffee and another stale danish.

I don’t know how long Gonz was away from the conference, the recordings suggest he may have spent a few hours on the beautiful Singapore beach with the Asian woman friend who seems to punctuate every sentence she speaks with a giggle. The next relevant audio follows:

“In our analysis,silicon gel cannot absorb enough energy to stop the major effects of continental drift,” said an unidentified speaker with a thick German accent. “We believe we need something more like an epoxy, a substance that has both plastic and elastic realms of deformation. Silicon will certainly allow the plates to glide more easily but, an epoxy, like the O rings on the space shuttle, will provide both lubrication and slow the motion of the plates. Our computer models…”

Gonz, whispering, “Is this all real?”

Unidentified whispering male voice, “I don’t know, the Japanese think it’ll work.”

“You are the New York Times,” whispered Gonz, “Here, take a look at my numbers.” Sounds of person handing an object to another followed by a long sip of coffee followed by, “Ah….”“

Sitarski, “Our next speaker is a Russian mathematician from the Romanov Polytechnic Institute, please give a warm welcome to the rock star of the movement to stop geological drift, Professor Vladimir Throbaum.

“Thank you Doctor Sitarski,” said Vlad above the only applause noted on the recording, “I am here to present the most perfect solution to the problem of tectonic drift, it’s neither silicon nor epoxy. The ideal solution is ball bearings, trillions of ball bearings dumped into the cracks in the crust. Trillions and trillions of ball bearings…”

“Shit,” says Gonz as we hear the sounds of objects being thrown into a gear bag. “Fucking ball bearings, trillions of ball bearings. I’m going back to the brothel.” And, suddenly the recordings stop.

Spam From The Ghetto

A few days ago, I received an advertisement email from the Korean access technology company, HIMS. The email had the subject line, “Next Generation Braille Notetaker Just Released” and it described a $4000 device running some oddball version of Windows which had no more than a minimal set of features.

Recently, former FS executive, Jonathan Mosen wrote an article on his blog entitled, “There’s No Such Thing as a Blind Ghetto Product” in which he criticizes people like Mike Calvo and I for referring to proprietary devices designed specifically for blind users as being the agents that segregates our population into a technological ghetto. Anyone who has read this blog or it’s predecessor, BlindConfidential, will already know my position on such products and my desire to see them replaced by mainstream solutions. I won’t bore you with the details of why I think this is the case, you can search on the word “ghetto” on this blog or on the BlindConfidential archive to see the analysis I’ve done of this business sector and why it must be avoided by the majority of users.

This article is specifically about the email I received from HIMS and not the greater philosophy of accessible mainstream versus ghetto devices. These are my findings:

An Email From HIMS

At the very top of the email I received from HIMS advertising their new notetaker I heard my screen reader say, “order-top.jpg.” This is, of course, the sign of an unlabeled graphic. HIMS is, ostensibly, an access technology company. If a company attempting to sell a four thousand dollar device to users with vision impairment cannot even spend the time to get the accessibility of the HTML in their advertisements correct, how can we expect them to build and sell a product to this community that has any credibility? This fundamental level of adherence to standards was ignored, what else did they get wrong in this product?

The Braille Sense U2 MINI

In the first paragraph describing the device, they claim that teachers, trainers and others will be impressed by the hardware and, “Excel Viewer, YouTube and Dropbox,” and all I can say is, “These people must be really easy to impress as blind users have been enjoying these features for years on our Apple and Android mobile devices.” In fact, YouTube was one of the first apps I enjoyed using back when I bought my first iPhone, four and a half years ago. Ghetto devices will never keep up with the mainstream and, in 2013, a proud announcement that one now has support for things we’ve been using for years is proof that these companies are a generation behind the technological curve.

HIMS then tells me, “Users will enjoy the power and performance of a 1 GHz Mobile CPU, a 32 GB storage capacity and enhanced features such as extended battery run time, improved GPS receiver sensitivity and the addition of a vibration motor.” Oh boy! Oh boy! Oh boy! These technical specifications sound so 2004 when compared to the Nexus/7 I bought used from a friend for $50 with its quad core processor, gigabyte of RAM and 32 gb of storage or my iPhone 5S with its quad core A7, a super computer compared to this new thing from HIMS. Apple sells a device that is 100% accessible out-of-the-box and Google, with its Nexus/7 running Android 4.3 provides a device that one can make tremendously accessible with third party software for profoundly fewer dollars out of your pocket. When the new iPad Mini hits the stores soon, a blind consumer can get one for $329 and the Nexus/7 costs $239 at BestBuy. The much less powerful hardware from HIMS costs $4000.

Next, HIMS boasts that this device runs “an optimized Windows-based operating system with a familiar, Windows-like user interface.” First off, this was done in PAC Mate years earlier and, sadly, it failed as it was nearly impossible to find accessible off-the-shelf apps to run on the device allowing users to improve and customize its functionality to better meet their needs. A blind person using an iPad or Android device has myriad options for apps to install. HIMS chose a mobile operating system with virtually zero accessible third party apps. They could have used Android and provided their users with thousands of accessible apps as options; instead, they chose to limit the possibilities a user of this device can have – a decision completely against the goals of achieving universal accessibility and compatibility with the same software that our sighted friends can use.

But, It Has A Keyboard and Braille Line

These days, one can find a small blue tooth braille display online for under $1000 new and used for far less. One can have any variety of blue tooth keyboards for an iOS or Android device. One can buy a talking battery extender on Amazon for $50.

Let’s do some math

  • 1 Apple iPad Mini: $330
  • 1 Google Nexus/7: $240
  • 1 Apple Macbook Air: $1100
  • 1 Toshiba Windows laptop: $300
  • 1 external keyboard: $50
  • 1 braille display: $1000

  • Total: $3020

These are all of the accessible devices I own and use on most days. I have four operating systems, a panoply of different screen readers and as many accessible applications I can possibly ever want. . For nearly a thousand dollars less than a single HIMS YouTube ini, I have all of its functionality plus much, much more. For $3000, a blind person can literally have everything the mainstream technology world can enjoy and, no matter what Mosen asserts, mainstream solutions for blind users are both more functional and, yes, less expensive by a lot.


  • HIMS insults the blind community by attempting to sell an underpowered, low functionality device on which it will be impossible to install third party applications.

  • HIMS is obviously earning a windfall profit by selling this bit of technology for $4000 when the parts in it cannot possibly cost more than a few hundred dollars.

  • The new HIMS device attempts to force blind people into a segregated technological ghetto from which it is difficult or impossible to compete in the workplace or in school with our friends who do not self-identify as having a disability.

  • All such ghetto devices must be eschewed by this community or we’ll never achieve technological equity.

– End ,

Chris, with maniacal grin, wearing long braids sticking out of either side of his head

I Left My Hair in San Francisco

It’s been a long time since I last wrote a random musings article. In fact, I haven’t written anything like this since the BlindConfidential days. Lots of things cross my mind that may be interesting or amusing to our readers so I will write them down here and hope you people enjoy them.

Locks of Love

Last year, Mia Lipner,, my best friend had a rare cancer (leiomyosarcoma) possibly linked to the Retinoblastoma that caused her to lose her vision as a child. While visiting her at a UCSF hospital in San Francisco, I had a conversation with a young woman working as a cancer navigator there. She noticed that I had relatively long hair then and suggested that I grow it out and donate it to Locks of Love, an organization that collects human hair and makes wigs for underprivileged children who’ve lost their hair to cancer. Growing my hair out cost nothing and, hopefully, some kid got a nice wig to wear while she is going through chemo therapy.

If you have long hair and want to get rid of it, I recommend sending it to Locks of Love. It’s really easy. I went to a hair salon on Haight St. in San Francisco, the stylist to whom I was assigned put my hair into three separate braids and cut them off as close to my head as possible. She said it was nice to have a man with long hair come in as she could cut it really short and the kids get more hair for their wigs than they would if I had chosen to keep it longer than I did.

I’m happy to have helped in my little way and suggest that if you want a really easy way to help kids with cancer, you do the same as growing one’s hair long is a really easy and very inexpensive way to help out.

A Nexus/7 at BestBuy

The other day, I went to BestBuy to look at and possibly buy a Google Nexus/7 tablet. I wanted to have an Android device in my hands just for the sake of being able to test its claims of accessibility for people with vision impairment. Unfortunately, when I asked the salesman if he could turn on TalkBack (the Android screen reader), I was informed that such was impossible when the unit is in “demo mode.” When I got home, I double checked to make sure it wasn’t a mistake made by the kid selling mobile devices at a retail store and learned, indeed, Android devices in demo mode cannot be used by a person with a vision impairment. Meanwhile, Apple devices in the same section at the same store allow users to turn its screen reader on and off.

This is exactly what I’m referring to when I describe iOS as the only truly accessible devices available on the market. It is also to what I’m referring to when I say that Google’s hubris, Google deciding what we blind people do want or need, they control the OS, the accessibility API, the engineering teams, everything but refuse to make their demo talk (something that would be included in the design if they cared about accessibility) while also refusing to make accessible 100% of the apps shipped by default in Android from Google.

Basic accessibility isn’t hard. Google, however, seems to believe that partial accessibility is good enough but I don’t accept that assertion.

Shredder Chess

A few weeks ago, another blind iOS user suggested that I may enjoy a game called Shredder Chess. I’m not a good chess player, I know the rules but that’s about it. I must say, though, that this app is 100% accessible and I’ve been enjoying it for a few weeks now. You can get it from the iOS AppStore where there is a no cost, “light” version and the full game is available for eight dollars. I paid for my copy as I’ll pay up to $10 for almost any accessible mainstream app just to thank the developers for making the effort of making their product accessible.

By the way, does anyone know of a accessible way to learn chess basics? There are some chess tutorial programs for iPHone but I’ve no idea which may be usable with VoiceOver.


I listen to baseball on the radio a lot. Tropicana orange juice ads on Tampa Bay Rays games say, “Picking an orange is just like throwing a baseball…” I ask, “Then, why do migrant orange pickers make so much less than baseball players?”

I really enjoy listening to the disclaimers at the end of commercials for pharmaceuticals. They tend to sound like a long list of the worst things that can possibly happen to someone. They usually sound far worse than the symptoms they are trying to cure. Meanwhile, supplements, most of which are made by the same big drug companies, “big pharma” if you like, don’t even need to publish the levels of active ingredients or test results regarding the nonsense they sell. Go figure.

Shelley Segal

Last week, Shelley Segal performed at my home in St. Petersburg, Florida. I advertised the event mostly on MeetUp.com on atheist, humanist, skeptical and other local groups. I had never met any of the people who attended other than Shelley herself.

Shelley turned her concert into a conversation with the attendees. Shelley made certain that she had spoken to each guest and, far more so than just chatting with people, Shelley made everyone in the room feel very special. Shelley’s generosity of spirit is amazing and, if you don’t know this awesome gal, seek her out as she’s such an amazing friend to have.


As ever, I’m uncertain what the future may hold for people with vision impairment. I can say that I’m working on some really interesting things. Keep reading this blog as we’ll be announcing some really cool things here in the near term future.

Progress in Screen Reading: Android and iOS

Since my middle school days,I did all of my computing either with a braille notetaking device or on a computer. People told me that, with the limited vision I had, a screen magnifier would be the best access technology for me to use. So, for years I was convinced that screen magnification should be my tool of choice. Halfway through middle school I noticed that my classmates with total vision impairment could get around the internet, e-mail, documents and the whole computer system a lot faster than I could. When I saw this I wanted to try my hand at this screen reader thing. I was fed up with some of the inefficiencies inherent to using a screen magnifier, having to move my mouse in circles just to find what I wanted, making sighted people “sea sick” when I tried to show them what I was looking at and various other aspects of such that wasted my time and energy.
I was introduced to and fell in love with the JAWS screen reader. I could navigate my Windows computer efficiently and perform every task I needed to do without getting headaches and having to stop for periods of time to let my eyes rest. I could now work as rapidly as my classmates at typing and everything else they were doing at the time.
Over the years, I enjoyed innovations from Freedom Scientific, the authors and publishers of JAWS, as they developed it into an excellent peace of software. As I got older, I continued to use screen readers as my primary modality for interacting with computers. I was finding that JAWS got slower and less innovative, and require more system resources. Also JAWS has more features than you can shake a stick at, you probably will never know they are there and will probably never use them. I’m not saying that a company isn’t innovating by adding new features for every upgrade. However, why not look at what your product is doing to a computer’s system and fix the bloat and crap that a lot of people will never use. Or even better yet make two versions of your software. One for the average consumer and the other for the professional in the workspace environment.

After high school I continued to use JAWS. The vocational/rehab services in my state all seemed to live by the notion that if you don’t know JAWS then you’re dumb. A few years ago, I tried using Macintosh OSX Snow Leopard and I didn’t care for it, but,. VoiceOver, the Apple screen reader included with all of their devices,,, Even though I didn’t like it much personally, at the time I had to acknowledge that Apple was making progress.
In 2009 I swore by Apple’s iPhone. It was the best mobile device ever; android, while I had heard it was becoming accessible, wasn’t even a factor in my mind. here I am with the device and loving how for the first time I could use a touch screen device. Apple was once again innovating and it was amazing. A friend of mine had an Android device back then. His HTC G2 ran android os 2.3 and it felt foreign to me. He would talk about things I didn’t understand. He’d discuss rooting, modding, unlocking, boot loaders etc. The list was endless and it was unfamiliar to me.
Fast forward to the present and with each upgrade of Apple’s iOS devices, I observe that few bugs get fixed and that iOS is becoming stale. iOS 7 brings only a new UI interface and a few new things in VoiceOver. ; Android, in the meantime, progresses rapidly and its marketshare among our community grows
While VoiceOver remains largely unchanged.

Talkback, Google’s Android screen reader, continues to evolve and get shaped by both professional software engineers at Google and independent, volunteer hackers who, due to its open source philosophy have access to its code and, as a result, can work on the software themselves. when he took on the role of Director of Access Technology at Free Software Foundation (FSF), Chris Hofstader, the owner of this site, an iOS accessibility supporter, and a prominent critic of Android accessibility, wrote that the free software, open source model was the only way that people with print disabilities can “own their own technological destiny.” With Google’s open model, programmers in this community can participate in the development of the tools they need to use; with Apple’s pathological secrecy, it’s impossible to even know what they are working on and, without any way to effect change in VoiceOver in a substantive manner, we haven’t the freedom to even participate in shaping its future.

In my opinion, innovation in screen reading with Apple products is an unlikely outcome. Sure some things may get fixed but when and what else is going to be broken? How long is Apple going to take on fixing bugs? No one will ever know.
Technology is changing every single day. New devices, services, and cloud computing are become not of a star trek era but the here and now. Most everything is done in the cloud and can be accessed online through as many different computers as anyone of us can get our hands on. Why be stuck in the past when you can be pulled and lead into the future.