In response to all the billing and cooing over Automattic’s announcement/release of Calypso and the general excitement of WordPress’s upcoming REST API, Andry “Rarst” Savchenko wrote an article decrying the lack of progressive enhancement being discussed or used in many of the examples.
Triggered (I presume) by Rarst’s citing the A Day of REST conference site as a an example of a site not using progressive enhancement (now removed), Joe Hoyle responded at length.
Here is my contribution to the discussion (I originally started it as a comment on Rarst’s site, but it outgrew that):
I presume from Joe Hoyle’s response and Rarst’s mention of removing specifics, that Joe’s REST conference site was highlighted as a bad example. So I understand the offended/defensive tone of his reply. Further, given the audience for that particular event and thus that site, I could regard it as a reasonable compromise to forego support for those without working JavaScript.
However, that does not apply to the rest of the web.
People not numbers
Joe, you quote some figures, quite small figures really. 1.1% of users don’t have JavaScript available; 5% of users running very old versions of Internet Explorer, even referring to 2.4% of screen reader users who don’t have access to JavaScript.
But the problem with those small numbers is that they are small percentages of very large numbers. So 1% of all web users is 30 million. And those users are people, not numbers.
But let’s look at just one of those numbers: the 2.4% of screen reader users who don’t have access to JavaScript. All of those people, by definition, have difficulties accessing the web: likely their only method of accessing the web is through a screen reader. And you imply that it’s OK to prevent them from accessing the web in the name of the latest shiny trends and to save some development time and cost. Actually I’m being polite, you don’t imply it’s OK, you said “I don’t care…” but I’ll accept you may be talking about one specific site.
Just from the RNIB figures in this 2014 report (PDF) there are 2 million people in the UK living with sight loss. If only half of them want to use the Web, then that 2.4% is really 12000 people you just excluded from the web.
Don’t forget: for many people with sight loss the internet is a lifeline helping them feel less isolated.
Blind users are not the only ones
And of course, visual difficulties are not the only reason people need accessible web sites. Those with motor difficulties, who can only use keyboards or special devices like braille readers, head wands, sip and puff controllers, etc. or simply those with unsteady hands that find a mouse difficult to use like the ever-increasing aged population, all need to be able to use the web too.
Time for honesty
These are real people you may walk past in the street this weekend. Be honest, do you really want to join Facebook, Netflix, American Express, and yes, Automattic and boldly say to them “I don’t care that you can’t access all of the internet”?
Hi Mike,
Thanks for writing this article. I’ll go read the other pieces you’ve linked to and coment there as well, but I wanted to point out here that, while I’m not disagreeing with you that people who can’t use JavaScript should not be excluded from the web, you can in fact javaScript accessibly. You have to be circumspect about how things are coded, (E.G. use native elements where possible, Aria when you can’t), and you have to put some thought into which JS frameworks you use, (React, for instance, is the only JS framework that I know of that has an accessibility API, which will make creating accessible JavaScript apps easier), but you can definitely use JavaScript. This is probably the biggest reason I’m pouring free time into Calypso. I want to see it become accessible, and as close to the ground up as possible. As part of the accessibility team for the WordPress project, and as someone who has extensively tested Jetpack for accessibility, I know full well that engaging Automattic on accessibility is an uphill battle. And yet, it has to be done, and it has to be done without any kind of activisim, because if we’re activist about accessibility when it comes to Automattic, we’ll continue to lose.
Hi Amanda,
I agree that JavaScript web sites/apps can be accessible and I support that. My main point is that unless we create websites and apps using progressive enhancement techniques, those who cannot use JavaScript have nothing to fall back on.
Comparing a site built using progressive enhancement principles to one which is JavaScript only; you go from a poorer user experience without all the bells and whistles provided by JavaScript, to no experience at all. For me, that is a step too far.
I didn’t explicitly mention Calypso as a bad example, but if, as I understand it, it is a purely JavaScript solution, then by definition, it excludes some users.
However, as long as Calypso, or a descendant of it, does not completely replace the old WordPress admin screens I’m not too bothered by that. (I know they are not 100% accessible, but are getting better through your and other’s contributions).
Meanwhile, keep up the good work on the accessibility team. I know it’s an uphill struggle (I’ve contributed a little myself) but it is one worthwhile. WordPress’ pledge to democratise publishing sounds hollow if that democracy excludes a whole class of people.
To be honest I want to note that I didn’t even focus my post on all of the Internet. I talked about (but apparently haven’t used enough bold 🙂 primarily about content sites.
I think genuine web apps do have different expectations and application (!) running in browser might have no choice but to rely on runtime code.
Content sites however not only have that choice, but it’s natural for them to choose progressive enhancement since they work perfectly without JS in first place.
Content sites are bread and butter of WordPress. Let’s be honest — those 25% of Internet are not web apps. I find the eagerness to discard their basic functionality and app-ization of them for the sake of it very worrying.
There is no relation between having JS disabled and having a disability. The study pointing out that 98% of screenreaders have JS enabled aims to break down a huge misconception among developers: that screenreaders can’t deal with JS. Looks like this article seeks to get back this false idea.
Why do people have JS disabled ? Because of lots of misconceptions. Misconception that JS use too much bandwidth and impact speed. Misconception that enable JS bring security issues. Misconception that JS-driven websites are not accessible. All of these prejudices are wrong. Of course, you will find plenty of examples of websites bloated with unsecure scripts and terrible UI libs that don’t care at all about accessibility. But these problems are not directly attributable to JavaScript.
It is time to accept JavaScript as a fundamental part of the Web beside HTML and CSS.
Sylvain,
Neither I nor the study referenced claim that no screen readers can cope with JavaScript. However the study does show that 2.4% cannot cope with JavaScript. Note that whilst it may include some people who have JavaScript disabled, there are many, many, old versions of screen readers in use, especially JAWS, that cannot cope.
In this country (UK) at least, it is prohibitively expensive to buy/upgrade JAWS software, it can cost more than one month’s wage (> $1000). And because most people obtain it through social/welfare benefit schemes, which do not include any provision to update software or hardware, people end up falling behind.
In my direct experience, many blind or visually impaired people are still using 5 or even 10-year old equipment with correspondingly old software because the social services do not have money in their budget to update equipment that is “still working”.
You are also missing the point that some people do not choose to have JavaScript disabled. There are organisations that disable it at the network level, or that inject additional JavaScript content into every page, which can then break the JavaScript on any site (I have directly experienced this). And the problem with JavaScript in the browser is that when one script as an error, all scripts stop working.
Similarly, on slow or intermittent network connections one piece of JavaScript may fail to load, stopping all the rest of the scripts from working. Again, I have experienced this myself.
I do accept JavaScript as a fundamental part of the web. I love sites that are enhanced by JavaScript and give a faster, richer experience. But there is almost no functionality enhanced by JavaScript that cannot also be implemented in plain old semantic HTML. And that is all I am asking: if the JavaScript fails, a website should continue to perform it’s fundamental role. Even if that is then an uglier, slower, poorer experience. It is infinitely preferable to no experience at all.
I’m going to agree with Mike’s comments. In regard to screen reader lag, that is definitely a thing. And the pricing for Jaws is that high across the board, not just in the UK. When I worked for Freedom Scientific, the company that develops Jaws and Magic screen magnification, I often found myself supporting not only the latest version of Jaws, but also versions that went as far back as running on Windows 98. While the Windows 98 versions and support calls were edge cases, versions that were at least two or three behind were commonplace. Jaws releases on a schedule once every year, and has not been innovating for the past ten years at least. Most of the new features that have been added have nothing to do with where the web is currently going. So while, as I commented above, you indeed can have accessible JavaScript apps and websites, and while I agree that the belief that JavaScript is by default inaccessible is harmful and wrong, (thanks National Federation of the Blind for coming up with and continuing to spread that one), I don’t think that JavaScript should be implemented without some sort of noscript fallback.