Apple’s Ecosystem and Accessibility


Earlier this year, my Airpods Pro began making a clicking sound, when in Noise Cancellation or transparency mode. I didn’t think much of it, and just used them regularly, until sound began distorting after a while of listening. I’ve simply stopped using them, as I shudder to think how much a cab ride to the nearest Apple Store, potentially an hour away, would cost. This is only one problem with the Apple ecosystem: being locked into Apple’s wireless headphones, other Bluetooth headphones, or other workarounds, and Apple Stores being far away, which is what I’ll be focusing on in this article. I will show, in the following paragraphs, how Apple’s handling of its ecosystem effects the hardware and software regarding accessibility matters. These matters may effect some in the general population, but people with disabilities are effected much more acutely.


Hardware


Apple’s hardware has usually been very well built. Reviewers often talk about nothing else. From the iPhone’s camera, iPad’s screen, Mac’s CPU and RAM, to the Watch’s health sensors, and the Airpod’s H1 chip, hardware, for Apple, is a big part of their products, and reviewers focus on that. But how does that help or hinder accessibility?


The TouchBar on the Mac


In late 2016, Apple’s


MacBook Pro (HTTP)


gained the


Touch Bar (HTTP)


a touch strip across the top of the keyboard, replacing the function keys. The reason was to add variable icons which could visibly change functions across the operating system. Many people may have liked this change, as they could use hand-eye coordination to perform functions they otherwise would have used the trackpad and menus for. These type of users would not have known about keyboard shortcuts, using the function keys, and other easy ways of getting the same functions done without needing yet another touch input.


Blind people, however, are a bit different. We usually know many keyboard shortcuts, use the function keys without a problem, and do not always need a touch screen. The Touch Bar can be used, but it is much slower, as we have no tactile way of finding just one distinct item on the touch bar, like the play/pause button, or the volume slider. Once we have found the function we want, we must tap it twice to activate, like a sighted person must left click twice, once to focus the item, the next to activate it. In fact, VoiceOver, the screen reader for the Mac, had to adopt a command to raise or lower the volume via keyboard, since it is slower to do so on the Touch Bar. On the other hand, most operating system and application features can be accessed via keyboard commands, so I only need to use the Touch Bar for system functions like volume, brightness of the screen, and media playback when I’m not in the media player.


If a blind person wants to use their Mac as a Windows machine also, through Bootcamp, they must attach an external keyboard, or simply not use the function keys, as Windows screen readers have no such notion of a Touch Bar function key row, thus will not read what a user is selecting, and will also not let a user explore the touch bar to find a function before activating them, so one touch activates an item, even if it isn’t the one a user wants. See


this Applevis forum post (HTTP)


for more information on this.


I feel that Apple should have made this change on the MacBook Air, for regular consumers, and left the Pro machines alone. Yes, they could have made the power button into the Touch ID button on the pro machines, and I hope that, just as they revived the scissor-switch keyboards, they revive the Function keys as well. It would help me greatly in doing even simple tasks easier, like pausing, skipping, and rewinding audio, and handling volume and brightness more quickly.


There is still hope, however. This year, Apple released the MacBook Air refresh with the new keyboard. It has an Escape key, at least. Now, they just need to add back the other twelve keys on that row, and things will be back to normal.


The headphone jack


In 2016’s iPhone 7 and 7+, Apple removed the headphone jack, replacing it with their own Airpods, other Bluetooth headphones, and Lightning audio. They did not add another Lightning port onto the phone so that one could listen to wired headphones and charge the phone at the same time, or, as they did with the TouchBar on the MacBook, but left people to choose between wireless options if one wanted to be able to listen and charge the phone.


For most people, this isn’t an issue. They don’t usually need

headphones, only using them when listening to music or movies, or

playing games. Even then, some people just listen on speakers built

into their phone, or use external speakers, like the HomePod. They

also do not have to worry about latency. Music is not effected by it,

and videos are usually delayed, so that the picture synchronizes with

the audio.


For blind people, however, headphones are important. In order to use

an iPhone, most blind people use a screen reader, which speaks

information out loud using a voice like the one Siri uses. Using a

screen reader without headphones means that anyone nearby can hear

what the user’s phone is saying, which can reveal sensitive

information like the phone numbers of people who call or text the

person, user passwords, and even the pass code to their phone. This

means that headphones are quite necessary. Some blind people own

Braille displays, which gets output from a screen reader and displays

it in braille, but these devices are expensive, starting at $600, up

to near $6000, so are out of most blind people’s price ranges.


Wireless headphones, using Bluetooth, often have large lags when being

used. If you play a game using them, you’ll surely notice it. A blind

person who uses Bluetooth headphones must deal with that for all

interactions with the phone. Imagine having to deal with a phone with

a screen that lags behind what you’re doing on the phone, even by 300

Milliseconds. Some Bluetooth headphones are better, but none can match

wired ones. Apple’s Airpods 2 and Airpods Pro come closer, but have

their own problems: they still must be charged, have lesser battery

life, and cost much for the sound quality they come with.


To solve all of these problems, I have bought a $10 Lightning to 3.5

Millimeter Headphone adapter, and use that with the headphones that I

already have. Sure, I have to take my iPhone with me in my pocket

wherever I go, but I usually do that anyways now that my Apple Watch

is broken also. Sure, I don’t have my Lightning connector free, but I

have a charging mat that I use to charge the phone. There is no lag

when using VoiceOver, the sound quality is very good, and I don’t have

to charge my headphones.


Hope is not lost, however. There is a


rumor (HTTP)


that iPhones could be completely wireless. Of course, one still must plug the iPhone into a computer, so it could be like the older MacBook products with a magnetic spot to plug dongles into. In this case, a third-party dongle could add the Lightning and headphone jack back to the iPhone.


The Home button and TouchID


In 2017, Apple shipped the iPhone X, the first iPhone without a home button. This was meant to extend the iPhone’s screen completely across the bottom of the screen, even though they had to notch the screen at the top. Along with the removal of the home button, they added FaceID. This replaced TouchID as the authentication method for unlocking the device in general usage of the phone.


Most users do not have a problem with FaceID. They raise the phone to look at it, and as they look at the camera, the phone unlocks. They can then swipe the lock screen away from the bottom, revealing the home screen. For sighted users, this is a quick, easy, and intuitive motion.


For blind people, it isn’t so simple. We do not have to look at our phones in order to use them. In fact, users with braille displays or hardware, Bluetooth keyboards, do not have to touch their phone. These users can easily and quickly enter their pass codes, however, so they usually are not effected by this. Most users must pick up the phone, wait for the unlock sound from the screen reader, then put it back down on the surface they were using it on before. If FaceID doesn’t work, they must angle the phone away and back again for another try. if it fails a few more times, they must enter their pass code, with headphones in, if they seek to preserve their privacy around others.


Hope is not lost, however. There is a


rumor (HTTP)


that a new iPhone SE type device, the iPhone 9, could be released this year with a home button, TouchID, and still sport the A13 CPU. This would be something that I myself may purchase, as I doubt there will be much greater features in the iPhone 12, released later this year.


Software


Apple’s software usually comes last in reviews. Reviewers may talk about the smooth animations, camera machine learning effects, or updates to apps. For users of Apple’s accessibility services, however, software is the core experience of a device, and what sets MacOS apart from Windows and Linux, and iOS apart from Android. I have covered Apple’s accessibility options extensively


elsewhere


so I will use this section to highlight parts of software which effect accessibility indirectly.


Gatekeeper on MacOS


For a pro machine, the Mac lately has become a mess of confirmation dialog boxes and hindrances to opening software not blessed by Apple or its notarization process. For most users, even most blind users, this won’t be much of an issue. If you use Apple’s apps, or apps from the App Store, you’ll be fine. But what happens when you want to use, say, Emacs for editing text, or Retroarch for playing video games?


Blind people sometimes use specialized software to complete tasks. We use apps on our phones for recognizing pictures, money, and images of text, since these are not usually accessible to us. On the Mac, I use


Emacs (HTTP)


for editing text, using the


https://github.com/tvraman/emacspeak Emacspeak (HTTP)


extension, because I find it much easier and more enjoyable than Text Edit, Pages, and other alternatives. In fact, I am using Emacs right now, to write, and publish, this blog post. However, this program is not notarized by Apple’s processes, so instead of just being able to open it, I must open it from the contextual menu, press “Cancel,” then open it again, and press “Open.” My laptop is a pro machine; I should be treated as a professional. These features, as with the Touch Bar, should be left to MacBook Air users, or left for iPad users, when, or if, the iPad becomes a general-purpose computer.


Conclusion


In this article, I’ve explored how some of Apple’s decisions across its ecosystem have effected accessibility. Hardware has changed much, with software mainly being usable besides accessibility bugs and overbearing security. More about direct accessibility in software and services can be found in other articles. Other, smaller issues include the lack of Apple Stores is smaller cities, turning on iPhone not producing a vibration, sound, or other way for a blind person to immediately know when it has turned on, and the Mac’s startup chime disabled by default.


Now, what do you think, readers? I’d love to have your feedback, and thank you for reading.



/gemlog/blindness/