A few weeks with Android


So, I’ve had a few weeks now with a new phone; and this time, it’s not an Apple phone. I now have the Samsung Galaxy S20 FE (maybe the 5G version). It’s been a pretty smooth, and fun, ride! In this post, I’ll try to detail as much as I remember about all this, and why I chose an Android phone at all. After all, haven’t I, throughout all of this blog’s short history, said that Android’s accessibility is awful?


Android accessibility


Android accessibility started out as just a kind of “use a physical keyboard, trackball, or other physical device to move around and select things.” It was very primitive, and the iPhone’s jumping into the touch screen with both feet made Android look puny. The excuse back then was that physical keyboards and such were better for blind folks anyways than that iPhone touch screen. And, in a small way, they were right. But modern devices come with “modern” input interfaces, and the keyboard on a phone didn’t last long.


Enter Android 4, which brought with it preliminary support for touch screens. Android 4.1 brought a little more with the whole “explore by touch” thing. At least, I think that was added on later. The screen reader, TalkBack, then gained menus for accessing different reading options, or actions within apps, like dismissing notifications or deleting email. VoiceOver had this from, I’d say, iOS 5 or so, with the Rotor, which is, in my view, still a bit easier to use than needing a whole GUI menu for stuff. But that’s just me.


Android 8 brought a few new things. With fingerprint sensors on phones having been a thing for the past three years or so, TalkBack gained the ability to use the sensor as a sort of second touch screen to have gestures on, so you could use that to adjust reading modes and such. But that didn’t include the actions menu, unfortunately. Google TTS gained the ability to detect the language of text, so TalkBack could use that to read text in other languages than the default whenever other languages are found in text. The iPhone has had that since around iOS4.


TalkBack 9 introduced, with Android 11, multiple finger gestures. You see, unlike the iPhone, apparently some older Android phones could only use one finger for gestures. Well, now Android phones, unlike the iPhone, support >=6 finger input at once, making writing in braille, whenever the thing works right, a joy. So, TalkBack now has a few more gestures, including one to pause and play media from anywhere. See, Google partnered up with Samsung, who had their own accessibility enhancements, to build this. I suspect Google was like “Hey Samsung ol’ buddy ol’ pal, best phone manufacturer of them all, would you please, please with $1000000 on top, give me the code you added on for that, uh, access... accessibility stuff whatsit? Our A123456XYZ team needs something cool to give to the lusers, so we need, like, that code stuff. Please? We’ll advertise for you!” And Samsung was like “Okay, we’ll get our one single accessibility person right on it. Pleasure doing business with you, Google. Oh by the way, we’ll need another $1000000 to pay this person’s salary. Well ours too but you know how it is, comrade.”


So, with Google being Google, they just plopped Samsung’s fixes on the accessibility API, and TalkBack can now do these things. They don’t always work, especially the three and four finger gestures. But, I am glad to have them, especially the two finger ones. They also added a voice assistant. I didn’t see this one coming, but it’s pretty cool. And, speaking of braille from earlier, they added a braille input method, that, unlike the iPhone’s, actually can use all six fingers for brailling. This means that I can type the FOR sign, dots 1-2-3-4-5-6, on Android, whereas I can’t on the iPhone, having to type “for” instead. I know, not much of a problem, but still, a technical deficiency is a technical deficiency. There shouldn’t be that issue on iOS in the first place.


Now, the TalkBack Braille Keyboard isn’t perfect. First, it sometimes puts in dots that I didn’t press, like a stray dot 5 or dot 6, when I’m sure I didn’t touch the screen, or when I’m about to press other dots. And sometimes, it doesn’t input dots that I do type. So it has some growing to do still. Even then, I use it, and like it, a lot.


There’s also a new screen search feature, but I just use the voice commands feature and just say “find <thing>” to search for stuff. Not even VoiceOver has that voice commands feature, yet.


Why Android?


So, we’ve established that Android is accessible, to the point of, maybe, iOS about 3 years ago. No text recognition, no screen recognition, and still no braille output (although BRLTTY has TalkBack beat out of the game in that arena anyway.) So why would I want it? The main points for me are exploration, customization, and owning my own darn device.


Exploration


As an Assistive Technology Technical Assistant (opinions are my own and not that of my employer and all that bullcrap), and reluctant Guru (I hate that term so much), I need to be able to use a lot of different devices. Yes, even the Orcam which is more like the Orscam to me since Seeing AI can detect text much better in my experience. Again, opinions my own, I’ll recommend whatever the student needs but *always* give alternatives beforehand. So, I need to be able to use Android phones, since a student *will* eventually come in having an Android phone and need training on it. So, it’s definitely helpful for me to have one.


Also, I wanted to see how accessibility was moving along. I’d been on iOS for about a year or two after trading in my Pixel for an iPhone 7... or was it the X R? Nah, I think it was the 7. Anyways, I hadn’t been on Android in a while. So, accessibility could have improved quite a bit since 2017 or so. And, it kind of has, a little. At least, it doesn’t feel as frustrating to me as it did back then, and I find myself really liking reading email using Aquamail. I can read through threads without VoiceOver’s weird bugs when dealing with threaded mail in the Mail app. And my goodness, this phone has a fingerprint sensor! No more needing to hold my phone up just to unlock the thing!


Customization


An iPhone is a very powerful pocket computer. And yet, we spend so much of our time on it just scrolling through Facebook, reading email, listening to music, and otherwise not really doing much with it. That, I feel, is very wasteful. Why use such a powerful device for so little? And it’s not the users’ fault, not entirely. Apple’s App Store is full of tiny, attention-seeking apps that just waste users’ time on little arcade-like experiences, when the users’ iPhone *could* replace their entire computing experience! And Apple doesn’t allow more freely-running software, like emulators and virtual machine managers, so we’re left with whatever the big companies create, which are often, if not always, “mobile” apps; mobile in the context means stripped down, almost useless apps for actually getting things done. And believe me, when an iPhone is the *only* computer a blind person has, yeah, getting things done on it is important. Sure, they can check mail, and write notes, and even browse much of the web. But what about writing essays, documentation, dealing with large spreadsheets? When a phone is more expensive than many laptop and desktop computers, it should do much more than them, as well.


But no, rather than elevating mobile computing to a height of a machine that is used for productivity, especially in the ongoing COVID-19 mess, it’s just a Netflix machine that can sometimes act as a Facebook interface or a Twitter screen. It’s awful, and I hate that these powerful computers are so locked down and used for such foolishness and abuse of what could be so much greater. But Apple has to make money, and big companies have to make money.


Android, on the other hand, is a lot more customizable. Granted, Google is cracking down on a lot of stuff, like apps not being able to access the whole file system and such. But, we can still install alternative app stores, we can still sideload apps, we can still install terminal emulators and console emulators and just about anything we want. And, we can develop on any computer we want, even Linux. Blind folks aren’t limited to the TalkBack screen reader. Anyone can develop an app as an accessibility service, and any user can install that app. Of course, there aren’t but one other screen reader out there, but it’s not too bad. Anyone can also develop different text-to-speech engines for Android, so we have a good few options there as well.


Now, I know, Android is Google’s domain, where they can watch you and all that. But, Apple is just as controlling. But instead of just letting you free and watch what you do with it, Apple tries to control you and just take away all that, so they don’t *have* to watch you. Either way, you are either watched so Google can sell ads, or you’re kept like a pet in a cage. Well, I’m not a pet. And while I don’t *like* what Google does, I really don’t care either. I use Google, and Google uses me, probably. It’s better than being given a device and not being able to do what I want with it.


Owning my Own Darn Device


So, when I bought the iPhone X R, I viewed it as just another upgrade, really. The neural coprocessor should make the Siri voices more responsive. The better speakers, stereo recording, all that fun stuff. But then I wanted to play games on my phone. Not the stupid “you’re a blind man and you have to do this, **only with your ears**(TM).” I wanted to play real video games, fighting games, whatever I could play! Stuff that is actually freaking fun, not exasperating because I’m already blind and I don’t want to be blind in a darn game that’s basically more advanced boppit. But no. The App Store doesn’t allow those unless the developer of the game puts it up there. And Square Inix doesn’t care at all about Dissidia Final Fantasy, especially not the original version for PSP. So, even though I’d bought the game from the PSN store, I couldn’t play it on my mobile device.


So, that, along with not being able to have a good, quick terminal (until iSH was released), made we think about Android. Now, you may then say, “well what about AltStore? You were all in love with that before.” Well, now I’m running Linux, and Linux doesn’t support the stuff needed for AltStore to run. So that’s a flop for me. I did try out the Build Store again, but I gave up on it, I just didn’t have the time to try to grab the PPSSPP config file and all from the iPhone, configure it, then find the games in the file system, hopefully that’d be easier, and then have to reinstall everything when the certificate got revoked from the Build Store.


Gosh, at least on Android, I know once I set something up once, it’ll stay there. I don’t have to worry about reinstalling PPSSPP, or Dolphin, or anything else. And yes, even Dolphin, with Wii games, runs amazingly well on this phone, even better than my computer.


The bad points


Android does, particularly in accessibility, have some bad points. For one, dealing with actions is slower than with iOS. For iOS:



For Android:



This is on Android 11, with Samsung’s version of TalkBack, which *should* be pretty close to normal TalkBack but who knows. I’ll try installing Google’s TalkBack at some point and seeing if anything is easier. There used to be a way to swipe with two fingers on a notification to dismiss it, but that was sometimes hard to do, swiping in a straight horizontal line, but that doesn’t seem to work now in Android 11. It’s not *too* big of an issue, but it is one that, when I hear iPhone users clear notifications so quickly, I’m kind of jealous of.


Another thing is opening apps quickly. With the iPhone, I just would unlock my phone, and then use braille screen input to type the first few letters of an app, then swipe right with one finger to open it. Now, I know, that’s something that sighted people cannot do, and it’s a workaround for a good and easy search feature, but it was very fast and efficient. I’ve worked around this by putting my most used apps on my home screen, like everyone else has to do. Still, when I’m looking for my banking app, for example, it is annoying to just be able to invoke the Investiture of braille screen input to just... flip it open. I know that I can also tell Google to open it, but I shouldn’t *have* to. I don’t particularly like talking to my devices, especially when other people are around. But I can get used to that, I think. Having the full power of the Google Assistant, rather than messy, simple-minded Siri, will be something I’ll have to get used to as well. As I’ve said, I’ve only had this thing for a few weeks, but I can see myself having this for a long while further.



devinprater.flounder.online/