In this episode
Jeremy and Mark have a fund discussion about the accessibility of the iPhoneX especially as it relates to the facial recognition feature.
The Interactive Accessibility Podcast (IAP) is an entertaining approach to accessibility. We enjoy sharing our discussions on accessibility and how it relates to technology, real-life issues, information, businesses, and people with disabilities.
Links of Interest
Presenter: Welcome to the "IAP, the Interactive Accessibility Podcast" bringing you the people, technology, and ideas, helping to make your world accessible to everyone.
Mark Miller: Hey, welcome to the IAP. I'm your host, Mark Miller, thanking you for helping us keep it accessible. Do us a favor, if you're enjoying the IAP, share it, tell someone about it. Hey, even link to it from your accessible website.
My co-host with me today is Jeremy Curry. Jeremy, how are you doing?
Jeremy Curry: I'm doing awesome, my friend. How are you?
Mark: I'm doing really well, thanks for asking. I wanted to go over something with you today. There's this article that came out in "Global Accessibility News." The article is entitled, "Apple launches the iPhone X accessibility features known so far."
Basically, it goes over a few things in terms of just some basic accessibility concerns that are already being addressed about the iPhone X, and the fact that Apple's really responding to these pretty well. The one thing that it talked about was facial recognition.
Jeremy: Oh yes.
Mark: This is the perspective I wanted to approach this with you from, is the fact that I'm a sighty, as you would call me, right?
Jeremy: [laughs] Yes, that is true.
Mark: Which means I'm some guy that has no idea what it's like to be blind, and you as somebody who's blind or with low vision, I think is the way you would say it in your case. You have this really perspective which is very interesting to me, on all this stuff.
On the surface, it seems something like facial recognition would be a huge technology for people who are blind. Then in this article it alludes to the fact that your head has to be positioned correctly. While it may be a decent technology, there are some obvious concerns that pop up, that I as a sighty wouldn't think of.
Mark: I'm looking to you to help me out here, and just tell me, when something like this does come out, that seems really interesting and useful for people who are blind, what are some of the things that you guys immediately are concerned with?
Jeremy: I guess maybe we should start with Apple's done an amazing job of always providing people with disabilities, well I shouldn't say always, but since the iPhone, I guess it was the 3G or 3GS came out, providing voice-over and making sure that mainstream technology was accessible to people with disabilities.
The iOS platform has really grown to be extremely popular within the blind community. The percentages are just enormous on how many people who are blind have an iOS device. Anytime there's something new like this, there's always concerns about "How is it going to work for people like myself who are either blind, visually impaired, or low vision, whatever you want to call it?"
It was interesting as I was watching the Apple keynote, he's holding the phone out in front of his face, so as you mentioned there are some spatial issues there. If you can't see the phone, are you holding it correctly? Then it's scanning your face essentially and using what they call their neural network in order to be able to assess if it's you or not.
That's great, but what about if you're blind? First of all, as you mentioned, if you're holding it, are you holding it correctly? Are you holding it too far away? Are you holding it too close? Is it oriented slightly one way or the other and that creates a problem?
Mark: Sorry to interrupt, but they did say that they were going to address some of that by directing the person who's blind with voice.
Jeremy: Yep, which is awesome. I think that Apple's headed in the right direction there, but there's other concerns too, such as people like myself, sometimes I'm using voice-over or sometimes I'm using zoom, the magnification on the iPhone. I tend to look at things really close if I'm magnifying.
If the phone -- I'm trying to see if it's on -- is just two inches from my face, well then is the facial recognition going to be able to get an accurate picture of that? Or is it even going to be useful? Apparently from what I've read you can actually turn that off if you want to and just make a voice-over. That would be helpful.
Mark: Yeah, they say that you can turn off the face ID feature and enable voice-over instead. Which seems like a great solution. But from my ignorant sighty standpoint, I look at that and go, "But the facial recognition is so cool."
I understand we're talking about unlocking the phone in this one instance, but it seems to me, and I know there's apps out there that have done this and tried to do this, but that there's a whole play here in terms of being able to use your phone to recognize people.
Because I know one of the things with people who are blind is that when somebody initially walks up to them and starts talking, they don't have the benefit of knowing who that person that approached them was as quickly as somebody with sight would, because we'd see them coming from across the room and be like, "Oh, hey, there's Jeremy and Darren Darringer, guide dog, coming across the room."
I know way ahead of you getting up to me that it's you. In your case, we have to get maybe close enough for you to see with low vision or probably, I'm guessing, you need to hear our voice for you to know who we are. But if you had something on your phone, that was like, "Hey, so and so is approaching," or these three people are in the group.
Because that's the other thing. I've been fortunate to be in groups of people where there's a lot of blind people around. One of the things I've noticed is that the person who's blind may not necessarily know all the people that are in a group, like in some sort of an event, something like that, or social thing where we're all standing around talking.
It seems to me that they get like, "Oh that person's here, didn't know that," five minutes ago when the conversation started. Anyways, I'm sure I'm just thinking about it on the surface, but it just seems like there's other things that might be useful.
Jeremy: Absolutely. You're definitely right. People come up to me. I usually ask them to say like, "Hey, it's Mark," so that I'd know who it is, or if it's completely out of context, like you live, you don't have a country away from me. If you were in Indiana, I wouldn't expect that. If I saw you at the grocery store, I'd probably ask you to say, "Hey, it's Mark Miller," because why would a Mark, you being Mark...
Mark: To meet some other local Mark. That makes sense. I've run into that circumstance, because we go to these conventions and stuff, and I'll see somebody who I haven't see in a year, "Hey Joe," and you can see the person turn around and come starts spending and try to orient towards the voice, and I'll say, "It's Mark Miller."
As I get closer, we start to talk. I'll say, "We met," or "I saw you last at..." I try to just give some context. Because I think even my full name, especially my name, because it's little generic. It's hard enough. I see people's faces, and I'm like, "All right, I met them somewhere." I would appreciate...
Mark: ...that level of detail sometimes. I would just imagine if you don't have sight, it's harder. I don't know. What do you think about something like the iPhone or facial recognition, in general, being somewhat of an answer to those type of situations that could...
Jeremy: I could certainly see that, especially if some of the augmented reality stuff that is now supported on iOS 11 starts to take off, like Microsoft's HoloLens.
I guess Google Glass is now defunct. Having that type of hardware where you could have something recognized somebody comes up to you, and says, "Mark is approaching," or "So and so is approaching," or "There are so many people in this group."
There's all sorts of things you could do with that type of recognition. That would be really beneficial to people with all sorts of visual impairments.
Mark: I would imagine that you and I probably could sit here for an hour on a podcast, and not come up with a fraction of the applications that something like facial recognition could ultimately lead to.
The other point I think you bring up is that it really might be something that it does require more of a wearable because unless you're waving your phone around in a crowd and pointing at people, which tends to make him nervous anyways, because they think they're going to end up on YouTube.
Mark: Something like wearable glasses that has a camera, and sorts to make sense to the situation might be more appropriate. Talk to me a little bit more about iPhone and Apple in accessibility because as you said at the top of the podcast, I know this from experience in being around a lot people who are blind, that really is a preferred device.
If I see a blind person hanging out on their own in a terminal, at an event, or something like that. 9 out of 10 times I have an iPhone either plugged into their ear with an ear bud or they're holding up really close. I know that this is an important device. Talk to me little bit about the other Apple accessibility issues.
Jeremy: Back when iPhone 7 was released, one of the things that became a concern, like we were speculating with the face ID being an issue right now, when iPhone 7 came out, they needed to have a smaller footprint, thinner, which meant getting rid of the headphone jack. That was actually quite a significant change within the blind community because we're listening all the time to everything.
If we don't want people hearing everything we're doing, we're plugging in headphones and trying to make sure stuff works. Not having that ability even though there's an adapter with the lightning port was a big thing that people were worried about.
I can see if phones start to go towards this concept of face recognition that it probably will be something that will resolve itself over time like with the iPhone 8. This wasn't as big of a deal not having a headphone port and iPhone X.
With facial recognition, those things are probably going to get worked out too. There are some other things even with facial recognition, for example, I have several friends who have prosthetic eyes and they keep their eyes closed all the time because there's no reason for them to open their eyes. It wouldn't make a difference.
With facial recognition, the way that Apple has done it from a security perspective is, if you hold that device in front of your face and your eyes are closed, it won't unlock. I guess that's to prevent if somebody is robbing you.
Mark: Security measure.
Jeremy: Yeah, exactly. If you were blind and you have your eyes closed, well, what's that going to do? There are some concerns like that that are going to work themselves out, I think, over time. We've seen so many different things from Apple just get better and better such as the Apple watch.
For example, there's a new Apple watch now that now has got cellular connectivity without having to have an iPhone near it. When I first heard of the Apple watch, it was like I did when I first heard of the iPhone, I thought, "Well, how is that going to work for blind or low vision users?"
In fact, zoom is on it, voice-over is on it. It can help you navigate things with the haptic touch whether you're walking and whether you should turn left, you turn right, wherever you're going.
Even though I think there are some issues with facial recognition just concerns about well where is this going to head, I tend to believe that Apple has a pretty good hold on what they should be doing as they move forward and trying to figure those things out.
When I very first heard about the iPhone, I think you and I have talked about this before and somebody said, "Oh, you're going to navigate the phone via speech and a touchscreen." I just laughed, I thought that's never going to work.
As you pointed out, almost everybody who's blind or visually impaired has an iOS device now. I could foresee a future where all smartphones or face ID and all these kinks are worked out. It not only makes it work but makes the world more accessible, which is what iOS has really done over the last decade that iPhone has been out.
Mark: Well, and I think the other real thing that I realized when I look at all this and I listen to you talk is that you can't launch something and have it be perfect. That Apple's focus really is on accessibility.
You can have all the usability testing you want, you can have all the people with disabilities jump on this in a controlled environment that you want. Until it's released out in the world, until people actually try to use it in the context of their daily lives, you're not really going to discover all of the things that may be an issue.
The other interesting thing was something like the iPhone is that the accessibility really moves in two different direction. One is the accessibility as it relates to the usability of the phone. We have facial recognition, how do we make sure that that's accessible to people with a variety of disabilities so that they can use the facial recognition feature as well? Or whatever the case is.
Also, made me think when you're talking about the haptic feedback in that wearable device when, you were talking about somebody getting so much information that they figured out the accessibility needs that wearable device in terms of zoom text and all these other assistive technologies that now are usable on that device.
Also, that the device itself can become a new type of accessibility feature that enables people to navigate their world in their environment better with that haptic feedback. Somebody said, "Hey, we use this app to feedback to make things better." There's really these two directions I think that accessibility moves.
It creates new ATS and new features like that but it also has to be looked at in terms of the usability of the product itself.
Jeremy: Yeah there's this whole realm of possibilities that has been opened ever since iOS started add accessibility features because before that, there was just like Windows Smartphone, there was another phone I can't think of the name of it at the moment. You had to install special software on it, and now everything is built-in with the accessibility just right there at your fingertips.
Even the new Apple TV. Apple TV has been accessible with voice-over out of the box but Apple just added support for braille with Apple TV. People who are deaf-blind, you could get the captions of a TV program and you could actually read those captions on your braille display.
Mark: No way.
Jeremy: It's awesome, it's just mainstream stuff.
Mark: Well and again, not something I would have thought of. Not even recognized as a need. It's really interesting to see as these things hit the world where these needs pop up and how they're addressed.
Jeremy: Yeah, it's really exciting. I always tell myself if there was a good time to be blind or visually impaired, now is the best time ever. To have that [inaudible 16:27] case.
Mark: I think it's only going to get better. I'm on the phone with people a lot that aren't familiar with accessibility let in the way that you and I are in. I'm claiming a lot of ignorance here just because you run into these challenges and you have a much better idea of the things.
People who don't have the advantages even I do of being around people with disabilities and being in the accessibility community, they really don't think about these things.
There's one woman I was on the phone with who did have a little bit of exposure into that. She was very excited and she sort of related a lot of it to a lot of the progression we made in technology going back to her days watching "Star Trek the Next Generation" and seeing the visor that Geordi wore, which seems to be the holy grail.
The cool thing about that is that he actually has more ability. You could see in different spectrums and stuff of that visor. In listening you were talking I'm like, "Wait a minute. We're headed there." I don't know if we're going to see that in your lifetime or my lifetime but that's the kind of direction we're headed in. The assistive technologies are going to be so sophisticated that there's no longer an issue.
Jeremy: The Universal Design is just going to be regular design. Won't even have to be called Universal anymore.
Mark: Just design.
Jeremy: In it.
Mark: Yeah, it's just design. What do you think? Would you like a pair of those Geordi glasses? Do you think if the iPhone progresses to that point, you'd go ahead and pick up a pair?
Jeremy: Only if I could be on a starship with warp-speed.
Mark: [laughs] What would Darren do? I'd feel bad for Darren. He'd be like...
Jeremy: Maybe I'll get him a pair of those glasses. Maybe he'd be able to talk. He'll get a translator so he can just talk back.
Mark: That would be perfect. [laughs] You can see, he can talk, it's all good.
Mark: Great. Well, listen, thanks for jumping on and talking about this. This has been an interesting subject. I'm always amazed every time I jump into these things, just what I don't think of personally. It's always interesting to hear from you, and to read these things and find out how people are solving issues, you know.
Jeremy: Yeah, that's cool.
Mark: That's really what it is. Kudos to Apple. It seems like they've got this figured out, they address things quickly and really have created something that's helped a lot of people. Any last comments?
Jeremy: I would say, live long and prosper.
Mark: [laughs] All right. Well, this is Mark, thanking Jeremy and reminding all of you to keep it accessible.
Presenter: The IAP Interactive Accessibility Podcast is brought to you by Interactive Accessibility the accessibility experts. You can find the Access Matters blog at InteractiveAcessibility.com/blog.