This week on the IAP, we discuss a new camera product, aimed at being a sort of seeing aid for people with blindness and low vision.
Show Notes & Links
- Subscribe to the podcast on iTunes
- Download Podcast as mp3
- OrCam image tech for people with vision disabilities
Announcer: This is the IAP - Interactive Accessibility Podcast with Mike Guill and Mark Miller. Introducing Mike Guill and Mark Miller.
Mark Miller: Hey, welcome to the IAP. Thanks for helping us keep it accessible. Do us a favor. If you’re enjoying the IAP, share it. Tell someone about it. Hey, even link to it from your accessible website. Mike, I’ve done it again. I found another one for us to talk about today.
Mike Guill: I know. I can’t believe it. I’m not the one coming up with the article this week.
Mark: Well I know why you’re not coming up with an article this week – you’ve been flying around for Thanksgiving.
Mike: Yeah, it’s kind of a busy travel time of the year for everybody.
Mark: Yeah. And last night, I saw the pictures of you and your wife sleeping in the passenger’s seat next to you.
Mike: Yeah, that’s supposed to be the co-pilot.
Mark: I think you might want to review the duties of a co-pilot with her.
Mike: I know! Normally, your co-pilot operate the radios, maybe talk to you or hand you a map if you need one and that kind of thing.
Mark: Well, she wasn’t doing that.
Mark: She was – uhm, resting.
Mike: Yes, resting also known as sleeping.
Mark: Also known as sleeping. Well today on the IAP, we’re going to talk about the OrCam. This little bad boy here is basically a video camera that you can clip on to the side of your glasses. If you have some sort of vision disability, I would imagine it’s probably going to be applicable to people who are completely blind pretty well, but basically, it’s an aid. I kind of view it almost like a hearing aid but for vision disabilities. So you’re navigating around a grocery store and you pick up an item and you’re not sure what it is, you’re having trouble reading the text, this camera will read that text for you and let you know what it says. And it can also go so far as to identify different objects, which is really exciting to me. So it’s a real specific – I mean, Google Glasses has a lot of the same function, but this is really honed to being an aid like an assistive technology for vision disabilities.
Mike: Right! Also, you can’t talk about something like this without discussing Google Glass. The difference here is that the Google device has all kinds of extra, things built into it such as social sharing and the ability to do a lot more. This thing is more like something that you clip on your own glasses and it can identify things and give you audio feedback about what the things are. The interesting thing to me is that it’s the kind of thing that can identify – you can sort of train it, right, from what I’ve read. So you can train it to identify the counter frequently especially if there’s something where you’re required to make a choice between something. The example that they use in their website is the crosswalk, right? Walk and don’t walk. A lot of cities have audible tones and stuff like that that are designed so that people who are blind or low vision get an audio signal about the crosswalk status. However, stuff like these just seems to enhance that as a back-up like more information than you would normally get just from a couple of beeps and signals like that. So I thought that was pretty cool. Another thing would be that you could train it. if you are going to the grocery store and you’re going to get a can of black beans, well you can train it to identify your favorite brand of black beans as opposed to the off-brand that you don’t want to buy and that kind of thing.
Mark: Well, here’s the thing. When we look at something like Google Glass, Google Glass is an awesome bit of tech. But right now from a research and development standpoint, from a marketing standpoint, from a features standpoint, it’s all encompassing. And I think it’s important that if we look at anything out there in the world like when phones were being developed and tablets were being developed or PDA’s – remember the PDA’s? I had a Dell Axiom and it was this big, thick PDA that essentially looks like what a phone looks like, but it’s thicker and clunkier. Those things develop in their own silos. And then eventually, the world brings them together. And I think while Google Glass is really cool, it’s very important for us from a developmental standpoint to have these silos that are focused on specific things like the OrCam is focused on aiding people specifically with a vision disability. I think it’s really cool that this thing learns. We always run into this issues of the database. You know what I mean? If it’s in the database, you’ll really have trouble. Well, human beings don’t have that issue because they learn. We may know what a handful of objects look like, but we go into a new environment that has new objects and we quickly learn and adapt. So I think that’s important for something like this to really be usable. I completely agree with you. Remember, Mike, a couple of podcasts back when we were talking about that 2D refreshable ‘printer’, essentially is what it was?
Mike: Yeah, right.
Mark: So you could put an object underneath the cameras. It was like this little bar that had a set of cameras on it so it can represent the image in 3D. And then these columns (pins), they were more like rectangular columns would pop up and recreate not a full 3D image, but it would recreate like a topographical image off the 2D surface. We started talking about the possibilities that that would have for the blind if you could have a refreshing map of the room that not only took into account the layout of the room, but also how the furniture was specifically laid out in any given time. A blind person before entering that room could run their hand over it and really have an understanding of the layout that they were about to walk into. The buffet is in the left-hand corner of the room and the tables that you can eat at are in the center of the room and the bars at the far right corner. You know what I mean? So they have so much information walking in that may be difficult for them to get otherwise. We speculated something like this where if a blind person could walk into say a bathroom and scan the room with a device similar to this, now imagine this thing in conjunction with that 2D refreshable device. All of a sudden, he knows right where the sink is, right where the soap is, right where the stalls are, whatever he’s looking for. You have to imagine that that’s going to be one of the frustrating experiences for a blind person is that they’re relying on somebody else to go, "Hey, if you’re looking for this, it’s over there. If you’re looking for that, it’s over here" all the time.
Mike: Right. Yeah. And of course I think of renovations. What if you get used to something a certain way especially like say the coffee area. You’re gone for a vacation and they did a renovation on it when you come back.
Mark: Right, when you come back, yeah.
Mike: Of course I joke with my blind friends that when they come to visit my house, I’ll keep moving the furniture around...
Mark: Keep them on their toes.
Mike: Keep them on their toes. But yeah, it’s a good point. They’re taught when it comes to mobility and stuff like that to not get in the habit of getting too specific with things like counting the number of stairs on a place or something like that because things can change. You go into a different building and you forget exactly which building you’re in and which number of steps there are. That can be kind of embarrassing. So it’s more reliable to go by use of your go-to navigation such as the cane and the handrails and the walls and stuff like that.
Mark: I read a long, long time ago, Mike in a book on neuroplasticity. I don’t know if we’ve discussed that on this.
Mike: We haven’t. I’ve got to say that that’s got to be some sort of a record because we’ve said the word neuroplasticity now I guess five or six times in the podcast. I don’t think there are too many podcasts...
Mark: What really was amazing is that either one of us has any cognitive to even remotely understand something like that. I think it’s an important sort of area of science because it’s sort of the brain science behind what allows the adaptability that it takes to sort of learn something visual without the benefit of sight or to learn something auditory without the benefit of hearing. That’s what neuroplasticity says. It’s like, "Hey, yeah, we’ve got specific places in our brain for specific things, but when push comes to shove, we can modify other places of our brains to do the same functions." That’s probably the worst and most cryptic explanation ever, but that’s essentially how I see it. But in one of these tests that they did to sort of prove -- you can’t see the air quotes that I’m doing here – to demonstrate neurosplasticity...
Mike: I can hear the air quotes.
Mark: You can’t see the air quotes?
Mike: I can hear the air quotes.
Mark: Oh, you can hear them. That’s right, it’s a podcast. You can’t see stuff on a podcast. But anyway, they actually put like a thing across a blind person’s back that had like little pens that popped out and could give a kind of a 2D, what would be equivalent. I’m thinking of that refreshable 2D device that we’re talking about. It was sort of that, but it’s stuck to the back. And then the cameras, what it would do is out of these pens, it would create an image of – so if there were like little LEDs or something, you and I would see an image that we would recognize, but they’re pens that the blind person could feel in their back that was created by this camera. They had great success in the blind person being able to react to that image like a sighted person would react to something that they see. I think it was a pretty cryptic device, but with this OrCam coming down the pike with a technology like that (2D refreshable technology), I think we’re going to see these things combine and that’s where we’re going to see assistive technology for vision impairment sit.
Mike: Yeah. And yes, of course, just like with everything else, it’s going to start out being ridiculously expensive and then prices are going to come down over the years as more and more people get into the marketplace and as more people find use with it. It’s the kind of device that’s not just for someone who is blind or really vision. It’s for somebody who’s just maybe aging and can’t see as clearly. They even make a case for it I think on their website. I think I saw something about that people with dyslexia or memory loss could benefit from it. And I can kind of see that.
Mark: But you said it was going to start off expensive, Mike. This thing is only $2500. That’s like you made that today already, right?
Mike: Whatever. That’s pretty expensive. The Google Glass...
Mark: Yeah, you know, any of this stuff, it starts off in the stratosphere when it comes to cost and then the cost come down. It’s just the nature of all that R&D into a product before it really hits the market.
Mike: Well it’s just like all the rest of the stuff. When cellphones were only carried around by the super elite docs and lawyers and you very rarely did you see someone with a cell phone. The technology existed for many, many years before they became really available in retail.
Mark: Yeah, accessible to the average person, right.
Mike: I mean, the technology existed even before the rich people got them.
Mark: Yeah, but look, $2500 sounds like a lot of money, but if my vision started going, I’d go pay $2500 for Lasik surgery if that was going to solve the problem.
Mike: No doubt.
Mark: How many people fork that over for that kind of thing anyways? And I understand that that doesn’t turn into something accessible, but it is a start. You know what I mean?
Mike: I hear you. I hear you, but there a whole lot of people who are very, very – they’re having such a difficult time making ends meet as it is. It’s going to be hard for them to get...
Mark: This could be the difference between you having to schedule the time with a friend or a family member to go to the grocery store to help you shop or, "Hey, I can take the bus there myself now and do my own grocery shopping." And guess what? It takes me the time it takes everybody else to do it because I can do it on my own and I don’t have this – I mean, it’s a pain. We don’t stop to think about it necessarily, but when you have to involve other people in every day functions, it’s a pain. Grocery shopping is a pain to begin with, never mind adding to it.
Mike: No, you’re right.
Mark: You know? But I think you’re right. I think the prices will come down and I think you’ll also see this technology especially from a software standpoint. It will start to blend into these more broader products like Google Glass. It’ll just be an opportunity for Google Glass to have an accessibility feature – really, that’s what we’re talking about, right? A person can’t see necessarily very well and use the hood aspects of it so they use the auditory aspect of it and guess what? The camera is now really good at identifying objects because these guys at OrCam perfected it and turned the software over, you know?
Mark: Alright! Anything else to add, Mike?
Mike: I don’t think so. I think we’ve covered this one pretty well although I do want to add that I still cannot actually say Google Glass very clearly. I think that’s a poor naming choice.
Mark: A little bit of a tongue twister?
Mike: It is! There’s so many Gs and Ls all running together, which is kind of funny of course because my last name is Gs and Ls.
Mark: I was going to say too bad your name wasn’t George. It’ll be like George Guill got his Google Glasses gunked up. Say that ten times fast. Alright! Well, thanks again, Mike. We’ll talk to you next week.
Mike: Yeah! Thanks, Mark. It was a good one.
Mark: Alright! Well this is Mark Miller...
Mike: ...and this is Mike Guill...
Mark: ...reminding you to keep it accessible.
Announcer: The IAP - Interactive Accessibility Podcast brought to you by Interactive Accessibility, the accessibility experts. You can find their Access Matters Blog at Interactiveaccessibility.com/blog