ATU249 – Orcam, Free Web Accessibility Webinar, Wearable Robot Arms for Drummers, Microsoft’s Accessibility Plans for 2016

Play

ATU logo

Your weekly dose of information that keeps you up to date on the latest developments in the field of technology designed to assist people with disabilities and special needs.

Show Notes: Orcam | Steven Goldstein | www.orcam.com
A11y- Web Accessibility for Developers – Assistive Technology at Easter Seals Crossroads http://buff.ly/1QhntnW
Wearable Robot Transforms Musicians into Three-Armed Drummers http://buff.ly/21kg69J
Microsoft are stepping up their accessibility efforts considerably http://buff.ly/21uAjX6

App: Read Write www.BridgingApps.org
——————————
Listen 24/7 at www.AssistiveTechnologyRadio.com
If you have an AT question, leave us a voice mail at: 317-721-7124 or email tech@eastersealscrossroads.org
Check out our web site: https://www.eastersealstech.com
Follow us on Twitter: @INDATAproject
Like us on Facebook: www.Facebook.com/INDATA

——-transcript follows ——

STEVEN GOLDSTEIN: Hi, this is Steven Goldstein. I’m the area sales manager with Orcam Technologies Technologies, and this is the Assistance Technology Update.

WADE WINGLER: Hi, this is Wade Wingler with the INDATA Project at Easter Seals crossroads in Indiana with your Assistive Technology Update, a weekly dose of information that keeps you up-to-date on the latest developments in the field of technology designed to assist people with disabilities and special needs.

Welcome to episode number 249 of Assistive Technology Update. It’s scheduled to be released on March 4 of 2016.

***

Today I have an in-studio guest named Steven Goldstein who shows us a pretty cool technology that rides on your glasses and is designed to help people who are blind or visually impaired get more information about the world around them. It’s called Orcam and we actually plugged the device right into the studio soundboard and get some direct feedback from the device.

***

We’ve got an announcement for a free web accessibility training that we are hosting; a story about a wearable robot arm that transform musicians into three armed drummers; also some information about what matters is doing in terms of stepping up their accessibility efforts; and our friends over at BridgingApps tell us about the app called read and write.

***

We hope you’ll check out our website at www.eastersealstech.com, shoot us a note on Twitter at INDATA Project, or call our listener line, give us some feedback. That number is 317-721-7124.

***

It’s that time of year again. We are going to be hosting our webinar, web accessibility for developers. It will be almost a full day of training, five hours of content. It will be on May 11, 2016, from 11 AM to 4 PM Eastern time. We are excited to have Dennis Limbry as our instructor to teach people who develop web content how to do so in a way that works well for users of assistive technology. The webinar is going to begin with a background on disability guidelines and law, and then jump into a bunch of techniques for designing and developing accessible websites. It will include basic through advanced levels are key will talk about all kinds of things like content structure, images, forms, tables, CSS, aria, and talk about techniques on writing for accessibility and testing for accessibility. There is no cost to attend, but we do need you to register in advance. If you head on over to www.eastersealstech.com/A11Y, you will find the registration form right there or at least a link to it and you can sign up for that free webinar that we are hosting on May 11. I will also pop a link in the Jonathan case that URL was just too much to remember. We would love to see you if you are a web developer and interested in web accessibility on May 11.

***

[Drumming]

SPEAKER: We believe that if you augment humans with technology, humans should be able to or would be able to do much more.

WADE WINGLER: What you’re hearing there is a clip from a Georgia Tech blog post. The headline reads, “Wearable Robot Transforms Musicians Into Three Armed Drummers.” Okay, so that totally grabbed my attention. The video is pretty interesting. It shows a prototype unit of a two armed able-bodied drummer who is drumming, and then this third arm he is using sort of shoulder gestures and different kinds of physical movement to control where that robotic arm is going and whether it’s writing the ride cymbal on the drum set or whether it is reaching over and hitting the tom drum over at the right hand side of his drum set. According to the article, the researchers there are working on making this a smart arm so that it learns behaviors, reacts to the drummer’s actual movement, and kind of learns to drum along with this robotic third arm like the drummer is doing with his normal two arms.

Obviously this isn’t right now and assistive technology story, but as we talk about robotic prostheses and some of the things that can be done in the situation, I think it certainly might apply to individuals who have had amputated arms and other kinds of things that create mobility challenges. A little bit futuristic, a little bit interesting, maybe even a little bit creepy if you watch the video. You’ll have to check it out. I’ll pop a link in the show notes and you can watch the video and learn about the work they are doing at Georgia Tech to create robotic prosthetic arms to play the drums.

***

Marco Zehe over at Marco’s Accessibility Blog has noted that Microsoft seems to be stepping up their accessibility efforts a lot here in the recent past. He talks about the fact that Microsoft chief accessibility officer, Jenny Leigh Fleury, is working with Microsoft CEO Satya Nadella about their reaffirmation of accessibility and the importance of Microsoft products and its accessibility for users of assistive technology. He mentions that Jenny Leigh Fleury has created a roadmap for upcoming accessibility efforts and also talks about the fact that they have just very recently reaffirmed their commitment to office 365 and expanding the accessibility there. In fact, he links over to a Microsoft blog that was written by the chief accessibility officer that talks about some of the things they’re doing in this accessibility roadmap. They talk about the fact that in 2016 they want to make sure that users have access to the Windows start menu, lock screening, settings, Cortana, the music, store, videos, and other kind of stuff. They’re talking about making marks of it more accessible. They are talking about Windows 10 mail and how screen reader support will be improved in that application. He spent a lot of time talking about Office 365 and some of the ways that it has been made more accessible on computers as well as mobile devices. Even talks about specific things that making high contrast mode working better in apps and some changes in one note that make things a little bit easier for folks who have learning disabilities or dyslexia.

In the show notes I will link over to Marco’s blog post. And there are a couple of more links over to the Microsoft documents that really do illustrate some of the changes that Microsoft seems to be making now and gearing up for later in 2016 to make sure that accessibility is good. I encourage you to check Michael’s blog and follow the Microsoft blog that are following these topics as well. Again I will have a link in our show notes.

http://buff.ly/21uAjX6

Each week, one of our partners tells us what’s happening in the ever-changing world of apps, so here’s an App Worth Mentioning.

AMY BARRY: This is Amy Barry with BridgingApps, and this is an App Worth Mentioning. In this segment, I am sharing the Read and Write for iPad app by text help. The Read and Write app is also available for Android. This app is an alternative keyboard and incredibly useful toolbar. The app is designed for maximum literacy support by allowing students to read and complete literacy assignments independently. Tapping on a device while using the app assists with word prediction, spellcheck, and dictionary, and it is very helpful with writing assignments, on my documents, email, and more. The text to speech and speech to text features can be really helpful for students who may struggle with written expression.

The Read and Write app is an extension of Text Help’s literacy software, and we highly recommend using Read and Write as a Google extension and add-on for Google Apps for Education. The Read and Write app and its software is useful for learners and those diagnosed with dyslexia, autism spectrum disorder, ADD, and it is also very helpful for typically developing students and adults. We really like the simplified toolbar. It cleans up the screen, eliminating a lot of junk that often distracts users.

The Read and Write for iPad app is available to trial for free for 30 days at the iTunes and Google Play stores, and it works on iOS and Android devices. The trial version of read and write for iPad supports all features described for 30 days, and then after the trial period, text to speech remains free forever while some of the premium features are available via an in app purchase. For more information on this app and others like it, visit BridgingApps.org

***

WADE WINGLER: I love it when I have an in-studio guest. I’ve got one today. We actually met a gentleman named Steven Goldstein who is the area sales manager for Orcam Technologies. Frankly he just spent an hour or two with our staff showing us this cool new technology designed to help people who are blind or visually impaired get more information about the world around them. We’ve got the device here in the studio. Steven and I are going to have a conversation about this cool thing, Orcam. First of all, Steven, welcome to the show.

STEVEN GOLDSTEIN: Thank you.

WADE WINGLER: Tell me a little bit about yourself and the history of this product and kind of how you got to the point where we are here today having this conversation.

STEVEN GOLDSTEIN: I’ve been in the medical device business for years and was introduced to Orcame about three years ago. They really weren’t at the point of bringing folks on, and then I stayed in touch. Eventually they were. Orcam has a little office in New York City. They are an Israeli company. Orcam is designed to assist the visually impaired with reading, recognizing products or objects, as well as recognizing faces. It is a wearable, accessible, OCR, optical character recognition, technology. The neat part about it is you can wear it and take it with you this like a cell phone or any other type of device like that. It’s worn by clipping onto your glasses.

WADE WINGLER: As we were messing around with the product in the lab, it was just a regular pair of glasses with sort of a low profile, not very interested device. It was kind of cool. We are talking about people who have low vision or who are totally blind, or little bit of both? Who are we talking about in terms of the customer?

STEVEN GOLDSTEIN: They are both our customers potentially. We have provided or sold products to both. The answer is yes.

WADE WINGLER: So why do we need a product like this? Why does this thing need to exist?

STEVEN GOLDSTEIN: Some of the challenges of some of the other assistive technology devices, particularly ones that are OCR, are that they are stationary or they are a little cumbersome to work with —

ORCAM: Suspended.

STEVEN GOLDSTEIN: That’s our device talking to us. There is a place for our device that you can bring to work, can bring home, you don’t have to have two devices. There is a place for a device you can bring to the restaurant. It is the most natural because you’re wearing the device. Again, this is a featherweight camera that sits up on your glasses. You’re wearing it up there so it is essentially saying whatever your nose is pointed out. And that way there is some intuitive aspects to it. That’s why we call it an intuitive device.

WADE WINGLER: I think that it was, based on the demo. I love the fact that Orcam started talking here just a few seconds ago. It obviously wants some attention. Why do we do that? Why do we do a little bit of a demo? Let’s get the device out. Maybe you can talk through what we are going to do, how it will work. We have a plugged in to our board here so that when it talks we will hear the speech output —

ORCAM: Waking up. Battery is 90 percent charge.

STEVEN GOLDSTEIN: I’m putting the device on and trying to watch it between the headphones here. Let’s see if I can do that. I’m wearing a pair of glasses. The Orcam is on it, and I can look at our brochure, turn my head sideways. If I take an image of our brochure by just using a tiny trigger button on our base unit —

ORCAM: Volume down.

STEVEN GOLDSTEIN: Sorry about that.

ORCAM: [Camera Click] Easy to use. All you have to do is point. Orcam response to simple gestures making it easy to use whether it is to read, find an item, or recognize a phase or product. No need to search for audiobooks, learn new software, or use other tools. Reading next textblock. O, Orcam. Reading next textblock. Orcam device can read printed text in real time. You can read newspapers.

STEVEN GOLDSTEIN: So I’ll posit or stop it there. As it sees textblocks separated by white space, informs you that it is moving on to the next one. Our Orcam logo it sees and an oh so that’s why sometimes it says O, Orcam. It does not read logos. Is it is not a picture recognizer. You can teach it to recognize products and phases, but in this case since it is reading text, it saw it as an O.

WADE WINGLER: So I will back us up a little bit because we are in a radio format. We did a YouTube video on this so our listeners should check out our website and see that YouTube as well. Basically put on a pair of glasses, and it is connected to a small hand control that looks a little bit bigger than a deck of cards or something like that. It has some wires so that it is connected from the glasses to the thing in your hand. Then you simply held up the brochure, looked at it, and clicked the trigger, and I started reading right away. It looks like some of the other OCR technologies I’ve seen accept it was all wearable.

STEVEN GOLDSTEIN: Correct. The base unit is designed to have basically three buttons. It is the trigger button which makes the camera take an image. It has a volume up/down, which is a toggle switch. I will call that one button even though it is an indent for a volume down and a bump for a volume up. It’s got an off on button. It is really three buttons. If you compare that to a cell phone might have, it is a lot less complex. The trigger button will prompt the camera to take an image. With the document in front of your face, it will read as a result of taking that image. There is a second way that you can take an image, and that is if you use a pointing gesture. The camera actually will respond to an upward pointing finger. It actually sees your nail bed. Wherever you are pointing, it will take an image. If there is tech behind it, it will begin reading that text as well.

WADE WINGLER: It is hard to do that actual gesture here with a big microphone stuck in front of your face, but it was fascinating when I saw you do this because you simply take your index finger, touched the tip of your nose, and then pointed right at whatever you wanted it to read what it was a sign on the wall or a brochure or even a candy bar. It followed your finger and sort of knew what to take a picture of and read the text from that.

STEVEN GOLDSTEIN: One of the issues with folks who are visually impaired is that there is an orientation as to where they are perhaps looking or where their head is facing and where the document is that they want to read or where the product is that they want to look at or memorize or teach. So often times what we suggested during the teaching phase of Orcam is that we suggest that they launched their pointing finger from their nose. That way they can have the best chance of actually pointing right at the document. In fact, even for holding the documents, we recommend that sometimes actually touch the document right to their nose and move it hour at about 10 to 12 inches or sometimes halfway from being a straight arm. That way they know it is pretty much in front of them and is at the right distance.

ORCAM: Suspended.

STEVEN GOLDSTEIN: So I will make up the device as it puts itself to sleep.

ORCAM: Waking up. Battery is 90 percent charge.

STEVEN GOLDSTEIN: I will turn my head and try a pointing gesture here.

ORCAM: [Camera click] welcome to Orcam. We are pleased to present you with your new Orcam device. Orcam is very intuitive. All you have to do is point. See for yourself. [Bell Ring]

STEVEN GOLDSTEIN: I just pointed at the back of our instruction booklet which has a nice section to read which we can introduce new users to buy using that backpage. That last ding that you heard just lets the wearer know that it is done reading.

WADE WINGLER: So we have seen it read a whole lot of text, but it also does optic and facial or product and facial recognition. Can you tell me a little bit about that?

STEVEN GOLDSTEIN: It can go into a learning phase. If you hold the product in front of it that, for instance, might be too graphic in its print, the device doesn’t really recognize graphics or images. It is not going to read print that is heavily italicized or scripted, or it’s not going to read handwriting. So if you have a product that has graphic print on it, you can force the units to go into a product learning phase. By holding the trigger button down for about three seconds, it goes into a product learning phase that allows you to educate it on the product. I’ve done so with a product here in front of me and I will point at it right now.

ORCAM: [Camera click] [Voice Recording of Steven]: Nature Valley protein bar. [Bell Ring]

STEVEN GOLDSTEIN: So that is my lunch. That’s a nature valley protein bar. I carry these with me all the time. As you could hear, that is my voice. By putting it into the product learning phase, it allows me to say the name of it. And once again when I look at it and point at it, it will recognize it and play back my own voice. So there is a microphone within our system.

WADE WINGLER: And you can learn multiple products like that, so your regular grocery list or the things that are in your pantry, it is easy to add a number of those, right?

STEVEN GOLDSTEIN: Right. So right now we have the ability, I believe, and I hope I get this right, 150 products and 100 faces. That is what our capacity is at the moment, or any mix of the above.

WADE WINGLER: So tell me a little bit about faces. You mentioned that as well.

STEVEN GOLDSTEIN: The unit itself has facial recognition software built in. So if you were to stand in front of me and I was right in front of you, I could take a picture and it would indicate to me that one person was in front of me. It sees two eyes and nose and a mouth. If I put it into the learning mode where I hold the trigger button down for about three seconds, it will ask me to name your name, the person in front of me. Just like I named the protein bar, I would name your name. It would record me and every time I looked at you it would notice who it is and would play it.

WADE WINGLER: That is super helpful, especially in an environment where you are seeing the same faces over and over again but you might not exactly know who it is just based on the context of the conversation.

STEVEN GOLDSTEIN: Correct. You can even set it up to say the face frequently or set it up to manual the way I have it set up. So there are settings within the device that would allow it to automatically save where it sees or to do it on a manual basis.

WADE WINGLER: There are products that do optical character recognition, and have sort of seen bits and pieces of this kind of technology throughout the years I’ve been in the field. Tell me a little bit about why Orcam is different from the other products that are addressing some of the same or similar issues.

STEVEN GOLDSTEIN: So there are a few reasons. First of all, all of those other products payroll. We don’t see them as necessarily competitors or anything. I do this more as a continuum. There are products that do certain things and don’t do other things. It’s important that we share with our potential users what we can do and what we can’t do. So the neat thing about the Orcam is that it is wearable and is accessible. It’s intuitive because you are essentially “seeing”, finger quotes, what it is you’re looking at because the camera is pointing where your nose is pointing. It is wearable in that you can bring it with you to work. You can bring it home so you don’t need multiple devices, multiple places. You can bring it to a restaurant. You can use it in your easy chair. You can use in your bedroom. All of these things are sometimes limitations with other devices. Another aspect is that many devices out there will take advantage of whatever mission is —

ORCAM: Suspended.

STEVEN GOLDSTEIN: There we go again.

WADE WINGLER: Saving that battery life.

STEVEN GOLDSTEIN: Exactly. It takes advantage of whatever vision is left with an individual by maximizing that mission in some form or fashion. The Orcam, since it is not really affecting what you’re saying, can work with you right along the pathway if you’re going through vision loss and it is becoming more pronounced over time. It is never going to be obsolete to you, whereas those other devices will play a role for a period of time and then they may not play a role. In that regard, it is sort of unique. The other neat part about the Orcam is that the software and upgrades that are coming are only going to make the product more robust. Right now, while the product recognizing currency by itself, it is already built in and I can teach you products like my protein bar, in the future they expect to have the software already carrying about 100,000 products in there so that you won’t have to necessarily teach it. From what I understand, on the to do list is also a pause button. We don’t have that at the moment.

WADE WINGLER: For reading text for example?

STEVEN GOLDSTEIN: Exactly. So if you have a phone call and you want to answer it, you could pause what was going on. Even though right now it will often recognize the text once again and will even inform you that it is starting from where it left off, it is still not a real pause button at the moment.

WADE WINGLER: Talk to me a little bit about the technical details of the product, the size, the weight, the battery life, camera, those kinds of things.

STEVEN GOLDSTEIN: The actual head unit, the thing you clip on to your glasses, his featherweight. I’ve probably got to measure it in grams, so it is almost weightless. The actual device that you can carry with you, put on your belt or put in your pocket or use in a carry pouch – I should know how much that weighs but I actually don’t. It was about the same amount as my cell phone. I can tell you, give or take. It is shaped a little bit differently, but it weighs about the same as my cell phone. So for those folks who carry a cell phone, that’s what it weighs.

WADE WINGLER: Better life and camera?

STEVEN GOLDSTEIN: The battery life is about four hours of continuous use. For all day intermittent use, it should last you the full day. But if it doesn’t, you can always plug it in and use it while the charger is plugged in. The camera itself is an eight megapixel camera. If you think about high definition, it is one megapixel. So it is a really powerful camera and it is tiny, about the diameter of a pencil.

WADE WINGLER: Again, I am looking at it right now on the side of your glasses and it is amazing how small and the amount of power that comes out of that camera. Talk to me a little bit about the learning curve. You make it look easy, but does it take you out to get used to it? Who does the training? How do people get this device?

STEVEN GOLDSTEIN: I’ll answer it backwards. The device is a quite directly through Orcam, though we do have folks we work with that assist that or will do trainings for us. Generally following the curriculum that we have, training for a user of Orcam should be between somewhere like a 90 minutes and three hours. If it is a young person who has a familiarity with assistive technology or a smartphone or anything like that, they are going to take this like a duck does to water. If it is an older person who hasn’t really been using a smartphone or maybe isn’t computer savvy, it may take a little bit longer for them to understand the menu and other aspects about it. Certainly that second one is not a barrier at all.

WADE WINGLER: We have about a minute left in the interview, and I want to make sure that we know what the pricing is on the device and then how can people learn more, contact information, that kind of stuff.

STEVEN GOLDSTEIN: So the pricing, we have two products. We have a product that is full-featured and has all of the product, facial, and currency recognition, the ability to get taught other products as well as it has the reading ability and will read any text pretty much, just not handwriting. That is what we call the Orcam My Eye. That is $3,500. And then we have a product that is just a reader, doesn’t have those product and currency recognition qualities. That is $2,500. But with a quick DS card exchange, that reader can become an Orcam My Eye. We call it Orcam My Reader.

WADE WINGLER: So if you start with the less featured model, change your mind, then he can pay for an upgrade and it is a software upgrade as opposed to a whole new unit?

STEVEN GOLDSTEIN: Right. It is a cheap shot, very easy to do.

WADE WINGLER: That’s great. Contact information if people want to reach out to you, learn more, what would you recommend?

STEVEN GOLDSTEIN: They can certainly go to the website of www.orcam.com, or they can call our United States office. The number is 800-713-3741.

WADE WINGLER: Steven Goldstein is the area sales manager for Orcam Technologies and has blown my mind this morning with this amazing technology here in our labs. Steven, thank you so much for hanging out with us today.

STEVEN GOLDSTEIN: My pleasure.

WADE WINGLER: Do you have a question about assistive technology? Do you have a suggestion for someone we should interview on Assistive Technology Update? Call our listener line at 317-721-7124. Looking for show notes from today’s show? Head on over to EasterSealstech.com. Shoot us a note on Twitter @INDATAProject, or check us out on Facebook. That was your Assistance Technology Update. I’m Wade Wingler with the INDATA Project at Easter Seals Crossroads in Indiana.

 

Please follow and like us:
onpost_follow
Tweet
Pinterest
Share
submit to reddit

Leave a Reply

Your email address will not be published. Required fields are marked *