AT Update Logo

ATU523 – Transcribe Glass with Madhav Lavakare

Play

AT Update Logo

Your weekly dose of information that keeps you up to date on the latest developments in the field of technology designed to assist people with disabilities and special needs.

Special Guest:
Madhav Lavakare – Creator of Transcribe Glass
Comcast Accessibility Story: https://bit.ly/3oVOAhI
Earswitch Story: https://bit.ly/3hUxJut
Tech for Good Canada Story: https://bit.ly/2RcRkLq
——————————
If you have an AT question, leave us a voice mail at: 317-721-7124 or email tech@eastersealscrossroads.org
Check out our web site: http://www.eastersealstech.com
Follow us on Twitter: @INDATAproject
Like us on Facebook: www.Facebook.com/INDATA
—————- Transcript Starts Here ———————
Madhav Lavakare:
Hi, this is Madhav Lavakare, and I’m the Founder of TranscribeGlass, and this is your Assistive Technology Update.Josh Anderson:
Hello, and welcome to your Assistive Technology Update, a weekly dose of information that keeps you up-to-date on the latest developments in the field of technology designed to assist individuals with disabilities and special needs. I’m your host, Josh Anderson, with the INDATA Project at Easterseals Crossroads, in beautiful Indianapolis, Indiana. Welcome to episode 523 of Assistive Technology Update. It’s scheduled to be released on June 4th, 2021. On today’s show, we’re super excited to have Madhav Lavakare on to talk all about TranscribeGlass. We also have stories about Comcast accessibility, a new ear switch, and a Tech For Good Program in Canada. We thank you so much for listening. Let’s go ahead and get on with the show.

Josh Anderson:
One of the cool things that we get to do here at the INDATA Project, besides putting on these great podcasts that you all enjoy listening to, is full day trainings. Now, of course, with the pandemic and some other things, all these trainings have been moved online, at least for this year, and maybe the foreseeable future, but this has also afforded us the opportunity to really be able to present to folks in a lot of different places, not just those that can make it here to central Indiana. So if there are any listeners out there who are interested in attending any of our full day free trainings, I want to make sure that you knew how to do that. So what I’ll do is I’ll put a link in the show notes to Eastersealstech.com/fulldaytraining, and this will bring you to a list of all of our trainings.

Josh Anderson:
Now, our next one is coming out on June 10th, so coming up here pretty darn soon, and that training is mobile apps for everyday life. I’m really excited about this one, mostly because it’s put on by my clinical team, so I’m always excited to hear them present. They’re all wonderful presenters, and they do a great job, both live and over Zoom, which is the way we’re doing pretty much all trainings these days, it seems like. If you attend one of these trainings you can get CEU, so if you need those for any sort of certification or anything like that, you can pick those up for attending the training, and I do believe that registration is open for that training on June 10th, so make sure to go ahead and register for that as soon as possible. Other ones coming up, we have Assistive Technology and Remote Supports for Independent Living coming up August 12th, as well as Assistive Technology and Social Isolation on November 4th. Sometime later this year, we’ll also have a list of events coming out for the next year, 2022, which somehow is already a little on the horizon.

Josh Anderson:
So I’ll put a link to that page over in the show notes, so that you can go check it out, register for future trainings, and just see what’s coming up with the full day trainings here at the INDATA Project. After all these months of lockdown, maybe you’re looking for some new podcasts to listen to. Well, make sure to check out our sister podcast, Accessibility Minute and ATFAQ, or Assistive Technology Frequently Asked Questions. If you’re super busy and don’t have time to listen to a full podcast, be sure to check out Accessibility Minute, our one minute long podcast that gives you just a little taste of something assistive technology based, so that you’re able to get your assistive technology fix without taking up the whole day. Hosted by Tracy Castillo, the show comes out weekly. Our other show is Assistive Technology Frequently Asked Questions, or ATFAQ. On Assistive Technology Frequently Asked Questions, Brian Norton leads our panel of experts, including myself, Belva Smith, and our own Tracy Castillo, as we try to answer your assistive technology questions.

Josh Anderson:
This show does rely on you, so we’re always looking for new questions, comments, or even your answers on assistive technology questions. So remember if you’re looking for more assistive technology podcasts to check out, you can check out our sister shows, Accessibility Minute and ATFAQ, wherever you get your podcasts, now including Spotify and Amazon Music.

Josh Anderson:
Our first story today comes to us from Forbes. It’s written by Steven Aquino, and it’s called Comcast Accessibility Chief, Tom Tom Wlodkowski, talks how the cable giant hopes to make a giant impact for customers with disabilities. Now, we’ve actually had folks from Comcast on the show before to talk about a little bit of their accessibility and some of the things they have, but just for folks who haven’t heard that, I thought I would put a link to this story over in the show notes. They could check out some things that Comcast does in order to make things a little bit more accessible for their customers.

Josh Anderson:
So the individual being interviewed here is Tom Tom Wlodkowski, and he’s Comcast’s Vice President of Accessibility and Assistive Technologies and runs the company’s accessibility lab. He also happens to be blind himself, so he actually can kind of test and see how these things actually help folks access their television, their cable box, and everything else. It gets into a little bit of the laws and the different things that have been signed in to place to kind of help folks with disabilities access different technologies, but then it really gets down into their approach for building accessibility. It says to take a customer first approach, they’ve added a screen reader to their guide, so that you can have that information read to you and actually be able to access it. They’ve also built in an Alexa-like voice remote, so that you don’t actually have to try to find the buttons, you can just tell it what you want to do, and it can take you and do those things.

Josh Anderson:
It says they’ve also added a dedicated call center for their deaf and hard of hearing customers, which employees representatives that speak American sign language. And Comcast is also working with different broadcast networks, especially NBC, which is owned by Comcast, to provide audio descriptions of traditionally televised events, such as the Macy’s Thanksgiving day parade in New York city. So we did a lot of the different things that we talk about on this show, about just ableism and just some other stuff, but how it really kind of building these tools in to help individuals with disabilities end up getting utilized by everyone and really make everything more accessible for all individuals, not just individuals with disabilities or those without. So I’ll put a link to this over in the show notes, but just a couple of things that Comcast is working on to make sure that everyone can access their technology regardless of ability.

Josh Anderson:
Our next story comes to us from newatlas.com. It’s titled Assistive Tech May Let Locked In Users Communicate via “Ear-Clicks,” it’s written by Ben Coxworth, So I do want all listeners to stop for just a moment, and try to move your ears’ tensor tympani muscle. Go ahead. I’ll give you a minute. If you’re listening to this on headphones, that might be a little bit harder, especially because it’s one of the tiniest muscles in your body, but it’s right there in the ear, so conceivably, this could be tested by people who may have lost control over their other muscles. So this story talks about something called an ear switch, and that’s not its technical name, but just kind of what it’s called. Technology is being developed at Britain’s University of Bath, and it’s really designed first and foremost for users who are locked in, and they kind of say that that means that they’re paralyzed and unable to speak.

Josh Anderson:
Sometimes, some people really don’t like that term, but really it means individuals who may have had a stroke or in other late stage kind of motor neuron diseases, but really, it’s individuals who aren’t able to access technology and other modes. So if we think of other muscle movements, eye gaze, eye movements, or these other kinds of things, it’s thought that perhaps, as this one muscle is so small, that it could still be controlled when those other access methods are not available. Now the whole time I’ve been talking, I’ve been sitting here trying to tense mine, and I must admit, I’m not even positive that I can completely find it, but what this does is it uses an earpiece camera that can detect this tiny little bit of movement, and then will trigger the computer to select different lines of keys. Other than that, it uses basically a kind of simple scanning.

Josh Anderson:
So, a row containing a desired letter is highlighted, the user selects it by tensing this muscle, and then the earpiece detects that and selects that line of keys. And then, of course, the highlights change sequentially, so kind of a scan and click kind of way. But in that way, the individual could type out messages, if they’re not able to use switches or sip puffs or eye gaze or any of the other kind of means to access things that folks use today. I will put a link to this over in our show notes. It’s definitely still really being developed, but really anything that opens up that accessibility or gives us another access method can open up the world to a whole another group of individuals, a whole another group of people, or for folks who maybe can move some other muscles, maybe their eyes, maybe a sip puffer thing, it can give them another way to access the world around them, a backup way or a way to more easily interact with things.

Josh Anderson:
Again, pretty cool technology. Hopefully we’ll see something out of it in the future, but again, anything that opens up access for individuals is always good in our book.

Josh Anderson:
I Found a story about our brothers and sisters up to the north in Canada titled, TELUS Tech for Good program expands across Canada to help connect and empower Canadians with disabilities. This is about a partnership between two different organizations. One of them’s TELUS, T-E-L-U-S, communications Inc., and the March of Dimes Canada. Says here, that TELUS is kind of a leading communication company up there in Canada, and of course the March of dimes is a nonprofit that helps individuals with disabilities, but it says here, this is a national expansion of a program that was first launched in Alberta and British Columbia in 2018, and it’s there to support Canadians with disabilities who require professional assistance to independently use or control their mobile devices.

Josh Anderson:
Says the program can offer customized recommendations, support, and training on mobile devices, and based on the individual’s needs, the assistive technology required for individuals with disabilities to use those mobile devices. The program called Tech For Good enables equitable access to mobile devices, empowering people with disabilities to live, work, and play in the digital world. It looks like a great program. That kind of mimics some of them that we do have here in the states and probably around the world, but really kind of getting that out and also giving folks the ability to not only access the technology, but also access to professionals, who can help them with the training, the implementation, the recommendations, and actually getting the right technology into folks’ hands. As we all know, as listeners of this show or AT professionals, there is no kind of magic bullet, no kind of one thing that can help everyone, so really finding that right piece that can help an individual is always a very important part of assistive technology and implementing such technology for folks.

Josh Anderson:
I’ll put a link to this over in the show notes, so you can find out a little bit more about it, but it sounds like a really great program. And we look forward to kind of hearing more about it and seeing how everything turns out.

Josh Anderson:
Having captions is a great accommodation for individuals with hearing impairments. We’re all used to having these on our TV programs, YouTube videos, and even at the movie theater, but what about during normal conversation? Well, our guest today is Madhav, and he’s here to tell us all about their solution called TranscribeGlass. Madhav, welcome to the show.

Madhav Lavakare:
Thank you very much for having me, Josh. It’s a pleasure.

Josh Anderson:
Yeah. I am really excited to kind of hear about this technology, but before we get into that, could you tell our listeners just a little bit about yourself?

Madhav Lavakare:
Yeah, absolutely. So I’m Madhav. I’m based out of New Delhi, which is the capital of India. I am 19 years old, turning 20 in just a couple of days.

Josh Anderson:
All right. Well, thank you for starting off the show by making me feel old, but that’s okay. We’ll keep on going. So Madhav, what is TranscribeGlass?

Madhav Lavakare:
So TranscribeGlass is an affordable, comfortable, wearable device that takes closed captions from any source and projects in your field of vision. And so, it’s basically real time comfortable captioning glasses that you can use in any situation, and you can snap it onto any glass frames, whether you wear prescription glasses or you don’t wear glasses at all.

Josh Anderson:
Nice. I got to ask, where did this idea come from?

Madhav Lavakare:
Yeah. So I actually came up with the idea with something that happened in my life. So four years ago, when I was in my junior year in high school, I had a friend who was hard of hearing, and one day I kind of just didn’t see him at school, and it turns out he had dropped out of school really suddenly, and that kind of took me aback, and I kind of asked him, well, what’s going on, the next time I saw him, and he was really unable to kind of understand what teachers were saying. He was even unable to understand what his friends were saying. It was very difficult for me to communicate with him as well, and our school in India didn’t really have a lot of accommodation services, and there wasn’t anything that could really help him out, and so I asked him, okay, but what about hearing aids?

Madhav Lavakare:
And he said, no, those are really expensive, but even if I do get a expensive pair of hearing aids, they don’t help me in a lot of situations, especially when there’s a lot of background noise or they’re multiple people speaking. So it’s just really not a good solution for me, and then I kind of checked… I started looking at other solutions and I talked about cochlear implants and I asked him, well, what about a cochlear implant? And he said, well, that’s a extremely expensive and B, it’s just really intrusive, and there’s a surgery required, and I really don’t want to go through all of that. And so, fair enough. I started looking at other solutions, and the kind of last solution that I could think of was closed captions, and I asked him, well, what about using a mobile application that you can kind of see speech to text in real time and you can see the transcription.

Madhav Lavakare:
Why don’t you use that? And he said, well, I’ve actually tried using an app, in fact, many apps like this before, but it’s really inconvenient because I depend on lip reading. I look at the speaker, I need to look at visual cues, like hand gestures and facial expression, and what she’s writing on the board. So I can’t be looking at a phone screen all the time, and it’s really inconvenient to kind of have this back and forth between the phone and the speaker and the phone and the speaker, and I just ended up being very confused and not knowing what’s going on. So that’s when I kind of just had a Eureka moment of, okay, why can’t we take closed captions and put them in your field of vision so that you can look at everything that you need to look at, and you can also see a transcript of what’s being said in real time.

Josh Anderson:
Well, that’s awesome. And you bring up such a great point, because, yeah, I do know that they do have the accommodation where you can look at the phone, but you miss so much of conversation, of the facial expressions and everything else to really get that whole communication. There’s so many of us who may not having hearing impairment really take for granted. So I love that idea. Well, how does it work?

Madhav Lavakare:
So, TranscribeGlass, you can think of it as a second screen for your phone or your laptop, where you can look at captions. So how it works is there are a lot of different caption sources, so I may use automatic speech recognition. I may use apps like Live Transcribe or Otter, and these apps usually use automatic speech recognition services that are usually provided by big companies like Google, Microsoft, Amazon, and then there’s also human captioning, which is CART services, so communication access, real time translation, where there’s a real human with a steno machine, and they’re typing what’s going on in real time. It’s usually a bit more accurate and reliable, and then of course there are digital subtitle files for movies or YouTube videos. So what TranscribeGlass does is it gets the captions from any caption source, so we’re independent of caption source, so you’re not just stuck to Google’s automatic speech recognition or something else, you can change any caption source.

Madhav Lavakare:
And we take the captions and we stream it to our wearable device, and that kind of projects the text as a floating image in your field of vision, and when you snap it on to your glasses, you can basically look at someone who’s talking to you and position the captions next to the lips or on their forehead, and where we’re getting those… the captions from the caption source, and they’re kind of in real time being streamed onto this augmented reality display.

Josh Anderson:
Oh, that’s awesome. And you kind of already brought this up, but can you change the position and the size of the caption? So if I maybe have a little bit of a visual impairment or something, can I change where they are and maybe how large they are?

Madhav Lavakare:
Yes. So you can change the size of the captions. We have five different font sizes. Of course, the smaller the font size, the more text you get, and the larger the font size, less text, but you can change the font size. You can kind of position it wherever you want in your field of vision. And another thing that kind of on-related to a point that you brought up is in the testing that we’ve done, we have realized that there are people who, for example, may have blind spots in one eye, or may have a weaker vision in their left eye or the right eye, so the product that we’re making is… The aim is to make it a symmetrical design that this one device you can attach onto the right side of your glasses, or you could attach it onto the left side of your glasses, whichever eye, you feel more comfortable viewing it from.

Josh Anderson:
And you mentioned a lot of different kinds of transcription services. I mean, from CART to kind of the ones that do it automatically and everything, and I know that this has to kind of connect to something. Is it going to be able to connect to an Android and an iOS, or is it just one, or are you kind of working through all that?

Madhav Lavakare:
So we have quite a bit of functionality working on Android, and we’ve just started developing an iOS app as well, so we’re pretty confident that we’ll be able to have both apps on both Android and iOS, so you just need to open up the app, the TranscribeGlass companion app on any mobile device, iOS or Android, and just set it up one time connected to the device, and then after that, pretty much everything happens automatically. You can select whatever caption source you want and hit play, and it starts streaming to the device. You can keep your phone away. You can even use it in the background. You can use another app if you want. And basically you use your phone as a remote control, if you want to change any settings on the hardware device, but it kind of works fluently once you set it up.

Josh Anderson:
Excellent. And what stage of development are you currently in?

Madhav Lavakare:
So we’re at a late proof of concept stage, which means that we’ve built quite a bit of prototype actually. I say we, but it’s been pretty much me for the first three years. I’ve been fortunate enough to have a few more people working with me in the last year, but I’ve been able to build five different proof of concept prototypes. Got a lot of feedback from deaf and hard of hearing people who’ve tried it out, and we’re now kind of designing, what we’re calling the beta version, of TranscribeGlass, which is going to be not a perfect product, but it’s going to be something that we can ship out to real people who want to try it out early, and that’s what we’re planning to do, and ship it out by late fall of this year, 2021.

Josh Anderson:
Madhav, how does this differ from maybe some other accommodations?

Madhav Lavakare:
Yeah. So when we talk about kind of smart eyewear or headset accommodations that can give you closed captions, a lot of people might have tried out Exxon smart glasses or Sony glasses, especially at kind of the cinema and the movie halls, Regal and Broadway, but the problem with those is we’ve spoken to a lot of people who’ve tried them out, and one, they’re just really big and heavy and people find them really uncomfortable, even sitting through a show for a couple of hours, ends up being really uncomfortable.

Madhav Lavakare:
Sometimes there’s eye strain and there’s headaches, so we’re trying to fix the problem with those things. So one of the things we’re focusing on with TranscribeGlass is comfort and wearability and convenience, kind of, it’s just going to work out of the box, and to kind of give some numbers to that, if you look at a lot of existing headsets are smart glasses, they weigh, I don’t know if we can do the conversion, but I’m going to tell you in, in metric terms, but they’re kind of 80 grams to 150 grams and above, and that just is a lot of strain around your nose, around the back of your ears, especially if you’re wearing hearing aids, but the product that we are building, the current prototype version that we have, it weights just 11.1 grams, which is literally the weight of a plastic ballpoint pen.

Madhav Lavakare:
And people aren’t even able to kind of notice that they’re wearing something on their face, and we also have something to reduce headache and eyestrain where you can actually adjust the focus of the text that you’re seeing, so it can be at a hundred feet or 500 feet or even five feet away from you, so that there’s no eye strain.

Josh Anderson:
Very cool. And I know you’re kind of still in some of the early kind of stages, but you’d mentioned making this an affordable accommodation, what is kind of the price point that you’re aiming for?

Madhav Lavakare:
Yeah. That’s a great question. So, in India, which is where I kind of came across this problem, and another big problem is purchasing power, right? So a lot of deaf people in India… In fact, there was a study that showed the deaf families in India on average earn 250 to $300 a year, which is, very, very low purchasing power. So hearing aids and cochlear implants that are thousands of dollars, there’s no way anybody’s going to be able to afford that, so the target price for transcribed glass, and this is a flat price that we’re doing for TranscribeGlass beta, is $55, which in Indian rupees is 4,000 Indian rupees, which is less than even a very cheap smartphone that you can get and it’s pretty affordable. So that’s what we’re targeting for the beta version and my kind of guaranteeing that price, and hopefully the final product will be able to kind of be around the same price point.

Josh Anderson:
Oh, that’s excellent. So many people don’t take that into consideration that if you make this great accommodation, but no one can afford it, then is it really an accommodation at all? So I’m glad that you did that research and actually thought about that kind of at the very beginning. So I know that you’re just kind of in the beta stages and getting these out to folks to kind of get them in their hands and try them out and see how they’re going, when is your goal to have your kind of final product fully available? And I know that’s kind of a looking into the future kind of thing, but do you kind of have a goal or a date set when you’d really like to have it completely available for market?

Madhav Lavakare:
Yeah. So, I mean, I’d start off by saying hardware is hard, right? You have an issue on that everywhere, and it takes a lot of time. It takes a lot of engineering. It takes a lot of refining and testing, and it’s not like software where you can change a couple of lines of codes and you’ve got a good product. So it’s a really long process, which is why I say, I’ve been working on this for four years, and people think, wow, you probably are already selling a product, but no, we’re still in a beta stage, but we’re planning to launch the beta by fall 2021, and then kind of optimistic estimations for when we can get out the first production version of TranscribeGlass would be the end of 2022 or early 2023, and that’s optimistic. So to be safe, I’d probably say Q2 or Q3, 2023.

Josh Anderson:
Well, and with something like this, I mean, you do have to take your time, because folks are going to be relying on this for communication and for so many different things that you really… You don’t want to rush it. You don’t want to have something… because then folks are going to try it. It’s not going to work, and they’re going to just kind of throw it by the wayside. So I do like that you’re taking that kind of all into consideration. What if our listeners want to find out more about TranscribeGlass, or maybe even, kind of get in the beta program and everything, what’s the best way for them to do that?

Madhav Lavakare:
Yeah. So you can visit our website, www.transcribeglass.com. We’ve got information about the features of the product, what’s going to be in the beta version, the team, you can check that all out on the website, and we’ve even got a form where you can sign up to potentially be part of the beta version, and we’d be able to ship out a device to you. So you can do all of that from our website. And if you want to stay in touch with us and figure out what… kind of stay up to date with what’s going on, you can follow us on social media. We’re on Instagram, Facebook, and Twitter, and the username everywhere is just TranscribeGlass.

Josh Anderson:
Excellent. We’ll put a link to that over in our show notes. One of the… Besides making me feel a little older, like I didn’t do enough in my life, in my, I guess, teenage years and early twenties, I really do appreciate you coming on the show and talking about this accommodation, which will just be great for individuals with hearing loss, especially in that face to face communication, so thank you so much.

Madhav Lavakare:
Thank you very much for having me on the show. It was a really great opportunity to talk about my work, and I enjoyed speaking with you.

Josh Anderson:
Do you have a question about assistive technology? Do you have a suggestion for someone we should interview on assistive technology update? If you do, call our listener line at 317- 721-7124. Shoot us a note on Twitter @INDATAProject, or check us out on Facebook. Are you looking for a transcript or show notes? Head on over to our website at www.eastersealstech.com. Assistive Technology Update is a proud member of the accessibility channel. For more shows like this, plus so much more, head over to accessibilitychannel.com. The views expressed by our guests are not necessarily that of this host or the INDATA Project. This has been your Assistive Technology Update. I’m Josh Anderson with the INDATA Project at Easterseals crossroads, in Indianapolis, Indiana. Thank you so much for listening, and we’ll see you next time.

Leave a Reply

Your email address will not be published. Required fields are marked *