AT Update Logo

ATU508 – Waymo with Clem Wright

Play

AT Update Logo

Your weekly dose of information that keeps you up to date on the latest developments in the field of technology designed to assist people with disabilities and special needs.

Special Guest: Clem Wright – Product Manager – Waymo
Samsung TV Accessibility Story: https://bit.ly/3a2j0cj
INDATA Web Accessibility Webinar: https://bit.ly/3cWgAxL

——————————
If you have an AT question, leave us a voice mail at: 317-721-7124 or email tech@eastersealscrossroads.org
Check out our web site: http://www.eastersealstech.com
Follow us on Twitter: @INDATAproject
Like us on Facebook: www.Facebook.com/INDATA

———————  Transcript Starts Here ———————-

Clem Wright:
Hi, my name is Clem. I’m the product manager for accessibility at Waymo. And this is your Assistive Technology Update.

Josh Anderson:
Hello, and welcome to your Assistive Technology Update. A weekly dose of information that keeps you up to date on the latest developments in the field of technology designed to assist individuals with disabilities and special needs. I’m your host Josh Anderson with the in data project at Easterseals, Crossroads in beautiful Indianapolis, Indiana. Welcome to episode 508 of Assistive Technology Update, it’s scheduled to be released on February 19th, 2021. On today’s show, we are super excited to have Clem Wright on from Waymo, and he’s here to talk about not only self-driving cars, but also how Waymo is keeping individuals with disabilities at the forefront of development of their technology. We have a quick story about some accessibility features being built into Samsung televisions to help individuals with hearing loss better access information on their television. We were also very excited to say that we are now on Amazon Music so if you like getting your podcasts from Amazon Music, you can find us there as well. Just search for Assistive Technology Update.

Josh Anderson:
Also, if you have a great idea for someone we should interview on the show, a question, a comment, or really anything that you would like to say to us, there’s many different ways that you can reach out. You can call our listener line at (317) 721-7124 or shoot us an email at tech@eastersealscrossroads.org, or drop us a line on Twitter @indataproject. Thank you so much for taking time out of your busy week to listen to us and let’s go ahead and get on with the show

Josh Anderson:
After all these months of lockdown, maybe you’re looking for some new podcast to listen to. Well make sure to check out our sister podcast, Accessibility Minute, and ATFAQ or Assistive Technology Frequently Asked Questions. If you’re super busy and don’t have time to listen to a full podcast, be sure to check out Accessibility Minute, our one minute long podcast that gives you just a little taste of something Assistive Technology based so that you’re able to get your Assistive Technology fix without taking up the whole day. Hosted by Tracy Castillo, this show comes out weekly. Our other show is Assistive Technology Frequently Asked Questions or ATFAQ. On Assistive Technology Frequently Asked Questions, Brian Norton leads our panel of experts including myself, Bellville Smith, and our own Tracy Castillo as we try to answer your Assistive Technology Questions. This show does rely on you so we’re always looking for new questions, comments, or even your answers on Assistive Technology questions. So remember if you’re looking for more Assistive Technology podcast to check out, you can check out our sister shows Accessibility Minute and ATFAQ, wherever you get your podcasts now, including Spotify and Amazon Music.

Josh Anderson:
Are you a developer interested in learning more about web accessibility? We’re joined renowned web accessibility professional Dennis Lambri for a full day of training. This webinar training begins with a background on disability guidelines and law. Many techniques for designing and developing accessible website are then explained. Basic through advanced levels are covered. The main topics include content structure, images, forms, tables, CSS, and ARIA. Techniques on writing for accessibility and testing for accessibility are also covered. If you’re involved in web design or development, don’t miss this wealth of practical knowledge. This webinar’s put on by the INDATA project in Indianapolis, Indiana, and will take place on May 12th, 2021, beginning at 11:00 AM Eastern time. So again, if you’re a developer or involved in web design or development, don’t miss out on this wonderful training to learn how to make sure that everything you create is a little bit more accessible. We’ll put a link to the webinar registration over in the show notes.

Josh Anderson:
Our first story today comes to us from over at Samsung newsroom. It’s titled Expanding Accessibility with Samsung part two audio. So this talks about some different things that Samsung is beginning to build into their TVs to really help with accessibility. And the ones they talk about in this story are really targeted towards individuals who might be deaf or hard of hearing. So the first feature it actually talks about is something called sign language zoom. So if you really think on a lot of press conferences, a lot of news stories, and maybe some other stuff, especially kind of during the pandemic, as elected officials are having maybe more press conferences, we always see the little box and it’s usually either in the upper right-hand or lower right hand corner of the sign language interpreter. For those who know a lot about sign language, you know that while it does have some hand gestures, really the body language, the facial expression, all these things are very important in conveying the message of sign language and what you’re trying to get across.

Josh Anderson:
So sometimes that little box just isn’t quite enough. So something they’re building is something called sign language zoom, and this uses an artificial intelligence algorithm and it will automatically recognize sign language and magnify that person that’s performing it by up to 200%. So if you really think it makes that actually kind of the focal point over to the side of the screen. So if you’re using that for comprehension, you’re going to have a whole lot more to deal with and a whole lot more to be able to see. This says you can actually take this feature and even manually choose an area on the screen to magnify. So maybe if it doesn’t find that interpreter using its algorithms, you can have it actually magnify that area on your own. Or you can alter the magnification ratio to get a closer look at other things such as questions on a quiz program or even a scoreboard at a sports event.

Josh Anderson:
So you can use it for some different things. But what it’s really made for is to really change that Zoom level on the interpreter so that they’re much more accessible. The next thing it talks about is something called separate closed caption. I didn’t really know that this was kind of a thing or something that was a problem, but I suppose as I’m actually talking to some folks, it is a problem for some things. So you will know the broadcasters actually have to provide closed captions for all their different programs to help out deaf consumers absorb that content. But sometimes the TV program itself also has captions so those two sets of words can overlap. So you’re not really going to get anything kind of from it. So in this situation, there’s something called separate closed caption, and this lets you move the closed captions for the deaf to a different viewing area on the TV.

Josh Anderson:
You can also change background and text colors in order to make them a little bit easier to see, but not only that, this can also move around where those closed captions are on the screen. So if you have something like I think of the score ticker whenever you’re on watching a sports game, it tells you all the other scores. Maybe there’s other information that’s at the screen where normally those closed captions would be. You can just move them out of the way so that you’re still able to get that information that is showing on the screen, as well as read the closed captions in order to understand exactly what’s going on. And then the last thing that it talks about here in this story is something called multi output audio. So if you think of your, if you have more than really two, maybe three people in a room listening to the TV, it’s usually too loud for someone, maybe too quiet for someone else.

Josh Anderson:
What this does is it actually lets you change the volume for different users. So if an individual maybe has a hearing impairment, they can wear a pair of Bluetooth headphones, turn up that volume as much as they want in order to be able to hear it. But everyone else listening to the TV won’t be affected. Their volume stays the same until they change it. So you can actually change the volume depending on what listening device and you have more than one person, different listening devices, all listening to it at the same time.

Josh Anderson:
Some really cool built-in technology that Samsung trying to put in their televisions to make them a little bit more accessible to folks who are deaf and hard of hearing. Hopefully we’ll hear more about this kind of stuff and see even more accessibility features added to televisions from Samsung and other companies in the future, but we’ll definitely keep an eye out. But for now we’ll put a link to this story right over in our show notes.

Josh Anderson:
Transportation probably has to be one of the biggest barriers for individuals with disabilities, finding accessible ride-shares, buses, or cabs can be a bit of a hassle and in some cases can even lead to discrimination. Well, our guest today is Clem Wright, product manager for Waymo, and they’re out to make driving and writing a lot easier, more accessible, and maybe even more enjoyable and they’re really just removing one major component: the driver. Clem, welcome to the show.

Clem Wright:
Thanks so much for having me.

Josh Anderson:
Yeah. I’m really excited to talk about this technology and just all that it can do for folks. But before we get into that, could you tell our listeners a little bit about yourself?

Clem Wright:
Sure, as you said, I’m a product manager at Waymo. I’ve been with the company almost three years now and I focus on our rider experience and accessibility. Making sure that all of our Waymo writers, including those with disabilities can enjoy safe, comfortable, and convenient rides in our fully driverless service.

Josh Anderson:
And that’s excellent. So many times disability and accessibility is something that people just tack on at the end. So I love that you guys are kind of thinking of that straight from the beginning. So let’s kind of start at the beginning. What is Waymo?

Clem Wright:
So Waymo is in the autonomous driving technology company. Our mission is to make it safe and easy for people and things to get where they’re going. So we are really focused on building the driver, what we call the Waymo driver, which is the underlying technology that allows our cars to drive fully autonomously. This includes moving people with our Waymo one ride hailing service and moving things with our Waymo via transportation for goods service.

Josh Anderson:
Oh, excellent. Excellent. Well, and then how was Waymo actually started?

Clem Wright:
We started back in 2009 as part of Google X, which is Google’s Moonshot Factory. And we started developing a level four autonomous driving technology, and level four means that you don’t need a human driver anymore. So there are levels one through three where it’s more of an assistive technology, but we really felt it was important to take the human driver fully out of the equation to provide the safest experience. And we’ve developed a bunch of driving experience over the past decade and more. So we have driven over 20 million miles on public roads and over 20 billion miles in the simulated world that we use to test our technology.

Josh Anderson:
So, Clem, I got to admit, I have no idea really how a self-driving car works. Can you give me kind of a, I don’t know, what do they call it? The 30,000 foot view of kind of how the self-driving car, how that technology actually works?

Clem Wright:
For sure. I’ll give you the, I’ll give you the 300 yard view, which is the distance that our cars can see with LiDAR, which is lasers out in front of the vehicle and in all 360 directions around the vehicle to help identify objects that they’re seeing along the road. So we have our vehicles that are outfitted with a number of different sensor types. We have LiDAR, like I said, these lasers that help us identify different objects. We have high definition camera, which also helps us identify objects and helps us identify color better. And radar, which helps us identify how fast objects are moving and throughout all of these, which is again, is all 360, we can perceive the world around us. So we call this our perception and we can perceive a truck in the lane next to us. We can perceive a ball rolling out in front of the car.

Clem Wright:
We then have a very large, powerful computer onboard the car, which can predict what each of these objects is going to do. So things that you as a driver may not even be able to see because you have sort of a cone of vision out in front of you, our car can perceive them and then predict what we think is most likely for those objects to do. Is that ball going to continue rolling? Is that car going to pass us on the left? And then based on that information, we plan our route through the world.

Clem Wright:
So we can say, we think that car is about to emerge out in front of us. We’re going to switch into the other lanes so that we avoid it. And all of this is built on a foundation of high definition maps. So we vary in detail map all the areas that we drive. And this allows us to be very confident about where we’re going. And it’s really combining this amazing technology with many, many miles of experience on public roads and seeing new driving situations. So yeah, I hope that gives you a bit of a window into how Waymo works.

Josh Anderson:
No, that’s perfect. That’s absolutely perfect. You have a company commitment to accessibility. Clem, can you tell us a little bit more about that?

Clem Wright:
Yeah, like I said, it’s our mission to help people get where they’re going and for us that’s all people. We’re here to improve mobility for everyone. And we really think of it at every stage in our development process. So when we’re first, for example, researching a new feature, wanting to make sure that people with disabilities are represented in those research groups. And we then go into developing the product requirements and prototyping and designing and building the actual feature, we then run testing with it to make sure that it’s working as intended for those folks. And we keep going throughout this cycle, making sure that we start with this foundational research and then we continue to build. And we’ve seen some really interesting effects of this where features that were originally intended for users with a particular disability, ended up being very valuable and applicable to our broader rider population. So we really try to espouse universal design and inclusive design and think of those more extreme use cases to make our product better for everyone.

Josh Anderson:
Excellent. Clem, can you tell me a few of the examples of those kinds of times when you had something maybe made for accessibility that ended up being great for kind of everyone?

Clem Wright:
Yeah, for sure. So through our internal users at Google and Waymo and through folks we were researching with in Phoenix where we operate our service, one of the things that we found was challenging for people was when the car, in order to save on the routing time it takes to get to, you would maybe switch the side of the street. So it might be a bit of a longer time to loop all the way around to get to right your front door, so maybe you pull on the other side of the street. For some folks, that’s an optimization that improves their experience because they have a shorter trip overall and the car gets there faster. But for example, for blind folks crossing a street can be a really big challenge. And so we added a feature that would let you select a setting where you could say, “You know what, even if it takes a bit longer, I’d really prefer you to bias the car towards picking up on my side of the street to minimize the walking overall and to minimize the side of streets swaps so I can get to the car.”

Clem Wright:
And we built this for folks with disabilities in mind, blind folks, folks with ambulatory disabilities who might have trouble crossing the road as well. And we found that lots of other people it turns out are wanting to kind of trade off to. Other people also want to minimize their walking and they may be things that it might even be a temporary disability. Like someone broke a leg and wants to make that trade off for a certain time. So that’s just one example. I have a couple more if you’re interested.

Josh Anderson:
Yeah, definitely. Yeah. Cause that’s a great one. Like you said, you didn’t really think about it, but I even think about I live in Indiana and in winter some days I don’t want to cross the entire street to wait on a ride. If you could somehow come to my house and get me, that’d be great. But what are some of the other ones?

Clem Wright:
We haven’t figured that one out yet. We haven’t figured it out coming into your house just yet. Yeah. So another example was wayfinding to the car. So we’ve seen a really big challenge with a self-driving service. We have these cars, they have state-of-the-art sensors on them. They can perceive the world around them, predict what it’s going to do and plan their way through it. And we have lots of experience, but there’s still areas where it’s safer for us to maybe pull around the corner of your house versus double-park directly in front of your apartment. And in those cases that trade off of finding a safer place to park sometimes makes it a little harder to find your car. And we are particularly sensitive to this for our blind users. You walk out your front door and you want to be able to find that car, if you’re blind, that can be a particular challenge if it’s not where you expect it to be.

Clem Wright:
So we built a feature that lets you honk the horn from your mobile app. So when the car is parked, you get a button and you can press it and you just hear the car honk, and our blind testers love this. It was like such a big improvement over just walking out and not being able to make the car make a noise. But I then, I was running a usability study for a totally unrelated feature. We happen to have this new feature, the honking feature, enabled. And one of our sighted users saw it there and said, “Oh cool. I can honk the horn cause the car didn’t pull up where it’s supposed to be.” And then they pressed it. And that was a real light bulb moment for me of like, “Oh, this other person who we didn’t design this for is benefiting from this feature and was able to find the car.”

Clem Wright:
So a lot more to improve on that feature and we can talk about that more too. But that was another case where something designed for this more extreme case of a user with disability then ended up benefiting everyone in our user base.

Josh Anderson:
Oh sure. And even as more and more of these would get on the road, that’d be great. I think of concerts, big events, things where you’re going to have six, seven of them show up, how do you possibly know which one is yours. But yet yeah, you can have that horn know exactly which one you’re supposed to go get in. So that’d be great. And I know I’ve worked with a lot of folks who are blind, visually impaired who have gotten in the wrong car before because of that exact same thing you’re talking about. Someone pulls up for a ride for someone else, they get in, and suddenly they’re going to the wrong spot. So that is a great feature and I can see how that can help a lot of folks. Now you guys have actually partnered with some different places to kind of work on all this technology. What are some of your partners?

Clem Wright:
Yeah, so we have partnered with the Foundation for Blind Children, which is located in the territory where we drive in in Phoenix and they’ve been super helpful to work with on these questions. Also the Foundation for Senior Living has been a really great partner for us. Helping us find research participants and advise on us. And more recently we’ve gotten advisory partners from more national groups like the AAPD and NFB. So for broadly people with disabilities or the Federation for The Blind.

Josh Anderson:
Nice. That’s great. That’s great that you’re actually using folks for real world kind of experience. Cause I know, and no offense to any other kinds of companies or anybody else, but a lot of folks were like, “Oh, we’ll make this because it will be great for individuals with disabilities,” but until you really test it with individuals with disabilities, you’re just, I guess, throwing stuff at the wall and hoping something kind of sticks. So that’s great that you are kind of taking them along for the ride I guess, for lack of better terms. Kind of talking about Phoenix a little bit, kind of what stage of development and kind of deployment are you at in Phoenix right now?

Clem Wright:
Yeah. I’m glad you asked because a lot of people might not realize this, but we have a fleet of fully autonomous vehicles. So no one in the driver’s seat that you can pull out an app on your phone. Anyone can go to our territory in Phoenix. It’s about a 50 square miles. Anyone can go there, download the Waymo app from the app store, sign in, and hail a ride and the car that shows up will have no one inside it. And it can take you from A to B within our territory. So it’s pretty exciting. It’s a pretty amazing milestone we’ve reached and obviously like 50 square miles as in the entire world. There’s a lot more we want to go, but we’re really proud of having achieved that. And we’ve already seen just the difference it makes for some people just like being able to have that personal space and have the ride to themselves is pretty transformative.

Josh Anderson:
I’m sure a lot of people don’t realize that. I know when I first kind of found out, I was like, “Whoa, that’s a lot farther along than I thought we were going to be talking about today.” So that’s very, very cool. Clement, I’m sure you could probably go on for days and days about this, but could you tell me some of the challenges that you guys have kind of come across as you’ve been developing all this stuff?

Clem Wright:
Yeah, for sure. So one you can imagine when there is no one in that vehicle, like ride hailing today over the past 10 years or so has… For some folks with disabilities, been a real boon and been really, really helpful because they’re able to get access to transportation that they wouldn’t have otherwise. And in the traditional ride hailing sense, there’s a driver in that seat to be able to provide some assistance in those cases and maybe get a little bit closer or even in some cases hop out and assist the person into the vehicle. So I think one of the big challenges for us, there are obviously pros of taking the driver out of the equation like from a safety perspective we believe we can be much safer, and from a personal comfort perspective you have that space to yourself. But there are definitely challenges with not having that human in there right now.

Clem Wright:
And what we’re trying to do is make sure we can cover those throughout the user experience we’re building into the app, through what the technology we’re building for the passenger experience while you’re riding in the car, and through our remote rider support agents. So at any point in the ride, you can call in and speak to a live human who can look through the cameras in the car on the interior and outside of the car if you need help with something, who can talk, and can also call any assistance for you that you might need. So what one example of that, one of the challenges was if you’re a user in one of our vehicles, a blind user and the car stops for a period of time, you might not know why it stopped. There isn’t a driver right there you can be like, “Hey, what’s going on outside?”

Clem Wright:
And we developed a feature that allows for more assistive audio in the car. So you can opt in to say, you know what, give me a little bit of a more cue of what’s happening. So for example, if you’re stopped for a period of time, our assistant audio cues might come on and say, “Hey, we’re yielding for a pedestrian. That’s why we’re stopped for an extra minute here,” or just to help you know where you are in your journey. “Hey, we’re merging onto 101,” which is a larger highway. So you can kind of have a sense of, okay, this is how far along I am and this might be why I’m hearing more traffic noises outside. So we do think there are a lot of great ways we can fill that need.

Josh Anderson:
Clem, what’s next for Waymo?

Clem Wright:
We are going to continue thinking about the next places we can drive. In terms of our rider experience, I mentioned some of the wayfinding features we developed to make it easier to find your car at pickup and get your destination at drop-off. And those have been great, but we also believe there’s tons of room to improve on that. Honking the horn is really helpful, but people don’t always want to hear a horn honk like it can be a little abrasive of a noise. Particularly you don’t want to bother the neighbors. So some of the feedback we’ve heard from riders is it’d be great if the car could play a different sound, a more soothing sound or a more sound that was designed explicitly for wayfinding. For folks who can’t hear, could they flash the lights of the car to be able to see it across a far parking lot.

Clem Wright:
Maybe that’s another way to help wayfinding for people with hearing impairments. Maybe there are things that we can do in the app itself. So you don’t even need the car to make noise or flashlights, but maybe the app can help you wayfind better to the car. And maybe we can do this even before you press a button on the app, maybe our cars, which can recognize all sorts of things, can recognize that you’ve come out and you’re getting closer to the car. And then give you a little bit of a way of giving you a honk, give you a flash of the lights and say, “Here I am.”

Clem Wright:
So we have a bunch of these ideas from feedback we’d gotten and we submitted them to the department of transportation’s inclusive design challenge, the USDOT challenge. And we were selected as semi-finalists in stage one, which we’re very proud of, and that is going to allow us to refine a bunch of those ideas and prototype them and research them with our partners that I mentioned earlier. And also to be in company of like a really just amazing crew of other academics and folks from industry who are working on this problem, it’s a really cool challenge that has inspired a bunch of people to look forward to autonomous driving and solve issues for people with disabilities.

Josh Anderson:
Definitely. For our listeners who don’t know about this challenge, can you tell us just a little bit?

Clem Wright:
Yeah. So a bunch of different groups applied and there were 10 semi-finalists selected. So now that we’ve been selected, we have a year and a half to design, develop, and research prototypes. And then in summer of 2022, we’ll be sharing those prototypes and the top three winners will receive a cash prize for that. For us, it’s really about just being part of this community and contributing and their design share outs and other things throughout the time period that will help us work more closely with those groups.

Josh Anderson:
Oh definitely. I’m sure some collaboration will definitely come out of that and be able to kind of, yeah, put all your heads together and really come up with some great things for folks. Clem, if our listeners want to find out more kind of about Waymo, what’s the best way for them to do that?

Clem Wright:
Best way is waymo.com. And you can also check out our blog at waymo.com/blog. We’re also on all of your favorite social channels so feel free to follow us there for quick updates. But yeah, there’s a ton of information on our website where you can learn more about what we’re doing and what we’re aiming to do in the future.

Josh Anderson:
Well, Clem, thank you so much for coming on today. I feel like I learned a heck of a lot. And again, thank you so much for trying to build all this in, as you make this amazing technology that’s really going to kind of change the world for a lot of folks. But thanks for definitely keeping the individuals with disabilities in front of mind as you develop those things.

Clem Wright:
Yeah. I mean, I think we take to heart what we’ve heard from a lot of different folks is nothing about us without us. So trying to make sure that we’re bringing people along for that ride, as you said. And also honestly it really is something that is it’s benefiting our entire product for everyone who uses it. So it’s really a win-win.

Josh Anderson:
Awesome. Well, thank you again for coming on the show.

Clem Wright:
Thank you. Thanks for having me.

Josh Anderson:
Do you have a question about Assistive Technology? Do you have a suggestion for someone we should interview on Assistive Technology Update? If you do, call our listener line at (317) 721-7124. Shoot us a note on Twitter at INDATA project, or check us out on Facebook. Are you looking for a transcript or show notes? Head on over to our website at www.eastersealstech.com. Assistive Technology Update is a proud member of the accessibility channel. For more shows like this, plus so much more, head over to accessibilitychannel.com. The views expressed by our guests are not necessarily that of this host or the INDATA project. This has been your Assistive Technology Update. I’m Josh Anderson with the INDATA project at Easterseals crossroads in Indianapolis, Indiana. Thank you so much for listening and see you next time.

Please follow and like us:
onpost_follow
Tweet
Pinterest
Share
submit to reddit

Leave a Reply

Your email address will not be published. Required fields are marked *