AT Update Logo

ATU582 – Touchless Computing with Pippa Chick – Global Account Director – Intel

Play

AT Update Logo

Your weekly dose of information that keeps you up to date on the latest developments in the field of technology designed to assist people with disabilities and special needs.

Special Guest:
Pippa Chick – Global Account Director – Intel

Download and more info: www.touchlesscomputing.org

Or: www.ucl.ac.uk – search for touch less computing

National Suicide Prevention Lifeline:
1-800-273-8255 or call, text or chat 988
More info: https://suicidepreventionlifeline.org/current-events/the-lifeline-and-988/

Stories:
Digital Access Story: https://bit.ly/3O21ANM

 

——————————
If you have an AT question, leave us a voice mail at: 317-721-7124 or email tech@eastersealscrossroads.org
Check out our web site: http://www.eastersealstech.com
Follow us on Twitter: @INDATAproject
Like us on Facebook: www.Facebook.com/INDATA

—– Transcript Starts Here —–

Pippa Chick:
Hi, this is Pippa Chick, and I am the global account director on the health and life sciences team here at Intel. And this is your Assistive Technology Update.

Josh Anderson:
Hello, and welcome to your Assistive Technology Update, a weekly dose of information that keeps you up-to-date on the latest developments in the field of technology designed to assist individuals with disabilities and special needs. I’m your host, Josh Anderson with the INDATA Project at Easterseals Crossroads in beautiful Indianapolis, Indiana. Welcome to episode 582 of Assistive Technology Update. It’s scheduled to be released on July 22nd, 2022.

Josh Anderson:
Today’s show, we’re super excited to have Pippa Chick, the global account director for Intel on to tell us about touchless computing, a whole new way to access the computer. We got a story about digital access and the bias that can come from data collection, as well as some information about an update to the National Suicide Prevention Lifeline. We thank you so much for listening. Now, let’s go ahead and get on with the show.

Josh Anderson:
Folks, it’s July here in Indiana, and that means a couple of different things. A, it’s really hot. But also it means the State Fair is right around the corner. This year, at the State Fair on Sunday, July 31st, they’re celebrating Ability Awareness Day sponsored by Easterseals Crossroads. The Indiana State Fair is showcasing another year of iconic Hoosier tradition, and now celebrating the first ever Ability Awareness Day, a day to help everyone become more compassionate and understanding of the challenges faced by individuals with disabilities and some of the tools to overcome those challenges.

Josh Anderson:
Easterseals Crossroads, along with many community partners will take over State Fair Boulevard on the 31st to recognize the accomplishments and challenges faced by individuals with disabilities. You can enjoy sensory-friendly midway hours, visit a sensory garden, play an adaptive sport, grab a funnel cake, or meet with yours truly who will be recording for the podcast right there on the midway. Whatever you choose to do, there will be something fun for everyone of every ability this year at the fair.

Josh Anderson:
So if you are heading to the fair on Sunday, July 31st, come out and celebrate Ability Awareness Day with Easterseals Crossroads and our partners. You can find us on State Fair Boulevard, and again, I will be there recording different things for the podcast. You’ll be hearing those recordings coming out sometime in August. So grab yourself a turkey leg, get yourself a funnel cake, loosen up those pants for all the great food, and come on out and check us out on July 31st at the Indiana State Fair at the Indiana State Fairgrounds. Can’t wait to see you there.

Josh Anderson:
Folks, today’s story comes to us from VentureBeat. It’s written by Kevin Mar-Molinero. It’s titled Digital Accessibility, What To Do When the Data Says You Don’t Exist. So we talk a lot about digital accessibility and making sure that things are accessible for folks. This digs a little bit deeper into kind of the data driven world that we live in today. And it kind of talks about how many people and solutions and decisions are based on data, and how so many different businesses, governments, and other things kind of make decisions based on these. Then talks about some of the problems. And not least of all is a source of inequality because of these things.

Josh Anderson:
Now, it doesn’t sit there and say how terrible data is. Sometimes it’s in the collection process and sometimes we’re relying a whole lot on algorithms and other things that may or may not actually capture a true picture of what’s actually happening. It talks about, and I really kind of like this part, it says the data says I don’t exist. And it talks about data collection, relying on things like user profiles and personas. So if there’s not enough information or that user persona isn’t there, well, then they’re just excluded from any data set that pulls any information. So they simply just don’t exist. And it gives a great example.

Josh Anderson:
Imagine if you’re a shopkeeper and your stor is upstairs. When asked you need a ramp for access, they might say, “Well, no, I’ve never had any customers that use a wheelchair.” So the data is correct. He’s not had anyone that uses a wheelchair, but he still has an accessibility problem, and of course you haven’t. No one can get in there with a wheelchair. So of course you’re not going to, and it doesn’t really help the shopkeeper make any kind of decisions towards making things better. And it talks about the catch-22, that this kind of creates, that if you believe certain groups aren’t using your products, then where’s the motivation to build products for those groups? So really it’s just talking much more about actually the way you interpret the data and the way you actually look at the tools that you have.

Josh Anderson:
It talks about bridging the gap. It talks about incomplete data. One thing would be typical tools, track things like clicks, page views, time spent on a page. Well, if the individual can’t access the website in the first place, well guess what, they’re not clicking on anything, they’re not accessing pages. So they’re not counted there at the very beginning.

Josh Anderson:
Also talks about some of the other things that kind of goes on, like you’re not allowed to capture data if someone’s trying to use assistive technology to access something. And that’s probably a good thing, I guess, you don’t want to target them. But at the same time, you don’t know that people are trying to access it. And maybe if you could have the data of every time a user of this assistive technology accesses your site, they maybe spend less time than the average person. Maybe they only access one or two pages because it’s not accessible and they’re not getting the information, they’re not able to get the kind of things that they need.

Josh Anderson:
Now, I don’t think that you’re probably ever going to have anything that can detect if someone’s using assistive technology to access your stuff. Again, that could be used for bias and other information as well. But it talks about it makes a pretty good argument for maybe allowing some of that, just so that individuals who make these sites and things can know that folks are trying to access their information and maybe get some better data in order to be able to make some real decisions on just how accessible their things are or need to be.

Josh Anderson:
Then towards the end, it gets into the ethics of data collection, and just some of the things that you really do kind of have to think about. It gives the example of a user test via Zoom or Microsoft Teams can end up being more a test to the remote software than your product or design because you’re doing it through there. So really just making sure you’re using the right tools, the right means, the dangers of collecting data. Think of unforeseen consequences that can happen from data collection.

Josh Anderson:
Really a great kind of think piece, just something that I just happen to stumble upon. And like I said, I know we talk about website accessibility, document accessibility, just pure-on accessibility, but we really don’t think about data collection that much, or maybe what kind of information’s collected or how it’s used. Or more importantly, what people are excluded just because they can’t access the places where the data is collected. So pretty interesting piece. And I will put a link to that down in the show notes so that you can go and check it out for yourself.

Josh Anderson:
You’re probably aware and familiar with 9-1-1, the number that you call when you need emergency services here in the states. Other countries have their version of this three digit code that you do for emergency services. But on July 16th here in the United States, 9-8-8 was launched. So I’m going to read a little bit about 9-8-8 from the suicidepreventionlifeline.org.

Josh Anderson:
And here it says 9-8-8 has been designated as the new three digit dialing code that will route callers to the National Suicide Prevention lifeline. While some areas may be currently able to connect to the lifeline by dialing 9-8-8, this dialing code will be available to everyone across the US starting on July 16th, 2022. When people call text or chat 9-8-8, they will be connected to trained counselors that are part of the existing National Suicide Prevention lifeline network. These trained counselors will listen, understand how their problems are affecting them, provide support, and connect them to the resources if necessary.

Josh Anderson:
The current lifeline phone number, 1-800-273-8255 will always remain available to people in emotional distress or suicidal crisis even after 9-8-8 is launched nationally. The lifeline’s network of over 200 crisis centers has been in operation since 2005 and has been proven to be effective. It’s the counselors at these local crisis centers who answer the contacts the lifeline receives every day. Numerous studies have shown that callers feel less suicidal, less depressed, less overwhelmed, and more hopeful after speaking with a lifeline counselor.

Josh Anderson:
This is the big step and can become an amazing resource to folks who are in crisis, either due to mental health, due to life factors, due to just a myriad of things. And I think over the course of the last few years, hopefully the stigma of mental health crisis has really kind of washed away a little bit, as I’m sure we can probably all admit that at some time in the last few years during the pandemic and everything else that’s gone on, we all have probably experienced some sort of anxiety or had some kind of crisis. But when it gets bad, when it gets to the point where you just can’t really handle it on your own, there are resources out there.

Josh Anderson:
And with the launch of this lifeline, that means that dialing three digits can put you in touch with somebody who can listen, who can really listen and can really talk to you, and hopefully get you set up with the services that you need, if not able to give it to themselves.

Josh Anderson:
Something that really does kind of hit home. I’ve lost some friends, family, and others to suicide. You don’t always know just how dark that darkness is that some folks can fall into. And this is definitely a service that’s there to give assistance, to provide that help that’s needed that maybe isn’t in the support system the individual has, or maybe they don’t want to talk to that support system.

Josh Anderson:
So again, if you are in any kind of mental health, substance abuse crisis, or contemplating suicide, please, please, please reach out. And you can do that either by dialing 9-8-8, or 1-800-273-8255.

Josh Anderson:
Down in the show notes, I’ll put a link over to the suicidepreventionlifeline.org with a little bit more information about the lifeline, about 9-8-8 and about other services. There’s also links on there if you want to become involved. This is where you can apply to actually become one of these counselors, if you would like.

Josh Anderson:
So again, just a really great resource for folks. And again, if you are feeling like you’re in crisis, just remember that if you don’t feel like there’s anybody there who you can speak to, just dial that 9-8-8, dial the 1-800-273-8255, and get connected to somebody who you can talk to and who will listen to those concerns, who will listen to what’s going on, and who can hopefully help you on that road to recovery.

Josh Anderson:
Listeners, access and controlling technology in different ways is super important to allow access to all individuals, regardless of their ability. Our guest today is Pippa Chick. She’s from Intel and she’s here to tell us about an exciting new project that they’re working on called Motion Input. And we cannot wait to hear all about it. Pippa, welcome to the show.

Pippa Chick:
Thank you very much pleasure to be here.

Josh Anderson:
It’s a pleasure to speak to you. And before we get into talking about the tech, could you tell our listeners a little bit about yourself?

Pippa Chick:
Yeah, sure. So my name is Pippa. I work for Intel in their health and life sciences team, and I have the privilege of working with the NHS on multiple technology projects.

Josh Anderson:
Excellent, excellent. And we’re here to talk about one of those today. So let’s talk about Motion Input. Can you tell us what is it?

Pippa Chick:
Yeah. So in short, Motion Input is piece of software, but the longer answer is that Motion Input is an answer to the pandemic. So back approaching three years ago, University College London had a look at, hey, how do we change the way that we interact with technology to move it into the touchless space? So in a pandemic environment, that means that you could be at a distance, or there’s less issue with infection control, because you’re not actually touching something. And then it’s kind of developed from there.

Pippa Chick:
So here we are now on version three and it’s just wonderful in terms of what it makes possible, and the interactions that it makes possible using really standard equipment, using a standard webcam. You can really change the accessibility levels, maybe that weren’t there before. So I’m really proud of the students that worked on this project. I think it’s fab.

Josh Anderson:
Awesome. And you kind of said you use what’s already there. You kind of use the webcam that’s already there. What all kinds of things can I control using that camera and the Motion Input software?

Pippa Chick:
Yeah. So there’s actually six ways you can interact with… You can either do this on your phone, or you can do it on a standard laptop or on a tablet device. So imagine you are looking at said screen, whatever size you have chosen, you can use facial navigation. So once you’ve download, you’ve just download a piece of software essentially. So you go to UCL’s website or you go to touchlesscomputing.org, and you download the code. And it’s free for non-commercial use. And it’s available now on the UCL app store.

Pippa Chick:
And you can then choose to… Option one is use facial navigation. So you may choose to, instead of clicking the mouse, you’re making a fish face, or you are wiggling your nose. It’s really fun to watch. I don’t know if you’ve seen the video with Paris in, but it sort of makes you smile as you watch how.

Pippa Chick:
There’s also hand gestures. So picture Minority Report. You’re not just waving your hand. It’s not one dimensional. You can actually pinch and drag and zoom with multi touch points. And that’s actually very complex underneath in terms of the math required to do that. So the students are done a fantastic job there, and it really is sort of today’s Minority Report when you watch it work.

Pippa Chick:
You can use eye gaze. So as you make a command, it will track as you scroll along. Now, that one seems sort of really simple, but if you think about how much that changes what you could do from a hospital bed. If you just want to watch Netflix and you don’t have the capability to click a mouse, use a remote control, what have you, but you can just navigate by genuinely reading down with your eyes and then clicking on to watch your favorite show. I just think that’s wonderful.

Pippa Chick:
Move around a bit more, there’s full body tracking. So the most fun versions of this are interacting with computer games. So you don’t need additional equipment. Again, it just tracks your full body, and then you can be immersed in a computer game that you’re playing. Or if we think about it in a physiotherapy session, we could be following a set of exercises that you need to do, and if your physiotherapist can’t be with you, they can still be tracking whether did you manage to lift your knee higher on this session? Or is your wrist mobility improving versus the last three, four sessions? So that’s got some wonderful use cases there.

Pippa Chick:
And then there’s speech. So you can say track left, stop, track right, select, and it will do that for you. And then there’s an in-air joy pad. So again, back into the sort of gaming space. But you’re doing it with your hands. But you don’t have to actually hold a joy stick, but you can be doing the sort of A-B, X-Y type of playing a game. You can’t see me, but I’m doing it with my thumbs as I speak to you.

Pippa Chick:
So yeah, that’s the six current use cases. And what the team are constantly doing is improving that user interface and usability, so the setup constantly should feel easier and more intuitive.

Josh Anderson:
Nice. And you mentioned intuitive. I know a lot of times in any kind of, just eye gaze, for example, kind of systems, you have to really train it to the user. It takes quite a while to really get it up and working. Do you have to do that with this as well, or does it just kind of figure it out as it goes along?

Pippa Chick:
Mostly option B. But it’s interesting the way you word that question, because what we are having a look at… That’s a very royal we. Gosh, what the university students are having a look at, I should say. I’m a non-technical person, but it’s wonderful what they’re doing. So what they’re trying to do next is can we understand your personal range?

Pippa Chick:
So at the moment to properly answer your question, it is fairly intuitive, constantly being improved as all pieces of software are. But what we want to do next is have a look at well, okay, if you are a stroke victim, for example, then your range of movement would be very different to mine. So I can move my head all the way to the left and all the way to the right and all the way up and all the way down. That range of motion is different for someone who may have suffered a stroke. And if the software can capture that, then they understand when someone who has less mobility asked to track all the way across, they can measure your all the way across. And that might be much less a degree than I could move.

Pippa Chick:
So really sort of personalizing that range of movement so that it interacts with you and it doesn’t feel forced, everything’s not having to be a full stretch for your head or your neck or your nose, whatever the measuring.

Josh Anderson:
Sure, sure. And with all these different kind of input methods and everything, can they be used simultaneously? Do you just use one at a time depending on your needs and abilities, or how does that work?

Pippa Chick:
Yeah, you can use a combination. I think that’s the bit that’s really exciting. So for example, if you are doing in-air gestures, eventually my arms and shoulders would get tired. So yeah, you can then switch and say, hey, actually I’ll have it respond to my voice now. Or if you’re somewhere that’s quite loud, then it might not be appropriate to use voice. So you go back to facial tracking. So yeah, you can mix and match between them and use combinations.

Josh Anderson:
Very cool. We mentioned games, we mentioned Netflix, does this work with pretty much anything that I would normally use a keyboard, a joystick, or something else to access?

Pippa Chick:
Yeah. And that’s what we want to keep getting to as well. So yes, that is absolutely the intent. It has been tested on all the resources that the university have access to and time. It is free in a non-commercial sense, but they just want to keep expanding this. What else could we do? What else can we measure? So Catherine Cummings, who’s the… I don’t want to get her job title wrong. I think she’s the president. Sorry, she’s the director of the ALS/MND Alliances. And also Nick Goldup who directs alliances for the MND in the UK. They’re really excited about this and what it could do for MND sufferers and what it can do to kind of continue to make their interactions easier with technology, and just keep them involved in family lives and in society for much longer without needing lots of additional equipment or expense.

Josh Anderson:
Oh, exactly. Yeah, just the simple fact that it’s software already using what you’ve got, it’s free to use, is a huge game changer. Not just the access, but, but being able to do that. I know you said that this is kind of on version three already. What are they working on? We can say we, but we’ll say, what are they working on next?

Pippa Chick:
Yeah. So there’s a couple. So next and very imminent is the range of capabilities which I described to you. So really understanding how much can you move and reacting to how much you move to ask it to, make commands.

Pippa Chick:
There’s another piece that they’re looking at. Now, what did they call it? It was a fun name. Crash Protection. It’s not as it sounds. There’s nothing in the autonomous driving space. But if you are interrupted, so if the camera is looking at you and you are interacting with your piece of technology and your five year old runs in, I’ll say that, because my five year old runs in, and I’m chaffed he hasn’t done it for this call. If your five year old runs in across the screen, it is identifying that it has crashed that interaction, and not responding to any movements that other person makes. So it’s making sure that it’s more robust and it’s just paying attention to you. And I think the better that becomes, the more you could use this technology kind of on the move when you’re out and about, you don’t need to be in such a controlled environment.

Pippa Chick:
One more piece, because they’re adding so much all the time. It’s really cool and it’s difficult to keep up with. So I gave Dean a call this morning to say, “What else are you doing?” They’re also looking at detecting multiple cameras. So that gives you additional depth. It’s grid-based anyway, so it’s pretty good at depth perception, but detecting multiple cameras just gives you more options, more accuracy, and better interaction. So yeah, they’re just working on that right now.

Josh Anderson:
Very, very cool. Pippa, tell me a story about somebody who’s been able to use this and it really opened up the world for them.

Pippa Chick:
Okay. I’ll tell you a fun how they tested it story. And then I’ll tell you a bit more of a life changing one.

Pippa Chick:
So professor Dean and his wife Ati, Ati is a doctor, they were very determined that this would be easy for everyone to use. And they’re very conscious that they are clinicians or computer scientists, depend on who you’re speaking about, and all the students are very, very good at using technology. So I said, okay, we’re going to have to find someone who isn’t to see if it really works. Is it as easy as we think it is? Or are we sort of blind to how much we already know about tech?

Pippa Chick:
So fast forward to Dean’s mom in the kitchen making, I think she was making roti bread, and they wanted her to interact with the recipe without touching the screen. So normally she would have it on a book. So I think she was already frustrated it wasn’t on a piece of paper. So they’ve got it downloaded. So they say, “Mom, can you use this? Let us know what you think.”

Pippa Chick:
And they said within five minutes, she’s not just using it successfully, but she’s smiling while she’s doing it. So I think, good on them. That’s proper user experience testing with folks who may be aren’t as okay with tech as university, college, London, computer science students. That is the risk.

Pippa Chick:
On the other side and on a much more serious note, this is really important that it has been picked up quickly by the Motor Neurone Disease Association, and will be by other parts of the NHS that have a look at people with mobility issues. And I think there are some nice quotes, and I’ll do them in injustice, so I’ll get them not exactly right. But on touchlesscomputing.org, there are some really nice pieces that come from Catherine Cummings. I think they come from Nick Goldup and others. But the video that we made with Paris illustrates that you go from being disconnected because this disease has got in the way, to being reconnected and smiling and having fun again. And computer games are not as frivolous as they might sound when it means you’ve reconnected with your family and you’re moving and laughing again.

Josh Anderson:
Oh definitely, definitely. Yeah, spending time with family and being able to do things together is a huge thing, whether it’s doing games or really anything. But opening that door for folks can really be a huge help. You’ve mentioned a few of these before, but how do our listeners find out more and even try this out for themselves?

Pippa Chick:
So this is a lovely question because it’s super easy and it’s free, which I will keep saying. So we want it out as far as possible. So if you go to touchlesscomputing.org, or if you go to ucl.ac.uk and then search for touchless computing or Motion Input, you will find Motion Input V3. So Motion Input, version three, and that is your latest and greatest. And there are just two pieces of code to download. And then you’re off and running regardless of your hardware.

Josh Anderson:
That is perfect. That is perfect. So Pippa, I know there was kind of a lot of parties involved in getting this all put together. Can you tell us who some of the contributors were?

Pippa Chick:
Yeah, certainly. So professor Dean Mohammad Ali of UCL pulled the project together. And we at Intel contributed, so Costas Stylianou is the technical specialist who mentored the students throughout the three years. But there were also significant contributions from John McNamara of IBM, Lee Stott of Microsoft, and Joseph Connor of the NHS.

Pippa Chick:
There’s a fun fact. It took 54 students to pull this together, led by Sinead and Carmen. But yeah, a group of 54 students.

Josh Anderson:
Pippa, thank you so much for coming on today, telling us all about this amazing software that can just really open the door. And it’s such a, oh, I don’t know. It sounds so easy. I mean, I can’t wait to… As soon as we’re done here, I’m going to go play with it too and see if maybe I can even edit part of the show playing around with it, just to try it out and see how it works. But used to be, you had to buy so much equipment, it took so long to train. You almost had to have that computer engineering degree to really get things up and working, and then every update, everything falls off and you have to recalibrate. So changing that can just open up just so much and being able to do things. And you mentioned a lot of those today. So thank you again for coming on and really helping us get the word out about it.

Pippa Chick:
You are so welcome. I suppose, one of the things that made this development faster, and I should mention, it would be remiss of me if I didn’t, is that it sits on OpenVINO. So again, free and open source in the tech for good vein. Intel has a set of tools and frameworks that are pre-trained and pre-written, and you can build on top of that code. So OpenVINO is particularly good at AI on computer vision and imaging. And that’s what the students used as the basis for their project.

Pippa Chick:
But because all of those tools that Intel put out there in the AI space are open source, it means that what’s produced on top of them can remain open source, and it really works for the ecosystem. So I’m particularly proud of the fact that we’ve got those things that make this kind of project happen faster.

Josh Anderson:
Oh, definitely. And I’m sure it’ll help make even more projects happen faster. And we’ll hopefully be able to have you on some time to hear about the next great thing that you all are working on.

Pippa Chick:
Oh yes, please. Thank you very much.

Josh Anderson:
All right. Thank you very much. Thanks again, Pippa.

Pippa Chick:
Thanks Josh.

Josh Anderson:
Do you have a question about assistive technology? Do you have a suggestion for someone we should interview on an Assistive Technology Update? If so, call our listener line at (317) 721-7124. Send us an email at tech@eastersealscrossroads.org, or shoot us a note on Twitter @INDATA Project.

Josh Anderson:
Our captions and transcripts for the show are sponsored by the Indiana Telephone Relay Access Corporation or InTRAC. You can find out more about InTRAC at relayindiana.com.

Josh Anderson:
A special thanks to Nicole Prieto for scheduling our amazing guests and making a mess of my schedule. Today’s show was produced, edited, hosted, and fraught over by yours truly. The opinions expressed by our guests are their own and may or may not reflect those of the INDATA Project, Easterseals Crossroads, our supporting partners, or this host. This was your Assistive Technology Update and I’m Josh Anderson with the INDATA Project at Easterseals Crossroads in beautiful Indianapolis, Indiana. We look forward to seeing you next time. Bye-bye.

Please follow and like us:
onpost_follow
Tweet
Pinterest
Share
submit to reddit

Leave a Reply

Your email address will not be published. Required fields are marked *