AT Update Logo

ATU589 – Enabledplay with Alex Dunn

Play

AT Update Logo

Your weekly dose of information that keeps you up to date on the latest developments in the field of technology designed to assist people with disabilities and special needs.

Special Guest – Alex Dunn – Founder – Enabledplay
Twitter: @enabledplay

Website: https://enabledplay.com/

——————————
If you have an AT question, leave us a voice mail at: 317-721-7124 or email tech@eastersealscrossroads.org
Check out our web site: http://www.eastersealstech.com
Follow us on Twitter: @INDATAproject
Like us on Facebook: www.Facebook.com/INDATA

—– Transcript Starts Here —–

Alex Dunn:
Hi, this is Alex Dunn and I’m the founder of Enabled Play. And this is your Assistive Technology Update.

Josh Anderson:
Hello, and welcome to your Assistive Technology Update. A weekly dose of information that keeps you up to date on the latest developments in the field of technology designed to assist individuals with disabilities and special needs. I’m your host, Josh Anderson with the INDATA project at Easterseals Crossroads in beautiful Indianapolis Indiana. Welcome to episode 589 of Assistive Technology Update. It is scheduled to be released on September 9th, 2022. Today, we’re super excited to welcome Alex Dunn, the founder of Enabled Play he’s on to talk about their device and how it can completely and totally change the way that you control your devices. If you’re looking for a transcript of today’s show, it’s available at eastersealstech.com. Our transcripts are generously sponsored by InTRAC, and you can find out more about InTRAC at indianarelay.com. As always, we thank you so much for listening. Now, let’s go ahead and get on with the show.

Josh Anderson:
Make sure to check out our sister podcast Accessibility Minute and ATFAQ or Assistive Technology Frequently Asked Questions. If you’re super busy and don’t have time to listen to a full podcast, be sure to check out Accessibility Minute. Our one minute long podcast that gives you just a little taste of something assistive technology based. Our other show is Assistive Technology Frequently Asked Questions or ATFAQ. On Assistive Technology Frequently Asked Questions, Brian Norton leads our panel of experts, including myself, Velva Smith and our own Tracy Castillo. As we try to answer your assistive technology questions. Check out our sister shows Accessibility Minute, and ATFAQ, wherever you get your podcast now, including Spotify and Amazon music.

Josh Anderson:
Listeners accessing technology is really paramount to any at intervention or service. It’s one of the main barriers that we encounter on a daily basis for individuals with disabilities. So whenever we hear AT Update, hear about a new method of input, well, we get a little bit excited. Well today I’m excited to introduce Alex Dunn from Enabled Play. And he’s here to tell us all about their solution to this ongoing and huge problem. Alex, welcome to the show.

Alex Dunn:
Great. Thanks for having me.

Josh Anderson:
Yeah. I’m really excited to talk about the cool tech and everything else. But before we do that, could you tell our listeners a little bit about yourself?

Alex Dunn:
Yeah. So I’m Alex Dunn, I’m from the Boston area in the states on the east coast. And for the last of 10 years, my background’s mostly been on, I’ve been in applied AI in the enterprise space. And for the last three or so years now, I’ve been really focused on using those same types of approaches and scaling, using AI in new ways to try to tackle some of the input challenges and creating a new human computer interaction paradigm more inclusive has been a big goal of mine. So I started to Enabled Play with that mission in mind to help level the playing field by really letting anyone control their technology in whatever ways are preferred or work for them. So that’s me.

Josh Anderson:
Excellent. And you started mentioning Enabled Play and talked about when it was started, but what made you want to start Enabled Play? What makes you want to take that experience and that AI information and use it in a whole new way?

Alex Dunn:
So I had a member of my family with disabilities who’s in his early teens. And when COVID hit, we found that really the only way that we could stay connected was playing games online, like games like Minecraft. And with his disabilities, it became a challenge that I saw right away for him in terms of just controlling basic inputs for games. And of course he could do it with maybe a little bit slower or with a little bit more frustration than I’d expect someone else to be able to play the game. And I saw, and then looked at him when, wow, it’s really just because doing many controls quickly at the same time with fine motor skills is just not easy for everyone. And so I originally underwent exploring ways to apply voice AI and giving voice commands as an alternate input, not to replace everything, but as a way to supplement essentially, which worked out great.

Alex Dunn:
And then as I started to grow that use case beyond my isolated problem that I saw with my family, I realized that the issues are pretty systemic with gaming as well as education in the workplace too. It’s just this model that we have of a keyboard and mouse or a game controller are just, they’re not built for everyone. And that’s really where I started to take off in exploring what would work for everyone and really found that building new inputs that learn how you want to use them rather than having to build something that you have to learn how to use is really been working out pretty well.

Josh Anderson:
Excellent. That’s a very different model. Because a lot of times you do have, hey, here’s a different kind of input device and this is how you use it. But actually having to learn from the individual is a new way of thinking. And I’m sure maybe the AI makes that a little bit easier than it was in the past.

Alex Dunn:
Yeah, exactly. It’s something that I think is missing in a lot of the lower level things in tech. We use AI and machine learning at the macro level of like, hey, I’m going to ingest a whole bunch of data that I’m getting from some other system and then use it to make inferences. It’s exactly how voice recognition works. We take a whole bunch of data, audio and text and more or less turn that into a model that can be processed. But I’m not really seeing a lot of people apply it at the lowest level, which is just how do we actually interact in general with technology. So it’s been a really fun journey to take that different approach.

Josh Anderson:
Excellent. So you’ve already mentioned this, but let’s really dig in, tell us about the Enabled Play controller.

Alex Dunn:
Yeah. So the Enabled Play controller is referred to by some of our users as a magic black box because it’s not a traditional controller. It doesn’t have buttons on it. It doesn’t have joystick on it. Doesn’t have keyboard keys on it or a mouse tracking sensor on it. It’s a box and it’s powered by the Rasberry Pi compute module.

Alex Dunn:
So it’s essentially a little computer that you plug into your own computer or your game console. And it acts to your computer just as if it was a keyboard, mouse and controller. But the way that you communicate to the controller is what’s different. So our goal is to enable essentially anything to be an input. And right now what that means is voice commands offline. There’s microphones built into the controller that are wide range, two mics. So they act as if they were a person’s ears that are listening for commands. They do this speech recognition offline. So you don’t have to be online. It’s not streaming your audio to any service or anyone else that’s going to keep that recording.

Alex Dunn:
And it continues to learn how you speak as well in order to give you the highest accuracy and the most predictive way of executing commands. And what that really means is that for example, in gaming, if you were to say, jump should be the same as the space bar, it knows that you are trying to say jump before you even emphasize the P sound at the end of the word, because it knows what to be looking for and is predictive. So you get really, really fast inputs. So that’s one input that’s voice command.

Alex Dunn:
We do also have dictation, which is the ability to just type what you’re saying. That’s still a little bit on the experimental side. And again, that’s still all offline. We also have tilt controls where you can turn your phones accelerometer into an input. So think of that like moving your phone left, right, up and down and angling it. And that can act as a mouse, that can act as a joystick or that can just send specific commands based on when it hits a certain threshold. You can control all sorts of sensitivity settings with it. So if you only are able to, or desire to move the phone in tiny, tiny movements that can still actually execute, or if you want to do these big sweeping slow movements that works as well.

Alex Dunn:
We also have what we call expressions and gesture controls. Where we will use your phones camera and you can mount it anywhere or even just put it somewhere facing you. It doesn’t have to be perfectly dead centered and calibrated or anything like that. And it’ll start to learn how you are moving your face and your body, and you can use those for commands. So for example, you can do things like if you raise your eyebrows or you smile, or you do a bite or you tilt your head left and right, and those can send different commands. We can also use your head and body position as a joystick or a mouse movement as well. So think of it almost like your head, if you’re able to rotate it, acting as if it is a joystick on a controller or moving a mouse left, right, up and down.

Alex Dunn:
And you can combine all of these things at the same time, but with expression controls and doing all these multiple expressions, it’s important to be able to also support the wide range in mobility of both the face and body. And so the way that we did that was implementing sensitivity settings for every single expression and every single gesture.

Josh Anderson:
Okay.

Alex Dunn:
And what that means is that if, for example, you can’t turn your head left as far as you can turn, right? Then you can just change a setting and basically say that you don’t have to turn your head left as far in order to trigger it the same way that you would turn your head right. Or if you have, for example, spastic movements, and you want to still be able to use those gestures, you can decrease the sensitivity so that you have to move very intentionally one way or the other. And that’s true for everything from head movements to body movements, to eyebrow movements and mouth movements and everything in between where you get total control over it. And it understands basically where you are. Doesn’t take a whole lot to set up as long as you’ve got a camera pointing at you. You’re basically good to go. Still a couple more inputs. So bear with me.

Josh Anderson:
Oh yeah.

Alex Dunn:
We also have, we call virtual buttons or hot keys. Essentially lets you turn your phone or tablet or your computer into a grid of buttons that can run macros or basically act as single inputs. So a common setup that we’ll see is someone using either their webcam or their phone or tablets camera for face expressions, they’ll be using voice commands on the device. And then they’ll have either another phone or a tablet act as basically that grid of buttons so that they’re able to do everything, whether that’s in games or that’s in schools and so on.

Alex Dunn:
And then we also have the ability to remap physical inputs. So you could plug in a controller into the Enabled Play controller or plug in a keyboard and remap that keyboard’s input. So if you wanted, for example, a key that you wouldn’t use as often, like the back slash, to do something like run a full macro, then you can do that. Or you can also use other adaptive controllers through that and basically get the power of Enabled Play’s automation, macros, and profiles, and just be able to bring your existing tools to the table as well.

Josh Anderson:
Wow. So you really did make it where there’s just multiple input methods and all these can be used at the same time. So really whatever the individual’s skill level, ability level is, it can work right into that, I would guess.

Alex Dunn:
Exactly. So even if you have essentially no mobility, you can’t use a regular controller, you can only, for example, move your mouth and your eyes, you can still get basically 18 different inputs out just using different combinations of those things, which unlocks a lot more capability for people than other tools do. And again, you can still combine that with other devices as well. You can still use your eye gaze tools or you can still use your other adaptive controllers. Enabled Play just slots in order to unlock a lot more capabilities for you.

Josh Anderson:
I don’t know so many times in AT, and this isn’t really a shot to anybody cause anybody that does it, I love that they’re making things, but I have this great idea and this is going to help everybody access things and this is how it works. And you went the way with, I have this great idea, I’m going to help people access things and we’re going to work it to what they need. And that’s a little backwards, but is the way it should be in the long run.

Alex Dunn:
Right. Exactly. It’s a challenge. In working with so many more people across from being entirely paralyzed, other than being able to move their eyes and maybe their mouth to people that are just, I can’t move my arm as far. So using the mouse is just a little bit harder and everything in between. Plus we also have users that are able bodied that use it at work.

Josh Anderson:
Sure.

Alex Dunn:
Just because using voice commands is super fast. So it’s one of those things where we realize very, very quickly that we want build stuff that just starts from the ground up at that lower level. And in order to do that, you have to build something that’s flexible. And I think to some extent, my background aids in making that easier. I don’t come from the space of being a mechanical or electrical engineer, building AT devices all the time and trying to just solve each individual person’s problem as it comes up, I come from the side of building platforms that service millions and millions of people. And to me, it’s using those same approaches just at a different problem that I think needs more solutions.

Josh Anderson:
That’s excellent. And Alex, you mentioned this a little bit, but what kind of devices is Enabled Play compatible with?

Alex Dunn:
So it’s compatible with anything that uses and supports the USB HID or HID standard. Because it basically tells the computer that it’s plugged into that it is a mouse, keyboard controller.

Josh Anderson:
Nice.

Alex Dunn:
So Mac, Windows, PC, certain tablets and Chromebooks, you can plug it in via your Xbox adaptive controller and then use it that way. A nuance to that in terms of power management. But it’s still very possible to then use that with Xbox and then other game consoles that support HID and then for the ones that don’t, for example, the Nintendo Switch or the PS5, you can always use another adapter, just like people use, for example, the QuadStick for. You can use a Cronus or a GameSir converter to take the generic game pad or to take the keyboard and mouse and turn that into a proper PlayStation 5 or a Nintendo Switch controller too.

Josh Anderson:
Nice. No, that’s excellent. And I love the way that it actually learns from the individual. Like you said, you train it as opposed to it teaching you how to use it.

Alex Dunn:
Exactly.

Josh Anderson:
I’m sure that really opens up the door for a lot of folks who maybe don’t have as good of a support system to be able to do that for themselves and really be independent in the setup and everything else.

Alex Dunn:
Yeah. The only physical setup that you need is just to plug it in to wherever you’re using it.

Josh Anderson:
Nice.

Alex Dunn:
Which is limitation just within the current version. In future versions, we’re hoping to be able to make that completely accessible too. So one of the things I’ve noticed in the AT space and also the AAC space is that even if you’re someone with a disability that wants to use any of these different devices, they usually require one or more people to basically get it configured for you every time you want to use it. It’s not something that’s just there. So we wanted to really minimize the amount of setup and just give the control to the person that’s really going to use it. Which we’ve seen a lot of success in and there’s still more work to do. It’s not perfect. You still do have to plug it in. And for some people they still prefer to have someone help set up their phone camera, but we’re getting a lot closer to providing full independence on digital access, which is exciting.

Josh Anderson:
Oh sure. And I can tell you just from doing this for a while, just plugging it in is so much better of an access. Used to be you’d help someone set something up and every time there was an update or something changed, you help them set it up again and then it would just continuously go. So yeah, just the fact that you’re already looking at fixing that part is leaps and bounds above where I’ve seen a lot of different things.

Alex Dunn:
Yeah. Actually, speaking of updates, its one of the areas we wanted to focus on where the device itself, you can connect it to the internet. So you can do remote commands too. You don’t have to be right next to it for Bluetooth connection, but you can also… It’ll just get updates as we roll out new features or new support or new changes to the app side, so the actual Enabled Play app on IOS and Android and Windows or to the device itself for new changes. As long as it’s connected and plugged in, you’re going to get all those updates automatically. You don’t have to go through a whole process where you plug it in and it’s disabled while it’s doing this really long firmware or update or have someone else do that is just continuously rolling.

Josh Anderson:
Excellent, excellent. Alex, in looking through the website and stuff, I saw that you have a pre-order available for a light version, what’s the difference about this device?

Alex Dunn:
So the light version, there’s two main things. One, it requires less power, which means you can use it, for example, on phones. You can just plug it right into your phone. And now you’ve got offline voice commands and tilt control, access and distributed remote control. The other side of it is that it doesn’t have microphones built in. So you’d only be able to use voice commands through the app itself, again, still offline, but that’s basically the limitation. And the lower power means that we can use cheaper hardware to power it, which means we can sell it at a much cheaper price point too.

Josh Anderson:
Awesome. Awesome. Alex, you probably got tons of these, but tell me a story about somebody that’s used Enabled Play and how it made a difference for them. Or maybe even a way that they used it that even surprised you.

Alex Dunn:
Oh my gosh, I have so many that surprised me. It’s the story of how Enabled Play keeps developing is so many people will use it and be like, wait, can I use it for this instead? And then I’m sitting there thinking it’s not on the website saying you can use it for that. I take a step back and go, absolutely, you can. Let’s figure how to make that even easier to do. But in terms of stories, my favorites come from the schools that I work with. So I can’t necessarily name students or the schools that they work in.

Josh Anderson:
That’s fine.

Alex Dunn:
But a big thing that we saw pretty early on, and again, we started in focusing on gaming, because that was the original problem and quickly saw that digital access in schools and limitations of digital access creates a systemic problem where we introduced tech earlier and earlier ages to students, which I think is a great thing. But if you’re a student in a special education program that doesn’t have the physical and digital access as the other students, then you fall behind at the accelerated rate that they’re moving forward by getting that access earlier. So that was a big problem that we set out to start solving. And we’ve been doing some great work in the education space to do that.

Alex Dunn:
So my favorite stories come from students, especially in the elementary school and middle school age where people start to really see the problems come to life. And a lot of times, as of sad as it is, by the time some of these students are high school, it’s almost too late to get them the same digital access where we can solve the problem a lot earlier. And of course we do this in high school too, and we do help solve the problem, but it’s a lot more challenging to catch someone up who hasn’t been using technology.

Alex Dunn:
So anyway, that’s a long setup for the actual story, which is, it could be as simple as a student that has hand tremors and they go to their first computer lab class where they’re learning to type and they start to learn how to use PowerPoint and Word or they’re using Google Classroom for the first time. And when you’re in those typing exercises and you have something as of invisible, as hand tremors, in terms of the visibility of the disability, to the people around you, to you, it just feels like you can’t go as fast as everyone else. And it’s simply because the keyboard is harder for you to use. And we’ve heard this story a number of times in a number of different states now where students that have these relatively small challenges with mobility starting to actually be able to accelerate by using Enabled Play, where they’re able to offset things that are harder with their mouse and keyboard while still actually learning how to type as well.

Josh Anderson:
Nice.

Alex Dunn:
And on the other side is where that problem starts to grow is around that middle school age, when students start to write big reports. It’s your first time you’re going to write an eight page essay. And if you have a harder time doing the research, because navigating around on the browser and finding primary resources is harder for you than your friend, but your friend is going to be done and then be able to go do their own after school activities like to play sports or join the robotics club and things like that. Well, you have to spend so much more time just working on the same exact report, sometimes up to 10 times longer.

Alex Dunn:
That’s a problem that we want to solve and are solving with a number of our students where giving them these profiles for the devices and platforms that they use, like Microsoft Word and in the browser, helping them navigate and get information faster, has really empowered and shamelessly enabled them to keep up. And in some cases, even accelerate faster than students who aren’t using the devices, because you don’t have to have a disability to get the advantage of voice commands and just saying what you want and the thing happens. So those are some of our favorite stories for students.

Alex Dunn:
And then in terms of where things came up where we weren’t expecting it. In the speech therapy side of things, we’ve been doing a lot of work recently. But again, we weren’t building a speech therapy, AAC device. We were building a digital access device. We found basically speech therapist being like, hey, can we use your same face expression detection, which can detect different bites and mouth movements and things like that, and actually use it in our therapy sessions to help the client or the student or the patient do things rather than just having to repeat the motion while someone watches them and saying that they’re doing it right?

Alex Dunn:
And that was one of those big moments where I went, huh? I guess, yeah, we can, let’s go do that. Let’s try it out. And we’ve seen just so much success in this where we’ve gamified what is normally a pretty tedious exercise for the person going through those sessions, where instead of just doing it in front of someone, they’re using it to play a game. For example, they’re doing the right bite movement or their teeth gritting, or they’re using different vowel sounds to actually play games like Minecraft, which has come of full circle. Or even just very simple games too. And it’s also then something that the parents can take home and continue to work on with their child where otherwise when the student was leaving the session, no one is continuing that practice and exercise. So when they come back at the next week or the two weeks later, there’s that two steps forward, one step back where now we can just keep running and move things forward a lot faster and they’re way more fun.

Josh Anderson:
Oh definitely. And that’s something I would’ve never ever thought of just being able to practice those movements and everything. And really I’m sure that shows amazing growth. Because yeah, anytime you know that you can do something and it’s fun and it’s a little bit more engaging, you’re going to try harder. Any person is, with or without a disability, for sure. You make it interesting and engaging, they’re definitely going to try harder and I’m sure those results are just amazing. And that’s super cool. Again, something that was created just to be able to help people be able to have access to gaming and level the playing field on that has so many more uses. And I love that you guys are taking that all into account. So I guess I have to ask, Alex, what’s next? Or do you even know what’s next? I guess maybe that might be the question.

Alex Dunn:
Definitely, have a plan for what’s next. But there’s always things that come up again when people are like, I want to be able to do this and we try to find a way. So I’ll tell you what I know is what’s coming and also some things that we’re thinking about too. So a big part of what we’re trying to do outside of create these new inputs and create more digital access capabilities is also try to cut down on the current market that we see in assisted tech where things are outrageously expensive. Anything going through the CMS program is going to be, because that’s the way it is. But even outside that people paying hundreds and hundreds or thousands of dollars for basic things. We want to be able to package just so much stuff into a reasonable price point. And right now I think we’re at that point where it’s somewhat reasonable, but we want to get that even lower in new ways where it’s even more accessible.

Alex Dunn:
So the light controller is a big part of that we’re working on to be able to basically cut that price in half. And then we’re also working on what we’re referring to right now as virtual devices, which is the ability to basically run a device inside of your PC or Mac or your Chromebook where you don’t actually need the physical device at all. You just run this little program in the background and it acts as if it is another device. So then you can use your webcam and your phone and your tablet and all these things as control and trying to do that in a way that’s significantly cheaper upfront and that works with you as you go. There’s other limitations.

Josh Anderson:
Sure.

Alex Dunn:
Like that doesn’t really work well in games because games actually look for the USB input. So it’s a little bit challenging there, but for basically everything on the education and workplace side that we focus on, there’s a ton of opportunity to cut down that price from our current 20 retail for Enabled Play down to a couple bucks a month, essentially.

Josh Anderson:
Nice.

Alex Dunn:
And then there’s also some things that have come up around controlling physical things with Enabled Play. Things like toys, quad copters and stuff like that. For people using Enabled Play and in developmental therapies, but also just like for people to be able to play with toys, which is something that we’re starting to explore a little bit more, that’s come up from a few different people that were like, hey, I use this on the digital side and it’s great, but I’d love to let this person be able to fly a quad copter or fly in RC plane, just using their face and body gestures. It would be amazing for them. So we’re starting to explore some new ways to do that as well.

Josh Anderson:
Ah, that’ll be awesome. Alex, if our listeners want to find out more, what’s the best way for them to do that.

Alex Dunn:
Yeah. You can find us on twitter EnabledPlay or go to our website at enabledplay.com. There’s tons of resources there. We’ve got tons of tutorials. You can pick up a device from there. And then for those that are not individuals that are listening to this, if you’re an organization, we have no cost partner programs in education, gaming, workplace, and just AT makers in general, where we can get device discounts more easily, help you get bulk orders, give you custom prebuilt profiles and really support you that way you can then support the people that you’re working with too. So on that side, we have some more of that information on the website, but you can always reach out to us on Twitter or on LinkedIn or anywhere else.

Josh Anderson:
Excellent. We’ll put links to all that down in the show notes, Alex Dunn thank you so much for coming on today for telling us about Enabled Play and just all the amazing things it can do. All the lessons learned and we can’t wait to see what’s next.

Alex Dunn:
Great. Thank you so much for having me.

Josh Anderson:
Do you have a question about Assistive Technology? Do you have a suggestion for someone we should interview on an Assistive Technology Update? If so, call our listener line at (317) 721-7124. Send us an email at tech@Eastersealscrossroads.org or shoot us a note on twitter @INDATAproject. Our captions and transcripts for the show are sponsored by the Indiana Telephone Relay Access Corporation or InTRAC. You can find out more about InTRAC at relayindiana.com. A special thanks to Nikol Prieto for scheduling our amazing guests and making a mess of my schedule. Today’s show was produced, edited, hosted, and fraught over by yours truly. The opinions expressed by our guests are their own and may or may not reflect those of the INDATA project, Easterseals Crossroads, our supporting partners or this host. This was your Assistive Technology Update and I’m Josh Anderson with the INDATA project at Easterseals Crossroads in beautiful Indianapolis, Indiana. We look forward to seeing you next time. Bye bye.

Leave a Reply

Your email address will not be published.