ATU433 – Jabberwocky App with Aaron Chavez and Jon Hoag

Play
ATU logo

Your weekly dose of information that keeps you up to date on the latest developments in the field of technology designed to assist people with disabilities and special needs.

Show Notes: www.jabberwockyapp.com

Check out on Twitter, Facebook and Youtube: jabberwockyapp


——————————
If you have an AT question, leave us a voice mail at: 317-721-7124 or email tech@eastersealscrossroads.org
Check out our web site: http://www.eastersealstech.com
Follow us on Twitter: @INDATAproject
Like us on Facebook: www.Facebook.com/INDATA

 

———–Transcript Starts Here—————————————

Aaron Chavez:
Hey, this is Aaron Chavez and I’m the co-founder and CEO of Swiftable.

Jon Hoag:
And this is Jonathan Hoag. I’m the co-founder and CTO at Swiftable.

Aaron Chavez:
And this is your Assistive Technology Update.

Josh Anderson:
Hello and welcome to your Assistive Technology Update, a weekly dose of information that keeps you up to date on the latest developments in the field of technology designed to assist individuals with disabilities and special needs. I’m your host, Josh Anderson with the INDATA Project at Easterseals Crossroads and beautiful Indianapolis, Indiana. Welcome to episode 433 of Assistive Technology Update. It’s scheduled to be released on September 13th, 2019. Welcome to our Friday, the 13th episode. And today we come face to face with the Jabberwocky. No, not the nonsensical creature from the Lewis Carroll poem, but instead an app made by our guest today, Aaron Chavez and Jon Hoag the co-founders of Swiftable. And they’re on to talk about a couple of different apps under the Jabberwocky name that use facial recognition in the newer version iPhones as communication devices, as browsers. And you know what, we’ll let them kind of get into that as we get on with the show.

Josh Anderson:
Are you looking for more podcasts to listen to? Do you have questions about assistive technology? Are you really busy and only have a minute to listen to podcast? Well, guess what? You’re in luck. Because we have a few other podcasts that you should really check out. The first one is Assistive Technology Frequently Asked Questions or ATFAQ hosted by Brian Norton and featuring myself, Bella Smith and then a bunch of other guests. What we do is we sit around and take questions about assistive technology, either about accommodations about different things that are out there or about different ways to use things. We get those questions from Twitter, online, on the phone and in many other ways. We’re also trying to build a little bit of a community as sometimes, believe it or not, we don’t have all the answers so we reach out to you to answer some of those questions and help us along.

Josh Anderson:
You can check that out anywhere that you get your podcasts and wherever you find this podcast. We also have Accessibility Minute. So Accessibility Minute is hosted by Laura Metcalf. And if you’ve never heard her voice, it is smooth as silk. And you should really listen to that podcast. She’s going to give you just a one minute blurb about some different kinds of assistive technology, kind of a kind of wet your whistle a little bit and just let you know some of the new things that are out there so that you can go out and find out a little bit more about them yourself. So again, check out our other shows, Assistive Technology Frequently Asked Questions and Accessibility Minute available wherever you get your podcasts.

Josh Anderson:
So folks who have a newer version of an iPhone X or above or newer iPad may know that it can recognize your face and it allows you to unlock your phone, do other cool things like maybe talk to your friends through an alien or unicorn avatar. But can this technology be used as an accommodation? Well, our guest today are Aaron Chavez, the co-founder and CEO of Swiftable and Jon Hoag the co-founder and CTO of Swiftable. And they’re here to tell us about an app that can do just that, the Jabberwocky. Aaron, Jon, welcome to the show.

Aaron Chavez:
Hey. Good to be here.

Jon Hoag:
Hey.

Josh Anderson:
Yeah, guys, I’m really excited to kind of talk about this app and all that it can do. But before we get started, could you guys tell our listeners a little bit about yourselves and your background?

Aaron Chavez:
Hey, I’m Aaron Chavez. I’m the co-founder and CEO of Swiftable. I’ve been on this project for about a year or two here. My background is in artificial intelligence, did a PhD at Kansas State University and then ended up part of a small startup in Denver called AlchemyAPI that got roles into Watson in 2015, was pretty involved in that acquisition and then spent a lot of time working with natural language processing and computer vision teams over at Watson, trying to figure out how to bring artificial intelligence to a lot of different business applications.

Aaron Chavez:
It was fun, but I’ve always had an interest to kind of strike out on a bit more of a personal entrepreneurial venture. I’ve known Jon for a long time. And I’m sure he’ll tell you, we’ve been talking about doing something for a while and this area of assistive technology, it was one that was personal to him. And when we got to talking about it, just in terms of being something where you could imagine getting up every day and feeling like you’re doing something worthwhile and making a positive social impact, it wasn’t really hard for him to sell me on it. So I’ll let him tell you a bit exactly how that came to pass.

Jon Hoag:
Yeah. So I’m Jonathan, I’m a co-founder and CTO at Swiftable. My mom is a professor of speech pathology at Kansas State University and she is really the inspiration for why I’m interested in this particular field. Throughout my college and beyond, she was always kind of pressuring me to think about how I could apply some of my computer science skills to make better apps. And I got to see firsthand with her how hard it is to use some of these apps and how long it takes to even use things like switch control.

Jon Hoag:
I think one of the things that was the most astounding to me was just how long it took for people who used AAC apps to get their first sentence out. That’s really what inspired me down this path. I didn’t actually get a chance to take it up until about a year and a half ago when Aaron was like, “Let’s do this thing.” So I worked at a subsidiary of Amazon for about seven years and thought that it was about time to do something different and stop selling people’s stuff and start providing a service that people who really need it to use.

Josh Anderson:
Awesome. Well, I’m glad you guys got together and decided to do that. So I know that the Jabberwocky app has a few different features, but let’s start by just talking about what the app is and maybe the AAC component of it.

Aaron Chavez:
Sure. Yeah. So there’s a couple of different apps who are under the umbrella of Jabberwocky, the first of which we put out being the AAC app last November. And so the AAC app is an app for iOS that runs on a patterned phone that you are able to type on a keyboard by moving your head and for somebody who doesn’t want to use toggle speech, making use speech of texts without typing and talk that way.

Josh Anderson:
Nice. And it will actually do some predictive text as well, is that right?

Aaron Chavez:
That’s correct. Yeah. So as you start to type, we have a language model present that’s going to serve as predictions, both of words, or even complete phrases that you may want to speak. Speaking back to Jon’s earlier point of just how long it takes to compose things that you want to say and we wanted to build something that could be a little bit flexible and real time versus something where somebody had to prepare everything in advance, not to say that you don’t want to prepare a lot, but that type of communication.

Josh Anderson:
And with that, because I know with facial recognition and other things, does it work with blinks? With a kind of dwelling? Both? How does that work?

Aaron Chavez:
Yep. We have a couple of modes of selection right now with both dwell and blink being the ones we support at the moment.

Josh Anderson:
Very nice. And you guys kind of talked about ease of setup and I know I watched a little bit kind of on this, about how long would it take the average person to kind of set this up and get up and running?

Aaron Chavez:
Yeah. Barrier to entry is kind of the main focus that we think about literally every day and how we could reproduce that. And there’s actually a variety of elements of that I’ll definitely want to talk about but certainly just setting up the app is one of those things. So we have a tutorial and sort of a default setup that we’ve found to work well for a lot of people, certainly not everyone, but our goal, and I think we’re fairly successful at that now, is to get somebody running in minutes and not only just having configured their device in that amount of time, but actually understanding it and being able to meaningfully use the app to compose a sentence that they want to speak.

Jon Hoag:
I think the other piece there that’s interesting is we’re focusing really hard on commodity hardware. Like iPhones, our new venture will be into Android, but what we want to do is use just the stuff that’s on the device and not any peripherals, right? So Jabberwocky actually works with the onboard camera, front facing camera and uses that to track your face. So you don’t have to do anything from a physical perspective to kind of set this up other than download the app and start playing around with it.

Aaron Chavez:
Yeah. I want to reiterate that and actually contrast that with some of the historical things that people had to work with when they’re using assistive technology. So the notion why head tracking or even eye tracking has been around for a while and there’s definitely some pretty good solutions out there once you’re in the realm of specialized hardware, like if you want to have glasses or some sort of headset or things like that, but for sure the process whereby somebody comes to use that, you’re going to probably have to have an advocate, somebody who’s a professional that you’re working with who is very aware of the technology space. You can come into a technology lab and do a demo and hopefully, you can understand very quickly how to make it work, and then it turns out that you’re able to quickly figure out whether this technology is suitable for you.

Aaron Chavez:
There’s probably going to be a large upfront investment to be able to use that solution. We’re talking about doing something on a commodity platform. Everybody has a phone and most folks have tablets too. So to have an app that somebody can download onto a device, that they or somebody in their close circle already has and then have a free trial. With all the considerations we’ve made for ease of use and we really believe we’re getting to the point where this is something somebody can, the learning curve is very kind now. And somebody being able to pick it up in just a couple of minutes and say, “Okay, I think I understand how this works,” a sensible default settings, but also the ability to calibrate and tinker. We’re just really fine to re-imagine that entire process, because we think there’s a lot of folks that actually could be using, it’s just probably that aren’t today for reasons that don’t have to do with what you would expect. But, you know.

Josh Anderson:
Yeah. Aaron, you kind of touched on it a couple of times there, what are some of the other barriers to access? Besides just having to have another device, having to have another piece, the training involved and stuff like that, what are some of the other ones you see that this app is able to kind of break down?

Aaron Chavez:
Yeah. I think I definitely like being able to understand the way something works. So whenever you’re introducing a new way of interacting with technology, there’s sort of a conversation between the technology and the user that has to take place before you’re really sort of understanding each other and you can have something that’s very well-thought out in terms of how an expert user might use it. But if you can communicate that and part of that’s tutorials and get the keys online stuff. But part of that is just being designed in a way that’s providing the sort of visual feedback and other sensory feedback. And is it sort of teaching you even unconsciously as well as consciously the correct way to use it?

Aaron Chavez:
Something that we had a lot of stumbling steps on, and even in the last six months, I’ve seen pretty dramatic changes in the way that people have that first experience with the app, just with a large number of things that we had to change. But there’s other barriers too. Certainly cost is always something that comes up. And so for student phonology, given the amount of work that has to go into developing these sorts of things, these are tend to be things that people charge thousands of dollars for, which can be fine when you’re able to figure out the right way to subsidize that. I know a lot of public programs and things can do that. So we’ve been spending some time looking into how various programs such as telecommunications access, which is something that’s becoming fairly widely available with different state level programs, and figuring out how we can help people subsidize the cost of their device and things like that.

Aaron Chavez:
Other than that, I think just being aware of what’s out there. It’s a tough space to be on top of every app that’s out there, every technology that’s out there because it’s not ubiquitous. It isn’t necessarily, something like, I don’t know, your standard iPhone or something gets built for a larger target audience and so there’s a bit more ubiquity in being aware of what’s out there. Whereas this one, you have to kind of, you really had your ear to the ground, or at least have a friend who does and is out there sort of fishing out what is happening and coming into light in some things that didn’t even exist a couple of years ago, or people aware of what’s coming out.

Jon Hoag:
Barrier to entry, pieces that we focus really hard on. Cost is one of them. I think cost comes up in a lot of different ways. There are different levels to the costs that are important. One is like, even on commodity hardware, right? Like the iOS or Android stuff, you still have to buy a phone, right? And the iPhone X for example, is what we need to run Jabberwocky. And it’s the only thing with the face ID technology that we’re kind of using to do this. And those are between 800 and a thousand dollars, right? That’s a price point at which somebody’s not just going to go out and purchase it to try it, right? So if you have prices that are that high, then just trying something out to see if you like it is sometimes not even an alternative.

Jon Hoag:
So it’s good that there’s a lot of programs out there, state funded and university funded where people come in and they can go to the device labs and stuff, but then you have to be in device labs, right? Sometimes those device labs are focused on people with different disabilities, right? Or other specific disability. So just being available to somebody to test and try out is one of the biggest barriers from our side and probably from the consumer side as well.

Josh Anderson:
Oh definitely. But I do like the way the one thing that you guys kind of talked about was just it’s intuitive. You’re using what’s already there. If somebody already does have the iPhone or that kind of device, they’re used to maybe looking at it to unlock it or do that kind of stuff. So they’re just doing those natural reactions, those natural things, but just in a new way in order to be able to access the communication.

Jon Hoag:
Yeah.

Josh Anderson:
And then it also has word draw on there. Tell me about that.

Aaron Chavez:
So word draw is a lot like if you’re familiar with swipe typing, I think it’s sometimes gets called on like more of a conventional phone keyboard or something where the analog, when you have a conventional phone key where you stick your finger on the screen, and then you move around on the keyboard until you’ve reached the end of the word and rather than hitting each key individually. So we came up with this sort of a head tracking equivalent of that, where you start on one letter and then quickly glance over all the other letters and then that’s where you want to end up. And there’s a little bit of extra algorithmic secret sauce that has to go in to do that, but it can be more efficient for composing messages, for sure.

Josh Anderson:
Oh, I could see for how some individuals that could really speed them up, especially during conversation and we’re actually reacting to folks in real time.

Aaron Chavez:
Yeah. It’s way faster once you get good at it, if you’re willing to sort of put in the effort to do that.

Josh Anderson:
Which I think anyone that’s ever used one of those swipe keyboards, it’s kind of the same thing. It takes a little bit of getting used to, but once you get good at it, it’s pretty darn quick. And so we talked a little about the AAC. Now you guys kind of have a new feature, a new app that’s with a browser. How does that work?

Aaron Chavez:
Yep. So we also have a browser and tracking technology, everything around how the head tracking works is fairly similar to what’s going on in the AACF, but you’re moving your head around and controlling a cursor to browse the web. You can head up and down the scroll on pages and blink to click and follow links and you’re doing it all hands free. And since most things are accessible via a website these days, just having even just a browser gets you quite a long ways in terms of being able to do everything you’d want to do. You can email, you can watch a video, you could go get on Facebook, do whatever you want to do.

Josh Anderson:
Well, that’s really helpful. Because I know, especially, I mean, individuals maybe have to use Switch Access or kind of other things like that. Navigating the internet while possible, it takes a lot of time. And just trying to get through all the links and get to where you actually want, or get to the edit box or anything like that can really take a ton of time and a lot of training. So the fact that you can do it just with your face and your eyes would really be a giant help.

Aaron Chavez:
Websites aren’t really designed to be… I mean, it is good that people are doing a lot better job of putting in those classic accessibility, [Alamo 00:19:49] in their websites. But even when you have that, many of the classic accessibility strategies don’t really work well with really dense two dimensional websites. I have theoretically hundreds of things that could be a candidate for switch control or something, it’s just a mess. So being able to navigate in a way that’s a little closer like a mouse cursor, for example, and that’s closer to the way those things were designed to be efficiently navigate. So you can be very effective and we’ve been able to be very effective in seeing people who are very effective in using this way.

Jon Hoag:
The other piece there, our feature that we have for the browser is what we call click assist or touch assist. And that actually, we kind of go over the page and find the clickable elements and we make kind of hit boxes that are bigger for them. So you don’t have to touch the exact spot. You can still kind of identify the thing that I want to click and you can click off a little bit further. So it makes those kind of hard to reach things a little bit easier, which we found as extremely helpful.

Josh Anderson:
Oh yeah. Especially on those pages that you were talking about where there’s just hundreds of things going on. Even trying to click one thing without getting the other would be a challenge. So making it a little bit bigger and easier to access really makes it a lot more accessible.

Jon Hoag:
Yeah.

Josh Anderson:
You guys talked about this a little bit. How much does Jabberwocky cost? Are there different prices, different kind of plans or how does that all work?

Aaron Chavez:
Both of those apps are free on the app store today and we plan on them being free for the foreseeable future. We have some other things where we’ve gotten iPhone, we’ll talk about later, but from our perspective, that’s very important to us that as many people as possible are using these and we just want to remove any possible barrier to somebody doing that. So yeah.

Josh Anderson:
Nice.

Aaron Chavez:
The app, both available for download today.

Josh Anderson:
Perfect. That is great. You kind of already jumped into my next question. What do you guys have coming down the pipeline?

Aaron Chavez:
I’m glad you asked. So we’re also working on an Android application as well, and this is very exciting to us. Android actually has a bit more of a… their model of working with developers on accessibility, as I think were a little more advanced than what you’ll see from iOS these days. They have the notion of an accessibility service, which is like… We’re literally telling developers, “Please, please build accessibility apps.” And some of the considerations that you need for that, giving more control of what you can do on your device versus saying, “Oh, you know, I’m afraid about the security of generating a key press that didn’t come from a finger.” They’re just a lot more, I think, forward thinking in terms of where they’re at on that right now.

Aaron Chavez:
So we’re going to have an app on Android that is a total device unlock. You’ll be able to do literally anything that you can do on your phone. You’ll be able to make a phone call. You’ll be able to do your email. You’ll be able to do all of the apps that you have on your phone and use our head tracking technology to do it. It’s still going to be extremely intuitive and extremely efficient and easy to understand.

Josh Anderson:
Oh wow. That’ll be great. That will be absolutely great for individuals. And I know sometimes the just kind of the entry point to Android can be a little less expensive just because of sometimes the devices are a little less expensive, so…

Aaron Chavez:
Yeah. That’s exactly right. Yeah. Like Jon was saying, the reality is what we built on iOS and using the special devices that can handle the face ID, putting us at an entry level price point of you get what you see, even the $600 or something like that even when some of the things are on sale. Whereas on the Android, the way we were doing the head tracking on Android is supported on a pretty wide range of devices, pretty commodity. There are devices that I think are like about a hundred dollars that worked perfectly for, but in principle should be supported by what we’re going to put out, which is very nice, maybe something that more people own already. And if you don’t, then it’s less daunting to go purchase the device that might be specific for that.

Josh Anderson:
Can you guys tell me a story about someone that this app has been able to help?

Aaron Chavez:
Sure. We have a friend who, I’m not getting into too many details, but yeah, quadriplegia due to neck injury and he’s really been enjoying using the browser, just to stay connected with people, just to be able to watch videos, and just kind of still have a bit of a presence in what’s going on in the outside world, when maybe getting out of the houses and something that is practical every day.

Josh Anderson:
No, that’s great. And I know it used to be, you had to go out to really kind of do anything or answer the phone and stuff, but now with social media, with videos, with being able to share pictures and everything, you can really kind of stay very well-connected and not have to really have somebody there to help you along all the time. So that’s really great that you guys have made this and are offering it to everybody. How can our listeners find out more information about Jabberwocky, about you guys, about Swiftable and what you guys are working on?

Aaron Chavez:
Yeah. Certainly just first and foremost, just to be aware of what’s going on and things change quickly with us. Following us on social media, Facebook and Twitter is a great way to be very up to date on the new stuff, especially, you’re going to want to do that for the upcoming Android app in terms of what’s already out there. Yeah. Look on our website, jabberwockyapp.com, that has all the relevant links to get you seeing sort of what are the apps we have, how can I go download them and try them out.

Josh Anderson:
Excellent. We’ll put links to that over in our show notes. Well, Aaron and Jon, thank you both so much for coming on the show today and telling us all about the Jabberwocky app. We’re very excited to get out there and try it out.

Jon Hoag:
It’s really nice to be here.

Aaron Chavez:
Yeah. Thank you very much.

Josh Anderson:
Do you have a question about assistive technology? Do you have a suggestion for someone we should interview on Assistive Technology Update? If you do, call our listener line at (317) 721-7124. Shoot us a note on Twitter at INDATAproject, or check us out on Facebook. Are you looking for a transcript or show notes? Head on over to our website at www.eastersealstech.com Assistive Technology Update is a proud member of the Accessibility Channel. For more shows like this plus so much more, head over to accessibilitychannel.com. The views expressed by our guests are not necessarily that of this host or the INDATA Project. This has been your Assistive Technology Update. I’m Josh Anderson with the INDATA project at Easterseals Crossroads in Indianapolis, Indiana. Thank you so much for listening and we’ll see you next time.