ATU452 – Nous with Dmitry Selitskiy

Play

ATU logo

Your weekly dose of information that keeps you up to date on the latest developments in the field of technology designed to assist people with disabilities and special needs.

Show Notes:
Dmitry Selitskiy – Founder and CEO of Thought-wired the makers of Nous
Check out Nous: https://getnous.app
Handiplanet Story: http://bit.ly/2QHVKXF
Comcast and NuEyes Story: https://comca.st/2si98Za
——————————
If you have an AT question, leave us a voice mail at: 317-721-7124 or email tech@eastersealscrossroads.org
Check out our web site: https://www.eastersealstech.com
Follow us on Twitter: @INDATAproject
Like us on Facebook: www.Facebook.com/INDATA——————Transcript Starts Here————————–

Dmitry Selitskiy:
Hi, this is Dmitry Selitskiy. I’m the founder and CEO at Thought-wired, the makers of Nous and this is the Assistive Technology Update.

Josh Anderson:
Hello, and welcome to your Assistive Technology Update. A weekly dose of information that keeps you up to date on the latest developments in the field of technology designed to assist individuals with disabilities and special needs. I’m your host, Josh Anderson with the Indata Project at Easterseals Crossroads in beautiful Indianapolis, Indiana. Welcome to episode 452 of Assistive Technology Update. It is scheduled to be released on January 24th, 2020.

Josh Anderson:
On today’s show, we’re very excited to have Dmitry Selitskiy on to talk about Nous, the new access device available for individuals with disabilities. We also have a story about Handiplanet, like a TripAdvisor, but for individuals with disabilities and accessibility. The story about some different things that Uber is now offering to assist individuals with disabilities and their travels and a new partnership between Comcast and New Eyes to help individuals with visual disabilities access their television. Now let’s go ahead and get on with the show.

Josh Anderson:
Are you heading to Orlando next week for ATIA? Myself and many members of my team will be down there running around learning all kinds of new stuff, learning about new assistive technology, new ways to use it and new ways to help those that we work with on a daily basis. I will also be presenting on Saturday, February 1st at 8:00 AM at ATIA on Working It Out, Accommodations For Full Workplace Inclusion. What were some case studies? We’ll talk about some best practices. We’ll talk about some other things. Hopefully, we’ll get to even have a little fun. That is the coveted Saturday morning, last day of the conference at 8:00 AM time slot. So I’m not real sure how I got that one, just born under a good sign, I guess. But if you do happen to actually be awake by that time in the morning on Saturday, and you’re still hanging out for that last half day of the conference, come on over and check out.

Josh Anderson:
Again, that’s Working It Out, Accommodations For Full Workplace Inclusion and I’ll be presenting Saturday morning, February 1st at 8:00 AM. Really, really hope to see you there. And if you can’t make it to the presentation, still maybe keep an eye out for me. I’ll be running around the vendor hall and everywhere else at the conference, just trying to find some new interviews, check out some new technology. So if you happen to see me, or a member my team walking around, please feel free to come up and say hello.

Josh Anderson:
Are you looking for more podcasts to listen to? Do you have questions about assistive technology, about an accommodation or maybe how something works? Are you really busy and only have a minute to listen to podcast? Well, guess what? You’re in luck, because we have a few other podcasts that you should really check out. The first one is Assistive Technology Frequently Asked Questions or ATFAQ, hosted by Brian Norton and featuring myself and Bella Smith and then a bunch of other guests. What we do is we sit around and take questions about assistive technology, either about accommodations, about different things that are out there or about different ways to use things. We get those questions from Twitter, online, on the phone and in many other ways. We’re also trying to build a little bit of a community as sometimes, believe it or not, we don’t have all the answers. So we reach out to you to answer some of those questions and help us along. You can check that out, anywhere that you get your podcast and wherever you find this podcast.

Josh Anderson:
We also have Accessibility Minute. So Accessibility Minute is hosted by Laura Metcalf. And if you’ve never heard her voice, it is smooth as silk and you should really listen to that podcast. She’s going to give you just a one minute blurb about some different kinds of assistive technology, kind of wet your whistle a little bit and just let you know some of the new things that are out there so that you can go out and find out a little bit more about them yourself. So again, check out our other shows, Assistive Technology Frequently Asked Questions and Accessibility Minute, available, wherever you get your podcasts.

Josh Anderson:
A lot of folks out there may have used TripAdvisor. So you can kind of look up different things that you want to do, maybe different places that you want to go. But if you have a disability, TripAdvisor doesn’t tell you maybe everything that you need to know. Well, I found a story called, Handiplanet, the TripAdvisor For People With Accessibility. So really this was created for when folks are kind of trying to look for a place to stay, maybe it gives them a little bit more information. This was started by Melina and Emmanuel Kouratoras and they created this just because they realized that they needed some special things whenever they went and traveled. They needed wide corridors, a larger bed, an Italian shower if possible, maybe a pharmacy next door, some other things like that.

Josh Anderson:
And there was really no way to find out this information. So Handiplanet is actually a social network and it’s dedicated to the accessibility of leisure sites and things like that for people with reduced mobility, as well as folks with visual impairments and maybe other disabilities. It says it already has more than 10,000 members, it is totally free and collaborative. And this can tell things like number of steps, if there’s adapted sidewalks, the size of the toilets, or really anything that other travelers’ kind of find in order to pass that information along to maybe the next folks who will need some sort of adaptation or just need a place to be fully accessible for them. So, very cool. Again, folks kind of making something, making an app that can really help out a lot of folks and then folks working together to make it work. We’ll put a link to this story over in our show notes.

Josh Anderson:
Just kind of sticking with stories about travel and maybe just about mobility, we’ve got a story over at businessinsider.sg, and it’s titled, How to Use Uber Assist or WAV For Riders Who May Need Additional Help to Enter and Exit a Vehicle. It’s by Steven John. And it talks about two programs that Uber has. One being, Assist and the other being WAV or W-A-V. So Uber, a long time ago, got in a lot of trouble for not being accessible, not allowing individuals to bring their animals in the car if they were support animals. So this is just a few programs that they do have out there. And we’ll kind of talk about both of them a little bit. So Uber Assist is a service that connects riders who need extra assistance with drivers who are trained and certified to help passengers enter, exit the vehicle.

Josh Anderson:
And this could be anything from maybe, just needing a hand up, to someone to perhaps fold up your wheelchair after transfer or be able to fold up your walker or whatever you might need and get that in the vehicle. But they can actually help you get from whatever your mobility device is and then into the vehicle. There’s also Uber WAV or Uber W-A-V, I think it’s pronounced wave. And this stands for wheelchair accessible vehicles. So this feature rides with ramps or lifts for passengers that are in motorized wheelchairs or something you’re not going to be transferring an individual out of, and actually picking up and kind of moving. Both these programs cost the exact same as a standard Uber X ride. But of course like anything it seems like with transportation and disability, may have a little bit longer wait time, but really that’s just because there’s probably not as many trained drivers actually doing this kind of program.

Josh Anderson:
This is not offered in every city. This is not available everywhere, but it is something that they’re really trying to get so they can accommodate consumers with physical disabilities, mobility and vision challenges. Folks who do the Uber Assist have completed a certified third-party training in assisting those with disabilities. So they actually have had training to be able to actually help folks and make sure that they do these things okay. It says that currently the Uber Assist is available in about 40 cities worldwide. So not really everywhere. It mentioned San Francisco, Toronto, Sydney, and Taipei. So let’s hope that perhaps it actually gets in a little bit more places. It says that the folks who do the Uber W-A-V or Uber WAV program, are also certified by a third party in safely assisting and driving passengers in wheelchairs. Again, service costs the exact same. It says this one is actually being tested in some US cities, including New York, Chicago, Los Angeles, Boston, and San Francisco and in some parts of the UK and Australia.

Josh Anderson:
It even gives some instructions here in the story of how to get an Uber Assist or Uber WAV. It’s actually really, really easy. You just open the Uber app on your iPhone or Android phone. So if you think, you should already probably have an account set up. If you don’t have an account set up, you’ll need to do that. Select the where to box, type in your destination, confirm your destination and your pickup location, then scroll down. And when you scroll down, you should get some different choices. Usually, there’re some things like maybe an Uber XL, if you need a bigger one or UberX, but then if this is available, it should go down and show assist and WAV, so that you can kind of get those and order them instead. So if they’re available, you just click on that and you go with it and do it just like you would do any other Uber.

Josh Anderson:
So at least Uber is kind of working ahead and trying to make these accessible options more readily available. But of course, as anyone knows, there just aren’t a ton of accessible vans out there and maybe not a lot of folks with this training. So I can really see how that Uber Assist can definitely become big. I mean, especially if Uber is willing to pay for that third party training for those individuals. The Uber WAV, again, you just have to find someone who has that accessible van and perhaps that’s something Uber could do is purchase those and then hire drivers to do it. I don’t know if that’s part of their business model or not, but I did find this story and I know in the past, Uber had been in trouble for not really trying to help out individuals with disabilities and it looks like they’re at least trying to make that effort.

Josh Anderson:
And hopefully that will end up being available in more cities. But we will put a link over to this story over in our show notes so that you can look back and see if perhaps, you can use Uber Assist or Uber WAV in your city.

Josh Anderson:
Found a press release over from Comcast. And it says Comcast partners with New Eyes to enable consumers with visual disabilities to see their television. So Comcast is a cable provider here in the United States. They provide cable services, they provide internet services, voice, those kinds of things. They also happen to be owners of NBC. So very big kind of conglomerate and the New Eyes, if you’ve never seen them or heard of them before, it is kind of an augmented reality vision device. So it’s a wearable, looks a lot like a virtual reality headset, but it’s got some different features in there where it can kind of read to you, it can manipulate print, make it larger, change the color aspects, can magnify things around you and other stuff like that.

Josh Anderson:
But it says here that they’ve actually kind of partnered together in order to kind of install the Xfinity Stream app on to the New Eyes e2 smart glasses. So what that means is, that individuals can basically take live TV and on demand content straight to these smart glasses, and then use those enhancements in order to be able to see the television. It says, it’s actually, this Xfinity stream comes pre-installed on the newest version of the New Eyes e2, so that way you get access to everything kind of available on the Comcast Xfinity stream, which is, I believe you can watch pretty much anything you can watch on television straight through there. So you can use all these different enhancements and plus it’s right there, you’re not trying to look across the room to the television. Because if you’ve ever tried to maybe look at a computer screen or a television screen with a magnifier, you do usually get some challenges there and some difficulties with actually being able to see things.

Josh Anderson:
So very cool that they’re kind of putting this on there. I know New Eyes. I’ve tried it out a few times. It’s very neat, just some of the features that it has. And of course, it’s anything with any visual impairment, some things work great for certain visual impairments, for certain individuals, some things don’t work. So it’s always good to be able to kind of try those out. So if you happen to be here in the United States, you can always go to your local tech act and hopefully they’ve got maybe a New Eyes that you can try out or even have loaned out to you, but we’ll put a link to this over in our show notes so you can find out a little bit more about Comcast Xfinity stream being available on the New Eyes.

Josh Anderson:
Switches, joysticks, cameras, voice, there’s just so many ways to access our devices out there for individuals with disabilities. But many of these require a lot of setup or training and may not meet the needs of some individuals or they need to be re-calibrate whenever any movement is really necessary. They also don’t always allow for comfort. Well, Dmitry Selitskiy, founder and CEO of Thought-wired is here to tell us about nous, a new access solution. Dmitry, welcome to the show.

Dmitry Selitskiy:
Hi, thanks for having me.

Josh Anderson:
Yeah. We’re really excited to talk all about nous. But before we do that, could you tell our listeners a little bit about yourself?

Dmitry Selitskiy:
Yeah, absolutely. So my name is Dmitry and I come from all the way from New Zealand and I’ve been in the assistive technology and overall technology industry for over 10 years now and specifically, looking at assistive tech for nearly 10 years. And in my previous life, I was in boring old business, IT sort of thing. Yeah. That’s a very, very quick introduction.

Josh Anderson:
Oh yeah. Well that’ll kind of work. I’m sure other things will probably come out as we get into nous and stuff. But go ahead and tell us what is nous.

Dmitry Selitskiy:
Nous is an assistive XIS solution and is a wearable eye blink switch or an eye blink detection system. And in plainer English, it just means that it is a device that allows you to control a computer using your intentional eye blinks. You blink your eyes and something happens on the screen, like a option is selected, or a button is clicked, anything like that.

Josh Anderson:
Perfect. Now I know a lot of other kind of eye blink systems rely on a camera on the screen or on the device, or set up kind of somewhere. This is actually fully wearable, is that right?

Dmitry Selitskiy:
That’s correct. So in the nous system, there is no camera. We detect eye blinks or the movements of the eyes that happen during an eye blink using a wearable sensor. So specifically, when you blink your eyes, your eyelids close, but your eyes actually roll at the same time. And that eye roll creates a distinct electrical signal and that is what we detect using a wearable sensor, which is a headband that people wear on their foreheads.

Josh Anderson:
Very cool. So kind of where did this idea come from?

Dmitry Selitskiy:
So originally, I and our team at Thought-wide, we got together some years back and we started researching and developing what’s called a brain computer interface, which is the type of technology that is designed to let people control computers and other systems using brainwaves. And as we worked in that area with these wearable sensors that could pick up on the electrical activity of the brain, we also developed this technology that allowed us to detect eye blinks. And as we have gone through the process and involve the community of professionals and end users, people with access needs, this eye blink detection capability came to the front and that’s what we decided to package up as our first product. So it all kind of came from that area of brain, computer interface research, something that we’re still putting time and effort into, but the eye blink stuff was just the first thing that manifested itself into an actual product that came to the market.

Josh Anderson:
And what all will nous work with. I mean, what all kinds of technology can it control?

Dmitry Selitskiy:
So at this stage, it’s essentially anything that runs on Windows powered devices. So you could control a specialized speech generating device or a communication system that runs on Windows and either replace or augment your switch access set up with nous. Or, you can use things that just run in the browser again, whether it’s communication related or it’s education type content, essentially anything that can be controlled with a switch or even just a button on the keyboard, you can plug nous into it and have access to it, using your eye blinks.

Josh Anderson:
Not to always talk about other kinds of eye blink systems, but I know a lot of them need a lot of calibration and a lot of training to really get an individual to kind of be successful with them. Does nous need a lot of calibration and training?

Dmitry Selitskiy:
So there is some training involved and it’s very individual. For somebody, it could be very quick, for somebody for whom it’s a kind of a new skill to learn, they might need to learn that for a little bit longer. There is some calibration, but now we’re getting to a point where we use machine learning in our systems that allows you to teach the system, how you blink as a individual person. So then, you only need to calibrate [inaudible 00:17:39] for where we need to calibrate somewhere between 10 or 15 times, not in a row, but over a period of sessions, at which point the system will know, okay, this is what Josh’s blanks look like. And then, you’d only need to recalibrate when things change drastically. That’s kind of a point at which we’re at right now.

Josh Anderson:
Oh, good. I can see how that’s a whole lot easier. I could also see how it could allow for kind of movement. I know when I’ve worked with other systems with the camera, if the individual needs to change position or anything like that, you have to reposition the camera, recalibrate and everything. But, being wearable, I would assume that’s something that definitely doesn’t need to be done as much.

Dmitry Selitskiy:
Oh, absolutely. That’s one of the key benefits of not having a camera involved, is you can be in any position in relation to your device that you’re interacting with. And again, because there is no kind of camera, the distance to the screen or the size of the screen that you’re using is completely flexible. You can use a tiny screen, you can use a projector on the wall. You don’t even need to use a screen really. We have some users that have low vision or vision impairments, and they use audio feedback in their communication systems and they just don’t rely on the screen at all. As long as you can blink your eyes, it will work.

Josh Anderson:
So, you kind of touch on it a little bit here, Dmitry, but tell me a story about someone maybe that you’ve worked with or that you’ve kind of heard about that’s been assisted by nous.

Dmitry Selitskiy:
Sure. Let me tell you about our very first user here in New Zealand. So she’s had the nous system for over a year now, and this person is in her late twenties. She has severe cerebral palsy. So she was born with severe even, complex access needs. And over the previous two decades, she and her family, they tried, I think, six different access methods, different possible solutions, and nothing worked for her. And then, we involved her and the family and the professionals that work with her in the product development process, even before nous was available on the market and kind of gathered all of that feedback. But now, over a year later, she uses nous on a nearly daily basis and she has access to a communication solution, the grid three system, and they also use the grids and grid three for other activities, whether it is putting together artwork or playing games or taking photos.

Dmitry Selitskiy:
She sent us some really funny photos of taking, that she took of her dad falling asleep on the couch. It’s pretty hilarious. But basically, it’s really been transformative for that person and then for the family, because now she has access to all of these different things. And, by any means, it’s not the, kind of at the peak of what is possible. They’re constantly learning, constantly improving, so there’s still a long road to go in terms of gaining access to these different types of activities and applications. But yeah, the progress has been amazing for them.

Josh Anderson:
Excellent. You kind of touched on this a little bit, but what does the future hold for nous?

Dmitry Selitskiy:
Well, we think about it on kind of two different tracks, I guess. So one is, closer to the present and that’s just improving the current product and the blink detection itself and making the calibration easier and making sure that the experience is user friendly and smooth as possible. That’s why we’re always keen to hear feedback, both the positive and the negative from people that are using the system or trying the system. So that’s one area of work. And other than that, would just see it being, becoming more reliable and easier to set up and all of those types of things. But the really exciting area for us is, as I briefly mentioned earlier, is this area of brain computer interfaces. So to put it into context, the sensors that we use currently with nous, they pick up on different kinds of electrical activity that the body and the brain produces. So while we pick up the electrical activity of the eye movements during eye blinks, we actually also see these other signals, like the brain signals.

Dmitry Selitskiy:
And so, we’re continually working in that direction of providing control using, or through brain activity. And so, that’s what we’re excited about for the future. Just giving people the ability to interact with their, not just their systems, but obviously their families and people around them, just using brainwaves. That’s the really exciting stuff. There is still some years of work, most likely in that. There’s a lot of challenges in terms of both the devices and the forum factors and the usability of the devices, as well as the software, but, things are moving in that direction.

Josh Anderson:
Oh, I think that would be great. And I know it’s something that’s been talked about for a long time and kind of tried, but I feel like with all the technology that we’ve got now, we are definitely getting closer, but wouldn’t it be just great if those physical disabilities, just really no longer mattered. Because, as long as that you could think it, and then with the different technologies, you’d be able to do anything that an abled bodied individual would be able to do, yeah, just by having the brainwaves go. That would be great. That would be really great. Dmitry, if someone would get nous and would need training, is there training available for them?

Dmitry Selitskiy:
Yes. So, the way we offer nous, or some of our partners, we definitely offer support and training. Everything starts with the resources that we provide, even before you purchase to get access to the system. If people go to the nous website, they can see all the resources they will have access to, whether it’s the user guide or the training and support videos that we have, all of that is available. And then beyond that, when people purchase the system, either we or our local partners offer the initial kind of onboarding and training, and then even some remote support as well, because that’s equally as important as your product is, the level of the training and support that people get to make sure that they are successful.

Josh Anderson:
Oh, definitely. Otherwise, it just kind of ends up being an expensive paper weight if they don’t know how to use it. So I know you said it’s available in New Zealand. Is nous available everywhere?

Dmitry Selitskiy:
Yes. So for us, it doesn’t really matter where you are physically in the world. And we, like I said, either we can support people directly over the internet or wherever we have local partners, whether it’s in the US or in the UK, or in some other places, people can get access to the systems and help, locally as well.

Josh Anderson:
Perfect. And if our listeners would want to purchase nous or find out more, what’s the best way for them to do that.

Dmitry Selitskiy:
The easiest way to find out more is to go on to the nous website, which is getnous.app. That’s G-E-T-N-O-U-S dot app, A-P-P. And all of the information is there, our contact details are there, the list of our partners around the world is there. Yeah, that’s the best place to start.

Josh Anderson:
Perfect. Dmitry, thank you so much for coming on the show today and telling us all about nous. We can’t wait to kind of see where it goes. And I really can’t wait to have you on sometime, way here in the future to find out about that brain interface and how that’s going.

Dmitry Selitskiy:
Excellent. Thanks so much for having me.

Josh Anderson:
Do you have a question about assistive technology? Do you have a suggestion for someone we should interview on Assistive Technology Update? If you do, call our listener line at (317) 721-7124, shoot us a note on Twitter at Indata Project, or check us out on Facebook. Are you looking for a transcript or show notes? Head on over to our website at www.eastersealstech.com. Assisted Technology Update is a proud member of the accessibility channel. For more shows like this, plus so much more, head over to accessibilitychannel.com. The views expressed by our guests are not necessarily that of this host or the Indata Project. This has been your Assistive Technology Update. I’m Josh Anderson with the Indata Project at Easterseals Crossroads in Indianapolis, Indiana. Thank you so much for listening and we’ll see you next time.

Please follow and like us:
onpost_follow
Tweet
Pinterest
Share
submit to reddit

Leave a Reply

Your email address will not be published. Required fields are marked *