AT Update Logo

ATU635 – Eyetech Digital Systems with Kia Canteen

Play

AT Update Logo

Your weekly dose of information that keeps you up to date on the latest developments in the field of technology designed to assist people with disabilities and special needs.
Special Guest – Kiuanta Canteen, MA CCC-SLP/TSSLD – Clinical Education Specialist – Eyetech Digital Systems
Bridging Apps: bridgingapps.org
——————————
If you have an AT question, leave us a voice mail at: 317-721-7124 or email tech@eastersealscrossroads.org
Check out our web site: http://www.eastersealstech.com
Follow us on Twitter: @INDATAproject
Like us on Facebook: www.Facebook.com/INDATA
—– Transcript Starts Here —–
Kia Canteen:

Hi. My name is Kia Canteen. I’m a clinical education specialist with EyeTech Digital Systems, and this is your Assistive Technology Update.

Josh Anderson:

Hello and welcome to your Assistive Technology Update, your weekly dose of information that keeps you up to date on the latest developments in the field of technology designed to assist individuals with disabilities and special needs. I’m your host, Josh Anderson with the InData Project at EasterSeals Crossroads in beautiful Indianapolis, Indiana. Welcome to episode 635 of Assistive Technology Update. It’s scheduled to be released on July 28th, 2023. On today’s show, we’re very excited to welcome Kia Canteen from EyeTech Digital Systems to the show. She’s going to tell us all about the AAC devices and services that they offer to help individuals with communication and living more independently. We also welcome back Amy Barry from Bridging Apps with an app worth mentioning.

Listeners, as always, we want to thank you for taking time out of your day to listen to us, and don’t forget to reach out. You can always send us an email at tech@eastersealscrossroads.org. Call our listener line at (317) 721-7124. Hit us up on Twitter @InDataProject, or drop us a line through our website at eastersealstech.com. Again, we thank you for giving us a listen today. Now, let’s go ahead and get on with the show.

Next up on the show, please join me in welcoming back Amy Barry from Bridging Apps with an app worth mentioning.

Amy Barry:

This is Amy Barry with Bridging Apps and this is an app worth mentioning. This week’s featured app is called Capti Voice. Capti Voice is an assistive technology app that allows users to browse the web and add content to a playlist to be read out loud, listen to documents, homework, the news, et cetera. Capti will improve your productivity and make your reading more enjoyable. This is a great tool for students, language learners, commuters, busy professionals, retirees, people with dyslexia or other learning disabilities, and many more. Capti is your literacy and reading support tool.

It’s currently available for iOS devices and it’s free to download with optional in-app purchases. The free features that are available are text to speech. You can switch seamlessly between your devices. You can save books, documents and web articles, customize text color and fonts. There’s also support for print disabilities like voiceover and special font, and you can also skip to the next or previous sentence, paragraph, heading, page, bookmark. And then additionally, if you choose to upgrade, the premium plan benefits include PDF is displayed with images, fonts, colors and layout. You can look up word translations and definitions, and it also helps you to stay organized by saving tracks into separate playlists. For more information on this app and others like it, visit bridgingapps.org.

Josh Anderson:

Listeners, on today’s show, we’re excited to welcome Kia Canteen from EyeTech Digital Systems to discuss their communication devices and services and how they’re helping individuals be more independent and accomplish their communication goals. Kia, welcome to the show.

Kia Canteen:

Thank you for having me, Josh.

Josh Anderson:

Yeah. I am really excited to get into talking about the technology, but before we do that, could you tell our listeners a little bit about yourself?

Kia Canteen:

A little bit about me, I am a speech language pathologist. I began my career in 2003. I’m celebrating my 20th year this year. I have worked the entire lifespan, so from birth to 104 years old, my oldest client, and have done everything from the NICU to hospice, and a lot of in between with adult neuro and brain injury, ALS, and a lot of other neurodegenerative diseases.

Josh Anderson:

Wow. So you’ve seen it all in those 20 years from a speech language pathologist standpoint. That is awesome. I’d almost want to spend the whole time just hearing some stories about that, but that’s not why we had you on today. We had you on to talk about EyeTech Digital Systems, and let’s just start with the big picture. What is EyeTech Digital Systems?

Kia Canteen:

EyeTech Digital Systems makes communication devices, AAC, assistive and alternative communication devices. We utilize eye gaze technology with two of our devices. We currently have three for different types of users. There is one device that is a non Eye Gaze and it’s a direct access for users who don’t need eye gaze. But for users who need a little more customization, we do offer two devices that do have eye gaze. We have Windows and Android tablets and a wide range of software applications, some proprietary and some partnered that we utilize on our devices.

Josh Anderson:

Excellent. Let’s dig in a little bit to the kind of different devices. Let’s start with the Windows tablet device. Tell me a little bit about it and about the eye gaze system with it, and just describe it to me.

Kia Canteen:

So Josh, we have our largest device is our 14-inch speech device. It is called our Ion Elite, and so our EyeOn Elite comes with our OnBright app that helps users stay connected, utilizes eye gaze, environmental controls, and hands recalling and texting. We have then our 12-inch EyeOn air device, which is an Android tablet. It is a lighter weight option that’s compatible with multiple access methods as well, including eye gaze or switch. You can use direct select or touch. The eye gaze is a perfect compact option that still offers the functionality of what our users need. And lastly, we have our EyeOn Go, which is a 10-inch speech device. We like to tout that it has up to 38 hours of battery life on our EyeOn Go. It offers lots of mobility for our ambulatory users who are using touch and switch as access methods, and it also includes access to your favorite app stores, like the Google Play Store as an Android device.

Josh Anderson:

Oh, nice. You can do a little bit more with just communication.

Kia Canteen:

Absolutely. We want to definitely utilize our devices as life function devices. So we understand that we don’t just communicate in certain aspects or certain times of the day, communication happens everywhere and anywhere, and so we like to have our users integrate our devices as a part of their regular life function.

Josh Anderson:

Excellent, excellent. Want to talk just a little bit kind of about the eye gaze system. What does setup look like that for an individual?

Kia Canteen:

Our eye gaze devices utilize our proprietary cameras and we’re able to calibrate for the user’s needs. And the nice thing about our calibration is that we are able to calibrate on a lot of complex eye users. There are different settings that we can do as far as dwell or blink. We can also set the timing for those functions to work, and we’re able to also use user driven or standard calibration. If there’s a little bit of trickier calibration, we are able to calibrate on either one eye or both eyes. So we definitely have options for calibration, which opens up the variety of users in the application for different types of users who may be able to use our eye gaze devices.

Josh Anderson:

Oh, definitely. If you have better control kind of over one, and I know sometimes those can be kind of hard to calibrate, so I like that there’s some different choices there to really be able to assist folks.

Kia Canteen:

Absolutely.

Josh Anderson:

Now I know these can be pretty well tailored to the individual. So besides just the input method, what are some different ways that the devices can be tailored to help individuals with different needs?

Kia Canteen:

We are able to customize our communication boards. Our boards can go anywhere from a one cell board to 88 cells. We also have picture boards where it’s symbol based communication, or for more literate users to use text-based communication with word or phrase prediction. We can also store words or phrases and use a combination of the two. All of our devices are able to utilize functions with keyboards where we’re able to customize those keyboards from a QWERTY keyboard to an alpha keyboard, two-step keyboard and really just meet the needs of what that user’s able to do or a combination of all of them.

In addition to the communication features that we have, we’re also able to integrate environmental control access and to utilize… We call it long range communication, and so our long range communication would incorporate being able to utilize an Android or an iPhone to make voice calls or to text. In addition to those features that we have, because we do want to make sure that our users are able to integrate their device as a part of their regular ADL task, we can pair our devices with other features like an Alexa or a Google Home so that we’re able to control remotes, light switches. I have many users who just like for Alexa to tell them a joke.

Josh Anderson:

Oh, for sure. For sure. The things that we all pretty much end up using them for after we’ve had them for a while. Of course an individual absolutely would definitely want to use it in the same kind of way.

Kia Canteen:

I’m sorry. And beyond our just basic needs, we have integrated a lot of our socialization or social networking or social access is how we like to put it, where you’re able to go onto Facebook, you’re able to go onto Instagram, you can use WhatsApp, you can go streaming if you like to. So we’re able to utilize some other features with applications in addition to the communication to also be able to go on Netflix and be able to stream or things of that nature to make sure that we’re encompassing everything that person may want to do or want to have that independence to have the control over, utilizing through their device, so that’s nice.

Josh Anderson:

That is really nice. Especially if you’re using an alternative access method, it’s great to not have to calibrate many devices or have someone change them out for you or anything like that. If you can access it all from one device, then it’s set up and you’re just pretty much going and can do whatever you want from that one.

Kia Canteen:

Absolutely. In our world today, when we look at just us who are not using AAC devices, we’re utilizing our phones and our computers as multifunctional devices. And so the goal here is to make sure that our users don’t have to choose. And so we never want you to have to make a decision as to whether or not you’re choosing speech or you’re choosing to live the other functions of your life. We want you to be able to choose that one device and it’s able to meet the needs of everything that you want to do.

Josh Anderson:

For sure. Without having to have a caregiver, family member or somebody else sit there and switch things out. That just gives you much, much, much more independence.

Kia Canteen:

Absolutely.

Josh Anderson:

Kia, for listeners who maybe don’t know a whole lot about AAC, what disabilities or needs do these kind of devices usually assist with?

Kia Canteen:

We have a wide variety of needs and users that we do support. Most typically when we think about eye gaze, I know the first thing that may come up in most people’s minds would be people with ALS, people who have brain injuries, stroke, dementia, developmental disabilities like autism, [inaudible 00:12:22] syndrome, and the list goes on and on and on. I think we’re looking at different indications of users who may be applicable for AAC. I know one area that we are definitely going into and has been brought about by the global pandemic are [inaudible 00:12:41] and vents. And so when we’re you looking at just medical access and the ability to be an active participant in healthcare, we want to make sure that we’re providing practitioners and providers a way to have a good two-way exchange so that everyone, regardless of their ability to communicate or to speak, they have a way to be an active participant in their healthcare.

So definitely looking at other indications outside of some of our typical users. As I mentioned before, ALS stroke, autism, cerebral palsy, brain injury, and some other mental health conditions that are concomitant. We’re also looking at just sort of those medical emergency types of situations where there is not really a lot of assistance to help those patients. And when we’re looking at patients who are unable to communicate, they’ve been considered silent patients, so we definitely want to change that outlook, because if it were you, and I would definitely want a way to be able to be an active participant to dictate the course of my care during a medical emergency or crisis.

Josh Anderson:

Oh, most definitely. And those are kind of the things that people don’t ever really think about. You just think about normal… Or not normal communication, I guess that’s not the right word, but our day-to-day little conversation, and people always neglect those kind of things, the big stuff, the big things that don’t come around all the time, but just being part of that process of life is so important. I know sometimes the individuals who may be using this, their caregivers, their families may not be real super tech-savvy. Can you tell us about the training and other services available with EyeTech Digital Systems that kind of just helps ensure the individuals are successful with their communication devices?

Kia Canteen:

Absolutely. One of the main reasons why I chose EyeTech Digital Systems as a company is because they support the user and they also support the community of caregivers that that user has to support them as well. We have our success coaches, and our success coaches are a team of individuals that are available to remote in or to call in or to schedule a face-to-face with our regional sales consultants to come onsite with to provide that level of support.

In addition, you also have myself a clinical education specialist. With EyeTech Digital Systems, there is a three-tiered support system. We have our regional sales consultants who are going to be that first line of support for an interaction for each user. And beyond that, once the device, once they actually have their device or we’re training for the device, we have a team of success coaches and our success coaches are individuals who do have some AAC background. We have a speech language pathologist assistant. In addition, we also have a assistive technology specialist who support clinical and technical questions and concerns and are able to navigate… There’s a click of a switch that they’re able to do, just a button that’s on the device. They’re able to click that button and be able to be connected to schedule a success coach or to reach out directly to someone to provide that level of support.

We understand that just getting the device is not the end all, be all; getting the device is the first step. And so after the device to ensure success and to try and reduce device abandonment, we want to provide that level of support for customization. We want to make that device yours. So our success coaches are available for not just technical support, the device isn’t working right or I don’t know how to change these settings, but they’re also there to customize those devices to meet the needs of the user. If a user only wants to have basic communication, and that’s where we’re starting, we may start with a very simple board, and as their needs change or their needs grow, we can work with either their clinical team or their family to customize those boards to meet the needs at home, at school, at work, or wherever that user’s going to be.

Josh Anderson:

And then beyond the devices, do you have mounts and other things like that available as well?

Kia Canteen:

We do. We partner with Rehadapt and they provide our mounting needs. If there are any specific additional mounting needs that Rehadapt is unable to fulfill, we will work with that family, that caregiver, that user, that provider to seek out the appropriate mounts. Those mounts can range from tabletop mounts, floor mounts, or wheelchair mounts.

I did also want to mention, I know we talked a little bit about the access methods and other things beyond speech. We also have just recently partnered with Ability Drive. And so now our users, again, when we’re talking about having to choose what we want to do, they have the ability to control their motorized wheelchair with their eyes. That’s a very, very, very cool feature that we’re able to do now. So we’re able to incorporate the ability to drive motorized wheelchairs through Ability Drive via your communication device

Josh Anderson:

Kia, you’ve been doing this for a little while. Could you tell me a story maybe about an individual and their success working with EyeTech Digital Systems, maybe one that sticks out to you?

Kia Canteen:

Oh, absolutely. There’s so many. Oh my.

Josh Anderson:

Maybe a couple. We can throw a couple.

Kia Canteen:

Oh sure, a couple. We can absolutely do a couple. I have not done AAC my entire career. AAC is probably where I have been towards the last five years or so of my career. But however, throughout my career, I have seen the challenges of the frustration of communication and that inability to be able to express what you want and what you want to say and to have someone else make decisions for you.

We have a user who’s 30 years old. She was two months away from graduating nursing school. While in her car she was struck by a gunshot, and so as a result of that gunshot wound, she was paralyzed from C3 down, unable to really use any of her limbs from for switch or anything of that nature. We were able to, while she was in ICU, go in with our device, put the device in front of her, and one of the very first things that she typed was, “I want to live”. And so you think about that and you think about that person being you, I was just like, wow, this is phenomenal. And it gives a voice to users that may not have had a voice.

I remember back in my career, just kind of trying to think of other methods that I’ve used to try to communicate with patients who are in that predicament, blink once for yes, blink twice for no, thumbs up, thumbs down, and just looking at the effectiveness of that. In working with her, we trained her care staff and came in for a check-in visit with she and her nurse, and just looking at just the patient-provider interaction during something as simple as a pain assessment, nurse came in to assess pain, she was able to communicate with her what pain she had, what type of pain, where it was located. Nurse gave her some pain medication, came back, and she was able to type on there that she no longer experienced any pain. She was able to get the appropriate dosage. She felt good and that this thing was awesome to get her needs met. And so for me, it was just chills.

Josh Anderson:

Oh yeah, for sure. Well, yeah, like you said, it’s so much more information than just yes, no. Even just in something… I don’t want to trivialize it, but as simple as that pain management. You could probably look at a scale and say it’s at a seven or a three or maybe something like that, but being able to say where it is, describe it and give so much more information to the doctors and nurses just helps everybody all the way around.

Kia Canteen:

Absolutely. Another one that comes up, and there are things that I guess we don’t think about, and we think about basic communication as just being able to give a response. To maybe in emergency, be able to immediately respond and meet what we anticipate or what we interpret as the important part piece of what’s being communicated.

Another veteran who received one of our devices was at a VA hospital, and the very first thing that he typed on the device, he had been trying to communicate, I guess, the adjustments for the temperature in his room, and they weren’t quite getting it right and there was a misinterpretation that he wanted the air on and not off. And so once he was able to go in and we customized his device where he had his settings, he was able to then control the temperature settings with Alexa. So, one of the first things that he typed on there is “Now I won’t be frozen”. It was just… And I know we laugh at it, but if you’re thinking of you’re that person and you’re in a room and the lights are on all the time or the air is at a temperature that’s not comfortable for you, just being able to express that and have those needs met in addition to some emergent needs, it’s just kind of a basic need that we all would like to be fulfilled.

Josh Anderson:

Oh, for sure. We think of these big grandiose things and if these little things aren’t taken care of, then I can’t move on to those other things. Just those basic comforts, those basic things we need that are wholly important but kind of taken for granted when you can just go and maybe hit the thermostat or something.

Kia Canteen:

Absolutely. And so of course our users are wide ranges. I could be here for days. We do have other non-typical verbal users, and I’d like to just talk a little bit about the misconception of just because someone speaks that AAC may or may not be appropriate for them. And I’d like for us all to remember that we want to choose communication over speech, and so there’s sometimes a lot more to the message than that person’s able to get out when we’re thinking about planning, organizing our thoughts, choosing the right words, getting them in the right sequence, making sure that that message is delivered.

I have a user who is a verbal user. She does speak. She has cerebral palsy. She’s pretty independent where she does live alone, but because of some of her thought organization and some of her physical challenges with just being able to access touch, she utilizes eye gaze. And so she uses her device as kind of a combination between speech and assistive technology for access. She made a mention that it’s sort of an extension of her body, and if she didn’t have that device, simple things like being able to call 9-1-1 or utilize those type of features with our connection with the phone make a big difference, because she lives alone, and that might have been the difference between her living alone and not living alone, having a way to be able to access that type of communication.

Josh Anderson:

Nice. And real quick, just while we got a little bit of time, just tell us the difference, because I’ve had SLPs tell me this before, between just speech and communication.

Kia Canteen:

Verbal speech doesn’t have to always be the goal. We want to have good communication as the goal. And there are many reasons why someone who may to choose to use AAC in addition to their verbal speech, because AAC provides us with a toolbox. There are some users who have multiple communication challenges beyond just being able to produce the words. We call it sometimes a mind-body disconnect. These atypical motor planning movements affect how someone who, let’s say with autism, might plan, control and execute a variety of their movements. And so in many ways, they may feel that their mouth words don’t match their brain words. For other users, there’s an auditory processing or an oral motor coordination that sometimes symbol based AAC can build upon to give their relative strengths with visual cues and aids for them to be able to express the desired communication message that they like to so we won’t get that mixed up, if that makes sense.

Josh Anderson:

Nope, that makes total sense. And thank you for kind of clarifying that for folks, because I know sometimes the two words and the two thoughts of speech and communication, we’ll jumble together as if they’re the same kind of thing, and I know there’s so much more that goes into it than that. Well Kia, if our listeners want to find out more about EyeTech Digital Systems and everything that they offer, what’s the best way for them to do that?

Kia Canteen:

The best way for them to do that is to visit our website and to definitely click on the button to schedule a demonstration or to schedule a time to have further discussion with one of our AAC consultants.

Josh Anderson:

And Kia, what is that website?

Kia Canteen:

So our website is EyeTech, E-Y-E-T-E-C-H-D-S.com. So eyetechds.com.

Josh Anderson:

Awesome. We’ll put a link to the website down in our show notes. Well Kia, thank you so much for coming on today, for telling us… Well, for talking about communication with us for one thing and for telling us about all the great things that EyeTech Digital Systems offers.

Kia Canteen:

Absolutely.

Josh Anderson:

Do you have a question about assistive technology? Do you have a suggestion for someone we should interview on an assistive technology update? If so, call our listener line at (317) 721-7124. Send us an email tech@eastersealscrossroads.org or shoot us a note on Twitter @InDataProject. Our captions and transcripts for the show are sponsored by the Indiana Telephone Relay Access Corporation, or INTRACK. You can find out more about INTRACK relayindiana.com. A special thanks to Nicole Prieto for scheduling our amazing guests and making a mess of my schedule. Today’s show was produced, edited, hosted, and fraught over by yours truly. The opinions expressed by our guests are their own and may or may not reflect. Those of the InData Project, EasterSeals Crossroads are supporting partners or this host. This was your assistive technology update, and I’m Josh Anderson with the InData Project at EasterSeals Crossroads in beautiful Indianapolis, Indiana. We look forward to seeing you next time. Bye-bye.

Leave a Reply

Your email address will not be published. Required fields are marked *