AT Update Logo

ATU574 – Cognixion with Meaghan Azlein

Play

AT Update Logo

Your weekly dose of information that keeps you up to date on the latest developments in the field of technology designed to assist people with disabilities and special needs.

Special Guest:

Meaghan Azlein – Marketing & Consumer Growth Coordinator – Cognixion

Find out more about Cognixion:
Link Tree: https://linktr.ee/cognixion
LinkedIn: https://www.linkedin.com/company/cognixion/
Facebook: https://www.facebook.com/cognixion/

Stories:
Apple GAAD Story: https://bit.ly/3wFmuwZ

——————————
If you have an AT question, leave us a voice mail at: 317-721-7124 or email tech@eastersealscrossroads.org
Check out our web site: http://www.eastersealstech.com
Follow us on Twitter: @INDATAproject
Like us on Facebook: www.Facebook.com/INDATA

—– Transcript Starts Here —–

Meaghan Azlein:
Hi, this is Meaghan Azlein. I am in the Growth & Customer Success department here at Cognixion, and this is your Assistive Technology Update.

Josh Anderson:
Hello. And welcome to your Assistive Technology Update, a weekly dose of information that keeps you up to date on the latest developments in the field of technology, designed to assist individuals with disabilities and special needs. I’m your host, Josh Anderson with the INDATA Project at Easterseals Crossroads in beautiful Indianapolis, Indiana. Welcome to episode 574 of Assistive Technology Update, it’s scheduled to be released on May 27th, 2022. On today’s show, we’re super excited to have Meagan Azlein on from Cognixion. And she’s here to talk about some of the really cool things they’re doing, including how to control a communication device using just your brain. We also have a quick story about some of the things that Apple announced during global accessibility awareness day. We thank you so much for listening and let’s go ahead and get on with the show.

Josh Anderson:
Maybe you’re looking for some new podcast to listen to, well make sure to check out our sister podcast accessibility minute and AT FAQ or Assistive Technology frequently asked questions. If you’re super busy and don’t have time to listen to a full podcast, be sure to check out accessibility minute. Our one minute long podcast that gives you just a little taste of something Assistive Technology based so that you’re able to get your Assistive Technology fixed without taking up the whole day. Hosted by Tracy Castillo, this show comes out weekly. Our other show is Assistive Technology Frequently Asked Questions or AT FAQ. On assistive technology frequently asked questions, Brian Norton leads our panel of experts, including myself, Velva Smith and our own Tracy Castillo, as we try to answer your Assistive Technology questions. The show does rely on you, so we’re always looking for new questions, comments, or even your answers on Assistive Technology questions. So remember if you’re looking for more Assistive Technology podcast to check out, you can check out our sister shows accessibility minute and AT FAQ, wherever you get your podcast now, including Spotify and Amazon music.

Josh Anderson:
So folks, our story today comes to us from over at Forbes. It’s written by Steve Aquino and it’s titled Apple Honors, this year’s global accessibility awareness day by previewing innovative new Assistive Technologies. I know we mentioned on the show last week, that global accessibility awareness day was celebrated in the third Thursday in May. And this is the day just to really raise awareness and let folks know about accessibility and making everything that they do accessible. Apple every year, usually comes out around this time and talks about maybe some new features that they’re going to have, some new things that are going to be built into their devices, well in this year was no different. So I’ll put a link to this story over in the show notes so that you can read it yourself, but I’m going to scroll on down to where it talks about some of the new things that Apple highlighted on Global Accessibility Awareness day.

Josh Anderson:
The first one is door detection, so door detection, this will be part of the magnifier app on iOS is an offshoot of the people detection feature, which was introduced back in iOS 14, 14.2, something like that. It uses LiDAR machine learning and can aid blind and low vision individuals in identifying doors. It says that these features, door detection and people detection live in a new place in magnifier called detection mode. So the software is even able to assist in telling someone whether a door is open or closed, how it should be opened and even read any signage that might be on there such as accessible restrooms. This feature is supported on any iPhone or iPad that has LiDAR, so you usually, you have to go with the kind of the pro models in order to get that. The next thing it talks about is Apple watch mirroring. So this uses airplay technology and it allows people to control their Apple watch through their paired iPhone. So if you got different mobility challenges, if you’re using switches or voice control or things like that, you can actually do everything from your watch on your phone.

Josh Anderson:
So control the entire thing from the phone, using the different gestures and ways that you normally would access your phone. The next one it talks about is live captions says, this is new for iPhone, iPad and Mac live captions are available for audio based text messages and FaceTime calls. Apple says these transcriptions are private and secure since the processing takes place locally and nothing is transmitted to iCloud or another server. So this could be a really great way to be able to communicate with folks who have hearing impairments via text message or FaceTime calls. They’ve also added new voice over languages, says that the new languages have full support of the Speak Selection and speak screen accessibility features. And it looks like they’re going to have about 20 new ones, including Bulgarian, Ukrainian, and Vietnamese. So even more folks will be able to access voiceover, which can really open up a whole new world to individuals. This wasn’t all that they talked about.

Josh Anderson:
One thing that I’m really excited about that they talked about during this launch was they announced miscellaneous new things like Siri Pause Time, which forces Siri to wait before responding to queries and commands. This has always been kind of an issue for folks. I asked Siri something and I’m not done talking and she’s already kind of answering, says here the users are able to adjust the amount of time that Siri waits. So this is great for folks who have any sort of speech impairment or just need a few extra seconds to get that thought out before she starts answering. So that’s something I am very, very excited about. It also says here that it sound detection’s going to be able to pick up some different things and that Apple books is going to let you customize options like Boulder formatting, change the word spacing or other things so that it’s a little bit easier to read for folks.

Josh Anderson:
So there’s even more information in this article and we’ll put a link to it so that you can go check it out. And I think we’ve had quite a few of Steven’s articles on the show here before out of Forbes. But again, just some cool things that Apple highlighted to celebrate Global Accessibility Awareness Day and we’re looking forward to trying them all out. Listeners on today’s show. We get to see what happens when you mix AI, AR and AAC. Our guest today is Meaghan Azlein from Cognixion, and she’s here to tell us about some pretty cool devices that they’re working on. Meaghan, welcome to the show.

Meaghan Azlein:
Hi, thanks for having me.

Josh Anderson:
Yeah. I am really excited to talk all about Cognixion and all the cool things you guys are doing, but before we do, could you tell us a little bit about yourself?

Meaghan Azlein:
Yeah, absolutely, I’d love to. So my background is interesting and a little bit diverse. You wouldn’t think I’d end up in the Assistive Technology field, but here I am. My background is actually in secondary science education. So originally way back when I used to be a junior high science teacher, because I love a challenge and that is just the best age for teaching. I think everyone says I’m crazy, but I don’t think that’s true. So about, I would say seven years ago, I was at a start of weekend competition in Santa Barbara, California at UCSB and our CEO, Andreas Forsland, who I knew at the time through a mutual friend and colleague was actually there as one of the judges on the panel for that weekend.

Meaghan Azlein:
And the team that I was on, I took over our marketing role and we actually ended up winning that startup weekend. So after that I got in touch with Andreas and ended up joining the team part-time for marketing and just, it kind of grew from there the message that we have at Cognixion and the community that we work with really spoke to me. And I have a background in education, but also in special education and for me it just seemed like such an amazing opportunity to really change the world. So I actually left my teaching career and joined Cognixion full time.

Josh Anderson:
That’s awesome. It’s amazing how these jobs and callings can kind of find us, isn’t it? It’s not always what you go out looking for.

Meaghan Azlein:
It’s so true. Yeah, and my dad is actually a para educator. So before I became a teacher, I actually got to go volunteer in his moderate to severe classroom. And just working with that community of students really drove me to have a passion for working in that area, especially because so much of the community who needs Assisted Technology is so underserved and seeing a company that wants to put those needs first. And not only make the technology that’s so needed, but make it look cool and make it something that you’d want to use. And people who might not even need AAC are like, “Hey, what is that?” Like, “Where do I buy that? I want one.”

Josh Anderson:
Oh, exactly. That’s definitely, and it’s kind of nice as technology becomes more, I guess ingrained in our lives. It seems like it’s a little bit easier for individuals with disabilities to use it and not stand out as much.

Meaghan Azlein:
Absolutely, yeah.

Josh Anderson:
Well, the real reason I had you on here today was to talk about first Cognixion. Can you tell us what is Cognixion?

Meaghan Azlein:
Yeah, absolutely. So Cognixion is a neuroscience and tech company and we are working on pioneering new technology within the field of Assistive Reality and what we call wearable speech. So we develop Assistive Technology using multiple interaction modalities, including an AI powered non-invasive BCI, or what most people will call, it is a print computer interface. We are based in Santa Barbara, California, and we also have a team up into Toronto, Canada that develops software that helps people with communication challenges or a variety of needs including those people who might be affected by some type of neurodevelopment cell disorder or an injury for brain or spinal cord, to help express themselves in a much faster way than any solution that’s currently available. Just to really help increase that ability to communicate with others and just provide a different type of tool. We also have something that no one else currently has, which is an Alexa enabled device within our AAC app.

Josh Anderson:
Nice, all right. Well you just opened up a whole floodgate of questions for me. So let’s start off by talking about a few of your different kind of solutions. Could you start off by telling me about Speakprose?

Meaghan Azlein:
Yeah, absolutely. So Speakprose is actually an app that’s available in the Apple app store. You can actually download it for free to try it out, because our belief at Cognixion is we want to make the best tools possible, but not everyone uses the same tool and not every tool is right for every person. So we want to make sure that everyone can try it out for free before we have to make that commitment of purchasing anything. And then it’s got a monthly subscription and the yearly subscription and we really believe that accessibility goes so far beyond just the accessible tool, but also offering accessible pricing. So that’s definitely something that is a great option for a lot of people to try out. So it is an augmented communication app that is for people who have the ability to use switch access, or they are able to type or tap on a screen or swipe. So it has gestures, it has basic guest knows, it also has a sentence builder in there and it has a keyboard that’s got predictive text in it that actually connects to Amazon Alexa.

Meaghan Azlein:
So you don’t need to buy a dot, you don’t need to buy an echo, sorry to everyone, who’s Alexa just put off on the podcast. And you can actually send it straight to Alexa and she can send you back an answer. So whether that’s asking what the weather is or wanting to check on a package, turning your lights on and off, which my coworkers love to do when they’re not in the office, it freaks everyone out, it’s great. You don’t actually have to be in the location where those smart devices are. So if you set something within your Speakprose app that you want to turn on or off the light and work in your office, we can actually do that from home, which my CEO, just to his wife, all the time from the office, you can actually control a lot of really fun things with your Alexa enabled options through the app and you don’t actually have to go out and purchase anything extra to do that.

Josh Anderson:
Oh, that is super cool. Because I know all the smart home technology, everything that, I guess I’ll say Alexa too, we’ve already enabled everyone’s devices enough, but-

Meaghan Azlein:
Just one, I should have been in Alexa trigger on the DNA of this side to think about that.

Josh Anderson:
… but all the different things that can do and really help folks, especially with maybe some kind of more severe disabilities, you really have to have the voice or be able to fully access the app, which can be a little weird and everything. So I love that you built that in. So that I’m not going app to app or if I can’t really access everything in a different way. So that’s awesome that you do kind of open that whole world of smart home technology up to folks who can really use it.

Meaghan Azlein:
And it’s really great too, with my background in education and teaching, it’s really great for independent schools. Whether it’s a student trying to reach their independent schools, that their teachers have lay down on their IEPs, that’s so essential and so important for growth, but also anyone who’s at home that might have limited mobility. For example, my husband’s disabled and sometimes it’s really difficult for him to get up and go in the morning and temperature inside our home can really affect his mobility abilities. So for him being able to change the temperature from his phone in bed, to be able to make life just a little bit easier is so important. And he is not even someone who might necessarily not… He doesn’t need the communication portion of an AAC app, but being able to have the Alexa on it and being able to change the temperature, turn the lights on and off. He’s not able to, use mobility to be able to get to the switch or to the temperature control is just super helpful.

Josh Anderson:
Oh, definitely. And now another solution that you guys have is Cognixion ONE, and I’m really excited to learn about this. So just start off by telling our listeners kind of what is it?

Meaghan Azlein:
So we get a lot of questions because it’s definitely something that’s not seen out there very often. And for example, when we’re at a trade show or things like that, now that everything’s starting to kind of open up, they see we have a brain to the hill on the table and people are like, “That looked suspicious. Why do you guys have a brain? You guys are an AAC company.” But we actually are working on the world’s very first brain computer interface that has an augmented reality wearable speech aspect. It’s completely non-invasive.

Meaghan Azlein:
It looks very similar to just your, augmented reality headset, but it actually able to generate speech. Now we have an augmented reality version that uses the ability of head movement. So if you are not fully locked in, you can have a different version that doesn’t need the DCI portion of it. But for example, someone who is fully locked in that is more later stage ALS or has some type of stroke or traumatic brain injury and isn’t able to move their head and isn’t able to even use an eye tracking option anymore. That would be something that is available that really isn’t currently out there in the world right now that once you get to that stage of ALS or different things, you completely lose the ability to communicate, but you are still having so many important things you need to share with the world.

Josh Anderson:
Oh, definitely. And you kind of brought this up, whenever I think of brain interface, I think of, putting electrodes inside the brain and then things like that. I know you sort that this wasn’t invasive. So how does that work? Is it well, how does it work? I guess I’ll ask you cause you’ll know a lot better than I will.

Meaghan Azlein:
Absolutely. So immediately when people hear brain computer interface, they think of either something in similar like mind control. Like can I read my thoughts, which I will tell you right now, I can’t, I make sure everyone knows that right front. And then they also think of things like invasive surgery or they think of wearing those caps that you see often, like the old school swimming caps in the fifties with a lot of conductive gel. So your head was really goopy or you have to shave your head, but actually with the Cognixion ONE with our axon model, which is what we call the version that has the brain computer interface in it actually has electrodes try electrodes on your head. It’s completely non-invasive. You don’t have to shave your head for it, it’s actually able to move those electrodes in and out. We have a wheel on the back that can actually, coordinate that and you don’t actually have to shave your head. You don’t have to wear gel.

Meaghan Azlein:
So we’ve got six electrodes on the back portion of the headset. And then there’s one on each side of the head because it goes around your head in a circle. And it uses that to sense the SSVEP signals in your brain and essentially, picture looking on a computer screen. And you are presented with a couple of options on the screen so we just say yes, no, and maybe. And visually you would look and see that those words look exactly the same, but on a very minute level, what we do is each one of those words is flashing at a different Hertz speed. So each one has a different speed of moving, so what happens is your brain, as you’re looking in, whatever word you saying you’re focusing on no. Your brain signals consents that you are looking at the item in front of you that is flashing at that speed. So what the electrodes do is they’re looking at the one that has this amount of speed and it translates it into the choice that you’re making and then vocalizes it out loud through the speakers in the actual headset.

Josh Anderson:
Oh, nice. Nice, I’m so glad you said that it can’t just read your thoughts because-

Meaghan Azlein:
That is always [inaudible 00:17:40].

Josh Anderson:
…a machine that sits on my head and tells everyone around me exactly what I’m thinking sounds like a nightmare.

Meaghan Azlein:
I know I’m like, I don’t know if I really want to do that. I mean I would want to put it on my dog if it could read thoughts, if that would be super cool, no thought reading and no invasive surgery, which is really great. You don’t have to shave your head, it is able to move in and out. We’re of course always building and growing to make it more acceptive and more inclusive of all different types of hairstyles. And we actually have an advisory council specifically to work on things like that to make sure it works for a variety of people and variety of needs.

Josh Anderson:
Oh definitely. And I love and you even kind of brought it up. It was going to be a question, but you already completely answered it is, individuals with kind of that advanced ALS where the eye gaze isn’t working, head mover where no switches is really going to be, unless you just make something completely custom for that individual there’s not really a solution out there. So having that new access kind of method is absolutely excellent. So where are you currently at with the development of the Cognixion ONE?

Meaghan Azlein:
So we have two models of our Cognixion ONE, so the first one that I mentioned, that is our augmented reality Cognixion version that has the PED movement to be able to make your selection options on the screen, that one is called the Nexus and that should be available at the end of the year, and that was currently in developed. And then our version that has the brain computer interface is Cognixion ONE Axon, like the axon in your brain. And that one should available where we’re currently in development, so if you guys want to connect with us on LinkedIn or Facebook or any of those types of things, we’ll be announcing more timeline information as we get closer.

Josh Anderson:
Excellent, excellent. Meaghan, you probably have quite a few of these, but can you tell me a story about someone who’s maybe, or maybe even a couple stories who have used Speakprose or Cognixion ONE or something and how it’s made a positive impact on their life?

Meaghan Azlein:
Absolutely. Like I had mentioned, we have a really wonderful advisory council that is filled with really amazing inspirational people, that do so much impact and have so many important things to say. For example, one of our Brainiacs is a wonderful gentleman that I actually, Europe was here in Santa Barbara. He is a gentleman who has celebral palsy and he’s a DJ. He uses speakers every once in a while at his gigs and Cognixion ONE to go ahead and communicate over his loud speakers while he’s doing a DJ set. So his name is, DJ of Abilities and he goes out and DJs, weddings and parties, and we just talked the other day and he’s booking up like crazy and he’s fantastic. And he uses Speakprose to be able to do things like that able to communicate overlaps because, but sometimes it’s difficult for people to hear what he’s trying to communicate and say and he has really important things to say. And he’s actually also very hilarious.

Meaghan Azlein:
So another one of our Brainiacs is Tim Jin and he actually did a Ted talk with our CEO, Andreas and Tim is a rock star. He’s super cool, if you guys have ever heard of the Aqua bats, who’s actually friends with them. He’ll go on stage with him every once in a while he goes skydiving and things like that. Tim also has celebral palsy and is in wheelchair and he actually uses his toes as his access method for his AAC app.

Josh Anderson:
Meaghan, tell us a little bit more about the Brainiac Council.

Meaghan Azlein:
Yeah, I would love to, we actually here at cloud mission, like I had mentioned earlier, we believe that having accessibility in so many forms is really important, but having accessibility tools and things like that really need to be driven by the people who are going to be using them. People who are going to be needing them really are going to know much better than myself, who doesn’t use AAC in a form of the communication devices that we are making. I can only offer so much insight that as someone who is actually using those devices, it’s so essential and so important and really integral into the background of Cognixion of what we do that we even actually, based on our brain council feedback actually created this version of Cognixion ONE, the nexus, because so many Brainiacs were saying, well, I might not necessarily need the Axon version with the brain computer interface, but I still want to have wearable speech.

Meaghan Azlein:
And I think that would be a great option. So we actually went and changed what we were doing based on the users within our Brainiac Council. So our Brainiac Council is made up is about of 175 individuals, whether they are users of AAC technology, whether they are experts within the community, whether they’re SLPs, assistive technologists, we have some neuroscientists, we also have some neurosurgeons or they’re just really into the tech space within augmented reality and things like that. We actually have that council and it’s growing. So if any of you that are out there, listening are interested in possibly learning more about our Brainiac Council or want to join or even want to test out Cognixion ONE or our Speakprose app, please reach out. I would love to get to know you. And we are always looking to hear and include more voices in the technology we develop.

Meaghan Azlein:
It’s so important for us for everyone’s voice to be heard, your voices for those who are going to be using Cognixion ONE, and those are going to be using Speakprose are just essential without those voices we wouldn’t be where we’re at today. For example, one of our Brainiacs who is no longer with us, got to be one of the very first people to try on Cognixion ONE Axon. And when we got there, she was using a Switch Access modality on her device that she currently used and was trying to communicate her name and just some other information, super quick, couple of words. But it took a substantially long amount of time. I believe it took over 20 minutes just to say, “Hi, my name is so and so.” When we put Cognixion ONE Nexus on her, the whole room was in tears because within just a few minutes, she had communicated and said, “Wow, I wish my parents would see me now.”And just knowing that this type of technology needs to just have so much more focus and so much more asset behind it and so much more voice.

Meaghan Azlein:
So we can get it to people so that their parents of people who might not verbally be able to communicate. “I love you,” can share that at some point because there’s so many thoughts and there’s so many ideas in the minds of so many people. And it’s so often presumed that people who might not vocally be able to say things like everyone else in their community or in their pod are able to do that. They’re assumed that they might not have anything important to say that is just so not true, there are so many brilliant ideas and so many brilliant things that are just waiting to be said, and that we’re hoping to do by unlocking. A million plus voices around the world of people who aren’t non-speaking because there’s really that many people in the world that, might be in need of a tool of AAC. And I just can’t imagine how many ideas and how many scientific breakthroughs we’re going to have when that ability becomes a possibility.

Josh Anderson:
Oh, definitely. Meaghan, if our listeners want to find out more about, about Cognixion, about all the great solutions you guys have, what’s the best way for them to do that?

Meaghan Azlein:
Yeah, absolutely. So the best way for you guys to connect with us about the solutions we offer and things like that would be probably heading to our LinkedIn and giving us a follow that’s where you guys will get all of the latest updates, the things that are happening, the events we’re going to. And also when we do any type of options to join our bringing at council, or to be able to do any testing or ask more questions, that’s usually the best place to go. Because then you give the updates, you can also go to our website and things like that, but it, if you follow us on Facebook, Instagram, LinkedIn, and you guys will look at our updates as they’re coming.

Josh Anderson:
Excellent. We’ll put list of that stuff down in the show notes so that folks can easily find you. Well, Meaghan, thank you so much for coming on today, talking about Cognixion, and all the great solutions that you guys have available.

Meaghan Azlein:
Thank you guys so much for having me.

Josh Anderson:
Do you have a question about Assistive Technology? Do you have a suggestion for someone we should interview on an Assistive Technology Update? If so, call our Listener Line at 317-721-7124. Send us an email at tech@eastersealscrossroads.org, or shoot us a note on Twitter in data project. Our captions and transcripts for the show are sponsored by the Indiana Telephone Relay Access Corporation or InTRAC. You can find out more about InTRAC AT relayindiana.com, a special thanks to Nikol Prieto for scheduling our amazing guests and making a mess of my schedule. Today’s show was produced, edited, hosted, and fraught over by yours truly. The opinions expressed by our guests are their own and may or may not reflect those of the INDATA Project. Easterseals Crossroads are supporting partners or this host. This was your Assistive Technology update and I’m Josh Anderson with the INDATA Project at Easterseals Crossroads in beautiful Indianapolis, Indiana. We look forward to seeing you next time. Bye, bye.

 

4 comments:

Leave a Reply

Your email address will not be published. Required fields are marked *