Your weekly dose of information that keeps you up to date on the latest developments in the field of technology designed to assist people with disabilities and special needs.
RERC on AAC – David McNaughton | http://rerc-aac.psu.edu
AT&T unveiling Internet of Things-linked wheelchair :: Editor’s Blog at WRAL TechWire http://buff.ly/1Qd4efO
ChromeVox Screen Reader for the Blind – A11yNYC Demo http://buff.ly/1FIJwyU
Flying Blind, LLC http://buff.ly/1FIJqYb
What’s New in iOS 9 for Accessibility http://buff.ly/1KOxhs5
App: HeyYouADHD www.BridgingApps.org
Listen 24/7 at www.AssistiveTechnologyRadio.com
If you have an AT question, leave us a voice mail at: 317-721-7124 or email firstname.lastname@example.org
Check out our web site: https://www.eastersealstech.com
Follow us on Twitter: @INDATAproject
Like us on Facebook: www.Facebook.com/INDATA
——-transcript follows ——
DAVID McNAUGHTON: Hi, this is Dave McNaughton , and I’m the co-leader of training and dissemination for the RERC on AAC, and this is your Assistance Technology Update.
WADE WINGLER: Hi, this is Wade Wingler with the INDATA Project at Easter Seals Crossroads in Indiana with your Assistive Technology Update, a weekly dose of information that keeps you up-to-date on the latest developments in the field of technology designed to assist people with disabilities and special needs.
Welcome to episode number 225 of Assistive Technology Update. It’s scheduled to be released on September 18 of 2015.
Today we talk with David McNaughton who is with the Rehab Engineering Research Center on Augmentative and Alternative Communication, or the RERC on AAC. We talk about an Internet of Things linked wheelchair that AT&T is working on with Promobile; a demo of the ChromeVox screen readers; an interesting story about what’s coming down the pike in terms of accessibility for the iPhone operating system, iOS 9; and an app from BridgingApps.
We hope you’ll check out our website at www.eastersealstech.com, shoot us a note on Twitter @INDATAproject, or give us a call on our listener line. We’d love to hear from you. That number is 317-721-7124. Leave a message. We may feature your call on a future show.
I don’t know about you guys, but I’m constantly hearing about the Internet of things. Our friends over at RESNA have kind of turned me onto a story here. The title reads, “AT&T Unveiling Internet of Things Linked Wheelchair.” What happened was, not too long ago in Research Triangle Park in North Carolina, there was a conference focused on wireless telephone kind of stuff. AT&T and Promobile have announced a new Internet of Things prove the concept chair that basically connects the wheelchair to the Internet to send some helpful information.
Some of the features they talk about is a thing that will monitor seat position and cushion pressure and the people know if the cushion seat pressure is out of range. That can lead to some problems like pressure sores or ulcers. It also monitors battery level and maintenance requirements, even GPS location, so that your wheelchair can essentially tell the wheelchair dealer that your battery is low, needs some maintenance, and even where it is. I have to assume that that’s all permission-based information, and right now they’re talking about how this is just kind of a proof of concept. It is a fascinating thing to think about how your wheelchair might be able to send critical information to the people who can help you deal with those problems.
I’m going to pop a link in the show notes over to Tech Wire blog over at WRAL, and a nod to RESNA for letting us know about the story and we’ll pop a link in the show notes so you can read more about this wheelchair.
SPEAKER: I’ve been using a very handy control key which silences speech. So now I’m on ChromeVox.com. I can navigate as I did in the system tray and just use the arrow keys to navigate.
WADE WINGLER: So what you’re hearing there is a little clip from a ChromeVox screen reader demo. Some gentlemen named Peter and David from Google’s Chrome OS and ChromeVox team went to an NYC New York City accessibility meetup. They gave a demo of how the ChromeVox, which is the built-in screen reader for the Chrome browser and the Chrome operating system, could be used to navigate the web. You can also do things like Google Docs and also offer accessibility to people who use a screen reader on those Google-related sort of products. They even did a little bit of the touch interface demo which, on a laptop running the Chrome OS or the Chrome operating system, they were able to use those navigation gestures reminiscent of what you do on an iPhone or those kinds of things. It’s about a 17-minute demo. It’s kind of nice. They show the basics of ChromeVox. They show us gesture interfaces and kind of do a pretty good job of explaining how this might be a solution for somebody who needs a low cost or no cost screen reader to do some basic computing. I’ll pop a link in the show notes and a nod over at the folks at Flying Blind who included this in one of their newsletters and helped me find it so I can share it with you guys. If you’re not familiar with the newsletter over at Flying Blind, I would recommend that you go and subscribe. Head on over to flying-blind.com, sign up for their newsletter. You’ll get all kinds of new information. I’ll pop a link in the show notes to those guys.
It’s that time of year when Apple gives us some teasers and then some real information about what’s coming down the pike in terms of their products. Not too long ago, there was a big announcement where they talked about the new iPhones, the new iPad Pro version that’s coming out, some changes to the Apple TV as well as updates to the Mac operating system and iOS 9 for the iPhone.
As I was scouring around the web to help figure out what this means for folks who use assistive technology, I stumbled upon a website called the website of Louis Perez. He’s got an article here titled just what you would think, “What’s New in iOS for Accessibility.” It’s a great article and gives a lot of information about some of the things that are coming down the pike and iOS 9, which should be out about the time this podcast hits the air.
One of the things he talks about is a new feature called touch accommodations. It allows you to adjust the way that somebody physically interfaces with their iPhone or whatever device is running their iOS operating system. It allows you to adjust the whole duration so that is how long the screen needs to be touched before that gesture is recognized. It has an ignored repeated multiple touches, which kind of reminds me of slow keys on the keyboard so that, if you unintentionally hit the screen too many times, it will allow you to adjust what it takes before that actually is initiated and impacts your interface with the device.
For those who have a hard time with hitting the screen multiple times, there is a thing called Tap Assistance. It allows you to set it up so that your initial touch location or your final touch location tells the phone where you meant to touch. There are some adjustments that are related to delays that allow you to kind of fine-tune that just a little bit.
There’s also some new stuff related to switch control which was new a couple of operating systems ago and now it is being improved to allow things called single switch step scanning. It allows you to hold down a switch and let go of it when you want it to actually activate. You can adjust the dwell timing so that you can change how long the item needs to be highlighted before you make a selection. There’s a thing called long press that allows you to specify the amount of time that you hold down a switch to perform a special action.
And then a thing called recipes which allows you to kind of create a little interface that will let you deal with things like iBooks. If you need it to always hit the middle of the screen for example to activate something, it allows you to create those special recipes on your switch interface.
For those who use assistive touch, it will allow you to change the menu so that if there are features you want to have on the top menu of assistive touch, you can put those there. You can also create some customization there within your secondary menus for the things you use the most often.
There are a couple of things that are little tweaks for uses of voiceover and zoom. They have now made text selection and typing mode options available on the rotor instead of having to go into other menus to change that. Also there are some things that you can do to change your voiceovers key on the keyboard.
Another one of the features that I find interesting, and I kind of went into this occasionally, is now you can disable the shake to undo feature and some of the other vibration related options that are available under interaction in that accessibility pain. A lot of times I have my phone on a mounting system on the dash of my car so that as I’m driving along, I can do navigation and those kinds of things. When I hit a bump, my phone will shake and try to undo the last thing that I did on the phone. It will be nice for me to be able to shut that off and also for folks who have issues with their hands or tremors and need to turn off the shake to undo feature.
It looks like there is some pretty cool stuff going down the pike as they relate to iOS 9 and accessibility. I’m going to pop a link in the show notes over to Louis Perez’s website where you can read all of the nitty-gritty details about what he said covered with this. Let us know what you think about iOS 9, what it means for you in terms of flexibility. We love it if you would call our listener line. That number is 317-721-7124. Leave us a message there. You can leave those messages 24 hours a day, and if you really like your commentary question, you might just show up on the show.
Each week, one of our partners tells us happening in the ever-changing world of apps, so here’s an App Worth Mentioning.
AMY BARRY: This is Amy Barry with BridgingApps, and this is an App Worth Mentioning. This segment’s app is called HeyYouADHD.
HeyYouADHD is a musical app designed to help children with daily routines such as getting dressed, bathing, and bedtime. We all know how frustrating that can be with children diagnosed with ADHD. The app presents the children with a song that incorporates directions for simple tasks. It provides the parents with instructions and rewards to use with the children. We found that this app is super useful for children with difficulties in establishing routines and remembering the steps of the routines. The words and beat of the music are entertaining and very engaging.
We reviewed this app using the Getting Dressed song. The songs includes five verses, Getting Up, Taking Pajamas Off, Getting Dressed, Bathroom Chores, and Getting Going. There is a chorus that’s repeated between each verse. The child can learn the chorus and verses with repeated use. The child is also reminded of the steps and to complete the activity without a parent’s repeated verbal or visual instructions.
HeyYouADHD is available at the iTunes Store for free, and it’s compatible with iOS devices. The Getting Dressed song is included with the app, and other songs can be purchased for $1.99. For more information on this app and others like it, visit BridgingApps.org.
WADE WINGLER: Augmentative and alternative communication is a term that I’m going to guess almost everybody in my audience is at least a little bit familiar with. RERC, however, may be a new acronym for people. We are going to unwrap the both of those today and talk with David McNaughton who is the coleader of training and dissemination at the dissemination team for the RERC on AAC. He also happens to be a professor of special education at the Penn State College of Education. I think Dr. McNaughton, you are still on the line?
DAVID McNAUGHTON: Yes, I am.
WADE WINGLER: Good. I’m so glad that you decided to take some time out of your day today, because we are going to talk about the RERC on AAC. Before we kind of get into that, I would like to know a little bit about yourself and how you became interested in augmentative and alternative medication.
DAVID McNAUGHTON: Thank you. I’m very happy to be here with to you today. I’ve been working in this field for what seems to be about 30 years. I first got involved working with both the young children and adults with complex communication needs. By that we mean individuals that, for whatever reason, be it cerebral palsy or autism or Down syndrome, individuals that have difficulty using speech to communicate. In my early years, I was working with both young children and adults who were experiencing difficulties in this area, and I was involved in both developing augmentative communication systems for their use and also trying to work with their communication partners to develop ways that they could best support the indication that’s of those individuals.
WADE WINGLER: Excellent. So then you ended up at Penn State in the school of education?
DAVID McNAUGHTON: Right. So over time I had the opportunity to develop my own skills in this area by going back to school for a period of time. I eventually transitioned into the position I have now, which is both conducting research to try to get a better sense of what we should be doing in the area of assistive technology for communication, and also having an opportunity to work with some very enthusiastic and engaged young people who are developing their clinical skills as well.
WADE WINGLER: I’m going to ask another question here in a minute about your kind of day-to-day activities, but I am pretty sure this is the first time we have had an RERC on our show. Can you explain sort of in broad terms what is an RERC?
DAVID McNAUGHTON: Sure. RERC’s — and that stands for a Rehabilitation Engineering and Research Center — are centers that are funded by the federal government, the Department of Health and Human Services, to do research in a particular area of assistive technology. We are the RERC on Augmentative and Alternative Communication. There is also an RERC on assistive technology to support cognitive enhancement. There are RERC’s dealing with wheelchair design and other types of assistive technology.
WADE WINGLER: I don’t have the number, but there are like it doesn’t. There are several, right?
DAVID McNAUGHTON: At least. As the federal government identifies new priority areas, they develop new RERC’s for those areas.
WADE WINGLER: Talk to me a little bit about how AAC fits into the RERC’s goals and activities, and maybe even tell me what a day in the life is for you in the role as coleader of training and dissemination.
DAVID McNAUGHTON: Sure. We like to think the reason that AAC is sort of on the radar for the federal government is because of the important role that communication plays in supporting the goals that I think all of us have for ourselves as individuals, which is supporting increased participation in society, supporting independent living, all those things we would hope for ourselves are the same things that the government in a sense is targeting for people with disabilities. It would seem that communication plays a central role in making sure that people have that ability to participate on a day-to-day basis.
Your second question was what does this mean in terms of how I approach day-to-day?
WADE WINGLER: What’s your day in the life like in your role there?
DAVID McNAUGHTON: Sure. I think, like everybody in the assistive technology field, I wear a couple of different hats. One of those is working on ways to share the information that the RERC on AAC is developing with a broader audience. We are quite active in developing webcasts and other web-based materials in an effort to communicate what we’ve learned about effective practices, share our research findings, and also we’ve spent quite a bit of time trying to, in a sense, share the stories of people who use augmentative and alternative medication themselves. Some of our most popular webcasts have been presentations by people who use AAC, reflecting upon their use of AAC, reflecting upon some of the things that AAC has enabled them to do in their lives.
WADE WINGLER: That’s great. Can you talk to me little bit about kind of the frequency of the webinars and the materials that you are putting out there and how people might find those?
DAVID McNAUGHTON: So we produce about 2 to 3 major webcasts a year. Once those are done, they are permanently available for free on our website. We are not so much scheduled time webinar kind of things as we are more of a produce a webcast which we then release and share information about. You can find those webcasts by going to our website, which is RERC-AAC.psu.edu, or if you just Google RERC on AAC, that will pull up our website as well. We have a bank now about 20 webcasts that we’ve developed over time, and as I mentioned, we add webcasts to that collection each year.
WADE WINGLER: So it sounds like your webcasts are, in a way, little bit like our podcast. We do them and then they are out there and they sort of stay for an ongoing conception.
DAVID McNAUGHTON: Absolutely. Very much like the work that you do, we hope that people would treat these as a resource that can be built into educational programs. Some of our webcasts also have quizzes that are associated with them. If faculty wanted to use them within an undergraduate class or something and both have a discussion and a quiz activity, we found that that’s been a popular option for a number of our webcasts as well.
WADE WINGLER: So how long has the RERC on AAC been around, and who are the people that are making it happen?
DAVID McNAUGHTON: This most current iteration of the RERC on AAC was we received word that we had been funded in October 2014. We are just coming up, in a sense, on the first year anniversary of this particular RERC. This is an area that the federal government has supported in the past, so there has been a number of these RERC’s on this topic in the past, and partly we’ve taken advantage of some of the work that came out of that group and the materials that we share now. Truth of advertising, some of webcasts that we are sharing with people now were actually part of those previous RERC’s. We want to continue that work.
The individuals that are currently involved with our center, we have individuals from the Madonna Rehabilitation Hospital in Lincoln, Nebraska, David Bergman and Susan Fager. We have faculty from Oregon Health and Science University, Melanie Furdoken. Tom Jacobs is a rehabilitation engineer who is the director of a company called Infotech. He has assisted us with a lot of the commercial translation of our research work to the private field. That’s also another main driving point of the RERC’s from the federal government’s point of view, is one of the goals for an RERC is to support the translation of university-based research to the field — the consumers. Again, Tom Jacobs plays a critical role in that. And another individual, and she’s actually the principal investigator on the product, is Janice Lake, who is also here at Penn State University.
WADE WINGLER: I guess I haven’t really thought about that, but one of the likely outcomes of your work is that people who are using AAC products are going to be using stuff that had its genesis in your research. Is that likely?
DAVID McNAUGHTON: Right, absolutely. From the federal government’s point of view, one of the major goals of the RERC is what they call tech transfer, so the transfer of the technology developed at a university setting into the marketplace. That’s something, you hope you always can do even more that, but that’s something that we’ve been pleased with.
To date, one of the major developments that’s come out of our past work has been an approach called visual scene displays. Within devices that are being used for augmentative and alternative medication, one of the areas in which we been especially interested in is how can we design interfaces that are easier for the person with complex communication needs to learn. Rather than developing technologies where people have to spend an awful lot of time learning how they can access vocabulary, our goal is to develop interfaces that are much more intuitive so that we can introduce those types of devices both for individuals with more severe cognitive challenges but also introduce this technology with very young children. You think about when young children first start to make use of speech, really that’s the age period, and before that that we would like to be able to introduce augmentative communication to very young children where we know that their changes are excellent that they’re going to be a sprinting debility with speech. When we meet a very young child with cerebral palsy or a very young child with Down syndrome, we have good reason to think that all of the things being equal, this person is probably going to experience difficulty with speech, so it’s in everybody’s best interest to introduce augmentative and alternative communication as soon as possible.
Frequently the AAC that is out there has really been developed more with adults in mind. One of the challenges that we have taken on is how do we develop augmentative communication technology that will be more intuitive for very young children. Again, one of the research efforts that came out of that was the use of something called visual scene displays. That’s actually something that we’ve seen excellent uptake from the AAC industry in terms of building this approach of visual scene displays into their augmentative communication technology.
WADE WINGLER: I’m sitting here in my studio with a big smile because I have never heard the term visual scene displays, but I know what you’re talking about, because I’ve seen them. I’ve seen them on the products that are out there in the wild. I kind of get that. That makes a lot of sense. You may have just answered my next question, but what do you see as some of the most promising approaches, tools, or technologies that are kind of emerging from your work. What’s on your radar?
DAVID McNAUGHTON: It’s probably two major areas in which we are working. One is trying to find ways that we can enable people with more severe disabilities to access technology. So we have work going on in brain computer interface. We have work going on in how do we take advantage of multiple modalities in operating assistive technology. So traditionally, for individuals with more severe physical disabilities, we have a single switch location, or we try to make use of speech recognition, or we try to make use of eye gaze technology. But typically there is a single modality that we’ve been trying to use for people, and clearly that’s better than no modality at all, but sometimes there are challenges associated with using a single modality. We can get into repetitive motion disorders, so somebody is always hitting a particular switch in a particular way, all day long. That’s not really where we want to be. Also that can be fairly limiting in terms of the number of things you can do with a single switch. It poses some inherent limitations. One of the areas we are investigating is are there ways that we can collect information from multiple sources and use it as a more powerful way of controlling assistive technology. So that is sort of one major strand, can we develop better ways to access assistive technology.
The other major strand is a little bit going back to that visual scene display research topic, but carrying that forward. What can we do to make the interface that an individual uses to access the vocabulary within their device, what can we do to make that as intuitive and as powerful as possible. We are both continuing that work on visual scene display.
We are also starting to investigate whether the same approach that we’ve used with visual scene display with photos might also be used with video. You think how much video is not a part of people’s lives in terms of sharing YouTube clips and just how people are continually sharing those brief little things that they captured in video. We are trying to think of are there ways that we can map communities and onto that activity and that experience. That’s one strand.
The other thing we are thinking there is using that as a way to support literacy development as well. Ultimately where we would like to support people as much as we can is the movement from using pictures as a method of mitigation to literacy. Once you can spell, your work changes in terms of augmentative and alternative medication. We are trying to think of ways we can support that transitional process as much as we can.
WADE WINGLER: I sit here and continue to smile because I’m getting excited, and my wheels are turning. I have to tell you, I’m a little bit jealous of people who get to think big thoughts on the direction that assistive technology is going to take. This is really good. Sadly, we are about out of time for the interview. What I’m going to ask is if people want to continue this conversation, if they want to learn more, if they want to see some of the results of which you’re doing there at the RERC, would you recommend? How can people learn more?
DAVID McNAUGHTON: First and foremost, we have all of this information on our website, as I mentioned. RERC-AAC.psu.edu. We have an e-blast. We have a Twitter account. We have a Facebook account. If you visit that website, you’ll have an opportunity to sign up for the different information tools. For people that are interested in more specialized information, we’ve developed two additional websites. One is called AACkids.psu.edu. As you might imagine, that provides specialized information on the use of augmentative communication for very young children. We have another specialized website called AACliteracy.psu.edu. Again, that’s a website that focuses on some of the techniques we’ve been working on for supporting literacy for individuals who use augmentative and alternative communication.
WADE WINGLER: I will drop those links into the show notes so that people can get directly to them.
DAVID McNAUGHTON: Thank you.
WADE WINGLER: Dr. David McNaughton is the coleader of training and dissemination team for the RERC on AAC and has been our guest today. Dr. McNaughton, thank you so much for taking time out of your day to be on our show.
DAVID McNAUGHTON: Thank you very much. I enjoyed it.
WADE WINGLER: Do you have a question about assistive technology? Do you have a suggestion for someone we should interview on Assistive Technology Update? Call our listener line at 317-721-7124, shoot us a note on Twitter @INDATAProject, or check us out on Facebook. Looking for a transcript or show notes from today’s show? Head on over to www.eastersealstech.com. Assistive Technology Update is a proud member of the Accessibility Channel. Find more shows like this, plus much more, over at accessibilitychannel.com. That was your Assistance Technology Update. I’m Wade Wingler with the INDATA Project at Easter Seals Crossroads in Indiana.