Your weekly dose of information that keeps you up to date on the latest developments in the field of technology designed to assist people with disabilities and special needs.
Visual Thinking, Autism and Artificial Intelligence with Dr. Maithilee Kunda, Vanderbilt University
If you have an AT question, leave us a voice mail at: 317-721-7124 or email firstname.lastname@example.org
Check out our web site: https://www.eastersealstech.com
Follow us on Twitter: @INDATAproject
Like us on Facebook: www.Facebook.com/INDATA
——-transcript follows ——
MAITHILEE KUNDA: Hi, I’m Maithilee Kunda. I’m an assistant professor of computer science at Vanderbilt University, and this is your Assistance Technology Update.
WADE WINGLER: Hi, this is Wade Wingler with the INDATA Project at Easter Seals crossroads in Indiana with your Assistive Technology Update, a weekly dose of information that keeps you up-to-date on the latest developments in the field of technology designed to assist people with disabilities and special needs.
Welcome to episode number 332 of assistive technology update. It’s scheduled to be released on October 6, 2017.
Today I spend most of my time with Dr. Maithilee Kunda of Vanderbilt University. She has created a computer system that thinks like a person with autism. It’s called visual thinking: autism and artificial intelligence.
We have a story about accessible college campuses and also an app from our folks at BridgingApps called gas buddy.
Check out our website at www.eastersealstech.com, give us a call on our listener line at 317-721-7124, or drop us a note on Twitter at INDATA Project.
It’s that time of year in the US when it’s the fall semester for most colleges. I’m looking at a blog post from Syracuse University, their student life caught him. It’s an interesting piece written by Joanna Orland. Her headline is, “Technology makes college accessible for students with disabilities but only if professors allow it.” It’s an interesting look into some challenges that people with disabilities might have simply because, at least in the stories illustrated by this article, a professor don’t allow students to use their laptops in class because there are studies that say taking notes with pen and paper are better for students than using laptops. They also talk about how some instructors are refusing to provide slides or notes to students with disabilities because they say that it gives those students an unfair advantage over the students who don’t have disabilities.
Obviously as someone who lives and breathes the world of disability and accommodation as someone who also teaches for a few universities as an adjunct, I find this challenging and disturbing. Going to pop a link in the show notes over to the Daily Orange which is where I found this column. I want to ask you guys in the audience to get back with me and let me know what you are sing in the classrooms. Are you finding University doing a great job in terms of providing accommodations to students who use assistive technology? Or do you agree with Joanna Orland in this blog post that there is still a lot of work to be done. She talks about some attitudinal barriers and also talks about some of the accessibility issues like universities using registration software that is inherently inaccessible just based on the software that it is built on.
I’d like to hear from you guys. Perhaps we could do some interviews with people who have some feelings or thoughts or developments and successes in this area. Give us a call on our listener line. That number is 317-721-7124. Or send an email at tech@EasterSealscrossroads.org. Let me know what you see are universities in terms of students using assistive technology or universities that are doing a great job of making things more accessible.
Each week, one of our partners tells us what’s happening in the ever-changing world of apps cost so here’s an app worth mentioning.
AMY BARRY: This is Amy Barry with BridgingApps, and this is an app worth mentioning.
This week I am sharing an app called GasBuddy. The GasBuddy app is a crowdsourcing tool that helps you save money and time. Using the app, you will never pay full price for fuel again. Simple and easy to use, the app helps you locate gas stations near you or in a distant location. You can filter the search results by price, location, brand, and even amenities like carwashes, restaurants, and restrooms. App users review the gas stations and report gas prices in real time to help others.
BridgingApps reviewers have found the GasBuddy app to be very handy for daily use and also for long-distance travel. Prior to or during travel, you can put in a location to find gas stations and find the one with the lowest gas price, cleanest restroom, or whatever option it is you are seeking.
The GasBuddy has been an invaluable resource in emergency situations. During hurricanes Harvey and Irma, BridgingApps users reported using the app to locate gas stations that have not run out of fuel.
To use the app, simply download the app, and upon opening it tap yes to turn on location services, then tap find gas station knew you. At the top of the screen you can choose to show the gas stations available on a list or map. At the top of the screen, there are also filter options for smart sort, fuel type, and amenities. Tap gas stations on the list or map to find more details like ratings, prices, amenities, and even reviews by other users. At the bottom of the screen is a green button with directions. Tap that and you will be redirected to a map with directions to the gas station. BridgingApps highly recommends the GasBuddy app as a resource for all drivers.
GasBuddy is available for free at the iTunes Store and Google play store and is compatible with both iOS and Android devices. For more information on this app and others like it, visit BridgingApps.org.
WADE WINGLER: I may be one of the luckiest people in the world, or at least I think I am because I get paid to watch the world and the Internet about what cool stuff is happening when it comes to people with disabilities and technology.
As I do that, I see all kinds of interesting things. Not long ago, I saw some research coming out of Vanderbilt University. The keywords that grab my attention were “artificial intelligence” and “autism” and also some things about geometric patterns and Temple grand and some things I was immediately drawn to and found myself fascinated with.
We reached out to the folks at Vanderbilt and are so excited today to have Dr. Maithilee Kunda who is an assistant professor of computer science at Vanderbilt and is going to be our guest today. Dr. Kunda, welcome to the show.
MAITHILEE KUNDA: Thank you so much for having me. It’s great to be here.
WADE WINGLER: We are excited to have you. I have to say I’m fascinated with your work just from a little bit I’ve learned preinterview. I’m so excited to talk to you today and do a little bit of a deeper dive. I know the audience will love this.
MAITHILEE KUNDA: I’m also excited to tell you what we are doing.
WADE WINGLER: Before we talk about your work, I was like to some interviews learning a little bit about the person. Tell me a little bit about yourself and your academic and professional backgrounds. How did you get to the point where your research interests landed here?
MAITHILEE KUNDA: I was always interested in science and technology growing up. I’m a huge reader of science fiction. I was always fascinated with this idea of other intelligences, robots, aliens, all kinds of things that might exist in this universe. I started to see myself as an artificial intelligence researcher when I was an undergraduate student. I had the professor who is teaching us about AI and what’s possible. I will never forget, used to say with AI, we can work on unlocking the mysteries of the human mind. That to me was such a compelling scientific question that I ended up going to grad school and computer science to do AI research. That’s how I got interested in using computers to study people.
WADE WINGLER: I totally relate as somebody who has been doing technology since I was a child and has a slightly unusual infatuation with world of Star Trek. I totally get that.
Tell me a little bit about the concept for your research, the idea that somebody with autism might think differently than you and I and there might be a technological application to that. Take me to school on your concept.
MAITHILEE KUNDA: I’ll tell you how this all began. When I was in graduate school, I read a book called “Thinking in Pictures” by Temple Grandin, which I’m sure many of your listeners might be familiar with. If not, Temple Grandin is a professor of animal science at Colorado State and also happens to be on the autism spectrum. She talks about how because she is a visual thinker, because she thinks and pictures, she has benefits and strengths and a lot of her work as a designer of complicated equipment for the livestock industry and is able to do some amazing things. She also talks about how this visual thinking style caused her problems in other areas. She would often have trouble understanding abstract concept when people were talking to her.
I read this book, and as I mentioned I was just getting started in grad school. It struck me that this was a fascinating way to think about reasoning and intelligence, this idea that as people we can use mental images, pictures in our head, to actually think through complicated concepts. Of course, we don’t see this just in autism. There are all kinds of stories from people like Albert Einstein and Richard Findon about how they use visual thinking to make physics discoveries. Computer programmers will talk about how, when they design a piece of software, they are actually seeing these things as animated machines in their heads.
I got interested in asking the question of what does that type of thinking look like from a computational perspective. Could we actually build AI systems that would think in that kind of visual way? It turns out that in AI, this idea is not new. People have been talking about it since the 70s. But I never gained much traction, so most of the AI systems in use today don’t think with pictures in that way. Most AI systems that are designed and built have internal abstract symbols. It’s analogous to thinking with language and people. It turns out that this was an uncharted frontier in AI, so that’s how I started on this research.
WADE WINGLER: That make sense because my experience with computer systems and technology is that they don’t think in pictures and geographic patterns in particular. Are AI systems particularly strong in doing this, or are they going to be strong in doing this, or is there a gap?
MAITHILEE KUNDA: I think currently there is a gap between what – certainly I think there’s a gap between what is currently possible and what may be possible in the future in terms of these visual thinking AI systems. That’s one part of my research, is to uncover what are these reasoning mechanisms that people use and how do they work. How do we learn to think visually as children or adults as we go about our lives? I think for AI systems, lots of the aspects of human intelligence, which are currently sort of out of reach – we are starting to have AI systems that are creative and can make new designs and discoveries on their own, but there is still a gap between what they can do and what people can do. I think this is one of many AI techniques under development that may advance the capabilities of these systems.
WADE WINGLER: Fascinating. The Star Trek bells in my head are going off again for the record. That’s awesome. Tell me, does your current research have a name? At a very basic level, what does it do?
MAITHILEE KUNDA: I guess a name I could use to describe this research could be computational mental imagery. The lab I run at Vanderbilt is called the Laboratory for AI and Visual Analogical Systems, which is a fancy way of saying the same thing. We build AI system that think visually. We build AI systems where the internal data structures are images, and then the system has the ability to manipulate those images, to rotate them, combined with image with another. There are some famous examples of what people can do with mental imagery. For example, I could ask you can’t imagine the letter “D”, an uppercase “D”, rotate it counterclockwise 90 degrees. Now take an uppercase letter “V” and put it under what you just imagine. What do you get?
WADE WINGLER: And I asking cones.
MAITHILEE KUNDA: Yeah. Some people also say parachute. That’s an interesting example of how, as people, we can take these internal mental images and manipulate them. That is simple problem to reason out. We build AI systems that do those same kinds of things. Our AI systems have images, they can rotate them and combine them. I guess I already mentioned that we are really hoping to push capabilities in AI using the systems. Another important piece of our work is looking at the systems as a way to understand what people are doing. For someone like Temple Grandin who feels that she thinks and images, if we have an AI system that thinks in images, we can start to study processes of learning and problem solving and figure out how can we understand how people think in different ways, how can we support different types of thinking and learning in people.
WADE WINGLER: That’s fascinating. For my audience and the record, I am deafly more than ice cream person than a parachute person. That being said, some of the research and materials I read, it talked about Raven’s progressive matrices test and of those geometric patterns. Can you tell me about how that is figured into your work?
MAITHILEE KUNDA: We became interested in the Raven’s progressive matrices test out of a very interesting research paper that came out in 2007 that was authored by a research group in Montréal, Canada. This paper was led by a researcher named Michelle Dawson who is herself on the autism spectrum. This paper was about intelligence testing for individuals on the autism spectrum. Usually, if you want to measure someone’s intelligence, you can bring them into a clinic and sit them down and give them an IQ test. A lot of times these IQ tests have lots of different subtests. If any of your listeners had taken an IQ test, they would know there are tests of memory and concept learning and visual-spatial. There are lots of things. You take the scores an average of them together and that’s an estimate of someone’s intelligence.
The Raven’s progressive matrices test is a different test of intelligence that was developed in the 1920s and 30s by a guy named John Raven, where the name comes from. This test only has problems of one type. It’s the kind of problem called a geometric analogy or matrix reasoning problem where you will see one circle going to two circles and one square, and there will be a blank spot. You have to pick what goes in that blank spot. You have to understand this geometric pattern and decide how best to complete it.
This test is interesting because it turns out this is one of the best measures of intelligence we have in people other than sitting them down and giving them a four-hour IQ test. It turns out if you give people these types of problems, it is a very good proxy for measuring someone’s intelligence. Sorry, I’m being long-winded here.
WADE WINGLER: That’s fine.
MAITHILEE KUNDA: This paper by Rachelle Dawson was looking at how people on the autism spectrum perform on the Ravens test. In this particular study, they study both children and adults. What they found is actually quite surprising. What they found was that for many people on the autism spectrum, not all but many, the scores that individuals get on the Ravens test would actually end up being much higher than the scores they would get on the traditional IQ tests. This was a startling finding because, prior to this, we were treating these test as equivalent. If you want to measure someone’s intelligence, you can give them this or this and it will give you the same measurement. This was a groundbreaking fighting to say that, in fact, in people with autism, we might be underestimating their intelligence if you are giving them traditional IQ test.
I found out about this work and had just read “Thinking in Pictures” around this time. It was one of those conferences of events that happen sometimes. Immediately I thought, I wonder if the reason some of these individuals might be so good at the Ravens test is because they are visual thinkers. If you have somebody who is a strongly visual thinker like Temple Grandin, maybe that gives them a better ability to deal with these shapes and patterns mentally.
To sort of test the hypothesis, we ended up trying to build an AI system that would solve Raven’s test problems purely visually. It turns out they had actually been several AI systems in the past that were built to solve Raven’s test problems. As AI researchers, we always love when we find hard, interesting problems to work on for people. It was no secret that this test is a very interesting intelligence test pick lots of people had worked on it and build AI systems couple all these previous systems used symbolic representations to reason. You would give the system an image, and then it would pull out all of these abstract symbols, the number two, squares, the texture is blocked, it would be a list of features. We tried to build our AI system differently where it would actually take a scanned image of each test problem, and we restricted its internal reasoning to only use those images. They could do the kinds of things that we had an example earlier, rotate the images, combine them, the one on top of the other and subtract pixels and see what was left. We built the system, tested it against an actual version of the ravens test – there is a version called the standard progressive matrices which is intended for children and adults. We were quite surprised to find that our system, which was using only visual images, ended up performing on this test at the level of a typical 16 or 17-year-old person.
WADE WINGLER: That’s fascinating. There is a whole lot to unpack. That makes sense. Tell me a little bit about the current state of your research. How far along are you, beginning, middle, and? Or do you know yet at this point?
MAITHILEE KUNDA: I tell you we are working on a few different threads right now. As I mentioned, a core part of our research is using these AI systems that we are building as way to understand human cognition, especially different ways people might be thinking. Following this early work on the Raven’s progressive matrices test, we have a few different projects. We are looking at different test of intelligence that are widely used in clinics or schools whenever you are trying to assess somebody’s cognitive abilities. One of our big realization was when you give somebody a test problem, a lot of times there are different ways to solve it. You talk about whether somebody is right or wrong, but whether or not they can solve the problem, a lot of times there are different routes that something might take to even get to the same answer.
One sort of group of projects we are working on is taking a bunch of the different test and we are trying to come up with ways to measure not just right and wrong, which is easy to measure with the current test, but can we actually measure the strategy differences that people show. Can we measure what two people might arrive at the same answer, but be doing it in different ways. We do that basically by building these AI systems and studying how different variations of AI systems might arrive at the same answer in different ways and using the system to find behavioral clues for people we can use to answer the same question.
WADE WINGLER: If you look into the future a little bit, what might some practical applications of this technology look like after it is a smashing success in 5 to 10 years?
MAITHILEE KUNDA: I hope that eventually when somebody goes to a clinic to have some cognitive testing done – this might be a child who is going to get a diagnosis for a learning disability or an adult who is going in to have a cognitive test after having a stroke can’t any time you want to understand something about how a person thinks.
I think I would love to see the system in place a clinic so when you walk in, there are a few centers around the room that are recording your behavior while you are taking this test, and there is an AI system running in the background that is interpreting what you are doing in terms of how you might be doing it. The system, in addition to getting a score on the test, you also get some description generated by the system that says, “This person appears to have a really high memory capacity while doing this and they’re using this particular type of problem solving strategy, and we think this problem is related to this type of neural mechanism.” I really see this technology as something that in the next 5 to 10 years can be deployed and clinical studies to help clinicians be able to get these strategy measurements in a reliable and efficient way.
WADE WINGLER: What kind of feedback are you getting either from people with autism or expert in the field about your approach so far?
MAITHILEE KUNDA: I think most of the people I talked to are pretty excited about it. When you look at cognitive testing as a field, it has been around for decades. We have neuropsychologist to do this for a living. They bring a lot of their own expertise to bear when they are assessing a person. But I think one of the things that gets people excited is that when you are giving a cognitive test to a person, it is so time-consuming for experts. It is something where experts wish they could just devote four or five hours to each patient to be able to evaluate them and think through what’s happening. A lot of times they just don’t have time to give them much individualized attention. I’m hoping that this kind of technology can start to fill that need. I’ve heard some exciting feedback from neuropsychologists about the idea.
WADE WINGLER: We are about out of time for the interview today. If you want to follow your journey or learn more about your research, how would you suggest they go about doing that?
MAITHILEE KUNDA: Here at Vanderbilt, we have just recently launched a new Center for Autism and Innovation that is bringing together not just myself but a lot of different researchers across campus who are using technology in different ways to look at questions relating to autism, helping people find cognitive strengths. The center is focused a lot on employment, how can we assess people’s cognitive abilities and find a good match in the workforce. The website for the center would be a good place to start. The website is my.vanderbilt.edu/AutismAndInnovation.
WADE WINGLER: Excellent. I’ll be sure to drop that link in the show notes so that listeners can just go directly to that. Dr. Maithilee Kunda is an assistant professor of computer science and computer engineering at Vanderbilt University and has been our guest today talking about compositional visual imagery. Thank you very much for being on our show today.
MAITHILEE KUNDA: Thank you very much for having me.
WADE WINGLER: Do you have a question about assistive technology? Do you have a suggestion for someone we should interview on Assistive Technology Update? Call our listener line at 317-721-7124, shoot us a note on Twitter @INDATAProject, or check us out on Facebook. Looking for a transcript or show notes from today’s show? Head on over to www.EasterSealstech.com. Assistive Technology Update is a proud member of the Accessibility Channel. Find more shows like this plus much more over at AccessibilityChannel.com. That was your Assistance Technology Update. I’m Wade Wingler with the INDATA Project at Easter Seals Crossroads in Indiana.
***Transcript provided by TJ Cortopassi. For requests and inquiries, contact email@example.com***