Podcast: Play in new window | Download

Hi, this is Liam, and I’m the CEO of TouchPulse, and this is your Assistive Technology Update.
Josh Anderson:
Hello and welcome to your Assistive technology Update, a weekly dose of information that keeps you up-to-date on the latest developments in the field of technology designed to assist individuals with disabilities and special needs. I’m your host, Josh Anderson, with the INDATA Project at Easterseals Crossroads in beautiful Indianapolis, Indiana. Welcome to episode 756 of Assistive Technology Update. It is scheduled to be released on November 21st 2025. On today’s show, we are joined by Liam from TouchPulse, and he’s here to tell us all about their new tool for navigation. Thank you so much for giving us a listen today, and let’s go ahead and get on with the show.
Do you have questions about assistive technology? Well, don’t forget to check out our sister show, Assistive Technology Frequently Asked Questions, or ATFAQ, available wherever you get your podcasts. You have assistive technology questions? We try to have the answers. Check out ATFAQ wherever you get your podcasts.
Listeners navigating the world is a barrier for a lot of individuals with disabilities, from pedestrian areas, mass transit, these areas can be foreboding and just inaccessible for some individuals. Well, today we’re joined by Liam of TouchPulse, and he’s here to tell us about how their new tool will assist individuals with blindness and low vision with navigating the world around them. And we are very excited to learn more. Liam, welcome to the show.
Liam Geschwindt:
Thank you so much, Josh, for having me. I’m really, really excited to be talking to you today, and to share what it is that we’re building. I think the listeners also listening right now will be also excited for what we have to present today.
Josh Anderson:
Yeah, I’m excited. I know they definitely will be too. But before we get into talking about TouchPulse, could you tell our listeners a little bit about yourself?
Liam Geschwindt:
Yes, of course. So my name is Liam, as you heard in the beginning. I’m originally from South Africa, based out of here in the Netherlands. We have a company here in a city called Eindhoven, a little bit south of Amsterdam. And yeah, we’re technologists, AI specialists. And we are also really into mobility and trying to see what kind of solution there maybe can be by leveraging what probably what I would say is the technology of this century, and seeing what we can do with it really to help out people to navigate with greater ease and independence.
Josh Anderson:
Awesome. Awesome. And that just kind of leads us right straight into it. So on to TouchPulse, what is it?
Liam Geschwindt:
That’s a great question. It’s a loaded question.
Josh Anderson:
I know it is.
Liam Geschwindt:
That’s, of course, what we’re here today to talk about. Yeah. In a nutshell, TouchPulse is a company on a very simple mission, and it’s to make the urban environments accessible equally for everybody. And that’s about equitable access. Meaning offering solutions and tools that can make it possible for people to move independently regardless of visual impairments. We are a company that’s two years old, as I mentioned. We come out of the Eindhoven University of Technology. We have AI engineers, mobility specialists, and we’ve come together basically to build what is the first AI powered navigation application, and that’s what we’re here to talk about.
Josh Anderson:
Awesome. And that, again, just leads us right straight into the product, the thing that you’re working on. And that is, am I pronouncing it right, is it Navis?
Liam Geschwindt:
Exactly, exactly. So the idea there is navigation with low vision. And I thought Navis was a nice kind of put it all together. Because some product names, sometimes it’s a little bit complicated. And I thought, okay, let’s just simplify it. And it is Navis, navigation with low vision.
Josh Anderson:
Awesome. And I guess let’s start at the very beginning, Liam. Where did the idea come from?
Liam Geschwindt:
So whenever I like to talk about what we’re building, I always have to think about the problem first. And that’s really where we started. Actually, that’s not where we started in all honesty. We started as a project out of the university here. And we were building this sleeve that you could wear on your arm. And the whole idea was, okay, what could we get in terms of information through vibration? So we thought, okay, for people who were with a visual impairment, maybe they could feel their environment rather than having it spoken to them, or in of course the lack of vision case, not being able to see it, maybe we can replace that sense with the skin.
But typical of these kind of deep tech university projects is you build a solution for potentially a problem that is not real. So we started from scratch as a company. And we asked ourselves, okay, what are the issues that people do experience in navigation? And we did this, we did a research study, we interviewed a lot of people here in the Netherlands. And we just tried to ask, okay, what are the issues? Is it public transportation? Public transit was a big issue. GPS directions, massive, inaccurate, unspecific. Google Maps, sure, great application if you have vision, but a lot of the assumptions there just do not meet the needs of this community.
So we knew there was a big problem. And it was a good timing because of course 2023 was really the AI boom, the AI kickoff, and we were really at the forefront of looking to each other. And we said, look, there are these tools like BlindSquare, there are these tools like Lazarillo and similar options, but none of them included AI. What could we do where we would be able to take information about where the user is. If we have their GPS location, okay, sure, you can tell them go turn left in 25 meters. But what if we could tell them that, along that route, there’s a bench over there. You may walk into some sort of obstacle along the routes that we know because of a combination of open source data on open street maps and satellite imagery.
What could we do to almost create it as if you had somebody looking down on you from above? And we call it the Navis HALO is the technology we developed, which is sort of there in a 360 view, looking down and trying to get the best understanding it can of the things that are around you, without you having to take out your phone, do a video recording or a scanning using a seeing AI kind of service, or some sort of AI vision. What could we do that was low cost, high tech, and ultimately super accessible? And that was sort of the concept from the beginning, and we’ve just been spending our time building that out for the last year or so now.
Josh Anderson:
That’s awesome. And like you said, I don’t have to have my phone out, I don’t have to be kind of going around. So I guess you brought up a little bit of this, but what kind of navigation information is available using that kind of Navis HALO, as you called it?
Liam Geschwindt:
Yeah. So it’s sort of a combination of open source data, of course. In open street maps, you can find a lot of information about any sort of materials, will it be benches, the texture even of surface types. The amount of data that’s available, especially in developed countries, United States, Canada, most of Western Europe, is plentiful. It’s just about how can we communicate the information to people. So that was really the original concept. That was the simplistic start, and it really pushed further. Because there’s one real big issue here, and that’s GPS inaccuracy, is it doesn’t matter if I can find out a lot of information about where you are, if I can’t really find out where you are. I mean, it’s-
Josh Anderson:
That’s the important part. [inaudible 00:08:38] starting point for sure.
Liam Geschwindt:
Absolutely. Absolutely. I mean, we did some tests here with the City of The Hague and we found that in between skyscrapers, you have these massive urban canyons. And GPS inaccuracy can be up to 25 meters. And imagine telling somebody to turn left and it’s not actually the turn, but it’s maybe in this case a canal, for example. That poses a massive danger.
So we knew that we had to find some other solution to the GPS problem, and we did. And it’s called visual positioning. How you can imagine it is that a large amount of the world now has been scanned. And one of the actual funny sources is, I don’t know if you remember, there was this craze back a few years ago, it was a Pokemon Go.
Josh Anderson:
Yeah. Yeah.
Liam Geschwindt:
So what that led to was the visual elements of the world being attached to GPS elements. So this created an entire database where, using the camera of your device, you could get a location. This technology is not super novel, but it’s never been applied in the way that we’re planning to apply it. It’s never been used to find the location of someone with a visual impairment in such a way that we can get centimeter accuracy and give them very specific directions.
So great technology, but the question is, okay, what are we going to do with that? And one of the main issues that people talked about all the time was, okay, yeah, I use Google Maps, I followed the instructions, I’ve reached my destination, but now where’s the door? Is this even the right door number? At least here, a lot of the houses are squashed together. Of course in the US, I know you guys have massive suburbs, and it’s a little bit more spacious, I guess, in some areas, but here it’s all really stuck together. So a lot of people would just end up maybe two, three, four houses away from their real destination.
So with this technology, now, when you know that you’ve arrived, you simply take out your phone, you can scan for three, four, five seconds, get your location. And then from there, our technology basically what we’re going to do is make it that using the vibrations of the device, you can find the orientation and get basically the correct heading to move in an accurate direction. Because that sort of system of orientation with vibration doesn’t work if you don’t have the right location of the user. We tried it and it was awful. So we said, okay, we had to do something better here.
So you can almost imagine Navis and what we’re building as kind of a patchwork of just novel technologies with the ultimate goal of really just being the first kind of super powered navigation service. And I say service because we’re not building a product, we’re building a support mechanism. And I can ramble on about it forever, but it’s of course something that us and our team we’re super passionate about is this kind of idea of, by 2027, we want to provide 24/7 global support. And that’s through AI, but it’s also through people.
And this is something I haven’t touched on yet, which is AI is great, massively incredible technology, world changing, paradigm maneuvering key technology, but at this moment we don’t believe that the current state of the art tech is going to be enough. And we always think that having the human fallback as a support is essential. So that’s what we plan to do is that whenever there’s a real issue, Navis can basically promote a call to one of our operators, a team member, and then you’ll always have support no matter what, whether it’s AI-powered or human-powered.
Josh Anderson:
Nice. And I like that you have that extra thing. And yeah, just talking about finding a need and addressing it. I mean for years as I’ve done this, anything in navigation, it’s always that last 10 feet, last three meters, last part. Like you said, yeah, I’m at my location, but I’m three doors down. I’m facing the wrong way. I’m at the wrong building. It’s always that, that last bit that’s always been a challenge. So I love that you brought up that that’s really how it addresses it. But then also kind of in that same vein of AI can get you so far, but that last little bit might need that human assistance. So I love that that’s offered as well.
Liam Geschwindt:
Yeah, yeah, absolutely. I mean it’s essential because navigation, it is, okay, A to B, I’m going to walk along the streets and take certain directions, but it’s also navigating inside. In a blog post as well, I had a conversation with a guy and I was like, “You’re also navigating to find the ketchup in the fridge.” And if that’s something where you’ve misplaced something in the house, that’s also a form of navigation. And whether you need AI for that because you don’t have a partner with you at that moment, or whether you need a human fallback support as well. Our plan is to provide a really all-inclusive toolkit, and you can just take the elements out of that that you need when you need them.
Josh Anderson:
Yeah. No, that’s absolutely great. So I know I can get tons of information from that. Can I control the amount of information shared with me, or the important information I guess to me on my trip?
Liam Geschwindt:
So you mean in terms of the frequency of instructions?
Josh Anderson:
Yeah. The frequency instructions, or we talked about benches, or maybe what’s along your trip. Can I kind of control maybe the information that’s important to me, especially for whatever trip or journey I might kind of be getting assistance with?
Liam Geschwindt:
Yeah. So alongside that kind of problem set that I mentioned earlier, one of the things that we noticed was there were no real solutions that were super adaptive. I have a strong belief that every visual impairment is different and every person is different of course. And I think that visual impairments do not define people, of course, and everyone has their own kind of individual wishes about the way that want to interact with technology.
So we knew from the get-go that it had to be 100% configurable, 100% customizable, and that’s what we’re enabling. So you can filter out basically the types of objects that you want to know. We did actually a test this morning. I was just looking at the kind of server output testing a particular route here. And it was even telling me about security cameras and the position of them, and even railway signals and a lot of information.
And I was like, okay, either we just leave all of this in here and allow people in the testing to say what they want and don’t want, or we just leave a massive settings page where you can just toggle on and off different categories of specific kind of objects. Because yeah, I don’t think all of that is relevant. So to answer your question, fundamentally, it’s one of the core propositions that we provide is basically the ability to customize to your needs. And we set up a really cool feedback system where, if you send an email to feedback@touchpulse, then basically we get an immediate request on our development board where then we can start to build out features basically on direct requests from users, which is really cool.
Josh Anderson:
No, that’s really cool. And I mean that’s how you’re going to make it usable [inaudible 00:16:18] actually gets used, the folks that are actually going to benefit from it are giving you the input and everything. So that’s really awesome. So I know that I can definitely see how this is helpful as I’m kind of traveling and everything else. Could I use Navis to plan out trips? Maybe I have to go somewhere tomorrow, it’s somewhere I’ve never been. Could I use it to get information that kind of way as well?
Liam Geschwindt:
So we offer you the ability to at least, it’s a developing technology, so at least in the phase that we have it now when the beta launches is you’ll be able to just enter a destination, whether you put that via text or through interaction with the voice assistant. You basically just put in your destination and you get a preview of the route, and then you can just save that destination basically, and then you can click on it whenever you want to.
So we won’t have the ability, unfortunately, we’ve had this request as well, because people want to make custom routes. They want to do either bread crumbing or set beacons. This is in our roadmap. But for the beta, we really just want to understand the navigation, the AI powered navigation steps, and then of course the route planning section we’ll leave. But for now, it’ll be really saved places and destinations.
Josh Anderson:
Nice. No, that’s awesome. I know you said I can actually kind of talk to it to interact. I guess what sort of questions can I ask or information can I get through that, I don’t even know the right words, conversational interface, or whatever that might be called. What kind of info can I get or back and forth can I have with the AI?
Liam Geschwindt:
So the conversation that we are having right now is the kind of conversation that we want to enable with our technology as well. So conversational interface is a perfect way to describe it. It’s actually quite funny because in the beginning it was really, really intelligent and sort of too much so. I was on a route, I remember. Early, early testing, I was on a route in the City of The Hague before our testing day there and I was like, what’s around me is one question you can ask to Navis. And then it started to go into a massive amount of detail about the buildings. And it’s a very historical area. And you could even ask it about the history of this area.
And it kind of brought a side idea in the future for kind of a tourism mode, where you could just walk around and be like, what’s around me? Explain me the history of the city and streets. I think that’s really cool. Not beta worthy unfortunately, but definitely something for the idea board, would be really, really cool.
So what you can imagine now in terms of capability is basically for the beta version of Navis, you’ll have the opportunity to just place in, hey, I would like to go to a Target. I would like to go to home, and home will be a saved address, et cetera. You can basically search any location like you would with Google Maps, but just totally via voice. You can also ask where am I? So you’ll get kind of like your street, the clearest, what’s the nearest sort of street number to where you are, city, et cetera. Although I assume you would probably know that. You can also ask what’s around me. And the what’s around me feature is really more about the kind of BlindSquare style. Okay, there’s a restaurant here, there’s an et cetera, et cetera, whatever you prefer in terms of the type of category you’re looking for.
Josh Anderson:
Awesome, awesome. I know you said you’re currently in beta. Can you share maybe some of the feedback that you’ve gotten, or maybe any surprises as folks are kind of out there using it, or some of the information you’re getting from them?
Liam Geschwindt:
Yeah, absolutely. So we did this pilot project with the City of The Hague, as I mentioned, and that was really, really insightful. Because our mission is of course independence and really empowering navigation. And we always had this idea that, okay, we’re going to develop Navis and we’re going to make navigation safer. The information that we’re going to provide is going to provide greater safety. But what that actually meant in reality, we weren’t too sure at that time.
And it was recently, and also through that testing, that we realized that actually what Navis provides is something that was coined in the interview by this user was cognitive offload. Was this idea that when I’m navigating, I have to think about so many things at the same time. And what Navis can do is take care of just one of those things for you and provide you a reassurance there, and enable you to really focus on the tools that you already have.
And I thought that this was kind of a real turning point for us in a sense where it wasn’t about intruding or pushing forth information, it was actually just about providing the really actionable info that’s accurate, reliable, on time, in such a manner that it’s kind of bite sized so that you can really focus on what you’re currently using, whether it’s both a guide dog and a cane, or just the cane, for example. So that’s kind of been our philosophy ever since really is to create a tool which just minimizes cognitive load. So any stress and strain, we try to reduce that by really providing sort of an AI assistant to take you along your route.
Josh Anderson:
So Liam, what’s next?
Liam Geschwindt:
Lots of things. We just secured some funding from both the Dutch government and about to secure an investment round as well. So we got a long roadmap, which is really great, because I am personally super in love with the project. My team is as well. I don’t think there’s anything I would really be wanting to do with my time other than this. And that is all of our time currently spent building out this technology.
And yeah, I’ve talked about a lot of things that we’re building today, but where we’re focused at least for the next few months is, until the end of the year, we’re building out with around 100 core testers, we’re building out the technology. Introducing new features, new user interfaces, really trying to get as much info back as possible from more experienced testers. And then at the beginning of next year, so in January, we are opening up the beta for a certain period, probably around four weeks. This is totally free, completely unlimited. We just offer the opportunity basically for people in the United States, Canada, the UK to use the app for free.
And honestly, it’s just a great opportunity for anybody who wants to have a say and get involved and give their input to build out the first AI navigation solution. And for anybody listening and anything, any of the problems that we’ve talked about or the ideas that we’ve mentioned, if that rings a bell for you or if that strikes a chord, then we invite you to really get involved. And like I say, there’s no strings attached. We just appreciate anyone who’s curious and wants to help us out.
So the plan is, yeah, open it up, get as many people involved as possible. You can actually sign up on the website, and I will share that information at the end of the podcast. And then we say, okay, well we’ve got a load of information about the AI navigation. We close the beta, and we just build. Build, build, build based on what we’ve heard, and then we’re going to look at more problems. Okay, yeah, the outdoor navigation, A to B, massive issue, orientation as well, trying to understand waypoints, big problem. But so is public transportation and that’s an entire other problem set that we’re going to look at. And then there’s providing the human fallback support as well. We’ve got a lot of things in the timeline with the ultimate goal, as I mentioned, of at the end of next year being able to provide global 24/7 support.
Josh Anderson:
Nice. That will be awesome. Well, Liam, if our listeners want to find out more, can you share the website in ways that they can find out more, learn about the beta and everything else?
Liam Geschwindt:
Yes, absolutely. So we’ve got all the information on the website. It’s www.touchpulse, as in pulse like in your arm, .nl, so NL for the Netherlands, and then it would be /navis. There you can find everything about what we’re building in terms of the technology. If you want to learn more about us as a company, of course you can just explore around the website. There’s plenty of info there.
And if you want to send me an email to my personal email, I would actually really appreciate anybody who’s curious and has listened today. You can always reach out to me with Liam, just L-I-A-M, @touchpulse, same as the website, and then .nl again for the Netherlands. Or info@touchpulse, whatever you prefer. And we’d love to hear from you. And like I said, feedback and input from the community is what is going to build out a solution that will finally solve this problem.
Josh Anderson:
Yeah, I am just absolutely excited to see where it goes and everything else. We’ll put all those links down in the show notes so that folks can easily reach out, and get access and just find out more and just kind of watch the journey as it expands. So Liam, thank you so much for coming on today, for telling us all about TouchPulse, and Navis and just the whole journey. And again, we just can’t wait to see where it goes. Thank you so much.
Liam Geschwindt:
Well, Josh, thank you so much for the opportunity as well to share everything that we’re so excited about, and I hope you guys are all excited about it too. And yeah, it’s the beginning of something, and it’s going to be great. And I’m looking forward to having you all along with us on this journey.
Josh Anderson:
Do you have a question about assistive technology? Do you have a suggestion for someone we should interview on Assistive Technology Update? If so, call our listener line at (317) 721-7124. Send us an email at tech@eastersealscrossroads.org, or shoot us a note on Twitter @INDATAproject. Our captions and transcripts for the show are sponsored by the Indiana Telephone Relay Access Corporation or InTRAC. You can find out more about InTRAC at relayindiana.com.
A special thanks to Nicole Prieto for scheduling our amazing guests and making a mess of my schedule. Today’s show was produced, edited, hosted, and fraught over by yours truly. The opinions expressed by our guests are their own and may or may not reflect those of the INDATA Project, Easterseals Crossroads, our supporting partners, or this host. This was your Assistive Technology Update. And I’m Josh Anderson with the INDATA Project at Easterseals Crossroads in beautiful Indianapolis, Indiana. We look forward to seeing you next time. Bye-bye.



The further you go in Escape Road, the faster your heartbeat catches up.