Touch is an “often overlooked sense” according to Ultrahaptics CTO Tom Carter. “We value our sight and our hearing really highly but we don’t really think about the value sense of touch gives to us.”
Bristol-based Ultrahaptics will be presenting its mid-air haptics technology at Pioneers Festival next month in Vienna. The technology uses ultrasound to create the sensation of touch, and Carter envisions Ultrahaptics in virtual reality and everyday-use around the home.
The startup raised £10.1 million in funding last year and has inked partnership deals with a number of universities to collaborate on research. tech.eu caught up with Carter ahead of his talk at Pioneers Festival later this month.
tech.eu: How exactly does mid-air haptics work and why did you start working on this?
Carter: We create a technology that lets you feel things in the air without touching anything. It lets you feel buttons and dials and feedback for gestures as well as 3D shapes, objects, and textures in the air without actually touching anything. It uses ultrasound to create these effects by focusing ultrasound into the air and gently vibrating the surface of the skin. We can change that vibration to create different sensations, textures, and shapes.
The technology originated back in University of Bristol. I did my PhD there, largely working on this technology and then worked on that for two or three years within the university. The time was right, so myself and two others spun the company out back in November 2013. We had enough interest from potential customers and investors.
What do you see as the main applications or uses for Ultrahaptics’ technology?
I think we have two categories of applications. One is controlling devices without touching them so that’s interacting with real physical devices around your home, whether it’s your kettle or your car or your cooker or television by holding your hand out and feeling controls projected on to your hands and gesturing to interact with those devices.
The other potential is in virtual reality. I think the main purpose of Ultrahaptics in virtual reality is to give back that semblance of touch, of tactile properties of objects that you interact with in the virtual world so that you prolong the immersion that the user is experiencing.
If you look back at the history of virtual reality, because it’s such a new field and difficult to get right, the ultimate goal of virtual reality is this nirvana where you are completely immersed in a different world and you don’t realise that it’s not real. That means the story about VR has always been: where does the immersion break down? There were big heavy headsets and then the graphics are too low resolution, you’re kind of nauseous if you move. Now the head tracking is really good so you can lose yourself looking around but then you look down and you don’t have a body, now you can have a body.
I did a VR demo of flying a helicopter and looked down and I had my hands on my lap and pilot was holding the control stick. That all looked normal but I moved my arm but the pilot didn’t in the game, so that sort of jarring break where you suddenly remember that you’re sat in a room with a headset on and it’s not real. You now have hand tracking so you can move your hands.
I think the next step where it’s going to break down, you now have these hands you can move around, when you reach out and put your hand into an object, it doesn’t feel of anything. You get no tactile feedback to let you know that you’ve touched it, or grabbed it, or manipulated it.
Just how strong are the ultrasound and sensations of touch?
The feedback is a vibration on your hand. It’s not a strong resistive force so we can’t stop your hand. You’ll still be able to put your hand through objects but we can give you the sensation to let you know you have touched them and we can also provide certain vibrations to give you an idea of what the texture is.
You recently signed an agreement with the University of Tokyo to conduct research together. Tell us about that.
We came from the University of Bristol so we came from the academic community and we want to keep that relationship with where we came from. We have partnerships with a number of different universities, the University of Tokyo is the latest one, also the University of Glasgow, University of Sussex, Bristol and a few others.
We don’t want to shut off the research community and do everything ourselves. We want to engage with them and keep working with them. By forming these partnerships, we can share information and share progress so that we don’t try and do the same thing at the same time. There’s no point in duplicating effort in trying to advance some aspect of mid-air haptics. When a university or an academic has a project, there may be something we can bring that can accelerate their work on that project by six months. It’s trying to speed everything up for all parties.
You’re also in talks with companies to explore possible commercial uses for your technology in the future. How is that coming along?
It’s moving on very quickly. We have an evaluation program that we launched, which gives companies the opportunity to get hands-on with the technology and perform concepts and prototypes within their offices. They sold well, particularly in the automotive sector, where I think most of the automotive companies are working on this kind of technology in their labs.
We have a number of companies moving through the pipeline and target products identified and working towards us finishing off helping with the integration for future products ready for launch. We’re moving along very quickly.