Giving sight to the blind is some kind of miracle. Though that feat continues to elude us, a Bucharest-based Intel® Innovator, Silviu-Tudor Serban, has set out to do the next best thing. Leveraging a background in computer vision and artificial intelligence, he and his team have designed Helios, which uses Intel® RealSense™ 3D cameras to give those with severe visual impairments a clearer view of the world. And for those with total blindness, Helios features like reading assistance, facial recognition, and a haptic interface provide a much enhanced sense of their surroundings.
Motherboard spoke with Serban about his project, and how next-gen technology can give greater independence to the 285 million people across the world who have low vision, or no vision at all.
Tell me about Helios.
What we wanted to do with Helios is use next-gen technologies to radically improve accessibility for visually impaired people. It works similar to a VR set, but instead of displaying computer-generated content, the user receives real-time visual data that is captured by the Intel® RealSense™ 3D camera. Our software processes the data in order to provide the most relevant information for people with low vision. For example, if there are some obstacles nearby, our software can highlight them, and tell the user he or she is going to bump into something. It can also recognize people's faces and assist with reading too.
How did you come up with the idea?
We took part in the Intel® RealSense™ App Challenge, which was a global competition organized by Intel to create the first applications for RealSense. It had a one million dollar prize pool, and we were actually one of the winners. We built a game called Drill Sergeant Simulator, which is when we first got to explore the full range of capabilities for the Intel® RealSense™ camera and SDK. It allowed us to get to know the technology inside out. That eventually led us towards the “Aha moment,” where we knew this camera could be the secret sauce for building something to help the visually impaired.
As someone who's followed emerging technologies within the field of computer vision, what are some things we can do now that we couldn't do five years ago?
We can do more on the go. Five years ago you couldn't do much on mobile. Laptop cameras were of much lower quality. We had Kinect, which is like a big hulk of equipment that had 3D vision. But the Intel® RealSense™ camera is so small, it can fit on your phone or tablet. This is the major difference from five years ago. You have the computing power and the sensors, and it's all so small and power-efficient you can actually place it in the product.
What do you hope to be able to do in a few years that we just can't do yet?
What we're waiting for is the technology to shrink even more. Now we have a wearable prototype, but what we want is to have it smaller and lighter, while keeping all the performance. I think in a couple of years we'll get even better mobility from these kinds of devices.
What kind of equipment is the user carrying?
The current prototype features an Intel® RealSense™ SR300 3D camera, which is mounted on a headset, and an Intel 2-in-1 hybrid laptop, which the user carries in a backpack.
What was most challenging part of making Helios work?
From my point of view, the most challenging part was to see the world from the perspective of the people who we are trying to help: the visually impaired. It's vital for designing the product and actually giving the software features that will make sense for the user, not me or you.
To that end, we've been working with a couple of people who have vision impairments. We have a team member named Mihai -- he's featured in our videos demoing Helios -- who was born with microphthalmia and cataract-microcornea syndrome. Basically, he can see with only one eye, and that’s up to 50 cm. We relied a lot on his insight.
Our meeting was rather serendipitous, actually. I was on the train, returning home after a Helios hardware hacking session with my team. Mihai was carrying a walking stick, and someone was helping him find his seat, which happened to be in the same compartment with me. It was incredibly random. I think the universe is particular in that way, making sure we wind up exactly where we should be.
We started talking. First, about his vision impairment and challenges he faces everyday, then about vision aids and his Master’s thesis on accessibility in museums. Finally, I told him about Helios and he said he would love to try it and help out if possible. He’s been part of our team ever since.
It sounds like he has a very specific type of vision impairment. Do you have to calibrate it so it works with someone who has a different impairment?
There are some afflictions that cover most people: cataracts, glaucoma, Retinitis Pigmentosa, things like that. For the most part, they struggle to see a distant point. For them, Helios works right out of the box. The headgear has an adjustable lense, so that fixes some minor problems. We also have a few presets people can switch between in real-time, to choose whichever is best for them. For example, some people can’t see at night; for them we have a nightvision mode.
Was there a particular moment when you knew this was going to work?
I believe we have had several breakthrough moments while developing the software and hardware, but one of the most meaningful ones was when Mihai tried out the prototype for the first time. He just put the headset on and started using it with zero learning curve. He began describing objects in his line of sight, like tables, chairs, the closet or the door, sometimes touching them for confirmation. Then he gained confidence and went on exploring the area. Meanwhile, I was trying to hold back an ecstatic grin. I just knew we were on the right track.
What if someone can’t see at all? Is there anything Helios can do for them?
We actually have two versions of Helios we’re working on: Helios Light is the current working prototype, which aims to enhance sight for people suffering from low vision; and Helios Touch is our concept for improving spatial awareness for the blind. The secret ingredient for both implementations is Intel® RealSense™ technology, which is used to capture visual data.
The idea behind Helios Touch is to help people with complete blindness by using software that can describe their nearby environment and then communicate with them via haptic impulses. It uses vibration patterns to convey a level of information about what’s around the user, somewhat like painting a picture in broad strokes. If, for instance, an obstacle is encountered, the user can feel it is there before they run into it.
Those with total blindness can also take advantage of other features our software has right now, like reading assistance. That allows them to do things like go to a restaurant on their own and know what’s on the menu. We also have a special feature we call Argos, which is named after the all-seeing guardian giant from Greek mythology. Argos is for remote caregiving. Basically, the user can connect to a friend or relative who they trust, and can ask for help and guidance whenever they need. That friend or relative will have access to a direct visual stream of what the user sees, which they get on their phone.
The Intel® RealSense™ technology also allows for almost instant learning of a person's facial features. So a known person, such as a friend or relative, can thus be easily recognized when they are in the line of sight of the user. More than that, the user can better understand facial cues, such as happiness or surprise, when talking to others.
What's the next step?
Our long-term plan is not only to develop the best possible solution, but also make it affordable. In terms of areas we're working on, we've been focusing a lot on the human-computer interface, improving in terms of computer vision and AI, and refining the algorithms to improve organization.
Though we're developing several other core projects, our greatest focus is on Helios, because we think it has the capacity to make the greatest impact. For people with low vision it’s difficult, or even impossible, to do simple things like reading, writing, shopping, watching television, or recognizing faces. With Helios, all these types of activities become achievable. What we want is to empower people. Imagine people with vision impairment being able to better integrate in the workplace, or more easily access knowledge, or have improved mobility or an easy method of asking a friend for help -- this could change the lives of millions.
The Intel® Developer Zone offers tools and how-to information to enable cross-platform app development through platform and technology information, code samples, and peer expertise in order to help developers innovate and succeed. Join communities for the Internet of Things, Intel® RealSense™ Technology, Modern Code, Machine Learning and Game Dev to download tools, access dev kits, share ideas with like-minded developers, and participate in hackathons, contests, roadshows, and local events.
from How Helios Is Using Intel 3D Cameras to Assist People with Vision Impairments