Digitally enhancing the visually impaired




By Bram Heijligers and Dan Hudson




Recently, we were asking ourselves “Why am I always looking at my smartphone?”



 The smartphone extends your capabilities by giving you access to a vast pool of information which allows you to interact with your world more efficiently. It gives you complex information about almost any topic, helps you to complete daily tasks and can even help you to navigate a new environment. But the smartphone is reliant upon a visual interface.



This is not so helpful to people with visual impairment, which got us thinking, “What interfaces are there that help blind and partially sighted people, that help them to extend their capabilities in daily life?” For them, a new type of interface, a non-visual interface is required.



We took to the internet to investigate what is already available and whether it is adequate.



There are several solutions for blind people to access digital and online information, or have enhanced interactions with the world through the Internet of Things (IoT). Several applications have been developed, for example to enable the visually impaired to use smartphones. Most of these applications are audio- and haptic feedback- based. These are sensible channels to communicate information to the user, but there are limitations to the ways that these senses are commonly used:



One problem is that a predominantly audio-based interface means that the user will not be able to focus on other sounds and that the interface is more prone to interrupt different activities involving sound.


Apart from that, smartphone applications for blind people require them to occupy their hands - meaning that the user cannot use their hands to sense and interact with their environment.

The concern of occupying hands is even more relevant for braille e-reader devices which apart from that are also more prone to obstruction in carrying them around since they can’t easily be pulled out and hidden away since they’re larger then smartphones.


There are also devices which enable the user to navigate around the environment using echolocation and camera systems like the Wayband and Sunu-band. These devices communicate how close a person is to an object using haptic feedback but mainly serve as an extra extension to existing tools such as a trained dog or the well known white cane. They only extend the capabilities of the user in a limited way. Other sensing devices such as the Eye See, Brainport or the OrCam which rely on audio communication seem to overload the auditory senses, are not task based and have a single or just a limited number of functions.



The Dot wristband seems to provide an excellent way of intuitive non-visual communication, mainly relying on braille and haptic feedback (notifying the user about a message). The physical design and layout of this device is in our opinion the best out there, however it doesn’t help the user to get to navigate or know their environment more thoroughly or provide other important benefits the smartphone provides such as rapidly processable visual information.



With all these issues in mind, the devices available have frustrating limitations. Therefore, we envision a device where the visually impaired naturally utilize their non-visual senses in order to acquire information and to interact with their environment. To solve this problem as intuitively as possible and increase ease of adaptation, we can leverage the senses and techniques other products are already relying upon and combine them in a single device to avoid them having to carry around a myriad of devices. We propose to do this by creating a wristband which utilizes physical and auditory input and output such as vibration, pressure, temperature, braille and, of course, sound. These in- and output signals could communicate information in a way similar to braille (Dot-watch like fashion) and Morse code; differences in location, rhythm and timing could be used to compose different kind of information pieces such as messages, notification and instructions.



The basis of the wristband is that it provides you with information in a quick and comfortable way while performing tasks. It is intended to increase the independence and comfort of blind and partially sighted people. It will feature apps (different modes of functioning) for different everyday activities that are difficult otherwise for visually impaired people. It is focused on tasks, and providing the feedback needed to complete those tasks through simple cues such as a short vibration or squeeze on the wrist.



Let’s call the device the Smart-Sense and go through some hypothetical situations to identify its usability. Let’s do one in text and two in audio in case someone in our target audience doesn’t have access to a reading tool or braille text converter:




Imagine you’re walking into a waiting room, you’re visually impaired and you need to register for your appointment. You cannot see the computer used for indicating your presence, nor can you operate it since you’re visually impaired.



You give a signal to your Smart-Sense wristbands which scans what is in front of you can communicates what is in front of you using vibration(haptic feedback) and pressure.



It indicates there is a hallways directly opposite to you.


You turn 20 degrees to the right and scan again (potentially automatic).


The Smart-Sense communicates to you via vibration and pressure that there’s a computer in front of you. You request it to tell you what kind of computer to which it responds with a flash of heat indicating to you to read the braille generated on the watch’s surface. It says: Regular PC - title: Register Appointments here.



You now know where to go and walk to the computer.



Sadly it’s not blind-people friendly but your Smart-Sense is equipped with a NFC chip and it start vibrating to indicate it’s vicinity to the scan-area. You move your hand towards the sensor following the vibrations until you receive a haptic and audio confirmation that you’ve scanned your chip. Now, you’ve logged in for your appointment.



Here are the two audio plays we created to give you more details about the interface we’ve designed and how we think it helps a blind or partially sighted person live their live with greater ease and comfort. Please give these a listen!



Scenario one - Cleaning up the house




Scenario two - Meeting friends at the bar




The transcripts for these audio plays are here.



We’ve also prepared a concept document for those interested in a more detailed design of the device and its functionality.



Smart Sense product design document:





So, to sum it up, what makes the Smart-Sense so special?



- One multi-functional task based device

 - Brings the smartphone advantages to the visually impaired

- Communicates intuitively and doesn’t overload auditory sense

- Environment & object scanning enables easy navigation & perception

- Upgradable software

- Handsfree & comfortable

- Intuitive design which doesn’t limit everyday use

- Useful in noisy & crowded environments

- Different kinds of intuitive communication for different levels of information complexity

- Internet of things capability providing a seamless experience



Digitally enhancing the visually impaired is a monumental challenge. There is a myriad of devices out there, each solving specific problems which in turn would require the user to wear a myriad of devices as well. With the smart-sense wristbands we aim to solve that problem. So what do you guys think? Did we overlook anything? Do you like the design? Feel free to send us an e-mail with your thoughts!!



Bram Heijligers

Game Designer, Producer, Academic

Rotterdam, The Netherlands