UK-based designer Emilios Farrington-Arnas has created a wearable for the blind and low-vision with the emphasis on merging fashion with technology.
Maptic uses small feedback units, which can be clipped onto clothing or worn around the wrist, and a visual sensor that can be worn as a necklace. The necklace sensor gives the user feedback enabling them to respond to obstacles to help protect the upper body. Together, the sensor and feedback units sense the user’s surroundings and guide them using GPS to provide step-by-step directions via a voice-controlled iPhone app. The feedback units give haptic feedback (vibrations) on the left or right side of the body to let the wearer know when to turn.
The iPhone app has voice commands to free up the users hands, while the haptic feedback units guide the user with vibration.
From the developer:
Maptic is a system of wearable sensory devices for the visually impaired, consisting of a visual sensor and vibrating feedback units.
These customisable, personalisable modules can be worn without attracting the stigma that current assistive products harbour, while still accurately detecting objects in the visual field and transmitting them into intuitive vibrations on the body.
Maptic won the James Dyson Foundation Scholarship, for innovation in design solutions.
Quality of life visual impairment affects approximately 1 in 28, people over 40 with 2 million people in the UK living with sight loss. Contrary to popular belief, full blindness is relatively uncommon, with 18% (360,000) of visually impaired people registered as legally blind, and the remaining 1.64 million retaining partial sight.
For the visually impaired, retaining independence in day-to-day life is important. However, navigation can be a long-winded, potentially unsafe process, especially in new environments or at night.
Simple tasks like travelling to work can become arduous and time consuming and more often than not, need planning beforehand.
While interviewing and observing those with visual impairments provides a good basis for research, to get a true feel of feeling partially sighted, simulation glasses were constructed and tested with for extended periods of time.
Hearing becomes the dominant sense for detecting immediate dangers and unfortunately, most current devices utilise purely audio based feedback for operation.
While visually impaired people may not feel as handicapped and believe it doesn’t hinder them in most tasks, the general public often have incorrect assumptions about the visually impaired. There is often a feeling of pity that comes with blindness, and one that is unwelcome to partially sighted people.
Current assistive products tend to look overly medical, with emphasis on function rather than aesthetics or ergonomics. Touchpoints cater to any remaining eyesight and tend to be high contrast, vibrant colours – but often garish and even ugly.
As the number of assistive products and technology breakthroughs increase, as does product discontinuance. 29.3% of all visual impairment devices used are abandoned. It is up to the user to decide to stop using a product, and user experience can be a major factor of this.
While interviewing users of visually impaired products, they expressed a dislike that most products are unattractive, both via tactile feeling and remaining sight. Many complain of undesirability and feeling self-conscious when using the products, especially if they looked medical.
A good example of an assistive product done right is the Bradley watch by Eone. It has become so desirable in fact, that what was initially designed for partially sighted people has become fully taken up by people with full sight, and was even available on the Dezeen Watch Store.
A discreet solution, not a medical one
With stigma as a focal point, the initial design stage incorporated designing concepts that would be invisible or concealed to the public.
Along with aesthetics, key points involved intuitiveness, ease of use and adequate feedback, and were investigated through journey maps.
A major part of the project was developing the effectiveness of hazard perception and the intuitiveness of tactile feedback. This was tackled through iterative electronics design prototyping and sensor testing.
This was as much an exercise into electronics learning as it was for design prototyping, and over the course of 30+ prototypes, the complexity of the electronics grew.
Current assistive devices and sensing products (think parking sensors) detect via ultrasonic sensors – sounds is pinged out, reflected off an object and returns to the sensor as a signal. These components are cheap, however, they tend to be bulky and can often be unreliable.
Initial prototyping and testing commenced with ultrasonic sensors but during the project, a new form of sensor was released: the long-range, time-of-flight sensor. This sensor is a fraction of the size of an ultrasonic sensor, and is much more reliable.
The smaller sensor allowed for a drastically smaller design form factor, allowing the design to look even less cumbersome and in the long run, less like an assistive product.
As the design incorporates a sensor and feedback devices, ensuring that the user gets good quality, intuitive haptics is important.
Prototype testing involved designs such as inflation bands and vibrating insoles, but it was discovered that ticking vibrations (similar to sonar) provided the best response for detecting distances, which were prototyped via mini vibration motors and specialised haptic microchips.
Through iteratively prototyping and designing hand in hand, the project ended up as a series of wearable units, consisting of a central sensor unit worn on the chest, and feedback units worn on the left and right sides of the body.
All interactions with the units have been taken with touch into consideration. At any given point, by running a finger down the device, the user can detect the state of the system, whether it’s on or off, indoors or outdoors setting, or which orientation the sensor is. Different profiles were given to the sensor (circle), left (square) and right (triangle) units, for the user to immediately tell what they’re touching.
The units are versatile, and by unclipping the top section, allows the user to choose an attachment that suits their needs and style. Charged with a simple 3.3V headphone jack on a custom dock, the units can plug into themselves, creating bracelet or necklace appearances.
‘Maptic’ (a play on map and haptic) was chosen as the brand name, one that doesn’t have immediate medical connotations and would be more closely linked to a lifestyle brand.
Turn by Turn Navigation
A major part of the project is turn by turn navigation, where the user can be directed to a destination via vibrations, not spoken word as is current.
Instead of using an in-built GPS module into the circuitry, increasing the size of the units while decreasing battery life, it was chosen to link the Maptic units to a smartphone, using the current GPS capabilities of the phone and Google Maps API to provide the data for navigation.
While the function is already part of the phone, the interaction is limited to voice input and output. To combat this, a custom app was built with high-contrast colours and shapes, allowing the user to simply tap to go home, or a couple of swipes to a chosen destination. The app also allows the user to check on the status of the units, and if they are paired correctly.
These high contrast colours and geometric shapes ultimately dictated the design language of the branding of the project.