Researchers at Goldsmiths, University of London are using haptic technology to help visually impaired audio engineers ‘feel’ sound waves.
The Haptic Wave prototype – developed by researchers at Goldsmiths – consists of a wooden board with a slider built into it. As the user moves the slider from left to right to scroll through time, a dial moves up and down depending on the position of the waveform at that point in time.
The louder it gets the higher the dial, and it falls to the bottom of the slider for the quiet parts. “It’s an immediate, intuitive indication,” said Atau Tanaka, a professor of media and computing at the university, who worked on the ESPRC-funded research.
Adam Parkinson, who co-authored the research, consulted with a number of visually impaired audio engineers about what kind of device they’d been looking for before developing the Haptic Wave, which is about 30cm long and 12cm tall. “Whether you’re visually impaired or not, this technology frees you up and you can take that information in through the hands,” he said. In the future, the same technology could potentially be used to show whether a vocalist is in tune.
Parkinson said the device, which is being trialled in music studios and recording facilities across the United States and England, could be useful for audio engineers, musicians, radio producers and voiceover artists.
The Haptic Wave prototype was showcased recently at the Royal Academy of Engineering’s Innovation in Haptics event.
The prototype was just one of a number of haptic technologies on show at the Royal Academy of Engineering event, which also featured a demonstration from Bristol-based start-up Ultrahaptics. Their technology uses ultrasonic waves to simulate the sensation of touch.
Speakers also discussed the potential for haptics in areas such as surgery and healthcare, and entertainment.