Augmented reality seems cool, unless you are blind. How can blind people function in an environment that is not there, and is only based on visual perception? Introducing Ultrahaptics, a technology that uses ultrasound to create perception based on touch, not vision. The French company, Immersion, is using Ultrahaptics in combination with Microsoft’s HoloLens to create an augmented reality with objects that are touchable. It is believed that it will be initially used in medical situations to reduce the unnecessary touching of equipment, to control kitchen appliances without spreading food born bacteria, and in automobiles, to allow drivers to access controls without having to reach for them and take their eyes off the road. Obviously, anyone with any sort of an imagination can think of countless ways in which this technology can be applied, but part of the restrictions will most likely be based on cost.