Apple has included a new accessibility feature in the latest release of the iOS 14.2 beta called People Detection, which is a part of the Magnifier app.
It uses augmented reality and machine learning to detect where humans and objects are in space. The addition was first spotted in a September report by Juli Clover of MacRumors.
The purpose of People Detection is to aid blind and low vision users in navigation; this type of application is-suited for the LiDAR sensor in iPhone 12 Pro. The goal is to help the visually impaired understand their surroundings—examples include knowing how many people there are in the checkout line at the grocery store, how close one is standing to the end of the platform at the subway station, and finding an empty seat at a table. Another use case is in this era of social distancing; the software can tell you if you’re within six feet of another person in order to maintain courtesy and safety.
Users can set a minimum distance for alerts—say, six feet for the aforementioned social distancing—as well as having an option to use haptic feedback to deliver those notifications. There also is audible feedback; if a person is wearing one AirPod, they will be notified when they’re in close proximity of a person or whatnot. People Detection is fully compatible with VoiceOver, Apple’s screen-reader technology.
Note: A limitation of People Detection is that it does not work in the dark.