Apple added People Detection to the Magnifier app in the latest iOS 14.2 update. It uses the camera and LiDAR sensor in the iPhone 12 and 2020 iPad Pro, and could change how low-vision users navigate space.  “Even after the pandemic, I foresee uses for such technology,” Aaron Preece, editor-in-chief of the American Foundation for the Blind’s AccessWorld, told Lifewire via email. “For example, searching for a path through a massive crowd of people where even a guide dog can’t find a way through.”

LiDAR

People Detection uses two key features of the iPhone 12. One is the LiDAR sensor, which is a kind of laser-radar built into the iPhone’s camera array. This lets the iPhone detect the position of objects around it, and is used to improve the camera’s background-blurring portrait mode, for example. The other essential part is the iPhone’s massive machine-learning ability, which takes up roughly a quarter of the space on its A14 chip. This lets the camera process the spatial data from the LiDAR and the camera, and recognize people around you. One point to note is that People Detection does not work in the dark.

How Can People Detection Help?

Walking around without being able to see isn’t just about avoiding walls, traffic, and other hazards. Even if you know a place well, joining a queue can be tricky. Finding an empty seat on a bus or train is equally hard—how do you know which seats are free, and which are occupied? People detection can’t replace human interaction, but it can certainly augment it.  “I think this will be so valuable when trying to navigate areas, especially in airports, hotel lobbies, and restaurants,” AFB’s major gifts specialist Melody Goodspeed told Lifewire via email. “I am excited to use the feature and get back some freedom and safety.” The iPhone’s People Detection feature can be set with a minimum distance alert, and can deliver warnings with a haptic vibration, or via an AirPod, both of which are kinds of augmented reality that don’t interrupt your flow. It can even, according to MacStories’ Alex Guyot, detect non-human objects, and “communicate how close objects or other people are.” Here it is in action: “The goal is to help the visually impaired understand their surroundings,” writes accessibility expert Steven Aquino at Forbes. “Examples include knowing how many people there are in the checkout line at the grocery store, how close one is standing to the end of the platform at the subway station, and finding an empty seat at a table.”

Hang In There

There are some caveats to this technology. One is that you need to have the iPhone and magnifying app active to use it. This could drain the battery pretty fast. You’ll also need a way to hold the iPhone so it can survey the scene ahead, unless you’re happy carrying it in your hand the whole time.  “Early adopters of hardware who are blind or visually impaired will doubtless benefit from this new feature,” AFB’s national aging and vision loss specialist, Neva Fairchild, told Lifewire via email. “It will be vital that they have a way to hang their iPhone from a neck strap to keep hands-free for using a cane or a dog and carrying personal items. This proves to be a challenge for this type of technology.” Despite these barriers, People Detection is an example of Apple at its best. Not only is it using the brand-new LiDAR sensor to augment photography, it has already integrated it into the already-impressive range of accessibility tools on iOS. Camera, sensor, haptic feedback, audio feedback via AirPods. It’s all there, almost on day one. Now, imagine how cool this will be when integrated with Apple’s long-rumored glasses.