Apple has added a new accessibility feature to the newly-released iOS 14.2. The feature uses augmented reality and machine learning to detect how close people and objects are to the iPhone. The aim is to aid blind and low vision individuals better navigate the world.
The feature emerged from Apple’s ARKIT “People Occlusion” feature, which determines if there’s someone around the camera’s field of vision and estimates how far the object or person is. The LiDAR sensor in the iPhone 12 Pro and Pro Max only make these estimations more accurate.
This detection feature works by sending out a short burst of light around the camera’s field of vision to measure just how long it takes for the light to return to the LiDAR scanner. This feature’s primary limitation is that it does not work in either dark or low light environments.
It will be implemented as part of the Magnifier app and can provide feedback to the user in different ways. It can verbally inform the user if someone is there. If the user is browsing the supermarket, it’ll say how close the person or object is in feet or meters, updating the user regularly as they move.
Unique tones can also be set that correspond to different distances. The user will then be alerted via this changing tone as an object gets closer. Users can also receive a haptic pulse that indicates the distance of an object. Both are ideal for someone with visual or hearing impairment.
This feature is a timely release considering the measures brought in by governments to tackle the COVID pandemic. Users with visual impairments can set an alert for six feet to help them follow the social distancing protocol. It’s ultimately practical for day-to-day activities such as queuing in the grocery store’s checkout line, walking in the park, or even finding empty seats on the subway or at a restaurant.
Users can find the feature by tapping on the people icon on the far right of the Magnifier toolbar, and is available with the iOS 14.2 update.
Get the latest industry news first.