Seeing AI is the name of Microsoft’s artificial object and person recognition technology that we knew four years ago, later launching as an application in the App Store. Its recognition technology includes a new ace up its sleeve in the update to version 4.0: the use of LiDAR of the iPhone 12 Pro and Pro Max.
With Seeing AI’s recognition technology, people with low vision can recognize people and expressions, read texts, recognize objects and even identify money. Until now it was based on visual recognition, using the camera, but in version 4.0 the LiDAR radar of the iPhone 12 Pro is added to the mix, and it can be used to explore the world in 3D with the help of spatial audio.
LiDAR at the service of accessibility
Seeing AI version 4.0 is loaded with new features. The most important of them is the new world channel, exclusive for those who use it on an iPhone 12 Pro or Pro Max with iOS 14 or, we assume, future devices of the brand that include a LiDAR.
This LiDAR can be used in conjunction with a headset to “explore an unknown place in 3D”, so that surrounding objects and their location in the room will be named on the headset. Another novelty is power look for a specific object in the room, for which a virtual audio signal is added to it.
One last use for the LiDAR of the iPhone 12 is the haptic proximity sensor, by which it can be used to point with the camera of the mobile and feel the distance at which the objects are around.
The application takes advantage of this update to redesign its interface, with a higher contrast and a greater field of view of the camera, improve descriptions when browsing photos on the phone, improves the accuracy of text recognition and adds seven more languages: Czech, Danish, Finnish, Greek, Hungarian, Polish and Swedish.
Microsoft Seeing AI
Via | MSPowerUser