Apple has two new patent applications out today (via AppleInsider) that show it’s been working on boosting the iPhone’s navigation abilities with augmented reality features. The patent describes a way that the iPhone could use its camera in combination with on-device software to generate virtual maps of your surroundings, which are overlaid on a real feed, and which can do things like provide a look inside surrounding buildings.
The patents describe a way to use GPS, Wi-Fi and sensor information to pinpoint a user’s position and then download a 3D model of their surroundings, including points of interest. To compensate for the fact that using sensor data alone would likely result in an imperfect matching of virtual and real environment (as has been the case in other similar AR mapping apps), Apple uses live video feed from the iPhone’s camera, and lets a user actively match up virtual elements with their real-world equivalents in the resulting image. Once it’s locked in with a match, users can do things like virtually “peel back” the outer layers of buildings to reveal their interiors.
Apple already offers its “Flyover” feature in iOS Maps, which provides a 3D representation of satellite imagery for a more interactive cruise through city streets, but it has yet to employ AR tech. AR apps that similar functionality to that described in these patents has been used before, by companies including Layar and other startups, but Apple’s system is clearly designed to make it less rough around the edges and more of a utility than a neat tech trick.
iOS Maps could use some flashy features to help it stand out, but I wouldn’t expect this one to make its way out to shipping software anytime soon. Still, it’s an interesting direction, and could work together nicely with Apple’s indoor positioning ambitions.
No comments:
Post a Comment