Today Apple was awarded with a patent that puts them on the augmented reality map. Though, from the sounds of it, it won’t be directly competing with Google Glass… for now. The patent details, according to Appleinsider state that it is “Synchronized, interactive augmented reality display for multifunction devices.” Apple customers using iOS will have information overlaid on top of their video feed viewable real time on the device. Apple customers will also be able to tag or add annotations to a real object viewed through the screen.
The patent describes the following:
A device can receive live video of a real-world, physical environment on a touch sensitive surface. One or more objects can be identified in the live video. An information layer can be generated related to the objects. In some implementations, the information layer can include annotations made by a user through the touch sensitive surface. The information layer and live video can be combined in a display of the device.
This is an interesting development. It appears as if Apple wants to take the reverse approach to gathering data. If i had to take a guess based on the current competitive landscape. I think they plan on building a user generated database of real objects tagged with descriptive information. Lets face it… Augmented Reality through a mobile device is nearly impractical. It makes sense for Apple to gather preliminary data which would enhance the user experience. That is, when they do enter the augmented reality hardware battle to compete with Google glass. The question is… Will this be enough for Apple to catch Google in the race for future augmented reality market share?