An Apple patent application discovered on Thursday hints that the company is looking to deploy on mobile devices a virtual navigation system based on panoramic location data, much like the popular "Street View" seen in Google Maps.
Published by the U.S. Patent and Trademark Office, Apple's "3D Position Tracking for Panoramic Imagery Navigation" describes a graphical user interface that leverages an iPhone or iPad's onboard sensors to navigate panoramic imagery.
Source: USPTO
Instead of the traditional approach, Apple proposes tracking subsystems and onboard sensors deployed within a mobile device be used to translate a user's physical motion into a panoramic navigation UI. In the examples that follow, data from accelerometers, cameras, gyroscopes and other sensors are used to "move" a user through virtual street-level panoramic space.
First, the invention notes a user must first enter the street-level view, which can be accomplished by "pinching in" on a map, or by selecting a dropped pin icon. Once in street view mode, a user can move their device up, down, left or right to view panoramic imagery supplied either by built-in storage or streamed wirelessly over cellular data networks. Movement is controlled by moving the device forward and back.
Illustration of device transitional movement from original position
(104) with informational overlay (103b).
Throughout the process, onboard sensors are collecting movement data, including linear and velocity metrics, and translating the motions into the GUI.
Further, the filing notes informational bubbles can be displayed on the virtual environment to point out places of interest such as buildings or shops. Information is stored in layers, an example being "businesses," and can be displayed according to a user's preferences. In some embodiments, the bubbles can be hidden to reduce clutter on smaller device screens.
In an alternative implementation, the system can translate movement data from an imaging sensor in what is called "optical flow," which reads apparent patterns of motion of objects in a panoramic image in relation to an observer. By scaling distance data, a device can display the appropriate virtual location of a user within the environment.
One particularly intriguing idea is the use of multiple displays to increase the visible area of a panoramic image. Devices can communicate wirelessly to display concurrent information regarding the virtual environment.
Finally, the application mentions the use of interior imaging data for use in some implementations, allowing users to "walk into" a building using their device. When inside structures, other actions can be performed, such as "selecting an object for purchase," though further detail regarding that level of functionality was not discussed.
Apple's iOS Maps currently lacks a street-level viewing option as it simply doesn't have the imaging data. This feature, which is available on Google's mapping service thanks to its Street View initiative, was sorely missed by some iOS device users with the introduction of the Maps app in iOS 6.
It is unclear if and when Apple will implement the invention in a future iteration of Maps, but the filing shows the company is at least actively investigating a competitor to Google's solution.
Apple's patent application was filed in September of 2011 and credits Patrick Piemonte and Billy Chen as its inventors.
The secrets of how to make Incredible Video with your iPhone
Data source: Apple Insider (By Mikey Campbell)
Further, the filing notes informational bubbles can be displayed on the virtual environment to point out places of interest such as buildings or shops. Information is stored in layers, an example being "businesses," and can be displayed according to a user's preferences. In some embodiments, the bubbles can be hidden to reduce clutter on smaller device screens.
In an alternative implementation, the system can translate movement data from an imaging sensor in what is called "optical flow," which reads apparent patterns of motion of objects in a panoramic image in relation to an observer. By scaling distance data, a device can display the appropriate virtual location of a user within the environment.
One particularly intriguing idea is the use of multiple displays to increase the visible area of a panoramic image. Devices can communicate wirelessly to display concurrent information regarding the virtual environment.
Finally, the application mentions the use of interior imaging data for use in some implementations, allowing users to "walk into" a building using their device. When inside structures, other actions can be performed, such as "selecting an object for purchase," though further detail regarding that level of functionality was not discussed.
Apple's iOS Maps currently lacks a street-level viewing option as it simply doesn't have the imaging data. This feature, which is available on Google's mapping service thanks to its Street View initiative, was sorely missed by some iOS device users with the introduction of the Maps app in iOS 6.
It is unclear if and when Apple will implement the invention in a future iteration of Maps, but the filing shows the company is at least actively investigating a competitor to Google's solution.
Apple's patent application was filed in September of 2011 and credits Patrick Piemonte and Billy Chen as its inventors.
The secrets of how to make Incredible Video with your iPhone
Data source: Apple Insider (By Mikey Campbell)
Post a Comment