I saw this yesterday at Christian Spanring’s blog , and didn’t really understand it. He suggests Apple is getting into pedestrian navigation.
After reading this from Tech Radar, I’m still confused. It suggests that mapping would be delivered like podcasts and somehow be provided based on location, though no use of Wi-Fi, GPS or a network is required. Huh?
I tracked back to Electonista. It spins it a bit differently:
The implementation would see an application or service turn map data into a series of audio and video elements based on location; driving directions and other maps could be spoken aloud with a view of the map at that location as a guide.
My take is that it’s another way to store/deliver LBS information (directions, content and ads) outside of the satnav devices (which frankly are doing much of the same thing - just not on Apple hardware). Further, it suggests GPS, while not yet available on Apple devices, might be in use and this system, or another was not available.
Next up, the actual patent. Here’s the abstract:
Improved techniques to facilitate generation, management and delivery of personalized media items for users are disclosed. Users are able to influence or control content within a personalized media item. According to one aspect, personalized media items can pertain to generation and delivery of map-based media items. These media items are playable by a media playback device. For example, when a map-based media item is played by a media playback device, an audio output and/or a visual output can be provided. The audio output can be provided by digital audio, and the visual output can be provided by at least one digital image that is associated with at least a portion of the digital audio.
The claims seem to indicate you request the data (a route) and it’s delivered from a server to your device and the playback includes audio and at least one image. It’s sort of broad and seems to be far more about media delivery than anything else.