Map data are overlaid on satellite imagery. A road segment within the map data is identified, and the satellite imagery indicates that the road segment is at a different geographic position than a geographic position indicated by the map data. The endpoints of the road segment in the map data are aligned with the corresponding positions of the endpoints in the satellite imagery. A road template is applied at an endpoint of the road segment in the satellite imagery, and the angle of the road template that matches the angle of the road segment indicated by the satellite imagery is determined by optimizing a cost function. The road template is iteratively shifted along the road segment in the satellite imagery. The geographic position of the road segment within the map data is updated responsive to the positions and angles of the road template.
Now before you get your pitchforks lets look at exactly what Google is proposing here. This is a computer automated process and not one that most GIS people have ever done. Read the claims section to learn more about what exactly this process is. It is interesting that they use TIGER as an example of a dataset that could be improved.
They could simply donate their map updates to OSM. Right my bad, TIGER is a great example of a dataset that doesn’t line up with satellite imagery.
As a long time Mac user I’ve used AppleScript to automate many work flows. Now AppleScript is pretty powerful but it unique (Well I’ve always thought it was like HyperTalk but that’s pretty unique too).
During WWDC Apple noted that there are no less than 680,000 apps in the App Store that use location data
That’s a lot and most of them end up using Apple’s Map API mostly because it’s easy (or lazy, I would be lazy). Last week Mapbox seems to have a solution ready to go though that might be easier than Apple’s and a whole lot better:
We just released Mapbox GL — a new framework for live, responsive maps in every iOS app. Now developers can have the most detailed maps sourced from ever-updating OpenStreetMap data, as well as the ability to fully control the style and brand to design maps that perfectly match their app. This is all done using our new on-device vector renderer, which uses OpenGL ES 2.0 technology for pixel-perfect map design, from antialiased fonts to polygon blurring, all hardware-accelerated and optimized for mobile devices — and all on the fly.
Of course it is open source and available on Github. I’ve not had a chance to play with it yet but I hope to sit down with it soon. It looks great, built on OSM and is released open source so no matter what happens to Mapbox1 I’ll still have access to the mapping library. Really amazing stuff.
Not that I know a thing, but it’s always a good question ↩
Indoor positioning is hard. The amount of effort that has been put into it since smartphones first started to take off has given us marginal results. WiFi, LTE/3G, GPS gets close but it’s still not accurate enough to tell me that the butter is right in front of my face. I’ve experienced Apple’s iBeacon technology in the Apple Store and at AT&T Park in San Francisco but that’s a tad proprietary. Possibly Apple’s got a solution though (still proprietary but what do I care, I’m all iOS):
In iOS 8, Apple is adding some new Core Location features that let app developers get precise indoor positioning data from an iOS device’s sensors and it’s even letting venues contribute by signing-up to get help enabling indoor positioning.
Yea so it’s iOS only but get a load of this:
Up until now, CoreLocation has been using a combination of Cellular, GPS, and WiFi technology in order to provide developers with location information from their users. Those technologies can get you within a city block or in some cases close to or inside a venue, but they aren’t enough to provide accurate positioning indoors or features like indoor navigation. That’s why with iOS 8, Apple is introducing new features for the CoreLocation API that will let developers tap into an iPhone’s M7 processor and motion sensors in order to get accurate indoor location, navigation, and floor numbers.
It’s actually very interesting. To save power once the location is determined GPS is turned off and Core Location uses the motion chip to figure out where you are walking and what’s near. The drawback according to 9to5mac is that locations are going to be required to provide Apple with their floorplans and RF information. Not exactly open…
I can’t help but laugh at this news a bit. I’m not educator but read up.
The dominant player in the world of geographic information systems is making free accounts to its advanced mapping software available to an estimated 100,000 K-12 schools in the U.S.
So free accounts of something that adults can’t even figure out how to use. I’m sure that will work out great. My son already thinks GIS is too complicated, now he might have to figure out how to manage Esri credits with ArcGIS Online? I admit though, I’ve been really busy lately so I haven’t kept up on the cost of AGOL:
In its release, Esri says the value of an account for the software is $10,000, leading to the $1 billion valuation for the entire donation ($10,000 X 100,000 U.S. K-12 schools.)
AGOL costs $10,000? Oh, well played Esri! You’re a “big player”!