Uber Technologies Inc. announced that it has entered into a Google master agreement under which the ride-hailing company will get access to Google Maps platform rides and deliveries services.
I mean today Uber uses Google Maps with their app, even on iOS. This is basically a continuation of the previous agreement with some changes that better align with how Uber does business. Rather than number of requests that Uber makes for Google Maps services, it is based on billable trips that are booked using Uber, a much more manageable deal for Uber. Last year, it came out that Uber paid Google $58 million over the past 3 years for access to Google Maps. This quote really strikes me as bold:
“We do not believe that an alternative mapping solution exists that can provide the global functionality that we require to offer our platform in all of the markets in which we operate. We do not control all mapping functions employed by our platform or Drivers using our platform, and it is possible that such mapping functions may not be reliable.”
For as much money Uber has invested in mapping, they don’t believe their technology is reliable enough to roll out to the public. That is mapping services in a nutshell, when you business is dependent on the best routing and addressing, those businesses pick Google every time. All that time and effort to build a mapping platform and they still pay another company tens of millions of dollars.
I’ve read so much about how Uber is about ready to release their own mapping platform run on OSM. But in the end the business requires the best mapping platform and routing services and clearly nobody has come close to Google in this regard. Google Maps is not only the standard but almost a requirement anymore.
I started blogging in May of 2005. Right before Katrina hit and everything we knew about GIS disaster response changed. Katrina was that moment where the static image PDF of a map changed to a map service that ran on almost any modern (at the time) web browser. Immediately every GIS map server that was out there became irrelevant at best, dead to the world at worst. Remember though, Google bought Google Earth almost a year before Katrina and Google Maps didn’t launch until early 2005. The tools that created this disaster response revolution were in place, but not too many people used them or had heard of them. But less than 6 months after Google Maps hit the web, Katrina response was almost entirely driven by their tools.
If you look at my blog entries from September and October, you can see attempts by Esri, Microsoft, Yahoo! and others to try and address this new paradigm of mapping but none of them stuck. Everyone, and I mean everyone, was using Google. Esri ArcScripts back then probably had 50 tools to convert SHP to KML or MXD to KML. We had tools like Arc2Earth that specialized in making maps easier with Google. And while Esri tools were still being used to generate the data, the display was happening on other platforms.
This of course gave rise to the Neogeography revolution. I’ll spare you the bare breasted Andrew Turner graphic but at this time we had so many people doing things with GIS that had no idea what GIS was let alone what Esri was. The limitations on getting started with mapping went down and all you needed was a computer and a text editor to make a map. My blog is littered with examples of Neogeography, from EVS Islands to all that great Flickr mapping that Dan Catt and crew did back then. People didn’t ask for permission, they just did it. It all culminated in what I consider the greatest crowdsourced disaster mapping effort, the wildfires in San Diego back in 2007 (feel free to choose the Haiti response over this, that’s fine. I really like the example of using Google My Maps in your backyard for this).
But something happened after this, it isn’t that people stopped mapping. Look at OSM growth. The amount of crowd sourced data continues to grow exponentially. But responses to disasters seemed to be run by Google and Microsoft themselves. Tools like Google My Maps continue to exist, but I truly can’t recall using one in the past 10 years. Or if the disaster was not interesting enough for Google, you’d see people using government websites to get that information. The Esri mapping had finally caught up that people would use the fire maps from the DOI other 3 letter agencies without complaining. The citizen effort moved to Twitter where it continues to show great promise, just not as a Google My Map. Take a look at the Bush Fire here in Arizona on Twitter. So many great posts by people but maps are either static images shared or links to traditional InciWeb maps.
This brings us full circle to COVID-19 mapping. Think of the best and most up to date COVID websites. They are built on Esri technology. Google has websites, Microsoft has them too. But the Esri dashboard has finally had its moment in the sun. I wonder if this is because the market has matured, that the tools have matured or the data set lends itself to a more scientific approach to display rather than simple lines and points. The Johns Hopkins COVID-19 Maps & Trends website is the bible for this epidemic.
GIS is no longer a side show on this response. I’m guessing that because this is more structured government data, Esri is uniquely positioned to be in the middle of it but even then, their tools have come a long way from the ArcIMS/ArcWeb madness that we dealt with during Katrina. COVID-19 dashboard is the opposite of Neogeography and that is OK. The influence of the citizens on mapping is clearly shown in the Esri tools we deal with today. They still drive me nuts from time to time but let’s be honest, they really do work for this situation. As we close out 1/2 of the way through 2020, hopefully we can keep the need for disaster response to a minimum.
I’ve been working on this blog post all weekend and I’ve rewritten is many times. It comes back to the confusion about why Mapillary and Facebook are now part of the same team. I wrote down about 10 guesses as to why Facebook decided it needed Mapillary and they needed them now but Joe Morrison did such a a good job outlining many of them I’ll share it here. Go read and come back after you’re done, I’ll wait.
Welcome back, now what do I think about this? Hard to say honestly, I can talk myself out of any idea. Get back at Google? I don’t think things are that emotional, sure they probably should own their own mapping solution as sending all their users on to another platform is leaking out their secret sauce and probably a boon for Google. But this isn’t something they haven’t been working on and I can’t see how as amazing Mapillary is, that it moves the needle on this at all. Any work toward a Facebook Maps platform has been done and is probably close to happening. I could see that amazing Mapillary team being an acqui-hire that could help in the long term given their expertise with Open Street Map.
Computer vision, AR/VR and the rest *could* be a reason but remember that Facebook owns Oculus and has done so much in AR that again Mapillary is a rounding error on this. While Oculus has not paid out the way I’m sure Facebook hoped it would, the engineering and development teams there clearly have influenced Facebook. Mapillary, as amazing as those guys are, just don’t have the horsepower that existing AR/VR/CV teams do at Facebook. Again, maybe an acqui-hire.
Place database is of course the holy grail of mapping. The maps are a commodity, but the places are not. But let’s be honest, there are very few companies that have better place data than Facebook. They might have not had street level view data but they sure had more pictures of these venues than almost anyone else. I get that people like street view data but how often do people really say, let me see a street view image from 2011 when they are look at directions. THEY DON’T. Street view is the coffee shop mapping example. It sounds interesting, looks great in demos but in the end not as important as a 3D world built from satellite imagery and lidar. But wait, that’s where Mapillary does come in.
The mostly likely reasons I feel that Facebook bought Mapillary was because of their expertise with Open Street Map and OpenSfM. Facebook is one of the largest users of OSM out there so bringing in a group that is as if not more experienced with OSM helps move the needle with their mapping efforts. The second thing Mapillary brings is their skill making 3D worlds out of imagery. As I said, who has better pictures of venues than Facebook? Start stitching those together and you get an amazing 3D city that is updated quicker than driving stupid cars down streets. Encourage people to take pictures and they update the 3D world for you. That and they they get some of the best OSM ninjas out there all at once.
Now what happens to the crowdsourced data? Will people continue to participate given there are few companies who are more reviled for data management than Facebook? That is what I’m most interested in, Mapillary the product, does it continue? Time will tell.
If you search my blog you’ll find an interesting post titled earthgoogle. Well it really isn’t that interesting, it just has a link to download Google Earth and a link to my blog. So what is this thing and why does it have such a weird title?
For those that might not remember, 2005 was a crazy time for GIS blogs. Katrina brought satellite imagery to everyone and people searched the internet for ways to find out more. Google Earth was probably the easiest and best way for the average person to learn more about satellite imagery and get some really helpful tools to mark up the area.
About this time in September 2005, I noticed a lot of people arriving to my blog due to the search term “earthgoogle”. So as most people who blogged back then, I loved to talk about blogging. I created a simple blog post asking what was this all about.
To all those reaching this site using MSN search with the term “earthgoogle” hello. You’ve been filling up my server logs with this request. I’m curious why you’ve typed this in to only MSN search and not Google/Yahoo/other search engines.
So obvious, right? MSN users, not typing a URL correctly? Anyway, what this blog post of mine actually did was make this page the number one result in Google for the search term “earthgoogle”. I got so much traffic by being the way most people, who didn’t understand how URLs work, find Google Earth. Eventually I changed the page to what you see now.
I put Google AdSense on that page too. I mean everywhere (really wished I took a screen shot because it was so tacky). The result from that tacky was that I was making over $1,000 a month in ad revenue from that blog post alone. People who wanted to find “earthgoogle” apparently also like to click on ads.
Eventually the page died down, people stopped being directed to my blog via search for “earthgoogle”. I probably pulled ads off the blog in 2006 and couldn’t care less. But the page remains, a reminder of how crazy Google Earth was back in 2005.
Google Fusion Tables – Are you kidding me? These stuff is “teh awesome”. Fusion tables are going to be more “killer” than Google Maps was. Yup, pay attention.
“teh awesome”? Seriously, who says that? Well I guess I did and that’s OK. Was it more “killer” than Google Maps, obviously no. It’s not that Fusion Tables was wrong, it is just there are so many alternatives to it that it really doesn’t matter anymore like it did when it first arrived.
Well if you’re like me, you probably have a lot of data in Fusion Tables and Google just sent out an email explaining how to get it out.
If you created many tables over the years, we’ve made it easy to download all your data in one step with a new dedicated Fusion Tables option in Google Takeout. You can save the rows, metadata and geometries of any base tables that you own, and export this data in the following formats: JSON, CSV and KML.
It’s a really nice tool, just tried it myself on some baseball data that I had in there. Google explains the tool as such:
The data for each table is saved to its own “archive”. The data will be saved in a Google Sheet; for datasets beyond the size limits of Sheets, you’ll get a CSV. This archive is stored in a top level folder called “ft-archive” in your Drive.
A Google Maps visualization is automatically created with the archived data. This map preserves many of the original Fusion Tables styling configurations. Any changes you make to the Sheet or CSV will appear in the map visualization.
A listing of all archived tables is stored in a Sheet. This handy Sheet is called “ft-archive-index” and lives within the “ft-archive” folder. The index Sheet summarizes each run of the archive tool and preserves the visualization URLs with encoded styles. Each time you run the archive tool, you will get additional archives based on the current data in your tables along with corresponding new rows in the archive directory.
You have until December 3, 2019 to get your data out. Google Takeout makes it easy which is really nice.
Yes everyone knows about What3Words. It was an attempt to come up with an easy way to assign addresses for places where there are none. In the end, a proprietary addressing system will never gain traction, and of course the inevitable eventually happened. My personal feeling is that What3Words never really got us beyond x/y numbering and the logic behind an addressing system was not there. Enter Plus codes which comes at this problem from a different perspective. There is a very detailed analysis of existing methods and why they choose to go this direction that I’ll leave it up to you to read.
Probably the biggest reason to pay attention is that this open addressing system was developed by Google. In fact, they are already implementing it in India as we speak which goes a very long way to making this happen.
All these systems are built on the idea the world is a grid, and how deeply you drill down into that grid is your address so things need not be a single point, they can be an area which opens up many exciting ideas for addressing, especially outside of North America and Europe. Check out the Github project to learn more.
Well I’m not sure how much this had to do with Waze being owned by Google or not but PhantomAlert is suing Waze.
Before the advent of GPS and navigation apps, cartographers sneaked “paper towns” and “trap streets” into their maps—fake points of interest that they used to detect plagiarism. If someone copied their map, it would be easily identifiable through the inclusion of those locations. That same trick has found its way into modern-day mapping systems: A new lawsuit brought against Google and its traffic app Waze cites sham points of interest as evidence that the Google-owned service copied from a competitor’s database.
Apparently these two companies tried to make a deal before Google snapped up Waze and PhantomAlert is alleging that Waze used their database to “boost its profile”. One of the biggest concerns in the OpenStreetMap community is allowing these intentional mistakes into their database. Copyright Easter Eggs is well documented on the OSM website.
A Copyright Easter Egg, in terms of mapping, is a feature that is drawn in a distinctive way in order to help identify its original author. It may be a nonexistent, or slightly or heavily distorted, map feature, or its name may be wrongly or unusually spelt.
The supposed main purpose of such a feature is to strengthen the author’s case in a copyright dispute. If he can show that his own unique feature appears in the defendant’s work, it is easier to prove that the defendant’s work is a copy of his.
Hey look, I got to use the new Google logo already!
Yea so if this is true, PhantomAlert has a pretty good idea that Waze stole their data and it could mean big trouble for Google. Having a closed database like this opens Waze up to these kinds of lawsuits because they are unable to have the community police the data. The big question is was this data imported into Waze intentionally or by accident. I don’t think the latter will get them off the hook but if there was intent it could be costly. We’ll have to see. The Waze byline about “outsmarting traffic, together” might not be too smart.
Indoor mapping is the white whale of our Spatial IT industry. We’re always reading about how our smartphones will lead us to the best deals or how I can find the specific nail I need in Home Depot without having to ask anyone or walk down every aisle. They key to all this is essentially iBeacon.
You can search Google News for all the latest excitement on the concept but essentially it is a way for your phone to know where things are and for the vendors to know where you phone is through Bluetooth. Imagine walking into a store and getting alerts about your favorite beer being on sale and then the ability to navigate directly there. Sexy right? Plus we’ve been anticipating this happening for years. Except…
Google was set to launch a new product that added context to one of its most successful apps, Google Maps. But earlier this year, it was shut down by Alphabet CEO Larry Page, according to people familiar with the project.
Google Here worked by sending a notification to a smartphone user’s lock screen within five seconds of their entering a partner’s location. If the user clicked on the notification, a full screen HTLM5 “app” experience would launch. Google Here would know when to send the notification via Google Maps and beacons placed in the stores of participating partners. Google planned to supply the beacons to partners for the launch, according to the document. The experience could also be found by going to the Google Maps app.
Exactly what we though everyone wanted. In testing the application was deemed too invasive and Google feared no retailers would sign up. That’s right, Google didn’t think could get their partners to install cheap beacons in their stores AND they feared they were too big brother. Seems weird doesn’t it, if there is one company that can get companies to spend money on ads, it is Google. And since when did Google ever think pushing ads on us was “invasive”?
The magic about Google Here (Here as in not Here that was owned by Nokia) was that you didn’t need an app running for it to work. Think about that for a minute, ads would appear on your phone based on where you where and you didn’t need to opt in to get them. Now we see why Google was very concerned that Here was going to get a large backlash. Being able to push ads on users would have been something they really could have sold well to companies, I’m not sure there would be any fear of companies not wanting to push ads on us.
Beacons are still very important to Google. Their Eddystone project talks about lots of uses of beacons but not for ad delivery. Clearly there was feedback on this project and it jolted Google out of their normal sell more ads business model. I think beacons will be very valuable as they start appearing in more areas, but I for one don’t need to get an ad for fabric softener every time I walk into a Target.
If there is one thing that gets the GeoMarketers excited, it is natural disasters. Every year at this time we see hurricane trackers, 3D storm viewers and other “exciting” products to help protect us from the wrath of mother nature. This year Google is promoting their Personalized Storm Tracker.
The safety recommendations you receive will be tailored to reflect the current status of the event and your context. For example, if you search for a specific storm when it’s still several days away, you may see a map of the developing weather event and a recommendation to start preparing an emergency kit. If the storm is only hours away from your location, you might receive a reminder to start charging your phone in case power goes out. And if you search when the storm is nearby, you’ll get the most urgent information, like how to avoid injury from fast-moving water or flying debris.
I feel like these things are more trouble than they are worth. Last Sunday I was at Sky Harbor Airport when a dust storm alert went off on all the iPhones in the baggage claim area. The whole place sounded loud as the alerts went off warning us that wind and dust were headed our way. The result of this great warning, people making jokes. Sure an alert was issued, sure it probably is a safety thing that we all get these alerts on our phones. But the delivery isn’t personalized, it’s a broadcast message and then 10 minutes of people joking about the alerts.
At least in the USA, storms don’t sneak up on anyone. These products are great press but of little value.
Well good news for those who want to help a down on its luck company like Google update their maps.
Google Map Maker, the tool which allows anyone around the world to contribute information to Google’s worldwide map, has re-opened in 45 countries after going live again in 6 countries two weeks ago. The product was temporarily shut down in May after it was discovered that some nefarious edits to the map, like geographic polygons shaped to depict an Android peeing on what is ostensibly an Apple logo, were being approved.
If you want to help Google, just go to Google Map Maker and start editing. Just know your edits will get locked up and used to make a ton of money. Here in the USA you can’t create polygons yet but I suppose that will be back soon.