Categories
Thoughts

Uber and Google Sign 4 Year Agreement on Google Maps

This is one of those surprised/not surprised things.

Uber Technologies Inc. announced that it has entered into a Google master agreement under which the ride-hailing company will get access to Google Maps platform rides and deliveries services.

I mean today Uber uses Google Maps with their app, even on iOS. This is basically a continuation of the previous agreement with some changes that better align with how Uber does business. Rather than number of requests that Uber makes for Google Maps services, it is based on billable trips that are booked using Uber, a much more manageable deal for Uber. Last year, it came out that Uber paid Google $58 million over the past 3 years for access to Google Maps. This quote really strikes me as bold:

“We do not believe that an alternative mapping solution exists that can provide the global functionality that we require to offer our platform in all of the markets in which we operate. We do not control all mapping functions employed by our platform or Drivers using our platform, and it is possible that such mapping functions may not be reliable.”

For as much money Uber has invested in mapping, they don’t believe their technology is reliable enough to roll out to the public. That is mapping services in a nutshell, when you business is dependent on the best routing and addressing, those businesses pick Google every time. All that time and effort to build a mapping platform and they still pay another company tens of millions of dollars.

I’ve read so much about how Uber is about ready to release their own mapping platform run on OSM. But in the end the business requires the best mapping platform and routing services and clearly nobody has come close to Google in this regard. Google Maps is not only the standard but almost a requirement anymore.

Categories
Thoughts

It is Different With COVID-19…

I started blogging in May of 2005. Right before Katrina hit and everything we knew about GIS disaster response changed. Katrina was that moment where the static image PDF of a map changed to a map service that ran on almost any modern (at the time) web browser. Immediately every GIS map server that was out there became irrelevant at best, dead to the world at worst. Remember though, Google bought Google Earth almost a year before Katrina and Google Maps didn’t launch until early 2005. The tools that created this disaster response revolution were in place, but not too many people used them or had heard of them. But less than 6 months after Google Maps hit the web, Katrina response was almost entirely driven by their tools.

Remember this? Don’t try and pan!

If you look at my blog entries from September and October, you can see attempts by Esri, Microsoft, Yahoo! and others to try and address this new paradigm of mapping but none of them stuck. Everyone, and I mean everyone, was using Google. Esri ArcScripts back then probably had 50 tools to convert SHP to KML or MXD to KML. We had tools like Arc2Earth that specialized in making maps easier with Google. And while Esri tools were still being used to generate the data, the display was happening on other platforms.

This of course gave rise to the Neogeography revolution. I’ll spare you the bare breasted Andrew Turner graphic but at this time we had so many people doing things with GIS that had no idea what GIS was let alone what Esri was. The limitations on getting started with mapping went down and all you needed was a computer and a text editor to make a map. My blog is littered with examples of Neogeography, from EVS Islands to all that great Flickr mapping that Dan Catt and crew did back then. People didn’t ask for permission, they just did it. It all culminated in what I consider the greatest crowdsourced disaster mapping effort, the wildfires in San Diego back in 2007 (feel free to choose the Haiti response over this, that’s fine. I really like the example of using Google My Maps in your backyard for this).

In all fairness, Andrew wasn’t literally saying it killed GIS.

But something happened after this, it isn’t that people stopped mapping. Look at OSM growth. The amount of crowd sourced data continues to grow exponentially. But responses to disasters seemed to be run by Google and Microsoft themselves. Tools like Google My Maps continue to exist, but I truly can’t recall using one in the past 10 years. Or if the disaster was not interesting enough for Google, you’d see people using government websites to get that information. The Esri mapping had finally caught up that people would use the fire maps from the DOI other 3 letter agencies without complaining. The citizen effort moved to Twitter where it continues to show great promise, just not as a Google My Map. Take a look at the Bush Fire here in Arizona on Twitter. So many great posts by people but maps are either static images shared or links to traditional InciWeb maps.

This brings us full circle to COVID-19 mapping. Think of the best and most up to date COVID websites. They are built on Esri technology. Google has websites, Microsoft has them too. But the Esri dashboard has finally had its moment in the sun. I wonder if this is because the market has matured, that the tools have matured or the data set lends itself to a more scientific approach to display rather than simple lines and points. The Johns Hopkins COVID-19 Maps & Trends website is the bible for this epidemic.

GIS is no longer a side show on this response. I’m guessing that because this is more structured government data, Esri is uniquely positioned to be in the middle of it but even then, their tools have come a long way from the ArcIMS/ArcWeb madness that we dealt with during Katrina. COVID-19 dashboard is the opposite of Neogeography and that is OK. The influence of the citizens on mapping is clearly shown in the Esri tools we deal with today. They still drive me nuts from time to time but let’s be honest, they really do work for this situation. As we close out 1/2 of the way through 2020, hopefully we can keep the need for disaster response to a minimum.

Categories
Thoughts

Waze sued for allegedly stealing data from another navigation app

Well I’m not sure how much this had to do with Waze being owned by Google or not but PhantomAlert is suing Waze.

Before the advent of GPS and navigation apps, cartographers sneaked “paper towns” and “trap streets” into their maps—fake points of interest that they used to detect plagiarism. If someone copied their map, it would be easily identifiable through the inclusion of those locations. That same trick has found its way into modern-day mapping systems: A new lawsuit brought against Google and its traffic app Waze cites sham points of interest as evidence that the Google-owned service copied from a competitor’s database.

Apparently these two companies tried to make a deal before Google snapped up Waze and PhantomAlert is alleging that Waze used their database to “boost its profile”.  One of the biggest concerns in the OpenStreetMap community is allowing these intentional mistakes into their database.  Copyright Easter Eggs is well documented on the OSM website.

Copyright Easter Egg, in terms of mapping, is a feature that is drawn in a distinctive way in order to help identify its original author. It may be a nonexistent, or slightly or heavily distorted, map feature, or its name may be wrongly or unusually spelt.

The supposed main purpose of such a feature is to strengthen the author’s case in a copyright dispute. If he can show that his own unique feature appears in the defendant’s work, it is easier to prove that the defendant’s work is a copy of his.

google_logo

Hey look, I got to use the new Google logo already!

Yea so if this is true, PhantomAlert has a pretty good idea that Waze stole their data and it could mean big trouble for Google.  Having a closed database like this opens Waze up to these kinds of lawsuits because they are unable to have the community police the data.  The big question is was this data imported into Waze intentionally or by accident.  I don’t think the latter will get them off the hook but if there was intent it could be costly.  We’ll have to see.  The Waze byline about “outsmarting traffic, together” might not be too smart.

Categories
Thoughts

Google Maps Gets Lost in Photos

Look I love iOS but I still use Google Maps as much as possible because it works better than any other mapping service out there.  But I’m beginning to wonder what Google is thinking by adding some new features.

Now Google is looking to capitalize on this ongoing trend with a new feature in Google Maps that encourages users to share their “foodie pics” with others by posting the photo to Google Maps itself.

It could be that I live in a car town and navigation is the reason I use Google Maps but the idea that I would use my mapping app to take pictures of food is a bit out there.  I mean don’t they have their own social media network to handle this?  Oh right

Categories
Thoughts

SpatialTau v2.6 – When Maps Got Slippy

SpatialTau is my weekly newsletter that goes out every Wednesday. The archive shows up in my blog a month after the newsletter is published. If you’d like to subscribe, please do so here.


It seems like just yesterday but 10 years ago Google Maps was born.

If you hopped in your DeLorean for a trip back to before 2005, you’d remember the days when we were all dependent on paper maps, print-outs, post-its and sometimes even a compass for directions! Getting from point A to B is something we do all day, every day—from finding the fastest way to get to work, to dropping the kids off on a carpool route, to meeting friends for drinks at a new spot—so it should be as easy as possible. That’s why we created Google Maps and why we’ve spent the last 10 years figuring out better ways for you to get around.

Re/code has a great article on the birth and evolution of Google Maps that I encourage you to read.  For us in the industry, the biggest thing we remember is the disruption of how we visual maps on the Internet.  I didn’t start blogging until May of that year (boy, almost 10 for me) but early on there was much discussion about Google Maps (and Google Earth).  Today, most of our mapping libraries mimic Google Maps either with their API or their tile structure or even the look and feel.  The days of panning and then waiting for a map to redraw or the re-center on a map click are over.  Tiling maps is as common as performing buffers on linear features.

“Google Maps like” is a phrase we see al the time on RFPs and marketing materials. Google Maps has so profoundly impacted our visualization work-flows that we almost delineate between BGM (Before Google Maps) and AGM (After Google Maps).  We compare all new mapping applications against Google Maps, in accuracy and in function.  Projects like OpenStreetMap are successful because Google Maps changed how we navigate and discover people and places.  Companies such as Mapbox and CartoDB exist because as a society we want to view information on maps quickly and easily.  Legacy GIS companies such as Esri have pivoted and become web-centric because Google Maps became the visualization method of GIS data.  Legacy GIS companies such as MapInfo and Intergraph have been pushed aside because they couldn’t change to work within this new dynamic.

Before Google Maps we created online maps in VB6, C#, Java and other complicated languages.  Now whole applications are built with nothing but JavaScript (for the best).  Mapping APIs all look and feel like Google Maps.  There are no weird silo methods to create and display mapping data.

map.data.LoadGeoJson

​Just look at that.  You know what it means and what it does.  The impact of Google Maps is so complete we seem to forget it is even there.

Even outside of Spatial IT we see the impact of Google Maps.  How long will it take to get to work?  Where is the nearest bar?  When does the next bus arrive? How do I get to the airport?  What is the best place to get a taco near me?  These are questions we type in to Google and get map showing us information.  Even though our cars may have some proprietary Navteq navigation system, we prefer to use our smartphones to find out where we are going.

I’ve been thinking about this all week, my professional life has changed so much since 2005 because of one product.  This started as a simple USA centric car navigation application and has become Navteq/Yelp/Yellowpages/Fodor’s/Michelin Guide/Zagat/AAA all in one.  But with the Google Maps API, it becomes a GIS visualization tool that everyone can use.  I can connect it to PostGIS without much effort and display database information that would have take a complex Java/.NET middleware component to handle.

Google Maps is the most disruptive force on GIS that ended up being exactly what we all needed.  I can’t wait to see what we do in the next decade!

Categories
Thoughts

SpatialTau v1.4 – Backend Irrelevance

SpatialTau is my weekly newsletter that goes out every Wednesday. The archive shows up in my blog a month after the newsletter is published. If you’d like to subscribe, please do so here.


What We See

Do you care about the Google Maps backend?  I mean do you really think about how their server stack is run or managed?  Of course not, Google has successfully abstracted out the server part of Google Maps to the point we just assume it will always be available and running.  But it isn’t just Google, Esri and their ArcGIS Online (or whatever they call it) just runs.  Sure it has its idiosyncrasies that make us all angry and frustrated but as with Google we just assume it will always be available and running.  Back when I worked at WeoGeo, SLAs were very important to how we did business.  Our SLA and our data provider’s SLA were so important to how we did business.  Most people I talked to who wanted to build upon our platform were interested in what our SLA was.  I know Google and Esri have SLAs available but I rarely see people curious about what they are and build contracts around them.  They are assume it will always be there and always be available.

To the Nines

High availability is something that is always talked about with these services.  How many “nines” is your device available?  That was a point of pride.  Of course if you built upon Amazon or another provider, you could only offer as good as their SLA is.  One picked providers that have “high nines” in availability so that you could pass on that SLA to your own customers.  Heck, it was why you outsourced the hosting, even 3 nines is an incredible uptime level.  What about Esri’s SLA?  Well, I found this PDF dated from last year:

Esri will use commercially reasonable efforts to make the Covered Services available with a Quarterly Uptime Percentage of ninety-nine point nine percent (99.9%) (“Service Commitment”).

That’s “three nines” which is just about 9 hours downtime a year which is noot bad for most GIS applications.  But I wonder how many of us looked at it.  OK, what about Google Maps for Work SLA (Google renames this product way too much)?

Google will use reasonable commercial efforts to provide Maps API web and mobile interfaces that are operating and available to Customers 99.9% of the time in any calendar month.

Gee, pretty much exactly the same as Esri.  Now I have no idea what Esri’s or Google’s true availability has been over the course of the year.  I haven’t heard anyone complain about ArcGIS Online lately nor have I ever heard someone complain about Google Maps for Work.  One has to assume they are both running at least 99.9% availability.

Should We Care?

I think the answer is absolutely.  As we move more and more into hosting our services, we need to be more and more aware of what we are getting ourselves into.  I’d wager that 8-9 hours on average downtime a year is fine with most applications.  But that needs to be take into consideration when migrating your legacy self hosted GIS applications into “cloud-like” environments.  I know as a contractor, I’m always keeping my clients aware of what it means to be hosted in Google, Microsoft or Amazon’s cloud including the SLAs.  Eventually Google, Microsoft, Amazon, Rackspace, etc will have a downtime that will affect your applications.  Death, Taxes and a server crashing going down are the only guarantees in life.  We need to plan for this inevitability and have plans to alert our users/clients to this possibility.  Sticking your head in the sand and ignoring the problem is not going to help.  Being proactive and having a plan will.  Take a look at your hosted services, the SLA for each and what you can do to mitigate the failure or one or more.  You’ll be glad you did!

Categories
Thoughts

SpatialTau v1.3 – Just Like Google

SpatialTau is my weekly newsletter that goes out every Wednesday. The archive shows up in my blog a month after the newsletter is published. If you’d like to subscribe, please do so here.


The RFP

I ran into this RFP a couple weeks ago that was a small business set-aside so URS couldn’t go after it.  I like to read most RFPs that come on to me desk though because they help me understand what customers are looking for.  This was a traditional “enterprise GIS” RFP where the client wanted someone to come in and clean up their geodatabases, migrate from SDE to Oracle Spatial and then create a new web front end (the old one was a classic Esri big button nightmare) that they wanted to be “just like Google”.  At the time I just let that fall out of my thoughts, but it stuck into my consciousness.  I mean, what does that even mean?

Just like Google

Now we all have an idea what it means to be Google, especially with mapping and data.  I can think of a slippy map with some points on it, a search bar at the top that magically finds the results you want.  Some pretty neat JavaScripting on the front end and overall responsiveness that makes you not even think about some clunky GIS server on the back end.

Reality Bites

The problem with that description is that what I just described was mostly front end client design and not the real power of Google.  There are plenty of great looking, very functional websites out there that are performant but not at the scale or responsiveness that Google is.  When someone writes they are looking for something “just like Google” they are also asking for the infrastructure that goes with it.  And that infrastructure rarely is “enterprise GIS”.  I can use the same tools that Google does for their applications but clearly their army of engineers is better than me.  I can leverage many of their tools but they don’t interact with enterprise solutions as deep at many RFPs require.  There is a spirit of “Google-like” that many try to deliver, but to actually deliver something “just like Google” is virtually impossible.

Bad RFP or Bad Contractor

I’ll be clear, I respond to many RFPs with “Google-like” features but I try and set expectations and constraints on what it can do.  But the expectation of “just like Google” is whose fault?  One could say the RFP writer doesn’t understand what the statement means.  But at the same time, it is contractors who love to throw out buzzwords such as “big data”, “cloud ready” and “responsive design” that ultimately should be blamed.  Software vendors routinely oversell their software and leave their users unhappy.  Contractors do this as well and it hurts long term relationships with their clients.

Simple Always Wins

I keep telling myself for each proposal I submit, simple always wins.  Simple isn’t claiming that something is “Google-like”.  Simple is spelling out what it means to be such a thing.  When we rely on buzzwords to describe what we do, or what our product does, or even our job titles; it obscures why we do what we do.  A simple road map as to what your solutions requires for the RFP goes miles beyond a confusing, copy and paste RFP that contradicts itself on every page.  For contractors, delivering a simple road map as to what your solution does helps just as much.  We all see great presentations and proposals every day.  They all have one thing in common.  Clear, concise recognition of the problem and then a solution that is easily understood and actionable.  I can only hope that I accomplish that every time.

Categories
Thoughts

Google Maps API Data Quality Issues

Link – GMaps API data quality deteriorating?

the basemap data you get via the API is only from TeleAtlas, but if you look at the maps through Google’s branded gateway, they are enhanced with NavTech data too. As rich pointed out, there’s a long discussion about this on the Google Maps API Google Group, or Usenet group as it was once known.

We heard a little bit about this a couple weeks ago. I’ve been saying since day one that the problem with a free API for web mapping is that you need to either reduce your costs as low as possible or have another revenue stream (advertisements). One of the biggest arguments for a paid service like ArcWeb is that you get a great choice of data. We’ve already seen that the satellite imagery in ArcWeb is much better than Google Maps and that I don’t think people mind paying for a service if the quality is better.

Conspiracy theories fly! Do people really care enough about very high quality base maps to pay for a premium API service? Or are geodata licensing costs driving this decision on the part of GMaps? If quality of service continues to deteriorate, will this provide a boon to collaborative mapping in the land of the free geodata, augmenting the accuracy and currency that Google’s maps may be losing?

So there is an opportunity for ArcWeb 2005. The question is how soon will it be to we here/see it (with a name like ArcWeb 2005, you’d think we’d see it soon). I’ll tell you this, as soon as the new ArcWeb 200X is out, I’m going to replace my blog map with it.

Categories
Thoughts

Google Maps + ArcSDE + ArcIMS + ArcWeb =

Link – Google Maps + ESRI’s ArcWeb Services

  • Users can geocode by city (Lawrence), address (1930 Constant Avenue, Lawrence), zipcode (66047), or the intersection of streets (9th & Iowa, Lawrence). We are using the Public Services category of ESRI’s ArcWeb Services.
  • The black and white imagery is coming from ESRI’s ArcSDE through ArcIMS. We are using the ArcIMS cache on demand system that I mentioned here. The reason for this is the lack of quality imagery data that Google provides for most of Kansas.
  • The Map client is the Google Maps ajax client.
  • The Road data is from Google Maps (Teleatlas).

Quite a novel way to get around limitations of all the products. You’d think with so many sources and different servers this would be slow, but it looks quite snappy and is a big improvement over the standard Google Maps version. Great job guys!

Categories
Thoughts

The Battle to be Google Maps’s Data Provider

Link – Google Maps and Their Data Providers

NAVTEQ spends a lot of money to get the most accurate data on the streets and roads, and they make most of their money selling routing (directions) through in-car navigation systems. I bet NAVTEQ wish they had a dollar for every time a prospective customer came to them expecting Google Maps-style driving directions to be free. Oh wait, they do. Every set of driving directions you get from Google Maps (or Yahoo! Maps, or MapQuest) represents real money in the pocket of NAVTEQ-they charge per route. Google, Yahoo!, MapQuest, and others are all eating those charges when they offer the service to you for free, planning to make it back on advertising and related travel services.

It is pretty easy to see who is being squeezed here. The data providers hold all the cards, but competition is driving the market and no one wants to be left holding the bag. One almost has to wonder though wen these data providers might strike back at Google/MapQuest/Yahoo! and start charging them more. For now they seem willing to undercut each other and maybe that business model will work. Still I have to wonder if NAVTEQ, TeleAtlas and other might look toward the fight that the RIAA is having with Apple and start wondering if they should start controlling more of the delivery of their datasets.