Plus Codes; Another Attempt at Addressing Places Without Street Addresses

Yes everyone knows about What3Words.  It was an attempt to come up with an easy way to assign addresses for places where there are none.  In the end, a proprietary addressing system will never gain traction, and of course the inevitable eventually happened. My personal feeling is that What3Words never really got us beyond x/y numbering and the logic behind an addressing system was not there.  Enter Plus codes which comes at this problem from a different perspective.  There is a very detailed analysis of existing methods and why they choose to go this direction that I’ll leave it up to you to read.

Probably the biggest reason to pay attention is that this open addressing system was developed by Google.  In fact, they are already implementing it in India as we speak which goes a very long way to making this happen.

All these systems are built on the idea the world is a grid, and how deeply you drill down into that grid is your address so things need not be a single point, they can be an area which opens up many exciting ideas for addressing, especially outside of North America and Europe.  Check out the Github project to learn more.

Focus on Data

When you think geospatial you think data, right? You imagine GIS professionals working their butts off making normalized datasets that have wonderful metadata. Nah, that’s just some slide at the Esri UC where “best practices” become the focus of a week away from the family in the Gaslamp. For some reason, GIS has become more about the how we do something and less about the why we do something. I guess that all that “hipster” and “technologist” thinking that goes into these “best practices” loses the focus on why we do what we do, the data.

At Cityzenith the first question a customer asks me is what data do we have available. See that’s because they aren’t GIS technologists, they’re just working folk who have to solve a problem. That problem requires the same problem that an accountant requires, accurate data. The last question these people care about is “Should I script this with JavaScript, Python or Ruby?”. They’re just looking for data that they can combine with their proprietary company data to make whatever decisions they need to make.

Finding Data is Hard

So much of what we do in our space is wasted on the tools to manage the data anymore. Sure in the 90s we needed to create these tools, or improve them so they could rely on enough to get our work done. But the analysis libraries are basically a commodity at this point. I can probably find 100 different ways to perform a spatial selection on GitHub to choose from. Personally, I can’t even recall opening ArcGIS or QGIS to solve a problem. There just isn’t a need to do so anymore. These tools have become so prevalent that we don’t need to fight battles over which one to use anymore.

Your TIGER WMS is available

Thanks to Google and OpenStreetMap, base maps are now commoditized to the point that we rarely pay for them. That part we can be sure that we’ve got the best data. (Disclosure, Cityzenith users Mapbox for our base mapping) But everything else is still lacking. I won’t pick on any vendor of data but generally, it works the same way, you either subscribe to a WMS/WFS feed (or worse, some wacky ArcGIS Online subscription) and if you’re “lucky”, a downloaded zip file of shapefiles. Neither lends itself to how data is managed or used in today’s companies.

Back to our customers, they expect a platform that can visualize data and one that is easy to use. But I know the first question they ask before signing up for our platform is, “What data do you have?”. They want to know more about our IoT data, data from our other partners (traffic, weather, demographics, etc.) and how they can combine it with their own data. They will ask about our tech stack from time to time, or how we create 3D worlds in the browser but that is so rare. It’s:

  1. What do you have?
  2. Where do you have it?

There are so many choices people have on how they can perform analysis on data. Pick and choose, it’s all personal preference. But access to the most up-to-date, normalized, indexed and available data for their area of interest. That’s why our focus has been partnering with data providers who have these datasets people need and present them to our users in formats and ways that are useful to them. Nobody wants a shapefile. Get over it. They want data feeds that they can bring into their workflows that have no GIS software in them whatsoever.

As I sit and watch the news from the Esri UC it is a stark reminder that the future of data isn’t in the hands of niche geospatial tools, it’s in the hands of everyone. That’s what we’re doing at Cityzenith.

Esri Arcade

When we think of Esri scripting and authoring languages, we think Python. Esri jumped in with two feet with Python and we were all much better off for it. But alas, as awesome as Python is, it isn’t as portable across the Esri ecosystem as they would like. To solve the problem either you choose another language to use that is more portable (JavaScript) or you write your own expression language and make is appear like Python and JavaScript had a baby.

At the Esri Arcade

Well that’s what Esri did, take Python, take JavaScript and a new expression language. Now in an open world this would be great because anything I wrote in Arcade would be usable anywhere else. But this is an Esri only solution as I can’t imagine other companies jumping in on it. But in Esriland, that’s OK because the ecosytem is large enough to support learning a proprietary language.

I don’t use Esri software anymore so I can’t play with it but it is a logical solution to their problem of having to write code to work with data in different platforms. Theoretically one can now use Arcade to author and render maps and let the Esri software handle the rest. I’d wait to see what happens with Arcade and the eventual 1.x release. It’s the Esri Web ADF talking but…

So another proprietary scripting language…

GIS Software has to be Hard to Use

Serious though, right? GIS has been defined by those who create much of it at “Scientific Software”. Because of such, it needs to be:

  1. Expensive
  2. Difficult to use
  3. Poorly documented
  4. Buggy
  5. Slow
ArcGIS Toolbars

Professional GIS*

GIS software is literally the kitchen sink. Most GIS software started out as a project for some company and then morphed into a product. They are a collection of tools created for specific projects duct taped together and sold as a subscription. We’ve talked about re-imagining how we work with spatial data but we rarely turn the page. The GIS Industrial Complex (open source and proprietary, everything is awful) is built upon making things hard to do. There has been attempts to solve the problem but then in themselves are usually built for a project rather than a product. Somewhat cynical but you have to wonder if this is true.

Tools such as Tableau are the future and as they add more spatial capability GIS Specialists will be out of a job. Being a button pusher seems more and more like a dead end job.

GIS is Easy, Visualization is Hard

Yesterday I said this:

When you get to the core of GIS, it is database management. Managing spatial data and performing analysis is what most GIS people do. But when it comes time to present that information, GIS people generally struggle with it. If you are like me, you can tell what maps are made with Esri’s ArcMap pretty easy. They just have those elements that nobody changes. That’s because it is hard to pick the right colors, get labeling just right and convey to the reader what the map is all about. Most good cartographers take years perfecting their maps which most of us don’t have that kind of time1.

There has been a ton of effort in making it easier to product good looking visualizations without having to know what you are doing. Just look at CartoDB and Mapbox and you can see some nice out of the box visualization. We are getting there but too often we fumble our message on poor mapping. Even in the 3D world getting the lighting just right on your features (dense buildings for example) can make or break how your data is viewed. Too often we work so hard on the analytical data side of things just to fall down the stairs while we rush to get the product out the door. Do not be a Pete Campbell.

  1. now let me be clear, I could never produce at map like Tom does even if you gave me forever to do it in. 

Curves in Open Data

Last week I talked about data formats and we continued it on Twitter.

No curves. It’s a good point. GeoJSON and TopoJSON don’t support curves. But neither does Shapefiles. All three formats are meant to handle simple features. Points, polygons and line. Whereas TopoJSON handles topology, it still can’t draw true curves. But what’s the implication here? To share data that requires curves (it’s an edge case but still an important one) you have to use a proprietary format? Enter WKT. Well-known text supports much more vector types than the previous including curves. Following up on sharing data in common file formats, WKT fits the bill perfectly. Share your data as GeoJSON/TopoJSON, KML and Shapefile if needed, then use WKT for complex features. Still open completely and it is well supported with most open and proprietary software packages.

Sometimes you need to use curves and generally it does work out.

Data Formats and the Datastore

Yesterday’s post generated some email, mostly in agreement, but I wanted to highlight one question.

Finding data for me is only half the problem, it’s formats that come and go. I’m archiving data in formats that I have no idea if they’ll be supported or not. What’s the point of indexing if you can’t view the file?

That’s a big issue of course. I mean think about it this way, what if I saved out my site plan in Manifold GIS format 5 years ago and wanted to open it today? Either I find a copy of Manifold or I don’t use the file. The solution isn’t as easy as you might think unfortunately. Software such as Safe Software FME can rescue many file formats but it’s a risk that you run hoping that Safe will add your format. One thing I try and do is save data as common format types. While I might use SDE and FGDB in production, I make an effort to save these layers off as Shapefiles and TIFF. We termed this over beers a couple years ago as “pragmatic file formats”. SHP, KML, TIFF, JPG, GeoJSON were all mentioned as ones that we thought were widely supported1. At WeoGeo, while we could support the 250+ file formats that FME supports, we left it at about 10 of the most requested formats.

But that brings up one thing we pushed with the WeoGeo Library2. You could load a WeoGeo supported file format, even a “pragmatic file format” type and because we used FME on the backend, know that it would be usable in the future. That’s a true “Library” environment, one where you can not only find what you are looking for, but know that it will be readable.

GIS by its very nature is file format verbose3 and we have to deal with this more than other professions. My recommendation today as it has been for years is try and do the following:

  1. Save in common file formats over niche proprietary ones
  2. Safe FME pays for itself
  3. Index your data. If you know what you have, you know what you need to support4

Simple enough, right? Don’t get turned upside down with your file formats.

  1. though you could argue GeoJSON back then wasn’t exactly supported well 

  2. now WeoGeo is Trimble Data 

  3. right now I can look at this MXD in front of me and see with 8 layers, I have 6 format types 

  4. that is why I have a copy of Corel Draw