No curves. It’s a good point. GeoJSON and TopoJSON don’t support curves. But neither does Shapefiles. All three formats are meant to handle simple features. Points, polygons and line. Whereas TopoJSON handles topology, it still can’t draw true curves. But what’s the implication here? To share data that requires curves (it’s an edge case but still an important one) you have to use a proprietary format? Enter WKT. Well-known text supports much more vector types than the previous including curves. Following up on sharing data in common file formats, WKT fits the bill perfectly. Share your data as GeoJSON/TopoJSON, KML and Shapefile if needed, then use WKT for complex features. Still open completely and it is well supported with most open and proprietary software packages.
Sometimes you need to use curves and generally it does work out.
There are plenty of 3D globes for desktop and for web that support above ground objects (mostly buildings) on the globe but there are few that support features underground (such as wells). The only one that really has good support is Esri’s CityEngine. You can render scenes such as this in the browser.
Now the problem is that this all requires CityEngine which is neither inexpensive nor easy to use. I’ve got a great database of wells with GeoJSON attributes that I’d love to map on a 3D browser view but most of the efforts so far have been put into 2.5D solutions. Most of my current project work is 3D but underground which means that I can’t view on Google Earth or other web solutions.
I get all excite to map wells and then disaster strikes.
You may or may not have seen, but there is a Cuban/Tampa Bay Rays game going on today. Given the love of baseball between the two countries I’m sure we’ll see much more Cuban baseball over the next couple years. It just so happens that the GeoJSON-Ballparks project has all the professional baseball stadiums in Cuba already mapped in GeoJSON format, including Estadio Latinoamericano where the game is being played today. Enjoy!
Last week Apple announced it was moving some of its iCloud storage to Google’s cloud from Amazon. Earlier this month Dropbox moved from AWS to their own cloud hardware. The motives for both these moves are to get off of someone else’s cloud and and on to their own1. But it does bring up a good point for all hosted development, try and be as cloud agnostic as possible because you never know what you might have to do.
Early on in WeoGeo’s life, we had a product called WeoCEO. You won’t find too much about it but it was WeoGeo’s way of managing S3, EC2 instances and balancing traffic. WeoGeo created this because Amazon’s EC2 tools weren’t robust enough to handle a business yet and WeoGeo didn’t know how AWS might turn out. The point of WeoCEO was it could be staged in any CentOS environment and handle the WeoGeo database, geoprocessing and websites. In theory that would have allowed WeoGeo to easily move off of AWS and on to Azure, Google or Rackspace. WeoGeo abandoned WeoCEO and started using AWS’s native tools because it made it easier to work with new technology such as Cloudfront, new storage solutions and database options. While there was zero chance WeoGeo would move off of AWS, having such a tool could have made it easier for the WeoGeo platform to be integrated into other technology.
All this got me thinking about my current hosting GIS systems. I’ve got some on AWS and some on Azure. Could I move from one provider to the other and how much work would it be? Most of my stack isn’t proprietary to one or the other system2. Node.js, PostgreSQL and other open technology runs really well on just about any hosted system out there. But there is somewhat proprietary cloud technology out there you can lock yourself into.
I don’t think that everyone needs to develop their own WeoCEO type management system3 but making pragmatic choices with how to deploy cloud applications can pay itself back in spades. I have clients who want their applications on AWS or Azure and I can’t deploy the same application there with little effort, but keeping it that way requires planning and a will to be cloud agnostic. I’ve always liked the term and I’ve always tried to prototype and develop applications that aren’t locked into to infrastructure. I’d love to keep it that way and you should too.
Finding data for me is only half the problem, it’s formats that come and go. I’m archiving data in formats that I have no idea if they’ll be supported or not. What’s the point of indexing if you can’t view the file?
That’s a big issue of course. I mean think about it this way, what if I saved out my site plan in Manifold GIS format 5 years ago and wanted to open it today? Either I find a copy of Manifold or I don’t use the file. The solution isn’t as easy as you might think unfortunately. Software such as Safe Software FME can rescue many file formats but it’s a risk that you run hoping that Safe will add your format. One thing I try and do is save data as common format types. While I might use SDE and FGDB in production, I make an effort to save these layers off as Shapefiles and TIFF. We termed this over beers a couple years ago as “pragmatic file formats”. SHP, KML, TIFF, JPG, GeoJSON were all mentioned as ones that we thought were widely supported1. At WeoGeo, while we could support the 250+ file formats that FME supports, we left it at about 10 of the most requested formats.
But that brings up one thing we pushed with the WeoGeo Library2. You could load a WeoGeo supported file format, even a “pragmatic file format” type and because we used FME on the backend, know that it would be usable in the future. That’s a true “Library” environment, one where you can not only find what you are looking for, but know that it will be readable.
GIS by its very nature is file format verbose3 and we have to deal with this more than other professions. My recommendation today as it has been for years is try and do the following:
Save in common file formats over niche proprietary ones
Safe FME pays for itself
Index your data. If you know what you have, you know what you need to support4
Simple enough, right? Don’t get turned upside down with your file formats.
though you could argue GeoJSON back then wasn’t exactly supported well ↩
Ninety percent of the world’s data has been generated over the last two years.
Unlike the “80% of Data is Spatial” I have to admit this is totally believable and I can find the source. Most of this data is pure junk but the biggest problem with it is that it is literally unsearchable. Even in the age of Google, we can’t even begin to start aggregating this data and sorting through it.
On the BLM GPM projected that I was part of at AECOM/URS, we teamed with Voyager to attempt to find all their spatial data and share it. The good news is that I hear the BLM Navigator will be rolling out soon so at least we can know that the BLM is indexing their data and attempting to share it. But that is one organization out of billions.
This unaccounted for data is unable to be leveraged by users and becomes wasted. We all know GIS is great for making informed decisions about just about anything, yet we are most likely uninformed ourselves because the data just doesn’t happen to be at our fingertips. We’re a society that loves to create data, but not one that likes to organize data. If we’re truly going to change the world with GIS, we need to make sure we have all the information available to do so. Smart Cities, GeoDesign and all the rest are big data use cases. Let’s figure out how to start pumping them full of it.
The White House released a draft policy yesterday for sharing source code among federal agencies, including a pilot program that will make portions of federal code open source.
This policy will require new software developed specifically for or by the Federal Government to be made available for sharing and re-use across Federal agencies. It also includes a pilot program that will result in a portion of that new federally-funded custom code being released to the public.
The policy outlined here highlights a draft policy proposal of a pilot program requiring covered agencies to release at least 20 percent of their newly-developed custom code, in addition to the release of all custom code developed by Federal employees at covered agencies as part of their official duties, subject to certain exceptions as noted in the main body of the policy.
Many Federal GIS consultants just had a bad morning.
So last week I was talking about how to now use ArcGIS Pro with “Classic Licensing”. Well after following the directions on Esri’s website which resulted in no new licenses we finally realized that despite what Esri says on their support page. The original suggestion was just use the ArcGIS Desktop license for Pro 1.2. What you actually need to do is find your ArcGIS Pro 1.2 license in My Esri and use that. Make sense when you think about it but the directions from Esri before was just use your ArcGIS Desktop.
The disconnect was that you get ArcGIS Pro license code with your ArcGIS Desktop license. You just need to run the licensing wizard and then point your ArcGIS Pro to that license server. Then it works without an issue.
ArcGIS Pro has always had somewhat of a non-standard way of being licensed. I’ve never really gotten into it mostly because it revolves around “provisioning” and “logging in” to ArcGIS Online. Even if I felt a real need to get it to work, it just seems like a very annoying method of licensing software. Now since technically we aren’t paying for ArcGIS Pro licenses just yet, I suppose it doesn’t really matter1. But as I do want to at least get an idea of what Pro is, how it works and what it means to GIS workflows when/if it replaces ArcGIS for Desktop, licensing matters. I’ve not been to an Esri conference in almost a year so the ins and outs of Pro licensing have been lost on me but this tidbit yesterday about ArcGIS Pro moving forward was interesting.
So there you go, I’m guessing this means when 1.2 arrives this week, I can just point it at my existing license manager and away we go. I’ll install ArcGIS Pro, be impressed with the new UI and then realize it’s a dog and buggy as sin2. But 64-bit is a big carrot so depending on how the geoprocessing works, I can see myself embracing Pro, Python 3.x and 64-bit.
Before the 1.2 release, the only licensing option available for ArcGIS Pro was through Named User licensing. This license model required authorization through your organization administrator on Portal for ArcGIS or ArcGIS Online. At 1.2, you now have two new licensing models available that don’t require you to go through a Portal for ArcGIS or an ArcGIS Online organization: Single Use and Concurrent licensing. With a Single Use license, ArcGIS Pro points to a file for authorization. The file is stored on the same machine that runs ArcGIS Pro. With Concurrent licensing, a given number of licenses are hosted on a License Manager (the ArcGIS License Server Administrator). ArcGIS Pro is then configured to allow organization members to check out an available license from the pool of licences hosted on the License Manager.
While I did spend a lot of time photoshopping the splash screen above, here is the ArcGIS Pro 1.2 splash screen.
Beta software has always been sort of a different beast when it comes to licensing. ↩
I’m thinking it will be ArcGIS Desktop 8.0.1 all over again. ↩