Basically you use your mouse to tilt a 3D cube and have a ball travel down the transportation networks (road, trails, subways, etc). Since we in GIS all know about topology in these networks, the game isn’t as hard as you might expect. Still it’s a pretty amazing example of using HTML + WebGL for web mapping. Open or not, the Google Maps API clearly is well ahead of others. If you can stomach the licensing, you’ll continue to have access to tools that other mapping libraries can only dream of.
As I said back a couple months ago, this is very impressive. The fact this is running in at least 3 browsers (Chrome, Safari and Firefox) gives me hope that WebGL apps have a future. I had partly expected this to be Chrome only…
This doesn’t change SketchUp’s awesomeness, but I’m wondering what the future holds. Trimble’s press release talks about an “enterprise solution”. The tea leaves say this means that SketchUp will transition away from free and the cheapskates need to pony up. Their FAQ says they’ll continue to support free customers, but I just can’t see that continuing like Google has been doing. A brave new world is upon us, one where Google doesn’t give everything away for free.
Everyone is addicted, but now what? I have to pay my supplier?
I get asked a ton about what sort of software do I choose to use in my daily workflows. I’m all over the place on stuff, but I figure I’d write it down and see what it looks like. This shouldn’t take long!
For my data storage needs I use WeoGeo Library as my content management system. I like how it keeps all my data organized and available no matter where I am in the world. I use WeoGeo Utility to keep that hosted library organized and to download data I need. Locally I use PostGIS to store and manage the data. I’ll occasionally use Microsoft SQL Server but PostGIS does all I need it to do and since I’m never on Windows anymore I really don’t feel like licensing a Windows instance to run a database.
Working with this data means I need a couple choices to get data in and out of PostGIS. I probably use FME Desktop most of the time because it can do just about everything, but I’ll also use GDAL/OGR because it runs on my MacBook Pro. I’m actually finding myself using GDAL/OGR more these days because I just can’t be bothered to start up a VM with Windows. Dan Dye has me using GDAL almost all the time now so it has become second nature to me. I don’t bother with MapServer or GeoServer because the thought of consuming data out of PostGIS as WxS seems very dirty. You can use either of them but count me out.
I haven’t mentioned Esri yet. I do use ArcGIS for Desktop but again, I’m just not interested in running GIS software in a VM anymore so it doesn’t get spun up much anymore. I do have an older Dell Desktop on my desk but even that seems like such a pain to get working that I just stick to software that runs on my Macintosh. I might checking out ArcGIS Online or whatever it is called these days when Esri gets some licensing costs out there but for now I’m sitting tight with my EDN licensing.
Did I just say I don’t use ArcMap that much anymore?
Now on the desktop I’m rolling QGIS because I’m very familiar with it. I keep trying to get gvSIG into my workflow but it’s resemblance to ArcView 3.x keeps turning me off. That said, having two great Desktop GIS tools means that they are pushing each other to be better. I use Mapnik for my cartography engine with Quantumnik integration into QGIS. Mapnik just makes beautiful maps and I don’t feel like I can get that control anywhere else. I’m also playing around with TileMill but migrating my old Mapnik XML files to Carto has been more work than I hoped. I’ll get there though.
I picked the wrong week to convert XML to CSS
I’ve also been using AutoCAD on my Mac since Autodesk came out with a native version again. Just goes to show, you can have a great native OS X version of CAD software. Shame Esri can’t do the same or I’d be running ArcGIS Desktop much more than I currently am.
Looking at my workflow, it looks like I don’t like or need a mapserver of any type. I’d rather just talk directly to PostGIS than put more middleware in my way. Tiling engines really have changed how I work. Top to bottom though, I rely on Python to script almost everything (from ArcGIS and FME to GDAL and QGIS). I’m not seeing much uptake in my workflow with File Geodatabases, but maybe when support for them becomes more widespread, we’ll see that happen. Shapefiles just seem to still be the standard and as much as I dislike them, it just won’t be changing soon.
I’m guessing most people have workflows like mine, looking for software that makes their life easier and the work more efficient.
I’m really excited about our latest free dataset on WeoGeo Market. The National Wetlands Inventory is available as a national layer or as individual states. Hopefully this will be a great resource for those of us needing the NWI in our projects. Remember too that you don’t have to take this data as File Geodatabases, you can request it as SHP, TAB, DWG and other spatial formats. Brilliant!
PostGIS 2.0 has hit the shelves (so to speak) this week. One of my projects this weekend is to get it up and running on my MacBook Pro and start working with all the new features. There are a ton of new features, but I’m going to pull these off of the announcement because I do think they are some of the best:
Raster data and raster/vector analysis in the database
Topological models to handle objects with shared boundaries
PostgreSQL typmod integration, for an automagical geometry_columns table
3D and 4D indexing
Index-based high performance nearest-neighbour searching
Many more vector functions including
Integration with the PostgreSQL 9.1 extension system
Improved commandline shapefile loader/dumper
Multi-file import support in the shapefile GUI
Multi-table export support in the shapefile GUI
A geo-coder optimized for free US Census TIGER (2010) data
The BostonGIS blog has more details on the new GUI shapefile loader that has been improved quite a bit. Loading data into PostGIS is as easy as ever these days. You can download and get started right here.
OK, ignore the fact there were two games in Japan last week and they had Opening Night last night. Today is the best day of the year, Opening Day! Alas, my Giants don’t play until tomorrow night so I’m going to have to live through watching the Padres beat the Dodgers tonight.
Last week, Directions Magazine had a podcast about sharing data. The question was APIs or downloads. Now of course I’m partial to data downloads as I work for WeoGeo, but before I worked here, I was a big proponent of raw data. Personally, I believe it is one of the best ways for citizens to keep track of their government (local to federal). Not that I’m wearing a tin foil hat, but stats are built to lie and APIs tend to deliver what their “owners” want them to do. Raw data means everyone has an opportunity to check each other’s work. Of course, raw data can be manipulated as well, but it is harder to obscure.
Data in APIs can make a nice story, but they don’t always tell the truth
But APIs do serve a purpose, they allow developers to work with data that they might not understand. Let’s say for example that there is a great dataset of Superfund cleanup locations that is in Esri File Geodatabase format. Most of us reading this blog post know exactly what to do with that, either load it into our Esri tools, or use OGR to convert it to another format. But Joe Developer wouldn’t know that you need to use either and even if the instructions are there, GIS tools are very difficult to use. Thus even if you have raw downloads, they might not be useful at all for anyone.
One of the biggest reasons I joined WeoGeo was to work on how we could allow organizations (commercial or government) to share data in raw formats, but allow the users of the data to convert them into formats that are actually useful for their needs. Thus a user presented with an Esri FGDB could deliver that in a KML or CSV where they would then be able to use tools that they are familiar with to get the data ready for their use. Because this happens on the server side of WeoGeo, the end user doesn’t worry about what the native format is, just the format/formats they care about.
We’ve worked really hard with our partners to enable the power of enterprise strength ETL. We use our APIs to deliver raw data downloads, not formatted structured data that you must shoehorn into your workflows. This is important because it gives you the power to use the data as you see fit, not as how some developer of the APIs thinks you should. Clearly we at WeoGeo are focused on location data, but there are tons of other datasets that should also be available as bulk downloads by organizations.
Now I don’t want to make too much of a stink about APIs. They do serve a purpose and there is nothing wrong with having them if the data is available for download first. In fact it is probably a great idea to offer both, data downloads for those who want to work with the data and APIs for those who want to stick points on a map.
If you are looking at a great and simple way to deliver data downloads on multiple platforms, you can get started with WeoGeo and share your data right away.
That’s a great, simple way to get sharing your data with other users. If you’ve been looking for a tutorial on how to use TileMill outside of the MBTiles format, take a look at Dan’s post. It is a great starting point.