No GIS Professional can make a map without a legend. Probably because we create such complex maps, they can’t stand on their own. Anyway ESRI added some new features that should help us GIS Professional enter the world of Web 2.0. Key new features as I see it are:
International Business Machines Corp. (IBM) agreed to acquire data specialist Netezza Corp. (NZ) in a deal valued at $1.7 billion, as Big Blue expands its analytics business.
Netezza provides technology that allows companies and government agencies to quickly analyze huge amounts of data-something IBM says will give it an advantage in its analytics business.
Netezza offers “a much simpler way to get started on analytics and data warehousing than anyone else in the industry,” said Arvind Krishna, IBM general manager for information management. He told Dow Jones Newswires Netezza’s system can be operated by one person instead of “an army of people” and that it provides increased performance at a lower cost.
Of course, Netezza does spatial as well so it will be interesting to see what happens in this space with the IBM R&D behind it.
The big news tonight though is Hawaii Five-0 is back!
This afternoon, I sat through the Collaboration Panel discussion. The panel was made of of a few people representing state, regional and local governments and well as utilities and academia. Almost uniformly, there was a fear (yes, I mean fear) of crowdsourcing that was best summed up by the following statement:
Crowdsourcing presents a vulnerability to us.
Fear of accurate maps? I wish I was there to see who could have this backward stance, but it doesn’t really matter. The tide is against them and in time they will be washed away. If there wasn’t a better reason to have SOTM 2011 in Denver, I can’t think of anything else. These folks need help and if the Front Range is the great hope for the USA, we are screwed.
Can’t find my way home because the map isn’t crowdsourced!
So we’ve got yet another blog touting the future of the SpatiaLite format as being the next Shapefile. Now, don’t you dare look over at that search feature on the right side of my blog and type in SpatiaLite because you’ll probably see the same thing (though honestly, I can’t recall if I was of sound mind when writing it)? The simple fact is SpatiaLite is a favorite format for those of us with nothing better to do than tell the rest of the world what they should be looking at.
Ah! But like most things, just because a bunch of bloggers thinks it is a good idea doesn’t mean it will actually matter. In this case, SpatiaLite is dying a slow death because no one is actually implementing it. Now yes OGR, FDO, and other libraries support it, but you don’t see that making its way into mainline software (QGIS aside, but even its support is poor) and in turn, you rarely see it in the real world. Offhand I can only think of the “beta” format that GeoCommons has on their service (and they’ve had beta attached to it for almost a year).
Now yes, I think we all need a better format than the venerable shapefile (and it’s three amigos) which as a transmission format fails miserably. But there doesn’t seem to be any indication that this is a problem people actually want to be solved. I’ve seen much more effort put into KML, GeoJSON, and LAS by the community than SpatiaLite or even SQLite. This isn’t because the SpatiaLite project hasn’t given tools to us to implement, it has been the community could care less about it. SHP works for them and there isn’t any reason to change.
So what is going to change things? Well, it will be web services, not GIS formats that matter for users moving forward. So I say let’s stop focusing on SpatiaLite as a consumer format and actually work harder at making better web services for these users (like stop it already with the WxS please). SpatiaLite still has its place in the world, but does anyone really want to bother downloading GIS files anymore? Of course not…
Shapefile can’t #FAIL!
Oh and the FGDB API – just assume it is dead as well. ESRI can’t get it out the door and in reality, no one gives a hoot other than federal agencies that have to provide open data, but are locked into the ESRI stack.
The new, improved Ovi Maps will offer, live?traffic?flow information, a new drive assist mode, public transportation maps, a redesigned places page, and social check-ins. The public?transportation?maps will be available via a map layer for over 80 cities around the world and check-ins will allow you to broadcast your location via SMS or to your social network of choice.
OK, so does anyone actually use Ovi Maps on purpose? I mean MapQuest learned how to fit in with the new world order, but Nokia seems to still think we are all ready to jump on their platform. Part of why Where 2.0 doesn’t interest me anymore is they keep getting up there pushing this platform like it is viable or something. But hey I doesn’t matter right? Name one product of value that ever launched at Where 2.0? [editor’s note: I must remind Mr. Fee that his passion launched at Where 2.0]
I wonder where Ovi Maps is? At least Nokia is consuming their own dogfood.
I found this blog post on basemaps over at the 41Latitude blog (if you aren’t following this blog you need to start right now) to resonate with me.
perhaps, in trying to make a basemap that’s optimized for everything, we’re actually creating one that’s optimized for nothing.
We all see it quite a bit these days. Some data overlaid on a default Google Map and you can’t read a darn thing. Working for the GNOCDC, we picked the Terrain map as our basemap (even though there is no “terrain” in NOLA) because it was the least cluttered basemap.
Over in the ESRI world, I’ve had a couple people ask me to put their data on the Esri Topographic web map servicebecause it looks so good. Now I do agree, it is a beautiful basemap, but it isn’t one that lends itself to being a basemap. Esri should be offering a muted basemap and allow for the most important part of the data, the information being overlaid, to stand out.
Wait, what happened to 3 day weekends. I guess you get one and then you expect them all the time. Oh well…
Some interesting reading for a Monday morning:
ArcGISEditor for OSM – Randal looks at the ArcGIS Editor for OSM and concludes it is complicated but powerful. I all Esri tools (they are “scientific” mind you) nothing is ever simple, but if you can get your hands around it, powerful results happen.
FOSS4G 2010 Final Answer – Apparently there was a Geospatial conference going on somewhere. They all kind of start blending into each other, don’t they?
Making a Data Portal With WordPress – Content management is content management, right? (bless his heart for trying to do this with WordPress) Just goes to show that if you can hack your way around code, there isn’t anything you can’t accomplish (assuming your billable time isn’t an issue).
Gearing up for GIS in the Rockies – Time for the fall conference season to kick into high gear. Front Range GIS is a unique community that does some really great things with both proprietary and open-source tools (usually in combination). Bummed I can’t go.
Why not GeoJSON? – Looks like France was good to Sean. He’s got a great post up on ESRI’s use of JSON in their RESTful API.
As you examine the specification, you’ll probably notice that it looks like the?ArcGIS REST API. This is deliberate. The pattern we have used at Esri for exposing REST-ful GIS services has been embraced by thousands of developers who use the ArcGIS Web APIs. It is a simple and intuitive way of structuring and talking to GIS Web services. We wanted you to feel free to implement services that follow the same pattern.
Whether or not this is truly an open spec (and not opening the debate as to how “RESTful” this spec really is), the rush begins for everyone to implement this spec on their own apps so they can be used with ESRI clients.
Jack opens his secret to getting RESTful with ArcGIS