The ESRI Developer Summit and the .NET SIG

Play this below to set the mood:

That Was Then

Way back in 2005, at the ESRI International User Conference, there was a .NET SIG that essentially started something great. In that room there were some great folks (Scott Morehouse, Art Haddad, Brian Golden, Rob Elkins, Jithen Singh, Brian Flood, Dave Bouwman, and others) who talked about where we should take the developer community at ESRI. In my opinion the biggest thing to come out of that meeting was what became the Developer Summit.

I’m sure most ESRI developers feel the same as I do in saying that the DevSummit is probably the biggest thing to come out of ESRI in the last 10 years. It has grown to be probably the must attend event for many ESRI users. At at the DevSummit, the .NET SIG (as well as the other ones) became sort of a place to reconnect. It didn’t matter if you were web or desktop development, used Desktop or Server or worked in C# or VB.net; you could talk about what the .NET community was doing at ESRI and how ESRI could continue improving it.

A Brave New World?

Well looking at the 2010 ESRI Developer Summit Agenda I can see the SIGs have been dropped. I asked a couple contacts at ESRI if this was just an oversight on the website and they confirmed that the SIGs are no longer part of the program. I guess the idea is that you’d rather Meet the Teams” to talk about what you are doing directly with them. Of course most folks probably won’t bother because they’ll be meeting the teams at the ESRI Islands and talking with them all week.

What I think I’ll miss is the strategic talk about how ESRI can improve their developer community. I thought this feedback was valuable to ESRI, but I guess these days it is better captured through contact us forms than face to face discussion. Part of what makes the Developer Summit so great is that it isn’t like any other ESRI event and I’m afraid that this is just the start of it losing its woodstock” feel. Of course maybe change is inevitable but I can’t help but note it sucks.

February 8, 2010 Thoughts






Data.gov is already broken — just like everything before it

Like most people (I assume), I was doing a little GIS project SuperBowl morning. Needing some data, the first place I thought of going what the new [Data.gov] site to download some data. After doing a quick and simple search, I got the dataset I wanted ready to download. But as with every government data repository before it, it is broken. Posted datasets download links are many times 404:

Broken downloadBroken download

It just isn’t the download, but the metadata as well. I know, some datasets still work and who knows, maybe this one will again one day. But for [Data.gov] to be valuable it needs to ping the data sources to let the users know that they are down (and for web services what percentage they are down). Also it wouldn’t hurt to let the owner of the data know that their datasets are no longer linked correctly in the Data.gov website. Otherwise we’ll just get link rot and that can kill a project.

If projects are going to be built on data discovered with Data.gov, much more has to be done to ensure that this data is available consistently, not when people get around to updating broken links. If things don’t change it is another waste of taxpayer money and we’d just have been better off sticking with the previous government data boondoggle.

February 7, 2010 Thoughts






Data.gov is already broken — just like everything before it

Like most people (I assume), I was doing a little GIS project SuperBowl morning. Needing some data, the first place I thought of going what the new [Data.gov] site to download some data. After doing a quick and simple search, I got the dataset I wanted ready to download. But as with every government data repository before it, it is broken. Posted datasets download links are many times 404:

Broken downloadBroken download

It just isn’t the download, but the metadata as well. I know, some datasets still work and who knows, maybe this one will again one day. But for [Data.gov] to be valuable it needs to ping the data sources to let the users know that they are down (and for web services what percentage they are down). Also it wouldn’t hurt to let the owner of the data know that their datasets are no longer linked correctly in the Data.gov website. Otherwise we’ll just get link rot and that can kill a project.

If projects are going to be built on data discovered with Data.gov, much more has to be done to ensure that this data is available consistently, not when people get around to updating broken links. If things don’t change it is another waste of taxpayer money and we’d just have been better off sticking with the previous government data boondoggle.

February 7, 2010 Thoughts






Increasing U.S. Census Participation

One of the biggest issues with the U.S. Census and probably the one that wastes the most money is trying to count those who are hard to count. My personal fix would be to use sampling to solve the problem, but for now the task of the Census takers is to try and count everyone. My attention was brought to a project called Census Hard to Count 2010 which maps the hard to count” population nationwide (based on the Census Bureau’s analysis) to help local and national organizations target their outreach efforts for the 2010 Census and customize messages to communities at risk of being undercounted.

It features interactive maps at the state, metro, county, and tract level, along with detailed statistics for each area. You can search in various ways, and also add overlays showing Congressional districts, ZIP Codes, tract-level maps of 2000 Census mail return rates, and recent foreclosure risk. There’s a FAQ that goes into details about the data and their methodology.

Clearly larger states have a bigger problem with hard to count populations but Alaska, Hawaii and New Mexico probably point out that there are socioeconomic factors as well. Using the demographic layers available in the web app shows that this problem is very difficult to pinpoint and my hat is off to those trying to crack it.

The UI from the Census Hard to Count 2010 Application

February 4, 2010 Thoughts






Increasing U.S. Census Participation

One of the biggest issues with the U.S. Census and probably the one that wastes the most money is trying to count those who are hard to count. My personal fix would be to use sampling to solve the problem, but for now the task of the Census takers is to try and count everyone. My attention was brought to a project called Census Hard to Count 2010 which maps the hard to count” population nationwide (based on the Census Bureau’s analysis) to help local and national organizations target their outreach efforts for the 2010 Census and customize messages to communities at risk of being undercounted.

It features interactive maps at the state, metro, county, and tract level, along with detailed statistics for each area. You can search in various ways, and also add overlays showing Congressional districts, ZIP Codes, tract-level maps of 2000 Census mail return rates, and recent foreclosure risk. There’s a FAQ that goes into details about the data and their methodology.

Clearly larger states have a bigger problem with hard to count populations but Alaska, Hawaii and New Mexico probably point out that there are socioeconomic factors as well. Using the demographic layers available in the web app shows that this problem is very difficult to pinpoint and my hat is off to those trying to crack it.

The UI from the Census Hard to Count 2010 Application

February 4, 2010 Thoughts






GDAL/OGR 1.7.0 Released

Good news from the gdal-announce email list:

The GDAL/OGR Project is pleased to announce the release of GDAL/OGR 1.7.0.

Yep, you can stop there and get your GDAL/OGR on. Or maybe you want to know what is new, copied directly from Frank’s email:

  • New Raster Drivers: BAG, EPSILON, Northwood/VerticalMapper, R, Rasterlite, SAGA GIS Binary, SRP (USRP/ASRP), EarthWatch .TIL, WKT Raster
  • GDAL PCIDSK driver using the new PCIDSK SDK by default
  • New Vector drivers : DXF, GeoRSS, GTM, PCIDSK and VFK
  • New utilities: gdaldem, gdalbuildvrt now compiled by default
  • Add support for Python 3.X. Compatibility with Python 2.X preserved
  • Remove old-generation Python bindings.
  • Significantly improved raster drivers: GeoRaster, GeoTIFF, HFA, JPEG2000 JasPer, JPEG2000 Kakadu, NITF
  • Significantly improved vector drivers: CSV, KML, SQLite/SpataiLite, VRT

I did a little highlighting up there to list what I think is noteworthy at least for me. You can either build it yourself or keep an eye out for an update of FWTools.

January 30, 2010 Thoughts