Reflections on the 2010 ESRI FedUC
Cloud Ready
Well I’m sure you’ve all heard the news. ESRI is now an Amazon Independent Software Vendor.
This means of course that we’ll see some ArcGIS Server in the Amazon Cloud very soon. In fact if you are an ESRI ELA user, you can take advantage of this right now using one of the pre-built AWS AMIs. Licensing still hasn’t been outlined by ESRI, which is probably why the ELA is required, but it seems like we could be close to hourly ArcGIS Server instances by next year. The AMI isn’t anything special, just a Windows Server AMI with ArcGIS Server at this point.
The WeoGeo booth was right next to Amazon (or maybe Amazon was right next to WeoGeo, hmmm) and there seemed to be some traffic and lots of questions. Answers weren’t that concrete from what I heard and Amazon looked rushed into being there, but it did appear people made an effort to seek them out and talk about GIS in AWS. At this point ESRI and Amazon is so early in the public relationship that we’ll have to wait for the BPC/DevSummit or most likely the International UC to get the real details.
The Plenary
OK so ESRI in the cloud didn’t knock your socks off, the inevitability of the whole thing at this point seemed to make many feel like it was anticlimactic. Tough world we live in.
Jack’s plenary talk was as always razor sharp on what ESRI is doing for their Federal customers and as always sets the stage for the year. As I alluded to early, the phrase “Cloud Ready” is something we’ll be hearing a ton about with ArcGIS 10. This means a couple things, first off it integrates with other cloud services with the REST API (something many have already been doing for years), second they’ve got this Amazon AWS AMI which you can license to run a full ArcGIS Server (without any scaling of course) in Amazon’s cloud and third I think it means that ESRI’s web services are going to essentially make even private or internal clouds “GIS Ready” (that’s my term in the spirit of Cloud Ready).
I think the Plenary was well received by the crowd, but they seemed quiet. I’d probably feel the same way if Mother Nature dumped a ton of snow on me for a couple weeks. Some interesting take-aways from the talk is ESRI’s focus on private clouds, which I think aligns very well with the FedUC crowd. Their focus on mobile was very apparent and I think at this point every reference to a Windows Mobile device has been removed from Jack’s slides and replaced with an iPhone. ESRI’s focus on web services means that they can transition to mobile devices with their mobile APIs (Ah, here is the iPhone API ready to work).
Jack focused on the large picture architecture of ArcGIS 10 and then it came for others on the ESRI team to come out and demo. We saw a good overview of ESRI’s ArcGIS Online map services. This world Topo map ESRI has been working on is really special. The cartography just catches your eye and that it goes down to 1:1000k 1:1k (off by a little scale factor there) scale in large cities really makes me want to use it instead of street map services. ArcGIS Explorer Online is a really slick Silverlight app that seems to emulate much of the ArcGIS Explorer (except 3D of course), which might be a good general GIS web services browser for ESRI users. They keep hiding the URL so so I can’t share it, but it was something like http://explorerweb.arcgis.com or similar. We’ll see it soon enough I guess.
It’s About Servers
Then the most surreal part of the whole FedUC occurred. John Calkins ran over his overview of ArcGIS Desktop 10 as he always does. If you’ve never seen John give this talk, you can view one here. John as always did a really good job and some of the refinement of ArcGIS Desktop 10 is simply amazing. The editing environment, threaded geoprocessing and symbology improvements really puts ArcGIS Desktop way beyond anything any other GIS vendor is doing. But what caught me off guard was the crowd’s reaction to it. As I said earlier, the crowd seemed tired and not into things, but during the Desktop demo I heard some things that really amazed me.
- “Why isn’t this demo in Flash (or Silverlight)?”
- “Why isn’t he using an iPhone to do this?”
- “Do people still use ArcMap?”
Here was a crowd that I thought would eat Desktop alive because they spend all day in it and many just didn’t care anymore. (Note: I don’t have super hearing so I could only listen to those in front or behind me) Could we finally be at a big shift in mentality where we are breaking out of these large legacy desktop clients and toward lightweight mobile and web clients for analysis? Are users finally listening to our “web is where the magic happens” talks and taking it to heart? Not sure, but it was interesting.
Now before everyone declares desktop GIS dead, lets be realistic here. Content creation tools are still not developed on mobile devices or web clients to the point were you can get the accuracy you need so for many users Desktop is still a required element and will probably be for decades to come. But I do think that average users of GIS, even those institutionalized in the federal service, are ready for this mobile, crowdsourcing future that we are just about to enjoy.
Crowdsourcing? ESRI?
Yep, Jack talked quite a bit about VGI (Volunteered Geographic Information) which of course is a term used to describe crowdsourcing/neogeogrpahy/participatory GIS or whatever else is the term of the hour. Dave Smith did a really good job of summarizing crowdsouricng and ESRI on his blog so I’d like to point you there from some reading. ESRI has put thought into ArcGIS’ place in VGI and how users will want to get information in and out. I think as ArcGIS 10 progresses we’ll see much more on this and how ESRI users can edit things such as OpenStreetMap directly from their ArcMap clients. I think the International UC should show us much more detail on how this is going to all work.
On the Floor of the Expo
We at WeoGeo of course were on the floor showing what we are doing with ArcGIS and the cloud but so were many others. Amazon was there of course as I said. GeoEye was but DigitalGlobe wasn’t. NAVTEQ and DeLorme were, but TeleAtlas wasn’t. I saw friends at VoyagerGIS and Arc2Earth (who was at the New Light Technologies booth) were there showing their latest products. Ran into Stu Rich at PenBayMedia showing off some of their very impressive building interior modeling and of course everyone else from SAIC to lone GIS professionals who stopped by to say hello.
2010 in the ESRI Community
So as always the FedUC kicks off the ESRI year. We’ll see much more at the BPC and DevSummit next month, but the message is simple. ArcGIS 10 will interact with “the cloud” no matter what that term means to you. The more I see with ArcGIS 10, the more I can see why they named it 10 rather than 9.4. It really is a break from what ESRI was doing in the past on both the Desktop and the Server. ArcGIS 10 should arrive early Summer (not to jinx anything of course), probably before the International UC so we can all give it a test run before we show up in San Diego.
I hadn’t been to a FedUC in more years than I can recall. It was really great to see how much this conference has grown and how many more people are interested in geospatial technology as well as how people have embraced the concept of web services, web clients and mobile GIS as more than just a display tool. Should be a very exciting year.
Off to the 2010 ESRI Federal User Conference.
Well I’m leaving warm sunny Arizona for some crazy reason to head off to the 2010 ESRI Federal User Conference in “balmy” Washington D.C. I haven’t been to the FedUC in more years than I care to share, but I’m excited to go this year. I’ll of course be hanging at the WeoGeo booth most of the day showing the cool ArcGIS Desktop integration we’ve been working on and going to as many talks as I can squeeze in.
I’m also going to see what is going on with this #geoglobaldomination stuff the Mid-Atlantic folks seem so keen on repeating every tweet. As with most things, east coasters seem to make a big deal about everything so this is their chance to impress me. Heck, they even schedule #geoglobaldomination, so it must be good.
GeoGlobalDomination
“Shall we play a game?”
Let’s Save Metadata
Metadata
When you see the word metadata I’m sure you begin to sweat. You get that lump in your throat and suppressed memories bubble to the surface (none of which are good).
They can get you at any time
Now it isn’t hard to think about why, metadata as we’ve been exposed to is just not human readable and thus barely human usable. Working in the government sector as a consultant exposed me to the worst two words that any DoD consultant can be exposed to; “metadata required”.
We deal with four letter acronyms all the time right? FGDC Even the website is built on Plone which of course feels more like Ivy League research project than the traditional SharePoint website we’d all expect from a government website. One should be scared navigating it and trying to find information. Anyway what about metadata as we’ve been utilizing it (FGDC or ISO) is just so painful?
Machine Readable vs. Human Readable
So FGDC or ISO metadata is complex, but there could be good reasons for this. They both try and address every conceivable possibility that might need describing in geo-data. If both were primarily designed for allowing servers to talk with each other, I’m not sure any of us would have any problem with it (nor would we really be looking for it). But servers rarely read and write metadata on their own without human interaction. Thus the reality of the situation is we poor humans have to ingest and parse metadata regularly.
Yikes
Well this brings me to what I see as the biggest problem with metadata. It is almost always in XML format. Now don’t get me wrong, XML does have its purpose. In fact I could list probably thousands of times that XML is the right answer. Sometimes it works and works well, other times you end up with a whole bunch of brackets and text that blends together. With a good eye you can parse out what you need, but there is so much noise there that it almost feels like a “Where’s Waldo” exercise. But XML does do a good job of organizing data for machines, but it doesn’t do it in ways that are easily readable.
What Human Readable Metadata Should Focus on
So some person sends you a dataset for a project you are working on. There are some questions you want answered before you commit to using the dataset:
- Who is responsible for the dataset?
- What is the dataset representing?
- When was it created?
- Where are its extents (projection, datum, etc)?
- How was it created?
- Why was it created?
The problem with metadata today is those questions are hard to parse out of metadata. If you know what to search for you might be able to find it relatively quickly, but the simple fact is that if I want to see the those answers above for a dataset, they should be exposed to me first.
Metadata Style Sheets
One way people have tried to make FGDC metadata (and ISO to some extent) more readable is through the use of style sheets. Many ESRI users are exposed to this inside their ArcCatalog. That drop-down list that lets you choose different ways of viewing the metadata is a style sheet selector. This means that you can take that ugly XML metadata and parse it out in ways that are easier to read. I’ve not seen much in the way of usability improvements on this front. At WeoGeo we offer human readable metadata on our dataset information pages. Others are doing it as well, but there is really no standard as to how this should be organized.
So Who Cares About FGDC/ISO?
Honestly you really shouldn’t care. You should care though about getting information describing the data you are working with. I think most of the issue with both metadata standards is that they are just too hard to input data into and too hard to get out the relevant information. Committee designed standards such as these always end up being way too much for real world use. We need to make sure we get the who, what, when, where, how and why of the dataset and to do this we need to look at the geo-data creation tools and how they help us input metadata. Data creators should have an easy time filling out those 6 things about their data. The issues are in the weeds of the metadata standards. But out on the fringes of the metadata requires, creation tools (such ArcCatalog) can help us manage things. Databases should be tracking who created the data (their name/address/etc), when it was last modified, any look up tables, aliases for field names, links to additional information and anything else that is being used for that dataset. Not having to track all that down gives the creator of the data enough focus to make the who, what, when, where, how and why so much better than they would if they had to enter everything.
And on the display end of things, I’d like to see UI experts work at creating better human readable metadata style sheets that hide the details that you don’t need to see at first glance and expose what we as uses of data need at first glance. It is easy enough to expand the details “below the fold” of a metadata page.
What Now?
It is up to all of us. We are stuck with the metadata standards so changing them at this point isn’t feasible. At WeoGeo we’re committed to working on bringing complex/detailed FGDC/ISO metadata to users in easy to digest methods. What I’d like to hear though is from others trying to crack this same nut and see if we can collaborate on this more and in this age of NSDIs still have usable metadata for people to make decisions.
Let’s Save Metadata
Metadata
When you see the word metadata I’m sure you begin to sweat. You get that lump in your throat and suppressed memories bubble to the surface (none of which are good).
They can get you at any time
Now it isn’t hard to think about why, metadata as we’ve been exposed to is just not human readable and thus barely human usable. Working in the government sector as a consultant exposed me to the worst two words that any DoD consultant can be exposed to; “metadata required”.
We deal with four letter acronyms all the time right? FGDC Even the website is built on Plone which of course feels more like Ivy League research project than the traditional SharePoint website we’d all expect from a government website. One should be scared navigating it and trying to find information. Anyway what about metadata as we’ve been utilizing it (FGDC or ISO) is just so painful?
Machine Readable vs. Human Readable
So FGDC or ISO metadata is complex, but there could be good reasons for this. They both try and address every conceivable possibility that might need describing in geo-data. If both were primarily designed for allowing servers to talk with each other, I’m not sure any of us would have any problem with it (nor would we really be looking for it). But servers rarely read and write metadata on their own without human interaction. Thus the reality of the situation is we poor humans have to ingest and parse metadata regularly.
Yikes
Well this brings me to what I see as the biggest problem with metadata. It is almost always in XML format. Now don’t get me wrong, XML does have its purpose. In fact I could list probably thousands of times that XML is the right answer. Sometimes it works and works well, other times you end up with a whole bunch of brackets and text that blends together. With a good eye you can parse out what you need, but there is so much noise there that it almost feels like a “Where’s Waldo” exercise. But XML does do a good job of organizing data for machines, but it doesn’t do it in ways that are easily readable.
What Human Readable Metadata Should Focus on
So some person sends you a dataset for a project you are working on. There are some questions you want answered before you commit to using the dataset:
- Who is responsible for the dataset?
- What is the dataset representing?
- When was it created?
- Where are its extents (projection, datum, etc)?
- How was it created?
- Why was it created?
The problem with metadata today is those questions are hard to parse out of metadata. If you know what to search for you might be able to find it relatively quickly, but the simple fact is that if I want to see the those answers above for a dataset, they should be exposed to me first.
Metadata Style Sheets
One way people have tried to make FGDC metadata (and ISO to some extent) more readable is through the use of style sheets. Many ESRI users are exposed to this inside their ArcCatalog. That drop-down list that lets you choose different ways of viewing the metadata is a style sheet selector. This means that you can take that ugly XML metadata and parse it out in ways that are easier to read. I’ve not seen much in the way of usability improvements on this front. At WeoGeo we offer human readable metadata on our dataset information pages. Others are doing it as well, but there is really no standard as to how this should be organized.
So Who Cares About FGDC/ISO?
Honestly you really shouldn’t care. You should care though about getting information describing the data you are working with. I think most of the issue with both metadata standards is that they are just too hard to input data into and too hard to get out the relevant information. Committee designed standards such as these always end up being way too much for real world use. We need to make sure we get the who, what, when, where, how and why of the dataset and to do this we need to look at the geo-data creation tools and how they help us input metadata. Data creators should have an easy time filling out those 6 things about their data. The issues are in the weeds of the metadata standards. But out on the fringes of the metadata requires, creation tools (such ArcCatalog) can help us manage things. Databases should be tracking who created the data (their name/address/etc), when it was last modified, any look up tables, aliases for field names, links to additional information and anything else that is being used for that dataset. Not having to track all that down gives the creator of the data enough focus to make the who, what, when, where, how and why so much better than they would if they had to enter everything.
And on the display end of things, I’d like to see UI experts work at creating better human readable metadata style sheets that hide the details that you don’t need to see at first glance and expose what we as uses of data need at first glance. It is easy enough to expand the details “below the fold” of a metadata page.
What Now?
It is up to all of us. We are stuck with the metadata standards so changing them at this point isn’t feasible. At WeoGeo we’re committed to working on bringing complex/detailed FGDC/ISO metadata to users in easy to digest methods. What I’d like to hear though is from others trying to crack this same nut and see if we can collaborate on this more and in this age of NSDIs still have usable metadata for people to make decisions.
Google Maps Labs Finally Improves Navigation Features - Sort Of
Yes FINALLY! I can’t tell you how frustrating it has been for me since the day Google Maps arrived. I always wanted to hold down the shift key (like every other modern mapping API) and draw a box to zoom in. With Google you had to use your mouse wheel and really who has a mouse wheel anymore with our notebooks and touch mice. Something had to be done.
Enter Google Maps Labs. You should now see that little green beaker in the upper right hand corner of your Google Maps screen.
The new labs icon
Clicking on that icon you are presented with some new features:
The labs options
A two of note:
Drag ‘n’ Zoom
Now this was the one I was most excited about until I saw its implementation.
The navigation
See that little square below the zoom bar? You are supposed to click on that if you want to zoom in. You can’t do what is completely obvious to everyone, hold down the shift key. I wouldn’t mind if they had both, but not adding the shift key to enable is totally baffling. But even worse, you can’t use the escape key to get out of the Drag ‘n’ Zoom. You have to move your mouse all the way back over to the left and turn it off.
Aerial Imagery
I don’t agree with what Google calls this because I’m sure there is “Aerial Imagery” in their “Satellite” images, but they’ve got 50 Billion in cash and I’m under water on my mortgage. So what do I know? Anyway this is the Google oblique imagery we’ve read about. It is only available in some small areas, but we can now see them outside of the Google Maps API. When you zoom to an area that has supported oblique imagery, you’ll see the new oblique aerials button that turns it on. You can use the Drag ‘n’ Zoom to quickly get into an area you wish.
Google Maps Oblique
The Others
The rest aren’t in my opinion that newsworthy but address probably small needs of users. I think this is a good way for Google to get some new features into Maps quicker than their normal release schedule. I just wish they’d get on board with existing UI and naming conventions.
via GigaOM
The ESRI Developer Summit and the .NET SIG
Play this below to set the mood:
That Was Then
Way back in 2005, at the ESRI International User Conference, there was a .NET SIG that essentially started something great. In that room there were some great folks (Scott Morehouse, Art Haddad, Brian Golden, Rob Elkins, Jithen Singh, Brian Flood, Dave Bouwman, and others) who talked about where we should take the developer community at ESRI. In my opinion the biggest thing to come out of that meeting was what became the Developer Summit.
I’m sure most ESRI developers feel the same as I do in saying that the DevSummit is probably the biggest thing to come out of ESRI in the last 10 years. It has grown to be probably the must attend event for many ESRI users. At at the DevSummit, the .NET SIG (as well as the other ones) became sort of a place to reconnect. It didn’t matter if you were web or desktop development, used Desktop or Server or worked in C# or VB.net; you could talk about what the .NET community was doing at ESRI and how ESRI could continue improving it.
A Brave New World?
Well looking at the 2010 ESRI Developer Summit Agenda I can see the SIGs have been dropped. I asked a couple contacts at ESRI if this was just an oversight on the website and they confirmed that the SIGs are no longer part of the program. I guess the idea is that you’d rather “Meet the Teams” to talk about what you are doing directly with them. Of course most folks probably won’t bother because they’ll be meeting the teams at the ESRI Islands and talking with them all week.
What I think I’ll miss is the strategic talk about how ESRI can improve their developer community. I thought this feedback was valuable to ESRI, but I guess these days it is better captured through contact us forms than face to face discussion. Part of what makes the Developer Summit so great is that it isn’t like any other ESRI event and I’m afraid that this is just the start of it losing its “woodstock” feel. Of course maybe change is inevitable but I can’t help but note it sucks.