I’m off to the 2012 Esri Federal GIS Conference tomorrow morning. The one thing the “FedUC” does more than anything else is set the tone for Esri’s marketing push. From the Plenary description:
Esri president Jack Dangermond will provide an update on the latest innovations in ArcGIS, new patterns in cloud GIS, and a vision for the future of GIS.
The Esri cloud story has been half told. Basically it’s a concept that still needs to be flushed out. I’m always waiting for the new ArcGIS architecture to come down the road where Esri can really start treating “ArcGIS” as a true scaleable hosted technology and not just some enterprise server software that happens to run on some IaaS platform.
The “new pattern” as Esri sees it is probably in this article written by Victoria Kouyoumjian. The problem with this vision is that I’m not sure ArcGIS.com as it exists today is anything close to something GIS professionals (the ones that use Esri Software) want or need. ArcGIS.com might “grow the brand” for Esri, but it keeps them stuck in the past having to support the world largest COM ecosystem out there, ArcGIS.
We’ll just have to see what happens when Jack takes the stage in about 36 hours and tells us what we need to know about Esri and hosted GIS. Personally I’m still hopeful at some point Esri will figure out a license model that lets us at WeoGeo integrate their products into our infrastructure that doesn’t cost $40,000 an instance (we scale our servers on Amazon to the point we could never afford to pay that kind of license).
ArcGIS.com is a smokescreen for Esri to keep talking about hosted GIS until they announce their new ArcGIS backend. For all we know, ArcGIS.com is the basis for this new ArcGIS that scales, works on non-Windows servers and is priced at a realistic price point. I’d jump at a chance to leverage more Esri technology in our stack at WeoGeo, but for now we’ll sit and wait with the rest of you for whatever this new pattern is Jack is going to talk about.
Licensing based on how people were doing business in 1988 doesn’t work in 2012. That’s what the “new pattern in cloud computing” should be telling us.
I saw today that Paul Ramsey posted the 2013 FOSS4G RFP. Highlights according to Paul:
- 2013 is a “Europe year”
- Like last year, using a two-stage bidding process, with letters-of-intent, followed by full proposals for selected bidders.
- Letters of intent due March 31.
Since this is a “Europe Year”, clearly we need to be thinking about a FOSS4G in Europe next year. Because Europeans are always looking to the USA for suggestions (right?), I’ll gladly pass this free advice on to our brothers and sisters in the Euro zone… Don’t put it in Northern or Eastern Europe. Acceptable choices are only the French or Italian Rivera.
I’m sure Prague, Budapest or Warsaw are wonderful cities and should be at the top of any trip to Europe. But to me that’s like having a conference in Chicago. Sure, there are wonderful things to do in Chicago, but eventually you wake up one morning and you realize the best thing about Chicago is that it isn’t St. Louis or Milwaukee. You don’t want that to happen to FOSS4G 2013, right?
Plus who doesn’t want to see old men in speedos pull a fishing boat up on the beach (isn’t that what Italy is all about)?
I’ve been trying to catch up on some reading during lunch today and this interview that Joe Francica did with Pitney Bowes Software’s John O’Hara caught my eye:
[John] O’Hara mentioned that Autodesk saw Esri moving into design space and therefore saw an opportunity to work with PB as a partner to go up against situations with Esri. Similarly, Autodesk and PB have some dependence on desktop software and largely don’t play in the same space. PB is focused on business applications of GIS in markets like insurance, banking and retail while Autodesk plays in the planning , engineering and energy space. Autodesk’s go to market model is through a huge network of partners; and PB has a more direct sales organization.
All this “GeoDesign” talk clearly caught Autodesk’s eye. I’m guessing that they’ve had enough and we could be seeing a larger commitment from Autodesk in the “Geo” (big G) space in the coming year.
On the way to battle the Wicked Witch of the West Redlands, Autodesk found Pitney Bowes on the side of the road.
Yesterday in the Geowanking mailing list (yes it seems to still be alive), Steve Coast posted about a new project he’s been working on.
I figured this is a good group to give a peek at something I worked on last month:
The premise is that a typical geocoder uses a large chunk of code to import a large database in to a large geocodable database. Then another large chunk of code is used to actually take a string and geocode it against this large imported dataset. At the end of all of this all you’re typically doing is showing some bbox for some string like “London” which the user typed in.
So wait, another geocoder? About OpenGeocoder.net says Steve:
The major differentiators against other sites are that the IP licensing is clear, all bboxes are derived from imagery we have rights to, the bbox & string data is put in the Public Domain. It’s trivial to use. The API saves misses for later fixing. It’s hard to find a site that does 2 or 3 of those.
Basically Microsoft is donating the imagery to the project so that anything created is in the Public Domain. Still, people seem nervous about a Microsoft license. I’m of the mindset to give Microsoft the benefit of the doubt here, especially given that Steve has put his name on this project. The API seems dirt simple:Simple OpenGeocoder API Calllink
And you can export everything out as a data dump:OpenGeocoder Data Dumplink
It will be interesting to see what the community does with this new resource. A Geocoder is only as good as the data inside and if the community is required to make this one work, we’d better get cooking.
OK, so maybe Esri doesn’t call the FedUC the FedUC anymore. If you called it something else, I might not know what you are talking about. Anyway, I’ll be in Washington next week Tuesday-Friday to present a talk on how WeoGeo uses Amazon’s infrastructure to do the awesome stuff we do.
Amazon has invited us to be with them at their booth showcasing how WeoGeo uses AWS to integrate location-based enterprise data into predictive analytical systems such as Business Intelligence tools. Drop by and let me know if you’d like to talk about WeoGeo or just email me and we can set up a time to meet.
It is hard to believe it has been over two years since I was last in D.C. I’m looking forward to it.
The GeoMonkey goes to Washington and all he finds is lots of paperwork. You guys really know how to throw a party inside the beltway.
This blog have moved more times than I recall. In it’s current form (since 2005) it started on Blogger, then MovableType and then WordPress where it’s resided since about late 2005. Times change though and WordPress is more of a content management system than a simple blog tool. Thus I’ve gotten tired of hacking PHP to get just a nice simple blog.
I’ve written my blog posts in Markdown for at least the past 4 years so when we migrated the WeoGeo website over to Markdown and Jekyll I was in love. Jekyll is a great static site generator built on Ruby which does one thing really well, generate a website full of static webpages. Unlike tools such as WordPress which render everything as they are requested, a Jekyll website is already pre-generated so all the browser does is render a simple HTML page. While there are tools in WordPress that basically allow you to cache the PHP pages, the number of plugins slow the website down and becomes a huge overkill for just doing what I want, write.
Now we rolled our own Jekyll website at WeoGeo, but I wanted to simplify the process for me since I was only creating a blog. That’s where Octopress came in [I wish I could give a hat tip here to whomever pointed Octopress out to me first]. The Octopress framework abstracts out much of the blogging process (connecting to Twitter, Facebook, Disqus Comments, Github) freeing me to basically just write. I had to do a little work getting Mint to work, but if you use Google Analytics, it basically works out of the box.
I’ve still got a couple things I need to do: * Clean up old blog posts. Because many of these have been migrated 10 times or so back and forth, the formatting is messed up. I’m doing this by hand because I want them to look good, but with over 2000 blog posts, this takes time.
- URLs are still a little wacked out. WordPress did some non-standard things so rather than write some weird .htaccess thing, I’m basically converting the urls by hand. I figure there might only be 1% that fail at this point, but I should have that cleaned up soon.
- The template is still stock and I want to get it back to the simple white background that I used to use. That’s simple enough, but the number one priority was to free my content from MySQL and WordPress.
I’d like to thank Dan Dye for helping me out with some of the Python work. He’s become such a Python Ninja that I can’t but help use him as a resource. Seriously, Levenshtein is awesome and I probably wouldn’t have found it without him.
100 years ago today Arizona became the 48th and last of the contiguous 48 states to gain statehood.
It only took about 500 years after the Spanish first set foot in Arizona looking for gold. Much more at the Arizona Republic.