Author: James

  • The ESRI Flex API vs the JavaScript API

    OK, I’ll come clean. While Flex is a great tool, I can’t see how you’d not use the JavaScript API instead. Flex is still not available on every platform (I’m an iPhone elitist) and isn’t easily picked up by everyone. But more than one person said in last weeks thread on the JavaScript API that Flex was the way they are going. I’m curious is there a rising groundswell in Flex or are ESRI developers just an outlier to the overarching movement toward JavaScript?? FlexBuilder 3 is at least $250 (Pro is almost $700) and I just can’t see people bothering to buy an

    What part of the Flex API makes you choose it over the JavaScript API?? If people are willing to block Flash, don’t you limit your marketplace by going that route over JavaScript?

    Flex…..Savior of the Universe

    Flex.....Savior of the Universe

  • 2009 ESRI User Conference Abstract Submission Deadline Extended

    If you were like me today and were working hard at getting your abstract in this is probably not too much of a big deal to you, but if you were unable to get your submission done, you’ll have some extra time to get those ducks in a row for your 2009 abstract submission.

    Professionals across industries and with all levels of GIS experience are encouraged to submit an abstract for possible presentation at the 2009 gathering. The deadline for submissions has been extended to November 14, 2008.

    You still have time to get that submission in!

    You still have time to get that submission in!

  • The ESRI WebADF and the ArcGIS JavaScript API

    I find it interesting that most work I’m seeing these days is with the JavaScript API that ESRI released at ArcGIS 9.3. I assumed a couple months ago that people would really be looking at moving off of the WebADF (.NET or Java) for the JavaScript API and it appears that this trend is beginning to happen. Now before you think that I’m really sticking a fork in the WebADF, think again. The WebADF will continue to grow and be used where it makes sense, but probably not as the “default” mapping front end for ESRI web servers. The simplicity of the JavaScript API and the way it works, makes the classic WebADF and HTML viewers obsolete for most users (I’m still waiting to see what ESRI does with Silverlight, but that discussion is for another day).

    Also, coupled with the JavaScript Extenders for Google Maps and Virtual Earth, there is probably very good reasons to be looking this way instead of deploying the WebADF. I’ve also seen people abandoning third party “helpers” for the WebADF such as Geocortex Essentials (I guess we’ll see JavaScript API tools from these companies soon, eh?) to move back to simpler JavaScript front ends. There are times and places for .NET or Java server solutions, but what the JavaScript API has done is allow ESRI customers and implementors to go with a more lightweight solution and in turn brings them to more cutting edge RESTful and JavaScript technologies that can be leveraged outside of the ESRI silo.

    I’ve really started to try and point my clients (and anyone else who asks) away from the Java and .NET WebADF and toward the lightweight ESRI JavaScript API. Everyone who has moved in that direction has really been satisfied and given the 9.2 release of ArcGIS Server, that is really turning things around.

    James seems to be pushing ArcGIS Server again

    James seems to be pushing ArcGIS Server again

  • A Week Off

    I’m heading down into the basement for a week. Enjoy the greatness that is David Hasselhoff, Gary Coleman and of course KITT.

    What a moment of zen…

    What a moment of zen...

  • GeoCommons Maker! – the next day

    Well kudos for FortiusOne for getting the word out on Maker! especially since the launch was delayed from the original PR blitz. As with most GeoBloggers, I’ve had access to Maker! since last week and have really been impress with its output. Sean has been teasing us for months it seems with the cartographic output of Maker! in his blog posts, so I was glad to finally get my? hands on Maker!. (side note, do you put a period after a product name that ends in a punctuation mark?)

    Maker! is the map production portion of GeoCommons and Finder! is the search engine for geospatial data. Together they allow users to create web maps that can be shared with the world. So to get information in Maker!, you first upload your data to Finder! and then add it to your map. The byproduct of this workflow is more data gets added to Finder! and in turn more data is available to the community at large. Freely sharing data is one the core components of GeoCommons (compared to WeoGeo which is more of a marketplace).

    Stefan Geens does a good job of showing how the map is created and how you set what we usually refer to the symbology of layers. What I like about this approach is you can bring to light the data in ways that before Maker!, required custom programming to achieve good looking results (if even possible). FortiusOne, according to Sean, worked with cartographic professionals to create the rich (I’m sorry) map production tools. These tools are so good in fact that I’ve heard a couple GIS professionals lament that they’ll be out of a job soon (of course we all know that Maker! will only increase our workloads to produce data for public consumption). What we have here are two really simple tools that allow anyone to upload geospatial content, combine that information with other datasets and then create a wonderful looking map that visually tells a story.

    You can argue all day and night about what the GeoWeb is or isn’t, but I think we have an excellent example of what the GeoWeb should be right here. Finder! has discoverable web services of data (with metadata to boot) and Maker! allows you to leverage those services together to create derivative value content to share with the world. Moving forward, the data of GeoCommons should support more OGC services (beyond KML) for those who need that support and the maps created with Maker! should be more easily shared beyond just an web map. But the groundwork is there for sharing data with the world.

    Despite the lack of monkey maps, the GeoMonkey approves of Maker!

    Despite the lack of monkey maps, the GeoMonkey approves of Maker!

  • Amazon brings Windows (and SQL Server) to the cloud

    The Amazon Web Services Blog says that Amazon will be bringing Microsoft Windows to EC2 this fall.

    The 32 and 64 bit versions of Windows Server will be available and will be able to use all existing EC2 features such as Elastic IP Addresses, Availability Zones, and the Elastic Block Store. You’ll be able to call any of the other Amazon Web Services from your application. You will, for example, be able to use the Amazon Simple Queue Service to glue cross-platform applications together.

    This is on the heels of the Oracle/Amazon EC2 release from a couple weeks ago. Now that the tools are here, we’ll have to see how well they are adopted by corporate IT administrators who aren’t always open to giving up control of their servers to others.

    Mr. Gates saw the value of the cloud early on

    Mr. Gates saw the value of the cloud early on

  • Stopping Over-engineered GIS Applications

    I was thinking the past week about a project that we will start working on soon. Simply put, it is updating a MapObjects IMS application we deployed almost 10 years ago, that is still working. When I saw that it was not only still running, but it was still a critical part of their business workflow, it started me thinking about why such an application was so successful. It obviously wasn’t the technology. Sure the back end runs on Oracle, but even the most ardent MOIMS supporter can’t claim that the Visual Basic application was cutting edge even back then. So that must mean there was something else going on that kept it running when most MOIMS sites are long gone.

    Won’t someone please think of the users?

    Wont someone please think of the users?

    History of GIS applications tells us one story that repeats itself again and again. There is a horrible habit of pushing over-engineered applications that are not used by the target audience because no one has time to figure out complicated tools. GIS vendors have not discouraged such habits and in some cases encourage them. The GIS world is really good at writing GIS applications for GIS professionals. I think this used to work before GIS and mapping became important in our everyday lives, but now that everyone everywhere is looking at deploying spatial applications focus needs to be put on what the end users are going to be doing with the application.

    So back to that old MapObjects application, it did a really good job of doing what it was supposed to do. Display information in a context that the users were comfortable working (the interface was familiar to them) with and meet their requirements (which were obviously well developed), fit within their websites, scaled well (even Visual Basic does that apparently) and wasn’t an obstacle to their workflows. With MOIMS depreciated and the need to connect to more modern ESRI servers and Oracle databases the application needs to be updated, but not because it restricts their business practices and workflows.

    Foisting this application on users of a bus system was poorly thought out, but the Google Transit version released a few weeks ago hits the target users right on. The heavy GIS website might meet needs of users in the organizations internally, but externally it really highlights missed opportunities and wasted resources. I’m personally really excited to see if we can replicate the success of the earlier MOIMS application with JavaScript APIs, KML downloads and other new technology and still keep is simple. The key is listen to what the client really wants and be agile enough to deliver simple, focused, and fast products.

  • On the Mississippi

    I’ve been quiet this week because I’ve been at the fiscal year end management meetings. The RSP Minneapolis office is located in the old Grain Belt Brewhouse which is just incredible office space.

    [googlemaps https://www.google.com/maps/embed?pb=!1m0!3m2!1sen!2sus!4v1484086064282!6m8!1m7!1sETpzK2QSiy-RxuJ4c-B_qw!2m2!1d44.99981173447936!2d-93.26984603232123!3f241.04912418888696!4f34.70625265220684!5f0.7820865974627469&w=600&h=450]

    The weather has been just so hit and miss. Flying in on Tuesday, we almost had to divert to Omaha, NE and the rain once we did get on the ground made it almost impossible to drive. But since then the weather has been very enjoyable (I suppose some might say warm, but being from Arizona I have a different definition).

  • Oracle enters the cloud (MySQL Enterprise too)

    Oracle and Amazon today announced that Oracle would be offering some of their products inside Amazon’s EC2 cloud.

    The Oracle Database 11g, Oracle Fusion Middleware, and Oracle Enterprise Manager can now be licensed to run in the cloud on Amazon EC2. Customers can even use their existing software licenses with no additional license fees.

    While I see nothing specifically about Oracle Spatial, I assume is can be licensed as well on the cloud. The benefit to everyone is outside of licensing costs, the ability to launch the Oracle AMIs on EC2 and be up and running in no time. That plus the scaleability of EC2 (and thus Oracle) means that you don’t have to worry about hardware limitations with your applications. RSP Architects uses SQL Server as our database of choice, and while I would have been able to run Oracle in a virtual server, I no longer have to worry about hardware constraints to our development. Just license (which of course I realize is a problem for some people) and start loading the database. I’m anxious to see how ArcGIS connects to Oracle Spatial on EC2 and what kind of performance I can expect.

    Cloudzilla could be unbeatable with Oracle in his hands

    Cloudzilla carries Oracle onshore

    Now for those who want to avoid Oracle, MySQL Enterprise as well in the Amazon Cloud.

  • Navteq out, TeleAtlas in

    So Google has finally gotten around to making sure both the Google Maps API and the Google Local Search API are using the same underlying data.

    Google Maps has now switched their map data provision completely over to TeleAtlas from Navteq. Now the google Maps, the Google API and the Google Maps for Mobile all use the same underlying data. This switch was only a matter of time given Nokia aquisition of Navteq

    I’m curious to see if the change will affect any mapping applications out there that were using the Navteq data given that TeleAtlas and Nokia Navteq are probably different. Time for Peter Batty to revisit his Google Maps vs Google Local Search blog posts.