Today’s HWJF planning staff meeting was full of new ideas. The biggest one was that HWJF becomes a podcast. One of the biggest feedback requests has been to offer an audio only version of the hangouts for those who want to listen on their smartphones offline. I’ve explored this many times and never really got a good plan in place. But given that HWJF isn’t really visual in nature (looking at my grill for an hour has to be taxing), we’re going to convert HWJF into a podcast for season 4 arriving in October. If there is a visual need to have video, we’ll have special hangouts on the YouTube. What this means is it won’t be live anymore so you can’t point out how wrong I am until after the faux pas has passed.
What it will mean is the podcast should be more consumable and usable by everyone. This is an experiment so we’ll see how it goes moving forward. It will also allow more flexible scheduling of the podcast so we can have guests on who can’t attend during work hours.
The crazy thing about GIS is that we never really take into consider version control with our data. Well we have a workflow, it usually entails putting a one at the end of a file name or calling it “temp” or “trash” while it works through analysis. Whenever I used to take over a project from someone, I’d take a quick look at the project folder and you’d see hundreds of seemingly orphaned GIS datasets just littering up the folder structure. And of course no documentation as to why there are there, how they were created and what the derivative products were created off of them.
When I joined WeoGeo many moons ago, that was one thing Paul Bissett always harped on and was something WeoGeo was trying to solve. WeoGeo was approaching it from a data sales marketplace end where data providers wanted to know what derivative works were being crated off their data. But for users, the same process was needed. We tried to pitch it to users but generally they didn’t see the value in keeping track of their datasets. I think this was shortsighted and I still believe that WeoGeo should have been the choice of every GIS professional to maintain an authoritative data library. But alas, GIS professionals didn’t care.
We’ve watched GeoGig by Boundless (It appears that they passed it on to the Eclipse Foundation last year. I hadn’t heard this, nor does the website show the update). for years, waiting to see if it will succeed. It provides the kind of revision control we’re used to with programming. Boundless has a pretty good graphic below that shows what’s going on from a GIS perspective.
Bingo right? Well wrong… While GeoGig is actually very impressive and takes into consideration all those weird things us GIS folks do, it won’t ever go anywhere. Without it being integrated into QGIS and ArcGIS Desktop users won’t be able to integrate it into their workflows. It is the same problem we ran into with WeoGeo Library, it’s just too hard to integrate it into ArcGIS Desktop without Esri doing it themselves.
But the QGIS tie is interesting. Boundless is behind GeoGig. They are also a big supporter of QGIS. GeoGig seems more tied in with Geoserver right now but editing is the big reason for GeoGig and let’s be honest. Most editing happens on the desktop. Boundless has been showing that QGIS and GeoGig work together but as I said above. Unless it is natively integrated into QGIS, it won’t have more than a niche uptake. At this point though, GeoGig is really our only big hope.
Esri has versioning on their geodatabase but it’s a nightmare. I’ve never had good luck with it but I’m willing to chalk it up to me not knowing a thing about what I’m doing. Geo-data is complex, Git works because it is so simple. Tracking simple changes in text for files. They have a hard enough time working with GeoRSS but you can see that it does work. I have to often wonder if GitHub might have GIS version control before GIS people get it working. Honestly that is what we want right? Github for GIS?
The B4UFLY app, aimed primarily at model aircraft enthusiasts, is designed to give users information about restrictions or requirements in effect at their current or planned flight location. The FAA expects the beta test will yield valuable data on how well B4UFLY functions, as well as uncovering any software bugs.
So first, it’s a private beta so you probably aren’t invited. You can email email@example.com if you want to give it a shot and ask to be in the private beta. Second, the app looks simple but I suspect the rules are going to be complex and thus the app will be less than easy to use. Still if you want to fly your drone, this could be the start of something good. Tomorrows, my birthday, feel free to buy me one.
If there is one thing that gets the GeoMarketers excited, it is natural disasters. Every year at this time we see hurricane trackers, 3D storm viewers and other “exciting” products to help protect us from the wrath of mother nature. This year Google is promoting their Personalized Storm Tracker.
The safety recommendations you receive will be tailored to reflect the current status of the event and your context. For example, if you search for a specific storm when it’s still several days away, you may see a map of the developing weather event and a recommendation to start preparing an emergency kit. If the storm is only hours away from your location, you might receive a reminder to start charging your phone in case power goes out. And if you search when the storm is nearby, you’ll get the most urgent information, like how to avoid injury from fast-moving water or flying debris.
I feel like these things are more trouble than they are worth. Last Sunday I was at Sky Harbor Airport when a dust storm alert went off on all the iPhones in the baggage claim area. The whole place sounded loud as the alerts went off warning us that wind and dust were headed our way. The result of this great warning, people making jokes. Sure an alert was issued, sure it probably is a safety thing that we all get these alerts on our phones. But the delivery isn’t personalized, it’s a broadcast message and then 10 minutes of people joking about the alerts.
At least in the USA, storms don’t sneak up on anyone. These products are great press but of little value.
I never “celebrated” 10 years of Spatially Adjusted mostly because I forgot about it. I was cleaning up the site earlier this week and noticed there was some good content back then, it definitely had a different tone but hey, I’m 10 years older now. I’m going to post a “best of” link every week to a 10-year-old article for the rest of the year. Some of it will be thought-provoking[footnote]disclaimer: probably not[/footnote] and some of it will be laughable. At any rate 10 years ago this week there were a couple posts about hurricane tracking that were interesting given that it was about Katrina, but this one caught my eye.
All the openness in the world won’t make any product successful, but listening to your customers will. The feeling that I’ve gotten from ESRI over the past year is that they have finally begun to realize that their road to continued success is supporting users like us. Don’t confuse the hype surrounding Google Maps/Earth with them being open and listening to their customers. There is no company that likes to hide behind their logo more than Google and they will do whatever it takes to not have to be open. There is a reason people are beginning to realize that Google is the next Microsoft (while Microsoft seems to have becomethe next IBM). Believe me, ESRI has a LONG WAY TO GO before they are as open as we’d all like them to be, but they do listen to their customers and that is a start.
Well the whole post is sort of like that, me claiming that Esri has been more open than Google or others. The context with this is they started allowing their employees to blog and contact people directly, it was a big shift from the traditional call a phone number support. So we were all so excited to see Esri employees blogging and responding to our articles. Well eventually it all collapsed into a corporate marketing blog cycle but at that moment it looks like we felt like Esri was changing.
Years ago in the Arc/Info world, we used to perform most of our geoprocessing in ArcInfo Workstation on Windows. But when we needed to really get work done, we’d use a HP-UX beast of a server to handle some of the more complex geoprocessing. It was really easy to do right, Esri even use to have some tools to help you accomplish this. I remember thinking that very soon we’d be able to offload most geoprocessing on remote constellations and then just get back the results. My personal workstation wouldn’t be bogged down with processing and the server would be doing what we paid good money for.
Well we didn’t know what we were talking about at the time was “GIS as a Service”. Mostly because we didn’t think of clouds anything more than rain makers. But the idea of offloading our geoprocessing was something to a person we’d wager would be built into GIS by now. Of course products like ArcGIS Server and FME Server can run processing remotely but it is not built into workflows. You have to go out of your way to author scripts that can handle this. I’m curious why things worked out this way.
It could be that with Arc/INFO on Unix going away there wasn’t servers that could handle geoprocessing. Or it could be that workstations these days are so fast that you don’t need to remote process. Maybe I’m just old and stuck in my ways that I want to use an Unix server for processing, maybe put a couple of Perl scripts in there and call it a day. But I think I’m disappointed that we just haven’t seen that much uptake on remote geoprocessing. The only workflow I’ve used this on that was supported by the software is authoring on FME Desktop and running those workbench scripts on FME Server.
I guess we always assume there will be flying cars and houses on the moon but we’re left with airport departure TVs that show the blue screen of death, smartphones that can be hacked with SMS and our credit cards being stolen left and right. The reality of GIS in 2015 is it is still enterprise work being done in a workgroup fashion. GIS isn’t taken seriously by IT because we don’t take ourselves seriously. Hiding in a corner “doing GIS” is how we’re seen by others. Time to break the mold.
Something I started in 2006 is still widely used. I created it originally as I was trying to create ArcGIS 9.1 Personal Geodatabases with ArcGIS 9.2. It wasn’t possible then to create older Geodatabases but Esri eventually added in functionality to create older versions. The reason we need these is that you can use older Geodatases in newer versions of ArcGIS but not the other way around. So if you are on ArcGIS 10.2 and your client is on ArcGIS 9.3, you’ll have a problem sharing data. But if you have a 9.3 version Geodatabase, then you can save your data to that version and share away.
I like this archive because each one of these Geodatabases was created with that version of the software. They will work perfectly since they are natively created. So next time you need to have a 8.3 Geodatabase (You totally know that day will come), you’ll have a native Geodatabase to work with. Bookmark and use!
Special thanks to @GIS_katie for providing the updated blank ArcGIS 10.3 File and Personal Geodatabases.