Yesterday I posted about Chris Hogan’s walk-through of generalizing data in PostGIS to make it usable in a web app. Basically he went through the process of finding out what is the sweet spot of quality vs speed. But there are other ways to accomplish this. Mapbox happened to post about a new library called geojson-vt.
Let’s see if Mapbox GL JS can handle loading a 106 MB GeoJSON dataset of US ZIP code areas with 33,000+ features shaped by 5.4+ million points directly in the browser (without server support):
Wait, what?! A few seconds loading the data, and you can browse the whole data set smoothly and seamlessly. But how exactly does that work? Let’s find out
So that’s actually pretty amazing. We all know what GeoJSON does in the browser and how it impacts the speed of maps drawing. 100 MB+ data rendering so quickly? Impressive. Read the whole post to see how they do it and the details on how to start using it. The only limitation is that it requires mapbox-gl-js or Mapbox Mobile[footnote]which is actually a big limitation if you think about it[/footnote].UPDATE: Per Tom MacWright:
Still this comes down to using tools that make your mapping products better. Maybe Mapbox does that cheaper and quicker than you could on your own. This kind of on-the-fly simplification is what we’ve all been asking for and Mapbox is really pushing the envelope. This could be what gets people to start using their platform.
10 years ago this week Katrina had rolled in and there were lots of posts on Spatially Adjusted about Digital Globe and Google Maps imagery being updated for the flooding. But the post that caught my attention was this one on ArcScripts:
Can someone at ESRI please clean up the ArcScripts site? Plain as day on the ESRI ArcScripts upload page it says “Not for samples or demos of products sold at Web sites”. There are way too many products that are commercial in there and this latest one takes the cake. 15 days and then you have to buy it, what a joke. If you have to advertise, do it by buying ad space, not polluting the ArcScripts gallery. Geospatial Enterprises is off my list of companies I’ll deal with. XTools Pro 3.0 is also a commercial product that tries to get around by offering some free tools, but it too is just a demo. Someone over at ESRI needs to get serious about cleaning this junk up and off the ArcScripts.
I mean how shady was XTools Pro anyway? The original XTools on ArcView 3.x was open, free and a great tool. Then some guys basically rebranded it for ArcGIS Desktop and started charging money for it. Oh well, the madness of ArcScripts is over as well as the need for tools like XTools is over. Still funny to think this was how we shared scripts and applications back then. No Github or other platforms to help. Life was so hard back then and we didn’t realize it!
Indoor mapping is the white whale of our Spatial IT industry. We’re always reading about how our smartphones will lead us to the best deals or how I can find the specific nail I need in Home Depot without having to ask anyone or walk down every aisle. They key to all this is essentially iBeacon.
You can search Google News for all the latest excitement on the concept but essentially it is a way for your phone to know where things are and for the vendors to know where you phone is through Bluetooth. Imagine walking into a store and getting alerts about your favorite beer being on sale and then the ability to navigate directly there. Sexy right? Plus we’ve been anticipating this happening for years. Except…
Google was set to launch a new product that added context to one of its most successful apps, Google Maps. But earlier this year, it was shut down by Alphabet CEO Larry Page, according to people familiar with the project.
Google Here worked by sending a notification to a smartphone user’s lock screen within five seconds of their entering a partner’s location. If the user clicked on the notification, a full screen HTLM5 “app” experience would launch. Google Here would know when to send the notification via Google Maps and beacons placed in the stores of participating partners. Google planned to supply the beacons to partners for the launch, according to the document. The experience could also be found by going to the Google Maps app.
Exactly what we though everyone wanted. In testing the application was deemed too invasive and Google feared no retailers would sign up. That’s right, Google didn’t think could get their partners to install cheap beacons in their stores AND they feared they were too big brother. Seems weird doesn’t it, if there is one company that can get companies to spend money on ads, it is Google. And since when did Google ever think pushing ads on us was “invasive”?
The magic about Google Here (Here as in not Here that was owned by Nokia) was that you didn’t need an app running for it to work. Think about that for a minute, ads would appear on your phone based on where you where and you didn’t need to opt in to get them. Now we see why Google was very concerned that Here was going to get a large backlash. Being able to push ads on users would have been something they really could have sold well to companies, I’m not sure there would be any fear of companies not wanting to push ads on us.
Beacons are still very important to Google. Their Eddystone project talks about lots of uses of beacons but not for ad delivery. Clearly there was feedback on this project and it jolted Google out of their normal sell more ads business model. I think beacons will be very valuable as they start appearing in more areas, but I for one don’t need to get an ad for fabric softener every time I walk into a Target.
I am working on a project that needs to display all the neighborhood polygons in Baltimore City at one time. The file is relatively detailed… which mean that tons of unnecessary polygon nodes are being sent from the backend, when, at the zoom level and the level of detail the map users need, the high level of detail is a total waste.
While there are some great hosted options to serve up complex GeoJSON, most of the time it is better served (no pun intended) to simplify your data. Unless you’re surveying or involved with some sort of lawyer, even a bit of generalization is a good idea with online mapping. Chris does a great job showing how you can modify the tolerance to get your results to look great and save lots of bandwidth. If you’re a generalization newbie, you should read his example and get a better understanding of how it works.
And if you’re an Esri user, the same concepts can be used in their stack as well.
Yesterday I had a long post about GIS and version control. I mentioned Git in the article saying how maybe in the future Git would work with GIS files. A couple of people mentioned an article written by Gretchen Peterson titled, “Huge increase in shareability by combining Git and QGIS”.
This isn’t the version control you’re looking for…
Gretchen does a great job showing how you can manage GIS projects with Git and I encourage everyone to read it. But keep in mind it isn’t version control how I was talking. Git doesn’t understand shapefiles and other binary GIS files. It will show that a shapefile was updated, but it won’t show what you updated in it, nor will it help reconcile updates. Gretchen is using Git to help her share projects with others which it does a great job. But it isn’t geodata version control and she outlines that clearly in the post.
We all would love to see Github support shapefiles, but I seriously doubt it will ever happen. For how you can use Github and GeoJSON files which isn’t half bad.
Today’s HWJF planning staff meeting was full of new ideas. The biggest one was that HWJF becomes a podcast. One of the biggest feedback requests has been to offer an audio only version of the hangouts for those who want to listen on their smartphones offline. I’ve explored this many times and never really got a good plan in place. But given that HWJF isn’t really visual in nature (looking at my grill for an hour has to be taxing), we’re going to convert HWJF into a podcast for season 4 arriving in October. If there is a visual need to have video, we’ll have special hangouts on the YouTube. What this means is it won’t be live anymore so you can’t point out how wrong I am until after the faux pas has passed.
What it will mean is the podcast should be more consumable and usable by everyone. This is an experiment so we’ll see how it goes moving forward. It will also allow more flexible scheduling of the podcast so we can have guests on who can’t attend during work hours.
The crazy thing about GIS is that we never really take into consider version control with our data. Well we have a workflow, it usually entails putting a one at the end of a file name or calling it “temp” or “trash” while it works through analysis. Whenever I used to take over a project from someone, I’d take a quick look at the project folder and you’d see hundreds of seemingly orphaned GIS datasets just littering up the folder structure. And of course no documentation as to why there are there, how they were created and what the derivative products were created off of them.
When I joined WeoGeo many moons ago, that was one thing Paul Bissett always harped on and was something WeoGeo was trying to solve. WeoGeo was approaching it from a data sales marketplace end where data providers wanted to know what derivative works were being crated off their data. But for users, the same process was needed. We tried to pitch it to users but generally they didn’t see the value in keeping track of their datasets. I think this was shortsighted and I still believe that WeoGeo should have been the choice of every GIS professional to maintain an authoritative data library. But alas, GIS professionals didn’t care.
We’ve watched GeoGig by Boundless (It appears that they passed it on to the Eclipse Foundation last year. I hadn’t heard this, nor does the website show the update). for years, waiting to see if it will succeed. It provides the kind of revision control we’re used to with programming. Boundless has a pretty good graphic below that shows what’s going on from a GIS perspective.
Bingo right? Well wrong… While GeoGig is actually very impressive and takes into consideration all those weird things us GIS folks do, it won’t ever go anywhere. Without it being integrated into QGIS and ArcGIS Desktop users won’t be able to integrate it into their workflows. It is the same problem we ran into with WeoGeo Library, it’s just too hard to integrate it into ArcGIS Desktop without Esri doing it themselves.
But the QGIS tie is interesting. Boundless is behind GeoGig. They are also a big supporter of QGIS. GeoGig seems more tied in with Geoserver right now but editing is the big reason for GeoGig and let’s be honest. Most editing happens on the desktop. Boundless has been showing that QGIS and GeoGig work together but as I said above. Unless it is natively integrated into QGIS, it won’t have more than a niche uptake. At this point though, GeoGig is really our only big hope.
Esri has versioning on their geodatabase but it’s a nightmare. I’ve never had good luck with it but I’m willing to chalk it up to me not knowing a thing about what I’m doing. Geo-data is complex, Git works because it is so simple. Tracking simple changes in text for files. They have a hard enough time working with GeoRSS but you can see that it does work. I have to often wonder if GitHub might have GIS version control before GIS people get it working. Honestly that is what we want right? Github for GIS?
The B4UFLY app, aimed primarily at model aircraft enthusiasts, is designed to give users information about restrictions or requirements in effect at their current or planned flight location. The FAA expects the beta test will yield valuable data on how well B4UFLY functions, as well as uncovering any software bugs.
So first, it’s a private beta so you probably aren’t invited. You can email b4ufly@faa.gov if you want to give it a shot and ask to be in the private beta. Second, the app looks simple but I suspect the rules are going to be complex and thus the app will be less than easy to use. Still if you want to fly your drone, this could be the start of something good. Tomorrows, my birthday, feel free to buy me one.
If there is one thing that gets the GeoMarketers excited, it is natural disasters. Every year at this time we see hurricane trackers, 3D storm viewers and other “exciting” products to help protect us from the wrath of mother nature. This year Google is promoting their Personalized Storm Tracker.
The safety recommendations you receive will be tailored to reflect the current status of the event and your context. For example, if you search for a specific storm when it’s still several days away, you may see a map of the developing weather event and a recommendation to start preparing an emergency kit. If the storm is only hours away from your location, you might receive a reminder to start charging your phone in case power goes out. And if you search when the storm is nearby, you’ll get the most urgent information, like how to avoid injury from fast-moving water or flying debris.
I feel like these things are more trouble than they are worth. Last Sunday I was at Sky Harbor Airport when a dust storm alert went off on all the iPhones in the baggage claim area. The whole place sounded loud as the alerts went off warning us that wind and dust were headed our way. The result of this great warning, people making jokes. Sure an alert was issued, sure it probably is a safety thing that we all get these alerts on our phones. But the delivery isn’t personalized, it’s a broadcast message and then 10 minutes of people joking about the alerts.
At least in the USA, storms don’t sneak up on anyone. These products are great press but of little value.
I never “celebrated” 10 years of Spatially Adjusted mostly because I forgot about it. I was cleaning up the site earlier this week and noticed there was some good content back then, it definitely had a different tone but hey, I’m 10 years older now. I’m going to post a “best of” link every week to a 10-year-old article for the rest of the year. Some of it will be thought-provoking[footnote]disclaimer: probably not[/footnote] and some of it will be laughable. At any rate 10 years ago this week there were a couple posts about hurricane tracking that were interesting given that it was about Katrina, but this one caught my eye.
All the openness in the world won’t make any product successful, but listening to your customers will. The feeling that I’ve gotten from ESRI over the past year is that they have finally begun to realize that their road to continued success is supporting users like us. Don’t confuse the hype surrounding Google Maps/Earth with them being open and listening to their customers. There is no company that likes to hide behind their logo more than Google and they will do whatever it takes to not have to be open. There is a reason people are beginning to realize that Google is the next Microsoft (while Microsoft seems to have becomethe next IBM). Believe me, ESRI has a LONG WAY TO GO before they are as open as we’d all like them to be, but they do listen to their customers and that is a start.
Well the whole post is sort of like that, me claiming that Esri has been more open than Google or others. The context with this is they started allowing their employees to blog and contact people directly, it was a big shift from the traditional call a phone number support. So we were all so excited to see Esri employees blogging and responding to our articles. Well eventually it all collapsed into a corporate marketing blog cycle but at that moment it looks like we felt like Esri was changing.