JavaScript instead of Python
As a long time Mac user I’ve used AppleScript to automate many work flows. Now AppleScript is pretty powerful but it unique (Well I’ve always thought it was like HyperTalk but that’s pretty unique too).
Well Apple is looking at allowing JavaScript to be used for automation instead of AppleScript with the next version of Mac OS X Yosemite. So I mused on Twitter this morning:
https://twitter.com/jamesmfee/status/476382675251367938
Now let’s be honest, GIS and Python have a huge love affair going right now. But I really think despite all of JavaScript’s “issues” (as everyone continues to point out it has floating point error issues) there are some great workarounds. JavaScript being used both on the server and front end of applications seems so simple and logical that Python becomes almost niche like FORTRAN was in the 1990s.
I’m not sure I would tell anyone in GIS not to learn Python because it is critically important to day and most likely will be for years. Just that you should be putting as much time into JavaScript at Python and be ready for the jump really soon. You’ll be talking about Python in a couple years like I do about Perl.
Mapbox GL for iOS
So Apple Maps is still a disaster. But:
During WWDC Apple noted that there are no less than 680,000 apps in the App Store that use location data
That’s a lot and most of them end up using Apple’s Map API mostly because it’s easy (or lazy, I would be lazy). Last week Mapbox seems to have a solution ready to go though that might be easier than Apple’s and a whole lot better:
We just released Mapbox GL — a new framework for live, responsive maps in every iOS app. Now developers can have the most detailed maps sourced from ever-updating OpenStreetMap data, as well as the ability to fully control the style and brand to design maps that perfectly match their app. This is all done using our new on-device vector renderer, which uses OpenGL ES 2.0 technology for pixel-perfect map design, from antialiased fonts to polygon blurring, all hardware-accelerated and optimized for mobile devices — and all on the fly.
Of course it is open source and available on Github. I’ve not had a chance to play with it yet but I hope to sit down with it soon. It looks great, built on OSM and is released open source so no matter what happens to Mapbox1 I’ll still have access to the mapping library. Really amazing stuff.
- Not that I know a thing, but it’s always a good question ↩︎
Mapbox GL for iOS
So Apple Maps is still a disaster. But:
During WWDC Apple noted that there are no less than 680,000 apps in the App Store that use location data
That’s a lot and most of them end up using Apple’s Map API mostly because it’s easy (or lazy, I would be lazy). Last week Mapbox seems to have a solution ready to go though that might be easier than Apple’s and a whole lot better:
We just released Mapbox GL — a new framework for live, responsive maps in every iOS app. Now developers can have the most detailed maps sourced from ever-updating OpenStreetMap data, as well as the ability to fully control the style and brand to design maps that perfectly match their app. This is all done using our new on-device vector renderer, which uses OpenGL ES 2.0 technology for pixel-perfect map design, from antialiased fonts to polygon blurring, all hardware-accelerated and optimized for mobile devices — and all on the fly.
Of course it is open source and available on Github. I’ve not had a chance to play with it yet but I hope to sit down with it soon. It looks great, built on OSM and is released open source so no matter what happens to Mapbox1 I’ll still have access to the mapping library. Really amazing stuff.
- Not that I know a thing, but it’s always a good question ↩︎
Indoor Positioning with Apple’s M7 processor
Indoor positioning is hard. The amount of effort that has been put into it since smartphones first started to take off has given us marginal results. WiFi, LTE/3G, GPS gets close but it’s still not accurate enough to tell me that the butter is right in front of my face. I’ve experienced Apple’s iBeacon technology in the Apple Store and at AT&T Park in San Francisco but that’s a tad proprietary. Possibly Apple’s got a solution though (still proprietary but what do I care, I’m all iOS):
In iOS 8, Apple is adding some new Core Location features that let app developers get precise indoor positioning data from an iOS device’s sensors and it’s even letting venues contribute by signing-up to get help enabling indoor positioning.
Yea so it’s iOS only but get a load of this:
Up until now, CoreLocation has been using a combination of Cellular, GPS, and WiFi technology in order to provide developers with location information from their users. Those technologies can get you within a city block or in some cases close to or inside a venue, but they aren’t enough to provide accurate positioning indoors or features like indoor navigation. That’s why with iOS 8, Apple is introducing new features for the CoreLocation API that will let developers tap into an iPhone’s M7 processor and motion sensors in order to get accurate indoor location, navigation, and floor numbers.
It’s actually very interesting. To save power once the location is determined GPS is turned off and Core Location uses the motion chip to figure out where you are walking and what’s near. The drawback according to 9to5mac is that locations are going to be required to provide Apple with their floorplans and RF information. Not exactly open…
Indoor Positioning with Apple’s M7 processor
Indoor positioning is hard. The amount of effort that has been put into it since smartphones first started to take off has given us marginal results. WiFi, LTE/3G, GPS gets close but it’s still not accurate enough to tell me that the butter is right in front of my face. I’ve experienced Apple’s iBeacon technology in the Apple Store and at AT&T Park in San Francisco but that’s a tad proprietary. Possibly Apple’s got a solution though (still proprietary but what do I care, I’m all iOS):
In iOS 8, Apple is adding some new Core Location features that let app developers get precise indoor positioning data from an iOS device’s sensors and it’s even letting venues contribute by signing-up to get help enabling indoor positioning.
Yea so it’s iOS only but get a load of this:
Up until now, CoreLocation has been using a combination of Cellular, GPS, and WiFi technology in order to provide developers with location information from their users. Those technologies can get you within a city block or in some cases close to or inside a venue, but they aren’t enough to provide accurate positioning indoors or features like indoor navigation. That’s why with iOS 8, Apple is introducing new features for the CoreLocation API that will let developers tap into an iPhone’s M7 processor and motion sensors in order to get accurate indoor location, navigation, and floor numbers.
It’s actually very interesting. To save power once the location is determined GPS is turned off and Core Location uses the motion chip to figure out where you are walking and what’s near. The drawback according to 9to5mac is that locations are going to be required to provide Apple with their floorplans and RF information. Not exactly open…
“Big Player” give free helpings of ArcGIS Online to Kids
via APB
I can’t help but laugh at this news a bit. I’m not educator but read up.
The dominant player in the world of geographic information systems is making free accounts to its advanced mapping software available to an estimated 100,000 K-12 schools in the U.S.
So free accounts of something that adults can’t even figure out how to use. I’m sure that will work out great. My son already thinks GIS is too complicated, now he might have to figure out how to manage Esri credits with ArcGIS Online? I admit though, I’ve been really busy lately so I haven’t kept up on the cost of AGOL:
In its release, Esri says the value of an account for the software is $10,000, leading to the $1 billion valuation for the entire donation ($10,000 X 100,000 U.S. K-12 schools.)
AGOL costs $10,000? Oh, well played Esri! You’re a “big player”!