We’ve been pushing along with our latest release of Smart World Professional, and we’ve got some elegant tools coming online that will showcase why we went the route we did. One of the biggest requests from the AEC industry is Shadow/Solar analysis. There are tons of tools that do this, but they can be expensive, hard to use and even hard to get data for. That’s why I really like the Mapbox Unity SDK and its 3D buildings.
You can, of course, load your own buildings into Smart World Professional but this is a great way to see how your projects will impact the surrounding city out of the box with our tools. The Unity SDK really shines with this kind of work.
I was just thinking about this on my plane flight back from Cityzenith HQ. Spring of 1993 I was first exposed to what I later learned was called GIS because some guy in Canada had the smarts to label what we were all learning to do. My exposure in college was mostly because pen and paper cartography I was learning didn’t work out for me, a computer geek since my Dad brought home a TI-99/4a and I discovered you could write applications.
Of course, I bought my stupid pen set for some considerable amount of money (at least to a college student back then) and tried to draw the Puget Sound with any degree of accuracy. Seriously though, if you’ve ever seen me draw even a circle or a square you can pretty much guess how this “puget sound” looked like (I recall it looked like an eggplant crossed with a maple leaf). I recall sitting in the computer lab for the statistics class and seeing that on the Macintosh computer a copy of Aldus Freehand was there. I fired that up, drew a damn good looking Puget Sound and submitted it to the teacher. So I didn’t even get an F, he gave me an incomplete which was to be expected back then. Computers were for spreadsheets and reports, not cartography.
But while this was going on, I had zero idea that this problem had already been solved by many people, including those Canadians I mentioned earlier. But it took two more years for me to be connected to the mid-decade US census and my internship at the City of Mesa, AZ planning department until I was exposed to Arc/INFO 5 and ArcView 2 on Unix to see how the world of points, polylines and polygons existed. But there I was creating maps in Freehand and throwing SPSS tables on top of them. GIS as we know it today, but I had zero idea what the heck I was doing. Originally being an Economics major I had the concept of table + pie chart/line graph down pat. I just replaced the chart with a hand-drawn map.
This week has been about sun angles, shadow analysis and time sliders in Unity, but it’s hard not to think back to a time where my struggle against drawing a map by hand introduced me to computer maps and eventually a career for basically 25 years. I can see an alternate universe where 1993 James sucked it up, worked hard to draw the water body by hand and became a city planner in some small midwestern city. Thank god that didn’t happen, and I was able to grow from Arc/INFO -> ArcView -> PC ArcInfo -> ArcInfo Workstation -> ArcGIS Desktop -> FME Desktop -> uDIG -> QGIS and then on to the multitude of open source libraries I use in my day to day workflows.
While that is a fun path to think of, I get excited to my next 25 years in spatial. Working with Unity, OSM, Elastic and AWS Lambda I can see how what we do has such a great exciting future ahead of us. The world loves what we do, and we’re lucky enough to be able to do it every day. Don’t let anyone make you feel bad for using software or products because it’s not “hipster” or “open.” If you’re pushing the bar forward, that’s good enough.
I was thinking this morning about how much of my professional life has been about vector data. From the moment I started using Macromedia Freehand in college in the early 90s (before I had heard about GIS) to make maps to the 3D work, I’m doing with Unity and Cityzenith I’ve used vector data. I wasn’t genuinely introduced to raster data until I started using ArcInfo 5 at my first internship and working with grids and even then it was still about coverages and typing “build” or “clean” again and again. We did a bunch of raster analysis with Arc, but mostly it was done in Fortran by others (I never was able to pick up Fortran for some reason, probably best in the long run).
It’s easy to see and use vectors in professional spatial work for sure. I always feel like Neo from the Matrix, I look at features in the world and mentally classify them as vectors:
- Bird -> point
- Electrical transmission line -> line
- House -> polygon
Heck or how you might think of a bird as a point (sighting), line (migratory pattern) or polygon (range). So damn nerdy and my wife fails to see the fun in any of this. Again, like Neo when he finally sees the world like the Matrix truly is we see things as the basic building blocks of vector data.
As I’m flying to Chicago this morning and I stare out the window of the airplane, I can’t help but think of rasters though. Sort of like that hybrid background we throw on maps, the world beneath me is full of opportunities to create vectors. Plus I bet we could run some robust agriculture analysis (assuming I even knew what that was) to boot. The world is not full of 1s and 0s but full of rasters and vectors.
As I’m a point, traveling a line on my way to a polygon, I can’t help but appreciate the spatial world that has been part of my life for over 20 years. I can’t help but think the next 20 is going to be amazing.
Yes everyone knows about What3Words. It was an attempt to come up with an easy way to assign addresses for places where there are none. In the end, a proprietary addressing system will never gain traction, and of course the inevitable eventually happened. My personal feeling is that What3Words never really got us beyond x/y numbering and the logic behind an addressing system was not there. Enter Plus codes which comes at this problem from a different perspective. There is a very detailed analysis of existing methods and why they choose to go this direction that I’ll leave it up to you to read.
Probably the biggest reason to pay attention is that this open addressing system was developed by Google. In fact, they are already implementing it in India as we speak which goes a very long way to making this happen.
All these systems are built on the idea the world is a grid, and how deeply you drill down into that grid is your address so things need not be a single point, they can be an area which opens up many exciting ideas for addressing, especially outside of North America and Europe. Check out the Github project to learn more.
So this time of year nothing says March more than March Madness (well maybe Easter when Easter is in March). Lots of companies play off the theme, and it appears Mapbox is jumping on that sports bandwagon themselves with Map Madness.
I’ve clicked the play button myself, and I’m waiting for the start. I have no idea what the challenge will be, but I’m up for any of the prizes. The Mapbox team is full of brilliant people so I can only assume the challenges will be more than just guess where a satellite photo was taken.
Side note, my Sun Devils somehow made the tourney. Forks up!
Good news for us Elastic users:
Several geospatial systems use Well Known Text (WKT) as their preferred/only format for geospatial objects. What if you wanted to use Elasticsearch for your geospatial data though? Until 6.2, Elasticsearch has only provided the option of providing shapes in GeoJSON format. To get your WKT data into Elasticsearch, you may have to go through a complicated export + conversion process. No longer! You can now index a shape in a WKT string directly to Elasticsearch.
I’ve been using WKT quite a bit because it supports curves and now I can load WKT natively into Elasticsearch without converting it beforehand. There is much here to think about for sure!
I know, I used the buzzword IoT in my title above. Stay with me though! We think about IoT as a link between a physical device (your Nest thermostat for example) and the digital world (your Nest app on your iPhone), but it is so much more. While we have been working with many IoT providers such as Current by GE we’ve also fundamentally changed how our backend APIs work to embrace this messaging and communication platform.
Using AWS IoT Services everything that happens in our backend API can alert our front end apps to their status. This ties very nicely into our Unity front-end Smart World Professional application because it can tell you exactly what is happening to your data. Uploading a detailed Revit model? The conversion to glTF occurs in the background, but you know exactly where the process is and exactly what is going on. Those throbber graphics web apps throw up while they wait for a response from the API are worthless. Is the conversion process two thirds the way through or just 10%? Makes a big difference don’t you think?
Where this really starts to matter is our analytics engine, Mapalyze. If I’m running a line of sight analysis for a project in downtown Chicago, there is a ton that is going on from the 3D models of all the buildings to trees, cars and the rest that can affect what you can see and can’t see. Or detailed climate analysis where there are so many variables from the sun, weather (wind, temperature, rain) and human impacts that these models can take a very long time to run. By building the AWS IoT platform into our backend, we can provide updates on the status of any app, not just ours. So if you want to call Smart World Professional Mapalyze from within Grasshopper or QGIS, you won’t get a black box.
In the end what this means is Smart World Professional is “just another IoT device” that you will be able to bring into your own workflows. Really how this is all supposed to work, isn’t it? For those who want to get deeper on how we’re doing this, read up on MQTT, there is a standard under here that everyone can work with even if you’re not on the AWS platform.