Tag: aws

  • IoT is not About Hardware

    When you think about IoT you think about little devices everywhere doing their own thing. From Nest thermostats and Ring doorbells to Honeywell environmental controls and Thales biometrics; you imagine hardware. Sure, there is the “I” part of IoT that conveys some sort of digital aspect, but we imagine the “things” part. But the simple truth of IoT is the hardware is a commodity and the true power in IoT is in the “I” part or the messaging.

    IoT messages can inundate us but they are the true power of these sensors

    IoT messages are usually HTTP, WebSockets, and MQTT or some derivative of them. MQTT is the one that I’m always most interested, but anything works which is what is so great about IoT as a service. MQTT is leveraged greatly by AWS IoT and Azure IoT and both services work so well at messaging that you can use either in replacement of something like RabbitMQ, which my daughter loves because of the rabbit icon. I could write a whole post on MQTT but we’ll leave that for another day.

    IoT itself is built upon this messaging. That individual hardware devices have UIDs (unique identifiers) that by their very nature allow them to be unique. The packets of information that are sent back and forth between the device and the host are short messages that require no human interaction.

    The best part of this is that you don’t need to hardware for IoT. Everything that you want to interact with should be an IoT message, no matter if it is an email, data query or text message. Looking at IoT as more than just hardware opens connectivity opportunities that had been much harder in the past.

    Digital twins require connectivity to work. A digital twin without messaging is just a hollow shell, it might as well be a PDF or a JPG. But connecting all the infrastructure of the real world up to a digital twin replicates the real world in a virtual environment. Networks collect data and store it in databases all over the place, sometimes these are SQL-based such as Postgres or Oracle and other times they are simple as SQLite or flat file text files. But data should be treated as messages back and forth between clients.

    All I see is IoT messages

    When I look at the world, I see messaging opportunities, how we connect devices between each other. Seeing the world this way allows new ways to bring in data to Digital Twins, think of GIS services being IoT devices, and much easier ways to get more out of your digital investments.

  • Long Term Storage of Spatial Data

    Following on with yesterday’s blog post, I’m also concerned about where I’m storing the data. Until this month I stored the data in Dropbox. I can’t recall when I signed up for Dropbox, but I’ve probably paid them over $1,000 for the privilege of using their service. As with most SaaS products, they start trying to help consumers and then they pivot to enterprise. That’s what Dropbox is doing and I’m tired of it. Their client software is just a hack and there are too many other solutions that better fit with my budget needs than a stand along cloud storage solution.

    So as of May 2020, I no longer pay Dropbox $99/year. I’ve moved all my data to iCloud because I do pay for 2TB of storage there (Family plan) and it integrates better with my workflows. I could have put it in Google Drive too, but I’ve never liked how it works which is a shame because it is easy to share with other users. But this isn’t archival by any means. All I’m doing is putting data on a hard drive, though a virtual hard drive in the cloud. It gets backed up sure, but there isn’t any check to make sure my daughter doesn’t drag the data to the trash and click empty. A true archival service is one that makes the data much safer than just storing it in a folder.

    Now back in the old days, we used to archive off to DLT tapes and then send those offsite to a place like Iron Mountain. Eventually you’d realize you needed a restoration and the IT guy would request the tape/tapes come back from offsite and restore them to a folder that you could access. Hopefully they were in a format you could read, but generally that wasn’t too much of a problem, there is a reason though we kept a Sun workstation around in case we needed to restore data from ARC/INFO on Solaris. The good thing about this is that that data was always a copy, sure the tape could get damaged, but it was offsite and not prone to being messed with. If I needed data from October 2016, I could get it. Of course, eventually, old tapes were destroyed because of space needs but generally it was a great system.

    I’m doing the math in my head as to the cost of those DLT tapes

    Now I’m not thinking I need to get a DLT tape drive and pay Iron Mountain for this privilege, but I do need to get data off site and by offsite I mean off my easy to access cloud services (iCloud, Google Drive, AWS S3, etc). I have been working with Amazon S3 Glacier and it has been a really great service. I’ve been moving a ton of data there to not only clean up my local drives and iCloud storage, but ensure that that data is backed up and stored in a way that makes it much safer than having it readily available. Not Glacier is easy enough to use, especially if you are familiar with S3, but you don’t want to throw data in there that you need very often because of how it is costed. Uploading is free, and they charge you $0.004 per GB/mo which is insanely low. Retrieval is 3 cents per GB which is reasonable and after 90 days you can delete data for free.

    Glacier isn’t new by any means, I had been using it to archive my hard drives using Arq but not this specifically using projects. I’ve just started doing this over the weekend so we’ll see how it goes but I like that the data is in a deep freeze, ready to be retrieved if needed but not taking of space where it isn’t needed. I’ve also set a reminder in 2 years to evaluate the data storage formats to ensure that they are still the best method moving forward. If I do decide to change formats, I’ll continue to keep the original files in there just in case the archival formats are a bad decision down the road. Storing this all in Glacier means that space is cheap, and I can keep two copies of the data without problems.

  • Cityzenith Smart World Professional IoT

    I know, I used the buzzword IoT in my title above.  Stay with me though!  We think about IoT as a link between a physical device (your Nest thermostat for example) and the digital world (your Nest app on your iPhone), but it is so much more.  While we have been working with many IoT providers such as Current by GE we’ve also fundamentally changed how our backend APIs work to embrace this messaging and communication platform.

    Using AWS IoT Services everything that happens in our backend API can alert our front end apps to their status.  This ties very nicely into our Unity front-end Smart World Professional application because it can tell you exactly what is happening to your data.  Uploading a detailed Revit model?  The conversion to glTF occurs in the background, but you know exactly where the process is and exactly what is going on.  Those throbber graphics web apps throw up while they wait for a response from the API are worthless.  Is the conversion process two thirds the way through or just 10%?  Makes a big difference don’t you think?

    Where this really starts to matter is our analytics engine, Mapalyze.  If I’m running a line of sight analysis for a project in downtown Chicago, there is a ton that is going on from the 3D models of all the buildings to trees, cars and the rest that can affect what you can see and can’t see.  Or detailed climate analysis where there are so many variables from the sun, weather (wind, temperature, rain) and human impacts that these models can take a very long time to run.  By building the AWS IoT platform into our backend, we can provide updates on the status of any app, not just ours.  So if you want to call Smart World Professional Mapalyze from within Grasshopper or QGIS, you won’t get a black box.

    In the end what this means is Smart World Professional is “just another IoT device” that you will be able to bring into your own workflows.  Really how this is all supposed to work, isn’t it?  For those who want to get deeper on how we’re doing this, read up on MQTT, there is a standard under here that everyone can work with even if you’re not on the AWS platform.