Categories
Thoughts

IoT is not About Hardware

When you think about IoT you think about little devices everywhere doing their own thing. From Nest thermostats and Ring doorbells to Honeywell environmental controls and Thales biometrics; you imagine hardware. Sure, there is the “I” part of IoT that conveys some sort of digital aspect, but we imagine the “things” part. But the simple truth of IoT is the hardware is a commodity and the true power in IoT is in the “I” part or the messaging.

IoT messages can inundate us but they are the true power of these sensors

IoT messages are usually HTTP, WebSockets, and MQTT or some derivative of them. MQTT is the one that I’m always most interested, but anything works which is what is so great about IoT as a service. MQTT is leveraged greatly by AWS IoT and Azure IoT and both services work so well at messaging that you can use either in replacement of something like RabbitMQ, which my daughter loves because of the rabbit icon. I could write a whole post on MQTT but we’ll leave that for another day.

IoT itself is built upon this messaging. That individual hardware devices have UIDs (unique identifiers) that by their very nature allow them to be unique. The packets of information that are sent back and forth between the device and the host are short messages that require no human interaction.

The best part of this is that you don’t need to hardware for IoT. Everything that you want to interact with should be an IoT message, no matter if it is an email, data query or text message. Looking at IoT as more than just hardware opens connectivity opportunities that had been much harder in the past.

Digital twins require connectivity to work. A digital twin without messaging is just a hollow shell, it might as well be a PDF or a JPG. But connecting all the infrastructure of the real world up to a digital twin replicates the real world in a virtual environment. Networks collect data and store it in databases all over the place, sometimes these are SQL-based such as Postgres or Oracle and other times they are simple as SQLite or flat file text files. But data should be treated as messages back and forth between clients.

All I see is IoT messages

When I look at the world, I see messaging opportunities, how we connect devices between each other. Seeing the world this way allows new ways to bring in data to Digital Twins, think of GIS services being IoT devices, and much easier ways to get more out of your digital investments.

Categories
Thoughts

Open Environments and Digital Twins

The GIS world has no idea how hard it is to work with data in the digital twin/BIM world. Most GIS formats are open, or at works readable to import into a closed system. But in the digital twin/BIM space, there is too many close data sets that makes it so hard to work with the data. The loops one must go through to import a Revit model are legendary and mostly are how you get your data into IFC without giving up all the intelligence. At Cityzenith, we were able to work with tons of open formats, but dealing with Revit and other closed formats was very difficult to the point it required a team in India to handle the conversions.

All the above is maddening because if there is one thing a digital twin should do, is be able to talk with as many other systems as possible. IoT messages, GIS datasets, APIs galore and good old fashioned CAD systems. That’s why open source data formats are best, those that are understood and can be extended in any way someone needs. One of the biggest formats that we worked with was glTF. It is widely supported these days but it really isn’t a great format for BIM models or other digital twin layers because it is more of a visual format than a data storage model. Think of it similar to a JPEG, great for final products, but you don’t want to work with it for your production data.

IFC, which I mentioned before, is basically a open BIM standard. IFC is actually a great format for BIM, but companies such as Autodesk don’t do a great job supporting it, it becomes more of interchange file, except where governments require it’s use. I also dislike the format because it is unwieldy, but it does a great job of interoperability and is well supported by many platforms.

IFC and GLTF are great, but they harken back to older format structures. They don’t take advantage of modern cloud based systems. I’ve been looking at DTDL (Digital Twins Definition Language) from Microsoft. What I do like about DLDT is that it is based on JSON-LD so many of those IoT services you are already working with take advantage of it. Microsoft’s Digital Twin platform was slow to take off but many companies, including Bentley Systems, are leveraging it to help their customers get a cloud based open platform which is what they all want. Plus you can use services such as Azure Functions (very underrated service IMO) to work with your data once it is in there.

Azure Digital Twins
Azure Digital Twins

The magic of digital twins is when you can connect messaging (IoT) services to your digital models. That’s the holy grail, have the real world connected to the digital world. Sadly, most BIM and digital twin systems aren’t open enough and require custom conversion work or custom coding to enable even simple integration with SAP, Salesforce or MAXIMO. That’s why these newer formats, based mostly on JSON, seem to fit the bill and we will see exponential growth in their use.

Categories
Thoughts

Using the Esri ArcGIS Server Cloud Builder

I’ve been playing with ArcGIS for Server 10.3.1 at Matrix and we’re all about running things with hosted services.  So rather than spec out some hardware and install ArcGIS for Server on local legacy machines, we’re doing it all in the cloud.  Because I’m new here there wasn’t any legacy AWS use so I was able to pick Azure for deployment.  My logic:

  • While I’m experienced with AWS, Azure is mostly an unknown world to me.  Given we’re running Windows servers with SQL Server, why not go native.
  • I really want to give SQL Azure a spin.
  • The portal for Azure is much nicer than AWS.  They have those stupid panels in places1 but mostly it makes logical sense.
  • Esri has Cloud Builder to simplify installation which I though would be great for starting up prototypes quickly.

So logical, no?  Well late yesterday this tweet went out by me.

I was stuck here:

You can literally hear the sad trombone sound.  Now Sam Libby was helping troubleshoot but things were still a bit weird.  Basically as you can see in the error above, I needed to accept an EULA.  Now of course I went into the the Azure Marketplace and followed the instructions to allow the Esri VM to be deployed programmatically which is what Cloud Builder requires.  But each time it errored out the same way.

Sam offered this:

Basically he hit upon it.  Microsoft did something with the marketplace and for whatever reason the Cloud Builder app won’t install an Esri ArcGIS for Server VM until you actually install it first yourself.

The workaround to get the Cloud Builder app to run is actually just create a VM using the Azure Portal then delete it.

After that, the Esri Cloud Builder app runs perfectly without trouble.

Philip Heede basically confirms everything.

So the ArcGIS for Server Cloud Builder2 works great.  While I don’t like wizards in general, it automates the processes that take time and let’s you focus on the settings for ArcGIS for Server you want to change.  I honestly haven’t installed ArcGIS for Server since it was ArcGIS Server (without the for) 9.3.1 and it was interesting to see how things have changed and how little has actually changed.

  1. Is that what they’re called? 

  2. Seriously, why no “for” in the title, consistency folks!