Internet of Things (IoT) solutions used to send a shiver down my back.  Insecure devices, configuration management issues and how on earth do you store, process and analyse all that data?

Microsoft’s IoT solutions are starting to mature to the point where there is a lot less pain involved with implementing these types of data solutions.  Azure gives us rapid deployment and almost infinitely scalable platform and offers maturing Software as a Service and Platform as a Service solutions.  As such, I felt the time was right to develop my own Azure hosted IoT solution.  Its only by going through the pain of things not working, different configurations and trial and error that you learn the valuable lessons of how to do it properly.  The documentation only tells you so much after all.

So, armed with an Azure MSDN subscription, my aim was to deliver an end to end IoT solution, comprising of four phases;

  1. Hardware configuration and device deployment
  2. Data ingestion into Azure – pipeline and storage (cold & warm)
  3. Data analytics
  4. Digital Twin Model

Microsoft publish a Reference Architecture for IoT solutions, which is always a good place to start when defining a new solution.  Reference architectures offer a template of what a typical solution looks like, its components, and an overall description of how the solution works.

Azure IoT Reference Architecture

 

This blog covers the first phase, hardware configuration and device deployment.  For my solution, I chose to use the Azure AZ3166 IoT Development Kit.

This is a small PCB based on the MXChip chipset and comprises a bunch of different sensors on a single PCB.  These include;
  • Temperature
  • Pressure
  • Accelerometer
  • Gyroscope
  • Humidity
  • Programmable LED
  • Microphone

Azure IoT Devkit AZ3166

As such, this is quite a useful board for capturing lots of different types of data and it can form the basis for multiple use cases for IoT solutions.  My initial thoughts were to capture data feeds for Smart Room type scenarios.  I followed the getting started procedure, and I ended in the position with the device securely sending data into an Azure IoT Hub.

https://docs.microsoft.com/en-gb/azure/iot-hub/iot-hub-arduino-iot-devkit-az3166-get-started

However, things didn’t go completely smoothly.  I encountered the following issues when following the getting started procedure and configuring the device.

  1. The procedure needs to run Powershell in Admin mode which is not possible on my company laptop.  I had to work around the problem by building a virtual machine in my Azure MSDN subscription.  Administrative access is required to allow the installation of Azure Powershell IoT plugins used in deployment, configuration and monitoring.
  2. The device shipped with an out of date firmware version (1.6.3).  I always like to run on the latest codebase when deploying solutions, so I upgraded the firmware to 1.6.5 (latest).  However, the firmware upgrade wouldn’t install successfully.  I had to go to main firmware site (https://microsoft.github.io/azure-iot-developer-kit/versions/) and install the same version from here (but it was a different size binary file).  This applied successfully.  I then tried firmware from original getting started procedure which then applied successfully.
  3. When connected to IoT device in Access Point mode (AP) to set Wifi connectivity and the Azure IoT Hub connection string, only the Wifi configuration would display on the web page.  I had to use puTTY to connect to the serial interface on the device using this procedure (https://microsoft.github.io/azure-iot-developer-kit/docs/use-configuration-mode/) and manually set the Azure connection string using the set_az_iothub command.

I now have the devkit working and connected to the Azure IoT hub.  I can see telemetry coming into the hub, giving temperature/humidity/pressure etc from the sensors.  This should have taken around 30 minutes, but ended up taking about 3 hours with the required troubleshooting.

The next step is to build a data pipeline to ingest the data into a warm and cold data store in Azure and prepare the data for Analytics.