Rebuilding the Enterprise - Software, Hardware and Peopleware Migrations for the Systems Architect


Dockerize the Azure IoT-Edge Gateway

Like most developers in the IoT space, I write code that is typically destined for Linux run-times; but I spend my days typing away on a Surface Book running Windows.  In days past, when I needed to do some hacking, I'd fire up a Raspberry PI or jump over onto one of a few Ubuntu VMs running HyperV sucking up disk space, processor time and RAM.  Thankfully, technology has progressed and these days, with Docker being all the rage, I just spin up a container and get to work.

One of my common tasks has been developing workflow modules for Azure IoT-Edge, our open source IoT gateway.  These modules provide functionality like data compression, aggregation, protocol translation, etc.  Easily spinning up a consistent development environment has been critical to accelerating this work, so I figured I'd share the love and show you how to get the gateway up and running in a Linux container on your Windows box using Docker.

Step one is installing Docker on your Windows 10 (Anniversary edition or higher).  Docker provides great setup instructions that will walk you through that whole process.  Though you can switch to using "Windows Containers" in the Docker settings, let's leave it with the default "Linux Container" setting.  With Docker installed, check and make sure everything is working properly by issuing:

`docker version`

in an elevated Powershell prompt.  This command should return data for both the client and server.  If it doesn't ... there are a copious number of fixes blogged about across the web.  I've also found that just selecting "Restore Docker" from the Docker Settings pane to be a useful nuke option.

Head over to the Azure Portal and set up a "Free" tier IoT Hub.  Use the integrated device manager to add two test devices, select 'Symetric Keys' and have them auto-generated.  Record the device names and primary keys for use in our dockerfile.

Now create a folder that we can put a dockerfile into.  `mkdir iot-edge-container` sounds nice.

`cd` into the directory and create a text file called `dockerfile.txt`.

The first step with our dockerfile will be to declare the base OS we want to use for the image.  In this case we'll do Ubuntu:

`FROM ubuntu`

Boy that was super hard; glad we got it out of the way.

Next we'll need to add some environment variables to the image.  There are a thousand secure ways to do this other than putting our secretes into our dockerfile; but time is of the essence, and that was a disclaimer to encourage you to do the right thing.  Please see here for detailed options from Docker.  So onto those env vars:

# ENV vars for setup
ENV IoTHubName {iot_hub_name}
ENV IoTHubSuffix
ENV device1 {device1_name}
ENV device1key {device1_key}
ENV device2 {device2_name}
ENV device2key {device2_key}

With the environment variables set up, we need to now make sure that the base image is up-to-date and all the IoT-Edge project dependencies are installed.  Apt-Get will be our friend ...

# Update image
RUN apt-get update
RUN apt-get --assume-yes install curl build-essential libcurl4-openssl-dev git cmake pkg-config libssl-dev uuid-dev valgrind jq libglib2.0-dev libtool autoconf autogen vim

Please take note that I've intentionally started a flame war by installing vim ... I have a hard enough time exiting vim, so emacs is off the table completely.  Also note that we are installing `jq` - this will be used to dynamically populate the Gateway's simulator JSON config file.

With the image all updated, we can turn our attention to cloning the IoT-Edge repository and kicking off the build:

# Checkout code
WORKDIR /usr/src/app
RUN git clone

# Build
WORKDIR /usr/src/app/iot-edge/tools
RUN ./ --disable-native-remote-modules

Take note of the flag on the build script, this may or may not be necessary depending on the base OS you are using.  Since we are running on Ubuntu, this flag will keep libuv from blowing up during the build.

Finally, we can turn our attention to getting the container run-time commands all ready.  The following big block of code will get us into the right directory to modify the json config file, echo the config to the console to make sure it's all correct and then kick off the Gateway.  The `ENTRYPOINT` command will ensure that the container does not exit immediately after starting.  Also note the last two `jq` commands which will set the loop time for the simulated devices ... 2 second intervals will chew through your 8K free messages quickly when you can't figure out how to kill your container :-).

WORKDIR /usr/src/app/iot-edge/build

## cat config file into env var
ENTRYPOINT J_FILE=$(cat /usr/src/app/iot-edge/samples/simulated_device_cloud_upload/src/simulated_device_cloud_upload_lin.json) \

    # cd into sample dir
    && cd /usr/src/app/iot-edge/samples/simulated_device_cloud_upload/src/ \

    # update settings based on env vars
    && echo "$J_FILE" \
    #configure iot hub
    | jq '.modules[0].args.IoTHubName="'$IoTHubName'"' \
    | jq '.modules[0].args.IoTHubSuffix="'$IoTHubSuffix'"' \
    | jq '.modules[0].args.Transport="AMQP"' \
    # configure device 1
    | jq '.modules[1].args[0].deviceId="'$device1'"' \
    | jq '.modules[1].args[0].deviceKey="'$device1key'"' \
    # configure device 2
    | jq '.modules[1].args[1].deviceId="'$device2'"' \
    | jq '.modules[1].args[1].deviceKey="'$device2key'"' \
    # set device 1 message period
    | jq '.modules[2].args.messagePeriod=10000' \
    # set device 2 message period
    | jq '.modules[3].args.messagePeriod=10000' \
    > replaced.json \

    # print updates
    && cat replaced.json \

    # cd back up to build dir
    && cd /usr/src/app/iot-edge/build/ \

    # run gateway
    && ./samples/simulated_device_cloud_upload/simulated_device_cloud_upload_sample ../samples/simulated_device_cloud_upload/src/replaced.json


With the dockerfile scripted out, we can now create our complete image.  From the previously opened elevated Powershell prompt issue the following command:

`Get-Content .\Dockerfile.txt | docker build -t iot-edge -`

This will read the dockerfile in and pass it to the Docker build command.  Also not that the image will be tagged with 'iot-edge' for easy identification.  This command will take about 10-15 to run initially, but subsequent runs should leveraging caching and be much faster.

Now for the pi├Ęces de r├ęsistance!

`docker run -ti iot-edge`

The container will fire up, print out the json config file and begin sending telemetry data to Azure!  Wait a few min and refresh the portal to see your simulated data arriving in IoT Hub from your fancy containerized IoT-Edge Gateway!

The complete docker file can be found at this gist.

Happy Coding!


Getting Started with and IoT

6/06/2017 Posted by William Berry , , No comments
I was in a very interesting workshop this past weekend on presented by Auth0's Glenn Block. brings to the table a thoughtfully designed in-browser editor, a powerful CLI and impressive startup times that live up to their messaging of "run(ning) code in 30 seconds".  While the tool chain appears to favor reactive implementations akin to AWS Lambda or Azure Functions, they do offer the capability to run jobs on a cron like schedule.  What better way to explore this feature set than to implement a trivial IoT device simulator ... so let's get started.

Get started by installing the Webtask CLI available here.  

Open a Powershell terminal, create a new folder in your source directory and cd into it: `mkdir webtaks; cd webtasks`

Kick off `npm init` and set up a basic package/project scaffold.
  • package name: `webtask`
  • version: `1.0.0`
  • description: `My First Webtask`
  • entry point: `main.js`
Add the Azure IoT Device SDK for Node: `npm install azure-iot-device-amqp --save`

Open VS Code in the project folder and create a new file called `main.js`

In main.js, add a shell function for the webtask that includes a callback to indicate function completion as follows:

Remove the body of the above function and add the Azure IoT Device library, a connection string and create a device client.  You can follow these instructions to learn how to create an IoT Hub, add a test device and get a device connection string.

We'll now define an array to hold logging data, create a simple logging method and alias the function completion callback passing it our log data.

The last elements we need are a callback for the client connect function and a call for the client to open it's connection to IoT Hub.

Open a Powershell command prompt and create a scheduled Webtask: `wt cron schedule "*/1 * * * *" .\main.js `.  This will create a new Webtask that will send our device data to Azure IoT Hub every minute.

The newly created Webtask can then be monitored via the CLI by simply issuing `wt logs`.  Take note however that this is a feed of all Webtasks running under your account.  If everything is configured correctly you should see output similar to the following:

To clean up the Webtask, simply issue `wt cron rm webtask`

The full demo source code is available here.

Happy Coding!


Getting started with Azure CLI v2 and IoT Hub on Windows

Version 2 of the Azure CLI was released recently and with the added power of PowerShell, we can accomplish some truly amazing things!

To get started download Python 3.5 and install it using the relevant platform link here.  I prefer to put python at the root of my C drive usually in a directory called C:\Python35, which is not surprisingly, right next to an installation of Python 2.7, in C:\Python27.  As you work through the installation prompts, be sure to install PIP, the python package manager, and have python added to your path.

After installation, you'll likely need to upgrade PIP to the latest version which can be done by opening a PowerShell administrative terminal and issuing the following command:

> python -m pip install --upgrade pip

With PIP updated, you can now install the Azure CLI v2 using the same PowerShell terminal window by issuing the following command:

> pip install azure-cli

Once the installation completes, you can now type Azure CLI commands at the PowerShell prompt.  This command will bring up the help for the CLI:

> az -h

Let's log into our account using:

> az login

The CLI will present a token and a URL to visit to authenticate your machine.  Follow the onscreen instructions to complete the authentication procedure.

Now, list all your available subscriptions:

> az account list

Set the subscription to use in creating a new IoT Hub.  I've chosen to use my Visual Studio Enterprise subscription to take advantage of the free credits:

> az account set  --subscription "Visual Studio Enterprise"

Before we can create an IoT Hub, we'll need a Resource Group to put it in, this will make for easy clean-up later.  Use the following command at the PowerShell prompt to create a new resource group:

> az group create -l westus -n MyResourceGroupName

We can now create a new free tier IoT Hub in that Resource Group:

> az iot hub create -g MyResourceGroupName -n DemoIoTHub --sku F1

We can explore all the relevant IoT Hub CLI commands with

> az iot hub -h

Let's view the iothubowner connection string:

> az iot hub show-connection-string

With only a few easy commands, we've now got ourselves a IoT Hub up and running in Azure!

One added benefit to using the PowerShell terminal in this case is that we can easily mix and match CLI commands and PowerShell commands.  To prove this out, I cloned the AzureIoT Protocol Gateway to my machine and using the following commands, pushed my IoT Hub's connection string into a configuration file:

> $file = (Get-Content .\FrontEnd.IotHubClient.json) | ConvertFrom-Json
> $file.ConnectionString = (az iot hub show-connection-string | ConvertFrom-Json | Select-Object -first 1).connectionString
> (ConvertTo-Json $file) | Out-File .\FrontEnd.IotHubClient.json -Encoding ascii

Go forth and whip up some of your own Azure CLI & PowerShell magic!

Happy Coding!


This Week In IoT ~January 6, 2017~

1/06/2017 Posted by William Berry , , , , No comments
In the wake of CES, `This week in IoT` has some interesting highlights.  We saw the public announcement of Microsoft's Connected Car initiative, built from an initial partnership with a Renault-Nissan.  The Kissenger iPhone device allows you to "kiss" strangers over the internet.  And though not expressly `IoT`, it turns out that Ultrasound tracking can be used to bypass anonymity systems.  So here's this week's roll:

In The News:

Kissenger - kissing people over the internet with your iPhone ... 


Security Spotlight:

Ultrasound Tracking Could Be Used to Deanonymize Tor Users

Who to Follow:

Book of the Week:

- Collects interdisciplinary and comprehensive analyses of theoretical and applied problems concerning group profiling, Big Data and predictive analyses.



Using Azure IoT Hub and PowerBI to Visualize Plant Floor Data

1/04/2017 Posted by William Berry , , , No comments
In my previous post on the Azure IoT Gateway SDK, we put together a Powershell quick start script for Modbus Gateway projects.  In this post we'll continue the exploration by connecting a Modbus compatible Beckhoff BK9100 to Azure IoT Hub, shape our data with Azure Stream Analytics and then visualizing the current position of a Dynapar optical encoder with Power Bi.

We'll begin by opening up the Azure Portal, click on the plus icon in the upper right corner and search for `IoT Hub`.  Once the resource is located, select `Create` in the lower left corner.

Configure your IoT Hub with a unique name, select an appropriate scale tier, and make sure to put the hub into it's own resource group for easy clean-up later.

Navigate to the `Shared access policies` tab of the IoT Hub and paste the `Primary key` of the `service` policy into a text file.

Back in the Hub's primary blade, select the `Endpoints` tab under messaging. In the new blade select the `Events` entry under `Built-in endpoints`.  Once the new blade opens, copy the `Event Hub-compatible name` value and the first segment of the `Event Hub-compatible endpoint` value (e.g. `ihsuprodbyres043dednamespace`) into your text file.  We'll use these values in a moment to hook up our Azure Stream Analytics Job.

Next, we'll need an Azure Stream Analytics(ASA) Job to perform some data scaling on the path toward PowerBI.  While this operation could be performed in the gateway, ASA provides us an opportunity to sniff/validate our data and opens the door to easy persistence, should we so choose.  Again click the plus (+) icon in the upper left corner and search for `Stream Analytics`. Once the resource is located, select `Create` in the blade's lower left corner.

 Configure the ASA job with a unique name and place it in the same resource group as the IoT Hub.

We now need to patch the IoT Hub as the input for the ASA Job. In the overview pane, select the input section under `Job Topology`, and press add at the top of the new blade.  Enter an alias that will be used to reference the input in the ASA query.  Make sure that the `Source Type` is set to `Data Stream` and the `Source` is marked as `Event hub`.  Unfortunately, the Event Hub input blade will not automatically recognize the IoT Hub's Event Hub endpoints, so we'll need to set the `Subscription` field to `Provide event hub settings manually`.  Paste in the `Service bus namespace`, the 'Event hub name' and the `Event hub policy key` from your text file. Finish up by setting the `Event up policy name` to `service` and press `Create`.  There is no need to define a non-default consumer group and the data will be JSON formatted, encoded as UTF-8

Before we define our ASA query we should set up the output hook to PowerBI.  Back under the ASA Job's overview tab on the main blade, Select the `Output` section under `Job Topology` and press `Add` at the top.  Enter `power-bi` as your `Output alias`, set the `Sink` to `Power BI` and press authorize.  You'll be asked to enter your Power BI credentials at the login.  If you don't already have a Power BI account, you can sign up for one here.  Once the prompt completes login, enter in a `Dataset Name` and target `Table Name`.

The last bit of ASA configuration we need to handle is setting up the query.  For now, we'll build the query based on values that might end up being slightly different when you wire up your device.

Navigate the the `Query` section under `Job Topology` and select it.  A new blade will open up that will list the Input and Output alias previously defined and a code editor to enter the ASA query.  We'll start by selecting the observation time stamp and the device type value.  As I noted earlier, we need to do some maths to range our encoder value to get the data in shape for PowerBI visualization.  To range the encoder value, cast the string to a float, devide by the ranges max count (in this case 65535) and multiply by 100 to push to percentage.

Once the query is entered, select `Save` at the top and back on the ASA home blade, press `Start` to kick off the job.

We can now turn our attention back to our Modbus device and complete the wiring of it to the gateway.  As I noted earlier, I'm using a Beckhoff BK9100, Modbus capable, bus coupler along with a pretty standard Dynapar 2 channel optical incremental encoder (no Z full rev. channel).  The Process Image for the coupler includes 8 channels of DIO in front of the mapping space for the encoder module, leaving our counter value located at the second word of the process image.  For the astute, the coupler is also confiugred for IPAddres assignment using BootP via Beckhoff's TCBootP application.

The next step in the process is to register our device with the IoT Hub. While there are a number of ways to accomplish this, I suggest you check out our `iot-samples` repository on GitHub.  Begin by either cloning the repo, or just download the code directly.  Navigate to Device Management -> csharp and open the solution file in Visual Studio (Note, you might also want to checkout my Introduction to Azure IoT with Fsharp post too!).  With the solution open, set the `Create Device Identity` project as the startup project.  In the config folder, copy the `config.default.yaml` file to `config.yaml`.  Enter the Host Name for your IoT Hub, and the `iothubowner`'s primary key connection string, which can be found under the IoT Hub's `Shared Access policies` tab.   Finish up by modifying the `Nickname` and `DeviceId` values in the config file, they can be any value you want.  After you run the project, your device will be registered and a device key will be populated in the config file, we'll need that value shortly.

Open the script from my previous blog post in an Administrative PowerShell ISE session.  You'll need to set at least the following values:

  • rootProjectPath
  • iotHubName
  • deviceName - same as the "deviceId" you registered earlier
  • deviceKey - the value from the config file in the Device Management Project
  • deviceMac - the mac address of your modbus device
  • modbusServerIp - the IP Address of your bus coupler/plc/IO device
  • modbusPollingInterval - the number of milliseconds between device polls
  • modbusDeviceType - this is the value that is plumbed to ASA
  • modbus * - note in the sample script, I'm reading unit 0, with function code 4 (read input registers), reading starting at register 2 (1 based), and only reading 1 word of data ... you can see more about Modbus function codes on page 22 of the protocol spec

Once you fill out your data and run the script, navigate to {projectRootPath} -> azure-iot-gateway-sdk -> build -> samples -> modbus -> Debug.  Open the `modbus_win.json` file in a text editor and double check that the document looks well formed.

You can now open a PowerShell command prompt in the same folder and issue the following command to begin reading from the device and sending data to Azure
> .\ modbus_sample.exe .\modbus_win.json
You should see output similar to the following:

Assuming you've set your polling to something reasonable, like 2000 (2 seconds), you should be able to run the application for a few hours on a day's allowance for the IoT Hub free scaling tier.

With data flowing to the IoT Hub and onto ASA, log in to Power BI.  Though it can take some time to finally show up, the ASA dataset should be visible in `Datasets` group.

Select the Modbus dataset, and expand Visualization and Fields from the toggles on the right of the display.

Add a `Line Chart` to the page and set the following values:

  • Axis - datetimestamp
  • Legend - device-type
  • Values - cnt

In the Filters menu for the chart, select the datetimestamp field and set the filter type to `Top N` and show the last 20 items.  This will produce a realtime graph of data streaming from the device.  The default polling interval should be around 3-4 seconds.  The resulting data will look like this!

Happy Coding!


Using PowerShell to Fast Track Azure IoT Gateway SDK Projects

12/30/2016 Posted by William Berry , , , No comments
I've been working with the Azure IoT Gateway SDK quite a bit lately and the project setup experience is not only repetitive, but also somewhat error prone.  Make a dozen settings changes here. Forget a setting there. Edit a few CmakeLists files.\Watch the build explode. Fix. Build AGAIN. Rinse and repeat.

I figure it was time to not only document the whole process, but encode it with PowerShell; 'cause "infrastructure as code" and if computers do anything well, it's to at least make the same mistakes in a repeatable fashion.  So let's whip up a script that will perform the following tasks:

  • Create a new project directory
  • Clone the Azure IoT Gateway SDK from GitHub
  • Clone the Modbus sample module for the Gateway from GitHub
  • Edit the relevant CMake files to include the sample Modbus module
  • Build the SDK
  • Create a Modbus module configuration file with all the correct settings to just `work`
While this example is tuned to building the Gateway SDK's Modbus Sample module, it could quite easily be adapted to build and configure any of the SDK's included samples or other add-on sample modules like the OPC-UA Client.

I'm a fan of constrained and declarative input so let's start by defining an Enum with the allowable transports for the IoT Hub.  The available options are 'AMQP', 'HTTP' and 'MQTT'.

We can then sett up all the pieces of configuration for the project, the IoT Hub, the device mapping and the Modbus Read module.

After some troubles getting the environmental variable set up for the SDK's build script, I turned to Stack Overflow for some assistance and was we well rewarded.  Bill Stewart stepped up with a few functions to make the process easier; his reply to my question includes his great Windows IT Pro blog post on the topic of PowerShell environment variable imports ... well worth a read.  For the sake of simplicity we'll just add them directly to the script.

Next, we'll create the project directory and use `git` to clone the relevant repositories there.  Note that git uses stderr output for some output that is not really an error and the use of the `--quiet` or `-q` option for `clone` does not truly silence the stderr output.  There are some details in this Posh-Git issue.

We can now move the Modbus module and associated sample into the SDK repository and amend the relevant CMake files to have them built when we build the SDK.

Let's move into the SDK directory and copy the current state of the session's environment variables for restore after the build.

To build the SDK on Windows, the documentation notes that we need to run from a Visual Studio Command Prompt.  Behind the scenes, this special command prompt calls a batch file that sets up IDE options and tooling for building, debugging and deploying.  If you right click and check the properties for the shortcuts you'll find a call that looks like this:

`%comspec% /k ""C:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\vcvarsall.bat"" amd64_x86`

The operational gist here is to open a command prompt and keep it open, using the `/k` switch, after running the `vcvarsall` batch script with, in this case, the x64/x86 argument.  We'll leverage the functions I noted earlier to call this batch file directly, adding the environmental variables to our current PowerShell session, and the call the SDK's build script.  Note that we'll skip running the SDK's unit tests and finish by restoring the environmental variables to their original state.

As of the writing of this post, the build will complete with a few warning from the compiler as noted in this Modbus module issue.

We can now turn our attention to setting up the gateway's configuration file for execution.  We'll want to reference the template configuration file for the Modbus gateway module and read it into a PowerShell object.  

From here, we can pick out the various parts of the configuration file settings that need values and can conclude by serializing the configuration to a file in the SDK's build directory.  Take note that the parsing library that supports the gateway can only parse ASCII and UTF-8 encoded configuration files.  PowerShell's default encoding for the Out-File function is Unicode and without specifying the encoding the gateway will blowup with a non-descriptive error at run-time.   

You can now navigate to the compiled Modbus sample (`cd .\build\samples\modbus\Debug`) and run the compiled Modbus gateway sample with the following command:

`  .\modbus_sample.exe .\modbus_win.json`

The complete script can be found here.  Open up an Administrator PowerShell ISE session, paste the code, modify the various settings for your demo project and run.

Happy Coding!


Introduction to Azure IoT with Fsharp

11/28/2016 Posted by William Berry , , , , , No comments


  • IDE/Editor with Fsharp capabilities, e.g. Visual Studio or VS Code with Ionide plugin. 
  • Azure Subscription.
  • Nuget or Paket

Estimated Completion Time: 2-3 hours

A Brief Introduction 

Beating the drum of strongly typed function programming in the land of IoT is the textbook definition of counterculture.  Embedded systems have been written in "high-level" languages like C/C++ forever.  New players to the IoT market yearn for broad-based adoption and think the only way to drive developer adoption is to JavaScript All The Things! While precedent and low barrier to entry are certainly compelling, neither are helping us build better, more robust, provably secure or correct systems.  As such, I think there is a very strong case for languages like F#, especially when leveraging opensource cross-platform run-times and sdks like .Net Core.

The goal of this introductory tutorial will be to show how F# fits into the world of IoT while simultaneously providing a broad architectural overview of an Azure IoT solution.

Happy Coding,

Other Azure F# Resources

Since this guide primarily covers the use of F# with Azure resources, you might find the following link helpful:

Guide - Cloud Data, Compute and Messaging with F# - from
Using F# on Azure - from Microsoft

Data Simulation  

For this tutorial, we'll be simulating wind speed measurements taken from an array of devices.  The data will include nested objects like geo-coordinates and observation times.  We'll transmit this data from a device simulator that will act as a field gateway device and publish the data to an Azure IoT Hub.  Further post-processing steps will leverage an array of Azure PaaS offerings and harness the power and simplicity of F# all the way through to data visualization.

Though we will be hand rolling the data generators, one could just as easily leverage community libraries like FsCheck, which include wonderful APIs for randomized data generation.

Project Scaffold

To complete this tutorial, we'll need two (2) empty projects created in a Visual Studio Solution.  The solution name is up to you; but, I would suggest the following names for the projects as they align with Microsoft's iot-samples library.
  • `RegisterDevices` - the project that will be used to register simulated devices with our Azure IoT Hub.
  • `DeviceSimulator` - the application that will simulate our IoT device(s) field gateway. 


To save ourselves from hard coding connection strings and keys, let's build a configuration file that can be used across all the applications, and have fun with an F# Type Provider while we are at it.

In your solution add a folder call `config` and create a new file in that folder called `config.yaml`.  We'll need two primary groups of configuration information, one group for our Azure Cloud settings and one group for the simulated device(s).  The cloud settings section will need to store the URI of our IoT Hub, the IoT Hub's Event Hub compatible endpoint for reading device to cloud messages and a connection string to the IoT Hub, which will be used for device registration and other tasks.  I should note at this point that you can obviously build up the connection string from it's elements, removing the copy pasta, but that will be left as an exercise for you.

The following text can be pasted into your config.yaml file, replacing the `{foo}` parts with your IoT Hub's settings which we'll collect in the next section.  Also, don't worry about the Device `Key` yet, we'll get that filled in via registration code in a subsequent section.

Creating an Azure IoT Hub

Log into the Azure Portal, if you don't have an account you can sign up for a free one here that will supply you with $200 of free credit.  This demo solution is very light on Azure resources, so don't worry about draining your free credits, even if you leave it running for a few days.

Once you are logged into the portal select the `+` icon in the top left corner of the screen and search for `IoT Hub`.

After selecting the resource press `Create` in the lower left corner of the newly presented blade.

You'll be subsequently prompted to enter configuration information for the IoT Hub.  There are only a few settings here worth mentioning:

  • In the `Pricing and Scale Tier` menu, be sure to click into it and select the `Free` tier.  This will provide you with more than enough of a daily messaging rate to complete this tutorial and continue exploring on your own.
  • Select one (1) IoT Hub Units, if it's not already populated.
  • Change the `Device to Cloud Partitions` count to two (2).  This setting helps with scale out for the Hub and having fewer partitions will ease experimentation with reading Device to Cloud messages later.  For further reading, check out this introductory article on Event Hubs to understand the mechanics behind partitions.  
  • Make sure to select `create new` for the Resource Group setting, this will allow for easy resource clean up later.
  •  Select an available region that is within your legal jurisdiction and/or close to your geographic local.  Please note that data generated by your IoT solutions such as latitude/longitude, city/state/province, postal code, occupancy or facility egress, and/or other pieces of end user information, may be considered Personally Identifiable Information (PII).  As such, many countries govern where this data can be transmitted and where and how it can be persisted, even temporarily.  It is up to you, the developer, to maintain compliance with these regulations - consult legal aid if you do not fully understand these requirements.   

After entering the IoT Hub configuration information, press `Create` - you will be returned to the your Portal Dashboard while Azure sets up the Hub.  Now would be a great time for an espresso!

(a few minutes later)

With an espresso in hand, navigate to the newly created IoT Hub.  While it's worth exploring all the good information presented in the Hub's main portal blade, we'll need to make note of a few specific things before writing the application code.

In the section labeled `Overview`, copy the IoT Hub's `host name` value into the config.yaml file's `IoTHubUri` setting.  My IoTHubUri value will be ``.

Scroll down the list of sections until you find the `Shared access policies` entry and click on it.  The blade will be extended with access accounts - select the `iothubowner` account.  Please note that for anything beyond toy solutions, fine tuned access controls that restrict user and subsystem permissions is imperative. Giving an application or other user the `iothubower` permission level is a recipe for a security disaster!

Once the `iothubowner` entry is selected, a new blade will be presented with security information.  Copy the `Connection string - primary key` value into the config.yaml file's 'ConnectionString` setting.

Continuing with the laundry list of disclaimers ... note that the portal has provided you with two (2) keys and two (2) corrosponding connection strings which include those keys in their bodies.  All applications that connect to the IoT Hub should have the capability to fail over between theses keys to ensure application up-time.  Also note that you'll want to develop a method for key rotation that meets your security requirements.  Though the posts are a bit old (2012), I suggest reviewing Bruce Kyle's awesome Windows Azure Security Best Practices series, to help your develop a cloud security mindset.

With our configuration set up, let's get to writing some F#!

Device Registration 

The next step along this IoT journey will be to write a small application that registers the simulated device with the IoT Hub; this process will generate a key that will subsequently be stored in the config.yaml file.

With the solution open in Visual Studio open the Package Manager Console - Tools > Nuget Package Manager > Package Manager Console -> Select the `DeviceIdentity` project and run the following commands to install the application's dependencies:
  • Install-Package Fsharp.Configuration
  • Install Package Microsoft.Azure.Devices
We are pulling in the Fsharp.Configuration package because it includes a YAML type provider that we'll use to easily parse the config.yaml file.

The application code will start simply by opening the dependent libraries, creating a `Config` type using the YAML Type Provider and then printing out to the console the Hub's connection string.

With the shell of the registration application reading from the config file, we now need code to create an IoT Hub Registry Manager, add devices, upgrade our key printing capabilities and persist the Azure generated Device Key to the config.yaml file.  So in that order:

The above code should replace the existing `printfn` call in the `Program.fs` file.  Notice that we've also run the `addDevice` function to kick the whole process off.

Now baring compilation errors, set the DeviceIdentity project as the default startup project, and run the application; the config.yaml file will be updated with the Azure generated Device Key.  But, we have a problem ... running the application a second time will result in a runtime `DeviceAlreadyExistsException`; so lets handle that. We'll start with adding a function that can `Get` a device's configuration from the IoT Hub based on it's Device Id in the event that it already exists in the device registry.  Additionally, we'll enhance the `addDevice` function to properly handle the already exists exception.

This code uses the simple `try ... with` expression to attempt the `addDevice` call, falling back to the new `getDevice` function in the event that the application encounters the aforementioned already exists exception.  Deleting the Device Key in the config.yaml file and a re-run should now properly demonstrate our intended behavior.  Oh, and congratulations - you've successfully added a device to your Azure IoT Hub using your cunning wits, some copy pasta and a bit of friendly F#!

Device Simulator 

The next step in the process will be to create a simulated device.  For this tutorial, we are going to simulate a field gateway device collecting wind speed sensors that have been placed at random geographic intervals in the area surrounding the Microsoft campus in Redmond, WA.

We'll need to initialize the project by installing the required dependencies.  Run the following commands in the package manager console after selecting the `DeviceSimulator` project in the console's project drop-down:
  • Install-Package Fsharp.Configuration 
  • Install-Package Microsoft.Azure.Devices.Client 
The device simulator application layout should be familiar after coding up the registration application.  It begins simply enough by opening the required dependencies, again creating the configuration type using the YAML Type Provider (though this time we'll set the ReadOnly flag to `true` to prevent accidental changes), extracting some config data and building a device client for the IoT Hub.

Though occasionally controversial in some circles, I am a strong advocate for pulling out data as types and there is a prime opportunity for that with the data simulator.  We are in need of a record type that can express a simulated wind-speed measurement.  This record type should include not only the measurement information but also the unique Device Id, some geo-coordinate data and an observation time that we can use further down the line for monitoring or graphing.  Let's add this new record type to our Device Simulator's `Program.fs` file just after the config type definition.

With the measurement type defined we'll need some functions to assist with mocking the field array.  I prefer to work these types of development tasks from the top down, effectively starting with the result and refining the functionality at progressively lower levels.  So let's give that a shot here and look over our requirements:
  • Send a stream of measurement events to the IoT Hub.  
  • Events/measurements should have some temporal spacing between them, i.e. we'll take measurements every N seconds. 
  • Model several devices producing data and concatenate their results such that the simulator application functions more as a field gateway than single measurement device.  
  • Sample data stream should be be effectively infinite. 
  • Communicate with IoT Hub in an asynchronous way.
So how are we going to accomplish this?  Let's begin by saying that we'll have an infinite sequence of strings, that are themselves delimited measurements, that we'll pass to some function that will transmit the string to the IoT Hub on 5 second intervals. Breaking the problem in half, let's define two further functions, one that creates an infinite sequence of measurement data and another function that takes a string and sends it to IoT Hub.

The data send task is rather straightforward.  We'll create a new Message based off the conversion of the string data to a byte array and then pass that message onto the Device Client for transmission to the IoT Hub.  The function will finish with a side effect, by printing the transmitted message to the console.

Creating the (nearly) infinite stream of data is equally as trivial thanks to a few helper functions that F# brings to the table.  If you are coming from C# and are familiar with Linq, then the F# Sequence should be familiar territory as it's mental model maps nicely onto IEnumerable ... a (potentially) infinite series of elements that are lazily evaluated.

F# makes data generation a non-issue as we can create an infinite sequence of elements using `Seq.initInfinite`.  `Seq.initInfinite` must be passed a function with the signature (int -> 'T) that is used to generate a sequence elements <`T> for each `int` that is passed in. The astute reader will notice that it is possible to run out of integers, so we wont technically have an "infinite" sequence.  But given that we are spacing our data out in 5 second increments, the simulator should be able to run for roughly 340 years before the sequence runs out of elements.

In this case, we'll pass `Seq.initiInfinite` a concatenated string of randomized wind-speed measurements based on an array of pre-initialized sites by using:

`msftSites |> Array.mapi (fun idx site -> windSpeedMessage site idx)`

Mapping the windsSpeedMessage function over the collection of `msftSites` along with an indexer, using Array.mapi will allow us to randomize the site data and ultimately generate an Array of `telemetryDataPoint` records. To generate our list of sites, let's do a naive port of this Stack Overflow code over to F# and initialize an Array of 10 `GeoCoordinates`, priming the computation with the Lat/Long for Microsoft Way in Redmond, WA.

Similarly, we can create a wind-speed message function that will return a `telemetryDataPoint` record built up from the randomized site data, and a randomized wind-speed centering on 10 (units, could be mph).

And here is all of our code put together:

Message Compression

If there is anything we can count on, it's that requirements change.  Unfortunately for us, our Partner has an additional constraint around message size.  They would like to compress the data we send to the IoT Hub to save on gateway to cloud bandwidth.  Low bandwidth situations often call for data compression in one form or another, so let'e revisit the Device Simulator and enhance it with the ability to perform data compression.

The functional nature of the Simulator application makes adding additional behavior, particularly additional data processing, a snap!

Open the Program.fs file of the Device Simulator project and add the following open:

`open System.IO.Compression`

Now we'll do a naive port of Mads Kristensen's gzip compression blog post, to F#.  We'll also need to update the `dataSendTask` to compress the delimited string of measurements and decompress the compressed string for a console print - just to prove that we have compression & decompression working!

Moving Data with Azure Event Hubs

Given the change in requirements that added compression, we'll need to enhance our solution architecture to not only shred the delimited measurement data, but also to decompress the messages.  There a are a handful of ways to accomplish this in Azure and given that Functions recently entered General Availability, let's give that path a shot.

While IoT Hubs are a distinctly different service from Event Hubs, they do provide an Event Hub compatible interface.  We'll leverage the IoT Hub's Event Hub interface to wire up an Azure Function that will decompress our messages, split them on the `|` delimiter and forward them onto a new Event Hub for further processing on our way toward PowerBI visualization.

Let's begin by building an Event up that we'll target from our Azure Function. Log into the Azure Portal and search for `Event Hubs`.  The selection you are making is for the service to which we'll need to add an Event Hub to for the project.

After pressing `Create`, you'll see the main overview panel for the Event Hub Service.  Scroll down to `Event Hubs`, press the `+ Event Hub` tab and enter in a name for the new Event Hub.  All the other settings can be left defaulted.  Note that this process will automatically add a storage account with a name that is part hub name and part GUID.

The new event hub will take a few minutes to deploy and will show up in the center pane of the image above.  Once the event hub is displayed, select it and scroll down to `Shared access policies`.  A new pane will open, select `+ Add` and create a new policy with `Manage` claims.  The blade should now refresh and present primary and secondary tokens as well as connection strings for those tokens. Select the primary connection string and paste it into a text editor - we'll need to modify it slightly before using it in our application.

The connection string should look like this:


Split the string at the last semi-color (`;EntityPath=...`) and place it on a second line for later use.

While we are gathering connection string data, let's pull the IoT Hub's Event Hub interface connection information.  Navigate back to the Portal Dashboard and select the IoT Hub.  Scroll down to `Messaging` which will open a second pane containing the Event Hub interface information for the IoT Hub.  Copy both the `Event Hub-compatible name` and the `Event Hub-compatible endpoint` strings and save them off to the aforementioned text file.

Navigate back to `Shared access policies`, select the `iothubowner` policy and copy the `Primary key` value into the text file.

Azure Function 

With batched and compressed data flowing from the device simulator to IoT Hub, we now need an Azure Function that can decompress the message, shred the concatenated sensor data and re-post each individual message onto the new event hub we created in the previous section.  While Azure Functions are relatively straightforward, there are a number of steps to this process and many features are marked as being in `Preview` and/or `Experimental` - keep in mind that some things may be slightly different than shown below.

In the Portal, select the `+` icon in the top left and search for `Function App`.

Press `Create` to kick off the deployment - the app should only take a few moments to create.

Once the Function App is deployed, a quick-start blade will present options to create C# and JavaScript functions.  Use the `+ New Function` tab in the upper left corner to reveal the full template list.  Using the language drop-down, filter for only F# templates and select the `EventHubTrigger-FSharp` template.

With the `EventHubTrigger-FSharp` template selected, a pane will show up below the templates prompting for input data.

Give the function a name.  In the text box for `Event Hub name`, enter the `Event Hub-compatible name` from the IoT Hub that was saved off to your text file in the previous section. Continue by pressing the `new` button next to the `Event Hub connection` text box.  This will present a new blade where we'll enter the connection string for the Event Hub interface of the IoT Hub.

In the text file paste this template connection string and add the values saved off earlier:

`Endpoint={Event Hub-compatible endpoint};SharedAccessKeyName=iothubowner;SharedAccessKey={iothubowner_primary_key}`

The result should look like this:


Paste the connection string into the `Connection string` field and press `OK`.

Back in the template pane, press the `Create` button at the bottom of the blade.  The portal will present a run.fsx file, and likely some error messages that can safely be ignored for now.

Select the `Integrate` tab under the Function and update the `Event parameter name` to `input` and press `Save`.

Click back to the `Develop` tab and update the Run function's first parameter name, as well as it's use in the log statement, to `input`.  Press `Save and run`.  The Function should compile and execute.

In order to post the shredded messages to our Event Hub, we'll need the WindowsAzure.ServiceBus Nuget package.  Thankfully, the Functions service provides an easy mechanism to add dependencies.  In the upper right corner of the Function work-space, select `View Files` and press `+Add` at the bottom of the newly presented pane.  Enter `project.json` and press `enter`.  Much like the ASPNET CORE projects, we can add project metadata, and dependencies, to the Function app using the project.json file.  The text below can be pasted into the project.json file, edited and saved, which will kick off the Nuget package restore process. 

Flip back to the run.fsx file and let's get working on the code for decompressing, shredding and re-posting of the simulated sensor data.

Delete the existing contents of the `run.fsx` file and add in our reference directives and open expressions:

Bind two identifiers that will hold the target Event Hub name and connection string information from the previous section (the connection string we split on `EntityPath`).

Add in the `decompress` function we used in the `RegisterDevices` project and start the binding for the Functions `Run` function like so:

The `Run` function needs to create an Event Hub client, decompress the input string, shred the batched sensor data and re-post each sensor measurement using the Event Hub client.  We can easily bind the decompressed data to an identifier in the run function and create the Event Hub client like so:

The last thing we need to do is split the grouped data on the `|` delimiter, iterate over the array result of that operation and ask the eventHubClient to `Send` each JSON payload. Here is the complete function code including the split and re-post.

Notice the added debug log statement that we can now use to test our function.  In the upper right corner of the Function page press `Test` to reveal a test pane.  Paste the following text into the `Request body` and press `Save and run`.

The function app will re-compile and execute on the test data, producing a log output like so:

Azure Stream Analytics 

With the Azure Function properly decompressing and shredding the IoT Hub data, and posting the results to our Event Hub, we can now focus on aiming our sensor data at PowerBI for display.  The easiest way to set up a properly shaped streaming dataset for PowerBI is to pass the Event Hub events through an Azure Stream Analytics Job (ASA).

Back in the Portal, select the `+` icon in the upper left corner and Search for `Stream Analytics`.  Select `Stream Analytics Job` and press `Create` in the new blade.

 The Portal will present a new configuration blade that requires a `Job name`; be sure to add the job to the existing resource group for cleanup later. Press `Create` to kick off the deployment of the Stream Analytics Job.

Once the deployment completes, select the `Inputs` tab of the ASA job.  Press the `+ Add` button at the top of the new pane and enter the following information:
  •  Input Alias - this will be the value we reference in the `from` field of the ASA query
  • Source Type - set to `Data Stream`
  • Source - select `Event Hub` from the drop down
  • Subscription - select `Use event hub from current subscription`
  • Service bus name - select the event hub service name created a few sections ago
  • Event hub name - select the event hub name created in the previous event hub service
  • Event hub policy name - select the policy that maps to the `Manage` policy 
  • Event hub consumer group - leave blank to default to the `$Default` consumer group
  • Event serialization format - select JSON from the dropdown
  • Encoding - leave it set to `UTF-8`

Press `Create` to complete the input definition.

Select the `Outputs` tab and press the `+ Add` button at the top of the pane.  Give the output alias a name and set the Sink to `Power BI`.  The portal will ask for Authorization to wire itself up to a PowerBi subscription.  If you don't already have a PowerBI account you can create one for free on the PowerBI Getting Started page.  

`Authorize` the Portal to connect to PowerBI which will re-direct you to an MSA login screen.  Once the login process is completed, the Portal will redirect you to complete wiring up the ASA job output.  For the `Group Workspace` drop-down select `My Workspace` and enter new names for the `DataSet Name` and `Table Name` fields.

With the output defined we can complete the ASA job set up by building the query that will shape our data for PowerBI consumption.  Remember that our JSON sensor data is a complex data structure with the GeoCoordinate sub-type that will need to be flattened for PowerBI consumption. Select the `Query` tab of the ASA Job which will open a new pane with some default SQL'ish code.  Delete the existing query and enter the following:

This query will  create a new data object that flattens the location data, extracting just the Latitude and Longitude values along with the top level DeviceId, Wind Speed, and Observation Time values.

Navigate back to the ASA `Overview` tab and press `Start` at the top of the overview pane.  Note that ASA jobs are notoriously slow to start and stop ... be patient, it will eventually start.  

Flip back to Visual Studio, set the Device Simulator as the startup application and run it.  After a few minutes you should start to see Monitoring Events on the ASA overview page.

Power BI

The final step in out F# & IoT exploration is to visualize our sensor data.  We'll leverage PowerBI to display geographic information and a historical line chart for the simulated sensors.

Log into PowerBI and in the left pane scroll down to `Datasets`, further selecting `Streaming datasets`.  This will bring up a menu of the available streaming dataset, one of which should be the output of the ASA job.

On the far right on the IoT dataset, press the `Create Report` icon.  You will be redirected to a new blank report.  From the Visualizations fly-out on the right, select the regular "Map" visualization.

To create the geographic map:

  • Drag the `deviceId` Field into the Legend of the visualization
  • Drag latitude to Latitude
  • Drag longitude to Longitude
  • Drag windspeed to Size, select the twill and set the value to the `Average` 
The resulting graph will look like this:

To generate the historical speed chart, add a line chart to the report and set the following values:
  • Axis - osbTime
  • Legend - deviceId
  • Values - Average of windSpeed
With a bit of filtering you'll end up with a report like so:


I hope this tutorial has illuminated some of the ways that F# fits nicely into the world of IoT, especially in the context of Cloud solutions. We've gone from data generation, through transmission; onto data post-processing and through visualization.  At each one of these steps are opportunities where F# and the community's F# tooling can play a deeper and more meaningful role.  And the best part of all this? ... Our community is only getting started.  We still have so much to say about topics like application correctness, developer productivity, and nearly every aspect of security.

My final call to action, equally for those new and old to the language alike, is to stay involved, come listen to people speak or speak yourself, try out the libraries, unit test your C# code with F#, build pet projects, build complex systems, hell build the next Jet; but most of all, remember to enjoy writing code.  We write F# because it makes coding fun again, it pushes us to be better, it enables us to be better engineers/coders/developers.    

Further Exercises

  • Create functions across the demo applications that will build the connection string from its elements.
  • Use Fable to create a custom PowerBI Visual.
  • Create a simulator application and run on Raspbian on a Raspberry Pi
  • Explore the Azure IoT Gateway SDK and compile a series of F# modules to run in the Gateway on Windows IoT Core
  • Create and app that will tap the IoT Hub Event Hub interface and pull off a sampling of messages using EventProcessorHost 
  • Test out the Cloud to Device Messaging, Device Management and Device Twin features of the Azure IoT SDK. 


Azure IoT Gateway SDK Build Problems

9/08/2016 Posted by William Berry , , , 3 comments
I've had the opportunity to start exploring the Azure IoT product suite lately and over all first impressions are very positive.  I've rolled a few F# solutions for device management and simulation with good success.  The next step of this project has lead me to the Azure IoT Gateway SDK in an attempt to implement a custom filtering/batching/grouping/compression gateway for a field array.  Unfortunately, after adding the C++ options to Visual Studio, installing CMake a few times and adding it to my path, I was stuck.  Running the included `build.cmd` or CMake by hand lead to a string of errors like:

After some lamenting that this happens every time someone lets me near C code I checked the issues list and found that I cannot follow directions.  Aside from making sure that your path to the checked out repo is under 20 characters, because `Windows`, you need to make sure to include the recursive flag in the clone from GitHub.  Here is the command for the lazy:

`git clone --recursive`

Happy coding!

Friendly Disclaimer: I now work for Microsoft in DX 


EasyNetQ Advanced Bus from F#

4/21/2016 Posted by William Berry , , No comments
I've been working lately on our company's first production F# application, a simple logging application that spans between our transaction processing system and our Hadoop Cluster.  Initially we'll be logging some simple transaction meta data for internal reporting applications, but hope to expand it to other types of logging/audit data, eventually.  Given that application could be reactive in nature, I thought it would be fun to play with both the MailboxProcessor and the .Net Reactive Extensions library in this implementation.

The basic architecture of the application uses EasyNetQ to consume a RabbitMq queue, an async recursive function to push data to HBase and an Rx Subject backed by a ConcurrentQueue to span the producer and consumers.

I initially wrote an async recursive function to process the rabbit queue using the RabbitMq .Net client library.  I quickly realized however, that the yak shaving required to do everything other than consume the queue was generally a waste of my time, SO ... EasyNetQ.  The nice thing about EasyNetQ is that it wraps up the whole process nicely in a C# library.  The flip side is that it's a tried-and-true C# library and as such, is not as idomatic as one would like to use from F#.

Since I am new to the F# world, I struggled a bit to get the type signatures together to implement the Advanced Bus from the EasyNetQ library.  The following gist is an example of setting up a synchronous message handler that bypasses the built in message de/serializer, instantiating a queue and then wrapping the Advanced Bus in a async block that we can kick off in the main body of our service. See here for details on the advanced API of RabbitMq.


Making a Sensory Swing

4/20/2016 Posted by William Berry , , , No comments
In a bit of a departure from tech, here's some instructions on making a slick sensory swing for your sensory starved little one.

Start with a pull-up bar, preferably a bolt together one like this one from Dicks:
We'll take advantage of the extensive number of bolts used to assemble this bar to provide hanging points for the swing.

The hardware shopping list is pretty straightforward:

(2) 3/16" quick links
(2) 1/4" x 2" fully threaded eye bolts (prefer welded or forged)
(2) 1/4" nylon nuts (nylocks)
(4) 1/4" washers
(2) 1/4" lock washers
(2) 32lbs long, automotive extension springs
(1) 15' x 1" nylon (or similar) truck strap

Begin assembly by removing the forward and inside carriage bolts from the pull-up bar.  Pre-assemble the eye bolts by running a nut up the threads to the seat.  Add the lock washer and then one of the flat washers.

Push the eye bolt through the empty hole from the bottom, drop the second washer and then the nylock.  To tighten things up, stick a screwdriver through the eye bolt and run the nylock down until there is a thread or two of the eye bolt past the nylon.  Follow-up by running the necked nut back the other direction until the lock washer fully closes.

Using a quick link, attach the spring to the eye bolt, and then hang the swing from the spring.  Unfurl the truck strap, cut off the hook and seal the fray with a lighter.  Fold the webbing in half and cut and seal the midpoint, making (2) 7' runs of webbing.  With both straps in hand, step into the swing and press the bucket to the floor.  From the top, slide a length of the webbing down the center of the spring and pull through.  Take the tail of the webbing at the top and tie a simple clove hitch with a locking half hitch to the pull-up bar.  Now step out of the swing, you look like an idiot ;-)

Load up your little one and watch the magic unfold; don't forget to show them how to use the webbing to pull themselves up for more bouncy fun.

Assembled Eye Bolt
Tightening the Nylock
Seat the Nylock a thread or two

Webbing exiting spring
Webbing tied off & fished through spring

The swing fully assembled