Author: René Hézser

Azure IoT Edge not starting

Sometimes a permission denied is a permission denied 🙁

[INFO] - Starting Azure IoT Edge Security Daemon
[INFO] - Version - 1.0.10~rc1
[INFO] - Using config file: /etc/iotedge/config.yaml
[INFO] - Configuring /var/lib/iotedge as the home directory.
[INFO] - Configuring certificates…
[INFO] - Transparent gateway certificates not found, operating in quick start mode…
[INFO] - Finished configuring provisioning environment variables and certificates.
[INFO] - Initializing hsm…
[INFO] - Finished initializing hsm.
[INFO] - Provisioning edge device…
[INFO] - Starting provisioning edge device via manual mode using a device connection string…
[INFO] - Manually provisioning device "iotedgedevice" in hub ""
[INFO] - Finished provisioning edge device.
[INFO] - Initializing the module runtime…
[INFO] - Initializing module runtime…
[INFO] - Using runtime network id azure-iot-edge
[WARN] - Could not initialize module runtime
[WARN] - caused by: Container runtime error
[WARN] - caused by: error trying to connect: Permission denied (os error 13)
[ERR!] - The daemon could not start up successfully: Could not initialize module runtime
[ERR!] - caused by: Could not initialize module runtime
[ERR!] - caused by: Container runtime error
[ERR!] - caused by: error trying to connect: Permission denied (os error 13)

This is the output I got viajournalctl -u iotedge -f on a testinstallation.

For troubleshooting purpose I looked at the guide. But nothing solved my problem. Then I disabled http and mqtt support as of Still not starting.

Finally I got it up and running by creating a docker group, adding iotedge to it and changed the group ownership of the /var/run/docker.sock file sudo chown root:docker /var/run/docker.sock

This post is meant to be found via search engines if you (or me again) has the same startup problems.

My context: Ubuntu 20.04 with snap installed docker

Properties for IoT Messages in Azure Stream Analytics

In this post I want to show how to use properties that are added to messages that IoT devices are sending to Azure IoT Hub in Stream Analytics. And while talking about properties, let’s even use message enrichment 🙂

Stream Analytics Architecture

Sample Message

The green properties will be added by the Message enrichment feature of IoT Hub, as the data is not most likely not known on the IoT device or does not need to be transferred with each message.

Sample IoT Device

This message is sent by a sample C# client. I used this one:

The code that sends the message with the alert property has been adjusted to this:

Configure IoT Hub

Device Twin

In most cases the IoT (Edge) device does not know which customer it is associated, as it does not need to know. For further processing of the data – or for device management – this information is relevant. Therefore we add this information to the device twin in Azure IoT Hub.

The property names do not need to match the desired properties that will be added via message enrichment. You can choose a structure that fits best.

Message Enrichment

We want to add the customer number and id from the device twins to the message before it is being passed along to an endpoint.

Message Enrichment settings in IoT Hub

As you can see the name of the property that is added does not need to match the name of the twin properties. Make sure you add the message enrichment to the right endpoint(s). You can decide to add different properties to messages that are routed to different endpoints.

Azure Stream Analytics

In the Stream Analytics job we use a SQL like query to filter the incoming message stream and route the messages to endpoints. The query will work fine as long as you use only the columns that are in the body of the messages (like “temperature” or “humidity” in this examle).

To be able to use the values in the properties, we need to use the GetMetadataPropertyValue function. Please take not of the sentence on the docs page: This function cannot be tested on the Azure portal using sample data


The first three columns are our property and message enrichment columns while the other columns are all added as well.


Let’s assume we want to add all message to a storage account where the customer id is part of the path.

Stream Analytics Blob storage output

This will work, as we added the customerid column in the query and it can be used for the path. Remember this is a demo and we only use the customerid as part of the path.

In the architecture diagram at the beginning of the post an Alert route is drawn. You can achieve this by adding a second query to the job which routes certain messages to that output.

VisionAI DevKit won’t deploy a module

Today my VisionAI DevKit was not deploying a module. In the logs (sudo journalctl -u iotedge -f) I could see the deployment was received:

Successfully pulled image
Creating module VisionSampleImagenet…
Could not create module VisionSampleImagenet
caused by: No such image:

Strange. During troubleshooting I started docker images and saw a lot of older images and versions. After deleting a log of them with docker image rm xyz the deployment succeeded and the module started. 🙂

Learning: Clean up the mess…

Configure Azure IoT Edge for downstream devices

A lot of documentation and posts are available to setup an Azure IoT Edge to act as an IoT Hub for downstream devices. In order to get it up and running in a dev environment, I had to do some more research.

My setup is a RaspberryPi 3 with Raspbian stretch and an Azure IoT DevKit which looks like this. And please remember the setup I used is for development only. I’ve used symmetric key authentication for the IoT Device. In a production scenario you would probably use certificate based authentication and no self signed certificates for the TLS encryption.

Transparent Gateway

Some starting points for reading are:

And here are my findings with the solutions that worked for my setup

  1. The downstream IoT devices should be able to connect to port 443 on the Edge module. But that port was not open/listening.
  2. How to verify the gateway certificate after the connection has been established?

The connectionstring for connecting to the gateway instead of an IoT Hub you can add ;GatewayHostName=hostname and the device should then go to the gateway. Take a note of the hostname and make sure it matches the name you specified when you were creating the certificates.

Looking at the serial output of the DevKit, I noticed it could not connect to the gateway. A quick analysis revealed that it does not accept connections on port 443. Hmm. Maybe a firewall on the Pi? As it turned out you have to tell the edge container to listen to 443 if you want to use it as a gateway.

Port bindings for the Edge module

This will allow incoming connections not only for HTTPS. After the change was pushed to the Edge device, I could connect to it on port 443. Hurray.

The next challenge was to get the downstream device accept the certificate, that the gateway offered. In order to be able to verify the certificate, it has to trust the root certificate. This was, in my case, the file from the ~/certividates/certs directory. Open it with an editor, paste the content into the ino file and use the certificate.

Now the IoT device should be able to connect to the gateway. Have fun with IoT 😉

Azure SQL with AAD authentication

I though this had to be an easy task. Well, actually it is. If you find the right documentation and read it in the correct order 🙂

Basically I wanted to be able to login with my AAD (Azure Active Directory) user.

In the first step, the database needs to be configured for Azure Active Directory in order to add users in the second step.

Configure an Administrator

In the Azure portal go the the SQL server and search for “active directory” to add an Active Directory admin.

After you’ve added an admin and saved the value, you will be able to use SSMS (SQL Server Management Studio) to logon to the server. Probably SSMS will prompt you about a firewall exception.

Use SQL Management Studio to add users and grant permissions

For other users (not the administrator we configured above) to be able to logon, access has to be granted like with an on premises SQL Server.

Add a user to the master DB

Create a new query o

Next grant permissions to the user on the database itself.

Add user to database

Open another query on the database.

That should be it.

Some documentation I used:

23 Jun

Two Hackathons in a week

What a week. Two hackathons (‘hack’+marathon) in a row. That was exhausting.

  • A three day hackathon with my colleges from Arvato Systems and a customer. We’ve used Cognitive services with 8 different programming languages and created great PoCs.









  • The second hackathon was about Azure Stack with Microsoft.

Thanks to all participants, the organisation. It has been fun and a great experience. Now I am looking forward on how the results will influence decisions for follow-up projects.

Besides the work, I enjoyed the opportunity to get to know  you all better and had some interesting networking. Let’s see what events the future brings 😉

18 Jun
7 Jun
19 Apr

Was ist Cloud-native? – Die Muttersprache der Cloud

Im Zeitalter der Digitalen Transformation umschwirren uns unzählige neue Buzzwords. Bei „Cloud-Native“ sprechen wir dabei keineswegs von einer weiteren Etage auf dem Turm zu Babel, sondern vielmehr von einem essentiellen Bestandteil der Digitalisierung, nämlich der Muttersprache der Cloud.

Ich habe wieder einen Beitrag auf dem Arvato Cloud Blog geschrieben:—die-muttersprache-der-cloud.html

Azure Table Storage REST Api not returning data

Today I wanted to query entities of an Azure Table via REST Api and did not get any results.

Looking over the query over and over again did not solve the problem. Sometimes I did not get any items back.

The “sometimes” depended on the query. I checked each part. Partition Key, string and date columns. Everything looked all right. And then it hit me.

I did not get a result, if there was too much data. Specifying the $top option to 1000 will always return data.

Learning for today: if a filtered query would return too much data, it will not return anything until you implement paging according to

Meetup #4 – Thema: Azure IoT Hub, MQTT

Am 24. Januar ist es wieder soweit.


René Hézser – Arvato Systems

Vorstellung Azure IoT Hub, Anbindung eines ESP8266 mit LED und Sensor an den IoT Hub

Dennis Hering – Microsoft Deutschland GmbH

Grundlagen / Funktionsweisen von MQTT,
“Last Will and Testament (LWT)” – Best Practices und Code Patterns

Bitte denkt an die Anmeldung über die Webseite, damit wir euch beim Pförtner anmelden können.

Resitor values for blinking Christmas Tree

I bought a DIY blinking Christimas Tree. Unfortunately it did not contain any assembly instructions 🙁

So I looked for the part number CTR-30B, which is printed on both parts of the tree. I found a couple of instructions. After I soldered the tree, I saw that the colors were not evenly bright. I adjusted the values of the resitors and want to share the values.

R2: 330
R4: 560
R6: 2k

For R1, R3, R5 and R7 I used the provided 10k resistors.

Upload files to NodeMCU from Windows Bash

Uploading files to a NodeMCU ESP8266 can be done with the Java tool ESPlorer. If you want to automate this process, you’ll want to use something else.

A quick research brought up NodeMCU-Uploader, which is a python script. On my Windows machine, I’ve got the bash installed. Naturally I want to use it 🙂

Fortunately the bash allows access to the COM ports. You have to modify the permissions for the device though.

  • sudo chmod 666 /dev/ttyS3where “3” is the COM port, you can see in the Windows Device Manager

After that, the COM port can be access. The uploader can be installed with pip.

  • sudo apt install pip
  • pip install nodemcu-uploader

After everything has been set up, files can be uploaded by specifying the port and file

nodemcu-uploader --port /dev/ttyS3 upload application.lua

No Default Subscription?

Set-AzureWebsite : No default subscription has been designated. Use Select-AzureSubscription -Default <subscriptionName> to set the default subscription.

*doh* Again I’ve used PowerShell comandlets for Azure classic instead of Resource Manager 🙁

Reminder: Always check for the magic “Rm” chars in the command, if a resource cannot be found.

Azure Meetup OWL

Nicht vergessen. Morgen findet das Azure Meetup zum Thema Build, Test und Deployment mit Azure in Bielefeld statt.

Meetup #2 – Build, Test und Deployment mit Azure

Wednesday, Oct 11, 2017, 7:00 PM

Arvato Bielefeld / Sennestadt
Fuggerstraße 11 Bielefeld, DE

17 Mitglieder Went

Liebe Azure OWL Community,am[masked] wird unser zweites Azure OWL Meetup durchgeführt.Diesmal soll es hauptsächlich um Build, Test und Deployment auf der Azure Platform gehen.Tyler von Microsoft wird uns einen Vortrag zu Build, Test und Deployment vorstellen und speziell auf Delivery Pipelines mit Docker, Kubernetes und Vistual Studio Team…

Check out this Meetup →


HowTo use Azure cmdlets in Azure Schedule

A Runbook schedule can be triggered every hour. If you need a smaller interval, like every minute, you can use the Azure Scheduler to do so.
So I went to the Azure Portal, created an Azure Schedule instance (with a job collection tier of at least basic, to be able to create schedules that are triggered every minute) and called a Runbook via webhook.

The Runbook contains a cmdlet that results in an error 🙁
Get-AzureRmMetric : The term 'Get-AzureRmMetric' is not recognized as the name of a cmdlet, function, script file, or
operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try

Azure cmdlets can be made available through the Automation Account the Runbook is using. The “Browse Gallery” link will let you find and choose the necessary cmdlets.

The error message above appears, because a) the cmdlet was not installed and b) the referenced version of AzureRM.profile was to old. Fortunately the problem can be resolved easily by upgrading the Azure modules.

After all modules are up to date, I could add the desired module and my runbook wasn’t complaining anymore 🙂

Azure SQL – Standard Tier IDs

In case you need the ServiceObjectiveId for SQL standard tiers, here is the list for you.

Tier nameServiceObjectiveId
Standard (S0)f1173c43-91bd-4aaa-973c-54e79e15235b
Standard (S1)1b1ebd4d-d903-4baa-97f9-4ea675f5e928
Standard (S2)455330e1-00cd-488b-b5fa-177c226f28b7
Standard (S3)789681b8-ca10-4eb0-bdf2-e0b050601b40
Standard (S4)3cf14e1a-0a5d-408c-bbc7-f63c5282f735
Standard (S6)ab69b4e3-d7cc-4aa5-87a6-f8b50615a03c
Standard (S7)b6ca0894-d2f0-4e40-99f5-0f8a93cc2437
Standard (S9)0efa88e9-99ff-4e36-a148-8c4b20c0826c
Standard (S1298100e8b-2f8a-4a81-9eb5-4d1e675c5a29

Usually you could change the tier within the Azure Portal. To change them via PowerShell, you can use the above IDs.

Connection Problems to a Secure Service Fabric Cluster

To be able to connect to a secure Service Fabric Cluster via PowerShell, you need to import the certificate specified into your personal certificate store. Otherwise an Exception will be thrown. Unfortunately the Exception does not point into the right direction 🙁

So in case you get an Exception like this

Connect-ServiceFabricCluster : An error occurred during this operation. Please check the trace logs for more details.
At line:1 char:1
+ Connect-ServiceFabricCluster -ConnectionEndpoint xyz-sf-de …
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidOperation: (:) [Connect-ServiceFabricCluster], FabricException
+ FullyQualifiedErrorId : CreateClusterConnectionErrorId,Microsoft.ServiceFabric.Powershell.ConnectCluster

you need to import the certificate with its private key (*.pfx) into the personal certificate store of the PC you are running PowerShell on.


Specifying -verbose for PowerShell will print additional information, that does not help a lot.

PS C:\WINDOWS\system32> Connect-ServiceFabricCluster -ConnectionEndpoint -X509Credential -FindType FindByThumbprint -FindValue xyz -StoreLocation CurrentUser -StoreName My -ServerCertThumbprint xyz -Verbose
VERBOSE: System.Fabric.FabricException: An error occurred during this operation. Please check the trace logs for more
details. —> System.Runtime.InteropServices.COMException: Exception from HRESULT: 0x80071C57
at System.Fabric.Interop.NativeClient.IFabricClientSettings2.SetSecurityCredentials(IntPtr credentials)
at System.Fabric.FabricClient.SetSecurityCredentialsInternal(SecurityCredentials credentials)
at System.Fabric.Interop.Utility.<>c__DisplayClass25_0.<WrapNativeSyncInvoke>b__0()
at System.Fabric.Interop.Utility.WrapNativeSyncInvoke[TResult](Func`1 func, String functionTag, String
— End of inner exception stack trace —
at System.Fabric.Interop.Utility.RunInMTA(Action action)
at System.Fabric.FabricClient.InitializeFabricClient(SecurityCredentials credentialArg, FabricClientSettings
newSettings, String[] hostEndpointsArg)
at Microsoft.ServiceFabric.Powershell.ClusterConnection.FabricClientBuilder.Build()
at Microsoft.ServiceFabric.Powershell.ClusterConnection..ctor(FabricClientBuilder fabricClientBuilder, Boolean
at Microsoft.ServiceFabric.Powershell.ConnectCluster.ProcessRecord()
Connect-ServiceFabricCluster : An error occurred during this operation. Please check the trace logs for more details.
At line:1 char:1
+ Connect-ServiceFabricCluster -ConnectionEndpoint xyz-sf-de …
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidOperation: (:) [Connect-ServiceFabricCluster], FabricException
+ FullyQualifiedErrorId : CreateClusterConnectionErrorId,Microsoft.ServiceFabric.Powershell.ConnectCluster

Guest Post on the Arvato Systems Cloud Blog

I’ve published an article on the Arvato Systems Cloud Blog (German) and Arvato Systems Blog (English).

Geld sparen mit serverloser Architektur

Ihr Unternehmen plant eine großangelegte Digitalkampagne zur Vermarktung eines neuen Produktes? Sie sind nicht sicher, ob die aktuelle Architekturlösung den Ansprüchen ihrer geplanten Kampagne gerecht werden kann? Bei einem Ausbau Ihrer Infrastruktur würden hohe Kosten auf Sie zukommen?

How a serverless architecture can help you scale easily and save costs during your next marketing campaign

Is scaling up your existing infrastructure very expensive? Are you not sure if your infrastructure can meet the demands of your next big marketing campaign? Get to know an alternative to scale up your current environment for a short period.

Often there is no need to set up an infrastructure like a web server, database server, and load balancers. Let us take a look at some cloud providers and how they can serve our content more cost efficient and deliver websites to the visitors with high performance.

Azure Meetup OWL

Am 13.7. wird das erste Treffen des Azure Meetups OWL stattfinden. Wir sind derzeit noch auf der Suche nach einem Ort und den Themen 🙂
Vermutlich wird es um Chatbots und Machine Learning gehen. Wir werden aber auch auf euer Feedback eingehen um Themen für die nächsten Treffen zu finden.

Wenn ihr dabei sein möchtet, meldet euch bitte über die Meetup Seite an.


Ort: Arvato Systems, An der Autobahn 100, Gütersloh, Tower I

Zeit: 19:00 – 21:00 Uhr

Es wird um eine namentliche Anmeldung gebeten, da die Besucher beim Pförtner angemeldet werden müssen. Bitte entweder über Meetup, oder sendet mir eine E-Mail.