Automatic Input Blob Binding in Azure Functions from Queue Trigger Message Data

Reading additional blob content when an Azure Function is triggered can be accomplished by using an input blob binding by defining a parameter in the function run method and decorating it with the [Blob] attribute.

For example, suppose you have a number of blobs that need converting in some way. You could initiate a process whereby the list of blob files that need processing are added to a storage queue. Each queue message contains the name of the blob that needs processing. This would allow the conversion function to scale out to convert multiple blobs in parallel.

The following code demonstrates one approach to do this. The code is triggered from a queue message that contains text representing the input bob filename that needs reading, converting, and then outputting to an output blob container.

using System.IO;
using Microsoft.Azure.WebJobs;
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Blob;

namespace FunctionApp1
    public static class ConvertNameCase
        public static void Run([QueueTrigger("capitalize-names")]string inputBlobPath)
            string originalName = ReadInputName(inputBlobPath);

            var capitalizedName = originalName.ToUpperInvariant();

            WriteOutputName(inputBlobPath, capitalizedName);
        private static string ReadInputName(string blobPath)
            CloudStorageAccount account = CloudStorageAccount.DevelopmentStorageAccount;
            CloudBlobClient blobClient = account.CreateCloudBlobClient();
            CloudBlobContainer container = blobClient.GetContainerReference("names-in");

            var blobReference = container.GetBlockBlobReference(blobPath);

            string originalName = blobReference.DownloadText();

            return originalName;

        private static void WriteOutputName(string blobPath, string capitalizedName)
            CloudStorageAccount account = CloudStorageAccount.DevelopmentStorageAccount;
            CloudBlobClient blobClient = account.CreateCloudBlobClient();
            CloudBlobContainer container = blobClient.GetContainerReference("names-out");

            CloudBlockBlob cloudBlockBlob = container.GetBlockBlobReference(blobPath);


In the preceding code, there is a lot of blob access code (which could be refactored). This function could however be greatly simplified by the use of one of the built-in binding expression tokens. Binding expression tokens can be used in binding expressions and are specified inside a pair of curly braces {…}. The {queueTrigger} binding token will extract the content of the incoming queue message that triggered a function.

For example, the code could be refactored as follows:

using System.IO;
using Microsoft.Azure.WebJobs;

namespace FunctionApp1
    public static class ConvertNameCase
        public static void Run(
        [QueueTrigger("capitalize-names")]string inputBlobPath,
        [Blob("names-in/{queueTrigger}", FileAccess.Read)] string originalName,
        [Blob("names-out/{queueTrigger}")] out string capitalizedName)
                capitalizedName = originalName.ToUpperInvariant();         

In the preceding code, the two [Blob] binding paths make use of the {queueTrigger} token. When the function is triggered, the queue message contains the name of the file to be processed. In the two [Blob] binding expressions, the {queueTrigger} token part will automatically be replaced with the text contents of the incoming message. For example if the message contained the text “File1.txt” then the two blob bindings would be set to names-in/File1.txt and names-out/File1.txt respectively. This means the input blob nameBlob string will automatically be read when the function is triggered,

To learn more about creating precompiled Azure Functions in Visual Studio, check out my Writing and Testing Precompiled Azure Functions in Visual Studio 2017 Pluralsight course.

Dynamic Binding in Azure Functions with Imperative Runtime Bindings

When creating precompiled Azure Functions, bindings (such as a blob output bindings) can be declared in the function code, for example the following code defines a blob output binding:


This binding creates a new blob with a random (GUID) name. This style of binding is called declarative binding, the binding details are declared as part of the binding attribute.

In addition to declarative binding, Azure Functions also offers imperative binding. With this style of binding, the details of the binding can be chosen at runtime. These details could be derived from the incoming function trigger data or from an external place such as a configuration value or database item

To create imperative bindings, rather than using a specific binding attribute, a parameter of type IBinder is used. At runtime, a binding can be created (such as a blob binding, queue binding, etc.) using this IBinder. The Bind<T> method of the IBinder can be used with T representing an input/output type that is supported by the binding you intend to use.

The following code shows imperative binding in action. In this example blobs are created and the blob path is derived from the incoming JSON data, namely the category.

public static class CreateToDoItem
    public static async Task<HttpResponseMessage> Run(
        [HttpTrigger(AuthorizationLevel.Function, "post", Route = null)]HttpRequestMessage req,
        IBinder binder,
        TraceWriter log)
        ToDoItem item = await req.Content.ReadAsAsync<ToDoItem>();
        item.Id = Guid.NewGuid().ToString();

        BlobAttribute dynamicBlobBinding = new BlobAttribute(blobPath: $"todo/{item.Category}/{item.Id}");

        using (var writer = binder.Bind<TextWriter>(dynamicBlobBinding))

        return req.CreateResponse(HttpStatusCode.OK, "Added " + item.Description);

If the following 2 POSTS are made:

    "Description" : "Lift weights",
    "Category" : "Gym"
    "Description" : "Feed the dog",
    "Category" : "Home"

Then 2 blobs will be output with the following paths - note the random filenames and imperatively-bound paths: Gym and Home :

Using the Actor Model and Akka.NET for IoT Systems

In additional to cloud-based offerings such as Microsoft’s Azure IoT Suite, it’s also possible to create representations of Internet of Things devices using the Actor Model and Akka.NET. For example, in addition to hosting an Akka.NET actor system in the cloud, you can also have them running on-premises; for example you may have an actor system monitoring an underground railway network for a city. In this case you may already have the infrastructure in place and decide that hosting in the cloud is unnecessary or risky, and deploy to servers geographically close to the underground with some disaster recovery/backup servers located elsewhere.

The Actor Model is a good fit for IoT scenarios due to the inherent concurrency controls, fault-tolerance, and performance/scalability. Akka.NET is a “toolkit and runtime for building highly concurrent, distributed, and fault tolerant event-driven applications on .NET & Mono”[1].

In the Actor Model, the smallest unit of computation is the actor. An actor can receive messages from other actors, perform computations, manage their state, and send messages to other actors.

For IoT scenarios, actors can model the system in a number of ways. For example, for every physical device in the real world there can be an actor instance to represent that device. This means you may have many actor instances, one for each physical device. When a sensor registers a new reading (temperature, proximity, pressure, etc.) this sensor can communicate with the actor responsible for it, the actor receives a message and can update its internal state to represent the latest reading. The actor representing the device can also respond to another type of message to access the last updated value. The device actor may also be responsible for sending messages to the physical device to update it’s configuration (for example) or this functionality may be broken down into its own actor definition depending on the exact requirements. The device actor may communicate with another actor that is purely responsible for handling the network interfacing required to communicate with devices. In this way if the “network actor” crashed and restarts the actors representing the device don’t crash and lose their state.

Groups of “device actors” can created with the supervision hierarchy that the Actor Model provides to protect groups of actors/devices from crashing other groups of actors/devices. These hierarchies can also provide device management semantics such as the registering of a new device that has just come online. Actors can also represent “abstract” concepts such as the creation of a dedicated actor to perform querying of groups of actors.

To learn more about using Akka.NET with IoT scenarios, check out my Representing IoT Systems with the Actor Model and Akka.NET Pluralsight course.

Investing In You

I grew up in humble surroundings, my family was for the most part “working class”, I moved around a bit as a kid, moved schools a few times, and lived in state/council housing. At one point as a child (due to some unfortunate circumstances) we lived for a short time in a “homeless” hostel – a transitional place whilst waiting for state housing to be allocated. In one area that I lived as a child I had a knife pulled on me outside a local shop, I learned then how quickly I could run! Today I live in a nice safe suburb, drive a decent car, and generally don’t have to worry too much about personal safety or not having a roof over my head. This is due to some kindnesses I’ve been shown along the way and also by investing. Investing in myself…

I recently completed reading Tony Robbins  Money Master the Game, it is a good book for those new to investing - with a few chapters being somewhat US-centric. (Other books you may find interesting if you’re just starting out your investing journey include The Little Book of Common Sense Investing and A Random Walk Down Wall Street.) While Money Master the Game contains a lot of information about how to attempt to maximize your financial returns and  ways to diversify your portfolio, in it Tony also talks about  how you can add more value.

One way to improve your financial investments is by by investing in yourself.

One nice idea is that by investing in yourself you can add more value and if you can add more value you can earn more and if you can earn more you can invest more.

I had some help and kindnesses shown to me in my journey and like everyone I’ve also some challenges to deal with along the way. Even though I come from a somewhat humble  background, and as a white heterosexual male I’ve never had to deal with prejudice, I am lucky that I have always loved to learn. I became fascinated by computers and programming from an early age and was lucky enough to borrow one for a time when I was younger. Eventually my interest and enthusiasm meant I was lucky enough to get my own machine.

Over the many years I continued to learn and was eventually privileged enough to be able to attend university to study computing. Even after starting my first job I continued to learn in my own time, in the evenings and at weekends, always interested in learning more.

As I look back now, at the time I was just following my natural curiosity, but looking back what I was really doing was investing in myself.

About 2 and a half years ago I stepped into a gym for the first time in my life. I look back now and smile, my first experience was not pleasant, I didn’t know what exercises to do, I tried bench pressing with an empty bar and wobbled all over the place, while the muscular guy next to be hoisted 50kg dumbbells to the sky. I went home feeling awful and a little stupid. Two days later I went back, and I kept going back. I devoured Arnold Schwarzenegger's Encyclopedia of Modern Bodybuilding and eventually paid for some personal training sessions to learn how to clean and press and bench press properly. Whilst I am not a shredded muscular bodybuilder, I did lose 14kgs over 2 years and add some amount of muscle mass and some strength. This is another example of investing in you, this time the physical you. Oftentimes, as developers we don’t always take the best care of ourselves, but I believe investing in the physical you carries over to the work/business you.

As the adage goes, "if you want better answers, ask better questions". One question I’m asking myself this year is: how can I continue to add more value than anyone else? As a software developer and “techie-minded”, in the past I would have thought of a question like this as being big-headed or management-speaky. But if you want to help others you need to help yourself and if you want to help yourself you need to offer value to others.

If you want better answers, ask better questions

It’s good to take a step back sometimes and ask ourselves some questions, especially as we get laser focused on the test we’re writing or the feature we’re working on or the sprint that we’re in, or the next project that might be coming along.

I’m grateful for the opportunities I’ve been given in life, I’m grateful for the challenges and failures and what I’ve learned from them, and I’m grateful for the gift of my lifelong love of learning.

Whilst somewhat dramatic, there is some truth to the phrase “if you’re not growing you’re dying” and if you want to grow you have to invest in you.

New Free eBook: C# 7.2: What's New Quick Start

My new free eBook is now available for download.

The book covers the following new features of C# 7.2:

  • Leading Digit Separator
  • Reference Semantics With Value Types
  • Non-trailing Named Arguments
  • Private Protected Access Modifier
  • Span<T> and ReadOnlySpan<T>

Free C# 7.2 eBook

You can get the book for free (or pay what you’re able to) in PDF, EPUB, and MOBI formats.

Create Precompiled Azure Functions With Azure Event Grid Triggers

Visual Studio can be used to create precompiled Azure Functions using standard C# classes and tools/techniques and then they can be published to Azure.

This article assumes you’ve created the resources (resource group, Event Grid Topic, etc.) from this previous article.

In Visual Studio 2017, create a new Azure Functions project.

Next update the pre-installed Microsoft.NET.Sdk.Functions NuGet package to the latest version.

To get access to the Azure Event Grid function trigger attribute, install the Microsoft.Azure.WebJobs.Extensions.EventGrid NuGet package (this package is currently in preview/beta).

Add a new class to the project with the following code:

using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.EventGrid;
using Microsoft.Azure.WebJobs.Host;

namespace DCTDemos
    public static class Class1
        public static void SendNewLeadWelcomeLetter([EventGridTrigger] EventGridEvent eventGridEvent, TraceWriter log)
            log.Info($"EventGridEvent" +
                $"\n\tId:{eventGridEvent.Id}" +
                $"\n\tTopic:{eventGridEvent.Topic}" +
                $"\n\tSubject:{eventGridEvent.Subject}" +
                $"\n\tType:{eventGridEvent.EventType}" +

Notice in the preceding code, the method name SendNewLeadWelcomeLetter is the same as specified in the function name attribute, this may be required due to a bug in the current preview/beta implementation – if these are different your function may not be executed when an event occurs.

Right-click on the function project and choose publish. Follow the wizard and create a new Function App and select your resource group where your Event Grid Topics is. Select West US 2 if you need to create any new Azure resources/storage account/etc..

Once deployed, head over to Azure Portal, open your new function app and select the newly deployed SendNewLeadWelcomeLetter function:

Adding an Azure Event Grid subcription for an Azure Function

At the top right select Add Event Grid subscription. And follow the wizard to create a new subscription - this will enable the new function to be triggered by an Event Grid Subscription. As part of the subscription we’ll limit the event type to new-sales-lead-created:

Adding an Azure Event Grid subcription for an Azure Function

Next go to the function app platform features tab and select Log Streaming. We can now use Postman to POST the following JSON to the Event Grid Topic we created earlier.

        "id": "1236",
        "eventType": "new-sales-lead-created",
        "subject": "myapp/sales/leads",
        "eventTime": "2017-12-08T01:01:36+00:00",
            "firstName": "Amrit",
            "postalAddress": "xyz"

Head back to the streaming logs and you should see your precompiled Azure Function executing in response to the Event Grid event:

2017-12-08T06:38:25  Welcome, you are now connected to log-streaming service.

2017-12-08T06:38:49.841 Function started (Id=ec927bc1-fa15-4211-a7bd-8e593f5d4840)

2017-12-08T06:38:49.841 EventGridEvent

  "firstName": "Amrit",

  "postalAddress": "xyz"


2017-12-08T06:38:49.841 Function completed (Success, Id=ec927bc1-fa15-4211-a7bd-8e593f5d4840, Duration=0ms)


To learn how to create precompiled Azure Functions in Visual Studio, check out my Writing and Testing Precompiled Azure Functions in Visual Studio 2017 Pluralsight course.

Getting Started with Azure Event Grid

In a previous article we got an introduction to Azure Event Grid, if you’re new to Event Grid you should check it out first to familiarise yourself with some basic concepts.

In this article we’ll create an Azure Event Grid topic and subscription and see it in action.

First off, if you want to create a free Azure Account you can do so, then log into the Azure portal.

Next go and create a new resource group, Azure Event Grid is currently in preview and only available in selected locations such as West US 2.

Creating a new resource group

Once the resource group is created, head down to the More services option and search for Event Grid.

Navigating to Event Grid Topics

There are topics provided by Azure services (such as blob storage ) and there is also the ability to create your own custom topics for custom applications/third parties/etc.

Click Event Grid Topics and this will take you to a list of all your topics. Click the +Add button to begin creation of a custom topic. Give the topic a name sales-leads and choose the resource group created earlier, once again choose West US 2.

Creating a new Azure Event Grid Topic

Click create, wait for the deployment to complete and hit refresh in the topics list to see your new topic:

Azure Event Grid topic added

Click on the newly added sales-leads topic, notice the overview showing publish metrics:

Event Grid Topic details

At the top right hover over the Topic Endpoint and click the button to copy this to the clipboard (we’ll use this later):

Getting Event Grid Topic endpoint

In this example the copied endpoint is:

We’ll also need an access key to be able to HTTP POST to this custom topic later, to do this click the Access keys option and copy Key 1 for later use:

Getting access key for Azure Event Grid topic

Click back on Overview and click the +Event Subscription button:

Creating a new Azure Event Grid Subscription

In this example we’ll create a subscription that will call an external (to Azure) service that will mail a conference brochure to all new sales leads. In this example we are simulating a temporary extension to the sales system for a limited period during the run-up to a sales conference. This is one use case for Azure Event Grid that allows extension of a core system without needing to modify it (assuming that events are being emitted).

To simulate this external service we’ll use RequestBin which you can learn more about in this article. Once you’ve created your request bin, take a note of the Bin URL.

Creating a RequestBin URL

Fill out the new event subscription details:

  • Name: send-upcoming-conference-brochure
  • Subscribe to all event types: Untick
  • Event Types: new-sales-lead-created
  • Subscriber endpoint: (this is the RequestBin URL created above)

Event subscription details

Click Create.

To recap, there is now a custom topic called sales-leads that we can publish events to at its URL: There is also an event subscription set up for this topic but that is limited to only those events published of type new-sales-lead-created. This event subscription uses the Azure Event Grid WebHooks event handler to HTTP push events to the RequestBin URL.

To see this in action, open Postman and select POST and paste the topic URL ( Add a header called aeg-sas-key and paste in the key that was copied earlier:

Basic Postman setup

The final thing to do is define the event data that we want to publish:

        "id": "42",
        "eventType": "new-sales-lead-created",
        "subject": "myapp/sales/leads",
        "eventTime": "2017-12-07T01:01:36+00:00",
            "firstName": "Jason",
            "postalAddress": "xyz"

Event JSON data

And then click Send in Postman. You should get a 200 OK response.

Heading back to the RequestBin window and refreshing the page shows the subscription working and the event being pushed to RequestBin:

RequestBin receiving Azure Event Grid event

Because the event subscription is filtered on an event type of new-sales-lead-created, if we send a different event type from Postman (e.g.: "eventType": "new-sales-lead-rejected",), the subscription won’t activate nor push the event to RequestBin.

Understanding Azure Event Grid

Azure Event Grid (currently in preview) is a managed publisher-subscriber service that pushes events to registered subscribers.

Azure Event Grid does not replace other services such as Azure Service Bus and it has a different focus. Whereas Azure Service Bus might be employed where you need very high reliability, message ordering etc, Azure Event Grid is more about emitting notifications of things that have happened.

Azure Event Grid uses a push model (with some retry logic built in) to push events to subscribers both inside and outside of Azure.

Messages and Events

One way to differentiate when Azure Event Grid may be more appropriate is to think of the publisher’s expectations. Firstly lets use a very general definition of a message as being a single piece of information that is produced somewhere and is (possibly) consumed somewhere. It this case we’re thinking about messages as individual “datagrams” as opposed to an ongoing/continuous stream of data.

If the sender of the message has an expectation when the message is sent we can think of this as a “message with intent”.

If the sender of the message has no expectation of what happens when the messages is sent we can think of this as a “message with no intent”.

For the sake of this article, we’ll call a “message with intent” a command and a “message with no intent” an event. Commands are messages  instructing the consumer to do something and that maybe return a result to the sender; events are messages  that represent a fact about something that has happened in the system.

Use Case

One use case for Azure Event Grids is to allow easier system extensibility beyond the core business functionality. For example in a sales system new sales leads are captured and stored in a database, this is the core system. This core system could also publish events with no expectation or knowledge about who may be responding to them. An example of an event could be when a new sales lead is added. When the new lead is entered into the core system, a “new-lead” event is published into Azure Event Grid. Now anyone who is interesting in knowing when a new lead has been added can subscribe to this type of event and do something with it.

Azure Event Grid Terminology

An Azure Event Grid event describes what happened in the system represented as JSON data. It contains the custom data specific to the type of event, in addition to information that is contained in all events such as the event source, event time, and a unique event identifier. The full Azure Event Grid event schema is available as part of the docs.

An event source is the place the event happened, for example the sales system, Azure Storage, etc. Event sources publish events. At present the docs list the following supported event sources (with more to be added in the future):

  • Resource group management operations
  • Azure subscriptions management operations
  • Event Hubs
  • Storage Blobs
  • Custom Topics (HTTP POST-able endpoints)

A topic is an arbitrary categorization of events. Once created, topics have endpoints that event sources publish event to. Events of different types can be sent to the same topic, for example a “sales-leads” topic that holds both “new-lead” and “converted-lead” events.

Event subscriptions wire up topics to event handlers. A topic can have 0, 1, or many subscriptions.

An Event Grid event handler is the place where a subscription sends an event to, for example sending the event using the HTTP webhook event handler. At present the docs list the following Azure event handlers (with more to be added in the future):

  • Azure Functions
  • Logic Apps
  • Azure Automation
  • WebHooks
  • Microsoft Flow

An event consumer (while not explicitly stated in the docs) can be thought of as the thing that the event handler pushes the event to. The event consumer receives the pushed event and uses it, for example an Azure Function responding to a “new-lead” event and sending out a conference invitation letter/email.


The cost of using Azure Event Grid is based on usage at the “operation” level with an “operation” being defined as “all ingress events, advanced match, delivery attempt, and management calls”. At the time of writing the preview cost is USD $0.30 per million operations with 100,000 free operations per month. You can find the latest  pricing here.

To learn more, check out the Azure Event Grid  docs.

Inspecting HTTP Requests With RequestBin

RequstBin is a free community project from Runscope. It allows you to generate a test URL that will capture requests sent to it and allow you to view details of those requests. This may be useful when developing push functionality, webhooks, etc. You should of course not send sensitive data, passwords, etc. and use only non-real test data.

You start by heading over to and creating your own “request bin” which gives you a unique URL which you can send HTTP request to:

Creating a RequestBin

Once you’ve clicked the “Create a RequestBin” button you’ll be given a “Bin URL” to send requests to(you can also restrict viewing to your current browser which uses a cookie behind the scenes):

RequestBin created

Now the the bin is set up and you have your bin URL, you can send HTTP requests to it, for example by setting up a test webhook to call the bin URL, setting up your test application to push to the bin URL, or as an example here using Postman:

Sending a request using Postman

Heading back to the browser window and refreshing it will show you the last (several) requests captured, including any form/post data that was sent, header information such as:

  • Content-Type
  • Cf-Ipcountry
  • Postman-Token
  • Total-Route-Time
  • Cache-Control
  • Host
  • User-Agent
  • Cf-Connecting-Ip
  • Connection
  • Content-Length
  • Cf-Visitor
  • Connect-Time
  • Cf-Ray
  • Accept-Encoding
  • Cookie
  • Via
  • X-Request-Id
  • Accept

And any raw body content such as:

    "arbitrary" : "Jason",
    "data" : "Roberts"

You can also check out the project on GitHub.

New Pluralsight Course: Writing and Testing Precompiled Azure Functions in Visual Studio 2017

Azure Functions have come a long way in a short time. With newer releases you can now create functions in Visual Studio using standard C# class files along with specific attributes to help define triggers, bindings, etc. This means that all the familiar powerful Visual Studio tools, workflows, NuGet packages, etc. can be used to develop Azure Functions. Visual Studio also provides publish support so you can upload your functions to the cloud once you are happy with them. Another feature that makes developing functions in Visual Studio easier is the local functions runtime that let’s you run and debug functions on your local development machine, without needing to publish to the cloud just to test them.

In my new Writing and Testing Precompiled Azure Functions in Visual Studio 2017 Pluralsight course you will learn how to:

  • Set up your local development environment
  • Develop and test Azure Functions locally
  • Publish functions to Azure
  • Create functions triggered from incoming HTTP requests
  • Trigger functions from Azure Storage queues and blobs
  • Trigger functions from Azure Service Bus and Azure Event Hubs
  • Trigger functions periodically on a timer
  • Unit test Azure Function business logic

Check out the full course outline for more details.