Creating a Tweet Buffer with Azure Queues and Microsoft Flow

There are apps and services that allow the scheduling or buffering of the sending of Tweets. Using the features of Microsoft Flow, it’s possible to create a solution that allows Tweets to be quickly created as simple text files in a OneDrive folder and these will then be buffered to be sent every 15 minutes (or whatever schedule you fancy). The actual buffering mechanism used below is an Azure Queue.

There are two Flows as part of this solution: Flow 1 to pick up text files from OneDrive, extract the content and write a new message to an Azure Queue. Flow 2 runs on a schedule, picks a message off the queue, grabs the message content and sends it a s a Tweet.

Flow 1: Queuing Tweets

The first step is to create an Azure Storage account and create an Azure Queue. The easiest way to create a new queue is to use the Azure Storage Explorer. Once installed and connected, creating a queue is a simple right-click operation:

Using Azure Storage Explorer to create a new Azure Queue

We’ll call the queue “tweet-queue”.

We’ll also create OneDrive folders: OneDrive\FlowDemo\TweetQ\In

Now we can create a new Flow that grabs files from this path and adds them to tweet-queue as the following screenshot shows (notice we're also deleting the file after adding to the queue):

Microsoft Flow reading a file from OneDrive and adding to Azure Queue

Now if we create a .txt file (for example with the content “Testing - this Tweet came from Microsoft Flow via OneDrive and an Azure Queue” in the OneDrive\FlowDemo\TweetQ\In directory, wait for the Flow to run and check out the queue in Storage Explorer we can see a new message as the following screenshot shows:

Azure Storage Explorer showing Azure Queue message content

Now we have a way of queuing Tweets we can create a second flow to send them on a timer.

Flow 2: Sending Tweets

The second Flow will be triggered every 15 minutes, grab a message from the queue, use the message body as the Tweet content, then delete the message from the queue.

The following screenshot shows the first 2 phases:

Getting Azure Queue messages on a timer

Even though we’ve specified 1 message, when we add the next action in the Flow, we’ll automatically get an “Apply to each” added as the following screenshot shows:

Posting Tweet from Azure Queue

Notice in the preceding screenshot that we also need to add an action to delete the message from the queue.

Now once we save this Flow, every 15 minutes a message will be retrieved and posted as a Tweet:

SHARE:

Triggering a Microsoft Flow from an HTTP Post

Microsoft Flow allows the building of workflows in the cloud. One way to trigger a Flow is to set up a HTTP endpoint that can be posted to.

For example, a Flow can be created that takes some JSON data and writes it out to OneDrive or Dropbox.

The first step is to create a new Flow and add a Request trigger. When the Flow is saved, a URL will be generated that can be posted to.

As part of this Request trigger, a JSON schema can be specified that allows individual JSON properties to be surfaced and referenced by name in later actions.

For example the following JSON schema could be specified:

{
  "$schema": "http://json-schema.org/draft-04/schema#",
  "title": "Customer",
  "description": "A generic customer",
  "type": "object",
  "properties": {
    "id": {
      "description": "The unique identifier for a customer",
      "type": "integer"
    },
    "name": {
      "description": "The full name of the customer excluding title",
      "type": "string"
    }
  },
  "required": [
    "id",
    "name"
  ]
}

Now in later steps, the “id” and the “name” properties from the incoming JSON can be used as dynamic content.

Next action(s) can be added that make use of the the data in these properties when an HTTP post occurs. For example we could create a file in OneDrive where the filename is {id}.txt that contains the customer name. This is a simple example but serves to demonstrate the flexibility.

The following screenshot shows the full flow and the JSON schema properties in use:

Microsoft Flow using request trigger and OneDrive

We can now post to the generated URL. For example the following screenshot shows a test post using Postman and the resulting file that was created in OneDrive:

HTTP post to trigger a Microsoft Flow

SHARE:

Serverless Computing and Workflows with Azure Functions and Microsoft Flow

Microsoft Flow is a tool for creating workflows to automate tasks. It’s similar in concept to If This Then That but feels like it exists more towards the end of the spectrum of the business user rather than the end consumer – though both have a number of channels/services in common. Flow has a number of advanced features such as conditions, loops, timers, and delays.

Flow has a number of services including common ones such as Dropbox, OneDrive, Twitter, and Facebook. There are also generic services for calling HTTP services, including those created as Azure Functions. Essentially, services are the building blocks of a Flow.

Screenshot of Microsoft Flow Services

Once the free sign up is complete you can create Flows from existing templates or create your own from scratch.

Screenshot of Microsoft Flow pre-built templates

To create a new custom Flow, the web-based workflow designer can be used.

Integrating a Flow with Azure Functions

In the following example, a Flow will be created that picks up files with a specific naming convention from a OneDrive folder, sends the text content to an Azure Function that simply converts to uppercase and returns the result to the Flow. The Flow then writes out the uppercase version to another OneDrive folder.

Reading Files From OneDrive

The first step in the Flow is to monitor a specific OneDrive folder for new files.

A Flow triggered by new OneDrive files

As an example of conditions, an “if statement” can be added to only process files that contain the word “data”:

Microsoft Flow condition

Now if the filename is correct we can go ahead and call an Azure Function (or other HTTP endpoint).

Calling an Azure Function from Microsoft Flow

Now that we are reading specific files, we want to call an Azure Function to convert the text content of the file to upper case.

The following code and screenshot shows the function that will be called – this code is stripped down and doesn’t contain any error checking/handling code for simplicity:

Azure Function app screenshot

using System.Net;

public static async Task<HttpResponseMessage> Run(HttpRequestMessage req, TraceWriter log)
{
    log.Info($"C# HTTP trigger function processed a request. RequestUri={req.RequestUri}");

    dynamic data = await req.Content.ReadAsAsync<object>();
    
    string text = data.text;

    return  req.CreateResponse(HttpStatusCode.OK, text.ToUpperInvariant());
}

We can test the API in Postman:

Calling Azure Function from Postman

Now that we have a working function we can add a new action of type “HTTP” to the Flow and pass the contents of the OneDrive file as JSON data in the request. The final step is to take the response of calling the Azure Function and writing out to a new file in OneDrive as the following screenshot shows:

Calling Azure Function passing OneDrive file content as JSON data

Now we can create a file “OneDrive\FlowDemo\In\test1data.txt”, the Flow will be trigged, and the output file “OneDrive\FlowDemo\Out\test1data.txt” created.

Output file

Microsoft Flow also has a really nice visual representation of runs (individual executions) of Flows:

Microsoft Flow run visualization

Microsoft Flow by itself enables a whole host of workflow scenarios, and combined with all the power of Azure Functions (and other Azure features) could enable some really interesting uses.

To jump-start your Azure Functions knowledge check out my Azure Function Triggers Quick Start Pluralsight course.

SHARE: