Creating Versioned APIs with Azure Functions and Proxies

One of the interesting possibilities with the (currently in preview) Azure Function Proxies is the ability to create HTTP APIs that can be versioned and also deployed/managed independently.

For example, suppose there is a API that lives at the root “https://dctdemoapi.azurewebsites.net/api". We could have multiple resources under this root such as customer, products, etc.

So to get the product with an id of 42 we’d construct: “https://dctdemoapi.azurewebsites.net/api/products?id=42”.

If we wanted the ability to version the API we could construct “https://dctdemoapi.azurewebsites.net/api/v1/products?id=42” for version 1 and “https://dctdemoapi.azurewebsites.net/api/v2/products?id=42” for version 2, etc.

Using proxies we can use the format “https://dctdemoapi.azurewebsites.net/api/[VERSION]/[RESOURCE]?[PARAMS]”

Now we can create 2 proxies (for example in an Azure Function called “dctdemoapi”) that forwards the HTTP requests to other Function Apps (in this example dctdemoapiv1 and dctdemoapiv2).

Screenshots of the proxies are as follows:

Azure Function proxy settings for API version 1

Azure Function proxy settings for API version 2

And the respective proxies.json config file:

{
    "proxies": {
        "v1": {
            "matchCondition": {
                "route": "api/v1/{*restOfPath}"
            },
            "backendUri": "https://dctdemoapiv1.azurewebsites.net/api/{restOfPath}"
        },
        "v2": {
            "matchCondition": {
                "route": "api/v2/{*restOfPath}"
            },
            "backendUri": "https://dctdemoapiv2.azurewebsites.net/api/{restOfPath}"
        }
    }
}

Notice in the proxy config the use of the wildcard term “{*restOfPath}” – this will pass the remainder of the path segments to the backend URL, for example “products”, meaning a request to “https://dctdemoapi.azurewebsites.net/api/v1/products?id=42” will be sent to “https://dctdemoapiv1.azurewebsites.net/api/products?id=42”; and “https://dctdemoapi.azurewebsites.net/api/v2/products?id=42” will be sent to “https://dctdemoapiv2.azurewebsites.net/api/products?id=42”.

Now versions of the API can be updated/monitored/managed/etc independently because they are separate Function App instances, but code duplication is a potential problem; common business logic could however be compiled into an assembly and referenced in both Function Apps.

To jump-start your Azure Functions knowledge check out my Azure Function Triggers Quick Start Pluralsight course.

Using Azure Functions and Microsoft Flow to Send Notifications for NuGet Package Downloads

One of the NuGet packages I maintain is approaching 100,000 downloads. I thought it would be nice to get a notification on my phone when the number of downloads hit 100,000.

To implement this I installed the Flow app on my iPhone, wrote an Azure Function that executes on a timer, and calls into Flow.

Creating a Flow

The first step is to create a new Microsoft Flow that is triggered by a HTTP Post being sent to it.

The flow uses a Request trigger and a URL is auto generated by which the flow can be initiated.

The second step is the Notification action that results in a notification being raised in the Flow app for iOS.

Azure Function calling Microsoft Flow

Creating an Azure Function with Timer Trigger

Now that there is a URL to POST to to create notifications, a timer-triggered Azure Function can be created.

This function screen scrapes the NuGet page (I’m sure there’s a more elegant/less brittle way of doing this) and grabbing the HTML element containing the total downloads. If the total number of downloads >= 100,000 , then the flow URL will be called with a message in the body. The timer schedule runs once per day. I’ll have to manually disable the function once > 100,000 downloads are met.

The function code:

using System.Net;
using HtmlAgilityPack;
using System.Globalization;
using System.Text;

public static async Task Run(TimerInfo myTimer, TraceWriter log)
{       
    try
    {
        string html = new WebClient().DownloadString("https://www.nuget.org/packages/FeatureToggle");

        HtmlDocument doc = new HtmlDocument();
        doc.LoadHtml(html);     
                            
        HtmlNode downloadsNode = doc.DocumentNode
                                    .Descendants("p")
                                    .First(x => x.Attributes.Contains("class") &&  
                                                x.Attributes["class"].Value.Contains("stat-number"));                        

        int totalDownloads = int.Parse(downloadsNode.InnerText, NumberStyles.AllowThousands);
        
        bool thresholdMetOrExceeded = totalDownloads >= 1; // 1 for test purposes, should be 100000

        if (thresholdMetOrExceeded)
        {
            var message = $"FeatureToggle now has {totalDownloads} downloads";

            log.Info(message);

            await SendToFlow(message);            
        }        
    }
    catch (Exception ex)
    {
        log.Info(ex.ToString());
        await SendToFlow($"Error: {ex}");
    }
}

public static async Task SendToFlow(string message)
{
    const string flowUrl = "https://prod-16.australiasoutheast.logic.azure.com:443/workflows/[redacted]/triggers/manual/paths/invoke?api-version=2015-08-01-preview&sp=%2Ftriggers%2Fmanual%2Frun&sv=1.0&sig=[redacted]";

    using (var client = new HttpClient())
    {
        var content = new StringContent(message, Encoding.UTF8, "text/plain");
        
        await client.PostAsync(flowUrl, content);
    }
}

Manually running the function (with test threshold of 1 download) results in the following notification on from the Flow iOS app:

This demonstrates the nice thing about Azure Functions, namely that it’s easy to throw something together to solve a problem.

iOS Flow App Notification

To jump-start your Azure Functions knowledge check out my Azure Function Triggers Quick Start Pluralsight course.

Cross Function App Proxies and Proxy Request Parameters in Azure Functions

In a previous article we saw how to get started with Azure Function Proxies. In addition to creating proxy URLs for HTTP functions in the same Function App, it is also possible to specifiy a backend URL that exists in a different Function App (or even a URL somewhere completely different). This essentially allows a single (HTTP) API for clients to use, that behind the scenes routes traffic to multiple Function Apps to enable  the decomposing of a single large Function App into multiple smaller ones. This also enables isolation between different Function Apps for deployment, testing, management, monitoring, etc..

For example, suppose we have an HTTP function defined in a Function App called myazurecloudfunctions that GETs a customer using an id querystring parameter; a request for customer 42 would look like the following (with function authorization enabled):

https://myazurecloudfunctions.azurewebsites.net/api/LoadCust?id=42&code=SKmM3Hr0IFEdJp9QnQ5ztu/mClYScUxjdjQw4TvSQ9OXm0xrgXQUpg==

In another Function App called dontcodetireddemos a proxy can be created to enable the following URL to be used instead:

https://dontcodetireddemos.azurewebsites.net/customer/42?code=SKmM3Hr0IFEdJp9QnQ5ztu/mClYScUxjdjQw4TvSQ9OXm0xrgXQUpg==

Notice in this proxied URL the id 42 is specified in the URL, not as a querystring parameter. This is another feature of Azure Function Proxies that  allows request parameters to be mapped from the proxy URL to the backend URL. In this example the proxy route template is “customer/{custId}” and the backend is set to “https://myazurecloudfunctions.azurewebsites.net/api/LoadCust?id={custId}” – notice the token {custId}.

The proxies.json configuration is:

{
    "proxies": {
        "CustomerGet": {
            "matchCondition": {
                "route": "customer/{custId}",
                "methods": [
                    "GET"
                ]
            },
            "backendUri": "https://myazurecloudfunctions.azurewebsites.net/api/LoadCust?id={custId}"
        }
    }
}

And a screenshot of the configuration:

Azure Function Proxy settings

To jump-start your Azure Functions knowledge check out my Azure Function Triggers Quick Start Pluralsight course.

Azure Functions Proxies Preview

Azure functions allow the creation of HTTP-triggered code. A new feature to Functions is the ability to define proxies. (Note: at the time of writing this feature is in preview)

For example a function can be created that responds to HTTP GETs to retrieve a customer as the following code demonstrates:

using System.Net;

public class Customer 
{
    public int Id {get; set;}    
    public string Name {get; set;}
}

public static async Task<HttpResponseMessage> Run(HttpRequestMessage req, TraceWriter log)
{
    // error checking omitted

    string id = req.GetQueryNameValuePairs()
        .FirstOrDefault(q => string.Compare(q.Key, "id", true) == 0)
        .Value;

    return req.CreateResponse(HttpStatusCode.OK, GetById(int.Parse(id)));
}

public static Customer GetById(int id)
{
    // simulate fetching a customer, e.g. from Table storage, db, etc.
    return new Customer
        {
            Id = id,
            Name = "Amrit"
        };
}

This function (called “GetCustomer”) could be available at the following URL (with function authorization enabled): https://dontcodetireddemos.azurewebsites.net/api/GetCustomer?id=42&code=rEQKsObGFDRCCiVuLhOnZ1Rdfn/XGgp5tfC1xHrhAqxvWqzHSQszCg==

Notice in the URL the name of the function is prefixed with “api” i.e. “api/GetCustomer”

Calling this URL would result in the following JSON being returned:

{
  "Id": 42,
  "Name": "Amrit"
}

Function proxies allow the “api” part to be replaced, for example to create a URL:  “https://dontcodetireddemos.azurewebsites.net/customer?id=42&code=rEQKsObGFDRCCiVuLhOnZ1Rdfn/XGgp5tfC1xHrhAqxvWqzHSQszCg==”

Azure Function Proxy

This proxy defines a name “CustomerGet”, a route template of “customer”, the actual URL to maps to “https://dontcodetireddemos.azurewebsites.net/api/GetCustomer”, and limited to only GET HTTP verbs.

If we had another function e.g. called PostCustomer we could also setup a POST proxy for that function:

Azure Function Proxy

Now we can issue both POSTs and GETs to the customer “resource” at: https://dontcodetireddemos.azurewebsites.net/customer (code key omitted).

Using Kudu for example, we can see a proxies.js file created in the wwwroot folder with the following contents:

{
    "proxies": {
        "CustomerGet": {
            "matchCondition": {
                "route": "customer",
                "methods": [
                    "GET"
                ]
            },
            "backendUri": "https://dontcodetireddemos.azurewebsites.net/api/GetCustomer"
        },
        "CustomerPost": {
            "matchCondition": {
                "route": "customer",
                "methods": [
                    "POST"
                ]
            },
            "backendUri": "https://dontcodetireddemos.azurewebsites.net/api/PostCustomer"
        }
    }
}

To jump-start your Azure Functions knowledge check out my Azure Function Triggers Quick Start Pluralsight course.

 

Push Notifications and Buttons with Microsoft Flow: Part 3

In part 1 we crated a Flow to enable/disable the sending of push notifications and in part 2 we created an Azure Function to generate random phrases of positivity.

In this third and final part of this series we’ll go and create the second Flow that sends the push notifications to the phone.

This Flow will be automatically triggered every 15 minutes by using a Recurrence action followed by an action to get the blob content containing whether or not we should send a notification as the following screenshot shows:

Running a Microsoft Flow every 15 minutes

Now that we have the blob content, we can examine it in an If condition. If the content of the blob is “enabled” we can continue the Flow:

If condition in Microsoft Flow

If the condition is satisfied (“IF YES”) two actions are  performed: the first an HTTP GET to the Azure function, the second a push notification action that as content uses what was returned in the body of the HTTP GET. Notice in the HTTP GET, the URI includes the function authorization key as a query string parameter.

flow4

After saving the new Flow, we can head back to the Flow app and hit the button to enable “positivity pushes”:

Microsoft Flow iOS app button

Then every 15 minutes (until we turn them off by hitting the button again) we’ll get a positivity notification on the phone:

iOS push notifications from Microsoft Flow

Push Notifications and Buttons with Microsoft Flow: Part 2

In part 1 we created a Flow to toggle the sending of push notifications on and off by storing the configuration in Azure blob storage.

Now that we have a way of enabling/disabling notifications we can start to build the second Flow.

Before jumping into the Flow designer, we need to consider how to generate random positivity phrases and how to integrate this into the second Flow. One option to do this is to create a simple Azure Function with an HTTP trigger. The Flow can then use an HTTP action to issue a GET to the server that will return the string content to be sent via push notifications.

In the Azure Portal function editor a new function can be created with a HTTP trigger configured to GET only as the following screenshot shows:

Creating an Azure Function with a HTTP trigger

Notice in the preceding screenshot the authorization level has bee set to “function”. This means the key needs to be provided when the function is called.

We can now write some code in the function code editor window as follows:

using System.Net;

public static HttpResponseMessage Run(HttpRequestMessage req, TraceWriter log)
{
    log.Info("C# HTTP trigger function processed a request.");

    string phrase = GeneratePhrase();

    return req.CreateResponse(HttpStatusCode.OK, phrase);
}

public static string  GeneratePhrase()
{
    var phrases = new string[]
    {
        "Don't worry, be happy :)",
        "All is well",
        "Will it matter in 100 years?",
        "Change what you can, don't worry about what you can't"
    };

    var rnd = new Random();
    
    return phrases[rnd.Next(phrases.Length)];
}

Azure Function code editor window

Clicking the Run button will test the function and we can see the random phrases being returned with a 200 status as shown in the following screenshot:

Azure function HTTP test output

In the final part of  this series we’ll go and create the second Flow that uses this function and the configuration value created in the previous article to actually send random positivity push notifications on a 15 minute schedule.

To jump-start your Azure Functions knowledge check out my Azure Function Triggers Quick Start Pluralsight course.

Push Notifications and Buttons with Microsoft Flow: Part 1

Microsoft Flow allows the creation of serverless cloud workflows. It is similar to services such as If This Then That and has more of a business focus. It allows custom Flows to integrate with Azure services such as blob storage, the calling of arbitrary HTTP services, in addition to a whole host of services such as Facebook, Dropbox, OneDrive, etc.

In addition to executing in the cloud, Flows can create push notification to the Flow app on iOS and Android.

Once installed, the Flow app can be used to design/edit Flows, view Flow activity/recent executions, and initiate the execution of flows via “buttons” as shown in the following screenshot.

Microsoft Flow iOS app

Tapping this software button will trigger the Flow in the cloud.

Example Scenario

To see buttons and push notification in action, imagine a scenario where sometimes you want cheering up with regular messages of positivity.

In this scenario, when enabled, you’ll get a push notification on your phone every 15 minutes with a random positive phase such as “Don't worry, be happy :)”.

To accomplish this two separate (but related) Flows can be created. The first Flow uses a button in the phone Flow app to toggle wether the “positivity pushes” will be sent. The second Flow is triggered automatically every 15 minutes and if enabled, sends a push notification.

Creating the Toggle Positivity Push Flow

This Flow will enable/disable the push notifications. To do this, a manual button trigger will be added to the Flow that will be pushed on the phone. To hold the enabled/disabled state, we can use the content of a blob in Azure blob storage. When triggered, the Flow will retrieve the content of the blob which can be the string “enabled” or “disabled”.

Microsoft Flow reading content from blob storage

Once the blob content has been retrieved, its content can be examined in a If condition. If the content of the blob is currently “enabled” it will be updated to “disabled” and vice versa. Finally we’ll send a push notification to confirm the state.

Microsoft Flow examining blob content

Pressing the button in the app a couple of times results in the expected push notifications:

Microsoft Flow sending push notifications to iOS

The blob content also gets toggled as expected:

Blob content being toggled from Microsoft Flow

In part 2 of this series we’ll start the process of creating another Flow to actually send the random positivity phrases to the phone.

Custom Session Logging in Marten

Marten is a .NET document database library that uses an underlying PostgreSQL database to store objects as JSON. The library has a variety of features that allow the logging of SQL statements issued to the underlying PostgreSQL including previewing LINQ query SQL statements.  One of the other logging features available allows custom logging to be created for individual session operations such as successfully issued database SQL commands, failed commands, and changes that were saved. There are also numerous other logging/extension points that can be utilized such as logging schema change SQL and automatically using a logger for all sessions.

The following code shows a console application that writes two customers and then retrieves them:

using System;
using static System.Console;
using Marten;

namespace MartenFKDemo
{
    class Customer
    {
        public int Id { get; set; }
        public string Name { get; set; }
        public override string ToString() => $"{Id} {Name}";
    }

    class Program
    {
        static void Main(string[] args)
        {
            const string connectionString = "host = localhost; database = OrderDb; password = g7qo84nck22i; username = postgres";

            var store = DocumentStore.For(connectionString);

            WriteLine("Creating new customer");
            using (var session = store.OpenSession())
            {
                session.Store(new Customer {Name = "Amrit"}, new Customer {Name = "Sarah"});

                session.SaveChanges();
            }


            WriteLine("All customers:");
            using (var session = store.QuerySession())
            {
                foreach (Customer customer in session.Query<Customer>())
                {
                    WriteLine(customer);
                }
            }
            
            ReadLine();
        }
    }
}

Running the application results in the following output:

Creating new customer
All customers:
9001 Amrit
9002 Sarah

To create a custom session logger, the IMartenSessionLogger interface can be implemented, a simple version that logs to the console is shown as follows:

class ColorConsoleLogger : IMartenSessionLogger
{
    public void LogSuccess(Npgsql.NpgsqlCommand command)
    {
        ForegroundColor = ConsoleColor.Green;

        WriteLine(command.CommandText); // additional properties (e.g. SQL parameters) are available
    }

    public void LogFailure(Npgsql.NpgsqlCommand command, Exception ex)
    {
        ForegroundColor = ConsoleColor.Red;

        WriteLine(command.CommandText); // additional properties (e.g. SQL parameters) are available
        WriteLine(ex);
    }

    public void RecordSavedChanges(IDocumentSession session, IChangeSet commit)
    {
        ForegroundColor = ConsoleColor.Gray;

        foreach (object insertedItem in commit.Inserted) //  updated/deleted are also available
        {
            WriteLine(insertedItem);
        }
    }
}

To configure individual sessions to use this logger, the sessions Logger property can be set as the following modified code demonstrates:

ResetColor();
WriteLine("Creating new customer");
using (var session = store.OpenSession())
{
    // Set logger for this session
    session.Logger = new ColorConsoleLogger();

    session.Store(new Customer {Name = "Amrit"}, new Customer {Name = "Sarah"});

    session.SaveChanges();
}

ResetColor();
WriteLine("All customers:");
using (var session = store.QuerySession())
{
    // Set logger for this session
    session.Logger = new ColorConsoleLogger();

    foreach (Customer customer in session.Query<Customer>())
    {
        ResetColor();
        WriteLine(customer);
    }
}

This produces the following output:

Creating new customer
select public.mt_upsert_customer(doc := :p0, docDotNetType := :p1, docId := :p2, docVersion := :p3);select public.mt_upsert_customer(doc := :p4, docDotNetType := :p5, docId := :p6, docVersion := :p7);
13001 Amrit
13002 Sarah
All customers:
select d.data, d.id, d.mt_version from public.mt_doc_customer as d
13001 Amrit
13002 Sarah

screenshot of Marten custom logger

To learn more about the document database features of Marten check out my Pluralsight courses: Getting Started with .NET Document Databases Using Marten and Working with Data and Schemas in Marten or the documentation.

What Would Easy Be Like?

As another (Gregorian) year edges ever closer, it’s a period that offers a natural reflection time on what happened in the past year and the forging, often weakly, of “New Years Resolutions”.

One interesting question to ask for your work (or personal) life is: “what would easy be like?”.

For example, looking back to the past year where was the pain, sadness, frustration, long hours, constant overtime, feelings of falling behind in new technologies, failed deployments, the same bugs re-occurring over and over again, unsupportive management, mean co-workers, feelings of inadequacy, disappointed customers, etc.

you should feel like you’re doing your best work

Take some time. Make a list. Write them all down.

Now imagine what it would feel like if all of the things on the list did not exist in the upcoming year – what would easy be like?

Fixing some of the things may mean radical decisions such as quitting your job and moving employer.

Some things may not be quite as radical: better automated testing to catch bugs earlier.

You probably won’t be able to fix everything on your list. Starting from the point of view that things should be easy, that you should feel like you’re doing your best work, can be a powerful way to frame where you’re currently at and where you want to be by this time next year.

Previewing the Generated PostgreSQL SQL for a Query in Marten

Marten is a .NET document database library that uses an underlying PostgreSQL database to store objects as JSON. The library has a variety of features that allow the logging of SQL statements issued to the underlying PostgreSQL database in addition to being able to do things such as get the PostgreSQL query plan for a given LINQ query.

One simple way to get the generated SQL for a Marten LINQ query is to use the ToCommand() extension method.

As an example, suppose we are developing some query code as follows (this code uses the Include method to include the related documents in a single database round-trip):

Customer customer = null;

List<Order> orders = session.Query<Order>()
                            .Include<Customer>(joinOnOrder => joinOnOrder.CustomerId, includedCustomer => customer = includedCustomer)
                            .Where(x => x.CustomerId == 4001).ToList();

If we want to get an idea of what SQL Marten will generate for this LINQ query, we can change the code as shown in the following:

Customer customer = null;

IQueryable<Order> orders = session.Query<Order>()
                                  .Include<Customer>(joinOnOrder => joinOnOrder.CustomerId, includedCustomer => customer = includedCustomer)
                                  .Where(x => x.CustomerId == 4001);

// Get the SQL command that will be issued when the query executes
NpgsqlCommand cmd = orders.ToCommand();

// Output some selected command info
Console.WriteLine(cmd.CommandText);

foreach (NpgsqlParameter parameter in cmd.Parameters)
{
    Console.WriteLine($"Parameter {parameter.ParameterName} = {parameter.Value}");
}

// Ensure included customer variable is populated
List<Order> orderResults = orders.ToList();

Console.WriteLine(customer.Name);
foreach (Order order in orderResults)
{
    Console.WriteLine($" Order {order.Id} for {order.Quantity} items");
}

Running this preceding code  results in the following console output:

select d.data, d.id, d.mt_version, customer_id.data, customer_id.id, customer_id.mt_version from public.mt_doc_order as d INNER JOIN public.mt_doc_customer as customer_id ON d.customer_id = customer_id.id where d.customer_id = :arg0
Parameter arg0 = 4001
Sarah
 Order 3001 for 42 items
 Order 4001 for 477 items
 Order 5001 for 9 items

To learn more about the document database features of Marten check out my Pluralsight courses: Getting Started with .NET Document Databases Using Marten and Working with Data and Schemas in Marten.