Azure Functions Continuous Deployment with Azure Pipelines: Part 2 - Getting Started

This is the second part in a series demonstrating how to setup continuous deployment of an Azure Functions App using Azure DevOps build and release pipelines.

In the previous instalment we got an overview  of Azure DevOps and of the app we’ll be building/deploying.

Demo app source code is available on GitHub.

Creating Azure Function Apps

There will be two Function Apps created in Azure. One will be a self-contained test environment (with separate Azure Storage account) and to keep things simple they both belong to the same resource group. The other Function App will be the production version. For simplicity we won’t be using slots, proxies, etc.

The two Function Apps are:

  • InvestFunctionAppDemo (production)
  • InvestFunctionAppDemoTest (test)

The test environment will be used to check the deployment  artifacts and as a target to execute functional end-to-end tests against.

Once these two Function Aps have been created in the Portal, all of the deployments will happen automatically from the release pipeline, including setting test/production specific Function App settings.

If you don’t have an Azure account you can currently sign up for free.

Signing Up or Azure DevOps

Now that there are some Function Apps created in Azure to deploy to, you can deploy to them from an Azure release pipelines. To do this you’ll need to sign up to Azure DevOps.

Azure DevOps contains a number of services, in this series we’ll be using the Azure Pipeline service.

An Overview of Azure Pipelines

Azure DevOps is currently free for open source projects and small teams with some limitations on things such as parallel job execution all the way up to 1,000 users for $7,833.26 USD/month. You can find the latest pricing information as part of the product information and compare features of different plans.

You can just use Azure Pipelines on their own, you don’t have to use all of the DevOps services (Boards, etc.).

Some of the features of Azure Pipelines include:

  • Build, test, and deploy for multiple languages including .NET. Python, Java, iOS, etc.
  • Container build and publish support
  • Deploy to clouds including Azure, AWS, and Google Cloud Platform
  • 10 free parallel jobs and unlimited build minutes for all open source projects
  • Advanced workflow features, testing, reporting, gates, etc.

Azure DevOps Organizations and Projects

Once you’ve signed up for Azure DevOps you’ll need an organization and a project.

“..an organization is a mechanism for organizing and connecting groups of related projects. Examples are business divisions, regional divisions, or other organizational structure. You can choose one organization for your entire company, or separate organizations for specific business units, or an organization just for you.” [Microsoft]

“Each organization contains one or more projects. Each project contains a set of features: boards and backlogs for agile planning, pipelines for continuous integration and deployment, repos for version control and management of source code and artifacts, and continuous test integration throughout the life cycle.” [Microsoft]

Creating a new Azure DevOps Project

Once the project is created, you can navigate to Pipelines section to create and manage build and release pipelines.

In the next part of this series we’ll create an initial build pipeline and get an introduction to defining source-controlled build definitions using YAML.

SHARE:

Azure Functions Continuous Deployment with Azure Pipelines: Part 1 - Overview

This is the first part in a series demonstrating how to setup continuous deployment of an Azure Functions App using Azure DevOps build and release pipelines.

In this series we’ll explore (among other things) how to:

  • Build an Azure Functions App when code is pushed to GitHub
  • Run unit tests as part of the build
  • Create multiple Azure Pipeline artifacts
  • Release to a test Function App environment in Azure
  • Run automated functional end to end tests
  • Release to a production Function App environment in Azure
  • Run smoke tests after a production deployment

…in addition to a whole host of other things such as: YAML build files, Azure Pipeline variables, setting Azure Function app settings from an Azure release pipeline, and using post-deployment gates.

An Overview of Azure DevOps

Azure DevOps is a collection of cloud services to facilitate team DevOps practices and includes:

  • Azure boards
  • Azure Pipelines
  • Azure repos
  • Azure test plans
  • Azure Artifacts

In this series we’ll be focussing on Azure Pipelines to build, test, and deploy the Function App automatically when new changes are pushed to GitHub.

InvestFunctionApp Overview

The demo code for these demos can be found on GitHub.

The InvestFunctionApp contains a number of functions that allow the investing of money in a portfolio. The initial entry point is a HTTP-triggered function that allows the investor id to be specified along with how much to add to the investors portfolio.Depending on the investors target allocations, the amount will be investing in either bonds or shares.

There are 4 functions involved in this workflow:

  1. Portfolio function: HTTP trigger, starts the workflow, creates a queue message
  2. CalculatePortfolioAllocation function: queue trigger from the output of (1), calculates where to invest and writes to either buy-stocks queue (3) or buy-bonds queue (4)
  3. BuyStocks function: triggered from buy-stocks queue, simulates buying stocks and updates the investor’s portfolio in Azure Table storage
  4. BuyBonds function: triggered from buy-bonds queue, simulates buying bonds and updates the investor’s portfolio in Azure Table storage

The app is designed to demonstrate the  facilities of Azure Pipelines and is simplified in many ways including minimal error checking, auditing, transactions, security, etc.

Azure DevOps Build Pipeline

The build Azure Pipeline will be automatically triggered from changes pushed to the GitHub repository.

The build pipeline will have the following steps:

  1. Build the solution
  2. Run unit tests
  3. Publish test results
  4. Package the Function App
  5. Publish the packaged Function App as a build artifact
  6. Copy the functional end-to-end tests
  7. Publish the functional end-to-end tests as a separate build artifact

Assuming all these step run without error, the two build artifacts can be passed to/used by the release pipeline.

Azure DevOps Release Pipeline

The release pipeline will be triggered automatically when the build pipeline completes without error.

The following diagram shows what the resulting release pipeline will look like:

Azure DevOps Release Pipeline example

The release pipeline will:

  1. Deploy the Function App to a test Function App in Azure
  2. Run functional end-to-end tests (written in xUnit.net) against this test deployment
  3. If the tests in (2) pass, deploy to production
  4. After deploying to production, call a smoke test function to verify the deployment was successful

In the next part of this series we’ll create Function Apps, signup to Azure DevOps, create a new DevOps project, and take a closer look at the demo function app.

SHARE:

Mocking HttpRequest Body Content When Testing Azure Function HTTP Trigger Functions

When creating Azure Functions that are triggered by an HTTP request, you may want to write unit tests for the function Run method. These unit tests can be executed outside of the Azure Functions runtime, just like testing regular methods.

If your HTTP-triggered function takes as the function input an HttpRequest (as opposed to an automatically JSON-deserialized class) you may need to provide request data in your test.

As an example, consider the following code snippet that defines an HTTP-triggered function.

[FunctionName("Portfolio")]
[return: Queue("deposit-requests")]
public static async Task<DepositRequest> Run(
    [HttpTrigger(AuthorizationLevel.Function, "post", Route = "portfolio/{investorId}")]HttpRequest req,
    [Table("Portfolio", InvestorType.Individual, "{investorId}")] Investor investor,
    string investorId,
    ILogger log)
{
    log.LogInformation($"C# HTTP trigger function processed a request.");

    string requestBody = await new StreamReader(req.Body).ReadToEndAsync();

    log.LogInformation($"Request body: {requestBody}");

    var deposit = JsonConvert.DeserializeObject<Deposit>(requestBody);

    // etc.
}

If the  the preceding code is executed in a test, some content needs to be provided to be used when accessing req.Body. To do this using Moq a mock HttpRequest can be created that returns a specified Stream instance for req.Body.

If you want to create a request body that contains a JSON payload, you can use the following helper method in your tests:

private static Mock<HttpRequest> CreateMockRequest(object body)
{            
    var ms = new MemoryStream();
    var sw = new StreamWriter(ms);

    var json = JsonConvert.SerializeObject(body);

    sw.Write(json);
    sw.Flush();

    ms.Position = 0;

    var mockRequest = new Mock<HttpRequest>();
    mockRequest.Setup(x => x.Body).Returns(ms);

    return mockRequest;
}

As an example of using this method in a test:

[Fact]
public async Task ReturnCorrectDepositInformation()
{
    var deposit = new Deposit { Amount = 42 };
    var investor = new Investor { };

    Mock<HttpRequest> mockRequest = CreateMockRequest(deposit);

    DepositRequest result = await Portfolio.Run(mockRequest.Object, investor, "42", new Mock<ILogger>().Object);

    Assert.Equal(42, result.Amount);
    Assert.Same(investor, result.Investor);
}

When the preceding test is run, the function run method will get the contents of the memory stream that contains the JSON.

To learn more about using Moq to create/configure/use mock objects check out my Mocking in .NET Core Unit Tests with Moq: Getting Started Pluralsight course. or to learn more about MemoryStream and how to work with streams in C# check out my Working with Files and Streams course.

SHARE:

Remote Debugging Azure Functions V2 "The breakpoint will not currently be hit. No symbols have been loaded for this document"

Sometimes it can be tricky to attach the Visual Studio debugger to a deployed Azure Functions app. For example if you use Cloud Explorer you can right click on the deployed Azure Function and choose Attach Debugger as the following screenshot shows:

Using Cloud Explorer to debug an Azure Function

While this may seem to work at first, you may experience a problem with your breakpoints actually being hit when function app code is executing with the message “The breakpoint will not currently be hit. No symbols have been loaded for this document”.

The breakpoint will not currently be hit. No symbols have been loaded for this document

As an alternative to attaching via Cloud Explorer, you can try the following approach:

1 Log in to Azure Portal and navigate to your deployed function app.

2 Download the publish profile

Downloading publish profile for Azure Function app

3 Open the downloaded file

Make a note of the userName, userPWD and destinationAppUrl values.

4 Attach Visual Studio Debugger to Azure Function App

  1. Make sure your function app project is open in Visual Studio
  2. Make sure that you have deployed a debug version of the function app to Azure
  3. On the Debug menu choose Attach to Process..
  4. For the Attach to value, click the Select.. button and un-tick everything except Managed (CoreCLR) code
  5. In the Connection target enter the  destinationAppUrl (without the preceding http) followed by :4022 – for example: investfunctionappdemotest.azurewebsites.net:4022 – and hit Enter
  6. You should now see an Enter Your Credentials popup box, use the userName and userPWD from step 3 and click Ok
  7. Wait a few seconds for Visual Studio to do its thing
  8. Click the w3wp.exe process and the click the Attach button (see screenshot below) Be patient after clicking as it may take quite a while for all the debug symbols to load.

Attaching to the Azure Functions w3wp.exe process

5 Set your breakpoints as desired

Breakpoint set in function run method

6 Invoke your function code and you should see your breakpoint hit

Once again this may take a while so be patient, you may also see “a remote operation is taking longer than expected”.

Azure Function breakpoint being hit in Visual Studio

SHARE:

Azure Functions Dependency Injection with Autofac

This post refers specifically to Azure Function V2.

If you want to write automated tests for Azure Functions methods and want to be able to control dependencies (e.g. to inject mock versions of things) you can set up dependency injection.

One way to do this is to install the AzureFunctions.Autofac NuGet package into your functions project.

Once installed, this package allows you to inject dependencies into your function methods at runtime.

Step 1: Create DI Mappings

The first step (after package installation) is to create a class that configures the dependencies. As an example suppose there was a function method that needed to make use of an implementation of an IInvestementAllocator. The following class can be added to the functions project:

using Autofac;
using AzureFunctions.Autofac.Configuration;

namespace InvestFunctionApp
{
    public class DIConfig
    {
        public DIConfig(string functionName)
        {
            DependencyInjection.Initialize(builder =>
            {
                builder.RegisterType<NaiveInvestementAllocator>().As<IInvestementAllocator>(); // Naive

            }, functionName);
        }
    }
}

In the preceding code, a constructor is defined that receives the name of the function that’s being injected into. Inside the constructor, types can be registered for dependency injection. In the preceding code the IInvestementAllocator interface is being mapped to the concrete class NaiveInvestementAllocator.

Step 2: Decorate Function Method Parameters

Now the DI registrations have been configured, the registered types can be injected in function methods. To do this the [Inject] attribute is applied to one or more parameters as the following code demonstrates:

[FunctionName("CalculatePortfolioAllocation")]
public static void Run(
    [QueueTrigger("deposit-requests")]DepositRequest depositRequest,
    [Inject] IInvestementAllocator investementAllocator,
    ILogger log)
    {
        log.LogInformation($"C# Queue trigger function processed: {depositRequest}");

        InvestementAllocation r = investementAllocator.Calculate(depositRequest.Amount, depositRequest.Investor);
    }

Notice in the preceding code the [Inject] attribute is applied to the IInvestementAllocator investementAllocator parameter. This IInvestementAllocator is the same interface that was registered earlier in the DIConfig class.

Step 3: Select DI Configuration

The final step to make all this work is to add an attribute to the class that contains the function method (that uses [Inject]). The attribute used is the DependencyInjectionConfig attribute that takes the type containing the DI configuration as a parameter, for example: [DependencyInjectionConfig(typeof(DIConfig))]

The full function code is as follows:

using AzureFunctions.Autofac;
using Microsoft.Azure.WebJobs;
using Microsoft.Extensions.Logging;

namespace InvestFunctionApp
{
    [DependencyInjectionConfig(typeof(DIConfig))]
    public static class CalculatePortfolioAllocation
    {
        [FunctionName("CalculatePortfolioAllocation")]
        public static void Run(
            [QueueTrigger("deposit-requests")]DepositRequest depositRequest,
            [Inject] IInvestementAllocator investementAllocator,
            ILogger log)
        {
            log.LogInformation($"C# Queue trigger function processed: {depositRequest}");

            InvestementAllocation r = investementAllocator.Calculate(depositRequest.Amount, depositRequest.Investor);
        }
    }
}

At runtime, when the CalculatePortfolioAllocation runs, an instance of an NaiveInvestementAllocator will be supplied to the function.

The library also supports features such as named dependencies and multiple DI configurations, to read more check out GitHub.

SHARE:

Testing Precompiled Azure Functions Overview

Just because serverless allows us to quickly deploy value, it doesn’t mean that testing is now obsolete. (click to Tweet)

If we’re using Azure Functions as our serverless platform we can write our code (for example C#) and test it before deploying to Azure. In this case we’re talking about precompiled Azure Functions as opposed to earlier incarnations of Azure Functions that used .csx script files.

Working with precompiled functions means the code can be developed and tested on a local development machine. The code we write is familiar C# with some additional attributes to integrate the code with the Azure Functions runtime.

Because the code is just regular C#, we can use familiar testing tools such as MSTest, xUnit.net, or NUnit. Using these familiar testing frameworks it’s possible to write tests that operate at different levels of granularity.

One way to categorize these tests are into:

  • Unit tests to check core business logic/value
  • Integration tests to check function run methods are operating correctly
  • End-to-end workflow tests that check multiple functions working together

To enable effective automated testing it may be necessary to write functions in such a way as to make them testable, for example by allowing function run method dependencies to be automatically injected at runtime, whereas at test time mock versions can be supplied for example using a framework such as AzureFunctions.Autofac.

There are other tools that allow us to more easily test functions locally such as the local functions runtime and the Azure storage emulator.

To learn more about using these tools and techniques to test Azure Functions, check out my Pluralsight course Testing Precompiled Azure Functions: Deep Dive.

SHARE:

Automatic Input Blob Binding in Azure Functions from Queue Trigger Message Data

Reading additional blob content when an Azure Function is triggered can be accomplished by using an input blob binding by defining a parameter in the function run method and decorating it with the [Blob] attribute.

For example, suppose you have a number of blobs that need converting in some way. You could initiate a process whereby the list of blob files that need processing are added to a storage queue. Each queue message contains the name of the blob that needs processing. This would allow the conversion function to scale out to convert multiple blobs in parallel.

The following code demonstrates one approach to do this. The code is triggered from a queue message that contains text representing the input bob filename that needs reading, converting, and then outputting to an output blob container.

using System.IO;
using Microsoft.Azure.WebJobs;
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Blob;

namespace FunctionApp1
{
    public static class ConvertNameCase
    {
        [FunctionName("ConvertNameCase")]
        public static void Run([QueueTrigger("capitalize-names")]string inputBlobPath)
        {
            string originalName = ReadInputName(inputBlobPath);

            var capitalizedName = originalName.ToUpperInvariant();

            WriteOutputName(inputBlobPath, capitalizedName);
        }
        
        private static string ReadInputName(string blobPath)
        {
            CloudStorageAccount account = CloudStorageAccount.DevelopmentStorageAccount;
            CloudBlobClient blobClient = account.CreateCloudBlobClient();
            CloudBlobContainer container = blobClient.GetContainerReference("names-in");

            var blobReference = container.GetBlockBlobReference(blobPath);

            string originalName = blobReference.DownloadText();

            return originalName;
        }

        private static void WriteOutputName(string blobPath, string capitalizedName)
        {
            CloudStorageAccount account = CloudStorageAccount.DevelopmentStorageAccount;
            CloudBlobClient blobClient = account.CreateCloudBlobClient();
            CloudBlobContainer container = blobClient.GetContainerReference("names-out");

            CloudBlockBlob cloudBlockBlob = container.GetBlockBlobReference(blobPath);
            cloudBlockBlob.UploadText(capitalizedName);            
        }

    }
}

In the preceding code, there is a lot of blob access code (which could be refactored). This function could however be greatly simplified by the use of one of the built-in binding expression tokens. Binding expression tokens can be used in binding expressions and are specified inside a pair of curly braces {…}. The {queueTrigger} binding token will extract the content of the incoming queue message that triggered a function.

For example, the code could be refactored as follows:

using System.IO;
using Microsoft.Azure.WebJobs;

namespace FunctionApp1
{
    public static class ConvertNameCase
    {
        [FunctionName("ConvertNameCase")]
        public static void Run(
        [QueueTrigger("capitalize-names")]string inputBlobPath,
        [Blob("names-in/{queueTrigger}", FileAccess.Read)] string originalName,
        [Blob("names-out/{queueTrigger}")] out string capitalizedName)
        {
                capitalizedName = originalName.ToUpperInvariant();         
        }
}

In the preceding code, the two [Blob] binding paths make use of the {queueTrigger} token. When the function is triggered, the queue message contains the name of the file to be processed. In the two [Blob] binding expressions, the {queueTrigger} token part will automatically be replaced with the text contents of the incoming message. For example if the message contained the text “File1.txt” then the two blob bindings would be set to names-in/File1.txt and names-out/File1.txt respectively. This means the input blob nameBlob string will automatically be read when the function is triggered,

To learn more about creating precompiled Azure Functions in Visual Studio, check out my Writing and Testing Precompiled Azure Functions in Visual Studio 2017 Pluralsight course.

SHARE:

Dynamic Binding in Azure Functions with Imperative Runtime Bindings

When creating precompiled Azure Functions, bindings (such as a blob output bindings) can be declared in the function code, for example the following code defines a blob output binding:

[Blob("todo/{rand-guid}")]

This binding creates a new blob with a random (GUID) name. This style of binding is called declarative binding, the binding details are declared as part of the binding attribute.

In addition to declarative binding, Azure Functions also offers imperative binding. With this style of binding, the details of the binding can be chosen at runtime. These details could be derived from the incoming function trigger data or from an external place such as a configuration value or database item

To create imperative bindings, rather than using a specific binding attribute, a parameter of type IBinder is used. At runtime, a binding can be created (such as a blob binding, queue binding, etc.) using this IBinder. The Bind<T> method of the IBinder can be used with T representing an input/output type that is supported by the binding you intend to use.

The following code shows imperative binding in action. In this example blobs are created and the blob path is derived from the incoming JSON data, namely the category.

public static class CreateToDoItem
{
    [FunctionName("CreateToDoItem")]
    public static async Task<HttpResponseMessage> Run(
        [HttpTrigger(AuthorizationLevel.Function, "post", Route = null)]HttpRequestMessage req,
        IBinder binder,
        TraceWriter log)
    {
        ToDoItem item = await req.Content.ReadAsAsync<ToDoItem>();
        item.Id = Guid.NewGuid().ToString();

        BlobAttribute dynamicBlobBinding = new BlobAttribute(blobPath: $"todo/{item.Category}/{item.Id}");

        using (var writer = binder.Bind<TextWriter>(dynamicBlobBinding))
        {
            writer.Write(JsonConvert.SerializeObject(item));
        }

        return req.CreateResponse(HttpStatusCode.OK, "Added " + item.Description);
    }
}

If the following 2 POSTS are made:

{
    "Description" : "Lift weights",
    "Category" : "Gym"
}
{
    "Description" : "Feed the dog",
    "Category" : "Home"
}

Then 2 blobs will be output with the following paths - note the random filenames and imperatively-bound paths: Gym and Home :

http://127.0.0.1:10000/devstoreaccount1/todo/Gym/5dc4eb72-0ae6-42fc-9a8b-f4bf646dcd28

http://127.0.0.1:10000/devstoreaccount1/todo/Home/530373ef-02bc-4200-a4e7-948448ac081b

SHARE:

Create Precompiled Azure Functions With Azure Event Grid Triggers

Visual Studio can be used to create precompiled Azure Functions using standard C# classes and tools/techniques and then they can be published to Azure.

This article assumes you’ve created the resources (resource group, Event Grid Topic, etc.) from this previous article.

In Visual Studio 2017, create a new Azure Functions project.

Next update the pre-installed Microsoft.NET.Sdk.Functions NuGet package to the latest version.

To get access to the Azure Event Grid function trigger attribute, install the Microsoft.Azure.WebJobs.Extensions.EventGrid NuGet package (this package is currently in preview/beta).

Add a new class to the project with the following code:

using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.EventGrid;
using Microsoft.Azure.WebJobs.Host;

namespace DCTDemos
{
    public static class Class1
    {
        [FunctionName("SendNewLeadWelcomeLetter")]
        public static void SendNewLeadWelcomeLetter([EventGridTrigger] EventGridEvent eventGridEvent, TraceWriter log)
        {
            log.Info($"EventGridEvent" +
                $"\n\tId:{eventGridEvent.Id}" +
                $"\n\tTopic:{eventGridEvent.Topic}" +
                $"\n\tSubject:{eventGridEvent.Subject}" +
                $"\n\tType:{eventGridEvent.EventType}" +
                $"\n\tData:{eventGridEvent.Data}");
        }
    }
}

Notice in the preceding code, the method name SendNewLeadWelcomeLetter is the same as specified in the function name attribute, this may be required due to a bug in the current preview/beta implementation – if these are different your function may not be executed when an event occurs.

Right-click on the function project and choose publish. Follow the wizard and create a new Function App and select your resource group where your Event Grid Topics is. Select West US 2 if you need to create any new Azure resources/storage account/etc..

Once deployed, head over to Azure Portal, open your new function app and select the newly deployed SendNewLeadWelcomeLetter function:

Adding an Azure Event Grid subcription for an Azure Function

At the top right select Add Event Grid subscription. And follow the wizard to create a new subscription - this will enable the new function to be triggered by an Event Grid Subscription. As part of the subscription we’ll limit the event type to new-sales-lead-created:

Adding an Azure Event Grid subcription for an Azure Function

Next go to the function app platform features tab and select Log Streaming. We can now use Postman to POST the following JSON to the Event Grid Topic we created earlier.

[
    {
        "id": "1236",
        "eventType": "new-sales-lead-created",
        "subject": "myapp/sales/leads",
        "eventTime": "2017-12-08T01:01:36+00:00",
        "data":{
            "firstName": "Amrit",
            "postalAddress": "xyz"
        }
    }
]

Head back to the streaming logs and you should see your precompiled Azure Function executing in response to the Event Grid event:

2017-12-08T06:38:25  Welcome, you are now connected to log-streaming service.

2017-12-08T06:38:49.841 Function started (Id=ec927bc1-fa15-4211-a7bd-8e593f5d4840)

2017-12-08T06:38:49.841 EventGridEvent
    Id:1234
    Topic:/subscriptions/797e1c4e-3fd4-4cd6-84b8-ef103cee8b6b/resourceGroups/DCTEGDemo/providers/Microsoft.EventGrid/topics/sales-leads
    Subject:myapp/sales/leads
    Type:new-sales-lead-created
    Data:{

  "firstName": "Amrit",

  "postalAddress": "xyz"

}

2017-12-08T06:38:49.841 Function completed (Success, Id=ec927bc1-fa15-4211-a7bd-8e593f5d4840, Duration=0ms)

 

To learn how to create precompiled Azure Functions in Visual Studio, check out my Writing and Testing Precompiled Azure Functions in Visual Studio 2017 Pluralsight course.

SHARE:

New Pluralsight Course: Writing and Testing Precompiled Azure Functions in Visual Studio 2017

Azure Functions have come a long way in a short time. With newer releases you can now create functions in Visual Studio using standard C# class files along with specific attributes to help define triggers, bindings, etc. This means that all the familiar powerful Visual Studio tools, workflows, NuGet packages, etc. can be used to develop Azure Functions. Visual Studio also provides publish support so you can upload your functions to the cloud once you are happy with them. Another feature that makes developing functions in Visual Studio easier is the local functions runtime that let’s you run and debug functions on your local development machine, without needing to publish to the cloud just to test them.

In my new Writing and Testing Precompiled Azure Functions in Visual Studio 2017 Pluralsight course you will learn how to:

  • Set up your local development environment
  • Develop and test Azure Functions locally
  • Publish functions to Azure
  • Create functions triggered from incoming HTTP requests
  • Trigger functions from Azure Storage queues and blobs
  • Trigger functions from Azure Service Bus and Azure Event Hubs
  • Trigger functions periodically on a timer
  • Unit test Azure Function business logic

Check out the full course outline for more details.

SHARE: