Unit Testing C# File Access Code with System.IO.Abstractions

It can be difficult  to write unit tests for code that accesses the file system.

It’s possible to write integration tests that read in an actual file from the file system, do some processing, and check the resultant output file (or result) for correctness. There are a number of potential problems with these types of integration tests including the potential for them to more run slowly (real IO access overheads), additional test file management/setup code, etc. (this does not mean that some integration tests wouldn’t be useful however).

The System.IO.Abstractions NuGet package can help to make file access code more testable. This package provides a layer of abstraction over the file system that is API-compatible with existing code.

Take the following code as an example:

using System.IO;
namespace ConsoleApp1
{
    public class FileProcessorNotTestable
    {
        public void ConvertFirstLineToUpper(string inputFilePath)
        {
            string outputFilePath = Path.ChangeExtension(inputFilePath, ".out.txt");

            using (StreamReader inputReader = File.OpenText(inputFilePath))
            using (StreamWriter outputWriter = File.CreateText(outputFilePath))
            {
                bool isFirstLine = true;

                while (!inputReader.EndOfStream)
                {
                    string line = inputReader.ReadLine();

                    if (isFirstLine)
                    {
                        line = line.ToUpperInvariant();
                        isFirstLine = false;
                    }

                    outputWriter.WriteLine(line);
                }
            }
        }
    }
}

The preceding code opens a text file, and writes it to a new output file, but with the first line converted to uppercase.

This class is not easy to unit test however, it is tightly coupled to the physical file system with the calls to File.OpenText and File.CreateText.

Once the System.IO.Abstractions NuGet package is installed, the class can be refactored as follows:

using System.IO;
using System.IO.Abstractions;

namespace ConsoleApp1
{
    public class FileProcessorTestable
    {
        private readonly IFileSystem _fileSystem;

        public FileProcessorTestable() : this (new FileSystem()) {}

        public FileProcessorTestable(IFileSystem fileSystem)
        {
            _fileSystem = fileSystem;
        }

        public void ConvertFirstLineToUpper(string inputFilePath)
        {
            string outputFilePath = Path.ChangeExtension(inputFilePath, ".out.txt");

            using (StreamReader inputReader = _fileSystem.File.OpenText(inputFilePath))
            using (StreamWriter outputWriter = _fileSystem.File.CreateText(outputFilePath))
            {
                bool isFirstLine = true;

                while (!inputReader.EndOfStream)
                {
                    string line = inputReader.ReadLine();

                    if (isFirstLine)
                    {
                        line = line.ToUpperInvariant();
                        isFirstLine = false;
                    }

                    outputWriter.WriteLine(line);
                }
            }
        }
    }
}

The key things to notice in the preceding code is the ability to pass in an IFileSystem as a constructor parameter. The calls to File.OpenText and File.CreateText are now redirected to _fileSystem.File.OpenText and _fileSystem.File.CreateText  respectively.

If the parameterless constructor is used (e.g. in production at runtime) an instance of FileSystem will be used, however at test time, a mock IFileSystem can be supplied.

Handily, the System.IO.Abstractions.TestingHelpers NuGet package provides a pre-built mock file system that can be used in unit tests, as the following simple test demonstrates:

using System.IO.Abstractions.TestingHelpers;
using Xunit;

namespace XUnitTestProject1
{
    public class FileProcessorTestableShould
    {
        [Fact]
        public void ConvertFirstLine()
        {
            var mockFileSystem = new MockFileSystem();

            var mockInputFile = new MockFileData("line1\nline2\nline3");

            mockFileSystem.AddFile(@"C:\temp\in.txt", mockInputFile);

            var sut = new FileProcessorTestable(mockFileSystem);
            sut.ConvertFirstLineToUpper(@"C:\temp\in.txt");

            MockFileData mockOutputFile = mockFileSystem.GetFile(@"C:\temp\in.out.txt");

            string[] outputLines = mockOutputFile.TextContents.SplitLines();

            Assert.Equal("LINE1", outputLines[0]);
            Assert.Equal("line2", outputLines[1]);
            Assert.Equal("line3", outputLines[2]);
        }
    }
}

To see this in action or to learn more about file access, check out my Working with Files and Streams in C# Pluralsight course.

Customizing C# Object Member Display During Debugging

In a previous post I wrote about Customising the Appearance of Debug Information in Visual Studio with the DebuggerDisplay Attribute. In addition to controlling the high level  debugger appearance of an object we can also exert a lot more control over how the object appears in the debugger by using the DebuggerTypeProxy attribute.

For example, suppose we have the following (somewhat arbitrary) class:

class DataTransfer
{
    public string Name { get; set; }
    public string ValueInHex { get; set; }
}

By default, in the debugger it would look like the following:

Default Debugger View

To customize the display of the object members, the DebuggerTypeProxy attribute can be applied.

The first step is to create a class to act as a display proxy. This class takes the original object as part of the constructor and then exposes the custom view via public properties.

For example, suppose that we wanted a decimal display of the hex number that originally is stored in a string property in the original DataTransfer object:

class DataTransferDebugView
{
    private readonly DataTransfer _data;

    public DataTransferDebugView(DataTransfer data)
    {
        _data = data;
    }

    public string NameUpper => _data.Name.ToUpperInvariant();
    public string ValueDecimal
    {
        get
        {
            bool isValidHex = int.TryParse(_data.ValueInHex, System.Globalization.NumberStyles.HexNumber, null, out var value);

            if (isValidHex)
            {
                return value.ToString();
            }

            return "INVALID HEX STRING";
        }
    }
}

Once this view object is defined, it can be selected by decorating the DataTransfer class with the DebuggerTypeProxy attribute as follows:

[DebuggerTypeProxy(typeof(DataTransferDebugView))]
class DataTransfer
{
    public string Name { get; set; }
    public string ValueInHex { get; set; }
}

Now in the debugger, the following can be seen:

Custom debug view showing hex value as a decimal

Also notice in the preceding image, that the original object view is available by expanding the Raw View section.

To learn more about C# attributes and even how to create your own custom ones, check out my C# Attributes: Power and Flexibility for Your Code course at Pluralsight.

MSTest V2

In the (relatively) distant past, MSTest was often used by organizations because it was provided by Microsoft “in the box” with Visual Studio/.NET. Because of this, some organizations trusted MSTest over open source testing frameworks such as NUnit. This was at a time when the .NET open source ecosystem was not as advanced as it is today and before Microsoft began open sourcing some of their own products.

Nowadays MSTest is cross-platform and open source and is known as MSTest V2, and as the documentation states: “is a fully supported, open source and cross-platform implementation of the MSTest test framework with which to write tests targeting .NET Framework, .NET Core and ASP.NET Core on Windows, Linux, and Mac.”.

MSTest V2 provides typical assert functionality such as asserting on the values of: strings, numbers, collections, thrown exceptions, etc. Also like other testing frameworks, MSTest V2 allows the customization of the test execution lifecycle such as the running of additional setup code before each test executes. The framework also allows the creation of data driven tests (a single test method executing  multiple times with different input test data) and the ability to extend the framework with custom asserts and custom test attributes.

You can find out more about MSTest V2 at the GitHub repository, the documentation, or check out my Pluralsight course: Automated Testing with MSTest V2.

Prevent Secrets From Accidentally Being Committed to Source Control in ASP.NET Core Apps

One problem when dealing with developer “secrets” in development is accidentally checking them into source control. These secrets could be connection strings to dev resources, user IDs, product keys, etc.

To help prevent this from accidentally happening, the secrets can be stored outside of the project tree/source control repository. This means that when the code is checked in, there will be no secrets in the repository.

Each developer will have their secrets stored outside of the project code. When the app is run, these secrets can be retrieved at runtime from outside the project structure.

One way to accomplish this in ASP.NET Core  projects is to make use of the Microsoft.Extensions.SecretManager.Tools NuGet package to allow use of the command line tool. (also if you are targeting .NET Core 1.x , install the Microsoft.Extensions.Configuration.UserSecrets NuGet package).

Setting Up User Secrets

After creating a new ASP.NET Core project, add a tools reference to the NuGet package to the project, this will add the following item in the project file:

<DotNetCliToolReference Include="Microsoft.Extensions.SecretManager.Tools" Version="2.0.0" />

Build the project and then right click the project and you will see a new item called “Manage User Secrets” as the following screenshot shows:

Managing user secrets in Visual Studio

Clicking menu item will open a secrets.json file and also add an element named UserSecretsId to the project file. The content of this element is a GUID, the GUID is arbitrary but should be unique for each and every project.

<UserSecretsId>c83d8f04-8dba-4be4-8635-b5364f54e444</UserSecretsId>

User secrets will be stored in the secrets.json file which will be in %APPDATA%\Microsoft\UserSecrets\<user_secrets_id>\secrets.json on Windows or ~/.microsoft/usersecrets/<user_secrets_id>/secrets.json on Linux and macOS. Notice these paths contain the user_secrets_id that matches the GUID in the project file. In this way each project has a separate set of user secrets.

The secrets.json file contains key value pairs.

Managing User Secrets

User secrets can be added by editing the json file or by using the command line (from the project directory).

To list user secrets type: dotnet user-secrets list At the moment his will return “No secrets configured for this application.”

To set (add) a secret: dotnet user-secrets set "Id" "42"

The secrets.json file now contains the following:

{
  "Id": "42"
}

Other dotnet user-secrets  commands include:

  • clear - Deletes all the application secrets
  • list - Lists all the application secrets
  • remove - Removes the specified user secret
  • set - Sets the user secret to the specified value

Accessing User Secrets in Code

To retrieve users secrets, in the startup class, access the item by key, for example:

public void ConfigureServices(IServiceCollection services)
{
    services.AddMvc();

    var secretId = Configuration["Id"]; // returns 42
}

One thing to bear in mind is that secrets are not encrypted in the secrets.json file, as the documentation states: “The Secret Manager tool doesn't encrypt the stored secrets and shouldn't be treated as a trusted store. It's for development purposes only. The keys and values are stored in a JSON configuration file in the user profile directory.” & “You can store and protect Azure test and production secrets with the Azure Key Vault configuration provider.”

There’s a lot more information in the documentation and if you plan to use this tool you should read through it.

Testing Precompiled Azure Functions Overview

Just because serverless allows us to quickly deploy value, it doesn’t mean that testing is now obsolete. (click to Tweet)

If we’re using Azure Functions as our serverless platform we can write our code (for example C#) and test it before deploying to Azure. In this case we’re talking about precompiled Azure Functions as opposed to earlier incarnations of Azure Functions that used .csx script files.

Working with precompiled functions means the code can be developed and tested on a local development machine. The code we write is familiar C# with some additional attributes to integrate the code with the Azure Functions runtime.

Because the code is just regular C#, we can use familiar testing tools such as MSTest, xUnit.net, or NUnit. Using these familiar testing frameworks it’s possible to write tests that operate at different levels of granularity.

One way to categorize these tests are into:

  • Unit tests to check core business logic/value
  • Integration tests to check function run methods are operating correctly
  • End-to-end workflow tests that check multiple functions working together

To enable effective automated testing it may be necessary to write functions in such a way as to make them testable, for example by allowing function run method dependencies to be automatically injected at runtime, whereas at test time mock versions can be supplied for example using a framework such as AzureFunctions.Autofac.

There are other tools that allow us to more easily test functions locally such as the local functions runtime and the Azure storage emulator.

To learn more about using these tools and techniques to test Azure Functions, check out my Pluralsight course Testing Precompiled Azure Functions: Deep Dive.

Stack Overflow Developer Survey 2018 Overview for .NET Developers

The 2018 Stack Overflow Developer Survey was recently released.

This article summarizes some interesting points that .NET developers may find interesting, in additional to some other general items of potential  interest.

.NET Points of Interest

  • C# is the 8th most popular programming language among professional developers at 35.3%.
  • TypeScript is the 12th most popular language among profession developers at 18.3%
  • VB.NET is the 18th most popular programming language at 6.9%.
  • .NET Core is the 4th most popular framework among professional developers at 27.2%.
  • SQL Server is the 2nd most used database among professional developers  at 41.6%.
  • Windows Desktop or Server is the 2nd most developed-for platform among professional developers  at 35.2%.
  • Azure is the 10th most developed-for platform among professional developers at 11.4%.
  • TypeScript is the 4th most loved language at 67%.
  • C# is the 8th most loved language.
  • VB.NET is the 4th most dreaded language
  • .NET Core is the 5th most loved framework, the 8th most dreaded framework, and the 5th most wanted framework.
  • Azure is the 5th most loved database (Tables, CosmosDB, SQL, etc.)
  • SQL Server is the 10th most loved database.
  • Visual Studio Code and Visual Studio are the top two most popular development environments respectively (among all respondents).
  • Professional developers primarily use Windows (49.4%) as their development operating system.
  • F# is associated with the highest salary worldwide, with C# 16th highest.
  • .NET development technologies cluster around C#, Azure, .NET Core, SQL Server etc.

General Points of Interest

  • 52.7% of developers spend 9-12 hours per day on a computer.
  • 37.4% of developers don’t typically exercise.
  • 93.1% of professional developers identify as male.
  • 74.3% of professional developers identify as white or of European descent.
  • 85.9% of professional developers use Agile development methodologies.
  • 88.4% of professional developers use Git for version control.

The survey offers a wealth of additional information and you can find the full set of results over at Stack Overflow.

Automatic Input Blob Binding in Azure Functions from Queue Trigger Message Data

Reading additional blob content when an Azure Function is triggered can be accomplished by using an input blob binding by defining a parameter in the function run method and decorating it with the [Blob] attribute.

For example, suppose you have a number of blobs that need converting in some way. You could initiate a process whereby the list of blob files that need processing are added to a storage queue. Each queue message contains the name of the blob that needs processing. This would allow the conversion function to scale out to convert multiple blobs in parallel.

The following code demonstrates one approach to do this. The code is triggered from a queue message that contains text representing the input bob filename that needs reading, converting, and then outputting to an output blob container.

using System.IO;
using Microsoft.Azure.WebJobs;
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Blob;

namespace FunctionApp1
{
    public static class ConvertNameCase
    {
        [FunctionName("ConvertNameCase")]
        public static void Run([QueueTrigger("capitalize-names")]string inputBlobPath)
        {
            string originalName = ReadInputName(inputBlobPath);

            var capitalizedName = originalName.ToUpperInvariant();

            WriteOutputName(inputBlobPath, capitalizedName);
        }
        
        private static string ReadInputName(string blobPath)
        {
            CloudStorageAccount account = CloudStorageAccount.DevelopmentStorageAccount;
            CloudBlobClient blobClient = account.CreateCloudBlobClient();
            CloudBlobContainer container = blobClient.GetContainerReference("names-in");

            var blobReference = container.GetBlockBlobReference(blobPath);

            string originalName = blobReference.DownloadText();

            return originalName;
        }

        private static void WriteOutputName(string blobPath, string capitalizedName)
        {
            CloudStorageAccount account = CloudStorageAccount.DevelopmentStorageAccount;
            CloudBlobClient blobClient = account.CreateCloudBlobClient();
            CloudBlobContainer container = blobClient.GetContainerReference("names-out");

            CloudBlockBlob cloudBlockBlob = container.GetBlockBlobReference(blobPath);
            cloudBlockBlob.UploadText(capitalizedName);            
        }

    }
}

In the preceding code, there is a lot of blob access code (which could be refactored). This function could however be greatly simplified by the use of one of the built-in binding expression tokens. Binding expression tokens can be used in binding expressions and are specified inside a pair of curly braces {…}. The {queueTrigger} binding token will extract the content of the incoming queue message that triggered a function.

For example, the code could be refactored as follows:

using System.IO;
using Microsoft.Azure.WebJobs;

namespace FunctionApp1
{
    public static class ConvertNameCase
    {
        [FunctionName("ConvertNameCase")]
        public static void Run(
        [QueueTrigger("capitalize-names")]string inputBlobPath,
        [Blob("names-in/{queueTrigger}", FileAccess.Read)] string originalName,
        [Blob("names-out/{queueTrigger}")] out string capitalizedName)
        {
                capitalizedName = originalName.ToUpperInvariant();         
        }
}

In the preceding code, the two [Blob] binding paths make use of the {queueTrigger} token. When the function is triggered, the queue message contains the name of the file to be processed. In the two [Blob] binding expressions, the {queueTrigger} token part will automatically be replaced with the text contents of the incoming message. For example if the message contained the text “File1.txt” then the two blob bindings would be set to names-in/File1.txt and names-out/File1.txt respectively. This means the input blob nameBlob string will automatically be read when the function is triggered,

To learn more about creating precompiled Azure Functions in Visual Studio, check out my Writing and Testing Precompiled Azure Functions in Visual Studio 2017 Pluralsight course.

New Pluralsight Course: Writing and Testing Precompiled Azure Functions in Visual Studio 2017

Azure Functions have come a long way in a short time. With newer releases you can now create functions in Visual Studio using standard C# class files along with specific attributes to help define triggers, bindings, etc. This means that all the familiar powerful Visual Studio tools, workflows, NuGet packages, etc. can be used to develop Azure Functions. Visual Studio also provides publish support so you can upload your functions to the cloud once you are happy with them. Another feature that makes developing functions in Visual Studio easier is the local functions runtime that let’s you run and debug functions on your local development machine, without needing to publish to the cloud just to test them.

In my new Writing and Testing Precompiled Azure Functions in Visual Studio 2017 Pluralsight course you will learn how to:

  • Set up your local development environment
  • Develop and test Azure Functions locally
  • Publish functions to Azure
  • Create functions triggered from incoming HTTP requests
  • Trigger functions from Azure Storage queues and blobs
  • Trigger functions from Azure Service Bus and Azure Event Hubs
  • Trigger functions periodically on a timer
  • Unit test Azure Function business logic

Check out the full course outline for more details.

FeatureToggle v4 Released

Version 4 of FeatureToggle is now released. This release adds initial support for .NET Core.

Example code.

Release notes.

Breaking Changes:

  • Min framework now 4.6.1 / .NET Standard 1.4
  • Windows 8.n, Windows phone 8.n, Windows Phone Silverlight 8.n no longer supported
  • Namespace changes: most types needed for application developers are now under root FeatureToggle namespace
  • Types not usually required by client code moved to FeatureToggle.Internal
  • Windows UWP now supported explicitly from build 14393

.NET Core Limitations/Specifics

This is in some ways somewhat of an interim release, it is envisaged that when version 5 comes around the implementation will move to a pure .NET Standard implementation.

New Free C# 7.1: What's New Quick Start eBook

My new free eBook “C# 7.0: What’s New Quick Start” is now complete and available for download.

C# 7.0: What’s New Quick Start Cover Page

The book has the following chapters:

  • Enabling C# 7.1 Features
  • Asynchronous Main Methods
  • Tuple Name Inference
  • Target-typed “default” Literal
  • Better Pattern-matching with Generics

You can download now for free or pay whatever you can.