ASP.NET Core – adding controllers directly from integration tests

1. Introduction

From time to time it might happen, that you need to test certain parts of your ASP.NET Core configuration without hitting publicly visible business-related controllers. For instance, in my case, I wanted to make sure that my ASP.NET Core API behavior is consistent with the original API written in Nancy. As you might expect there are quite a lot of differences between both of the frameworks, so let’s focus on testing one thing, namely non-nullable reference type handling. Long story short, in order to have the same behavior in ASP.NET Core as in Nancy I had to add the following line of configuration

This one prevents ASP.NET Core from marking non-nullable reference type properties in request as required. Having that configuration ready, I wanted to test it with an end to end test. Because there was no business-related logic yet, I needed to figure out the way of adding API controllers to my application directly from the integration tests. Here is how you can achieve that.

2. Accessing ApplicationPartManager from integration tests

ASP.NET Core is able to compose your API from different parts thanks to ApplicationPartManager. By default, you don’t use it directly but rather with IMvcBuilder AddApplicationPart extension method while setting up your application

However, there is no easy way of getting IMvcBuidler from the integration tests(at least I didn’t find a proper way of doing that without tampering too much with the original application pipeline) so we need to access ApplicationPartManager directly. Inspecting the source code of the framework you will find that ApplicationPartManager can be retrieved directly from IServiceCollection with following code

Applying similar code to the WebApplicationFactory will allow us to add controllers directly from tests

3. Handling private controllers

The solution presented below works nice, however it has couple of drawbacks. First of all, it only discovers public non-nested controllers. Second of all, it will always add all controllers from given assembly – which potentially might affect other integration tests. In order to get rid of these drawbacks, we need to tell ApplicationPartManager to include only selected controllers. We can achieve that by adding additional IApplicationFeatureProvider<ControllerFeature> to the list of FeatureProviders.

Putting it all together and applying some refactoring we will end up with following code

Source code for this post can be found here

ASP.NET Core – adding controllers directly from integration tests

Running ASP.NET Core together with Nancy

1. Introduction

Our current API runs on Nancy which in my opinion is past its prime. Recent news from the GitHub issue tracker seems to confirm that thesis, that is why we started looking for a migration path from Nancy based API to ASP.NET Core. Because the codebase is quite large we didn’t want to do a Big Bang Rewrite but instead of that, we wanted to gradually replace old API with a new one, so both of them can coexist next to each other.

2. Legacy API

Before I jump into implementation, here is a sample Nancy API with two endpoints – products and variants

The goal is to replace products endpoint with ASP.NET Core implementation while variants endpoint should still be served by Nancy

3. Combining Nancy with ASP.NET Core

The solution is based on branching the pipeline feature which is available starting from ASP.NET Core 2.1. Long story short – it is possible to configure different pipelines for different route paths thanks to

method. Having that in mind we can Map products path to run through ASP.NET Core pipeline whereas the rest would go through Nancy.

At this point we are almost there, however running request through Map pipeline will remove the path prefix, meaning that our controller would be accessible under / path instead of products. In order to bypass this limitation we have to restore original prefix with RewriteMiddleware. Once we put it all together, now we are able to replace multiple Nancy endpoints with ASP.NET Core ones using following piece of code


Source code for this post can be found here

Running ASP.NET Core together with Nancy

.NET Core – missing currency symbol in docker alpine image

During the process of moving a Scala-based API to .NET Core, we encountered an interesting localization issue when running our code in a docker container based on an alpine image. The code itself was doing a currency formatting based on some culture. It looked more or less as below

We also had some integration tests for that piece of logic

The tests were run during a CI build in a container with an image defined as below

At this point, we were sure that everything works fine, as the tests were green and everything was also working correctly on our local machines. However, after deployment to a testing environment, we started getting invalid currency symbols

As you can see the response contains ¤ (invariant currency symbol) instead of expected €. It took us some time to figure this out but finally, it turned out that aspnet:2.1.11-alpine image(the one we used for running the application) contrary to the SDK image(used for building and running tests) is missing icu-libs package. In default conditions, the application should throw the following exception during the startup

However, aspnet:2.1.11-alpine image has the DOTNET_SYSTEM_GLOBALIZATION_INVARIANT flag set to true by default, so the missing package was not validated during a startup. After all, in order to fix the issue, we had to install the icu-libs package and also set the DOTNET_SYSTEM_GLOBALIZATION_INVARIANT back to false. This was done by these two lines in Dockerfile

Once the lines were added the application started working as expected

Source code for this post can be found here

.NET Core – missing currency symbol in docker alpine image

MongoDB.Driver – class-based server side projection

1. Introduction

When working with NoSQL databases, your documents might be quite heavy and in some cases, you would like to get only a slice of original data. For instance, let’s assume we have an Account document which among other things contains a list of transactions

As there might be hundreds of transaction in the account object, you might want to occasionally work on a subset of original data, say for instance AccountSlim object to improve the performance

2.Exploring existing options

MongoDB.Driver has a couple of ways of defining projections so as you can operate on a slimmer object instead of a default one. For instance

Unfortunately, those are a client-side projection. This means that the entire object is returned from the database and we just serialized it to a different class. You can check it on your own by examining request and result commands sent to Mongo

In both cases requested query doesn’t contain “projection” section, so entire document is returned

Of course, there is a possibility to manually create a server-side projection, for instance

or with a strongly typed version

However, in my opinion, this is error-prone and it would be better to generate server-side projection automatically based on properties in a slim object. As I didn’t find anything like that in the official driver, here is my approach for handling this

3. Class-based server-side projection

In order to create a custom projection, all we have to do is to extend ProjectionDefinition<TSource, TResult> class and provide a RenderedProjectionDefinition with all properties which are in both “heavy” and “slim” object

As you can see we use MongoDB.Driver build-in projections to render our custom projection consisting of necessary properties. Note, that as we are using StringFieldDefinition instead of defining Bson document manually, the projection will take into account potential class mappings or attribute mappings applied to your object

Having the projection ready we can make it a bit easier to use by introducing some extensions. The first one looks as follows

which allows you to use this projection similarly like others – so by accessing Builders class

The second method will extend IFindFluent<TDocument, TProjection> interface

and thanks to it we will end up with an even better syntax

One way or another we will end up with proper projection definition which will result in a smaller document returned from the database

Source code for this post can be found here

MongoDB.Driver – class-based server side projection