Generating PDB files with Resharper

The majority of the libraries developers use in their projects these days are open source. So in case something crashes inside third party library or you just want to know how it works, it is possible to get the PDB files from Microsoft Symbols Servers and debug it. If for some reasons PDB cannot be found on the servers, you can always grab the source code from GitHub and add to your project manually. Unfortunately when you use commercial libraries, it is impossible to do any of the previous steps. Luckily with Resharper you are able to generate PDB files from assembly and use it later in Visual Studio to debug it.
Let’s assume we would like to generate PDB files for EntityFramework. First of all, we have to locate EntityFramework assembly in Assembly Explorer. Go to Solution Explorer, RMB click on assembly you are interested in and select View in Assembly Explorer.
ViewInAssemblyExplorer
In Assembly Explorer once again RMB click on EntityFramework and select “Generate Pdb…
GeneratePdbFileMenu
In the opened window, select the destination folder for the files.
GeneratedPdbsOption
Once you click “Generate“, Resharper will process the assembly and generate PDBs.
PdbGenerated
Once the files are generated we have to tell Visual Studio to use them. In order to do that, run the app and stop the execution with some breakpoint, then go to Debug->Windows-> Modules, locate EntityFramework.dll, click it with RMB, select “Load Symbols” and choose file(s) generated by Resharper.
LoadSymbols
A this point we have PDB files ready but we are not able to set any breakpoint as we don’t have source code of EntityFramework. Fortunately Resharper once again saves the day as it is able to navigate to decompiled sources. Just make sure that your settings (Resharper->Options->External Sources) are the same as in the picture below
decompile
and you can navigate to external libraries’ source code just like they were in your project. The very last step is to disable “Enable Just My Code” option in Tools->Options->Debugging->General
JustMyCode
and from now on you can debug the external library
DbContextDebugging

Generating PDB files with Resharper

Debugging cake scripts

We’ve been using Cake in our project for quite some time and it has been working great. However for time to time lack of debugging support was really annoying for me. Fortunately it seems that these times are gone now. During NDC Oslo 2016 Gary Ewan Park showed that it is possible to attach debugger to the build process. Here are the steps to achieve that. Let’s assume that we have following simple cake script.

In order to make the debugging possible it is necessary to add

preprocessor directive to the task we want the debugger to stop.

This operation doesn’t affect the default build process, so if you run in console

the build will run as usual. In order to be able to attach the debugger we have to call Cake with

flag. You can’t do it directly via

script so assuming that you didn’t change directory structure created by bootstrapper, go to Tools/Cake folder and run from PowerShell

where

is path to your cake script and

is task you want to start. Once you run the command, cake will wait for attaching the debugger
WaitingForDebugger
Now launch Visual Studio, go to Debug-> Attach to Process…, find process id listed in console
Attach
and click Attach. After couple of seconds Visual Studio will load your cake script and you will be able to debug it like normal application.
attached

Debugging cake scripts

NLog – tracking flow of requests using MappedDiagnosticsLogicalContext

1. Problem

Everyone who has ever had to analyze logs from production environment knows, that quite often it is not an easy task. Usually application flow is deduced by log entries matched by thread id, which is problematic when you deal with multithreading scenarios. Say for example we have following piece of code

Default log file usually looks like that

Having multiple simultaneous Index requests, makes it almost impossible to figure out which log entry is part of which call. Fortunately with NLog it is possible to solve this problem without too much of a hassle.

2. Proposed solution

The whole idea is to leverage MappedDiagnosticsLogicalContext “storage” to keep some unique identifier of request for entire call. Note, that this identifier will be accessible also from other threads created from within given call. In order to set unique request id (also known as correlation id) I will use custom HttpModule. The very first implementation looks like that

Once the HttpModule is ready it is necessary to register it in web.config.

From now on, every request goes through LogginHttpModule and proper correlationId is set in MappedDiagnosticsLogicalContext.
As we want to have correlationId included in log files it is necessary to modify Nlog.config. In my case it looks like that

Note that we can access our correlationId variable via ${mdlc:item=correlationid}
Log file after the changes looks like that

This is almost perfect, however for some reasons correlationId is lost in EndRequest event handler. I am not sure if this is bug in NLog (I’ve posted question in Nlog’s github) or it is by design, however there is easy workaround for that. We can store our correlationId in HttpContext.Items collection and reassign it to MappedDiagnosticsLogicalContext in EndRequest event handler.

After these changes our log file finally looks OK

Source code for this post can be found here

NLog – tracking flow of requests using MappedDiagnosticsLogicalContext

Console2 – multiple tabs on startup

I’ve been using Console2 for quite some time and as for now it is my favorite command line terminal. The one thing I like the most is availability to use multiple tabs you can easily switch between. By default Console2 starts with only one tab opened, which was a bit of a pain for me. Mainly because I work a lot with Git and I also like to have a Visual Studio command prompt opened. So basically every time I launched command line I had to open one or two additional tabs. Fortunately it is quite easy to launch multiple tabs on startup. All you have to do is to create a shortcut to the Console2 with proper flags. Say for example I have following tabs defined
Settings

I can start Console2 with all these tabs opened on startup using following command.

Of course it is easier to define a shortcut to Console2 rather than remembering command above. My configuration looks like that
Properties

Console2 – multiple tabs on startup

Mocking downstream services with Mountebank

1. Introduction

I’m a part of a team which creates large enterprise platform for the banking sector. Due to the fact that we operate in the financial area, our application consumes a high amount of downstream services. One of the major problems we are facing at the moment is the inaccessibility of those services on DEV and UAT environments. It often happens that one of those services is down for quite some time, so our development speed is impaired. The bigger problem, however, is the fact that our end-to-end tests are red. Having a red test on TeamCity for a long time is a really annoying thing because basically at this point, we just automatically assume that all of the failures are caused by issues with downstream systems. Fortunately, we finally decided to get rid of this problem and this is the solution we came up with.

2. Background

Before I go to the implementation details, take a look at a sample implementation of situation I described at the very beginning. Let’s assume that we have WCF service which contains business logic of our application and it is consumed by the UI side

BookingFacade uses Pricer service which basically is just a wrapper for external/downstream service we have to use in order to retrieve some data (and we do not have an ownership of that service).

Very simple end-to-end test for BookingFacade can look like that

As I mentioned before, everything is fine as long as all of the downstream systems are working. The moment one of them is down, our tests start failing.
FailingTests

3. Implementation

General idea to fix this problem is to use fake downstream systems during end-to-end testing. In order to do that we use combination of custom compilation profile, app.config transformations and mountebank test doubles.

3.1 Setting up transformations

First of all we will create a custom compilation profile called test.integration which will be used for running tests with mocked downstream systems (we want to preserve the normal use of downstreams for Debug and Release mode). In order to do that, go to ConfigurationManager
ConfigurationManager
and follow the steps presented in the pictures below
ConfigurationManager2
ConfigurationManager3

Now it is time to prepare our app.config for transformations. We can do it manually but for sake of simplicity I will use SlowCheetah extension. Go to Tools->Extensions and Updates and install
SlowCheetah – XML Transforms. Once extension is installed you will see a Add Transform item in a config files’ context menu.
AddTransform
Clicking Add Transform will create three files: App.Debug.config, App.Release.config and App.Test.Integration.config. What is more, some additional changes will be added to your csproj file, which will be responsible for applying transformations during a build process. These three additional files hold information about config transformations. The default App.config file looks like that

Nothing special in here but note the empty file attribute in appSettings node. The value of this attribute will be replaced with proper filename during transformation process. In order to do that let’s implement tramsforms for every of three config files we use

App.Debug.config

App.Release.config

App.Test.Integration.config

As you can see I apply SetAttributes transform for appSettings node, which means that the transformation will replace appSettings node attributes from app.config file with appSettings node attributes from transform file. You can preview results of transformation by clicking Preview Transform context menu option available for every of our transform files
Preview
As you’ve probably already figured it out app.settings.debug and app.settings.release fileses contain real downstream endpoints and app.settings.tests.integration contains addresses of fake endpoints. Of course you have to create those files and copy them to output directory during the build (for instance by using Copy to Output directory option).

3.2. Creating fake downstreams with mountebank

Setting up fake downstream systems completely from scratch would be a very painful and time consuming process. Fortunately there are tools which can make this task easier. The tool I use is called mountebank – you can easily install it via npm

Having mountebank on board it is time to configure it, so that it acts as fake Pricer service. It can be done in couple of ways but I will go with file based configuration. You can run the tool using following command

ImpostorsListening
Where impostors.ejs is a configuration file with following structure

<% include pricer.json %> is EJS syntax which (in this case) allows you to split configuration into multiple files. As you’ve probably already figured it out, Pricer.json contains the actual mock configuration for Pricer service. The initial mock object consists of couple of properties which setup the name, port and protocol for given mock.

Note that we do not specify the address of mock but just a port. Mountebank by default runs on http://localhost:2525 and mocks listen on http://localhost:{port}. The actual magic happens in stubs property.
Here is an example of stub/mock for GetTradingDates method.

As you can see stub object consists of two properties

  • responses
  • predicates

responses property is an array of objects mountebank will be using to create a response. If you insert multiple items into array, they will be utilized using round robin algorithm. Note that you don’t have to create a full response (with all the headers etc.) just use “is” property which will fill the default values for you. More about that topic can be found here but in most of the cases setting value for “body” property will be enough for you.

predicates property is an array of predicates which will be used during matching process. Predicate object can be quite complex, it supports lots of different matching techniques. In this case, for simple GET method I just use equals operator to match HTTP verb and HTTP method.
The same technique can be used for stubbing POST methods. However, in this case, you can also perform matching against the body of the request. The one thing you have to remember is that body property is not an object but string. So in my case, if I want to make a stub for GetPrice which takes as params following request object

I can write for instance this stub

Of course for more complex request it is pointless to write every possible combination of params. Fortunately you can create fallback/default stub which will match everything

If you put this at the very end of stubs array it will not interfere with other stubs.
This is of course just a tip of the iceberg when it comes to predicate matching. Explanation of all possible matching options can be found here

4. Results

From now on running tests against test.integration compilation profile and having mountebank working in the background, our tests will be green.
SuccessfulTests
Source code for this post can be found here

Mocking downstream services with Mountebank