Implementing Dependency Replication with TFS Team Build

A very common question from people is how to handle dependencies between projects/applications/team projects in TFS source control. A typical scenario is that you a common library/framework tucked away nicely somewhere in TFS source control, and now you have some applications that, in some way, needs to reference this project.

My colleague Terje has written an article on what he calls “Subsystem branching”, in which he talks about different ways to organize your source code in order to solve the above problem. Ther article can be found here:

I won’t go through all the different scenarios again, but thought that I’d show how we do it. We normally use Terje’s solution 3 and 3b, namely Binary deployment branching with or without merging. Shortly, this means that we setup a team build for our common library that we start manually when we have checked in changes that need to be replicated to the applications that are dependent on the library. This build (in addition to compiling, testing and versioning) checks in the library outputs (typically *.dll and *.pdb) into a Deploy folder. This folder is branched to all dependent applications. After the checkin, we merge the folder to the application(s) that will be built against the new version of the library.

As Terje mentions, another approach to this problem is the TFS Dependency Replicator which is a very nice tool that automates copying the dependencies between different parts of the source control tree. The main objective that we have with that approach is that using copying gives you no traceability. You have no easy way to see which applications use which version of which library.

In this post, I thought I would show how to implement this using TFS Team Build. We will implement solution 3b from Terje’s post, which means that efter we check in the binaries from the library build, we will automatically merge those binaries to the dependent projects.

Custom Task or “plain” <Exec>?
I considered implementing a custom task to implement this kind of dependency replication. The problem however, is that once you start wrapping functionality in the TFS source control API, you find that you often end up reimplementing lots of stuff to not make the task to simplistic. There are myriads of options for the tf.exe commands, and different scenarios often require different usage of the commands. So to keep it flexible, I suggest that you use the command line tool tf.exe instead when you are working against TFS source control.  On the downside, you need to learn a bit more MSBuild…. 🙂

Sample Scenario


We have one CommonLibrary project, which just contains a ClassLibrary1 project. In addition, we have the Deploy folder that is used for the resulting binary. Then we have two applications (Application1 and Application2) that each simple contains a WpfApplication project.In addition, each application has a Libs folder that is a branch from the Deploy folder. (The Deploy/Libs names have become a naming convention for us).So, we want a release build for CommonLibrary that builds the ClassLibrary1 assembly and checks it in to the Deploy folder, and then merges it to Application1Libs and Application2Libs.

Workspace Mappings

Now, before starting to go all MSbuild crazy, we need to discuss what the workspace for this build definition should look like. First of all, the workspace for the CommonLibrary build should not include anything from the dependent applications. This means that we must dynamically include the Libs folders into the build workspace as part of the build, to be able to perform the merge. Also,we really don’t want the Deploy folder to be part of the workspace for the build. If it is, the changesets that are created by the build will show up as associated changesets for the build, which is really not relevant since they contain the outputs of the build. So, the workspace mapping for our build definition looks like this:


Implementing the Build
The steps that we need to implement in our team build is:

  1. Decloak the Deploy folder into the current workspace and peform a check out
  2. Copy the build output to the Deploy folder and check it back in
  3. Add the Libs folders to the current workspace
  4. Merge the Deploy folder to the Application1/2Libs and check everything in

All these steps uses the Team Foundation Source Control Command-Line tool (tf.exe) to perform operations on TFS source control.

We start off by defining some properties and items for the source and destination folders:

  <ReplicateDestinationFolder Include="$(BuildProjectFolderPath)/../../Application1/Libs">
  <ReplicateDestinationFolder Include="$(BuildProjectFolderPath)/../../Application2/Libs">




Note the LocalMapping metadata that we define for each ReplicateDestinationFolder item. This will be used later on when modifying the workspace.

Step 1:

<Target Name="AfterEndToEndIteration">
  <!-- Get and checkout deploy folder-->
  <MakeDir Directories="$(ReplicateSourceFolder)"/>
  <Exec Command="$(TF) workfold /decloak ." WorkingDirectory="$(ReplicateSourceFolder)" />
  <Exec Command="$(TF) get &quot;$(ReplicateSourceFolder)&quot; /recursive"/>
  <Exec Command="$(TF) checkout &quot;$(ReplicateSourceFolder)&quot; /recursive" />


We put the logic in the AfterEndToEndIteration target, which is executed when

Step 2:

<!-- Copy build output to deploy folder and check in -->
   <Copy SourceFiles="@(CompilationOutputs)" DestinationFolder="$(ReplicateSourceFolder)"/>
   <Exec Command="$(TF) checkin /comment:&quot;Checking in file from build&quot; &quot;$(ReplicateSourceFolder)&quot; /recursive"/>

We use the nice CompilationOutputs item group that was added in TFS 2008, which contains all output from every configuration that is built. Note that this won’t give you the *.pdb though.

Step 3:

<!-- Add destination folders to current workspace -->
    <Exec Command="$(TF) workfold /workspace:$(WorkspaceName) &quot;%(ReplicateDestinationFolder.Identity)&quot; &quot;%(ReplicateDestinationFolder.LocalMapping)&quot;"/>

Here we use MSBuild batching to add a workspace mapping for each destination folder into the current workspace. We pass the %(ReplicationDestinationFolder.Identity) as the source parameter to the merge command, and we send the %(ReplicationDestinationFolder.LocalMapping) as the destination parameter, which we defined previously´

Step 4:

<!-- Merge to destinations and check in-->
<Exec Command="$(TF) merge &quot;$(ReplicateSourceFolder)&quot; &quot;%(ReplicateDestinationFolder.LocalMapping)&quot; /recursive"/>
<Exec Command="$(TF) checkin /comment:&amp;quot;Checking in merged files from build&amp;quot; @(ReplicateDestinationFolder->'&quot;%(LocalMapping)&quot;', ' ') /recursive"/>


So, every build will result in two checkins, first the check-in of the file(s) to the Deploy folder, and the a check-in for all merged binaries.

Note: I haven’t added any error handling. Typically you would add a OnError to the target that performs a tf.exe undo /recursive to undo any checkouts.

Writing a Code Coverage Checkin Policy

The source code for this policy is available here :

Checkin policies is a great tool in TFS for keeping your code base clean and adhering to your companhy standards and policies.  The checkin policies that are included are very useful, but don’t stop there! Implementing your own custom checkin policy is pretty straight-forward and can soon pay off by stopping people from doing silly things (on purpose or not…).

At our company (Osiris Data) we have developed several small checkin policies that both stop people from breaking our standards, but also helping them to do the right thing. We all make mistakes from time to time, and if a tool can help us not doing them, then that’s pretty good… 🙂

For example we have a checkin policy that stop people from checking in binaries into TFS. Of course there are occasions when people are allowed to do this (3rd party dll:s, binary references), so then we check that the binaries are placed in folders that are named according to our naming policies, thereby enforcing standards across the team projects.

I recently saw a post in one of the MSDN forums asking for a checkin policy that would check coverage as part of a check-in. That is, if the latest test run either does not have code coverage at all, or the total code coverage percentage is below a certain treshold, the policy would stop the check-in. I couldn’t find any such checkin policy on the net, so I decided that it would be fun to write one.

The following things must be solved:

1) Locating the latest test run and code coverage information
2) Analyzing the code coverage information

The first part was simple to implement, unfortunately there does not seem to be anything in the VS.NET extensibility API that allows you to locate the test runs or code coverage information, so I basically had to run through the folder structure beneath the current solution to locate the folder with the latest test run. Simple and rather boring, so I won’t mention that code here.

The second part was a bit worse, since the API for running and analysing code coverage is totally undocumented and, frankly, not supported by MS. However, the following blog post by Joe contained the information I needed in order to load and analyse the code coverage information. As always with unsupported stuff, there is no guarantee that the code will work with new versions of VSTS or even service packs. This code has been tested on VSTS 2008 SP1.

The code coverage result is stored in a proprietary binary format, and is located beneath the test run result. the local folder structure looks like this:

    —–  TestResults
                   —- TestRun1
                              —–  In
                                       —— data.coverage
                             —— Out
                                       —— Binaries from the instrumented assemblies

To programmatically access and analyse the code coverage results, we need a reference to the Microsoft.VisualStudio.Coverage.Analysis assembly, which is located in the private assemblies folder of VSTS. In this assembly, we use the CoverageInfoManager class to load the coverage file. In addition this class contains a method that returns a typed dataset (method is appropriately called BuildDataSet). This method returns an instance of the CoverageInfo class from which we can easily read the information.

The code snippet for loading the coverage file calculating the total code coverage in percent looks like this:

CoverageInfoManager.ExePath = binariesFolder;
CoverageInfoManager.SymPath = binariesFolder;
CoverageInfo ci = CoverageInfoManager.CreateInfoFromFile(codeCoverageFile);
CoverageDS data = ci.BuildDataSet(null);
uint blocksCovered = 0;
uint blocksNotCovered = 0;
foreach (CoverageDS.ModuleRow m in data.Module)
    blocksCovered += m.BlocksCovered;
    blocksNotCovered += m.BlocksNotCovered;
return GetPercentCoverage(blocksCovered, blocksNotCovered);

Note that we must set the ExePath and the SymPath properties to the folder where the instrumented assemblies is located. If not, the BuildDataSet method will throw a CoverageException.

So all we have to do then is to implement the PolicyBase.Evaluate method and compare the totalCodeCoverage with the configurable treshold. This treshold is configured by implementing the CanEdit and the Edit methods. See the source code for how this is done, it is all standard checkin policy stuff.

Hopefully this checkin policy will be useful for some people, let me know about any problems and I will try to fix them asap.