Avoiding TF237124 when Creating Work Items in New Areas

At my company we write a lot of tools and extensions that uses the TFS API to automate various things for us. A very common thing to automate is the creation of work items and the areas and iterations structure.

Creating a work item using the TFS API is simple, just connect to TFS, get the WorkItemStore service object and create a new work item and set any fields that you want to:


Creating a work item

//Connect to TFS and get the WorkItemStore object var tfs = new TfsTeamProjectCollection(new Uri("http://localhost:8080/tfs")); var wis = tfs.GetService(typeof(WorkItemStore)) as WorkItemStore; //Get team project var teamProject = wis.Projects["Demo"]; //Get the Bug Work Item Type var wit = teamProject.WorkItemTypes["Bug"]; //Create a new Bug work item and set the title field WorkItem wi = new WorkItem(wit); wi.Title = "New Bug In New Area"; wi.Save();

 

Creating an area or iteration is equally simple:



Creating an area

//Connect to TFS and get the ICommonStructureService object var tfs = new TfsTeamProjectCollection(new Uri("http://localhost:8080/tfs")); var css = tfs.GetService(typeof(ICommonStructureService)) as ICommonStructureService; //Get the root path of the new area string rootNodePath = "\Demo\Area"; var pathRoot = css.GetNodeFromPath(rootNodePath); //Create the new area, in this case it will be a new root area css.CreateNode("NewRootArea", pathRoot.Uri);

 

BUT (yes there is a but, you could sense it coming), when you combine these two fellows into one task, e.g. create a new area (or iteration) and then create a new work item in that area, chances are high that you will receive the following exception:

Microsoft.TeamFoundation.WorkItemTracking.Client.ValidationException: TF237124: Work Item is not ready to save

   at Microsoft.TeamFoundation.WorkItemTracking.Client.WorkItem.Save(SaveFlags saveFlags)

   at Microsoft.TeamFoundation.WorkItemTracking.Client.WorkItem.Save()


The meaning of the error is not obvious, but if you would call the Validate() method before calling Save() (which you should, of course) you would see that it returns the Area field indicating that this field is the problem.

The underlying problem here is that in the TFS Data Store, work items and areas/iterations are persisted in two different stores. And these stores need to be synchronized before you can reference any new items that you added. You’ll notice the same issue in Visual Studio as well, when you create a new area or iteration in Team Explorer, you need to refresh Team Explorer in order to use the new nodes for work items.

But how do we do this programmatically? There actually two things that needs to be done:

  1. Request that the work item store is synchronized with the Common Structure store
  2. Refresh the local cache

Which is translated into the following code



Synchronize External Stores

//Synchronize the work item store with external stores (e.g. CSS) private static void SyncExternalStructures(TfsTeamProjectCollection tfs, WorkItemStore wis, ICommonStructureService css, string teamProject) { //Get work item server proxy object WorkItemServer witProxy = (WorkItemServer)tfs.GetService(typeof(WorkItemServer)); //Get the team project ProjectInfo projInfo = css.GetProjectFromName(teamProject); //Sync External Store witProxy.SyncExternalStructures(WorkItemServer.NewRequestId(), projInfo.Uri); //Refresh local cache wis.RefreshCache(); }

 

Call this method after creating the area/iterations and before creating the new work item and it will work as expected

Introducing: Community TFS Build Manager

The latest release of the Community TFS Build Extensions include a brand new tool called Community TFS Build Manager and has been created for two reasons:

  1. An implementation of the Team Foundation Build API which is referenced by the Rangers Build Customization Guidance V2 (available H1 2012)
  2. Provide a solution to a real problem. The Community TFS Build Manager is intended to ease the management of builds in medium to large Team Foundation Server environments, though it does provide a few features which all users may find useful.

The first version of the tool has been implemented by myself and Mike Fourie. You can download the extension from the Visual Studio Gallery here:
http://visualstudiogallery.msdn.microsoft.com/16bafc63-0f20-4cc3-8b67-4e25d150102c


Note 1:
The full source is available at the Community TFS Build Extension site

Note 2: The tool is still considered alpha, so you should be a bit careful when running commands that modify or delete information in live environments, e.g. try it out first in a non-critical environment.

Note 3: The tool is also available as a stand alone WPF application. To use it, you need to download the source from the CodePlex site and build it.

 

Getting Started

You can either install the extension from the above link, or just open the Visual Studio Extension Manager and go to the Online gallery and search for TFS Build:

image

 

After installing the build manager, you can start it either from the Tools menu or from the Team Explorer by right-clicking on the Builds node on any team project:

 

image

This will bring up a new tool window that will by default show all build definition in the currently selected team project.


View and sort Builds and Build Definitions across multiple Build Controllers and Team Projects
This has always been a major limitation when working with builds in Team Explorer, they are always scoped to a team project. It is particularly annoying when viewing queued builds and you
have no idea what other builds are running on the same controller. In the TFS Build Manager, you can filter on one/all build controllers and one/all team projects:

image

 

The same filters apply when you switch between Build Definitions and Builds. In the following screen shot, you can see that three builds from three different team projects are running on the same controller:

image

You can easily filter on specific team projects and/or build controllers. Note that all columns are sortable, just click on the header column to sort it ascending or descending.
This makes it easy to for example locate all build definitions that use a particular build process template, or group builds by team project etc.

 

Bulk operations on Build Definitions

The main functionality that this tool brings in addition to what Team Explorer already offers, is the ability to perform bulk operations on multiple build definitions/builds. Often you need to modify or
delete several build definitions in TFS and there is no way to do this in Team Explorer.

 

In the TFS Build Manager, just select one or more builds or build definitions in the grid and right-click. The following context menu will be shown for build definitions:

image

 

Change Build Process Templates

This command lets you change the build process template for one or more build definitions. It will show a dialog with all existing build process templates in the corresponding team projects:

image

 

Queue

This will queue a “default” build for the the selected build definitions. This means that they will be queued with the default parameters.

 

Enable/Disable

Enables or disables the selected build definitions. Note that disabled build definitions are by default now shown. To view disabled build definitions, check the Include Disabled Builds checkbox:

 

image

 

Delete

This lets you delete one ore more build definitions in a single click. In Team Explorer this is not possible, you must first delete all builds and then delete the build definition. Annoying! Ler

TFS Build Manager will prompt you with the same delete options as in Team Explorer, so no functionality is lost:

image

 

Set Retention Policies

Allows you to set retentions policies to several build definitions in one go. Note that only retention policies for Triggered and Manual build definitions can be updated, not private builds. 
This feature also gives you the same options as in Team Explorer:

image

 

Clone to Branch

My favorite feature Ler Often the reason for cloning a build definition is that you have created a new source code branch and now you want to setup a matching set of builds for the new branch. When using the Clone build definition feature of the TFS Power Tools, you must update several of the parameters of the build definition after, including:

  • Name
  • Workspace mappings
  • Source control path to Items to builds (solutions and/or projects)
  • Source control path to test settings file
  • Drop location
  • Source control path to TFSBuild.proj for UpgradeTemplate builds

All this is done automagically when using the Clone to Branch feature! When you select this command, the build manager will look at the Items to build path (e.g. solution/projects) and find all child branches to this path and display them in a dialog:

image

When select one of the target branches, the new name will default to the source build definition with the target branch name appended. Of course you can modify the name in the dialog. After pressing OK, a new build definition will be created and all the parameters listed above will be modified accordingly to the new branch.

 

Bulk operations on Builds

You can also perform several actions on builds, and more will be added shortly. In the first release, the following features are available:

 

Delete

This will delete all artifacts of the build (details, drops, test results etc..). It should show the same dialog as the Delete Build Definition command, but currently it will delete everything.

Open Drop Folders

Allows you to open the drop folder for one or more builds

 

Retain Indefinitely

Set one or more builds to be retained indefinitely.

 

Bonus Feature – Generate DGML for your build environment

This feature was outside spec, but since I was playing around with generating DGML it was easy to implement this feature and it is actually rather useful. It quickly gives you an overview of your build resources, e.g. which build controllers and build agents that exist for the current project collection, and on what hosts they are running. The command is available in the small toolbar at the top, next to the refresh button:

image

Here is an example from our lab environment:

image

The dark green boxes are the host machine names and the controller and agents are contained within them.

Note: Currently the only way to view DGML files are with Visual Studio 2010 Premium and Ultimate.

 

I hope that many of you will find this tool useful, please report issues/feature requests to the Community TFS Build Extensions CodePlex site!

December 2011 TFS Power Tools Release

Brian Harry just posted an update on the latest version of the TFS 2010 Power Tools. This will most likely by the last version of the Power Tools for the TFS 2010 version, next version will target Dev11!

The main improvements in this release are:

  • Team Foundation Server Power Tools for Eclipse
  • MSSCCI Provider for 64-bit IDE’s
  • VS 2010 Power Tools update
    • Improved Work item Search
    • Best Practice Analyzer now also analyzes the integration with Project Server, if you are using it

Check out the early Christmas gift here:

http://blogs.msdn.com/b/bharry/archive/2011/12/16/december-2011-tfs-power-tools-release.aspx

TF237165: Team Foundation could not update the work item because of a validation error on the server.

I often use the VS 2010/TFS 2010 evaluation virtual machines that Microsoft publishes every 6 months with the latest bits. It’s a great timesaver to use an image where everything is already setup and also contains a bit of sample data that is useful when you want to demo something for customers.

 

There is one thing that has always been a, albeit small, but still very annoying problem and that is that the builds always partially fail when you start using the image. When you want to demo the powerful feature of associated work items in a build, you’ll find yourself with your pants down since the build fails when trying to update the associated work item! Even when looking at the historical builds for the Tailspin Toys project, you will notice that they also partially failed:

image

 

If you look at the error message in the build details, you’ll see the following error:

 

image

The work item ‘XX’ could not be updated: ‘TF237165: Team Foundation could not update the work item because of a validation error on the server. This may happen because the work item type has been modified or destroyed, or you do not have permission to update the work item.’

The problem here is that the build agent by default is running as the NT AUTHORITYSYSTEM account, which is an account that do not have permission to modify work items. Your best option here is to switch account and use the Network Service account instead. Open TFS Administration Console, and select the Build Configuration node. Press the Stop link in the Build Service section:

 

image

 

Select Properties and select NT AUTHORITYNetworkService as Credentials

 

image

 

Press Start to start the build service with the new credentials.

 

If you would queue a new build now, the build would fail because of conflicting workspace mappings. The reason for this is that we haven’t changed the working folder path for the build agents, so when the build agent try to create a new workspace, the local path will conflict with the workspace previously created by NT AUTHORITYSYSTEM.

So to resolve this we can do two things:

  1. (Preferred). Delete the team build workspaces previously created by the SYSTEM account. To do this, start a Visual Studio command prompt and type:

    tf.exe workspace /delete <workspacename>;NT AUTHORITYSYSTEM


    If you need to list the workspaces to get the names, you can type:

    tf workspaces /owner:NT AUTHORITYSYSTEM /computer: <buildserver>

  2. (Less preferred, but good if you want to switch back later to the old build service account) 

    You can also modify the working folder path for the build agents, so that they don’t conflict with the existing workspaces. Click Properties on the build agent(s) and modify the Working Directory proprerty:

    imageb

    In this case, you can for example change it to $(SystemDrive)BuildsNS$(BuildAgentId)$(BuildDefinitionPath)  where NS = Network Service.

Compatibility Problem with Microsoft Test Manager 2010 and Visual Studio 2011

UPDATE 10.01.2012:

The issue has been resolved by Microsoft and will be addressed in patch soon. Here is the full description from the Connect site:

“We’ve identified the rootcause. This bug was introduced in the compatibility GDR patch released for VS 2010 to work against 2011 TFS Server. We shall be releasing a patch soon. Till then, please follow the workaround mentioned to unblock yourselves. “

When setting up a physical environment for a new test controller on our TFS 2010 server, I ran into a problem that seems to be related to having installed the Visual Studio 2010 SP1 TFS Compatibility GDR and/or Visual Studio 2011 Developer Preview

on the same machine as Visual Studio 2010 (SP1)

 

The problem occurs when trying to add a test agent to the physical environment, MTM gives the following error:


Failed to obtain available machines from the selected test controller.


Clicking on the View details link shows the following error dialog:

image

Error dialog: Cannot communicate with the Controller due to version mismatch

 

I have investigated the problem together with Microsoft, and they are working on finding out why this is happening. I have posted the issue on the Connect site here:
https://connect.microsoft.com/VisualStudio/feedback/details/712290/microsoft-test-manager-2010-can-not-communicate-with-test-controllers-when-visual-studio-11-is-installed-on-the-same-machine

 

Workaround

Fortunately, we found a workaround that is not too bad. When facing this problem, go the the Controllers tab that list all the controllers. If you select the controller from the list, it will actually show the test agent.

 

image

Then go back to the Environments tab and voila, the test agent appears now on the list. It seems like the

I’ll post an update when the issue has been resolved by MS

TFS 2010 Build – Troubleshooting the TF215097 error

Anyone working with developing custom activities in TFS 2010 Build has run into the following dreadful error message when running the build:

TF215097: An error occurred while initializing a build for build definition TeamProjectMyBuildDefinition: Cannot create unknown type ‘{clr-namespace:[namespace];assembly=[assembly]}Activity

What the error means is that when the TFS build service loads the build process template XAML for the build definition, it can’t create an instance of the customer workflow activity that is referenced from it.

The problem here is that there are several steps that all need to be done correctly for this process to work.

Make sure that:

  • When developing custom workflow, you keep the XAML builds template workflows in one project, and the custom activities in another project. The template workflow project shall reference the custom activities project.
    This setup also makes sure that your custom activities show up in the toolbox when designing your workflow.
  • You have checked in the modified version of the XAML workflow (easy to forget)
  • Your custom activity has the the BuildActivityAttribute:

    image

  • Your custom activity is public (common mistake…)
  • You have configured your build controller with the path in source control where the custom activities are located

    image

  • Verify that all dependencies for the custom activity assembly/assemblies have been checked into the same location as the assembly
    NB: You don’t need to check in TFS assemblies and other references that you know will be in the GAC on the build servers.
  • The reference to the custom activity assembly in the XAML workflow is correct:
  • xmlns:obc="clr-namespace:Inmeta.Build.CustomActivities;assembly=Inmeta.Build.CustomActivities"
    


But, even if you have all these step done right, you can still get the error. I had this problem recently when working with the code metrics activities for the http://tfsbuildextensions.codeplex.com/ community project.

The thing that saved me that time was the Team Foundation Build Service Events eventlog. This is a somewhat hidden “feature” that is very useful when troubleshooting build problems. You find it under Custom View

in the event log on the build servers


image

In this case I had the following message there:

Service ‘LT-JAKOB2010 – Agent1’ had an exception:

Exception Message: Problem with loading custom assemblies: Method ‘get_BuildAgentUri’ in type ‘TfsBuildExtensions.Activities.Tests.MockIBuildDetail’ from assembly ‘TfsBuildExtensions.Activities.Tests, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null’ does not have an implementation. (type Exception)

Which made me realize that I had by accident checked in one of the test assemblies into the custom activity source control folder in TFS.


Unfortunately this whole process with developing you own custom activities is problematic and error prone, hopefully this will be better in future versions of TFS. Once you have your setup working however, changing and adding new custom activities is easy. And deployment is a breeze thanks to the automatic downloading and recycling of build agents that the build controller handles.

First stable release of the Community TFS 2010 Build Extensions

Today the first stable release of the Community TFS 2010 Build Extensions shipped on the CodePlex site. Visual Studio ALM MVP Mike Fourie (aka Mr MSBuild Extension Pack) has been the leader of this project and has done a tremendous job, both in contributing functionality as well as coordinating the work for the first release. Great work Mike! I (as well as several others) have contributed a small part of the activities, I plan to be working on the upcoming releases as well.

The build extensions contain approximately 100 custom activities that covers several different areas, such as IIS7, Hyper-V, StyleCop, NUnit, Powershell etc, as well as some core functionality (Assembly versionig, file management, compression, email etc etc). In addition to more activities in upcoming releases, the plan is to include build process templates for different scenarios.

Please download the release and try it out, and give us feedback!

To give you a hint of the content, here is a class diagram that shows the content of the “Core” activities project (there are several other projects included as well):

image

Automatically Merging Work Items in TFS 2013

** Source available at http://mergeworkitems.codeplex.com/ **

Half a year ago I wrote about about Merging Work Items with a custom check-in policy. The policy evaluated the pending changes and for all pending merges, it traversed the merge history to find the associated work items and let the user add them to the current changeset.

I promised to post the source to the check-in policy (and I’ve got a lot of requests for it), but I never did. This was primary for two reasons:

  1. The technical solution turned out to be a bit complicated. The problem was/is that it is not possible to modify the list of associated work items in the current pending changes using the API. This was a show stopper and the only way around it was to add another component that was executed on the server after the check-in that did the actual association. The information about the selected work items was temporarily stored in the comments of the changeset. This worked, but complicated the deployment.
  2. The feedback internally at Inmeta was that why should the developer be allowed to select which work items that should be associated? If a work item was associated with a changeset in a Main branch, it should always be associated with the merge changeset when merging to a Release branch. So the association should be done automatically.

For these reasons I changed the implementation and converted the check-in policy to a TFS server side event handler instead (for more info on work with these event handlers, check out my post about them here: Server Side Event Handlers in TFS 2010)

By having the association done server side, the process is very smooth for the developer. When a changeset that contains merges is checked in, the event handler evaluates all merges and associates the work items. If a work item has already been associated by the developer, it is of course not associated again. Since it is implemented with a server side event handler, it runs instantly without the user ever really noticing it.

Let’s look at how this works. Lets say that we have the following branch hierarchy:

image

Now, one of our finest developers have both fixed a bug and added a new user story in the Main branch. That was two changesets, each changeset was associated with a corresponding work item:

imageimage

Now, we want to push these changes to the 2.0 Release branch. So the developer performs a merge from Main to the 2.0 branch.

image

At this point, instead of manually adding the work items, the developer just checks in the changes. After this, lets take a look at the source history

image

Double-clicking on the latest changeset, we can see the following information on the work items tab:

image

The work items associated with the original changesets that was merged to the R2.0 branch, have been associated with the new changeset. Also, double-clicking on one of the work items, we can see that the server event handler has added a link to the changeset for this work item:

image

Note the following:

  • The changeset has been linked in the same way as when you associate a work item manually.
  • The change was done by the developer account (myself in this sample), and not the TFS service account. This is because the event handler uses the new TFS Impersonation API to impersonate the user who committed the check-in.
  • The history comment is a bit different than the usual (“Associated with changeset XXX”), just to highlight the reason for the change.
  • If the changeset would contain both merges and other types of pending changes, the merges would still be traversed, the other changes are just ignored.

 

Deployment
Deploying the server side event handler couldn’t be easier, just drop the Inmeta.TFS.MergeWorkItemsEventHandler.dll assembly into the plugin directory of the TFS AT server. This path is usually %PROGRAMFILES%Microsoft Team Foundation Server 12.0Application TierWeb ServicesbinPlugins. Note: This will cause TFS to recycle to load the new assembly, so in production you might want to schedule this to minimize problems for your users.
See my post more details on this.

image

Implementation
I have uploaded the source code to CodePlex at: http://mergeworkitems.codeplex.com/ so you can check out the details there. One thing that is worth mentioning is that I had to resort to the TFS Client API to access and modify the associated work items. The TFS Server Side object model is not very documented, and according to Grant Holiday (in this post) the server object mode for work item tracking is not very useful. So instead of going down that road, I used the client object model to do the work Not as efficient, but it gets the job done.

Note that the event handler hooks into the Notification decision point, which is executed asynchronously after the check-in has been done, and therefor it doesn’t have a negative impact on the overall check-in process.

Hope that you find the event handler useful, contact me either through this blog or via the CodePlex site if you have any questions and/or feature requests.

TFS 2010 Inmeta Build Explorer

This weekend we at Inmeta release a free Visual Studio 2010 Team Explorer extensions that solves the problem with the Builds node in the Team Explorer not being hierarchic. For some reason, this part of the Team Explorer didn’t get the nice hierarchical folder structure that the Work items node got in 2010. The result is that, for a company that has several hundreds of builds in the same team project, it becomes very hard to navigate.

The solution that we implemented is very simple and uses a naming convention to group the build definitions in folders. The default separator is ‘.’ (dot) which is prabably the most common convention used anyway. As it turns out, Microsoft DevDiv uses this convention internally, as posted
by Brian Harry. And they have a _lot_ of build definitions…. Ler

This is what the build explorer looks like:

 

As you can see, if you have a multi-part name, such as Inmeta.TFS Exception Reporter.Production, you get two folders in the hierarchy.

The Build Explorer is available in the Visual Studio Gallery, either download it from http://visualstudiogallery.msdn.microsoft.com/35daa606-4917-43c4-98ab-38632d9dbd45, or use the Visual Studio Extension Manager directly (search for Inmeta):

image

 

The extension was developed mostly by Lars Nilsson, with some smaller additions by myself and Terje Sandström.

The source code is available at http://tfsbuildfolders.codeplex.com. Let us know what you think and if you want to contribute, contact me or Terje at the Codeplex site.

Integrating Code Metrics in TFS 2010 Build

The build process template and custom activity described in this post is available here:
http://cid-ee034c9f620cd58d.office.live.com/self.aspx/BlogSamples/Inmeta%20TFS%20Build%20Sample.zip

Running code metrics has been available since VS 2008, but only from inside the IDE. Yesterday Microsoft finally released Visual Studio Code Metrics Power Tool 10.0, a command line tool that lets you run code metrics on your applications.  This means that it is now possible to perform code metrics analysis on the build server as part of your nightly/QA builds. In this post I will show how you can run the metrics command line tool from a build, and also a custom activity that reads the output and appends the results to the build log, and fails the build if the metric values exceeds certain (configurable) treshold values.

The code metrics tool analyzes all the methods in the assemblies, measuring cyclomatic complexity, class coupling, depth of inheritance and lines of code. Then it calculates a Maintainability Index from these values that is a measure of how maintanable this method is, between 0 (worst) and 100 (best). For information on how this value is calculated, see http://blogs.msdn.com/b/codeanalysis/archive/2007/11/20/maintainability-index-range-and-meaning.aspx. After this it aggregates the information and present it at the class, namespace and module level as well.


Running Metrics.exe in a build definition
Running the actual tool is easy, just use a InvokeProcess activity last in the Compile the Project sequence, reference the metrics.exe file and pass the correct arguments and you will end up with a result XML file in the drop directory. Here is how it is done in the attached build process template:

image

In the above sequence I first assign the path to the code metrics result file ([BinariesDirectory]result.xml) to a variable called MetricsResultFile, which is then sent to the InvokeProcess activity in the Arguments property.
Here are the arguments for the InvokeProcess activity:

image

Note that we tell metrics.exe to analyze all assemblies located in the Binaries folder. You might want to do some more intelligent filtering here, you probably don’t want to analyze all 3rd party assemblies for example.
Note also the path to the metrics.exe, this is the default location when you install the Code Metrics power tool. You must of course install the power tool on all build servers.

Using the standard output logging (in the Handle Standard Output/Handle Error Output sections), we get the following output when running the build:

image

Integrating Code Metrics into the build
Having the results available next to the build result is nice, but we want to have results integrated in the build result itself, and also to affect the outcome of the build. The point of having QA builds that measure, for example, code metrics is to make it very clear how the code being built measures up to the standards of the project/company. Just having a XML file available in the drop location will not cause the developers to improve their code, but a (partially) failing build will! Ler

To do this, we need to write a custom activity that parses the metrics result file, logs it to the build log and fails the build if the values frfom the metrics is below/above some predefined treshold values.

The custom activity performs the following steps

  1. Parses the XML. I’m using Linq 2 XSD for this. Since the XML schema for the result file is available with the power tool, it is vey easy to generate code that lets you query the structure using standard Linq operators.
  2. Runs through the metric result hierarchy and logs the metrics for each level and also verifies maintainability index and the cyclomatic complexity against the treshold values. The treshold values are defined in the build process template and are sent in as arguments to the custom activity
  3. If the treshold limits are exceeded, the activity either fails or partially fails the current build.

For more information about the structure of the code metrics result file, read Cameron Skinner’s post about it. It is very simple and easy to understand. I won’t go through the code of the custom activity here, since there is nothing special about it and it is available for download so you can look at it and play with it yourself.

The treshold values for Maintainability Index and Cyclomatic Complexity is defined in the build process template, and can be modified per build definition:

image

I have chosen the default values for these settings based on a post from my colleague Terje SandströmCode Metrics – suggestions for approriate limits. When you think about it, this is quite an improvement compared to using code metrics inside the IDE, where the Red/Yellow/Green limits are fixed (and the default values are somewhat strange, see Terjes post for a discussion on this)

This is the first version of the code metrics integration with TFS 2010 Build, I will probably enhance the functionality and the logging (the “tree view” structure in the log becomes quite hard to read) soon.
I will also consider adding it to the Community TFS Build Extensions site when it becomes a bit more mature.

Another obvious improvement is to extend the data warehouse of TFS and push the metric results back to the warehouse and make it visible in the reports.