Wednesday, December 19, 2012

Tfs Automation - Automatically creating queries when new areas are created

Our testers got sick of setting up new queries whenever they created a new area in TFS so asked me if I could automate it.
The solution is broken into 2 parts:

AreaCreatedSubscriber

This is ISubscriber that watches for StructureChangedNotification events and queues a one time job to create the queries.
The reason I queued a job instead of just doing the work in this plugin was that when creating the query TFS was throwing an exception on validation (TF51011: The specified area path does not exist.) It seems that there is a slight delay between adding areas and them becoming valid for use in queries. I thought it was a caching issue but even when explicitly clearing the cache on the workitemstore it still happens, I found a post (New Area Path values are not available in work items) that suggested that tfs subscribes to its own events which then triggers a process of making the area available. This meant I needed to delay the creation of the queries until after all the events had completed, hence the job queue.
Also it is fairly good practice to push any event semi long running tasks into a job as it can delay actions for end users.
This gets installed to c:\Program Files\Microsoft Team Foundation Server 11.0\Application Tier\Web Services\bin\Plugins
public class AreaCreatedSubscriber : ISubscriber
{
public string Name
{
get { return "Create Default Queries"; }
}
public SubscriberPriority Priority
{
get { return SubscriberPriority.Normal; }
}
public EventNotificationStatus ProcessEvent(TeamFoundationRequestContext requestContext, NotificationType notificationType, object notificationEventArgs, out int statusCode, out string statusMessage, out Microsoft.TeamFoundation.Common.ExceptionPropertyCollection properties)
{
statusMessage = string.Empty;
statusCode = 0;
properties = null;
try
{
if (notificationType == NotificationType.Notification)
{
var eventArgs = notificationEventArgs as StructureChangedNotification;
if (eventArgs != null)
{
ICommonStructureService commonStructureService = requestContext.GetService<CommonStructureService>();
var changedNodes = commonStructureService.GetChangedNodes(requestContext, eventArgs.SequenceId - 1);
XDocument xml = XDocument.Parse(changedNodes);
XElement changedNode = (from XElement n in xml.Descendants("StructureElement") select n).First();
if (changedNode.Attributes("Deleted").First().Value.ToString() == true.ToString())
return EventNotificationStatus.ActionPermitted;
string nodeId = changedNode.Attributes("Id").First().Value.ToString();
var node = commonStructureService.GetNode(requestContext, nodeId);
if (node.StructureType != StructureType.ProjectModelHierarchy)
return EventNotificationStatus.ActionPermitted;
var reader = XmlReader.Create(new StringReader("<AreaPath>" + node.Path + "</AreaPath>"));
var xmlData = new XmlDocument().ReadNode(reader);
// Handle the notification by queueing the information we need for a job
var jobService = requestContext.GetService<TeamFoundationJobService>();
jobService.QueueOneTimeJob(requestContext, "Create Default Queries", typeof(CreateDefaultQueriesJob).ToString(), xmlData, false);
}
}
}
catch (Exception e)
{
statusMessage = e.Message;
}
return EventNotificationStatus.ActionPermitted;
}
public Type[] SubscribedTypes()
{
return new[] { typeof(StructureChangedNotification) };
}
}

CreateDefaultQueriesJob

The second part is the actual job, its also fairly simple (although crudely written). It reads the Area path out of the event data and creates a folder hierarchy for the queries to live in. It then checks if the queries already exist and if not creates them.
This job should probably be extended/cleanedup to read the queries from an external source, whether it be a WIQL file on the disk, some metadata stored on the team project, or just a special folder in source control.
This is installed into c:\Program Files\Microsoft Team Foundation Server 11.0\Application Tier\TFSJobAgent\plugins
public class CreateDefaultQueriesJob : ITeamFoundationJobExtension
{
public TeamFoundationJobExecutionResult Run(TeamFoundationRequestContext requestContext, TeamFoundationJobDefinition jobDefinition, DateTime queueTime, out string resultMessage)
{
resultMessage = "";
try
{
TeamFoundationLocationService service = requestContext.GetService<TeamFoundationLocationService>();
Uri selfReferenceUri = service.GetSelfReferenceUri(requestContext, service.GetDefaultAccessMapping(requestContext));
TfsTeamProjectCollection tfsTeamProjectCollection = new TfsTeamProjectCollection(selfReferenceUri);
var workitemStore = tfsTeamProjectCollection.GetService<WorkItemStore>();
var jobDataXmlNode = jobDefinition.Data;
// Expects node like <WorkItem>31</WorkItem>
var areaPath = (from wi in XDocument.Parse(jobDataXmlNode.OuterXml).Elements() select wi.Value).First();
var nodes = areaPath.Split(new[] { '\\' }, StringSplitOptions.RemoveEmptyEntries);
var queryHierarchy = workitemStore.Projects[nodes[0]].QueryHierarchy;
QueryFolder currentLocation = queryHierarchy.FirstOrDefault(i => i.Name == "Shared Queries") as QueryFolder;
foreach (var path in nodes.Skip(2))
{
if (currentLocation.FirstOrDefault(i => i.Name == path) as QueryFolder != null)
{
currentLocation = currentLocation.FirstOrDefault(i => i.Name == path) as QueryFolder;
}
else
{
var folder = new QueryFolder(path);
currentLocation.Add(folder);
currentLocation = folder;
queryHierarchy.Save();
}
}
queryHierarchy.Project.Store.RefreshCache();
if (currentLocation.FirstOrDefault(i => i.Name == "Open Bugs") == null)
{
var query = new QueryDefinition("Open Bugs", string.Format("SELECT [System.Id], [System.Title], [System.AssignedTo], [System.State], [Microsoft.VSTS.Common.Priority] FROM WorkItems WHERE [System.TeamProject] = @project and [System.WorkItemType] = 'Bug' and [System.State] = 'Open' and [System.AreaPath] under '{0}' ORDER BY [System.Id]", areaPath.Substring(1).Replace("\\Area", "")));
currentLocation.Add(query);
}
if (currentLocation.FirstOrDefault(i => i.Name == "Closed Bugs") == null)
{
var query = new QueryDefinition("Closed Bugs", string.Format("SELECT [System.Id], [System.Title], [System.AssignedTo], [System.State], [Microsoft.VSTS.Common.Priority] FROM WorkItems WHERE [System.TeamProject] = @project and [System.WorkItemType] = 'Bug' and [System.State] = 'Closed' and [System.AreaPath] under '{0}' ORDER BY [System.Id]", areaPath.Substring(1).Replace("\\Area", "")));
currentLocation.Add(query);
}
queryHierarchy.Save();
}
catch (RequestCanceledException)
{
return TeamFoundationJobExecutionResult.Stopped;
}
catch (Exception exception)
{
resultMessage = exception.ToString();
return TeamFoundationJobExecutionResult.Failed;
}
return TeamFoundationJobExecutionResult.Succeeded;
}
}

Debugging

I had a bunch of issues trying to get this to work.
Firstly the TF51011 error - folders were appearing in the shared queries so I knew the plugin was doing something, luckily the exception was being logged to the event logs so it was really easy to find.
After moving the logic to a one time job I tried testing it only to find nothing happened. No folders created and nothing logged in the event logs. I figured that the event was still being fired so it was probably queueing the job, after a quick google I came across a blog article from Martin Hinshelwood debugging the TFS Active directory sync job this gave me a bunch of helpful debugging tips.
This lead me to the tfs project collection database
SELECT * FROM [Tfs_Enlighten].[dbo].[tbl_JobDefinition] where ExtensionName like '%CreateDefaultQueriesJob'
Returned rows but they weren't much help, jobs were being queued but I had no idea about if they were successful or not. I eventually discovered another table in the tfs_configuration database which contained the jobHistory
select * from tfs_configuration.dbo.tbl_JobHistory where jobid in (SELECT [JobId] FROM [Tfs_Enlighten].[dbo].[tbl_JobDefinition] where ExtensionName like '%CreateDefaultQueriesJob')
Also had one row for every time I queued the job, with the status result of 6, this time I needed to do some diving with ILSpy, which lead me to this enum
namespace Microsoft.TeamFoundation.Framework.Common
{
public enum TeamFoundationJobResult
{
None = -1,
Succeeded,
PartiallySucceeded,
Failed,
Stopped,
Killed,
Blocked,
ExtensionNotFound,
Inactive,
Disabled,
JobInitializationError
}
}

ExtensionNotFound = 6 - TFS couldn't find my job! I double and triple checked all my code, and started diving through the JobRunner code in TFS trying to figure out why it wasn't loading. I noticed a few hints of MEF in the code so tried adding an [Export] attribute to the class, still no luck. Enabling the trace log by editing c:\Program Files\Microsoft Team Foundation Server 11.0\Application Tier\TFSJobAgent\TfsJobAgent.exe.config I started seeing a curious exception about a plugin being added to a dictionary twice.
Detailed Message: There was an error during job agent execution. The operation will be retried. Similar errors in the next five minutes may not be logged.
Exception Message: An item with the same key has already been added. (type ArgumentException)
Exception Stack Trace: at System.Collections.Generic.Dictionary`2.Insert(TKey key, TValue value, Boolean add)
at Microsoft.TeamFoundation.Framework.Server.TeamFoundationExtensionUtility.LoadExtensionTypeMap[T](String pluginDirectory)
at Microsoft.TeamFoundation.Framework.Server.JobApplication.SetupInternal()
at Microsoft.TeamFoundation.Framework.Server.JobServiceUtil.RetryOperationsUntilSuccessful(RetryOperations operations, Int32 maxTries, Int32& delayOnExceptionSeconds)
view raw exception.txt hosted with ❤ by GitHub

Especially interesting as it appears to check if the plugin is already registered in the dictionary before adding it, which makes me think they have a threading bug.
After removing the [Export] attribute and restarting the tfs job service a few times it seemed to come right.

Update: Apparently there's some new web access interfaces to view the status of TFS Jobs (and more), may have meant less database diving had I known at the time

Friday, December 7, 2012

Tfs Automation - Create Builds Automatically for New Solutions

I recently wrote a tfs server plugin that creates new builds automatically whenever you add or branch a solution file.  It's fairly simple, but worth sharing.
using Microsoft.TeamFoundation.Framework.Server;
using Microsoft.TeamFoundation.Common;
using Microsoft.TeamFoundation.VersionControl.Client;
using Microsoft.TeamFoundation.Client;
public class BuildCreator : ISubscriber
{
public string Name
{
get { return "Automated Build Creator"; }
}
public SubscriberPriority Priority
{
get { return SubscriberPriority.Normal; }
}
public EventNotificationStatus ProcessEvent(TeamFoundationRequestContext requestContext, NotificationType notificationType, object notificationEventArgs, out int statusCode, out string statusMessage, out Microsoft.TeamFoundation.Common.ExceptionPropertyCollection properties)
{
statusCode = 0;
statusMessage = string.Empty;
properties = null;
if (notificationType == NotificationType.Notification && notificationEventArgs is Microsoft.TeamFoundation.VersionControl.Server.CheckinNotification)
{
var checkinNotification = notificationEventArgs as Microsoft.TeamFoundation.VersionControl.Server.CheckinNotification;
TeamFoundationLocationService service = requestContext.GetService<TeamFoundationLocationService>();
Uri selfReferenceUri = service.GetSelfReferenceUri(requestContext, service.GetDefaultAccessMapping(requestContext));
TfsTeamProjectCollection tfsTeamProjectCollection = new TfsTeamProjectCollection(selfReferenceUri);
var versionControl = (VersionControlServer)tfsTeamProjectCollection.GetService(typeof(VersionControlServer));
var changeSet = versionControl.GetChangeset(checkinNotification.Changeset);
foreach (var change in changeSet.Changes)
{
if ((change.ChangeType & (ChangeType.Add | ChangeType.Branch)) == 0)
continue;
if (!Path.GetExtension(change.Item.ServerItem).Equals(".sln"))
continue;
var buildDefinitionHelper = new BuildDefinitionHelper(tfsTeamProjectCollection);
buildDefinitionHelper.CreateBuildDefinition(change.Item.ServerItem);
}
}
return EventNotificationStatus.ActionPermitted;
}
public Type[] SubscribedTypes()
{
return new Type[] { typeof(Microsoft.TeamFoundation.VersionControl.Server.CheckinNotification) };
}
}
view raw BuildCreator.cs hosted with ❤ by GitHub


It takes advantage of another piece of code I had previously written (or perhaps found online, I honestly can't remember now) to create builds using the TFS api.

Installing it is very simple, just build the class into an assembly and copy it to c:\Program Files\Microsoft Team Foundation Server 11.0\Application Tier\Web Services\bin\Plugins you may also need to copy a couple of the referenced dlls to the same folder.

Creating a build using the Tfs API

Here is a snippet of code to create tfs builds using the API with a few sensible default settings.
using Microsoft.TeamFoundation.Client;
using Microsoft.TeamFoundation.Build.Client;
using Microsoft.TeamFoundation.Build.Workflow;
using Microsoft.TeamFoundation.Build.Workflow.Activities;
public void CreateBuildDefinition(string solutionFile)
{
//todo pull from config
string dropLocation = @"\\tfsserver\builddrop";
string teamProject = solutionFile.Substring(2, solutionFile.Replace("$/", "").IndexOf('/'));
string solutionDirectory = solutionFile.Substring(0, solutionFile.LastIndexOf('/'));
string buildName = GetBuildNameForSolution(solutionFile); //Insert algorithm here to convert solution name (and folder) to a build name
//First we create a IBuildDefinition object for the team project and set a name and description for it:
var buildDefinition = _buildServer.CreateBuildDefinition(teamProject);
buildDefinition.Name = buildName;
buildDefinition.Description = "Auto Genned Build for " + solutionFile;
//Trigger - Next up, we set the trigger type. For this one, we set it to individual which corresponds to the Continuous Integration - Build each check-in trigger option
buildDefinition.ContinuousIntegrationType = ContinuousIntegrationType.Individual;
//Workspace - For the workspace mappings, we create two mappings here, where one is a cloak. Note the user of $(SourceDir) variable, which is expanded by Team Build into the sources directory when running the build.
buildDefinition.Workspace.AddMapping(solutionDirectory, "$(SourceDir)", WorkspaceMappingType.Map);
//Build Defaults - In the build defaults, we set the build controller and the drop location. To get a build controller, we can (for example) use the GetBuildController method to get an existing build controller by name:
buildDefinition.BuildController = _buildServer.QueryBuildControllers().First();
buildDefinition.DefaultDropLocation = dropLocation;
//Get default template
var defaultTemplate = _buildServer.QueryProcessTemplates(teamProject.Replace("/", "").Replace("$", "")).Where(p => p.TemplateType == ProcessTemplateType.Default).First();
buildDefinition.Process = defaultTemplate;
//Set process parameters
var process = WorkflowHelpers.DeserializeProcessParameters(buildDefinition.ProcessParameters);
//Set BuildSettings properties
BuildSettings settings = new BuildSettings();
settings.ProjectsToBuild = new StringList(solutionFile);
settings.PlatformConfigurations = new PlatformConfigurationList();
settings.PlatformConfigurations.Add(new PlatformConfiguration("Any CPU", "Release"));
process.Add(ProcessParameterMetadata.StandardParameterNames.BuildSettings, settings);
process[ProcessParameterMetadata.StandardParameterNames.CreateWorkItem] = false;
process[ProcessParameterMetadata.StandardParameterNames.AssociateChangesetsAndWorkItems] = true;
process[ProcessParameterMetadata.StandardParameterNames.BuildNumberFormat] = "$(BuildDefinitionName)_$(Date:yyyyMMdd)$(Rev:.r)";
TestAssemblySpec testSpec = new TestAssemblySpec();
testSpec.AssemblyFileSpec = "**\\*test*.dll";
process.Add(ProcessParameterMetadata.StandardParameterNames.TestSpecs, new TestSpecList(testSpec));
buildDefinition.ProcessParameters = WorkflowHelpers.SerializeProcessParameters(process);
//The other build process parameters of a build definition can be set using the same approach
//Retention Policy - This one is easy, we just clear the default settings and set our own:
buildDefinition.RetentionPolicyList.Clear();
buildDefinition.AddRetentionPolicy(BuildReason.Triggered, BuildStatus.Succeeded, 10, DeleteOptions.All);
buildDefinition.AddRetentionPolicy(BuildReason.Triggered, BuildStatus.Failed, 10, DeleteOptions.All);
buildDefinition.AddRetentionPolicy(BuildReason.Triggered, BuildStatus.Stopped, 1, DeleteOptions.All);
buildDefinition.AddRetentionPolicy(BuildReason.Triggered, BuildStatus.PartiallySucceeded, 10, DeleteOptions.All);
//Save It!
buildDefinition.Save();
_buildServer.QueueBuild(buildDefinition);
}

Wednesday, November 14, 2012

Web Code Coverage for Manual Tests

Found a great little tool which I think may be new in VS/TFS 2012 (it was possible in previous versions but seemed much harder to setup)

 C:\Program Files (x86)\Microsoft Visual Studio 11.0\Team Tools\Dynamic Code Coverage Tools\CodeCoverage.exe collect /iis /output:c:\outputfile.coverage

 It will restart IIS and record a coverage file which can be later opened in visual studio. Great to set going when testers are doing any manual testing in a development environment

Very Simple TFS 2012 Web Access Plugin

Made a very basic tfs plugin to make the tfs logo a link back to the homepage.

Manifest.xml
<WebAccess version="11.0">
<plugin moreinfo="http://bzbetty.blogspot.co.nz/2012/11/very-simple-tfs-2012-web-access-plugin.html" name="Make Logo Clickable" vendor="Betty" version="0.0.0.3">
<modules>
<module loadAfter="TFS.Core" namespace="Clickable"></module>
</modules>
</plugin>
</WebAccess>
view raw clickable.xml hosted with ❤ by GitHub
clickable.min.js
TFS.module("Clickable",[],function(){ });
$(function() { $(".logo").replaceWith("<a class='header-item logo header-logo-onpremise' href='/tfs/'>"); });
Then zip them up together and upload it in the tfs admin section.

Obviously not exactly the best plugin in the world, considering we actually bypass the whole plugin system, but it works and may be of help to someone.

Saturday, November 10, 2012

TFS Extensibility

In this post I hope to outline all the extensibility points in TFS 2012 and come up with a reason I might use it for the company I work for.  I would define this as both a major strength and weakness in TFS, there's a lot of powerful stuff exposed to developers but there's not a lot of documentation and a fair bit of it seems half complete which all results in very little adoption.  The community around customizing TFS seems fairly small in my opinion, but could be a major win for Microsoft.

Tfs Client API - .NET
A set of .NET classes to interact with the various webservices exposed by TFS, this is the main interaction point from which Visual studio and all components that don't need to be installed on the TFS server directly, for this reason it is extremely powerful.

We mainly use it to build a custom TFS web frontend for our clients for logging bugs which is a bit more cut down than the TFS web access. The need for this has reduced drastically with the release of TFS 2012 however it does mean we can integrate other systems giving our clients a one stop site for all their needs (rather than requiring them to login to 3-4 different systems).  We are careful that users can only see any work items that they create so that they don't require a special TFS CAL. The built in boards and backlog in TFS 2012 require a higher level CAL, but you can build your own backlog/board that works under the limited CAL.

We also make use of the API in a few ways to help manage the TFS Server. For example automatically granting permissions to TFS teams for any areas/iterations they're associated with, that way I can just assign people to project teams and they'll have the permissions they need.  Another example is a set of sanity checks which do things like making sure every solution in source control has an associated build.

Tarun AroraNeno Loje, and Shai Raiten all blog extensively on this subject.

TFS Build Templates - XAML, .NET
TFS Builds in 2010 and higher are controlled by .NET Workflows, this includes a lot of the Lab management functionality.  There are lots of builtin activities you can add to your templates and you can write your own activities in Visual studio and include you (or use one of the many activities that are part of the community tfs build extensions).

I'm not a fan of the editor and using custom assemblies causes all kinds of pain just to get the custom activities into Visual Studio so you can add them.  You can't easily just grab a component off someones blog, add it to your template, test it and deploy it.  This gets even more complex when you've already customized your template so can't just replace the entire template and have to add their changes manually.

For the above reasons I typically try keep template customizations to a minimum.  I believe the only customization I have currently is to set the default build quality of a successful build so that I can make use of TFS Deployer to deploy the same binaries in turn to each environment for testing.  TFS Deployer then makes use of powershell scripts, which I find much easier to customise and test.

TFS Soap Subscriptions - SOAP
The tfs api (mentioned above) allows you to subscribe to a number of events via soap webservices, this allows your app to get notified when any of the following events occurs in TFS.
  • Work item changes
  • Check ins
  • Build completes
  • Build quality changes (this is what TFS Deployer uses)
  • Project is created or deleted
  • Branch moved
  • Security Acl changes
  • Iteration/Area changes
Note: The last 4 may be client events only, meaning only the client that triggers the event gets notified.

Back in 2005/2008 we used to use this notification service to send out emails when a workitem was assigned (instead of getting every developer to create a subscription themselves), in 2010 we migrated this to use the server events instead and finally replaced it with a single out of the box team subscription in tfs 2012.

These days I don't make use of any of these events myself (except obviously build quality change). I can imagine subscribing to area/iteration changes to automatically create folders in SharePoint, much like the proposed TFS Automation Platform from Martin Hishelwood, sadly this project never seemed to get off the ground.

TFS Server-side Event Handlers - .NET
Location: %Program Files%\Microsoft Team Foundation Server 11.0\Application Tier\Web Services\bin\Plugins

TFS 2010 introduced the ability to create server side event subscribers by implementing the ISubscriber interface and dropping the resulting assembly into the plugin directory and access the same events as the soap subscriptions.

These seemed a lot more reliable than the soap subscriptions, but had to process fairly quickly as it blocked TFS from finishing what it was doing. As mentioned earlier we used this to send email alerts for assigned work items, if you ever did a bulk update (100 work items) TFS would lock up and not respond for up to a minute.  The solution was either to spin up a background thread (which I believe had a chance to be killed) or queue a custom TFS Job.

Because TFS waits for you to process it means you can use events as Decision Points to allow or deny certain actions from occurring.  This means you can add extra custom validation to workitems and checkins, there are better ways to do both but sometimes you need access to code to do more advanced checks.

Apart from emailing notifications, I've seen a plugin that can associate workitems to checkins based on the comments (and heard of someone doing the reverse - setting comments based on the associated workitems).

Update: I created a few sample server side event plugins that automatically creates builds for new solution files, automatically creates workitem queries for new areas

MTM/TestCase server notifications are also available in 2012.

Check In Policies - .NET
Location: Developer machines

Code that can ensure particular requirements are met before a checkin can occur, examples:
  • Project Compiles
  • Tests pass
  • Comments added
  • WorkItem associated
  • Passes StyleCop checks
  • WorkItems associated are in particular areas/queries
  • Regex patterns on filenames/contents
New policies can be created by extending the PolicyBase class, and deploying them to your development machines.

These are annoying to deploy as they get installed on each developers machine, not the TFS server. They became easier in VS2010 with the ability to install using a vsix, in Visual Studio 2012 it became possible to create private extension galleries to help deploy extensions within your enterprise.  However as far as I'm aware there is still no automated way to deploy one of these when your developers connect to a TFS server.

I try not discourage my developers from checking in so I keep the policies fairly light.  We currently require a comment, but encourage other practices just by word of mouth.  I think it would be possible to prompt the user if they want to create a code review request in 2012 on checkin, which may be a good way to introduce users to code review, however it would likely be annoying fast and probably still harder than just informing developers manually.

TFS Job Agent - .NET
Location: %ProgramFiles%\Microsoft Team Foundation Server 11.0\Application Tier\TFSJobAgent\plugins\

TFS runs a number of jobs in the background using its job agent and service (Processing the cube, email subscriptions, AD sync).  You can either write your own custom tasks by implementing the  ITeamFoundationJobExtension interface, or fire off existing tasks when you feel like it.

Custom Jobs are a good choice when you need to do a lot of work in the background, the problem I mentioned earlier when TFS hanging when trying to update 100+ workitems due to the email plugin could be solved by writing an event handler that queues a custom job.
Update: I created a sample server side job plugin that automatically creates workitem queries for new areas

Warehouse/Cube Customization - .NET
Location: %ProgramFiles%\Microsoft Team Foundation Server 11.0\Application Tier\TFSJobAgent\plugins\

TFS builds its own data warehouse from scratch, this is likely because work item fields can be configured to push information through to the warehouse/cube as either a dimension or measure.  TFS supports the interface IWarehouseAdapter which allows the various parts of TFS to add fields to the warehouse as it is getting built.  Developers can also take advantage of this interface and write their own custom data warehouse adapter.

The above example is regarding TFS 2008 which I have heard does not work in 2010, unfortunately there seems to be no real documentation from Microsoft and very few people trying to build a custom adapter.  I would suggest that maybe a tool like ILSpy and looking at the built in adapters would be the way to go.

In 2010 it looks like the warehouse processing was merged into the job service, and any IWarehouseAdapter have been converted to extend the abstract class WarehouseAdapter.  There's also a notion of WebhouseJobExtension which are the ITeamFoundationJobExtension jobs that presumably kick off the schema and data updates int the adapters.

Microsoft.TeamFoundation.WorkItemTracking.Adapter.WorkItemTrackingWarehouseAdapter looks to be a good example of a warehouse adapter.

Deployment Services - REST, .NET
TFS 2012 update 1 adds in a new build definition and services that allow you to deploy websites to azure. The code behind this actually looks a lot more powerful and gives me the impression that they're going to open it up for others to start publishing not only deployment methods but other services that can interact with TFS using OAuth.

Using WCF Storm I discovered the following webservices.

http://tfs:8080/tfs/{TeamProjectCollection}/services/v1.0/ConnectedServicesService.asmx

  • CreateConnectedService
  • QueryConnectedServices
  • GetConnectedService

http://tfs:8080/tfs/{TeamProjectCollection}/Build/v4.0/BuildDeploymentService.asmx

  • CreateDeploymentEnvironment
  • GetDeploymentEnvironments

http://tfs:8080/tfs/{TeamProjectCollection}/services/v1.0/StrongBoxService.asmx

  • GetDrawerContents
  • GetString

Once you register a deployment service (eg Azure, or a custom built one) you can add environments to it with a custom set of properties (any keyvalue pair). You can then query all enviroments for a given service and pull properties out of its strong box drawer.

I haven't looked at the workflow activities at all to see if they would support calling a custom service.


Custom WorkItem Controls - .NET, Javascript
It's fairly well known that you can customize the fields for any workitem, however you can also write your own custom controls to display on workitems.

With the totally new Web Acess in TFS 2012 there's a new way to write custom controls.


There's a codeplex project that contains many work item tracking custom controls along with their source, and look like they may be updated for TFS 2012 (the visual studio integration side anyway).

Web Access Extensibility - Javascript
The Web access allows you to upload new javascript based plugins, apparently the team were not happy with the state of this at the time of release and are currently reworking them before they make many details public.

It also appears that the different sections of Web Access are built using the above plugins, two controllers (one for the view, one for json) and an AreaRegistration to register them. It may be possible to create new sections just by creating a new AreaRegistration and dropping it in the bin directory.


Update - below are some extensibility points I was either unaware of at the time of writing, or didn't think worth mentioning (visual studio extensions) as everyone already knows about them.

Test Controller Plugins
I've never seen anyone else use one of these, but they appear to exist and work well. I've written one to help filter available test agents by their machine tags for the current test config.


Diagnostic Data Adapters
Custom Adapter – Part 1


MTM Extensions
Example Extension TestScribe for MTM

Visual Studio Extensions
Developing Visual Studio Extensions

Summary
There's lots of extensibility points in TFS but a number of them seem half finished, or not documented.  It could be that most of them are only designed for internal Microsoft use, but it wouldn't take too much to make them very useful for other developers.

Wednesday, October 3, 2012

TFS Deployer - Marking builds as deployed in tfs 2012

TFS2012 allows you to mark builds as deployed, this is meant to work hand in hand with the azure deployment but I figured it should work with whatever TfsDeployer does too.

Add-Type -AssemblyName 'Microsoft.TeamFoundation.Client, Version=11.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a'
Add-Type -AssemblyName 'Microsoft.TeamFoundation.Build.Client, Version=11.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a'
$tfs = New-Object -TypeName Microsoft.TeamFoundation.Client.TfsTeamProjectCollection -ArgumentList $TfsServer
try {
$deployServiceType = [Microsoft.TeamFoundation.Build.Client.DeploymentService]
$deployService = $tfs.GetService($deployServiceType)
$deployService.CreateBuildDeployment($Uri, $Uri, $Quality)
}
catch [Microsoft.TeamFoundation.Build.Client.BuildServerException]
{
}
TFS always seems to throw an error afterwards which i haven't figured out quite yet, so i'm just wrapping it in a try/catch. Even if it didn't throw this exception, I think it will throw another one if you ever try mark the same build as deployed

Exception calling "CreateBuildDeployment" with "3" argument(s): "TF246021: An
error occurred while processing your request.
Technical information (for administrator):
SQL Server Error: 2601"
At E:\projects\BuildProcessTemplates\Deployment\Update-TfsBuildDetails.ps1:24
char:1
+ $deployService.CreateBuildDeployment($Uri, $Uri, $Quality)
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : BuildServerException
view raw gistfile1.txt hosted with ❤ by GitHub
This adds a new row in the "deployed" tab under builds in TFS 2012 web access (looks like visual studio doesn't support this quite yet).

It has a redeploy button which just triggers a new build, so it won't actually set the correct build quality to be auto deployed by tfs deployer.

Added a discussion to TFS Deployers discussion tab http://tfsdeployer.codeplex.com/discussions/397983

Friday, September 28, 2012

TFS 2012 - set user images to be same as active directory

I posted a question on Stackoverflow hoping that someone has an idea of how to use the user image api in tfs 2012.

http://stackoverflow.com/questions/12633732/api-to-update-users-image-identity-extended-properties-not-saving

Update
Figured it out myself

Microsoft.TeamFoundation.Client.TeamFoundationServer teamFoundationServer = new Microsoft.TeamFoundation.Client.TeamFoundationServer("http://UrlToTFS");
FilteredIdentityService service = teamFoundationServer.GetService&lt;FilteredIdentityService&gt;(); ;
IIdentityManagementService2 service2 = teamFoundationServer.GetService&lt;IIdentityManagementService2&gt;();
foreach (var identity in service.SearchForUsers(""))
{
var user = UserPrincipal.FindByIdentity(new PrincipalContext(ContextType.Domain), identity.UniqueName);
if (user == null) continue;
var de = new System.DirectoryServices.DirectoryEntry("LDAP://" + user.DistinguishedName);
var thumbNail = de.Properties["thumbnailPhoto"].Value as byte[];
identity.SetProperty("Microsoft.TeamFoundation.Identity.Image.Data", thumbNail);
identity.SetProperty("Microsoft.TeamFoundation.Identity.Image.Type", "image/bmp");
identity.SetProperty("Microsoft.TeamFoundation.Identity.Image.Id", Guid.NewGuid().ToByteArray());
identity.SetProperty("Microsoft.TeamFoundation.Identity.CandidateImage.Data", null);
identity.SetProperty("Microsoft.TeamFoundation.Identity.CandidateImage.UploadDate", null);
service2.UpdateExtendedProperties(identity);
}
view raw settfsimage.cs hosted with ❤ by GitHub

Monday, September 24, 2012

TFS 2012 - Groups Reviewed

When the company I work at first started using TFS 2005 we decided to have only one Team Project due to the limit of ~250 Team Projects. Today our root source folder for our main Team Project has a few hundred folders (clients) in it, so do our areas and iteration trees.  While it is a bit annoying to maintain, I am certainly thankful of that decision when it comes to upgrading versions of TFS.

So understandably when I heard about the new Team concept in TFS 2012 I got rather excited.  After playing with Teams for a little while in 2012 they are nice, but don't help with management as much as I was hoping for.

Things I think should be tweaked

  • Work Items - "Assigned To Me" should filter by current team
  • When creating an alert in a team, I should be able to filter work items by my team iteration or areas rather than selecting it manually
  • Alerts that can be reused between teams (Email all team members when a work item is created in an area their team is associated to)
  • Iteration/Area tree views in the admin section should default to filtered to the default area/iteration and what is below it
  • Source view should select the first team favourite
  • Builds should filter queued/completed by team favourites
  • Easier way to assign security to areas based on groups (eg if a group is assocated to an area then allow the group a particular access level)
  • Allow personal favourites per team
  • At least some level of integration with reports
  • Way to change the path of the area that is automatically created for a team
Obviously the extended view is required to setup favourites in the first place, but this can be a toggle on all views.  Nothing major, i just feel that it might be nice to restrict my default view a bit more when I'm meant to be looking at a team.

Maybe i'll look into the plugin framework a bit more and see what I can do.



TFS 2012 - Web Access Plugins

Update 1

Found a blog series on how to write custom work item controls (plugins) in javascript (which is quite old, oops).  Custom controls don't seem all that exciting to me, but there may be enough information there to develop something else with the information.
  1. http://blogs.msdn.com/b/serkani/archive/2012/06/22/work-item-custom-control-development-in-tf-web-access-2012-development.aspx
  2. http://blogs.msdn.com/b/serkani/archive/2012/06/22/work-item-custom-control-development-in-tf-web-access-2012-deployment.aspx
  3. http://blogs.msdn.com/b/serkani/archive/2012/07/12/adding-content-files-to-work-item-custom-control-extension-package.aspx

Update 2


Tiago Pascoal has released a Plugin that enhances the Task Board
http://pascoal.net/2012/09/team-foundation-task-board-enhancer-version-0-3-released

Update 3


Wrote my own very basic plugin to make the logo a clickable link
http://bzbetty.blogspot.co.nz/2012/11/very-simple-tfs-2012-web-access-plugin.html
Introduction
While using the new WebAccess for TFS 2012 I noticed an interesting tab in the administration section named extensions, but it appears Microsoft haven't made the API public yet (1) so I thought I might have a quick dig to see what I could figure out.  Because the information has not been made public by Microsoft there's a good chance it will all change in upcoming TFS builds.

The Plugins
Plugins appear to be uploaded as a zip file containing an xml manifest file which defines its name, version, vendor and a few other properties. The author can also specify which other modules it loads after suggesting it has some form of dependency management.

<WebAccess>
<plugin name="helloworld" vendor="betty" version="0.1" moreinfo="http://bzbetty.blogspot.com">
</WebAccess>
view raw helloworld.xml hosted with ❤ by GitHub
All files included in the archive appear to be uploaded to a filestore somewhere, but the only required file that I could see was the manifest.xml.  Unfortunately whenever I tried making a extension I got the error message "TF400900: Invalid zip archive file : manifest.zip" upon upload, no resolution as of yet.

Registered plugins are added to a common xml file and serialized to a database or other persistent storage. Neither my TFS_Configuration nor Team Collection databases appeared to contain this file, more than likely it is created when the first plugin is uploaded.

There's a couple of properties and an ActionFilter that make it look like javascript plays a big part of these plugins. The website itself does reference a file named builtin-plugins.js, which then references a bunch of other javascript files by name.

var _builtinPlugins = [{"namespace":"TFS.Diag.TracePoints","loadAfter":"TFS.Diag"},{"namespace":"TFS.TestManagement.Controls","loadAfter":"TFS.WorkItemTracking.Controls"},{"namespace":"TFS.TestManagement.Setup","loadAfter":"TFS.OM"},{"namespace":"TFS.VersionControl.Setup","loadAfter":"TFS.OM"},{"namespace":"TFS.Requirements.Setup","loadAfter":"TFS.OM"}];
view raw register.js hosted with ❤ by GitHub
The builtin javascript plugins align very nicely with the areas defined earlier, although not a 1 to 1 mapping.

I curious to see a lack of NuGet, it would have made a great plugin repository as seen in a number of projects now and would have provided a few major benefits such as better dependency management and a distribution channel. Although given Microsoft's recent announcements of other App Stores maybe they plan to take the same route with TFS.

Another point of interest is that TFSPreview does not appear to have this tab, which would be a curious ommision if they are just javascript scripts.

Portable Areas

TFS 2012 also seems to make heavy use of MVC areas, each area defined seems to be separated into its own assembly file containing the relevant controllers, viewmodels and other relevant code.  Some form of portable area code must have been developed, although i'm guessing nothing was taken from either MvcContrib nor Orchard.

Interestingly they don't appear to take advantage of the new WebAPI, nor do they use razor for their view engine. But they do typically contain a basic controller for serving views with little logic and an api controller to query via json.


While i haven't had the time to dig into this whole thing properly it looks like it should be possible to create complete new tabs/areas using MVC in the new webaccess, or to augment the existing ones using javascript plugins.  Considering how much of the UI appears to be created on the clientside this could be a very powerful little framework.  I hope Microsoft start releasing information about it soon, if not I'm sure a blogger will.

References
(1) http://social.msdn.microsoft.com/Forums/en-US/TFSvnext/thread/7696732e-767a-43c7-81d2-c318aeff41ed/

Sunday, June 3, 2012

Using AutoMapper to copy Metadata from Entities to ViewModels


On my crusade to eliminate common mistakes causing bugs in my projects I found was that my metadata on my entities didn’t always match my viewmodels.  I wrote 2 fairly simple providers to copy the metadata from the entity to the ViewModel at runtime which leverages my AutoMapper configuration so it supports field names that are renamed.


I use the approach below to automatically copy data annotations from my entities to my view model. This ensures that things like StringLength and Required values are always the same for entity/viewmodel.

It works using the Automapper configuration, so works if the properties are named differently on the viewmodel as long as AutoMapper is setup correctly.

You need to create a custom ModelValidatorProvider and custom ModelMetadataProvider to get this to work. My memory on why is a little foggy, but I believe it's so both server and client side validation work, as well as any other formatting you do based on the metadata (eg an asterix next to required fields).

Note: I have simplified my code slightly as I added it below, so there may be a few small issues.

Metadata Provider

public class MetadataProvider : DataAnnotationsModelMetadataProvider
{
private IConfigurationProvider _mapper;
public MetadataProvider(IConfigurationProvider mapper)
{
_mapper = mapper;
}
protected override System.Web.Mvc.ModelMetadata CreateMetadata(IEnumerable<Attribute> attributes, Type containerType, Func<object> modelAccessor, Type modelType, string propertyName)
{
//Grab attributes from the entity columns and copy them to the view model
var mappedAttributes = _mapper.GetMappedAttributes(containerType, propertyName, attributes);
return base.CreateMetadata(mappedAttributes, containerType, modelAccessor, modelType, propertyName);
}
}
Validator Provivder
public class ValidatorProvider : DataAnnotationsModelValidatorProvider
{
private IConfigurationProvider _mapper;
public ValidatorProvider(IConfigurationProvider mapper)
{
_mapper = mapper;
}
protected override System.Collections.Generic.IEnumerable<ModelValidator> GetValidators(System.Web.Mvc.ModelMetadata metadata, ControllerContext context, IEnumerable<Attribute> attributes)
{
var mappedAttributes = _mapper.GetMappedAttributes(metadata.ContainerType, metadata.PropertyName, attributes);
return base.GetValidators(metadata, context, mappedAttributes);
}
}
view raw validator.cs hosted with ❤ by GitHub

Helper Method Referenced in above 2 classes

public static IEnumerable<Attribute> GetMappedAttributes(this IConfigurationProvider mapper, Type sourceType, string propertyName, IEnumerable<Attribute> existingAttributes)
{
if (sourceType != null)
{
foreach (var typeMap in mapper.GetAllTypeMaps().Where(i => i.SourceType == sourceType))
{
foreach (var propertyMap in typeMap.GetPropertyMaps())
{
if (propertyMap.IsIgnored() || propertyMap.SourceMember == null)
continue;
if (propertyMap.SourceMember.Name == propertyName)
{
foreach (ValidationAttribute attribute in propertyMap.DestinationProperty.GetCustomAttributes(typeof(ValidationAttribute), true))
{
if (!existingAttributes.Any(i => i.GetType() == attribute.GetType()))
yield return attribute;
}
}
}
}
}
if (existingAttributes != null)
{
foreach (var attribute in existingAttributes)
{
yield return attribute;
}
}
}
view raw helper.cs hosted with ❤ by GitHub
Other Notes
If you're using dependency injection, make sure your container isn't already replacing the built in metadata provider or validator provider. In my case I was using the Ninject.MVC3 package which bound one of them after creating the kernel, I then had to rebind it afterwards so my class was actually used. I was getting exceptions about Required only being allowed to be added once, took most of a day to track it down.


My StackOverflow Post: http://stackoverflow.com/questions/9989785/technique-for-carrying-metadata-to-view-models-with-automapper/10100042#10100042

Dependency Injection for RouteConstraints


Often I've wanted to check whether something exists in the database in a route constraint, unfortunately MVC doesn't support dependency injection for RouteConstraints out of the box, and due to them being alive for the entire life of the application it isn't that easy. If the scope of the dependency is in request scope, for example, then it will work for the first request then not for every request after that.

You can get around this by using a very simple DI wrapper for your route constraints.

public class InjectedRouteConstraint<T> : IRouteConstraint where T : IRouteConstraint
{
private IDependencyResolver _dependencyResolver { get; set; }
public InjectedRouteConstraint(IDependencyResolver dependencyResolver)
{
_dependencyResolver = dependencyResolver;
}
public bool Match(HttpContextBase httpContext, Route route, string parameterName, RouteValueDictionary values, RouteDirection routeDirection)
{
return _dependencyResolver.GetService<T>().Match(httpContext, route, parameterName, values, routeDirection);
}
}
then create your routes like this
var _dependencyResolver = DependencyResolver.Current; //Get this from private variable that you can override when unit testing
routes.MapRoute(
"Countries",
"countries/{country}",
new {
controller = "Countries",
action = "Index"
},
new {
country = new InjectedRouteConstraint<CountryRouteConstraint>(_dependencyResolver);
}
);
view raw createroutes.cs hosted with ❤ by GitHub

 
var _dependencyResolver = DependencyResolver.Current; //Get this from private variable that you can override when unit testing
routes.MapRoute(
"Countries",
"countries/{country}",
new {
controller = "Countries",
action = "Index"
},
new {
country = new InjectedRouteConstraint<CountryRouteConstraint>(_dependencyResolver);
}
);
view raw routes.cs hosted with ❤ by GitHub

My StackOverflow Post

Pagination with AutoMapper

I recently posted a method of using AutoMapper in an ActionResult (http://bzbetty.blogspot.co.nz/2012/06/thoughts-on-actionfilters.html).  This became an issue when I was trying to return a paginated model to the view, as I wanted to do pagination in the database (on the entities) but AutoMapper had no idea how to map the IPagination result to the ViewModel.

Result - Build a custom IObjectMapper to do the mapping


Resulting Action

[AutoMap(typeof(EventListViewModel))]
public ActionResult Index(int? page)
{
page = page ?? 1;
var entities = //load data from database
.AsPagination(page.Value, 20); //pagination method from MVC Contrib
return View(entities);
}
view raw automap.cs hosted with ❤ by GitHub
IObjectMapper
public class PaginationMapper : IObjectMapper
{
private IMappingEngineRunner _mapper;
public T Map<T>(object source)
{
TypeMap typeMap = _mapper.ConfigurationProvider.FindTypeMapFor(source, source.GetType(), typeof(T));
MappingOperationOptions mappingOperationOptions = new MappingOperationOptions();
ResolutionContext resolutionContext = new ResolutionContext(typeMap, source, source.GetType(), typeof(T), mappingOperationOptions);
return (T)_mapper.Map(resolutionContext);
}
public PagedList<T> CreatePagedList<T>(IPagination source)
{
var result = Activator.CreateInstance<PagedList<T>>();
result.TotalItems = source.TotalItems;
result.TotalPages = source.TotalPages;
result.HasNextPage = source.HasNextPage;
result.HasPreviousPage = source.HasPreviousPage;
result.PageNumber = source.PageNumber;
result.PageSize = source.PageSize;
result.FirstItem = source.FirstItem;
result.LastItem = source.LastItem;
foreach (var item in source)
{
result.Add(Map<T>(item));
}
return result;
}
public object Map(ResolutionContext context, IMappingEngineRunner mapper)
{
_mapper = mapper;
Type destinationType = context.DestinationType.GetGenericArguments()[0];
var method = typeof(PaginationMapper).GetMethod("CreatePagedList").MakeGenericMethod(destinationType);
return method.Invoke(this, new[] { context.SourceValue });
}
public bool IsMatch(ResolutionContext context)
{
return typeof(IPagination).IsAssignableFrom(context.SourceType) && typeof(IPagination).IsAssignableFrom(context.DestinationType);
}
}
view raw pagination.cs hosted with ❤ by GitHub

Updating collections using AutoMapper


By default AutoMapper replaces child lists with a completely new instance only containing the items in the original list.  Because of the way EF works you need to change the existing items in the list for it to track changes as updates instead of adds.  This method also means deletes can be tracked succesfully.
Basic steps were
Ensure the destination collection is loaded from db and attached to the object graph for change tracking
    .ForMember(dest => dest.Categories, opt => opt.UseDestinationValue())
Then create a custom IObjectMapper for mapping IList<> to IList<T> where T : Entity
The custom IObject mapper used some code from http://groups.google.com/group/automapper-users/browse_thread/thread/8c7896fbc3f72514
foreach (var child in source.ChildCollection)
{
var targetChild = target.ChildCollection.SingleOrDefault(c => c.Equals(child)); //overwrite Equals or replace comparison with an Id comparison
if (targetChild == null)
target.ChildCollection.Add(Mapper.Map<SourceChildType, TargetChildType>(child));
else
Mapper.Map(child, targetChild);
}
view raw collection.cs hosted with ❤ by GitHub
Finally one last piece of logic to check all Id's in targetCollection exist in sourceCollection and delete them if they don't.
It wasn't all that much code in the end and is reusable in other actions.

My Stackoverflow Post: http://stackoverflow.com/questions/9739568/when-using-dtos-automapper-nhibernate-reflecting-changes-in-child-collections/9856360#9856360
Recently found a library that does just this - https://github.com/TylerCarlson1/Automapper.Collection 

Thoughts on ModelBinders


Validation
Source: http://www.markeverard.com/blog/2011/07/18/creating-a-custom-modelbinder-allowing-validation-of-injected-composite-models/

Custom ModelBinders don’t seem to run validation on the newly created model automatically, so while you may add validation attributes to your model you need to add in some extra code to force validation to occur.

ModelMetadata modelMetadata = ModelMetadataProviders.Current.GetMetadataForType(() => model, model.GetType());
ModelValidator compositeValidator = ModelValidator.GetModelValidator(modelMetadata, controllerContext);
foreach (ModelValidationResult result in compositeValidator.Validate(null))
bindingContext.ModelState.AddModelError(ValidationPropertyName, result.Message);
view raw validate.cs hosted with ❤ by GitHub
There may be a class you can override to get this automatically but I haven’t found it yet.

Entities 
Source: http://lostechies.com/jimmybogard/2011/07/07/intelligent-model-binding-with-model-binder-providers/

An interesting technique I’ve seen used is using modelbinding to automatically load an entity based on the id passed in. So your actions ask for an entity instead of an Id.

public ActionResult View(User entity) {
UserEditModel model = //change entity into a edit model
return View(model);
}
view raw view.cs hosted with ❤ by GitHub
Doesn’t seem like much but it should remove 1 line from most actions (loading the entity from the database). However it requires you to never use your entities as edit models (which you shouldn’t be doing anyway). It also doesn’t allow extra filtering/including of data that I can tell.

This idea could be extended to work on list pages by creating a filter criteria from all route values (page, sort, search)

Dependency Injecting Model Binders
Source: http://iridescence.no/post/Constructor-Injection-for-ASPNET-MVC-Model-Binders.aspx

Out of the box you can’t use dependency injection with ModelBinders, you can however pass objects in when you create them at registration time.  If your ModelBinders only take singletons then that approach works, otherwise create a fairly basic modelbinder that takes your kernel as a parameter and when binding method is called use the kernel to create your real model binder complete with full dependency injection.

DateTimes
For a recent projet I didn’t want to use a javascript datetime picker as it was meant to be used on a horrible android tablet, but I still wanted something a bit friendlier than a textbox.  I chose to split dates into day/month/year fields, which would be a lot of effort to do to every date manually.

Instead I made an Editor Template for DateTimes which outputs 3 fields, and a custom ModelBinder that when binding dates looks for 3 fields instead of just 1.

Thoughts on ActionFilters


UnitOfWork 
Source: Ayende/Rob Conery http://wekeroad.com/2011/03/21/using-entityframework-with-aspnet-mvc-3/

I have a very basic UnitOfWork ActionFilter which runs after every action, when no exceptions are thrown it calls SaveChanges on my DbContext (which is set to be  1 per request in ninject).  Mostly because we occasionally forgot to call this method.

AutoMapperActionFilter 
Source: http://lostechies.com/jimmybogard/2009/06/30/how-we-do-mvc-view-models/

Because I use AutoMapper to create my ViewModels from my entities I ended up with a bunch of code in my views that basically looked like

public ActionResult View(int id)
{
var entity = LoadFromDatabase(id);
var viewModel = Mapper.Map<ViewModel>(entity);
return View(viewModel);
}
view raw loadfromdb.cs hosted with ❤ by GitHub
In my crusade to remove repetitive code in my actions, I created an extra ActionFilter that can perform the mapping after the action has finished executing.

[AutoMap(typeof(ViewModel)]
public ActionResult View(int id)
{
var entity = LoadFromDatabase(id);
return View(entity);
}
The main benefit of this turned out to not be less code, as it’s still the same number of lines of code. It was that my tests no longer needed to setup AutoMapper in order to test an Action (except for the create/update actions).

Jimmy has actually moved on from using an ActionFilter and started using an ActionResult that decorates another ActionResult and performs the mapping.  I don’t use that approach as I find it hard/impossible to add code between the mapping and the final view being displayed, which I needed to do to populate dropdowns (see ViewModelEnricher).  Jimmy’s new approach looks like the following:

public ActionResult View(int id)
{
var entity = LoadFromDatabase(id);
return AutoMap<ViewResult>(View(entity));
}
ViewModelEnricher 
Source: http://ben.onfabrik.com/posts/better-form-handling-in-aspnet-mvc

Another piece of repetitive code was loading the contents dropdowns and adding them to the ViewModels. I created yet another ActionFilter (run after the AutoMap one) which looks at all the properties on a ViewModel for an attribute telling it where to load dropdowns from.

This was added globally to all actions and meant I could no longer accidentally forget to add the code to set the dropdowns on Validation failure of an update/create action (most common place to forget to do it).

I’m still not 100% convinced this is the best approach; I’ve been playing with the concept that dropdowns are dependencies of the ViewModel and should be created by the DI container.  This would either require the AutoMap ActionFilter and ModelBinder to create all ViewModels using the DependencyResolver or a small modification to the ViewModelEnricher ActionFilter to use property injection into an existing object on the way through.

ValidationActionFilter
Source: http://trycatchfail.com/blog/post/Cleaning-up-POSTs-in-ASPNET-MVC-the-Fail-Tracker-Way.aspx

It’s fairly common to check ModelState.IsValid on all postbacks and return a view if validation fails.  I added another global action filter which performs this check on any post request.  Again this is so it isn’t accidentally forgotten and to reduce the amount of repetitive code in Actions.

Thoughts on ActionResults


ActionResults are a good way to make your controllers more testable.  For example, HttpContext.Current  is not available in unit tests a null reference exception occurs when testing code that calls HttpContext.Redirect, however because the Redirect ActionResult is returned and not executed by the controller unittests can check the action returned and ensure that it is a redirect without needing to mock an entire HttpContext.

Note: While MVC does wrap HttpContext which allows you to mock it while testing, it’s a lot easier if you don’t have to deal with it at all.

LoginAction
A similar approach can be used for LogIn/LogOut, you could abstract the FormsAuth call to a custom service but again you would need to pass a mock in your tests or risk another null reference, using a ActionResult instead makes testing much easier

public ActionResult Login(string username, string password) {
if(username != “user” || password != “123”) {
ModelState.AddError(“username or password incorrect”);
return View();
}
return new LoginResult(username); //I typically pass the actual user into the login result
}
view raw login.cs hosted with ❤ by GitHub
[TestMethod]
public void Login_Success() {
AuthenticationController controller = new AuthenticationController();
ActionResult result = controller.Login(“user”, “123”);
Assert.IsTrue(result.GetType() == typeof(LoginResult));
Assert.AreEqual(“user”, ((LoginResult)result).Username);
}
view raw loginsuccess.cs hosted with ❤ by GitHub

File Result
They also are a great method of code reuse. You would never imagine of putting the code that reads a view and renders it in every action, so why add the code that outputs a pdf/ical/rss/xls/graph/report inline in the controller?  Simply return a Result that knows how to render the pdf and pass the data to it.

public ActionResult Graph() {
var data = null; //todo load some data
return new PdfResult(data);
}
view raw pdfresult.cs hosted with ❤ by GitHub
Status Results 
Source: https://github.com/MattHoneycutt/Fail-Tracker/blob/master/FailTracker.Web/ActionResults/StatusResult.cs
You can use the decorator pattern to extend what a view does.  I like adding a status message at the top of a page after an action has taken place eg “User created successfully”.  Instead of adding extra properties to every ViewModel to include a status message, I just put the message into TempData (Not ViewBag as that won’t live through a redirect) and consume it in the view.

return new StatusResult(View(ViewModel), “user created succesfully”);
view raw usercreated.cs hosted with ❤ by GitHub

I then use an extension method to make it a bit nicer to use

return View(ViewModel).WithSuccessMessage(“user created succesfully”);
return RedirectToAction(“Index”).WithSuccessMessage(“user created succesfully”);
view raw extension.cs hosted with ❤ by GitHub

Unit tests can then check that a StatusResult is returned that contains another action which performs the redirect.