Category: TFS

Configure MTM 2013 to run automated tests

The Scenario

I have an MTM 2013 installation that is configured in the following way:

image

This is the workflow that is triggered when a Developer check-in something:

  1. The code is built by TFS 2013, using a TFS Build Agent
  2. The agent update a Nuget Package containing the deployed application
  3. Octopus release the package over our Staging environment
  4. MTM execute remote tests after the Build is complete

Configuring MTM 2013

In order to have a successful and pleasant experience with MTM 2013 we need to pre-configure in the proper way the test environment(s). If you don’t configure properly the Test machines, the Environments and/or the Test Cases you will have a lot of troubleshooting activities in your backlog … MTM is quite articulated.

The time I am writing this article is April 2014 and MTM came out a while ago, so after you install it you may face some missing values in the operating systems or in the browsers list. So, first of all, let’s update these value lists.

Open MTM and Choose Testing Center>Test Configuration Manager>Manage configuration variables. In my case I extended the values in the following way:

image

You can also go directly to the source and change the XML entries. In order to change the correct file I would suggest you to visit this useful MSDN page:
http://msdn.microsoft.com/en-us/library/ms243856.aspx

Now that I have my value lists updated I can start with the configuration process. I have highlighted below the steps you should follow in order to have a proper MTM configuration.

  1. Define the Environment
    http://msdn.microsoft.com/en-us/library/ee943321(v=vs.110).aspx
  2. Define the Test Configurations
    http://msdn.microsoft.com/en-us/library/dd286643.aspx
  3. Create or Import the Test Cases
    http://msdn.microsoft.com/en-us/library/dd380741.aspx
  4. Create a Test Plan for your backlog
    http://msdn.microsoft.com/en-us/library/dd380763.aspx
  5. Execute a Test Automation and Configure it
    http://msdn.microsoft.com/en-us/library/ee257067(v=vs.100).aspx
  6. Trigger automated tests after a build complete

Let’s have a look at each of these steps, or you can follow the MSDN link I have attached to each one of them.

#01 – Define your Environment

First of all you need to install an MTM Controller. Usually I install it on the same location of my main TFS 2013 instance (not the build servers …). After I have installed the Controller I can start to register my agents.

For the controller and agent installations and configuration follow this link:
http://msdn.microsoft.com/en-us/library/hh546459.aspx

Note: if you don’t have any agent registered in your Controller you will not be able to configure the environments. In my case I try to keep the Machines’ classification identical between Build, Deployment and Test tools. So, in my case, I have the following structure:

Staging > Production > Cloud

And this is the expected result in my MTM configuration.

image

After you install a new Agent remember to refresh the Dashboard. Also, if you are facing troubles registering the Agent, try to reboot the Controller and the Agent machines, sometimes it helped me to move forward with the registration.

And in my environment overview dashboard

image

One final note here if you choose to have an “external” virtualization mechanism and work without SCVMM you will not have access to some functionalities like reboot, clone and manage environments because they are not handled by SCVMM. ie if you are using VMWare

#02 – Create some configurations

Configurations are used by MTM to define different test environment scenario. Let’s assume that your MTM is testing a WPF Client Application, probably you want to know how it runs over multiple Operating Systems. For this and many other reasons, you can create inside MTM multiple configurations to test your application over multiple environments, operating systems, browsers and/or SQL Server instances.

The picture below show some of the configurations I use while testing a WPF Client application. I use different operating systems, different languages and different browsers to download the ClickOnce application. It should work exactly the same over all these configurations.

image

When I am done with this part, before assigning test plan to configurations and machines to configurations, I need to complete the setup of my set harness.

#03 – Create or import the Test Cases

After you are done with the configuration of MTM it’s time to prepare our backlog in order to be able to manage the tests execution. MTM requires that your tests are identified by a test case work item. In order to do that you have two options:

  • Manually create your test cases and associated them with an automation if you need to automate it, or create a manual test and register it within your backlog in TFS
  • Import your automation from an MsTest class library, using the tcm command: 
    tcm testcase 
      /collection: CollectionUrl 
      /teamproject:MyProject 
      /import 
      /storage:MyAssembly.dll 
      /category:"MyIntegrationTestCategory"

and at the end you will have your test cases created automatically for you like the following screenshot shows:

image

Now open MTM and go to Testing Center > Track > Queries and you can start to search for your test cases. In this phase you’ll notice how important is to keep a good and constant naming convention for your tests and to work with categories:

image

Why? Because with a proper naming convention you can create a query and group your work items in an easier way

#04 – Create a Test Plan

There are multiple ways of creating a test plan. You can create a test plan manually and then add a test case, one by one. This is quite useful if you are working on a new project and sprint by sprint you simply add the test cases as soon as you create them.

Another option, which I personally love (ndr), is to create a Test Suite composed by multiple Test Cases, generated by a query. Why is this very useful? Well first of all you don’t have to touch anymore because every time you add a new test case it will just be included in the Test Suite. Second, it will force you and your team to use a proper Test Naming Convention.

In my case, I know the Area of my tests, but I want to test only the PostSharp aspects, nothing else, so I can write a query like the following:

image

and associate the Suite query generated with a parent one, like I did in my projects. After a while you will end up with a series of test suite (test harness) grouped by a certain logic. For example you can have test suites generated by a DSL expression or by a test requirement created by a PO or a QA:

image

#05 – Run your Automation

Before running the automation you need to inform MTM about few things. If you think about it for a moment, when you execute local tests you usually have a test settings file which is used to inform MsTest about the assemblies that need to be loaded, plugins and other test requirements.

Inside MTM, you can inspect the settings by opening the test plan properties window.

Within this windows you can choose settings for a Local run but also for a Remote run. In my case, when I run a remote test I need to be sure that a specific file is deployed, so this is what I have done in my configuration:

image

And when I manually trigger a Test I just ensure that the right configuration is picked up, like here:

image

and that’s it. Now you know how to prepare MTM for automation, how to configure it and how to group and manage test suites. With this configuration in place you should be able to trigger tests in automation after a build is complete.

Last piece of the puzzle could be “how do I trigger those tests after my build is complete?” and here we come with the latest part of this tutorial.

#06 – Trigger automated tests after a build complete

With TFS 2013 we got a new Workflow Template called LabDefault template. In order to use it you have to create a new build and select this Template.

After you have setup the new build you can go in the Process tab and specify how you want to execute your automated tests.

For example, you can choose which environment will be used for your test harness:

image

Which Build output will be used for the tests. You can either trigger a new build or get the assemblies from the latest successful build or even trigger a new customized workflow on fly:

image

And what Test Plan you want to execute, where and how:

image

Conclusion

I hope you will find this post useful cause for me the configuration of MTM took a while and I truly struggled to find a decent but short post that highlights the steps that need to be done in order to have MTM working properly.

TFS 2013 Create a local build

With TFS we can have two different type of Build, local or remote. A remote build in triggered on a controller that doesn’t reside on your local PC. A local build is triggered on your local dev agent and it can also be “hidden” from the main queue build repository.

The scenario

My scenario is the following:

I have to commit a code change and I want to test the CI build locally before check-in my changes and commit the code to the main repository. I don’t want to work with Shelvesets cause I just don’t want to keep busy the main Build Controller. 

image

So, for every build you queue (local or remote) the build agent will just create a new workspace and download the required files that need to be built.

So in my local PC I will end up with the following situation:

image

Which is really inconvenient because it will just replicate my workspace for each build agent I am running locally and it won’t include the changes I didn’t commit to the repository.

So, first of all we want to instruct TFS to use a different strategy when running a local build than the strategy when running a remote build.

Second we want to instruct the build agent to execute the build within the workspace directory without creating a new workspace and without downloading the latest files from the source because our local workspace is the source.

How does TFS get the latest sources?

In order to understand my solution we need to have a look at how TFS build the workspace and what activities in the workflow are in charge of that. If you open the default build workflow (please refer here if you don’t know what I am talking about) you will find that is starts with the following activities:

image

Initialize environment

This activity setup the initial values for the Target folder, the bin folder and the test output folder. You want to get rid of this activity because it will override your workspace.

Get sources

This activity creates a new Workspace locally and download the latest code. You can pass a name for the workspace but unfortunately TFS will always drop the existing one and re-create it, so this activity should also be removed from your local build definition.

Convert the remote to local path

At this point we need to inform TFS about the project location. Because we didn’t generate a workspace, when we ask TFS to build $/MyProject/MyFile.cs it will bomb saying that he doesn’t know how to translate a server path into a local path. Actually the real error is a bit misleading cause it just says “I can’t find the file …

This error can be easily fixed by converting the projects to build into local path using the following TFS activities:

image

First I ask TFS to get an instance of my Workspace, which is the same I am using within Visual Studio. Then, for each project/solution configured in my build definition I update the path. The Workspace name is a Build Parameter in my workflow …

Last piece, we still need to build against a Workspace but the existing one, so in order to accomplish this kind of build we need to change the build path of the local agent in the following way:

image

Now when you ask to the workflow to convert Server to Local paths using your Workflow name, it will return a path pointing to the local workspace which is the same path configured in your build agents.

Note: Multiple agents can run on the same workflow path in parallel, which means a parallel build sequence Winking smile

Create new Octopus Release from TFS Build

In this article we will have a look at how we can automate the Octopus deployment using TFS build server. Every time a member of the team performs a check-in I want to execute a continuous build with the following workflow:

image

The first step is to change the default build workflow in TFS. Usually I clone the default build workflow and work with a new one, cause if something goes wrong I can easily rollback to the default build status.

First of all we need to create a new version of our build workflow, so I clone my CI build and its own workflow:

#01 – Clone the CI build
image

#02 – Clone the Workflow

In order to clone the workflow you just have to press the NEW button and locate the original workflow, or DOWNLOAD an existing one into your workspace:
image

Now, you need to locate a specific section of the workflow. We want to create a new release of our app only if everything went fine in the build but before the Gated Check-In is issued, because if we can’t publish to Octopus, the build still has to fail.

image

In my case I want to obtain the following output on my build in case of success or failure, plus I don’t want to publish a release if something went wrong in the build:

#01 – Build log
image

#02 – Build summary
SNAGHTML13155776

I also want to output a basic log so that I can debug my build just by reading the log.

Now the fun part, I need to execute the Octo.exe command from TFS in order to be able to publish my projects. I need to know few info that I will provide to my build workflow as output parameters:

image

Finally, I have to create a new task in my workflow that will execute the command. How?

image

The trick is inside the InvokeProcess activity. In this activity I simply call Octo.exe and use the Octopus API to publish my project into the Staging environment. This is the environment where I will run my Automated Tests.

I configured the activity in the following way:

image

You can find more information on how to call the Octopus API using Octo.exe here:
https://github.com/OctopusDeploy/Octopus-Tools/blob/master/readme.md

Hope this help

Deploy Database Project using Octopus

Octopus is a deployment tool that use the Nuget packaging mechanism to pack your application and deploy it over multiple environments.

Unfortunately it does not have a native support (yet) for Visual Studio Database project, so I had to provide a sort of workaround in my project structure to allow Octopus to deploy also a “Database Project Nuget package”.

Visual Studio .dacpac

Visual Studio Database project is capable to generate diff scripts, a full schema deployment script and also a post deployment script (in case you need to populate the database with some demo data, for example). When you compile a Database project this is the outcome:

image

As you can see we have two different .dacpac files. One for the master Database and one for my own database. A dacpac file is what is called “Data Tier Application” and it’s used within SQL Server to deploy a database schema.

Another interesting thing is the schema structure, in every database project you will have also a second output folder with the following structure:

image

And in the obj folder we have an additional output:

image

which contains a Model.xml file. This file can be used to integrate entity framework with our database schema. The postdeploy.sql is a custom script that we generate and execute after the database deployment.

Package everything with Nuget and OctoPack

So, what do we need in order to have a proper Nuget package of our database schema? Well, first of all let’s see what we should carry on in our package. Usually I create a package with the following structure:

image

The steps to obtain this structure are the following:

1 – Modify the database project to run OctoPack

  <Import 
        Project="$(SolutionDir)\.nuget\NuGet.targets" 
        Condition="Exists('$(SolutionDir)\.nuget\NuGet.targets')" />
  <Import 
        Project="$(SolutionDir)\.octopack\OctoPack.targets" />
</Project>

2 – Provide a .nuspec file with the following structure:

<?xml version="1.0"?>
<package xmlns="http://schemas.microsoft.com/packaging/2010/07/nuspec.xsd">
  <metadata>
    <!-- Your file specifications -->
  </metadata>
  <files>
    <!-- The Database Schema -->
    <file src="\dbo\**\*.sql" 
            target="Content\Schema"/>
    <!-- The deployment script -->
    <file src="\obj\**\*.sql" 
            target="Content\Deploy" />
    <file src="\obj\**\*.xml" 
            target="Content\Deploy" />
    <!-- Your .dacpac location -->
    <file src="..\..\..\..\..\..\bin\**\*.dacpac" 
            target="Content\Deploy" />
  </files>
</package>

And of course have your Build Server the RunOctoPack variable enabled.

Install the package using Powershell

The final step to make the package “digestable” by Octopus using PowerShell. In our specific case we need a power shell script that can execute the .dacpac package and the post deployment script. That’s quite easy.

In order to install a .dacpac with power shell we can use this command:

# load Dac Pac
add-type -path "C:\Program Files (x86)\Microsoft SQL Server\110\DAC\bin\Microsoft.SqlServer.Dac.dll"

# make DacServices object, needs a connection string 
$d = new-object Microsoft.SqlServer.Dac.DacServices "server=(local)"

# Load dacpac from file & deploy to database named pubsnew 
$dp = [Microsoft.SqlServer.Dac.DacPackage]::Load($DacPacFile) 
$d.Deploy($dp, $DatabaseName, $true)

In my case I set some variables in Octopus in order to be able to dynamically create the database and locate the .dacpac file.

image

The final result is available through Octopus deployment console, cause I always set my PShell commands using | Write-Host at the end:

image

Final note: remember that the only way to stop a deployment step in Octopus using Power Shell is to return –1. In my case I wrap the code in a Try/Catch and return –1 if you want to stop the deployment but you can find a better explanation here.

Sharing assembly version in Visual Studio 2010.

Last week I came up with a fancy requirement that forced me to struggle a little bit in order to find an appropriate solution. Let’s say that we have a massive solution file, containing something like 100ish projects and we would like to keep the same assembly version number for all these projects.

In this article I will show you how the assembly version number works in .NET and what are the possible solutions, using Visual Studio.

Assembly version in .NET

As soon as you add a new project (of any type) in Visual Studio 2010, you will come up with a default template that contains also a file “AssemblyInfo.cs” if you are working with C# or “AssemblyInfo.vb” if you are working with VB.NET.

image

If we look at the content of this file we will discover that it contains a set of attributes used by MSBuild to prepare the assembly file (.dll or .EXE) with the information provided in this file. In order to change this information we have two options:

  1. Edit the AssemblyInfo.cs using the Visual Studio editor.
    In this case we are interested in the following attributes, that we will need to change every time we want to increase the assembly version number:

  2. Or, we can open the Project properties window from Visual Studio using the shortcut ALT+ENTER or by choosing “properties” of a VS project file from the Solution Explorer

    image

How does the versioning work?

The first thing that I tried was to understand exactly how this magic number works in .NET.

If you go to the online MSDN article, you will find out that the version number of an assembly is composed by 4 numbers, and each one has a specific mean

1. Major = manually incremented for major releases, such as adding many new features to the solution.

0. Minor = manually incremented for minor releases, such as introducing small changes to existing features.

0. Build = typically incremented automatically as part of every build performed on the Build Server. This allows each build to be tracked and tested.

0 Revision = incremented for QFEs (a.k.a. “hotfixes” or patches) to builds released into the Production environment (PROD). This is set to zero for the initial release of any major/minor version of the solution.

Two different assembly version attributes, why?

I noticed that the [assembly] attribute class exposes two different properties, Assembly Version and Assembly File Version.

AssemblyFileVersion

This attribute should be incremented every time our build server (TFS) runs a build. Based on the previous description you should increase the third number, the build version number. This attribute should be placed in a different .cs file for each project to allow full control of it.

AssemblyVersion

This attributes represents the version of the NET assembly you are referencing in your projects. If you increase this number in every TFS build, you will incur in the problem of changing your reference redirect every time the assembly version is increased.

This number should be increased only when you release a new version of your assembly and it should be increase following the assembly versioning terminology (major, minor, …)

Control the Versioning in Visual Studio

As I said before VS allows us to control the version number in different ways and in my opinion using the properties window is the easiest one. As soon as you change one of the version numbers from the properties window, also the AssembliInfo.cs file will be automatically changed.

But what happens if we delete the version attributes from the assembly info file? As expected VS will create an assembly with version 0.0.0.0 like the picture below:

image

Note: if we open the Visual Studio properties window for the project and we write down the version 1.0.0.1 for both, Assembly and AssemblyFile attribute, VS will re-create these two attributes in the AssemblyInfo.cs file.

Sharing a common Assembly version on multiple projects

Going back to the request I got, how can we setup a configuration in Visual Studio that allows us to share on multiple projects the same assembly version? A partial solution can be accomplished using shared linked files on Visual Studio.

Ok, what’s a shared linked file, first of all? A linked file is a file shortcut that points in multiple projects to the same single file instance. A detailed explanation of this mechanism is available on Jeremy Jameson’s blog at this page.

Now, this is the solution I have created as an example where I share an AssemblyVersion.cs file and an AssemblyFileVersion.cs file to the entire Visual Studio solution.

image

Using this approach we have one single place where we can edit the AssemblyFileVersion and the AssemblyVersion attributes. In order to accomplish this solution you need to perform the following steps:

  1. Delete the assembly version and the assembly file version attributes for all the existing AssemblyInfo.cs files
  2. Create in one project (the root project) a file called AssemblyFileVersion.cs containing only the attribute AssemblyFileVersion
  3. Create in one project (the root project) a file called AssemblyVersion.cs containing only the attribute AssenblyVersion
  4. Add as linked files these two files to all the existing projects
  5. Re-Build everything

Final note on Visual Studio properties window

Even if my root project has now two files with the attributes AssemblyFileVersion and AssemblyVersion, when I open the Visual Studio properties window, it tries to search for these attributes in the AssemblyInfo.cs file, and clearly, it can’t find them anymore, so it does not display anything:

image

If you add a value to these textboxes Visual Studio will re-create the two attributes in the AssemblyInfo.cs file without taking care of the two new files we have created and as soon as you try to compile the project you will receive this nice error:

image

So, in order to use this solution you need to keep in mind that you can’t edit the AssemblyFileVersion and the AssemblyVersion attributes from the VS properties window if they are not saved in the AssemblyInfo.cs file!

I believe that MS should change this in the next versions of Visual Studio.

Winking smile