Rasmus Møller Selsmark

On software test and test automation

Using TFS API and remote registry to keep track of what’s installed on our lab machines

clock December 20, 2012 23:16 by author rasmus

As described in http://rasmus.selsmark.dk/post/2012/09/26/Levels-of-Software-(Test)-Automation.aspx, we have put a lot of effort at ScanJour into automating our lab, currently running approx. 400 virtual machines using TFS 2010 Lab Manager. Running this many machines in SCVMM/Lab Manager is a challenge in itself, which I might come back to in a later post. In order to keep track of all these machines, my colleague Kim Carlsen started out developing an internal website (“labstat”) which displays information about:

  • Deployed machines and their state
  • Available RAM and disk space on each SCVMM host
  • Number of machines deployed per TFS project
  • Available disk space in libraries

All of this information is of course available from the SCVMM console, but in order to help the people serve themselves, we are exposing this information to our users. A small part of the site looks like the following:

image

All of this information is pulled from SCVMM using PowerShell scripts. This sample shows how to get machine details using the Get-VM cmdlet:

Add-PSSnapin Microsoft.SystemCenter.VirtualMachineManager

#VMs deployed and in library
$dt= Get-VM | Select @{N="VmName"; E = {$_.Name}}, @{N="LabName"; E = {([xml]($_.Description)).LabManagement.LabSystem.Innertext}}, owner, status, description, hostname, location, creationtime, @{N="Project"; E = {([xml]($_.Description)).LabManagement.Project}}, @{N="Environment"; E = {([xml]($_.Description)).LabManagement.LabEnvironment.Innertext}}, @{N="Snapshots"; E = {$_.VMCheckpoints.count}}, hosttype, @{N="TemplateName"; E = {([xml]($_.Description)).LabManagement.LabTemplate.innertext}}, Memory | Out-datatable
Invoke-Sqlcmd -query "Delete from VM" -Database $Global:VMMDatabase -ServerInstance $Global:ServerInstance
Write-DataTable -ServerInstance $Global:ServerInstance -Database $Global:VMMDatabase -TableName "VM" -Data $dt

Having approx. 400 machines running, another question that often pops up is “Do we have a test environment with version X of product Y?”. The solution for this is the following page that shows detailed information for each environment (apologize the layout, we’re probably not going to win any design awards for this website…):

image

The information presented for each environment is:

  • Environment name and owner/creator
  • “InUse” column simply queries TFS for the “Marked In Use” information, you can set on an environment. The advantage of using this property, is that it can be set without having to shut down the environment, opposed to e.g. changing the description field, which can only be done when environment is not running
  • Machines in the environment, both with the internal name (almost all of our environments are network isolated) and OS
  • Finally the products installed in the environment

We are storing all of this data in a separate database, which is updated every 10 to 60 minutes, depending on the type of information. The SCVMM data (deployed machines, state etc.) is queried every 10 minutes, whereas the information about installed products is a more time-consuming operation and thus only done once per hour.

In order to access remote registry on the lab machines, the firewall must be configured, which is done using this PowerShell function:

function Initialize-FirewallForWMI
{
    Write-Host "Opening Windows Firewall for WMI"
    Start-Process -FilePath "netsh.exe" -ArgumentList 'advfirewall firewall set rule group="windows management instrumentation (wmi)" new enable=yes' -NoNewWindow -Wait
}

Using TFS API for getting information about deployed lab environments

Getting details for environments and machines in lab, is simply a matter of accessing the TFS API. One interesting detail here is that we also get the IP address of the machine, in order to make it easier for people to remote desktop to the machines without necessarily having to open the Microsoft Test and Lab Manager client.

// open database and TFS connection
using (SqlConnection cnn = new SqlConnection(databaseConnectionString))
using (TfsTeamProjectCollection tfs = new TfsTeamProjectCollection(new Uri(tfsUrl)))
{
    tfs.EnsureAuthenticated();
    cnn.Open();

    DataAccess.ResetIsTouchedForLabEnvironmentsAndMachines(cnn);

    LabService labService = tfs.GetService<LabService>();

    ICommonStructureService structureService = (ICommonStructureService)tfs.GetService(typeof(ICommonStructureService));
    ProjectInfo[] projects = structureService.ListAllProjects();

    // iterate all TFS Projects
    foreach (ProjectInfo project in projects)
    {
        LabEnvironmentQuerySpec leqs = new LabEnvironmentQuerySpec();
        leqs.Project = project.Name;
        var envs = labService.QueryLabEnvironments(leqs);

        // Iterate all environments in current TFS project
        foreach (LabEnvironment le in envs.Where(e => e.Disposition == LabEnvironmentDisposition.Active))
        {
            Trace.WriteLine(String.Format("Project: {0}; Environment: {1}", project.Name, le.Name));

            // need to reload in order to get ExtendedInfo data on machines in environment
            LabEnvironment env = labService.GetLabEnvironment(le.Uri);

            DateTime? inUseSince = null;

            if (env.InUseMarker != null)
            {
                inUseSince = env.InUseMarker.Timestamp;
            }

            LabEnvironmentDTO envData = new LabEnvironmentDTO
            {
                Id = env.LabGuid,
                Name = env.Name,
                Description = env.Description,
                ProjectName = env.ProjectName,
                CreationTime = env.CreationTime,
                Owner = env.CreatedBy,
                State = env.StatusInfo.State.ToString(),
                InUseComment = (env.InUseMarker == null ? "" : env.InUseMarker.Comment),
                InUseByUser = (env.InUseMarker == null ? "" : env.InUseMarker.User),
                InUseSince = inUseSince
            };
            DataAccess.Save(cnn, envData);

            // Iterate machines in environment
            foreach (LabSystem ls in env.LabSystems)
            {
                string computerName = String.Empty;
                string internalComputerName = String.Empty;
                string os = String.Empty;
                StringBuilder ip = new StringBuilder();

                if (ls.ExtendedInfo != null)
                {
                    computerName = ls.ExtendedInfo.RemoteInfo.ComputerName;
                    internalComputerName = ls.ExtendedInfo.RemoteInfo.InternalComputerName;
                    os = ls.ExtendedInfo.GuestOperatingSystem;

                    if (!String.IsNullOrWhiteSpace(computerName))
                    {
                        try
                        {
                            IPAddress[] ips = Dns.GetHostAddresses(computerName);

                            foreach (IPAddress ipaddr in ips)
                            {
                                if (ip.Length != 0)
                                    ip.Append(",");
                                ip.Append(ipaddr);
                            }
                        }
                        catch (Exception ex)
                        {
                            ip.Append(ex.Message);
                        }
                    }
                }

                LabMachineDTO machineData = new LabMachineDTO
                {
                    Id = ls.LabGuid,
                    Name = ls.Name,
                    LabEnvironmentId = le.LabGuid,
                    ComputerName = computerName,
                    InternalComputerName = internalComputerName,
                    IpAddress = ip.ToString(),
                    OS = os,
                    State = ls.StatusInfo.State.ToString()
                };

                DataAccess.Save(cnn, machineData);

                string machineDisplayName = String.Format(@"{0}\{1}\{2}", project.Name, le.Name, internalComputerName);
                ExtractPropertiesForLabMachine(cnn, ls, machineDisplayName); 
            } // foreach machine
        } // foreach environment
    } // foreach project

    // TODO: DataAccess.DeleteUntouchedEnvironmentsAndMachines(cnn);
} // using SqlConnection + TFS

Querying lab machines for information about installed products

My first attempt for getting information about installed products on a remote machine was using “Get-WmiObject -Class Win32_Product” in PowerShell, which works but as described on http://sdmsoftware.com/wmi/why-win32_product-is-bad-news/ has the unwanted side-effect that each MSI product queried on the remote machine is reconfigured/repaired. Because of this, and the fact that it took a long time to repair each product on the machine, I decided to implement this using remote registry access instead, and reading values from “HKLM\SOFTWARE\Microsoft\Windows\CurrentVersion\Uninstall” where Windows stores information about installed applications. Both 32- and 64-bit registry is queried.

The code for accessing the remote registry is shown here:

/// <summary>
/// Populates the products for machine using remote registry access.
/// </summary>
/// <param name="machineName">Name of the machine.</param>
/// <param name="products">Reference to products collection, that will be populated.</param>
/// <param name="registryMode">The registry mode (32- or 64-bit).</param>
/// <returns>False, if e.g. access denied, which means no reason to try subsequent reads</returns>
private static bool PopulateProductsForMachine(string machineName, List<Product> products, RegistryMode registryMode)
{
    string registryPath = @"SOFTWARE\Microsoft\Windows\CurrentVersion\Uninstall";

    if (registryMode == RegistryMode.SysWow64)
        registryPath = @"SOFTWARE\Wow6432Node\Microsoft\Windows\CurrentVersion\Uninstall";

    try
    {
        using (RegistryKey remoteRegistry = RegistryKey.OpenRemoteBaseKey(RegistryHive.LocalMachine, machineName))
        {
            if (remoteRegistry == null)
                return false; // could not open HKLM on remote machine. No need to continue

            using (RegistryKey key = remoteRegistry.OpenSubKey(registryPath))
            {
                if (key == null)
                    return true; // no error, we just couldn't locate this registry entry

                string[] subKeyNames = key.GetSubKeyNames();
                foreach (string subKeyName in subKeyNames)
                {
                    RegistryKey subKey = key.OpenSubKey(subKeyName);

                    if (subKey == null)
                        continue;

                    string name = GetRegistryKeyValue(subKey, "DisplayName");
                    string vendor = GetRegistryKeyValue(subKey, "Publisher");
                    string version = GetRegistryKeyValue(subKey, "DisplayVersion");

                    if (!String.IsNullOrWhiteSpace(name))
                    {
                        products.Add(new Product { Name = name, Vendor = vendor, Version = version });
                    }
                }
            } // using key
        } // using remoteRegistry

        return true;
    }
    catch (Exception ex)
    {
        Trace.WriteLine(String.Format("An exception occured while opening remote registry on machine '{0}': {1}", machineName, ex.Message));
        return false;
    }
}

private static string GetRegistryKeyValue(RegistryKey key, string paramName)
{
    if (key == null)
        throw new ArgumentNullException("key");

    object value = key.GetValue(paramName);
            
    if (value == null)
        return String.Empty;

    return value.ToString();
}

Before you can access the remote registry using RegistryKey.OpenRemoteBaseKey() you need to authenticate against the machine, which is done by connecting to the “C$” system share on the machine.

// connect to server
NetworkShare share = new NetworkShare(labMachineDnsName, "C$", @"(domain)\Administrator", "(password)");
try
{
    share.Connect();

    // get list of all products installed on machine
    List<Product> products = new List<Product>();

    if (!PopulateProductsForMachine(labMachineDnsName, products, RegistryMode.Default))
        return; // failed to connect to remote registry -> don't try any further on this machine

    PopulateProductsForMachine(labMachineDnsName, products, RegistryMode.SysWow64);

    // filter out products by ScanJour or selected MS apps
    var relevantProducts =
        from p in products
        where (p.Vendor.Equals("ScanJour", StringComparison.OrdinalIgnoreCase))
            || (p.Name.StartsWith("ScanJour", StringComparison.OrdinalIgnoreCase))
            || (p.Name.StartsWith("Microsoft Office Professional"))
            || (p.Name.StartsWith("Microsoft Office Enterprise"))
            || (p.Name == "Microsoft Visual Studio 2010 Premium - ENU")
            || (p.Name == "Microsoft Visual Studio 2010 Professional - ENU")
            || (p.Name == "Microsoft Visual Studio Premium 2012")
        select p;

    foreach (Product product in relevantProducts)
    {
        LabMachinePropertyDTO data = new LabMachinePropertyDTO
        {
            LabMachineId = machine.LabGuid,
            Id = "Product",
            Value = String.Format("{0} ({1})", product.Name, product.Version)
        };

        Trace.WriteLine(String.Format("Adding product '{0}'", data.Value));

        DataAccess.Save(cnn, data);
    }

    // Find Oracle version on machine
    string oracleVersion = GetOracleVersionForMachine(labMachineDnsName);
    if (!String.IsNullOrWhiteSpace(oracleVersion))
    {
        LabMachinePropertyDTO data = new LabMachinePropertyDTO
        {
            LabMachineId = machine.LabGuid,
            Id = "Product",
            Value = oracleVersion
        };

        Trace.WriteLine(String.Format("Adding product '{0}'", data.Value));

        DataAccess.Save(cnn, data);
    }
}
catch (Exception ex)
{
    Trace.WriteLine("An exception occured while getting lab machine properties:");
    Trace.WriteLine(ex.ToString());
}
finally
{
    // Disconnect the share
    share.Disconnect();
}

As it can be seen from the code, we’re querying for our own products (vendor or name “ScanJour”) and version of Visual Studio and Microsoft Office installed on the machines. We also display which version of Oracle is installed in the environment, again by simply querying for a specific registry key.

Conclusion

In my previous blog post, I described how we have automated deployment of test environments in the lab. In this post we have covered another important aspect of maintaining a large test lab, namely getting overview of state of lab environments and machines. It of course requires some effort to build up an automation framework around your lab infrastructure, but when in place, it has given us the following benefits:

  • Minimized manual time used on setting up and maintaining lab environments
  • Providing an overview of lab usage for all users
  • Increased predictability when setting up multiple environments, since all configuration of environments is automated
  • Easier to roll out new changes to base templates, e.g. we have a one Domain Controller template, which is updated e.g. with latest Windows Updates regularly. By automating environments, we more often get “fresh” environments, instead of earlier where it was a manual process to set up a new environment, and therefore not done as often.

Main conclusion here is that you should invest some time in automation for your lab. It certainly does cost some time, but now we have it, I don’t understand how we were able to get any work done in the lab before when we were setting up environments manually Smile




GOTO Aarhus 2012 - Developers *are* writing functional tests :)

clock September 9, 2012 22:33 by author rasmus

Even though this is a part of my “warm-up” blog posts for GOTO Aarhus 2012 Conference, I’ll start out referring to a related event this week. At our latest meeting in the Danish TFS User Group, held at the Microsoft office in Hellerup, Rune Abrahamsson from BRFkredit had a presentation on how they are using http://cuite.codeplex.com/ for developing functional tests for one of their systems. If you are interested, Mads (our agile coach at ScanJour) has also blogged from the user group here.

Although the people attending the TFS Users Group meetings are mostly developers, we often have at least one testing-related topic on the agenda, and this time it was clear that many of the present developers does have experience with developing functional tests, often using a UI automation framework like Microsoft Coded UI. So even before attending the GOTO Aarhus 2012 Conference, it seems I can conclude that developers are writing functional tests, which pleases me, as I think the people writing the software, also are the best at writing automation for it. The testers can then do actual quality assurance, by ensuring that the customer requirements have been automated, as well as performing exploratory testing to exercise the software in new ways.

IMAG0007

Sorry for the bad picture quality; the slide shows at the bottom which areas (including functional UI tests) are covered by developer tests, whereas e.g. exploratory tests are performed by their manual testers (which are actually domain experts, not full-time testers). I find it very positive to see a development team take quality seriously, by including test automation, when they are in a situation where they are lacking testers. I guess this is not an uncommon scenario in the industry, that you have to get test assistance from other teams/departments.

And it makes me wonder if manual testers should be forced to not do manual scripted test at all, but only do exploratory testing and quality assurance, since the developers will fill the gap themselves, by automating the scripted test cases? Smile

 

To put this in the context of this years GOTO conference, I have been looking at the biographies of some of the speakers, and found a video by Steve Freeman on Sustainable Test-Driven Development, where he touches topics like:

  • Too coupled production and test code, which makes refactoring difficult (beginning of video)
  • Test code structure (similar to Given/When/Then pattern of e.g. SpecFlow)
  • “You come back to the code after 6 months, and forgot why you did this” (approx. 13:00)
  • Patterns for writing test code. As simple as good variable naming, using DSL syntax to have more more readable test code
  • Prepare for your test to fail at some point, simply by having clear error messages when it happens (“Explain yourself”, “Describe yourself” and “Tracer Objects” around 23 mins into the video).
  • Make your tests robust, e.g. “Only Enforce Order When It Matters” (around 40:00)
  • “Tests Are Code Too” (43:35), which is the last slide and also seems to be the headline for this presentation

The last slide is shown here:

image

Although all four bullets here are important, I have myself faced the “If it’s hard to test, that’s a clue” problem, when developing unit tests, both for production code as well as test automation framework code. As an automation tester, I also regularly have the problem when writing automated functional tests against production code, e.g. missing id’s on UI controls, no clear interface for testing etc. When the development team is writing automated functional tests as part of definition of done, it should hopefully result in code better suited for automation. And then the general conclusion, that test automation should be treated like any other code activity.

This year Steve Freeman has a session on Raspberry Pi with extra toppings Monday at 13:20, which unfortunately is a timeslot where I have also some other sessions I would like to see (What is value and Mythbusting Remote Procedure Calls), but it might be that I have to change my mind; as it could also be fun to get an introduction to the Raspberry Pi.

After watching this video, I feel confident that I will meet developers at the GOTO Aarhus 2012 Conference this year with a quality/testing mindset, and I’m looking forward to talk with you about your view on test automation, and how I as an automation tester can bring value and increase the quality of our software products, even if it's no longer a dedicated test automation developer writing the functional tests.

Feel free to comment on this post below.




Microsoft running on TFS 2012

clock August 8, 2012 22:32 by author rasmus

At TechEd 2012 session DEV340 - Taking Your Application Lifecycle Management to the Cloud With the Team Foundation Service (which I recommend watching) the presenter mentions that TFS 2012 has been used internally at Microsoft for a while. This is described in further details at http://blogs.msdn.com/b/buckh/archive/2012/06/08/developer-division-is-running-on-tfs-2012-rc.aspx, where it is also mentioned that they are running with 3600+ users currently. I think (hope…) this will result in a more stable TFS, especially in Lab Management. Looking forward to get onto TFS 2012 Smile




About the author

Team lead at Unity Technologies. Focus on automating any task possible. Author of e.g. http://uimaptoolbox.codeplex.com

Twitter: @RasmusSelsmark

Month List

Sign In