Skip to main content

Many of us know about the benefits of automated UI testing. Leveraging automated, repeatable tests that can be executed with frequency over time results in increased software quality and a reduction in overall software development costs.

But knowing the benefits isn’t enough. Knowing the absolute best way to create and maintain your automated tests is critically important to maximize your benefits. Improper practices often leads to team frustration, rework, and even abandonment of the practice.

Let’s start by exploring some of the Microsoft automated coded UI testing best practices that Imaginet leverages with our own customers, including (1) Imaginet’s Coded UI Test Framework, (2) cross-browser and cross-platform testing, and (3) reporting and data collection.

Imaginet’s Coded UI Test Framework

Imaginet’s approach to implementing Coded UI Testing is distinctly different from most of the demos and videos available about Coded UI usage.  We leverage the Page Object pattern to build up a re-usable testing framework.  Coded UI includes a technology called UIMaps that simplifies the creation of a testing framework.  For small applications or technology demonstrations, UIMaps may be appropriate.  However, UI Maps do not scale well for use in larger applications and projects. Imaginet’s approach to adopting Coded UI on larger projects uses the Page Object Pattern (common in Selenium testing) to build up a test harness around each View (a.k.a. page / screen) within the application.  These Page Objects utilize the Coded UI libraries, directly bypassing the need for UI Maps.  In turn, the Coded UI Tests simply consume the Page Objects.  The Page Objects can be reused across any Coded UI Test that is required to interact with that View. For example, the Page Objects might expose the following methods for use by the Tests:

  • LoginPage.InputUserName(string userName)
  • LoginPage.ClickLogin()
  • ShoppingCart.ClickCheckout()
  • ShoppingCart.RemoveItemFromCart(string productName)

By using common coding patterns (e.g. polymorphism, page object, factory, etc.), we help you to separate the portions of the framework and/or test scenarios that are resuable between different implementations (e.g. web vs mobile vs rich client).

Cross-Browser and Cross-Platform Testing

For teams who want to reuse the same test scenarios across many operating systems, screen resolutions, and web browsers, Imaginet’s recommends:

  • Coded UI supports cross-browser testing by relying on the Selenium Web Driver tooling under-the-covers.  CodedUI supports execution on IE natively, and via Selenium, it also supports Chrome and FireFox.
  • CodedUI Tests are invoked remotely through the use of TFS Test Agents and Test Controllers.  By provisioning multiple VMs, one can create test environments (running the TFS Test Agent) that are configured with various combinations of OS versions and/or browser versions.
  • Through the use of (1) TFS Builds, (2) environments configured in TFS Lab Center, and (3) appropriate use of Test Plans, Test Configurations, and Test Settings in Microsoft Test Manager (MTM), the entire process of running through a set of test scenarios, spread across many different OS, browser and machine combinations can be automated and run “lights out”.

Reporting YOUR Data Collection

Coded UI Tests are intended to be linked to Test Case Work Items in MTM (Microsoft Test Manager).  Through this linking and the use of Test Agents to provide remote test execution and data collection, TFS automatically aggregates data for every test execution and surfaces it in a data warehouse and set of out-of-the-box reports. The Test Agent that enables remote execution of Coded UI Tests not only collects the pass/fail results of the test, but it can also automatically collect screenshots of each step within the test case and/or a video of the test execution for review.  Lastly, it can be configured to automatically collect code coverage data for each test run. These data files collected during execution in addition to the test results are published back to TFS and are made available via the TFS Data Warehouse for reporting purposes.  TFS includes a handful of reports that analyze the test results so organizations end up with custom SSRS reports against the TFS Data Warehouse to satisfy more advanced reporting needs.

Figure1_Example_TFS_Report Figure 1 – Example TFS Report Figure2_Test_Results_in_MTM Figure 2 – Test Results in MTM Figure3_Sample_of_Test_Log Figure 3 – Sample of Test Log with Screenshot

Ready to Adopt Best Practices?

Successfully adopting automated coded UI and their corresponding best practices is challenging at best.  Without the in-house expertise of experienced Automated Coded UI professionals, adoption and implementation of automated UI testing can be time-consuming and often frustrating. When it comes to YOUR automated UI testing, let Imaginet help you maximize your success. There are two ways we can help you:

Option #1 – Consulting and Training

Any ALM environment component required to enable automated testing – This includes consulting and implementation services around Automated Testing environments done by a Sr. ALM Consultant onsite at the client location.

Training and Mentoring – Imaginet understands the key goal to build the best skillset within your organization to enable adoption of Coded UI technology by your staff. Imaginet’s engagements include training and mentoring that involves a blended team of Imaginet consultant(s) and client staff. We recommend a secondary engagement which includes one or more client staff members working alongside Imaginet full-time through another group of test cases.  This provides you with the most effective knowledge transfer through mentoring and results in increased productivity, client staff becomes proficient in assisting with delivery. Ideally, the client would have a technical tester (proficient in C# coding) available full-time during the engagement, and a non-technical tester available 30-40% throughout the engagement.

Option #2 – QA Center of Excellence

In order for automated testing to be successful in any organization, there is always process and role change that takes place.  A common example is when current test team are all manual testers – moving to automated testing requires developers to write coded UI test scripts using the domain knowledge of the current test team. Organizations that attempt to “train” their current test teams on how to write automated test scripts find an experienced tester with no software development background in .NET cannot just be “trained” to write coded UI scripts; there is a large technical skill gap between the two roles.  In this case, either new team members with a .NET development background should be added to the team or the organization has to outsource the actual test script creation to a third party. Imaginet offers outsourced coded UI development through our QA Center of Excellence (QA CoE).  As part of the Imaginet QA CoE, our test experts and engineers provide:

  • Test plan and strategy development
  • Automated test authoring
  • Functional (manual) testing
  • Performance, stress, and load testing
  • Cross platform testing
  • Test automation frameworks
  • Test lab setup and management
  • Test performance optimization
  • On-premise or cloud hosted test environments
  • Custom test extensions, tooling, and reports

As a Center of Excellence, we cover many edge conditions and scenarios that other consulting organizations cannot. Click here for more details. 


Imaginet is your trusted technology partner who turns your business innovation ideas into reality. 18+ years | 1100+ satisfied customers | 2500+ successful engagements. Located in Dallas (Irving), Winnipeg, and Calgary. Services offered worldwide. Contact us today at or 1-800-989-6022.


Imaginet is your trusted technology partner who turns your business innovation ideas into reality. 20+ years | 1200+ satisfied customers | 2500+ successful engagements. Located in the United States and Canada. Services offered worldwide. Contact us today at or 1-800-989-6022.

Leave a Reply

Let‘s Talk.

Let's talk!