Test Synchronization

See Tasktop Editions table to determine if your edition contains Test Synchronization functionality.

Introduction

Many organizations have been using Micro Focus ALM (aka Quality Center) for quality management for years. But where once it was the only tool used for testing, today enterprises are augmenting ALM with additional tools to align with their agile and test automation efforts. That includes tools like Tricentis Tosca. Even as new tools are introduced, ALM remains popular and continues to play an important role in test management, especially when it comes to manual testing, defect management, and quality reporting.

The challenges for QA teams and leadership are how to restore visibility into coverage, quality, and cost, now that testing data is split across multiple tools.

Tasktop enables users to flow test results into Micro Focus ALM in order to take advantage of ALM's reporting capabilities while using other tools, such as Tricentis Tosca, for their test planning and execution.  

The method outlined below will enable you to flow test results into Micro Focus ALM from Tricentis Tosca or from another ALM instance.  Due to the architectural specificity of each external tool, the methods below cannot be used for other endpoints.

You can watch this demo video to learn more:

Test Architecture

Before you begin configuring your integration, it's important to understand how test artifacts relate to one another.

While the goal of this integration is to flow test results, the architecture required to do so is more complex than one might assume.  Test Results are a field on Test Runs.  To create a Test Run in ALM, a Test Instance must exist.  For a Test Instance to exist, both a Test and a Test Set must exist (the test is 'added' to the test set and that creates a test instance).  For a Test to exist, you need a Test Folder.  And for a Test Instance to exist, you need a Test Set and a Test Set Folder.  That's 6 different artifacts, just to flow a Test Run!

ALM Test Architecture

But don't worry - instead of six complex integrations, Tasktop cuts that configuration in half.  To set up this integration scenario, you will set up three integrations:

Integration Container A Container B Work Item
Test Design

Test Folder


-- Test
Test Planning

Test Set Folder


Test Set Test Instance
Test Execution -- -- Test Run

Test Synchronization Set-Up

Once configured, your integrations will look like the images below.

(lightbulb) To keep your integrations in order, we recommend appending a number to the beginning of each title, i.e. "1 - Test Design," "2 - Test Planning," "3 - Test Execution"

Test Integrations - Landscape View

Test Integrations - List View

Before You Begin

Before you get started, familiarize yourself with the following steps of integration configuration:

Connecting to your External Tools

Creating a Model

Creating a Collection

(lightbulb) Review the details in the sections below to ensure that any required fields are mapped in your collection

Configuring an Integration

Integration 1: Test Design 

The first integration you will configure is a Container + Work Item Synchronization flowing Test Folders/Test Case Folders (container) and Tests/Test Cases (work item). 

Test Design Integration

Containers Supported

  • Micro Focus ALM Test Folders 
    • Parent field must be mapped to preserve folder hierarchy
  • Tricentis Tosca Test Case Folders
    • Parent field must be mapped to preserve folder hierarchy

Artifacts Supported

  • Micro Focus ALM Tests
    • Subject field must be mapped (this points to the Test folder)
    • When flowing tests out of ALM, multiple test configurations are not supported.  Tests must have a single test configuration.
  • Tricentis Tosca Test Cases
    • Test Case Folder field must be mapped

This integration can be run independently, as the Test Folders and Tests do not require any other artifacts to exist before they can be created.   Artifact Creation Flow can be one-way or two-way.

Integration 2: Test Planning

The Test Planning integration is a Container + Work Item Synchronization that utilizes child containers, flowing Test Set Folders/Execution List Folders (container), Test Sets/Execution Lists (child container), and Test Instances/Execution Entries (Work Item).  

Test Planning Integration

To configure this integration, you will use the normal 'Container + Work Item Synchronization' template.  Tasktop has behind-the-scenes magic that will allow you to include a child-container integration once it sees the appropriate collections created:

Container Collections:

  • ALM Test Set Folders 
    • Parent field must be mapped to preserve folder hierarchy
  • Tosca Execution List Folders
    • Parent field must be mapped to preserve folder hierarchy

Child Container Collections:

  • ALM Test Sets
    • Parent field must be mapped to preserve folder hierarchy
  • Tosca Execution Lists
    • Parent field must be mapped to preserve folder hierarchy

Work Item Collections:

  • ALM Test Instances 
    • Test field must be mapped
    • Test Set field must be mapped
  • Tosca Execution Entries
    • Test Case field must be mapped
    • Execution List field must be mapped

Step 1: Test Set Folder/Execution List Folders

Once your collections have been created and configured, create a Container + Work Item Synchronization.  First, configure the top-level container integration:

  • ALM <=> ALM: Test Set Folder to Test Set Folder, or
  • Tosca <=> ALM: Execution List Folder to Test Set Folder

Container Creation Flow will most likely be one way into ALM, but this will depend on the use case.

Once this integration has been configured, you'll see an option to create a child container integration

(lightbulb) Note: The Test Instance (or Execution Entry) collection (i.e. the work item collection for this integration) must exist before the "New Child Container Integration" button will appear while configuring this integration.

Click 'New Child Container Integration

Step 2: Test Sets/Execution Lists

Your Child Container Integration will be

  • ALM <=> ALM: Test Set to Test Set, or
  • Tosca <=> ALM: Execution List to Test Set

This integration will likely not require container mirroring configuration, as it will inherit that from the parent container integration.

Step 3: Test Instances/Execution Entry

Finally, you will configure your Work Item integration.  It will either be:

  • ALM <=> ALM: Test Instance to Test Instance
  • Tosca <=> ALM: Execution Entry to Test Instance

Click the 'New Work Item Integration' button to add the integration.

Add New Work Item Integration

Here is what your fully configured integration will look like:

Fully Configured Test Planning Integration

Step 4: Run Integration

(lightbulb) Before you run this integration, you must have the Test Design integration configured and running. This is because in order to create a Test Instance, both a Test Set and a Test (created via the Test Design integration) must exist.  When a Test is added to a Test Set, a Test Instance is created.

To run the integration, click the green 'Run All' button.

Run Integration

Integration 3: Test Execution

The Test Execution integration is a Work Item Synchronization that flows Test results located on ALM Test Runs or Tosca Execution Test Case Logs

Test Execution Integration

Supported Artifacts:

  • ALM Test Run
    • Test Instance field must be mapped
  • Tosca Execution Test Case Log
    • Execution Entry field must be mapped

Artifact Routing and Filtering

Because Test Runs and Execution Test Case Logs live at the project level, Artifact Routing will only need to be configured at the Project level.

Since routing is at the project level, you may be asking, "How will I know that only the Test Runs that I care about are synchronizing?  I don't want every single Test Run in this project to flow!"  And you're in luck.  Test Execution Integrations behave a little differently from typical integrations: Tasktop will use built-in magic to only flow the Test Runs or Execution Test Case Logs that are associated with a Test Instance/Execution Entry that is also configured to flow. 

It's that simple!

For this reason, you will see an Issue on the Activity Screen of Tasktop if you attempt to run this integration without running an associated Test Planning integration (Remember: Test Runs require that an associated Test Instance exist first).

Race Conditions

Due to the interdependencies between the three integrations, the order that artifacts synchronize in matters.

In this Test Management integration scenario, Tasktop won't create an artifact if its parent container or other required artifact does not yet exist.

Examples of when Tasktop won't flow a work item:

  • Trying to create a Test Run without the correct Test Instance already existing in the target system
  • Trying to create a Test Instance without the correct Test already existing in the target system

Here's an example of what you can expect to see in a race condition:

  • Create a test instance and immediately run the test
  • If Tasktop picks up the Test Run first, it will not have the necessary Test Instance on the target to attach to and will error
  • Once Tasktop picks up the Test Instance & synchronizes it, then the Test Run will be able to flow across on retry and the error will clear

You are most likely to see this condition when first setting up your integrations. For this reason, we recommend setting up the integrations 'from top to bottom'. In other words, start with the Test Design integration. Then move on to the Test Planning integration. And finally, set up the Test Execution integration. If you have the integrations running in that order, you'll be more likely to flow any required artifacts before any dependant artifacts attempt to flow. 

(lightbulb) To keep your integrations in order in the Integration List View, we recommend appending a number to the beginning of each title, i.e. "1 - Test Design," "2 - Test Planning," "3 - Test Execution"

Another possible time when this race condition could occur is if you have vastly different change detection intervals on your integrations. For example, if you have a short interval on your Test Execution integration, but a much longer interval on your Test Planning integration.