Submitting build and automated test results from Jenkins to Helix ALM
To get build and automated test result data submitted to Helix ALM, you must use the Jenkins plugin. See Using the Helix ALM Test Management Jenkins plugin. You must also make sure to add post-build steps in Jenkins to submit results data to Helix ALM. See Adding post-build steps to automatically submit build and automated test results from Jenkins to Helix ALM.
You can configure Jenkins to generate a report file after a build completes that contains information about build and automated tests, including results. The report format must be JUnit or XUnit. Other report formats may not be XML.
The plugin parses data from the XML file and then uploads data to Helix ALM. You can then view the build and automated test data in the Helix ALM web client.
Your automated test scripts need to write the correct data out to the XML report file. The following information explains how data in XML report files is parsed and submitted to Helix ALM so you can understand the data that your scripts need to provide.
Note: Helix ALM supports JUnit and xUnit XML result report formats. Some automated testing tools may not generate a report file that Helix ALM can parse or the JUnit XML file may not contain all of the data you want to upload to Helix ALM. Specifically, if you are using a testing framework that runs tests using Selenium, detailed information about the test environment, such as operating system and browser information, will not be added to Helix ALM. If it is important to include this information in Helix ALM, you may need to handle reading the test report yourself and submit results using the Helix ALM REST API. See Submitting builds and automated test results to Helix ALM using the REST API.
XML report elements that Helix ALM parses
The plugin parses data from the following elements in the XML report file:
<testsuite>
– Contains one or more<testcase>
elements and information about the timing and results of those tests. A report may contain multiple<testsuite>
elements, but only one build is added to Helix ALM.<testcase>
– Contains all information about a single automated test. One automated test result is added to Helix ALM for each<testcase>
element in the XML report file. A<testcase>
element can contain child elements that specify extra information for tests that fail, have errors, or are skipped.
Build data added to Helix ALM
One build is added to Helix ALM for each build, even if the XML report file contains more than one <testsuite>
element. The following information about builds is added to Helix ALM from Jenkins.
Required data
If the following data is must be specified in Jenkins and in the XML report file.
- Build number
Optional data
If other optional data is specified in Jenkins and in the XML report file, it is displayed in the build details in Helix ALM. See Viewing build summary information in automation suites.
The following values are configured by the Jenkins plugin user per project. If you are using a Freestyle project, they are text values entered in text fields. If you are using Pipeline, you can optionally provide these values when invoking the halm_report pipeline step.
- Description
- Branch
- Test run set
The following values are automatically populated by the Jenkins plugin and will always have data.
- External URL
- Run configuration information
- Build properties – Displayed in the Properties area in the build details in Helix ALM. See Viewing build summary information in automation suites.
Build Start date and Duration fields
The build Start date and Duration fields are populated in Helix ALM using the following methods:
- If the XML contains one
<testsuite>
element and hastimestamp
andtime
attributes, the build Start date field is populated withtimestamp
and the Duration field is populated withtime
. - If the XML contains multiple
<testsuite>
elements withtimestamp
andtime
attributes, the build Start date field is populated with the earliest reportedtimestamp
andtime
value, and the Duration field is calculated by adding the difference between the earliest and latest reportedtimestamp
plus the duration value specified in the latest reportedtimestamp
. - If the
<testsuite>
elements in the XML file do not contain atimestamp
attribute, the build Start date field is not set in Helix ALM, but Helix ALM infers the date from the<testcase>
elements.<testcase>
elements optionally include atime
value to indicate the test duration. Helix ALM uses the total of all reportedtime
values and uses it for the build Duration field value.
Automated test result data added to Helix ALM
One automated test result is created for each <testcase>
element in the XML report file. Helix ALM automatically maps the following XML attributes to the automated test result fields in Helix ALM. You can view this data on the Automated Test Result Details dialog box in Helix ALM. See Viewing automated test result details.
Automated test result values in Helix ALM | Value from XML file used to populate Helix ALM values | More information |
---|---|---|
Test name | name or uniqueName |
Automated test results in Helix ALM must have unique names to ensure that results are automatically associated with test cases correctly. If names are not unique, automated test result and associated test cases could be mismatched when running subsequent tests. The name attribute in the <testcase> element in the XML file is not required to be unique, so we recommend also specifying the uniqueName attribute in the <testcase> element.If uniqueName is not specified, Helix ALM uses SuiteName:ClassName:TestName instead. If uniqueName is already used on a result, a number is appended to the existing unique name value. For example, if three items have "automatedTest" as their uniqueName , their name values in Helix ALM are automatedTest, automatedTest.1, and automatedTest.2.uniqueName is not part of the JUnit specification. To set a specific uniqueName in the JUnit report, you would need a custom test report generator that creates <testcase> elements with uniqueName attributes. |
Associated test cases | tags
|
Used to associate test cases with automated test results. You can specify multiple tags for one automated test result in a comma-separated list. For example, if the XML file specifies <testcase ... tag="TC-1,TC-2"...> , Helix ALM associates test case 1 and test case 2 with the automated test result.tags is not part of the JUnit specification. |
Start date | timestamp
|
-- |
Duration | time
|
-- |
Properties | Any other attributes in the <testcase> element |
The following attributes in the XML file are displayed in the Automated Test Result Details dialog box in Helix ALM. Any other attributes are displayed as name/value pairs in the Properties area in the Automated Test Result Details dialog box.
|
Status | Inferred from data | See Automated test result status |
The status of an automated test is inferred from the XML report file. Detailed status information is displayed in the Automated Test Result Details dialog box in Helix ALM. See Viewing automated test result details.
If the <testcase> element contains this child element: | It indicates that: | In Helix ALM: |
---|---|---|
None | The test passed. | The test result is Passed. |
<error>
|
An error occurred when running the test. | The test result is Failed. If a message is specified in the <error> element in the XML file, it is set as the error message on the test result, which is displayed in the Automated Test Result Details dialog box. If an errorMessage is already specified in the <error> element, the error message is displayed in the Properties area in the Automated Test Result Details dialog box. If a type or value are specified in the <error> element, they are displayed as errorType and errorValue in the Properties area in the Automated Test Result Details dialog box. |
<failure>
|
The test failed. | The test result is Failed. If a message is specified in the <failure> element in the XML file, it is set as the error message on the test result, which is displayed in the Automated Test Result Details dialog box. If an errorMessage is already specified in the <failure> element, the error message is displayed in the Properties area in the Automated Test Result Details dialog box. If a type or value are specified in the <failure> element, they are displayed as failureType and failureValue in the Properties area in the Automated Test Result Details dialog box. |
<skipped>
|
The test was skipped. | The test result is Skipped. If a message is specified in the <skipped> element in the XML file, it is set as the error message on the test result, which is displayed in the Automated Test Result Details dialog box. If an errorMessage is already specified in the <skipped> element, the error message is displayed in the Properties area in the Automated Test Result Details dialog box. If a type or value are specified in the <skipped> element, they are displayed as failureType and failureValue in the Properties area in the Automated Test Result Details dialog box. |
<?xml version="1.0" encoding="UTF-8"?>
<testsuites name="Mocha Tests" time="100.0000" tests="100" failures="14" skipped="11">
<testsuite name="Root Suite.100 random tests" timestamp="2022-08-29T17:54:39" tests="100" file="C:\SSMainline\Code\\Java\jenkins\halm-test-management\work\workspace\Pipeline Test\dummy-script.js" time="100.0000" failures="14" skipped="11">
<testcase name="Test 1" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 2" time="1.0000" classname="100 random tests">
<failure message="Fail!" type="Error"><![CDATA[Error: Fail!
at Context.<anonymous> (dummy-script.js:9:11)
at processImmediate (node:internal/timers:466:21)]]></failure>
</testcase>
<testcase name="Test 3" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 4" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 5" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 6" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 7" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 8" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 9" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 10" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 11" time="1.0000" classname="100 random tests">
<skipped/>
</testcase>
<testcase name="Test 12" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 13" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 14" time="1.0000" classname="100 random tests">
<failure message="Fail!" type="Error"><![CDATA[Error: Fail!
at Context.<anonymous> (dummy-script.js:9:11)
at processImmediate (node:internal/timers:466:21)]]></failure>
</testcase>
<testcase name="Test 15" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 16" time="1.0000" classname="100 random tests">
<failure message="Fail!" type="Error"><![CDATA[Error: Fail!
at Context.<anonymous> (dummy-script.js:9:11)
at processImmediate (node:internal/timers:466:21)]]></failure>
</testcase>
<testcase name="Test 17" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 18" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 19" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 20" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 21" time="1.0000" classname="100 random tests">
<failure message="Fail!" type="Error"><![CDATA[Error: Fail!
at Context.<anonymous> (dummy-script.js:9:11)
at processImmediate (node:internal/timers:466:21)]]></failure>
</testcase>
<testcase name="Test 22" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 23" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 24" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 25" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 26" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 27" time="1.0000" classname="100 random tests">
<failure message="Fail!" type="Error"><![CDATA[Error: Fail!
at Context.<anonymous> (dummy-script.js:9:11)
at processImmediate (node:internal/timers:466:21)]]></failure>
</testcase>
<testcase name="Test 28" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 29" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 30" time="1.0000" classname="100 random tests">
<skipped/>
</testcase>
<testcase name="Test 31" time="1.0000" classname="100 random tests">
<failure message="Fail!" type="Error"><![CDATA[Error: Fail!
at Context.<anonymous> (dummy-script.js:9:11)
at processImmediate (node:internal/timers:466:21)]]></failure>
</testcase>
<testcase name="Test 32" time="1.0000" classname="100 random tests">
<failure message="Fail!" type="Error"><![CDATA[Error: Fail!
at Context.<anonymous> (dummy-script.js:9:11)
at processImmediate (node:internal/timers:466:21)]]></failure>
</testcase>
<testcase name="Test 33" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 34" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 35" time="1.0000" classname="100 random tests">
<skipped/>
</testcase>
<testcase name="Test 36" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 37" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 38" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 39" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 40" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 41" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 42" time="1.0000" classname="100 random tests">
<skipped/>
</testcase>
<testcase name="Test 43" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 44" time="1.0000" classname="100 random tests">
<failure message="Fail!" type="Error"><![CDATA[Error: Fail!
at Context.<anonymous> (dummy-script.js:9:11)
at processImmediate (node:internal/timers:466:21)]]></failure>
</testcase>
<testcase name="Test 45" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 46" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 47" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 48" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 49" time="1.0000" classname="100 random tests">
<failure message="Fail!" type="Error"><![CDATA[Error: Fail!
at Context.<anonymous> (dummy-script.js:9:11)
at processImmediate (node:internal/timers:466:21)]]></failure>
</testcase>
<testcase name="Test 50" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 51" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 52" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 53" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 54" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 55" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 56" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 57" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 58" time="1.0000" classname="100 random tests">
<failure message="Fail!" type="Error"><![CDATA[Error: Fail!
at Context.<anonymous> (dummy-script.js:9:11)
at processImmediate (node:internal/timers:466:21)]]></failure>
</testcase>
<testcase name="Test 59" time="1.0000" classname="100 random tests">
<failure message="Fail!" type="Error"><![CDATA[Error: Fail!
at Context.<anonymous> (dummy-script.js:9:11)
at processImmediate (node:internal/timers:466:21)]]></failure>
</testcase>
<testcase name="Test 60" time="1.0000" classname="100 random tests">
<failure message="Fail!" type="Error"><![CDATA[Error: Fail!
at Context.<anonymous> (dummy-script.js:9:11)
at processImmediate (node:internal/timers:466:21)]]></failure>
</testcase>
<testcase name="Test 61" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 62" time="1.0000" classname="100 random tests">
<skipped/>
</testcase>
<testcase name="Test 63" time="1.0000" classname="100 random tests">
<skipped/>
</testcase>
<testcase name="Test 64" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 65" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 66" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 67" time="1.0000" classname="100 random tests">
<skipped/>
</testcase>
<testcase name="Test 68" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 69" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 70" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 71" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 72" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 73" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 74" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 75" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 76" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 77" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 78" time="1.0000" classname="100 random tests">
<skipped/>
</testcase>
<testcase name="Test 79" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 80" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 81" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 82" time="1.0000" classname="100 random tests">
<skipped/>
</testcase>
<testcase name="Test 83" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 84" time="1.0000" classname="100 random tests">
<skipped/>
</testcase>
<testcase name="Test 85" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 86" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 87" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 88" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 89" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 90" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 91" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 92" time="1.0000" classname="100 random tests">
<skipped/>
</testcase>
<testcase name="Test 93" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 94" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 95" time="1.0000" classname="100 random tests">
<failure message="Fail!" type="Error"><![CDATA[Error: Fail!
at Context.<anonymous> (dummy-script.js:9:11)
at processImmediate (node:internal/timers:466:21)]]></failure>
</testcase>
<testcase name="Test 96" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 97" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 98" time="1.0000" classname="100 random tests">
</testcase>
<testcase name="Test 99" time="1.0000" classname="100 random tests">
<failure message="Fail!" type="Error"><![CDATA[Error: Fail!
at Context.<anonymous> (dummy-script.js:9:11)
at processImmediate (node:internal/timers:466:21)]]></failure>
</testcase>
<testcase name="Test 100" time="1.0000" classname="100 random tests">
</testcase>
</testsuite>
</testsuites>