Test cases
Overview
Test cases are the means by which a specific testing scenario is implemented. One or more test cases form the content of a test suite. The following example represents a complete, simple test case for the validation of an invoice that is uploaded by a user.
<?xml version="1.0" encoding="UTF-8"?>
<testcase id="testCase1" xmlns="http://www.gitb.com/tdl/v1/" xmlns:gitb="http://www.gitb.com/core/v1/">
<metadata>
<gitb:name>UBL invoice validation 1</gitb:name>
<gitb:version>1.0</gitb:version>
<gitb:description>Test case to verify the correctness of a UBL invoice. The invoice is provided manually through user upload.</gitb:description>
</metadata>
<imports>
<artifact type="schema" name="schema">artifacts/UBL/maindoc/UBL-Invoice-2.1.xsd</artifact>
<artifact type="object" name="schematron">artifacts/BII/BII_CORE/BIICORE-UBL-T10-V1.0.xsl</artifact>
</imports>
<actors>
<gitb:actor id="User" name="User" role="SUT"/>
</actors>
<steps>
<!--
Step 1. Request the user to upload the UBL invoice.
-->
<interact id="userData" desc="UBL invoice upload">
<request name="invoice" desc="Upload the UBL invoice to validate" inputType="UPLOAD"/>
</interact>
<!--
Step 2. Validate the uploaded invoice.
-->
<verify handler="XmlValidator" desc="Validate invoice">
<input name="xml">$userData{invoice}</input>
<input name="xsd">$schema</input>
<input name="schematron">$schematron</input>
</verify>
</steps>
<output>
<failure>
<default>"The test session resulted in a failure. Please check the validation reports and apply required corrections."</default>
</failure>
</output>
</testcase>
The following table provides an overview of the attributes and child elements that a testcase
may have. A more detailed discussion per case
follows in the subsequent sections.
Name |
Required? |
Description |
---|---|---|
@id |
yes |
A string to uniquely identify the test case by. This is referenced in the test suite XML. |
@supportsParallelExecution |
no |
A boolean flag indicating whether this test case may be executed in parallel with other test cases for a given SUT (default is “true”). |
@optional |
no |
A boolean flag indicating whether this test case is optional (default is “false”). Optional test cases may be executed but their results don’t count towards a conformance statement’s status. |
@disabled |
no |
A boolean flag indicating whether this test case is disabled (default is “false”). Disabled test cases cannot be executed and any existing test results don’t count towards a conformance statement’s status. |
metadata |
yes |
A block containing the metadata used to describe the test case. |
namespaces |
no |
An optional set of namespace declarations to define the namespace prefixes used in the test case’s expressions. |
imports |
no |
An optional set of imports used to load additional resources from the test suite. |
preliminary |
no |
An optional set of user interaction steps to display before the test session starts. |
variables |
no |
An optional set of variables that are used in the test case. |
actors |
yes |
The set of actors that this test case refers to. |
steps |
yes |
The sequence of steps that this test case foresees. |
output |
no |
Definition of an output message to display for the overall test session. |
scriptlets |
no |
Optional named groups of test steps which can be used within the test case. |
The id
attribute is important in uniquely identifying the test case within a given test suite, and needs to be referenced by the
test suite if it is to be considered. It is not presented to normal users, only administrators, and is used when
uploading a new version of a test suite to determine whether the test case definition serves as an update to an existing
test case.
The supportsParallelExecution
attribute is important in determining how the test case is handled in batch background executions (i.e. not executions
that are interactively launched and followed by a tester). If this is set to “true”, the default value considered if missing, the test case is assumed
to be able to correctly function while other test cases are being executed in parallel for the same SUT. This means that the design of the test case
caters for such concurrent sessions and is able to correctly map exchanged messages to sessions. This is not always possible to do, especially in
scenarios where messaging is initiated from the SUT (not by the test bed) or are asynchronous in nature.
If the test case cannot correctly handle such concurrency, you need to set supportsParallelExecution
to “true”. Doing so instructs the test
engine to always execute the given test case in isolation. Any ongoing test session will first need to complete before the current test case is executed
and its own test session will itself need to complete before executing any other test cases. The order of execution of test cases when making such
considerations is defined by their declaration order in the test suite.
Note
When supportsParallelExecution
is set to “false”, the test case’s non-parallel execution is honoured only within the context of a single batch
execution of a test suite. The flag becomes ineffective if the tester explicitly launches separate test sessions in parallel.
Elements
We will now see how a test case breaks down into its individual sections and discuss the purpose of each.
Metadata
The metadata
element is basically the same as the one defined for the test suite Metadata. Its purpose is to provide basic information
about the test case to help users understand its purpose. Its structure is as follows:
Name |
Required? |
Description |
---|---|---|
name |
yes |
The name of the test case that is used to identify it to users. |
type |
no |
Either “CONFORMANCE” (the default) or “INTEROPERABILITY”. “INTEROPERABILITY” is used when more than one actor are defined as SUTs. |
version |
yes |
A string that indicates the test case’s version. |
authors |
no |
A string to indicate the test case’s authors. |
description |
no |
A string to provide a user-friendly description of the test case that is displayed to users. |
published |
no |
A string acting as an indication of the test case’s publishing time. |
lastModified |
no |
A string acting as an indication of the last modification time for the test case. |
documentation |
no |
Rich text content that provides further information on the current test case. |
update |
no |
Instructions determining the default choices when an update of this test case is taking place. |
tags |
no |
Optional tags used to record additional metadata for the test case and visually highlight its attributes. |
specification |
no |
Optional information regarding the test case’s normative specification reference. |
Note
GITB software support: The test case type
must currently be set to “CONFORMANCE” (the default value) as the
“INTEROPERABILITY” type is not supported. Finally, the version
, authors
, published
and lastModified
values are recorded but never used or displayed.
documentation
The documentation
element complements the test case’s description
by allowing the author to include extended rich text documentation as HTML. The structure of this element is as follows:
Name |
Required? |
Description |
---|---|---|
import |
no |
A reference to a separate file within the test suite archive that defines the documentation content. |
from |
no |
The identifier of a test suite from which the |
encoding |
no |
In case an |
Using the above attributes to specify a reference to a separate file is not mandatory. The documentation’s content can also be provided as the element’s text content,
typically enclosed within a CDATA section if this includes HTML elements (in which case the from
, import
and encoding
attributes are omitted).
When loading documentation from a separate file, it is also possible to lookup this file from another test suite. This is
done by specifying as the value of the from
attribute the id
of the target test suite. This is used to lookup the
target test suite as follows:
Look for the test suite in the same specification as the current test case.
If not found in the same specification, look for the test suite in the other specifications of the test case’s domain. If across specifications multiple matching test suites are found, one of them will be arbitrarily picked. To avoid such a scenario it is obvious that you should ensure test suites used to load shared resources can be uniquely identified.
This documentation can provide further information on the context of the test case, diagrams or reference information that are useful to understand how it is to be completed or its purpose within the overall specification. The content supplied supports several HTML features:
Structure elements (e.g. headings, text blocks, lists).
In-line styling.
Tables.
Links.
Images.
The simplest way to provide such information is to enclose the HTML content in a CDATA section to ensure the XML remains well-formed. The following sample provides an example of this approach:
<testcase id="TS1-TC1" xmlns="http://www.gitb.com/tdl/v1/" xmlns:gitb="http://www.gitb.com/core/v1/">
<metadata>
<gitb:name>Test case 1</gitb:name>
<gitb:version>1.0</gitb:version>
<gitb:description>A short description of the test case to offer a short summary of its purpose.</gitb:description>
<gitb:documentation><![CDATA[
<p>Extended documentation for <b>Test case 1</b></p>
<p>This is an example to support the <a href="https://www.itb.ec.europa.eu/docs/tdl/latest">GITB TDL docs</a>.</p>
]]></gitb:documentation>
</metadata>
...
</testcase>
Note that documentation such as this is also supported for:
The overall test suite.
Individual test case steps.
update
The update
element allows the test suite’s developer to prescribe what should happen when this test case is being uploaded and
an existing test case with the same identifier is found. Through this you can define if the test case’s existing metadata
(e.g. name, description and documentation) should be updated to match the definitions from the new archive. In addition, you can
specify whether the testing history linked to the test case being updated should be reset. Note that these choices represent the
default selected options during the test suite upload, and can always be verified and replaced by the Test Bed’s operator.
It could be interesting to use the update
element if the test developer is not the one performing the test suite upload. Doing so,
avoids providing detailed instruction to operations staff, by already encoding the relevant choices within the test suite archive itself.
The structure of the update
element is as follows:
Name |
Required? |
Description |
---|---|---|
@updateMetadata |
no |
A boolean value determining whether the existing test case’s metadata should be updated based on the new archive (default is “false”). |
@resetTestHistory |
no |
A boolean value determining whether any previously executed test sessions for the test case being updated should be considered as obsolete (default is “false”). |
The following example shows how you can specify that the test case’s metadata should be updated to reflect the new values in the archive
(see attribute updateMetadata
). Also we specify here that any existing test sessions should be considered obsolete, forcing users to
re-execute their tests for the updated version (see attribute resetTestHistory
).
<testcase id="TS1-TC1" xmlns="http://www.gitb.com/tdl/v1/" xmlns:gitb="http://www.gitb.com/core/v1/">
<metadata>
<gitb:name>Test case 1</gitb:name>
<gitb:version>1.0</gitb:version>
<gitb:description>A short description of the test case to offer a short summary of its purpose.</gitb:description>
<gitb:update updateMetadata="true" resetTestHistory="true"/>
</metadata>
...
</testcase>
Relevant options to manage updates at test suite level are possible through a similar update
element of the test suite definition.
specification
The specification
element is an optional part of a test case’s metadata, that allows you to record in a structured manner a normative specification
reference for the test case. Besides being present in the test case definition, this information will also be rendered appropriately in the test case’s
on-screen display and in reports.
The structure of the specification
element is as follows:
Name |
Required? |
Description |
---|---|---|
reference |
no |
The reference identifier or code. |
description |
no |
A text describing the referred specification. |
link |
no |
A link to allow navigation to the referred specification’s online documentation. |
All the above elements are optional, meaning that you can choose to provide any documentation you see fit for the specification. Depending on what is provided, this information will be displayed accordingly, presenting for example the reference as a link if both are provided, or presenting only a link icon if only the link is present.
The following example illustrates how this metadata could be used to identify the specification section relevant to the test case and point to its online documentation.
<testcase id="TS1-TC1" xmlns="http://www.gitb.com/tdl/v1/" xmlns:gitb="http://www.gitb.com/core/v1/">
<metadata>
<gitb:name>Test case 1</gitb:name>
<gitb:version>1.0</gitb:version>
<gitb:description>A short description of the test case to offer a short summary of its purpose.</gitb:description>
<gitb:specification>
<gitb:reference>Section-1.2.A</gitb:reference>
<gitb:description>Security requirements</gitb:description>
<gitb:link>https://my.spec.wiki.org</gitb:link>
</gitb:specification>
</metadata>
...
</testcase>
Note
Similar specification reference information can also be added to test suites.
Namespaces
The namespaces
optional element is used to declare namespace mappings for use within the test case. The primary use cases of these namespaces is to allow
the definition of prefixes used in XML and XPath expressions. In principle they could be applied to any type of language or expression that has such a concept
(e.g. JSON-LD, Turtle) but currently their use is limited to XML.
Each namespace to declare is defined as a child ns
element with the following structure:
Name |
Required? |
Description |
---|---|---|
@prefix |
yes |
The namespace prefix that will be used in expressions. |
The value to which the prefix
is mapped is provided as the ns
element’s text content.
Namespaces declared using this approach can be used in two cases:
Within any GITB TDL step that supports expressions.
As the expression to apply for the XPathValidator embedded validation handler.
The following example illustrates how namespaces can be used for XML-based processing. The sample test case:
Requests an invoice from the user.
Extracts the invoice’s type using namespaces in an assign step and then logs it.
Validates the invoice’s type using namespaces with the XPathValidator.
<testcase>
<!--
Declare the namespaces to use used in XPath expressions.
-->
<namespaces>
<ns prefix="inv">urn:oasis:names:specification:ubl:schema:xsd:Invoice-2</ns>
<ns prefix="cbc">urn:oasis:names:specification:ubl:schema:xsd:CommonBasicComponents-2</ns>
</namespaces>
<steps>
<!--
Request the user to upload the invoice to validate.
-->
<interact id="input" desc="Upload invoice">
<request desc="File upload" name="xml" inputType="UPLOAD"/>
</interact>
<!--
Use an XPath expression to extract the invoice type as an XML element.
-->
<assign to="invoiceTypeElement" type="object" source="$input{xml}">/inv:Invoice/cbc:InvoiceTypeCode</assign>
<!--
Log the extracted element.
-->
<log>$invoiceTypeElement</log>
<!--
Use XPath to validate the invoice.
-->
<verify handler="XPathValidator" desc="Check invoice type">
<input name="xmldocument">$input{xml}</input>
<input name="xpathexpression">"/inv:Invoice/cbc:InvoiceTypeCode/text() = '380'"</input>
</verify>
</steps>
</testcase>
Defining alternative expression languages
A further, experimental use case for the namespaces
element is to define additional expression languages for use within the test case. This needs to
be done when expressions are used that should not be processed using the default XPath language. A detailed discussion on GITB expressions as well
as where and how you can use them is provided in Expressions.
In this case the alternative languages are defined using ns
elements, of which the prefix
attributes define how they are to be referenced. A TDL step
that supports expressions can then define the expression language to consider using its lang
attribute. To illustrate how this works consider a test case
in which we declare to be using expressions as JavaScript. We will use JavaScript for conditional checks on a number to determine a result and
illustrate how this is done in the default XPath.
<testcase>
<namespaces>
<ns prefix="JavaScript"/>
</namespaces>
<steps>
<!--
Assignment using the default XPath.
-->
<assign to="result">if ($var = 1) then 'result1' else 'result2'</assign>
<!--
Assignment using JavaScript.
-->
<assign to="result" lang="JavaScript">if ($var == 1) { return 'result1' } else { return 'result2' }</assign>
</steps>
</testcase>
Use of alternative expression languages, and their definition through the namespaces
element, is tricky because we need to know exactly how the target test bed refers
to the language (i.e. “JavaScript” in our case) to correctly identify it. Furthermore it must be clear how the test bed will process the expression and how session context
variables are looked up. In the above example we assume that context variables (e.g. “$var”) are looked up in exactly the same way as with XPath expressions and that the
entire expression will be evaluated by first wrapping it in a function, the result of which is returned as the assignment output. Apart from actually supporting JavaScript
for expressions, these additional details need to first be defined unambiguously by the test bed and made known to its users. Only then can we use them in a deterministic
and portable manner.
Note
GITB software support: Using the namespaces
element to define expression languages other than the default XPath is currently not supported.
Imports
The imports
element allows the use of arbitrary resources from the same or another test suite. This can be very useful when a test case needs to send messages
based on a template or load a binary file that is needed as input by a messaging, processing or validation handler (e.g. a certificate). The imports
element
defines one or more artifact
children with the following structure:
Name |
Required? |
Description |
---|---|---|
@name |
Yes |
The name with which this artefact will be associated to the test session context for subsequent lookups. |
@type |
Yes |
The type as which the artefact needs to be loaded. |
@encoding |
No |
In case the artefact is to be treated as text, this is the character encoding to apply when reading its bytes (default is “UTF-8”). |
@from |
No |
The identifier of another test suite from which this resource will be loaded. If unspecified the current test suite is assumed. |
The text value of the artifact
element is the path within the test suite from which the relevant resource will be loaded. This path may be provided as a
fixed value or as a variable reference to determine the imported resource dynamically. In case a variable reference
is provided this should be one of the following:
A reference to a configuration value (i.e. a domain, organisation, system or actor parameter).
A reference to a variable defined in the test case. In this case the value of the variable can even be adapted during the course of the test session resulting in different resources depending on the point at which the import is referenced.
Importing resources is not limited to the current test suite. Using the from
attribute it is possible to define another
test suite as the source from which to lookup the resource, specifying as the attribute’s value the identifier of the target
test suite. The lookup of the test suite using the from
value is carried out as follows:
Look for the test suite in the same specification as the test case being executed.
If not found in the same specification, look for the test suite in the other specifications of the test case’s domain. If across specifications multiple matching test suites are found, one of them will be arbitrarily picked. To avoid such a scenario it is obvious that you should ensure test suites used to load shared resources can be uniquely identified.
Regarding the type
attribute, this needs to refer to an appropriate type from the GITB type system (see Types). Given that in this case we are referring to a file
being loaded, the types that can be used are:
binary
: Load the artefact as a set of bytes without additional processing.object
: Load the artefact as a XML Document Object Model. In this case it is best to also explicitly provide theencoding
to consider.schema
: Load the artefact as a XML Schema or Schematron file. As in theobject
case it is best to explicitly provide theencoding
to consider.
Regarding the path to the resource this is the resource’s path within the test suite archive (with or without the test suite ID as a prefix). As an
example consider the following test case fragment where a XML schema is loaded and set in the session context as a variable of type schema
that is named “ublSchema”. The
path specified suggests that the file is named “UBL-Invoice-2.1.xsd” and exists in a folder within the test suite archive named “resources”. This example also includes
another input whose referenced resource is defined dynamically based on an external configuration parameter (at organisation level in this case).
<testcase>
<imports>
<!--
The "ublSchema" is loaded from a fixed resource within the test suite.
-->
<artifact type="schema" encoding="UTF-8" name="ublSchema">resources/UBL-Invoice-2.1.xsd</artifact>
<!--
The "organisationSpecificSchema" is loaded dynamically based on an organisation-level configuration property named "xsdToUseForOrganisation".
-->
<artifact type="schema" encoding="UTF-8" name="organisationSpecificSchema">$ORGANISATION{xsdToUseForOrganisation}</artifact>
</imports>
<steps>
<verify handler="XmlValidator" desc="Validate invoice against UBL 2.1 Invoice Schema">
<!--
Variable $fileContent is loaded in another step.
-->
<input name="xml">$fileContent</input>
<input name="xsd">$ublSchema</input>
</verify>
</steps>
</testcase>
It is also possible to import resources from other test suites. To do this you use the from
attribute identifying the
test suite that contains the resource, in which case the provided path is resolved in the context of the other test suite.
<testcase>
<imports>
<!--
The "ublSchema" is loaded from a fixed resource within a test suite with identifier "testSuite2".
-->
<artifact type="schema" encoding="UTF-8" name="ublSchema" from="testSuite2">resources/UBL-Invoice-2.1.xsd</artifact>
</imports>
</testcase>
Note
Test module import: The GITB TDL schema allows also for module
elements to be defined for the import of test modules (validation,
messaging and processing handlers). This approach is no longer supported as it required the handler implementations to be bundled within
the test bed itself. The preferred and simpler approach now is to simply define the handler in the respective test step (e.g. the verify
step’s handler
attribute for validators) without previously importing it.
Preliminary
The preliminary
element allows the test case to interact with the user before the test session begins. The purpose here is to allow the
user to provide preliminary input or be informed of certain actions that need to take place before the test session starts. In terms of structure
and use, the preliminary
element is a UserInteraction
element (see interact). The difference is that the interaction takes place before the
test session actually starts.
The following example shows a test case that prompts the user before starting to initialise their server and upload a configuration file.
<testcase>
<preliminary desc="Prepare your system" with="User">
<instruct desc="Preparation instructions" with="User" type="string">"Make sure your system is up and running"</instruct>
<request desc="Provide your configuration file" with="User" contentType="BASE64">$sutConfigFile</request>
</preliminary>
<variables>
<var name="sutConfigFile" type="binary"/>
</variables>
<actors>
<gitb:actor id="User" name="User" role="SUT"/>
</actors>
<steps>
<!--
The provided file can be referenced as $sutConfigFile
-->
</steps>
</testcase>
Actors
The actors
element is where the test case defines the actors involved in its steps and, importantly, their role. It contains
one or more actor
children with the following structure:
Name |
Required? |
Description |
---|---|---|
@id |
yes |
The actor’s unique (within the specification) ID. This must match an actor ID specified in the test suite. |
@name |
no |
The name to display for the actor. This can differ from the ID to display an actor name specific to the test case. Not specifying this will default to the name for actor provided in the test suite. |
@role |
no |
The actor’s role in the test case. This is “SUT” if the actor is the focus of the test case, “SIMULATED” (the default value) if the actor is simulated by the test bed, or “MONITOR” if the actor is present for monitoring purposes. |
@displayOrder |
no |
A number indicating the relative positioning that needs to be respected when displaying the actor in test case’s execution diagram. Setting this here overrides any corresponding setting at test suite level (see Actors for details). |
endpoint |
no |
An optional sequence of configuration endpoints if the actor is simulated. |
The main purpose of the actors
element in the test case is to identify which of the actors defined in the test suite
is the SUT (the actor the target system is testing for). This is done simply by defining the role
attribute as follows:
<testcase>
<gitb:actor id="sender" role="SUT"/>
<!-- The "SIMULATED" role is considered by default. -->
<gitb:actor id="receiver"/>
</testcase>
Besides defining the actors involved in the test case, you can also override their presentation by means of the name
and displayOrder
attributes:
<testcase>
<gitb:actor id="sender" role="SUT" name="Message sender" displayOrder="0"/>
<gitb:actor id="receiver" name="Message receiver" displayOrder="1"/>
</testcase>
Actor endpoint
elements used in test cases require a bit more explanation to understand their use. They serve a niche case for test suites
including multiple actors defined in test cases as SUTs, and for each of which actor-level configuration properties are foreseen. In practice,
a simpler and typically more flexible approach is to use several system-level configuration properties.
If you still require actor-level configuration for such cases, you can use the actors’ endpoint
elements to define default configuration values for
simulated actors. Imagine a specification that defines “sender” and “receiver” actors that can both be the SUTs depending on the actor a system selects to test for.
As such, a test suite focusing on the sender will include test cases with the sender as the SUT and the receiver as being simulated. Similarly, a
test suite focusing on the receiver will define the receiver as SUT and the sender as simulated. In terms of configuration properties, the sender
might need to define a “replyToAddress” to receive replies, whereas the receiver simply needs to define his “deliveryAddress” which is where messages
are expected. In terms of actor configuration in the test suite this would look like this:
<testsuite>
<gitb:actor id="sender">
<gitb:name>Sender</gitb:name>
<gitb:endpoint name="info">
<gitb:config name="replyToAddress" desc="The address to return replies to" kind="SIMPLE"/>
</gitb:endpoint>
</gitb:actor>
<gitb:actor id="receiver">
<gitb:name>Receiver</gitb:name>
<gitb:endpoint name="info">
<gitb:config name="deliveryAddress" desc="The address to receive messages on" kind="SIMPLE"/>
</gitb:endpoint>
</gitb:actor>
</testsuite>
Depending on the test case at hand, the user will be expected to provide the appropriate configuration parameters. For example, in a conformance statement for the sender, the applicable test cases will be those defining the sender actor as the SUT, and the “replyToAddress” parameter will need to be entered before starting the test. How is the “deliveryAddress” then provided for the simulated receiver actor? This can be achieved in two ways:
Dynamically through a custom messaging handler. Using this approach, the test bed, while in its initiation phase, will request configuration properties from the handler that will be mapped to the SUT’s corresponding endpoint (see Endpoints for dynamic configuration values).
Statically by defining the endpoint and one or more of its parameters within the test case itself.
The second option is why we are able to configure endpoint
elements as part of the test case. The values configured here will be used only
if not already specified by the response of the simulated actor’s handler. The below snippet illustrates this considering the sender as the SUT:
<testcase>
<gitb:actor id="sender" name="sender" role="SUT"/>
<gitb:actor id="receiver" name="receiver">
<gitb:endpoint name="info">
<gitb:config name="deliveryAddress">SIMULATED_ADDRESS</gitb:config>
</gitb:endpoint>
</gitb:actor>
<steps>
<!--
receiver's address can be referenced by $sender{receiver}{deliveryAddress}
-->
</steps>
</testcase>
Note
GITB software support: The “MONITOR” value for the actor role
is currently not supported.
Variables
Note
Implicit variables: Variables are automatically created for new assignments. You should typically never need to declare a variable explicitly.
The variables
element can be defined to create one or more variables that will be used during the test case’s execution. It contains one
or more var
elements, one per variable, with the following structure:
Name |
Required? |
Description |
---|---|---|
@name |
yes |
The name of the variable. It is with this name that the variable can be referenced. |
@type |
yes |
The type of the variable. One of the GITB data types can be used (see Types). |
value |
no |
One or more values for the variable. More than one values are applicable in case of a |
Variables can be used to record arbitrary information for the duration of the test session. These can be fixed values defined along with the
variable’s definition or dynamically produced values resulting from test steps. The way to reference variables is defined based on the expression
language in place. Using the default XPath 3.0 expression language a variable named myVar
is referenced as $myVar
. More information on
expressions to reference variable values is provided in Expressions.
Definition of variables using the variables
element is optional given that test steps resulting in output will automatically create
variables as needed to store the output in the test session context. Such steps include:
Assign steps that define new values or calculate expressions.
User interaction steps that request data from the user.
Messaging steps to record the output of a send or a receive.
Processing steps to record the output of the process.
The type of the automatically created variables in the above cases is inferred from the type of the relevant data or expression result. For example,
when assigning a string to a variable, this will automatically be set with a string
type. Considering this, you would use the variables
element
to predefine variables in the following cases:
To predefine all variables if you prefer this from the perspective of code organisation.
To explicitly set the type of variables in cases where the automatic determination is not suitable (e.g. force a
string
type for a numeric value).To cover exceptional cases where automatic type determination is not possible.
To provide initial values to variables.
For examples of automatic variable definition refer to the corresponding steps as well as the documentation on expressions. Coming back to explicitly defined variables, the following example shows two such cases, one to store a user-uploaded file and another to store a part of it, extracted via XPath:
<testcase>
<imports>
<artifact type="schema" encoding="UTF-8" name="schemaFile">testSuite/artifacts/UBL/maindoc/UBL-Invoice-2.1.xsd</artifact>
</imports>
<variables>
<var name="fileContent" type="object"/>
<var name="targetElement" type="object"/>
</variables>
<steps>
<!--
Store the uploaded result in the fileContent variable.
-->
<interact desc="UBL invoice upload" with="User">
<request desc="Upload the UBL invoice to validate" with="User" contentType="BASE64">$fileContent</request>
</interact>
<!--
Extract a part of it and store in the targetElement variable.
-->
<assign to="$targetElement" source="$fileContent">/*[local-name() = 'testcase']/*[local-name() = 'steps']</assign>
<!--
Pass the targetElement for validation.
-->
<verify handler="XmlValidator" desc="Validate content">
<input name="xml">$targetElement</input>
<input name="xsd">$schemaFile</input>
</verify>
</steps>
</testcase>
Setting a variable’s initial value is achieved using the value
element, with one or more being used in case of a map
or list
type. The following example illustrates setting values for different variable types:
<testcase>
<variables>
<var name="aList" type="list[string]">
<value>List value 1</value>
<value>List value 2</value>
</var>
<var name="aMap" type="map">
<value name="key1" type="string">Map value 1</value>
<value name="key2" type="string">Map value 2</value>
</var>
<var name="aString" type="string">
<value>A string value</value>
</var>
</variables>
</testcase>
Note
List variables: When a list
is defined as a variable it also needs to specify its internal element type. To do this you
need to specify the type
attribute as list[INTERNAL_TYPE]
. For example a list
of string
elements is defined as
<var name="myList" type="list[string]"/>
.
Steps
The steps
element is where the test case’s testing logic is implemented. It consists of a sequence of test steps realised by means
of a GITB TDL step construct. The structure of the element is as follows:
Name |
Required? |
Description |
---|---|---|
@stopOnError |
no |
A boolean flag determining whether the test session should stop if any step fails (default is “false”). See also Stop a test session upon errors. |
@logLevel |
no |
The minimum logging level that this test case should produce. This can be (in increasing severity) |
Test case logging
The test case’s logging level affects log statements produced automatically by the test bed or added explicitly by the test case. While executing a test session, the test bed automatically produces the following log output:
At
DEBUG
level, information on each step’s start, end and latest status.At
INFO
level, information on key lifecycle points such as the start and end of the session.At
WARNING
level, detected issues that although not blocking for the test session could be signs of problems.At
ERROR
level, information on unexpected errors that forced the test session to fail.
When using the logLevel
attribute to set the test case’s log level, this defines the minimum level of messages to be added to the
session’s log. A good example is setting the logLevel
to WARN
which will exclude all DEBUG
and INFO
output while including all output
of levels WARNING
and ERROR
. This could be interesting if you want to only share test engine errors and problematic issues signalled by
using the log step while ignoring status updates.
In certain cases you may prefer to set the test case logging level dynamically. This is achieved by using a variable reference as the
logLevel
value, referring either to a configuration property or a predefined test case variable.
Interestingly, when referring to a variable and given that the provided expression is calculated every time, you can adapt the test case’s
logging level during the course of the test session. You may for example start with a WARNING
level but switch to INFO
for a specific set of steps. An example of this is illustrated below:
<testcase>
<variables>
<var name="loggingLevel" type="string">
<value>WARNING</value>
</var>
</variables>
<steps logLevel="$loggingLevel">
<!--
The following log entry is ignored as we only log at WARNING level and above.
-->
<log level="INFO">'An info message'</log>
<!--
For the group that follows switch to INFO level.
-->
<assign to="loggingLevel">'INFO'</assign>
<group>
...
</group>
<!--
Switch back to WARNING level.
-->
<assign to="loggingLevel">'WARNING'</assign>
...
</steps>
</testcase>
Available steps
The test case’s steps are defined as children of the steps
element. The available test steps that can be defined are:
Step name |
Description |
---|---|
Send a message to an actor |
|
Receive a message from an actor |
|
Listen for exchanged messages between actors |
|
Begin a messaging transaction |
|
End a messaging transaction |
|
Process a set of inputs to get an output |
|
Begin a processing transaction |
|
End a processing transaction |
|
Apply if-else logic to conditionally execute test steps |
|
Loop over a set of steps while a condition is true |
|
Repeat a set of steps while a condition is true (executing at least once) |
|
Execute a set of steps a fixed number of times |
|
Execute sets of steps concurrently |
|
Immediately terminate the test session |
|
Process an expression and assign its output to a variable |
|
Log a message in the test session log. |
|
Display a set of steps as a logical group |
|
Validate content |
|
Call a scriptlet |
|
Trigger an interaction with the user |
Output
The output
element is an optional means of defining a final result message for a given test session. It is processed once all
steps have completed, checking the data in the test session’s context to display specific success or failure messages. Using this
element allows extended feedback to be returned that may be important to summarise and contextualise the steps’ results.
The output
element supports both success and failure messages defined as different cases, each having a match condition, and an
overall default. The structure of the output
element is as follows:
Name |
Required? |
Description |
---|---|---|
success |
no |
The set of output cases to apply in case the test completes with a success. |
failure |
no |
The set of output cases to apply in case the test completes with a failure. |
The success
and failure
elements share a common structure to define their specific cases and overall defaults:
Name |
Required? |
Description |
---|---|---|
case |
no |
Zero or more specific cases to apply depending on the provided match conditions. |
default |
no |
An optional default if no specific case was found to apply. |
Finally, each case
element shares a common structure as follows:
Name |
Required? |
Description |
---|---|---|
cond |
yes |
Defines a condition expression expected to return a “true” or “false” value. |
message |
yes |
Defines an expression expected to return the output message (as a string). |
The output
section is flexible as it doesn’t require you to define both success and failure messages. In addition, you could
opt only to provide default messages without specific cases or only provide certain specific messages without generic defaults.
It is important to note that the condition expressions tolerate failures, meaning that if a condition cannot
be evaluated its relevant case will simply be skipped. In addition, if any error is raised when creating the text message itself, the output
message will be altogether skipped; under no circumstances will a test session fail due to the evaluation of its output
section
(relevant warnings will however be included in the test session’s log).
The following snippet illustrates how the output
section could be leveraged to return user-friendly failure messages based on the executed
test steps (using the STEP_STATUS variable to determine the cause of the failure):
<testcase>
<steps stopOnError="true">
...
<verify id="checkIntegrity" desc="Validate integrity">
...
</verify>
<verify id="checkSyntax" desc="Validate syntax">
...
</verify>
<verify id="checkContent" desc="Validate business rules">
...
</verify>
...
</steps>
<output>
<!-- We skip the "success" element as we only want failure messages. -->
<failure>
<case>
<cond>$STEP_STATUS{checkIntegrity} = 'ERROR'</cond>
<message>"Please verify the integrity of your data and re-submit."</message>
</case>
<case>
<cond>$STEP_STATUS{checkSyntax} = 'ERROR'</cond>
<message>"Please verify the syntax of your data and re-submit."</message>
</case>
<case>
<cond>$STEP_STATUS{checkContent} = 'ERROR'</cond>
<message>"Please verify your data content and re-submit."</message>
</case>
<!-- The default will be applied if no specific case was found to match. -->
<default>"Your data failed to be processed correctly. Please check the session log to determine the cause of the failure."</default>
</failure>
</output>
</testcase>
Note
Messages are also expressions: Output messages are themselves expressions allowing dynamic output to be returned (e.g. concatenating text with session variable values). For a fixed message make sure to enclose the text in quotes.
Scriptlets
The scriptlets
element is meant to define reusable blocks of steps that can be called during the test case’s execution.
They resemble function blocks in programming languages considering that they can be called multiple times with different
inputs and produce different outputs.
Scriptlets are typically defined as separate XML documents that can be used across test cases or even across test suites.
Defining a scriptlet within a test case results in it being private to its containing test case. To define such private
scriptlets, include the scriptlets
element with one or more scriptlet
children.
Details on how each scriptlet
element is defined are provided in the scriptlet documentation. This
includes the differences to consider when comparing scriptlets embedded in test cases and ones
that are defined as standalone XML documents.
Calling a scriptlet from a test case is achieved through the call step. The following example illustrates the definition of a scriptlet within a test case to validate XML documents. This is called twice for each of the inputs provided by the user.
<testcase>
<steps>
<!--
Request two files to be uploaded.
-->
<interact id="userData" desc="Upload files">
<request desc="Upload the first file" name="file1" contentType="BASE64"/>
<request desc="Upload the second file" name="file2" contentType="BASE64"/>
</interact>
<!--
Call the scriptlet for the first file and store the result under variable "call1".
-->
<call id="call1" path="validateDocument">
<input name="contentToValidate">$userData{file1}</input>
</call>
<!--
Call the scriptlet for the second file and store the result under variable "call2".
-->
<call id="call2" path="validateDocument">
<input name="contentToValidate">$userData{file2}</input>
</call>
<!--
Log the root element names of the validated files.
-->
<log>concat("File 1: ", $call1{rootName})</log>
<log>concat("File 2: ", $call2{rootName})</log>
</steps>
<scriptlets>
<scriptlet id="validateDocument">
<imports>
<artifact type="schema" encoding="UTF-8" name="schemaToUse">resources/aSchemaFile.xsd</artifact>
<artifact type="schema" encoding="UTF-8" name="schematronToUse">resources/aSchematronFile.sch</artifact>
</imports>
<params>
<var name="contentToValidate" type="object"/>
</params>
<steps>
<verify handler="XmlValidator" desc="Validate XML structure">
<input name="xml">$contentToValidate</input>
<input name="xsd">$schemaToUse</input>
</verify>
<verify handler="XmlValidator" desc="Validate business rules">
<input name="xml">$contentToValidate</input>
<input name="schematron">$schematronToUse</input>
</verify>
</steps>
<output name="rootName" source="$contentToValidate">name(/*)</output>
</scriptlet>
</scriptlets>
</testcase>
Note
Using scriptlets across test cases: Scriptlets defined within test cases are private to that test case. If you want to use a scriptlet across several test cases, within the test suite or across test suites, you need to define it in its own XML document. See the scriptlet documentation for details on this.