Description

The z/OS Utility plugin includes steps for retrieving and deploying IBM z/OS artifacts. It is installed and upgraded as part of the HCL Launch server. This plugin works with all the supported versions of the Launch server and agents. There are some new features that are only supported if the agent is upgraded to those versions. If you attempt to use the feature that is not enabled for a version, the process might fail with the message to upgrade the agent. This plug-in requires agents that run on the z/OS platform.

Quick Info

Product
HCL Launch
Type
plugin
Compatibility
Created by
HCL Software
Website
Version Name Action

56.1098848

launch-zos-56.1098848.zip

56.1102074

launch-zos-56.1102074.zip

57.1121666

launch-zos-57.1121666.zip

57.1121798

launch-zos-57.1121798.zip

57.1121803

launch-zos-57.1121803.zip

58.1122539

launch-zos-58.1122539.zip

launch-zos-56.1098848.zip

56.1098848


launch-zos-56.1102074.zip

56.1102074


launch-zos-57.1121666.zip

57.1121666


launch-zos-57.1121798.zip

57.1121798


launch-zos-57.1121803.zip

57.1121803


launch-zos-58.1122539.zip

58.1122539


Summary

The z/OS Utility plugin includes steps for retrieving and deploying IBM z/OS artifacts. The plugin includes steps that are related to deploying z/OS artifacts, running z/OS commands, submitting and tracking jobs, and working with data sets.

The plugin also includes the Generate Artifact Information step, which scans version artifacts and generates text based on a template. The output text can be used as an input property to subsequent steps. Use the Generate Artifact Information to process data sets or members in a component version. You can also use the Generate Artifact Information step to select a set of artifacts to process, by applying filters on data set names, member names, deployment types, and custom properties.

This plugin includes one or more steps, click Steps for step details and properties.

Compatibility

This plugin requires HCL Launch agents that run on the z/OS platform.

The Submit Job and Wait For Job steps require the job server component that is included with IBM UrbanCode Deploy, Rational Team Concert, or Rational Developer for System z.

This plugin requires IBM z/OS version 2.1 or later.

This plugin requires Java 8 or above.

Installation

See Installing plugins in HCL Launch for installing and removing plugins.

History

The following table describes the changes made in each plugin version.

Plugin history details
Version Description
56 Fixed newline character parsing in Generate artifact template input.

Usage

To use the z/OS Utility plugin, you must install and configure the z/OS deployment tools. For information on the z/OS deployment tool, see Deploying to the z/OS platform in the HCL Launch product documentation.
The job server component must be configured before you run the Submit Job and Wait For Job jobs.

Running MVS system commands

The Run MVS Command step uses the Java programming interface with the System Display and Search Facility (SDSF) to run MVS system commands on the agent. To use the Run MVS Command step, you must work with your system administrator to configure security properly for the agent user account. In the following examples, protecting resources by setting the universal access authority (UACC) to NONE might prevent all users, except users with explicit permission, from accessing the protected command.

The agent user account must be authorized to use SDSF from Java and must be authorized to issue MVS slash (/) commands from SDSF. MVS commands are protected by defining a resource name in the SDSF class, as shown in the following table.

Resource name Class Access
ISFOPER.SYSTEM SDSF READ

If the SDSF class is not activated yet, use following command to activate it.

SETROPTS CLASSACT(SDSF)

To use the Resource Access Control Facility (RACF) to authorize the use of an MVS command, issue commands similar to the commands in the following examples:


RDEFINE SDSF ISFOPER.SYSTEM UACC(NONE)
PERMIT ISFOPER.SYSTEM CLASS(SDSF) ID(userid or groupid) ACCESS(READ)

Additionally, the agent user account must be authorized to use the ULOG command to view command responses. MVS commands can return responses to the user console and to the user log (ULOG). The ULOG command is protected a resource in the SDSF class, as shown in the following table.

Resource name Class Access
ISFCMD.ODSP.ULOG.jesx SDSF READ

To use the Resource Access Control Facility (RACF) to authorize the use of the ULOG command, issue commands similar to the commands in the following example.


RDEFINE SDSF ISFCMD.ODSP.ULOG.* UACC(NONE)
PERMIT ISFCMD.ODSP.ULOG.* CLASS(SDSF) ID(userid or groupid) ACCESS(READ)

Run following command to make your changes to profiles effective.

SETROPTS RACLIST(SDSF) REFRESH

For more information on setting up SDSF security, see System Display and Search Facility.

The following settings show an example of how to configure the Run MVS Command step.

Using custom properties in deployments

You can add custom properties to data sets or to members when you create component versions. The custom properties can then be used by the Generate Artifact Information step to generate commands or other input that can be used by other subsequent steps in the process.

Before you can use the Generate Artifact Information step, a component version must be deployed by using the Deploy Data Sets step.

In the following example, a custom property is used to generate IBM DB2 database commands.

The following shiplist file shows the DB2 plan name as a custom property to the DBRM data set:


<manifest type="MANIFEST_SHIPLIST">
 <container name="TONY.MORT.DEV.LOAD" type="PDS" deployType="CICS_LOAD">
 <resource name="JKECMORT" type="PDSMember"/>
 </container>
 <container name="TONY.MORT.DEV.DBRM" type="PDS" deployType="DBRM">
  <property name="plan" value="TONY"/>
 <resource name="*" type="PDSMember"/>
 </container>
</manifest>

When you create a component version by using this shiplist file, the custom property is visible in the version artifacts view. Properties added to a data set are also visible to all members of the data set.

In the following deployment process, the FTP Artifacts and Deploy Data Sets steps deploy the members to the target system. The Generate Artifact Information step generates TSO commands that are then used to run the REXX BIND commands. The generated commands contain the DB2 plan name from the custom property. The generated commands are then run by the Run TSO or ISPF Command.


The Generate Artifact Information step uses the following settings:

Use ${propertyName} to refer to a custom property. In the previous example, TEST.REXX(BIND) is a REXX script that accepts plan, library, and member values as parameters and then runs the DB2 DSN BIND command.

The Generate Artifact Information step generates the following output properties. In this example, the text property contains the generated TSO commands.

In this example, the Run TSO or ISPF Command step uses the following settings:

Steps

The following process steps are available in the z/OS Utility plugin.

Allocate Data Set

Allocate non-SMS-managed data set.

Input properties for the Allocate Data Set step
Name Type Description Required
Average Record Unit Enumeration Select the unit to use when allocating average record length. U specifies single-record units (bytes). K specifies thousand-record units (kilobytes). M specifies million-record units (megabytes). () specifies the system default value. No
Block Size String Specify the number of bytes of data to place in each block, based on the record length. Yes
Data Set Name String Data set name. If the single quotation marks are omitted, the users data set prefix
from the TSO profile is automatically appended to the front of the data set name.
Yes
Data Set Name Type Enumeration Valid values are LIBRARY, PDS or Default() No
Directory Blocks String The number of directory blocks to allocate. Specify zero for a sequential data set.
Specifying LIBRARY in the data set name might override a setting of zero directory
blocks.
No
Primary Quantity String Specify the primary quantity in average record units. Yes
Record Format Enumeration Values are F,B; F; V,B; V; U; F,B,A; V,B,A; F,B,M; F,M; and V,B,M. No
Record Length String Yes
Secondary Quantity String Specify the secondary quantity in average record units. Yes
Space Units Valide values are BLKS, TRKS, CYLS Yes
Volume Serial String Leave blank to use the system default volume. No

Allocate Data Set From Existing

Create a data set with the attributes of an existing model data set.

Input properties for the Allocate Data Set From Existing step
Name Type Description Required
Data Set Name String Data set name. If the single quotation marks are omitted, the users data set prefix
from the TSO profile is automatically appended to the front of the data set name.
Yes
Like String Specify the name of an existing data set to use as a model. The attributes of this
data set are used as the attributes for the data set being allocated. If the single
quotation marks are omitted, the users data set prefix from the TSO profile is automatically
appended to the front of the data set name.
Yes
Primary Quantity String Specify the primary quantity in space units. No
Secondary Quantity String Specify the secondary quantity in space units. No
Space Units Enumeration BLKS, TRKS, CYLS or default() No
Volume Serial String Leave blank to use the system default volume. No

Allocate SMS Managed Data Set

Allocate an SMS-managed data set.

Input properties for the Allocate SMS Managed Data Set step
Name Type Description Required
Data Class String Leave blank to use the default data class. No
Data Set Name String Data set name. If the single quotation marks are omitted, the users data set prefix
from the TSO profile is automatically appended to the front of the data set name.
Yes
Management Class String Leave blank to use the default management class. No
Storage Class String Leave blank to use the default storage class. No

Copy Artifacts

Load artifacts from a local repository.

Input properties for the Copy Artifacts step
Name Type Description Required
Directory Offset String The working directory to use when running the command. This directory is relative
to the current working directory.
Yes

Copy Data Set

Copy a data set.

Input properties for the Copy Data Set step
Name Type Description Required
Exclude Members String Specify a list of members in the source data set to skip when copying. Separate member
names with newline characters.
No
From PDS String Specify the names of the source data sets, separated by newline characters.
Use the following format: name|name,R. R specifies that all members of the source
data set replace any members with the same name in the target data set.
Yes
Include Members String Specify the members in the source data set to copy, separated by
newline characters. Use the following format: name1|name1,newname1[,R]|name1,,R.
To rename a member, specify the current name of the member, followed by the new name
and optionally the R (replace) parameter. To replace a member, specify
the name of the member and the R parameter, separated by two commas.
No
Load Module Dataset Boolean Select to use the IEBCOPY COPYMOD control statement when copying load modules. No
To PDS String Specify the name of the target partitioned data set. Yes

Deploy Data Sets

Deploy data sets and HFS files

Input properties for the Deploy Data Sets step
Name Type Description Required
Allow Creating Data Set String Specify TRUE to create a data set if the specified target data set does not exist. No
Allow Creating Directory String Specify TRUE to create the directory if the specified HFS target directory does not
exist.
No
Backup for Rollback Boolean Select to make a backup of the data sets and files which are going to be replaced.
A backup must exists to do rollback.
No
Check Access Boolean Select to check permission to update the data sets to deploy. No
Container Filter String The filter to limit Source datasets to be deployed. Java regular expression matching is used if the filter starts and ends with a forward slash (/). For example, specify /.*LOAD/ to match any data set containers that ends with LOAD. If the filter is not a regular expression, exact matching is used. Separate each filter with a newline character. No
Data Set Mapping String Specify a list of mapping rules for the data set packages, separated by newline characters. Use the following format: Source_Data_Set,Target_Data_Set. Use the asterisk (*) in the Source_Data_Set value to match any characters. If multiple rules specify the same Source_Data_Set value, only the first one is used. Use Source_Data_Set,DUMMY mapping to skip backup/deployment for a particular dataset. It will show as deployed at the environment level, but in actual, the backup/deployment is skipped. No
Delta Deploy Enumeration Specify FULL deployment type to replace all artifacts with artifacts in the current component version.
Specify INVENTORY deployment type, a delta deployment, to reduce the deployment time significantly by deploying only the changes between artifacts. The comparison is based on identity attributes including lastModifiedTimestamp and customer properties starts with SYS.id.
Specify RUNTIME deployment type, a delta deployment, to use checksum logic to compare artifacts to be deployed with the same artifacts in the target environment. This check is done for every artifact. Only artifacts with checksums that don’t match are considered as changed and used for deployment.Note:

  • Two artifacts are considered the same when at least one attribute can be used for comparison and all attributes that are used for comparison match exactly.
  • The attributes used for delta deployment type are explained in the table below.
Yes
HFS Directory mappings String Specify the target directory mapping to deploy HFS files separated by newline characters. Use the following format: Source_Directory,Target_Directory. When using multiple rules, specify the same Source_Directory value, only the first one is used. Specify only one target directory for agents prior to V7.0.3. For agents V7.0.3 and later, you can give mapping similar to PDS mapping with source and target directories No

For each artifact in a delta deployment, the following attributes are compared to the latest inventory version of the same artifact.

Input parameters for the delta deployment type
Name Type Description Required
Last Modified Timestamp INVENTORY IBM UrbanCode Deploy reads the Last Modified Timestamp value when the version is packaged. All load modules that are build by RTC have Last Modified Timestamp values stored in SSI. If SSI has no Last Modified Timestamp values, ZLM4DATE, ZLMTIME and ZLMSEC statistical values are read from ISPF. Note that the JCL-built or third-party tool load modules have a Last Modified Timestamp value of NO.
Custom properties starting with SYS.id (aka identification properties) INVENTORY These properties provide an open framework for the customer or provider to add additional attributes to indicate whether two artifacts are the same. Two artifacts are considered the same when all attributes that are used for comparison match exactly.
checksum RUNTIME The checksum value is determined when the version is packaged. During a RUNTIME deployment, the checksum is calculated for the artifact in the target environment and compared with the checksum calculated during the version creation. These properties can be hash or binder information for load modules.

FTP Artifacts

Load artifacts from a remote repository using FTP.

Input properties for the FTP Artifacts step
Name Type Description Required
Directory Offset String The working directory to use when running the command. This directory is relative
to the current working directory.
Yes

Generate Artifact Information

Generate text information for selected version artifacts. The information is sent
to the text output property for use by later steps.
Note: From version 51, groovy string methods are not interpreted in Template input since the code is rewritten in Java.

Input properties for the Generate Artifact Information step
Name Type Description Required
Container Name Filter String Specify a filter to use on the container name. Container can be data set, directory
or generic artifact group. Java regular expression matching is used if the filter
starts and ends with a forward slash (/). For example, specify /.*LOAD/ to match any
text that ends with LOAD. If the filter is not a regular expression, exact matching
is used.
No
Custom Properties Filter String Specify a list of custom properties filters, separated by newline characters. Use
the following format: propertyName=valueFilter. A property without valueFilter selects
all artifacts that have that property. Java regular expression matching is used if
the filter starts and ends with a forward slash (/). For example, specify developer=/M.*/
to match artifacts with a developer property where the value of the property starts
with M. If valueFilter is not a regular expression, exact matching is used. For example,
developer=Martin matches artifacts where value of the developer property is Martin.
No
Deploy Type Filter String Specify a filter to use on the deploy type. Java regular expression matching is used
if the filter starts and ends with a forward slash (/). For example, specify /.*LOAD/
to match any text that ends with LOAD. If the filter is not a regular expression,
exact matching is used.
No
Fail On Empty Boolean Select to set the step to fail if no text is generated. No
For Each Enumeration Generate information for each artifact of the selected type. Valid values are Member, PDS, Sequential, DeletedMember, DeletedPDS, DeletedSequential, Directory, File, DeletedFile, GenericArtifactGroup, and GenericArtifact. Yes
Order By Enumeration Valid values are ASC, DESC, and SHIPLIST. Yes
Resource Name Filter String Specify a filter to use on resource name. Resource can be data set member, file or
generic artifact. Java regular expression matching is used if the filter starts and
ends with a forward slash (/). For example, specify /.*LOAD/ to match any text that
ends with LOAD. If the filter is not a regular expression, exact matching is used.
No
Target Data Set Name Filter String Specify a filter to use on the target data set name. Java regular expression matching
is used if the filter starts and ends with a forward slash (/). For example, specify
/.*LOAD/ to match any text that ends with LOAD. If the filter is not a regular expression,
exact matching is used.
No
Template String Specify the template to use to generate text. The text output
property contains the generated text from this step. Subsequent
steps can access this text with the${p:stepName/text} property.
Add separators, including line breaks, in the template as needed.
Use ${propname} to access custom properties. The following built-in
properties are available: ${sourceDataset} for the source dataset
name. ${dataset} for the target dataset name.
${member} for the member name. ${deployType} for the deployment
type. ${artifactGroup} for the generic artifact group name.
${artifact} for the generic artifact name. ${directory} for the
directory name. ${file} for the file name.
${inputsUrl} for the url of the inputs. All property names are
case-sensitive. Do not use the built-in names for custom properties.
Yes

Remove All Versions

Remove all versions in an environment.

Input properties for the Remove All Versions step
Name Type Description Required

Remove Redundant Versions

Remove redundant versions in an environment. Redundant versions are versions that
are completely replaced by subsequent versions.

Input properties for the Remove Redundant Versions step
Name Type Description Required
Dry Run Boolean Select to specify a dry run, which does not delete versions. Instead, the versions
to be deleted are written to the output log for verification.
No

Replace Tokens MVS

Replace tokens in MVS data set using properties.

Input properties for the Replace Tokens MVS step
Name Type Description Required
Allow Wildcard Boolean Select to use an asterisk (*) as a wildcard character in the Include Data Sets field.
The asterisk matches any characters. Using wildcard characters can result in updates
to a large number of data set members, or to unexpected updates.
Yes
End Token Delimiter String The end delimiter character used to identify tokens. No
Exclude Data Sets String Specify a list of data set patterns to exclude from processing. Separate patterns
with commas or newline characters. Use an asterisk (*) to match any characters. For
example: USERID.JCL(ABC*)
No
Explicit Tokens String Specify a list of explicit tokens to replace, separated by newline characters. Use
the following format: token->value. For example, mytoken->new_value will replace
the mytoken string with new_value in all files. This field is not affected by the
delimiter or prefix fields. To replace @token@ with new_value, specify @token@->new_value.
If you specify a value in the Property List field, the explicit tokens are added as
additional values to replace and override any properties that have the same name.
Regular expressions are not supported.
No
Fail On Truncate Boolean Select to set the step to fail if the line exceeds the record length after replacement.
If cleared, the line is truncated to fit the record length.
No
Include Data Sets String Specify a list of patterns that describe data sets to process. Separate patters with
commas or newline charactesr. For example, specify USERID.JCL(ABC) for a partitioned
data set, or USERID.DATA for a sequential data set.
Yes
Property List String Specify a value here to use existing property names as tokens to replace in the target
files. For example, specify ${p:environment/allProperties} use the names of all component
environment properties tokens and the property values
as the replacements. Similarly, specify ${p:component/allProperties},${p:environment/allProperties}
to use all component and component environment properties for token replacement. The
delimiter and prefix settings above apply. For example, if the start and end token
delimiters are the at sign (@) and a property is called token1, then the step searches
for @token1@ to replace.
No
Property Prefix String Specify a prefix to use to determine which properties are included in token replacement.
Leave blank to use all properties.
No
Start Token Delimiter String The start delimiter character used to identify tokens. No

Rollback Data Sets

Roll back data sets and HFS files to a backup that was created in the previous deployment.

Input properties for the Rollback Data Sets step
Name Type Description Required
Check Access Boolean Select to check permission to update the data sets to deploy. No
Delete Backup Data Boolean Select to remove the backup data that was created during deployment for this version. No
HFS Target Directory String Specify a target directory to deploy HFS files. No
Prevent Risky Rollback Boolean Select to prevent risky rollback. A risky rollback tries to rollback modules which
have been replaced by a subsequent version.
No
Run to Check Risk Only Boolean Select to do a dry run which only checks for risky rollback. No actual rollback is
done when doing a dry run. The step will fail when risk is detected, otherwise, the
step will pass.
No

Run MVS Command

Run MVS system commands.

Input properties for the Run MVS Command step
Name Type Description Required
Fail Message String Specify messages that indicate command failure. The step fails if any of these messages
are in the system responses. Separate multiple messages with newline characters.
No
MVS Commands String Specify a list, separated by newline characters, of MVS system commands to run. Yes
Stop On Fail Boolean Select to stop running commands after a command fails. No

Run TSO or ISPF Command

Run TSO and ISPF commands using the ISPF gateway.

Input properties for the Run TSO or ISPF Command step
Name Type Description Required
Command To Run From ISPF String Specify the TSO and ISPF commands to run. Separate multiple commands with newline
characters. Interactive TSO commands are not supported.
Yes
ISPF TSO Profile String Specify an existing ISPF profile to use in the call. No
Run In A Reusable ISPF Session Boolean Select to run commands in a reusable ISPF session that stays active between calls. No
Show Operation Log Boolean No
Stop On Fail Boolean Select to stop running commands after a command fails with a return code > 0. No
TSO Or ISPF Enumeration Valide values are TSO and ISPF. Only ISPF supports return code. Yes

Submit Job

Submit job.

Input properties for the Submit Job step
Name Type Description Required
Default Job Statement String Default job statement to use if no job statement is found in the JCL. The job statement
is not validated. Ensure that the job statement contains valid values for your system.
Token replacement rules are not applied to the default job statement.
No
JCL String Enter the JCL to submit. No
JCL Dataset String Submits JCL from a partitioned data set (PDS) member. Input can be a PDS member name:
A.B.C(MEM). Or a PDS member pattern: A.B.C(D*X). Or a PDS name: A.B.C. When input
is a member pattern, all matching members are submitted. When input is a PDS name,
all members are submitted. Multiple JCL statements are submitted in sequence using
the same settings. Multiple input JCL statements cannot be used together with Replace
Token sets for Each Job field.
No
JCL File String Submits JCL in a file in the UNIX file system. For example: /u/userid/jobname.jcl No
Max Lines String Specify the maximum number of lines to display in the log. No
Max Return Code String Specify the maximum return code for the step. The step fails if the JCL return code
is greater than the specified value.
Yes
Replace Token sets for Each Job String One job is submitted for each set of token replacement rules. Each set must be separated
by a line containing only two forward slash (//) characters. Within a set, each rule
must be on a separate line.
No
Replace Tokens String Specify replacement rules to apply to the JCL before submission. Rules are represented
by a list of explicit tokens to replace in the following format: token->value. Separate
rules with newline characters. For example, mytoken->new_value will replace the mytoken
string with new_value in all files. To replace @token@ with new_value, specify @token@->new_value.
Regular expressions are not supported.
No
Show Output String Specify the output data set to be displayed in the log. Separate multiple data sets
with commas. Specify ALL for all data sets.
No
Stop On Fail Boolean Select to stop submitting jobs after a job fails. Failure is determined by the Max
Return Code and Timeout fields. A JCL error is always considered a failure.
No
Timeout String Specify the timeout in seconds. No
Wait For Job Boolean Select to wait for the job to complete. If cleared, the Timeout, Show Output, Max
Lines, and Max Return Code fields are not used.
No

Wait For Job

Wait for a submitted job to complete.

Input properties for the Wait For Job step
Name Type Description Required
Job ID String Specify the job ID. For example: JOB06663. Use the ${p:submitStepName/jobId} property
to refer to the job ID from an earlier Submit Job step.
No
Max Lines String Specify the maximum number of lines to display in the log. No
Max Return Code String Specify the maximum return code for the step. The step fails if the JCL return code
is greater than the specified value.
Yes
Show Output String Specify the output data sets to display in the log. Separate multiple data sets with
commas. Specify ALL to display all data sets.
No
Timeout String Specify the timeout in seconds. No

Troubleshooting

Copy Artifacts step limitation

When you use the Copy Artifacts step, you can copy only in the same logical partition (LPAR). To transfer artifacts between different LPARs, use the FTP Artifacts step.

Missing return code for Run TSO or ISPF Command step

If you use the Run TSO or ISPF Command step to run a TSO command, the return code might not be displayed in IBM UrbanCode Deploy because the ISPF gateway does not support passing return codes when in TSO mode. To work around this behavior, in the TSO Or ISPF list, select ISPF instead of TSO.

Repository field for Copy Artifacts and FTP Artifacts steps

The local repository referred to in the Copy Artifacts and FTP Artifacts steps is not the Codestation repository, but rather the z/OS deployment tools artifact repository. You specify this directory when you install the z/OS deployment tools. By default, the artifact repository is the following directory: agent_installation_directory/var/repository.