HCL Launch customers can use their product data to create  Elasticsearch reports and Kibana charts. If you already have your ELK environment, you can use that. Directions for setting up an ELK-stack environment are provided later in this post.

Typically, the ELK stack uses Logstash to collect the logs required by Elasticsearch to build its data model. Instead of using logs, HCL Launch is using data stored in the database that you set up when you installed HCL Launch.

The ELK reports work for all versions of HCL Launch. Access to the reports is set in Kibana; no HCL Launch access is required.

Available reports

API requests

This report contains data about API requests made to the HCL Launch server. The available filters include user, creation date, requesting URL, and average duration.

HCL Launch application processes

This report contains application process requests. Some of the available filters include:

  • count by date
  • requests by application
  • result
  • environment
  • average duration
  • average warning count
  • requests by user

Codestation

This report contains data about Codestation usage. Some of the available filters include:

  • Codestation total size
  • source config plugins
  • usage by component
  • cleanup counts
  • Codestation copies
  • auto imports
  • component lists

Setting up your ELK environment

Prerequisites

ELK setup with Elasticsearch host locations. i.e. https://es01:9200
Elasticsearch user ID and password
Logstash configuration directory. Default: /usr/share/logstash/config/
Logstash binary directory. Ddefault: /usr/share/logstash/bin/
HCL Launch database user ID and password
JDBC driver jar
JDBC URL of HCL LAUNCH database
Kibana URL and user.

Installation

1. Move the JDBC driver jar to the Logstash configuration directory.

2. Create an empty logstash.yml file in the Logstash configuration directory if one does not exist.

3. Create Logstash keystore to contain user secrets if it does not already exist:

<LogstashBinaryDir>/logstash-keystore --path.settings <LogstashConfigruationDir> create

4. Set the JDBC and ElasticSearch passwords in the Logstash keystore if they are not already set:

<LogstashBinaryDir>/logstash-keystore --path.settings <LogstashConfigruationDir> add JDBC_PASSWORD
<LogstashBinaryDir>/logstash-keystore --path.settings <LogstashConfigruationDir> add ELASTICSEARCH_PASSWORD

5. Add lines from pipelines.yml to pipelines.yml in the Logstash configuration directory. /user/share/logstash/config must be replaced with the Logstash configuration directory.

6. Add *.conf and *_temp.json into Logstash configuration directory.

7. Move the set_env and start_logstash scripts to the directory where Logstash runs.

8. Configure the set_env script with the correct values for the environment variables. Replace any placeholders wrapped in brackets <> with real values.

9. From a shell, run the start_logstash command.

10. Logon to Kibana and navigate to Stack Management -> Saved Objects.

11. Import export.ndjson.

The basic process for installing a plugin is to download the installation file to your computer and then upload it to the HCL Launch server. After a plugin is loaded on the server, it is available for use. You do not need to restart the server after you install a plugin.

Downloading the plugin

To download the latest version of a plugin, click the Download button at the top of the plugin page. If you want to download a specific version, click Version History. A list of available versions displays and you can click Download for the version you want to install.

To complete the download, you must accept the terms and conditions. When you accept the terms and conditions, a dialog opens for you to open or save the file. Plugin installation files are in a compressed file. Select Save File and the compressed file is put in your Download folder.

After the plugin is downloaded, you are ready to load the plugin on to the HCL Launch server. To complete this task, log into the HCL Launch user interface.

  1. Click Settings
    • For automation plugins, click Automation Plugins
    • For source plugins, click Source Config Plugins
  2. Click Load Plugin and enter the path to the compressed plugin file, and then click Submit.

    You should see the plugin in the appropriate list: Automation Plugins or Source Config Plugins. The plugin is available for use.

Uninstalling a plugin

Before uninstalling a plugin, verify that it is no longer in use. Deleting a plugin which is being used by existing processes causes those processes to not be valid.

To uninstall a plugin, click Settings > Automation Plugins, find the plugin, and then click Delete.

Rolling back plugins

You cannot roll back a plugin version to a previous version. If you have mistakenly deleted a version of a plugin being used in processes, the step display PLUGIN DELETED”. The server retains the pertinent steps to avoid breaking persistent processes and snapshots, but the configuration should be updated according to intention. If the intention is to use an earlier version of the plugin, perform the following steps:

  1. Delete the later version of the plugin.
  2. In all processes containing steps from that plugin are shown with the text PLUGIN DELETED. These steps should never be left in this state.

    Note: The server retains the later version steps to avoid breaking snapshots. However, you cannot add these steps to processes after you delete the plugin. All deleted steps should be updated immediately.

  3. Install the earlier version of the plugin, if not already installed.

    Note: If an earlier plugin version was installed and previously upgraded, this step is not necessary, as the previous version is now usable again.

  4. In each process that used steps from the later plugin, re-create and delete the steps that are labeled PLUGIN DELETED. This rolls back the step from the later to earlier plugin version. The process steps cannot be changed until you re-create them, even if you installed another version of the plugin.

Note: Re-installing the later version of the plugin restores the processes.

Overview

The AccuRev plug-in integrates IBM UrbanCode Build with AccuRev and automates populating a AccuRev workspace as part of a build process.. The plug-in provides integration properties, that define the connection between the UrbanCode Build server and AccuRev repository. For details, see Repositories.
The repository is configured based on a workflow.

Installation

No special steps are required for installation. See Installing plug-ins in UrbanCode products.

Compatibility

The plug-in runs on any agents that the IBM UrbanCode Build server supports.
This plug-in run on all supported IBM UrbanCode Build platforms.

History

Version 6.752929 released on March 9, 2016
Version 6.752929 includes the following features and fixes:
– Added RPX dependency.
– Translation for step information.
– Added support for using the transaction ID in the Changelog step.

Version 5.669681
Version 5.669681 includes the following features and fixes:
Updated to append the depot name to all change IDs. This prevents an issue where multiple depots can have the same change ID causing issues to get picked up for depots that do not contain them.

Version 4.604289
Version 4.604289 includes the following features and fixes:
Fixed a communication issue where the UrbanCode Build server would fail if it was running with an IBM JDK or JRE.