HCL Launch customers can use their product data to create  Elasticsearch reports and Kibana charts. If you already have your ELK environment, you can use that. Directions for setting up an ELK-stack environment are provided later in this post.

Typically, the ELK stack uses Logstash to collect the logs required by Elasticsearch to build its data model. Instead of using logs, HCL Launch is using data stored in the database that you set up when you installed HCL Launch.

The ELK reports work for all versions of HCL Launch. Access to the reports is set in Kibana; no HCL Launch access is required.

Available reports

API requests

This report contains data about API requests made to the HCL Launch server. The available filters include user, creation date, requesting URL, and average duration.

HCL Launch application processes

This report contains application process requests. Some of the available filters include:

  • count by date
  • requests by application
  • result
  • environment
  • average duration
  • average warning count
  • requests by user

Codestation

This report contains data about Codestation usage. Some of the available filters include:

  • Codestation total size
  • source config plugins
  • usage by component
  • cleanup counts
  • Codestation copies
  • auto imports
  • component lists

Setting up your ELK environment

Prerequisites

ELK setup with Elasticsearch host locations. i.e. https://es01:9200
Elasticsearch user ID and password
Logstash configuration directory. Default: /usr/share/logstash/config/
Logstash binary directory. Ddefault: /usr/share/logstash/bin/
HCL Launch database user ID and password
JDBC driver jar
JDBC URL of HCL LAUNCH database
Kibana URL and user.

Installation

1. Move the JDBC driver jar to the Logstash configuration directory.

2. Create an empty logstash.yml file in the Logstash configuration directory if one does not exist.

3. Create Logstash keystore to contain user secrets if it does not already exist:

<LogstashBinaryDir>/logstash-keystore --path.settings <LogstashConfigruationDir> create

4. Set the JDBC and ElasticSearch passwords in the Logstash keystore if they are not already set:

<LogstashBinaryDir>/logstash-keystore --path.settings <LogstashConfigruationDir> add JDBC_PASSWORD
<LogstashBinaryDir>/logstash-keystore --path.settings <LogstashConfigruationDir> add ELASTICSEARCH_PASSWORD

5. Add lines from pipelines.yml to pipelines.yml in the Logstash configuration directory. /user/share/logstash/config must be replaced with the Logstash configuration directory.

6. Add *.conf and *_temp.json into Logstash configuration directory.

7. Move the set_env and start_logstash scripts to the directory where Logstash runs.

8. Configure the set_env script with the correct values for the environment variables. Replace any placeholders wrapped in brackets <> with real values.

9. From a shell, run the start_logstash command.

10. Logon to Kibana and navigate to Stack Management -> Saved Objects.

11. Import export.ndjson.