Skip to content

Upgrade from 8.1.0 to 8.1.0.1

1. Upgrade From 8.1.0 to 8.1.0.1

RDAF Platform: From 8.1.0 to 8.1.0.1

OIA (AIOps) Application: From 8.1.0 to 8.1.0.1

RDAF Client rdac CLI: From 8.1.0 to 8.1.0.1

1.1. Prerequisites

Before proceeding with this upgrade, please make sure and verify the below prerequisites are met.

  • RDAF Deployment CLI version: 1.4.1

  • Infra Services tag: 1.0.4

  • Platform Services and RDA Worker tag: 8.1.0

  • OIA Application Services tag: 8.1.0

Note

  • Check the Disk space of all the Platform and Service Vm's using the below mentioned command, the highlighted disk size should be less than 80%
    df -kh
    
rdauser@oia-125-216:~/collab-3.7-upgrade$ df -kh
Filesystem                         Size  Used Avail Use% Mounted on
udev                                32G     0   32G   0% /dev
tmpfs                              6.3G  357M  6.0G   6% /run
/dev/mapper/ubuntu--vg-ubuntu--lv   48G   12G   34G  26% /
tmpfs                               32G     0   32G   0% /dev/shm
tmpfs                              5.0M     0  5.0M   0% /run/lock
tmpfs                               32G     0   32G   0% /sys/fs/cgroup
/dev/loop0                          64M   64M     0 100% /snap/core20/2318
/dev/loop2                          92M   92M     0 100% /snap/lxd/24061
/dev/sda2                          1.5G  309M  1.1G  23% /boot
/dev/sdf                            50G  3.8G   47G   8% /var/mysql
/dev/loop3                          39M   39M     0 100% /snap/snapd/21759
/dev/sdg                            50G  541M   50G   2% /minio-data
/dev/loop4                          92M   92M     0 100% /snap/lxd/29619
/dev/loop5                          39M   39M     0 100% /snap/snapd/21465
/dev/sde                            15G  140M   15G   1% /zookeeper
/dev/sdd                            30G  884M   30G   3% /kafka-logs
/dev/sdc                            50G  3.3G   47G   7% /opt
/dev/sdb                            50G   29G   22G  57% /var/lib/docker
/dev/sdi                            25G  294M   25G   2% /graphdb
/dev/sdh                            50G   34G   17G  68% /opensearch
/dev/loop6                          64M   64M     0 100% /snap/core20/2379

Warning

Make sure all of the above pre-requisites are met before proceeding with the upgrade process.

Warning

Non-Kubernetes: Upgrading RDAF Platform and AIOps application services is a disruptive operation. Schedule a maintenance window before upgrading RDAF Platform and AIOps services to newer version.

Important

Please make sure full backup of the RDAF platform system is completed before performing the upgrade.

Non-Kubernetes: Please run the below backup command to take the backup of application data.

rdaf backup --dest-dir <backup-dir>
Note: Please make sure this backup-dir is mounted across all infra,cli vms.

  • Verify that RDAF deployment rdaf cli version is 1.4.1 on the VM where CLI was installed for docker on-prem registry managing Non-kubernetes deployments.
rdaf --version
RDAF CLI version: 1.4.1
  • On-premise docker registry service version is 1.0.3
docker ps | grep docker-registry
ff6b1de8515f   cfxregistry.CloudFabrix.io:443/docker-registry:1.0.3   "/entrypoint.sh /bin…"   7 days ago   Up 7 days             deployment-scripts-docker-registry-1
  • RDAF Platform services version is 8.1.0

Run the below command to get RDAF Platform services details

rdaf platform status
  • RDAF OIA Application services version is 8.1.0

Run the below command to get RDAF App services details

rdaf app status

Important

Please click on this Link to update the registry credentials.

1.2. Upgrade Steps

1.2.1 Download the new Docker Images

Login into the VM where rdaf deployment CLI was installed for docker on-premise registry and managing Non-kubernetes deployment.

Download the new docker image tags for RDAF Platform and OIA (AIOps) Application services and wait until all of the images are downloaded.

To fetch registry please use the below command

rdaf registry fetch --tag 8.1.0.1

Note

If the Download of the images fail, Please re-execute the above command

Run the below command to verify above mentioned tags are downloaded for all of the RDAF Platform and OIA (AIOps) Application services.

rdaf registry list-tags 

Please make sure 8.1.0.1 image tag is downloaded for the below RDAF Platform services.

  • rda-client-api-server
  • rda-registry
  • rda-scheduler
  • rda-collector
  • rda-identity
  • rda-fsm
  • rda-asm
  • rda-access-manager
  • rda-resource-manager
  • rda-user-preferences
  • onprem-portal
  • onprem-portal-nginx
  • rda-worker-all
  • onprem-portal-dbinit
  • cfxdx-nb-nginx-all
  • rda-event-gateway
  • rda-chat-helper
  • rdac
  • bulk_stats

Please make sure 8.1.0.1 image tag is downloaded for the below RDAF OIA (AIOps) Application services.

  • cfx-rda-app-controller
  • cfx-rda-alert-processor
  • cfx-rda-file-browser
  • cfx-rda-smtp-server
  • cfx-rda-ingestion-tracker
  • cfx-rda-reports-registry
  • cfx-rda-ml-config
  • cfx-rda-event-consumer
  • cfx-rda-webhook-server
  • cfx-rda-irm-service
  • cfx-rda-alert-ingester
  • cfx-rda-collaboration
  • cfx-rda-notification-service
  • cfx-rda-configuration-service
  • cfx-rda-alert-processor-companion

Downloaded Docker images are stored under the below path.

/opt/rdaf-registry/data/docker/registry/v2/ or /opt/rdaf/data/docker/registry/v2/

Run the below command to check the filesystem's disk usage on offline registry VM where docker images are pulled.

df -h /opt

If necessary, older image tags that are no longer in use can be deleted to free up disk space using the command below.

Note

Run the command below if /opt occupies more than 80% of the disk space or if the free capacity of /opt is less than 25GB.

rdaf registry delete-images --tag <tag1,tag2>

1.2.2 Upgrade RDAF Platform Services

Warning

For Non-Kubernetes deployment, upgrading RDAF Platform and AIOps application services is a disruptive operation when rolling-upgrade option is not used. Please schedule a maintenance window before upgrading RDAF Platform and AIOps services to newer version.

Run the below command to initiate upgrading RDAF Platform services with zero downtime

rdaf platform upgrade --tag 8.1.0.1 --rolling-upgrade --timeout 10

Note

timeout <10> mentioned in the above command represents as Seconds

Note

The rolling-upgrade option upgrades the Platform services running in high-availability mode on one VM at a time in sequence. It completes the upgrade of Platform services running on VM-1 before upgrading them on VM-2, followed by VM-3, and so on.

During this upgrade sequence, RDAF platform continues to function without any impact to the application traffic.

After completing the Platform services upgrade on all VMs, it will ask for user confirmation to delete the older version Platform service PODs. The user has to provide YES to delete the old docker containers (in non-k8s)

192.168.133.95:5000/onprem-portal-nginx:8.1.0
2024-08-12 02:21:58,875 [rdaf.component.platform] INFO     - Gathering platform container details.
2024-08-12 02:22:01,326 [rdaf.component.platform] INFO     - Gathering rdac pod details.
+----------+----------------------+---------+---------+--------------+-------------+------------+
| Pod ID   | Pod Type             | Version | Age     | Hostname     | Maintenance | Pod Status |
+----------+----------------------+---------+---------+--------------+-------------+------------+
| 3a5ff878 | api-server           | 8.1.0   | 2:34:09 | 5119921f9c1c | None        | True       |
| 689c2574 | registry             | 8.1.0   | 3:23:10 | d21676c0465b | None        | True       |
| 0d03f649 | scheduler            | 8.1.0   | 2:34:46 | dd699a1d15af | None        | True       |
| 0496910a | collector            | 8.1.0   | 3:22:40 | 1c367e3bf00a | None        | True       |
| c4a88eb7 | asset-dependency     | 8.1.0   | 3:22:25 | cdb3f4c76deb | None        | True       |
| 9562960a | authenticator        | 8.1.0   | 3:22:09 | 8bda6c86a264 | None        | True       |
| ae8b58e5 | asm                  | 8.1.0   | 3:21:54 | 8f0f7f773907 | None        | True       |
| 1cea350e | fsm                  | 8.1.0   | 3:21:37 | 1ea1f5794abb | None        | True       |
| 32fa2f93 | chat-helper          | 8.1.0   | 3:21:23 | 811cbcfba7a2 | None        | True       |
| 0e6f375c | cfxdimensions-app-   | 8.1.0   | 3:21:07 | 307c140f99c2 | None        | True       |
|          | access-manager       |         |         |              |             |            |
| 4130b2d4 | cfxdimensions-app-   | 8.1.0   | 2:24:23 | 2d73c36426fe | None        | True       |
|          | resource-manager     |         |         |              |             |            |
| 29caf947 | user-preferences     | 8.1.0   | 3:20:36 | 3e2b5b7e6cb4 | None        | True       |
+----------+----------------------+---------+---------+--------------+-------------+------------+
Continue moving above pods to maintenance mode? [yes/no]: yes
2024-08-12 02:23:04,389 [rdaf.component.platform] INFO     - Initiating Maintenance Mode...
2024-08-12 02:23:10,048 [rdaf.component.platform] INFO     - Following container are in maintenance mode
+----------+----------------------+---------+---------+--------------+-------------+------------+
| Pod ID   | Pod Type             | Version | Age     | Hostname     | Maintenance | Pod Status |
+----------+----------------------+---------+---------+--------------+-------------+------------+
| 3a5ff878 | api-server           | 8.1.0   | 2:34:49 | 5119921f9c1c | maintenance | False      |
| ae8b58e5 | asm                  | 8.1.0   | 3:22:34 | 8f0f7f773907 | maintenance | False      |
| c4a88eb7 | asset-dependency     | 8.1.0   | 3:23:05 | cdb3f4c76deb | maintenance | False      |
| 9562960a | authenticator        | 8.1.0   | 3:22:49 | 8bda6c86a264 | maintenance | False      |
| 0e6f375c | cfxdimensions-app-   | 8.1.0   | 3:21:47 | 307c140f99c2 | maintenance | False      |
|          | access-manager       |         |         |              |             |            |
| 4130b2d4 | cfxdimensions-app-   | 8.1.0   | 2:25:03 | 2d73c36426fe | maintenance | False      |
|          | resource-manager     |         |         |              |             |            |
| 32fa2f93 | chat-helper          | 8.1.0   | 3:22:03 | 811cbcfba7a2 | maintenance | False      |
| 0496910a | collector            | 8.1.0   | 3:23:20 | 1c367e3bf00a | maintenance | False      |
| 1cea350e | fsm                  | 8.1.0   | 3:22:17 | 1ea1f5794abb | maintenance | False      |
| 689c2574 | registry             | 8.1.0   | 3:23:50 | d21676c0465b | maintenance | False      |
| 0d03f649 | scheduler            | 8.1.0   | 2:35:26 | dd699a1d15af | maintenance | False      |
| 29caf947 | user-preferences     | 8.1.0   | 3:21:16 | 3e2b5b7e6cb4 | maintenance | False      |
+----------+----------------------+---------+---------+--------------+-------------+------------+
2024-08-12 02:23:10,052 [rdaf.component.platform] INFO     - Waiting for timeout of 5 seconds...
2024-08-12 02:23:15,060 [rdaf.component.platform] INFO     - Upgrading service: rda_api_server on host 192.168.133.92

Run the below command to initiate upgrading RDAF Platform services without zero downtime

rdaf platform upgrade --tag 8.1.0.1

Please wait till all of the new platform services are in Up state and run the below command to verify their status and make sure all of them are running with 8.1.0.1 version.

rdaf platform status
+--------------------------+----------------+-------------------------------+--------------+---------+
| Name                     | Host           | Status                        | Container Id | Tag     |
+--------------------------+----------------+-------------------------------+--------------+---------+
| rda_api_server           | 192.168.108.51 | Up 4 hours                    | dc2dd806e6a6 | 8.1.0.1 |
| rda_api_server           | 192.168.108.52 | Up 4 hours                    | a76257df0330 | 8.1.0.1 |
| rda_registry             | 192.168.108.51 | Up 4 hours                    | f23455c6b85b | 8.1.0.1 |
| rda_registry             | 192.168.108.52 | Up 4 hours                    | 3b8deb15ad1f | 8.1.0.1 |
| rda_scheduler            | 192.168.108.51 | Up 4 hours                    | 1864f7e88bfb | 8.1.0.1 |
| rda_scheduler            | 192.168.108.52 | Up 4 hours                    | 62089081e902 | 8.1.0.1 |
| rda_collector            | 192.168.108.51 | Up 4 hours                    | 50c81f436fd9 | 8.1.0.1 |
| rda_collector            | 192.168.108.52 | Up 4 hours                    | 754db49f2804 | 8.1.0.1 |
| rda_identity             | 192.168.108.51 | Up 4 hours                    | 37625fde83e8 | 8.1.0.1 |
| rda_identity             | 192.168.108.52 | Up 4 hours                    | bb60423a47fa | 8.1.0.1 |
| rda_asm                  | 192.168.108.51 | Up 4 hours                    | 5ae15e7d661e | 8.1.0.1 |
| rda_asm                  | 192.168.108.52 | Up 4 hours                    | 80181bb0f80e | 8.1.0.1 |
| rda_fsm                  | 192.168.108.51 | Up 4 hours                    | bfaf7206eacb | 8.1.0.1 |
| rda_fsm                  | 192.168.108.52 | Up 4 hours                    | 8c470b9d7b08 | 8.1.0.1 |
+--------------------------+----------------+-------------------------------+--------------+---------+

Run the below command to check the rda-scheduler service is elected as a leader under Site column.

rdac pods

Run the below command to check if all services has ok status and does not throw any failure messages.

rdac healthcheck
+-----------+----------------------------------------+--------------+----------+-------------+-----------------------------------------------------+----------+-------------------------------------------------------------+
| Cat       | Pod-Type                               | Host         | ID       | Site        | Health Parameter                                    | Status   | Message                                                     |
|-----------+----------------------------------------+--------------+----------+-------------+-----------------------------------------------------+----------+-------------------------------------------------------------|
| rda_app   | alert-ingester                         | 7f75047e9e44 | daa8c414 |             | service-status                                      | ok       |                                                             |
| rda_app   | alert-ingester                         | 7f75047e9e44 | daa8c414 |             | minio-connectivity                                  | ok       |                                                             |
| rda_app   | alert-ingester                         | 7f75047e9e44 | daa8c414 |             | service-dependency:configuration-service            | ok       | 2 pod(s) found for configuration-service                    |
| rda_app   | alert-ingester                         | 7f75047e9e44 | daa8c414 |             | service-initialization-status                       | ok       |                                                             |
| rda_app   | alert-ingester                         | 7f75047e9e44 | daa8c414 |             | kafka-connectivity                                  | ok       | Cluster=NTc1NWU1MTQxYmY3MTFlZg, Broker=1, Brokers=[1, 2, 3] |
| rda_app   | alert-ingester                         | f9ec55862be0 | f9b9231c |             | service-status                                      | ok       |                                                             |
| rda_app   | alert-ingester                         | f9ec55862be0 | f9b9231c |             | minio-connectivity                                  | ok       |                                                             |
| rda_app   | alert-ingester                         | f9ec55862be0 | f9b9231c |             | service-dependency:configuration-service            | ok       | 2 pod(s) found for configuration-service                    |
| rda_app   | alert-ingester                         | f9ec55862be0 | f9b9231c |             | service-initialization-status                       | ok       |                                                             |
| rda_app   | alert-ingester                         | f9ec55862be0 | f9b9231c |             | kafka-connectivity                                  | ok       | Cluster=NTc1NWU1MTQxYmY3MTFlZg, Broker=3, Brokers=[1, 2, 3] |
| rda_app   | alert-processor                        | c6cc7b04ab33 | b4ebfb06 |             | service-status                                      | ok       |                                                             |
| rda_app   | alert-processor                        | c6cc7b04ab33 | b4ebfb06 |             | minio-connectivity                                  | ok       |                                                             |
+-----------+----------------------------------------+--------------+----------+-------------+-----------------------------------------------------+----------+-------------------------------------------------------------+

1.2.3 Upgrade rdac CLI

Run the below command to upgrade the rdac CLI

rdaf rdac_cli upgrade --tag 8.1.0.1

1.2.4 Upgrade RDA Worker Services

Note

If the worker was deployed in a HTTP proxy environment, please make sure the required HTTP proxy environment variables are added in /opt/rdaf/deployment-scripts/values.yaml file under rda_worker configuration section as shown below before upgrading RDA Worker services.

rda_worker:
  mem_limit: 8G
  memswap_limit: 8G
  privileged: false
  environment:
    RDA_ENABLE_TRACES: 'no'
    RDA_SELF_HEALTH_RESTART_AFTER_FAILURES: 3
    http_proxy:  "http://test:[email protected]:3128"
    https_proxy: "http://test:[email protected]:3128"
    HTTP_PROXY:  "http://test:[email protected]:3128"
    HTTPS_PROXY: "http://test:[email protected]:3128"
  • Upgrade RDA Worker Services

Please run the below command to initiate upgrading the RDA Worker Service with zero downtime

rdaf worker upgrade --tag 8.1.0.1 --rolling-upgrade --timeout 10

Note

timeout <10> mentioned in the above command represents as seconds

Note

The rolling-upgrade option upgrades the Worker services running in high-availability mode on one VM at a time in sequence. It completes the upgrade of Worker services running on VM-1 before upgrading them on VM-2, followed by VM-3, and so on.

After completing the Worker services upgrade on all VMs, it will ask for user confirmation, the user has to provide YES to delete the older version Worker service PODs.

2024-08-12 02:56:11,573 [rdaf.component.worker] INFO     - Collecting worker details for rolling upgrade
2024-08-12 02:56:14,301 [rdaf.component.worker] INFO     - Rolling upgrade worker on 192.168.133.96
+----------+----------+---------------+---------+--------------+-------------+------------+
| Pod ID   | Pod Type | Version       | Age     | Hostname     | Maintenance | Pod Status |
+----------+----------+---------------+---------+--------------+-------------+------------+
| c8a37db9 | worker   | 8.1.0.1       |3:32:31 | fffe44b43708 | None         | True       |
+----------+----------+---------------+---------+--------------+-------------+------------+
Continue moving above pod to maintenance mode? [yes/no]: yes
2024-08-12 02:57:17,346 [rdaf.component.worker] INFO     - Initiating maintenance mode for pod c8a37db9
2024-08-12 02:57:22,401 [rdaf.component.worker] INFO     - Waiting for worker to be moved to maintenance.
2024-08-12 02:57:35,001 [rdaf.component.worker] INFO     - Following worker container is in maintenance mode
+----------+----------+---------------+---------+--------------+-------------+------------+
| Pod ID   | Pod Type | Version       | Age     | Hostname     | Maintenance | Pod Status |
+----------+----------+---------------+---------+--------------+-------------+------------+
| c8a37db9 | worker   | 8.1.0.1       | 3:33:52 | fffe44b43708 | maintenance | False      |
+----------+----------+---------------+---------+--------------+-------------+------------+
2024-08-12 02:57:35,002 [rdaf.component.worker] INFO     - Waiting for timeout of 3 seconds.

Please run the below command to initiate upgrading the RDA Worker Service without zero downtime

rdaf worker upgrade --tag 8.1.0.1

Please wait for 120 seconds to let the newer version of RDA Worker service containers join the RDA Fabric appropriately. Run the below commands to verify the status of the newer RDA Worker service containers.

rdac pods | grep worker
| Infra | worker      | True        | 6eff605e72c4 | a318f394 | rda-site-01 | 13:45:13 |      4 |        31.21 | 0             | 0            |
| Infra | worker      | True        | ae7244d0d10a | 554c2cd8 | rda-site-01 | 13:40:40 |      4 |        31.21 | 0             | 0            |

rdaf worker status

+------------+----------------+------------+--------------+---------+
| Name       | Host           | Status     | Container Id | Tag     |
+------------+----------------+------------+--------------+---------+
| rda_worker | 192.168.108.53 | Up 4 hours | ea187f89505f | 8.1.0.1 |
| rda_worker | 192.168.108.54 | Up 4 hours | a62b3230bbaa | 8.1.0.1 |
+------------+----------------+------------+--------------+---------+
Run the below command to check if all RDA Worker services has ok status and does not throw any failure messages.

rdac healthcheck
+-----------+----------------------------------------+--------------+----------+-------------+-----------------------------------------------------+----------+-----------------------------------------------------------------------------------------------------------------------------+
| Cat       | Pod-Type                               | Host         | ID       | Site        | Health Parameter                                    | Status   | Message                                                                                                                     |
|-----------+----------------------------------------+--------------+----------+-------------+-----------------------------------------------------+----------+-----------------------------------------------------------------------------------------------------------------------------|
| rda_infra | api-server                             | 1b0542719618 | 1845ae67 |             | service-status                                      | ok       |                                                                                                                             |
| rda_infra | api-server                             | 1b0542719618 | 1845ae67 |             | minio-connectivity                                  | ok       |                                                                                                                             |
| rda_infra | api-server                             | d4404cffdc7a | a4cfdc6d |             | service-status                                      | ok       |                                                                                                                             |
| rda_infra | api-server                             | d4404cffdc7a | a4cfdc6d |             | minio-connectivity                                  | ok       |                                                                                                                             |
| rda_infra | asm                                    | 8d3d52a7a475 | 418c9dc1 |             | service-status                                      | ok       |                                                                                                                             |
| rda_infra | asm                                    | 8d3d52a7a475 | 418c9dc1 |             | minio-connectivity                                  | ok       |                                                                                                                             |
| rda_infra | asm                                    | ab172a9b8229 | 2ac1d67a |             | service-status                                      | ok       |                                                                                                                             |
| rda_infra | asm                                    | ab172a9b8229 | 2ac1d67a |             | minio-connectivity                                  | ok       |                                                                                                                             |
| rda_app   | asset-dependency                       | 6ac69ca1085c | c2e9dcb9 |             | service-status                                      | ok       |                                                                                                                             |
| rda_app   | asset-dependency                       | 6ac69ca1085c | c2e9dcb9 |             | minio-connectivity                                  | ok       |                                                                                                                             |
| rda_app   | asset-dependency                       | 58a5f4f460d3 | 0b91caac |             | service-status                                      | ok       |                                                                                                                             |
| rda_app   | asset-dependency                       | 58a5f4f460d3 | 0b91caac |             | minio-connectivity                                  | ok       |                                                                                                                             |
| rda_app   | authenticator                          | 9011c2aef498 | 9f7efdc3 |             | service-status                                      | ok       |                                                                                                                             |
| rda_app   | authenticator                          | 9011c2aef498 | 9f7efdc3 |             | minio-connectivity                                  | ok       |                                                                                                                             |
| rda_app   | authenticator                          | 9011c2aef498 | 9f7efdc3 |             | DB-connectivity                                     | ok       |                                                                                                                             |
| rda_app   | authenticator                          | 148621ed8c82 | dbf16b82 |             | service-status                                      | ok       |                                                                                                                             |
| rda_app   | authenticator                          | 148621ed8c82 | dbf16b82 |             | minio-connectivity                                  | ok       |                                                                                                                             |
| rda_app   | authenticator                          | 148621ed8c82 | dbf16b82 |             | DB-connectivity                                     | ok       |                                                                                                                             |
| rda_app   | cfx-app-controller                     | 75ec0f30cfa3 | 1198fdee |             | service-status                                      | ok       |                                                                                                                             |
| rda_app   | cfx-app-controller                     | 75ec0f30cfa3 | 1198fdee |             | minio-connectivity                                  | ok       |                                                                                                                             |
| rda_app   | cfx-app-controller                     | 75ec0f30cfa3 | 1198fdee |             | service-initialization-status                       | ok       |                                                                                                                             |
| rda_app   | cfx-app-controller                     | 75ec0f30cfa3 | 1198fdee |             | DB-connectivity                                     | ok       |                          
+-----------+----------------------------------------+--------------+----------+-------------+-----------------------------------------------------+----------+-----------------------------------------------------------------------------------------------------------------------------+

1.2.5 Upgrade OIA Application Services

Run the below commands to initiate upgrading the RDA Fabric OIA Application services with zero downtime

rdaf app upgrade OIA --tag 8.1.0.1 --rolling-upgrade --timeout 10

Note

timeout <10> mentioned in the above command represents as Seconds

Note

The rolling-upgrade option upgrades the OIA application services running in high-availability mode on one VM at a time in sequence. It completes the upgrade of OIA application services running on VM-1 before upgrading them on VM-2, followed by VM-3, and so on.

After completing the OIA application services upgrade on all VMs, it will ask for user confirmation to delete the older version OIA application service PODs.

2024-08-12 03:18:08,705 [rdaf.component.oia] INFO     - Gathering OIA app container details.
2024-08-12 03:18:10,719 [rdaf.component.oia] INFO     - Gathering rdac pod details.
+----------+----------------------+---------+---------+--------------+-------------+------------+
| Pod ID   | Pod Type             | Version | Age     | Hostname     | Maintenance | Pod Status |
+----------+----------------------+---------+---------+--------------+-------------+------------+
| 2992fe69 | cfx-app-controller   | 8.1.0   | 3:44:53 | 0500f773a8ff | None        | True       |
| 336138c8 | reports-registry     | 8.1.0   | 3:44:12 | 92a5e0daa942 | None        | True       |
| ccc5f3ce | cfxdimensions-app-   | 8.1.0   | 3:43:34 | 99192de47ea4 | None        | True       |
|          | notification-service |         |         |              |             |            |
| 03614007 | cfxdimensions-app-   | 8.1.0   | 3:42:54 | fbdf4e5c16c3 | None        | True       |
|          | file-browser         |         |         |              |             |            |
| a4949804 | configuration-       | 8.1.0   | 3:42:15 | 4ea08c8cbf2e | None        | True       |
|          | service              |         |         |              |             |            |
| 8f37c520 | alert-ingester       | 8.1.0   | 3:41:35 | e9e3a3e69cac | None        | True       |
| 249b7104 | webhook-server       | 8.1.0   | 3:12:04 | 1df43cebc888 | None        | True       |
| 76c64336 | smtp-server          | 8.1.0   | 3:08:57 | 03725b0cb91f | None        | True       |
| ad85cb4c | event-consumer       | 8.1.0   | 3:09:58 | 8a7d349da513 | None        | True       |
| 1a788ef3 | alert-processor      | 8.1.0   | 3:11:01 | a7c5294cba3d | None        | True       |
| 970b90b1 | cfxdimensions-app-   | 8.1.0   | 3:38:14 | 01d4245bb90e | None        | True       |
|          | irm_service          |         |         |              |             |            |
| 153aa6ac | ml-config            | 8.1.0   | 3:37:33 | 10d5d6766354 | None        | True       |
| 5aa927a4 | cfxdimensions-app-   | 8.1.0   | 3:36:53 | dcfda7175cb5 | None        | True       |
|          | collaboration        |         |         |              |             |            |
| 6833aa86 | ingestion-tracker    | 8.1.0   | 3:36:13 | ef0e78252e48 | None        | True       |
| afe77cb9 | alert-processor-     | 8.1.0   | 3:35:33 | 6f03c7fdba51 | None        | True       |
|          | companion            |         |         |              |             |            |
+----------+----------------------+---------+---------+--------------+-------------+------------+
Continue moving above pods to maintenance mode? [yes/no]: yes
2024-08-12 03:18:27,159 [rdaf.component.oia] INFO     - Initiating Maintenance Mode...
2024-08-12 03:18:32,978 [rdaf.component.oia] INFO     - Waiting for services to be moved to maintenance.
2024-08-12 03:18:55,771 [rdaf.component.oia] INFO     - Following container are in maintenance mode
+----------+----------------------+---------+---------+--------------+-------------+------------+

Run the below command to initiate upgrading the RDA Fabric OIA Application services without zero downtime

rdaf app upgrade OIA --tag 8.1.0.1

Please wait till all of the new OIA application service containers are in Up state and run the below command to verify their status and make sure they are running with 8.1.0.1 version.

rdaf app status
+-----------------------------------+----------------+------------+--------------+---------+
| Name                              | Host           | Status     | Container Id | Tag     |
+-----------------------------------+----------------+------------+--------------+---------+
| cfx-rda-app-controller            | 192.168.108.51 | Up 3 hours | 2f5970c9ba3f | 8.1.0.1 |
| cfx-rda-app-controller            | 192.168.108.52 | Up 3 hours | 831cb384c807 | 8.1.0.1 |
| cfx-rda-reports-registry          | 192.168.108.51 | Up 4 hours | ae6dfcf1fb88 | 8.1.0.1 |
| cfx-rda-reports-registry          | 192.168.108.52 | Up 4 hours | 3387e3ac2e8b | 8.1.0.1 |
| cfx-rda-notification-service      | 192.168.108.51 | Up 4 hours | 757acc39018c | 8.1.0.1 |
| cfx-rda-notification-service      | 192.168.108.52 | Up 4 hours | a14d8ea906f7 | 8.1.0.1 |
| cfx-rda-file-browser              | 192.168.108.51 | Up 4 hours | 1e83162f75ce | 8.1.0.1 |
| cfx-rda-file-browser              | 192.168.108.52 | Up 4 hours | bff3cca26363 | 8.1.0.1 |
| cfx-rda-configuration-service     | 192.168.108.51 | Up 4 hours | 3c6598ce38e2 | 8.1.0.1 |
| cfx-rda-configuration-service     | 192.168.108.52 | Up 4 hours | de793664be3a | 8.1.0.1 |
| cfx-rda-alert-ingester            | 192.168.108.51 | Up 4 hours | 6df94614f4c2 | 8.1.0.1 |
| cfx-rda-alert-ingester            | 192.168.108.52 | Up 4 hours | b1de17f5c587 | 8.1.0.1 |
| cfx-rda-webhook-server            | 192.168.108.51 | Up 4 hours | 6de31a1f5101 | 8.1.0.1 |
| cfx-rda-webhook-server            | 192.168.108.52 | Up 4 hours | e70a6570d922 | 8.1.0.1 |
| cfx-rda-smtp-server               | 192.168.108.51 | Up 4 hours | efcebbe2a1ee | 8.1.0.1 |
| cfx-rda-smtp-server               | 192.168.108.52 | Up 4 hours | 93b36a17f7f3 | 8.1.0.1 |
+-----------------------------------+----------------+------------+--------------+---------+

Run the below command to verify all OIA application services are up and running.

rdac pods
+-------+----------------------------------------+-------------+----------------+----------+-------------+----------+--------+--------------+---------------+--------------+
| Cat   | Pod-Type                               | Pod-Ready   | Host           | ID       | Site        | Age      |   CPUs |   Memory(GB) | Active Jobs   | Total Jobs   |
|-------+----------------------------------------+-------------+----------------+----------+-------------+----------+--------+--------------+---------------+--------------|
| App   | alert-ingester                         | True        | rda-alert-inge | 6a6e464d |             | 19:22:36 |      8 |        31.33 |               |              |
| App   | alert-ingester                         | True        | rda-alert-inge | 7f6b42a0 |             | 19:22:53 |      8 |        31.33 |               |              |
| App   | alert-processor                        | True        | rda-alert-proc | a880e491 |             | 19:23:21 |      8 |        31.33 |               |              |
| App   | alert-processor                        | True        | rda-alert-proc | b684609e |             | 19:23:18 |      8 |        31.33 |               |              |
| App   | alert-processor-companion              | True        | rda-alert-proc | 874f3b33 |             | 19:22:24 |      8 |        31.33 |               |              |
| App   | alert-processor-companion              | True        | rda-alert-proc | 70cadaa7 |             | 19:22:05 |      8 |        31.33 |               |              |
| App   | asset-dependency                       | True        | rda-asset-depe | bde06c15 |             | 19:47:50 |      8 |        31.33 |               |              |
| App   | asset-dependency                       | True        | rda-asset-depe | 47b9eb02 |             | 19:47:38 |      8 |        31.33 |               |              |
| App   | authenticator                          | True        | rda-identity-d | faa33e1b |             | 19:47:52 |      8 |        31.33 |               |              |
| App   | authenticator                          | True        | rda-identity-d | 36083c36 |             | 19:47:46 |      8 |        31.33 |               |              |
| App   | cfx-app-controller                     | True        | rda-app-contro | 5fd3c3f4 |             | 19:23:09 |      8 |        31.33 |               |              |
| App   | cfx-app-controller                     | True        | rda-app-contro | d66e5ce8 |             | 19:22:56 |      8 |        31.33 |               |              |
| App   | cfxdimensions-app-access-manager       | True        | rda-access-man | ecbb535c |             | 19:47:46 |      8 |        31.33 |               |              |
| App   | cfxdimensions-app-access-manager       | True        | rda-access-man | 9a05db5a |             | 19:47:36 |      8 |        31.33 |               |              |
| App   | cfxdimensions-app-collaboration        | True        | rda-collaborat | 61b3c53b |             | 19:22:18 |      8 |        31.33 |               |              |
| App   | cfxdimensions-app-collaboration        | True        | rda-collaborat | 09b9474e |             | 19:21:57 |      8 |        31.33 |               |              |
| App   | cfxdimensions-app-file-browser         | True        | rda-file-brows | 00495640 |             | 19:22:45 |      8 |        31.33 |               |              |
| App   | cfxdimensions-app-file-browser         | True        | rda-file-brows | 640f0653 |             | 19:22:29 |      8 |        31.33 |               |              |
| App   | cfxdimensions-app-irm_service          | True        | rda-irm-servic | 27e345c5 |             | 19:21:43 |      8 |        31.33 |               |              |
| App   | cfxdimensions-app-irm_service          | True        | rda-irm-servic | 23c7e082 |             | 19:21:56 |      8 |        31.33 |               |              |
| App   | cfxdimensions-app-notification-service | True        | rda-notificati | bbb5b08b |             | 19:23:20 |      8 |        31.33 |               |              |
| App   | cfxdimensions-app-notification-service | True        | rda-notificati | 9841bcb5 |             | 19:23:02 |      8 |        31.33 |               |              |
+-------+----------------------------------------+-------------+----------------+----------+-------------+----------+--------+--------------+---------------+--------------+

Run the below command to check if all services has ok status and does not throw any failure messages.

rdac healthcheck
+-----------+----------------------------------------+--------------+----------+-------------+-----------------------------------------------------+----------+-------------------------------------------------------------+
| Cat       | Pod-Type                               | Host         | ID       | Site        | Health Parameter                                    | Status   | Message                                                     |
|-----------+----------------------------------------+--------------+----------+-------------+-----------------------------------------------------+----------+-------------------------------------------------------------|
| rda_app   | alert-ingester                         | 7f75047e9e44 | daa8c414 |             | service-status                                      | ok       |                                                             |
| rda_app   | alert-ingester                         | 7f75047e9e44 | daa8c414 |             | minio-connectivity                                  | ok       |                                                             |
| rda_app   | alert-ingester                         | 7f75047e9e44 | daa8c414 |             | service-dependency:configuration-service            | ok       | 2 pod(s) found for configuration-service                    |
| rda_app   | alert-ingester                         | 7f75047e9e44 | daa8c414 |             | service-initialization-status                       | ok       |                                                             |
| rda_app   | alert-ingester                         | 7f75047e9e44 | daa8c414 |             | kafka-connectivity                                  | ok       | Cluster=NTc1NWU1MTQxYmY3MTFlZg, Broker=1, Brokers=[1, 2, 3] |
| rda_app   | alert-ingester                         | f9ec55862be0 | f9b9231c |             | service-status                                      | ok       |                                                             |
| rda_app   | alert-ingester                         | f9ec55862be0 | f9b9231c |             | minio-connectivity                                  | ok       |                                                             |
| rda_app   | alert-ingester                         | f9ec55862be0 | f9b9231c |             | service-dependency:configuration-service            | ok       | 2 pod(s) found for configuration-service                    |
| rda_app   | alert-ingester                         | f9ec55862be0 | f9b9231c |             | service-initialization-status                       | ok       |                                                             |
| rda_app   | alert-ingester                         | f9ec55862be0 | f9b9231c |             | kafka-connectivity                                  | ok       | Cluster=NTc1NWU1MTQxYmY3MTFlZg, Broker=2, Brokers=[1, 2, 3] |
| rda_app   | alert-processor                        | c6cc7b04ab33 | b4ebfb06 |             | service-status                                      | ok       |                                                             |
| rda_app   | alert-processor                        | c6cc7b04ab33 | b4ebfb06 |             | minio-connectivity                                  | ok       |                                                             |
+-----------+----------------------------------------+--------------+----------+-------------+-----------------------------------------------------+----------+-------------------------------------------------------------+

1.2.6 Upgrade Event Gateway Services

Important

This Upgrade is for Non-K8s only

Step 1. Prerequisites

  • Event Gateway with 8.1.0 tag should be already installed

Note

If a user deployed the event gateway using the RDAF CLI, follow Step 2 and skip Step 3 or if the user did not deploy event gateway in RDAF CLI go to Step 3

Step 2. Upgrade Event Gateway Using RDAF CLI

  • To upgrade the event gateway, log in to the rdaf cli VM and execute the following command.

    rdaf event_gateway upgrade --tag 8.1.0.1
    

Step 3. Upgrade Event Gateway Using Docker Compose File

  • Login to the Event Gateway installed VM

  • Navigate to the location where Event Gateway was previously installed, using the following command

    cd /opt/rdaf/event_gateway
    
  • Edit the docker-compose file for the Event Gateway using a local editor (e.g. vi) update the tag and save it

    vi event-gateway-docker-compose.yml
    
    version: '3.1'
    services:
    rda_event_gateway:
    image: docker1.cloudfabrix.io:443/external/ubuntu-rda-event-gateway:8.1.0.1
    restart: always
    network_mode: host
    mem_limit: 6G
    memswap_limit: 6G
    volumes:
    - /opt/rdaf/network_config:/network_config
    - /opt/rdaf/event_gateway/config:/event_gw_config
    - /opt/rdaf/event_gateway/certs:/certs
    - /opt/rdaf/event_gateway/logs:/logs
    - /opt/rdaf/event_gateway/log_archive:/tmp/log_archive
    logging:
        driver: "json-file"
        options:
        max-size: "25m"
        max-file: "5"
    environment:
        RDA_NETWORK_CONFIG: /network_config/rda_network_config.json
        EVENT_GW_MAIN_CONFIG: /event_gw_config/main/main.yml
        EVENT_GW_SNMP_TRAP_CONFIG: /event_gw_config/snmptrap/trap_template.json
        EVENT_GW_SNMP_TRAP_ALERT_CONFIG: /event_gw_config/snmptrap/trap_to_alert_go.yaml
        AGENT_GROUP: event_gateway_site01
        EVENT_GATEWAY_CONFIG_DIR: /event_gw_config
        LOGGER_CONFIG_FILE: /event_gw_config/main/logging.yml
    
  • Please run the following commands

    docker-compose -f event-gateway-docker-compose.yml down
    docker-compose -f event-gateway-docker-compose.yml pull
    docker-compose -f event-gateway-docker-compose.yml up -d
    
  • Use the command as shown below to ensure that the RDA docker instances are up and running.

    docker ps -a | grep event
    
  • Use the below mentioned command to check docker logs for any errors

    docker logs -f  -tail 200 <event gateway containerid>
    

Tip

In version 3.6.1 or above, the RDA Event Gateway agent introduces enhanced Syslog TCP/UDP endpoints, developed in Go (lang), to boost event processing rates significantly and optimize system resource utilization.

  • New Syslog TCP Endpoint Type: syslog_tcp_go

  • New Syslog UDP Endpoint Type: syslog_udp_go

1.2.7 RDA Studio Upgrade

Please navigate to the rda-studio.yml file. You need to modify the existing tag version to 8.1.0.1, ensuring it matches the format shown in the example below, and then save the file

services:
cfxdx:
    image: docker1.cloudfabrix.io:443/external/ubuntu-cfxdx-nb-nginx-all:8.1.0.1
    restart: unless-stopped
    volumes:
    - /opt/rdaf/cfxdx/home/:/root
    - /opt/rdaf/cfxdx/config/:/tmp/config/
    - /opt/rdaf/cfxdx/output:/tmp/output/
    - /opt/rdaf/config/network_config/:/network_config
    ports:
    - "9998:9998"
    environment:
    #JUPYTER_TOKEN: cfxdxdemo
    NLTK_DATA : "/root/nltk_data"
    CFXDX_CONFIG_FILE: /tmp/config/conf.yml
    RDA_NETWORK_CONFIG: /network_config/config.json
    RDA_USER: xxxxxxx
    RDA_PASSWORD: xxxxxxxxxxxx

After updating the rda-studio.yml file to set the tag version to 8.1.0.1, execute the following commands to pull the latest images and start the services

docker-compose -f rda-studio pull
docker-compose -f rda-studio up -d

1.2.8 Upgrade RDAF Bulkstats Services

Note

The RDAF Bulkstats service is optional and only necessary if the Bulkstats data ingestion feature is required. Otherwise, you may ignore the steps below and go to next section.

Run the below command to upgrade bulk_stats services

rdaf bulk_stats upgrade --tag 8.1.0.1

Run the below command to get the bulk_stats status

rdaf bulk_stats status
+----------------+----------------+------------+--------------+---------+
| Name           | Host           | Status     | Container Id | Tag     |
+----------------+----------------+------------+--------------+---------+
| rda_bulk_stats | 192.168.108.51 | Up 4 hours | 2b92d66234c8 | 8.1.0.1 |
| rda_bulk_stats | 192.168.108.52 | Up 4 hours | 9bd8564aaa52 | 8.1.0.1 |
+----------------+----------------+------------+--------------+---------+

1.2.8.1 Upgrade RDAF File Object Services

Note

This service is applicable for Non-K8s only, The RDAF File Object service is optional and only necessary if the Bulkstats data ingestion feature is required. Otherwise, you may ignore the steps below and go to next section

Run the below command to upgrade File Object services.

rdaf file_object upgrade --tag 8.1.0.1

Run the below command to get the file_object status

rdaf file_object status
+-----------------+----------------+---------------+--------------+---------+
| Name            | Host           | Status        | Container Id | Tag     |
+-----------------+----------------+---------------+--------------+---------+
| rda_file_object | 192.168.108.51 | Up 54 seconds | d1733d6d8995 | 8.1.0.1 |
| rda_file_object | 192.168.108.52 | Up 52 seconds | 1e1342ceac1c | 8.1.0.1 |
+-----------------+----------------+---------------+--------------+---------+