0% found this document useful (0 votes)
4 views

NetBackup IT Analytics DC Installation Guide for Cloud v11.4

The document is the installation guide for the NetBackup IT Analytics Data Collector for cloud environments, specifically detailing the setup for Google Cloud Platform, OpenStack Ceilometer, OpenStack Swift, and Microsoft Azure. It covers prerequisites, installation procedures, and configuration steps necessary for effective data collection. Additionally, it includes legal notices, support information, and links to relevant resources for users.

Uploaded by

maulet2001
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views

NetBackup IT Analytics DC Installation Guide for Cloud v11.4

The document is the installation guide for the NetBackup IT Analytics Data Collector for cloud environments, specifically detailing the setup for Google Cloud Platform, OpenStack Ceilometer, OpenStack Swift, and Microsoft Azure. It covers prerequisites, installation procedures, and configuration steps necessary for effective data collection. Additionally, it includes legal notices, support information, and links to relevant resources for users.

Uploaded by

maulet2001
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 83

NetBackup IT Analytics

Data Collector Installation


Guide for the Cloud

Release 11.4
NetBackup IT Analytics Data Collector Installation
Guide for the Cloud
Last updated: 2025-02-03

Legal Notice
Copyright © 2025 Veritas Technologies LLC. All rights reserved.

Veritas and the Veritas Logo are trademarks or registered trademarks of Veritas Technologies
LLC or its affiliates in the U.S. and other countries. Other names may be trademarks of their
respective owners.

This product may contain third-party software for which Veritas is required to provide attribution
to the third party (“Third-party Programs”). Some of the Third-party Programs are available
under open source or free software licenses. The License Agreement accompanying the
Software does not alter any rights or obligations you may have under those open source or
free software licenses. Refer to the Third-party Legal Notices document accompanying this
Veritas product or available at:

https://www.veritas.com/about/legal/license-agreements

The product described in this document is distributed under licenses restricting its use, copying,
distribution, and decompilation/reverse engineering. No part of this document may be
reproduced in any form by any means without prior written authorization of Veritas Technologies
LLC and its licensors, if any.

THE DOCUMENTATION IS PROVIDED "AS IS" AND ALL EXPRESS OR IMPLIED


CONDITIONS, REPRESENTATIONS AND WARRANTIES, INCLUDING ANY IMPLIED
WARRANTY OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE OR
NON-INFRINGEMENT, ARE DISCLAIMED, EXCEPT TO THE EXTENT THAT SUCH
DISCLAIMERS ARE HELD TO BE LEGALLY INVALID. VERITAS TECHNOLOGIES LLC
SHALL NOT BE LIABLE FOR INCIDENTAL OR CONSEQUENTIAL DAMAGES IN
CONNECTION WITH THE FURNISHING, PERFORMANCE, OR USE OF THIS
DOCUMENTATION. THE INFORMATION CONTAINED IN THIS DOCUMENTATION IS
SUBJECT TO CHANGE WITHOUT NOTICE.

The Licensed Software and Documentation are deemed to be commercial computer software
as defined in FAR 12.212 and subject to restricted rights as defined in FAR Section 52.227-19
"Commercial Computer Software - Restricted Rights" and DFARS 227.7202, et seq.
"Commercial Computer Software and Commercial Computer Software Documentation," as
applicable, and any successor regulations, whether delivered by Veritas as on premises or
hosted services. Any use, modification, reproduction release, performance, display or disclosure
of the Licensed Software and Documentation by the U.S. Government shall be solely in
accordance with the terms of this Agreement.

Veritas Technologies LLC


2625 Augustine Drive.
Santa Clara, CA 95054
http://www.veritas.com

Technical Support
Technical Support maintains support centers globally. All support services will be delivered
in accordance with your support agreement and the then-current enterprise technical support
policies. For information about our support offerings and how to contact Technical Support,
visit our website:

https://www.veritas.com/support

You can manage your Veritas account information at the following URL:

https://my.veritas.com

If you have questions regarding an existing support agreement, please email the support
agreement administration team for your region as follows:

Worldwide (except Japan) [email protected]

Japan [email protected]

Documentation
Make sure that you have the current version of the documentation. Each document displays
the date of the last update on page 2. The latest documentation is available on the Veritas
website.

Veritas Services and Operations Readiness Tools (SORT)


Veritas Services and Operations Readiness Tools (SORT) is a website that provides information
and tools to automate and simplify certain time-consuming administrative tasks. Depending
on the product, SORT helps you prepare for installations and upgrades, identify risks in your
datacenters, and improve operational efficiency. To see what services and tools SORT provides
for your product, see the data sheet:

https://sort.veritas.com/data/support/SORT_Data_Sheet.pdf
Contents

Chapter 1 Pre-installation setup for Google Cloud Platform


............................................................................................. 6

Overview ...................................................................................... 6
Pre-installation setup for GCP ........................................................... 7
Prerequisites for adding Data Collectors (GCP) .................................... 7
Installation GCP ............................................................................. 7
Adding policy ................................................................................ 8
Testing the collection ...................................................................... 9
Creating an IAM role ....................................................................... 9
Billing Data Access Role ................................................................ 11
Cloud API ................................................................................... 15
Project access ............................................................................. 16

Chapter 2 Pre-Installation Setup for OpenStack Ceilometer


........................................................................................... 17

Pre-Installation setup for OpenStack Ceilometer ................................. 17


Prerequisites for adding Data Collectors (OpenStack Ceilometer) ........... 17
Installation Overview (OpenStack Ceilometer) .................................... 18
Adding an OpenStack Ceilometer Data Collector policy ........................ 18

Chapter 3 Pre-Installation Setup for OpenStack Swift ............... 23

Pre-Installation setup for OpenStack Swift ......................................... 23


Prerequisites for adding Data Collectors (OpenStack Swift) ................... 23
Installation overview (OpenStack Swift) ............................................. 24
Adding an OpenStack Swift Data Collector policy ................................ 24

Chapter 4 Pre-Installation Setup for Microsoft Azure ................. 28

Pre-Installation setup for Microsoft Azure .......................................... 28


Setting up credentials for Microsoft Azure Data Collection ..................... 29
Install the Azure PowerShell client on a Windows computer .................. 29
Find your tenant and subscription ID ................................................ 29
Register a new application for the Data Collector ................................ 30
Create a principal and assign role to the application ............................ 31
Contents 5

Prerequisites for Adding Data Collectors (Microsoft Azure) .................... 32


Installation overview (Microsoft Azure) .............................................. 32
Add a Microsoft Azure Data Collector policy ....................................... 33

Chapter 5 Installing the Data Collector Software ......................... 37


Introduction ................................................................................. 37
Installing the WMI Proxy service (Windows host resources only) ............ 38
Testing WMI connectivity ................................................................ 38
Install Data Collector Software on Windows ....................................... 41
Install Data Collector software on Linux ............................................. 50
Deploy Data Collector in native Kubernetes environment ...................... 53
Configure Data Collector manually for Veritas NetBackup ..................... 55
Install Data Collector binaries on Windows (without configuration)
........................................................................................... 57
Install Data Collector binaries on Linux host (without configuration)
........................................................................................... 61

Chapter 6 Validating Data Collection ............................................... 64


Validation methods ....................................................................... 64
Data Collectors: Vendor-Specific validation methods ............................ 65
Working with on-demand Data Collection .......................................... 67
View real-time logging during an on-demand collection .................. 69
Generating debug level logs during an on-demand collection .......... 70
Using the CLI check install utility ...................................................... 72
List Data Collector configurations ..................................................... 74

Chapter 7 Uninstalling the Data Collector ...................................... 75


Uninstall the Data Collector on Linux ................................................ 75
Uninstall the Data Collector on Windows ........................................... 76

Chapter 8 Manually Starting the Data Collector ........................... 77


Introduction ................................................................................. 77

Appendix A Firewall Configuration: Default Ports ........................... 79


Firewall configuration: Default ports .................................................. 79
Chapter 1
Pre-installation setup for
Google Cloud Platform
This chapter includes the following topics:

■ Overview

■ Pre-installation setup for GCP

■ Prerequisites for adding Data Collectors (GCP)

■ Installation GCP

■ Adding policy

■ Testing the collection

■ Creating an IAM role

■ Billing Data Access Role

■ Cloud API

■ Project access

Overview
Google Cloud Platform (GCP) provides infrastructure, platform as a service, and
serverless computing environments for business agility.
GCP stores any amount of data and can be retrieved as requested in a secure and
customizable services. GCP enables to create and execute virtual machines on
Google's infrastructure.
Pre-installation setup for Google Cloud Platform 7
Pre-installation setup for GCP

Pre-installation setup for GCP


In most cases, a single instance of the Data Collector can support any number of
enterprise objects. However, each environment has its own unique deployment
configuration requirements, so it is important to understand where the Data Collector
software must be installed so that you can determine how many Data Collectors
must be installed and which servers are best suited for the deployment.

Prerequisites for adding Data Collectors (GCP)


■ Must have Google account to log in to Google Cloud Platform.
■ Must have administrative access to create new projects / existing project.
■ Must have an IAM Service Account User and create key in .JSON format which
will be used as private key while authentication.

Note: For more information to create IAM Service Account user, See “Creating
an IAM role” on page 9.

■ Enable Billing Account Access to the service account and enable billing export
option.
■ The APTARE projects requires the billing API enabled.

Note: For more information on billing API, See “Cloud API” on page 15.

Installation GCP
1. Update the Local Hosts file. This enables Portal access.
2. In the Portal, add a Data Collector, if one has not already been created.
3. In the Portal, add the Google Cloud Platform Data Collector policy.
4. On the Data Collector Server, install the Data Collector software.
5. If collecting from Windows hosts, install the WMI Proxy Service on one of the
Windows hosts.
6. Validate the Data Collector installation.
Pre-installation setup for Google Cloud Platform 8
Adding policy

Adding policy
The following procedure details to add policy to Google Cloud Platform
1. Navigate to Admin >> Data Collection >> Collector Administration.
2. Click Add Policy and select Google Cloud Platform from cloud column.
Pre-installation setup for Google Cloud Platform 9
Testing the collection

Table 1-1 GCP Data Collector Policy parameters

Field Description

Service Email* Client email for the service account which data collector is going to communicate via REST on
http / https to collect respective probe data.

Private Key* Communicating with the Server for REST calls.

Active Probes The probes which are to be executed. Following are the available options:

■ VM Instances.
■ Storage Bucket Details.
■ Storage Bucket Usage.
■ Billing Details.

Schedule Relevant schedule at which the probe must trigger.

Test Connection Test Connection initiates a Data Collector process that attempts to connect to the subsystem
using the Service email and private key supplied in the policy. This validation process returns
either a success message or a list of specific connection errors. Test Connection requires that
Agent Services are running.

Several factors affect the response time of the validation request, causing some requests to
take longer than others. For example, there could be a delay when connecting to the subsystem.
Likewise, there could be a delay when getting the response, due to other processing threads
running on the Data Collector

Testing the collection


Verify the connection using Test Connection. Once the connection is successful,
you can test the collection of data using the Run functionality available in
Admin>Data Collection>Collector Administration.
This test run performs a high-level check of the installation, including a check for
the domain, host group and URL, plus Data Collector policy and database
connectivity.

Creating an IAM role


This role is intended to be used at the organization level so that it can be easily
shared between projects.
If the collector is intended to only monitor a single project then the role can be added
at the project level.
Create a new role.
Pre-installation setup for Google Cloud Platform 10
Creating an IAM role

To create an IAM role


1. Choose the organization from the project selector.
2. Navigate to IAM & Admin > Roles.
3. Choose CREATE ROLE.
4. Specify title and description.

Adding permissions to the role


1. After completing the above steps, continue with the following.
2. Select General Availability.
3. Click Add Permissions.
4. In the Filter permission by role field to find and add the following permissions.
■ compute.instances.list
■ compute.machineTypes.get
■ compute.snapshots.list
■ compute.zones.list
Pre-installation setup for Google Cloud Platform 11
Billing Data Access Role

■ resourcemanager.projects
■ .get resourcemanager.projects.list
■ storage.buckets.list
■ storage.objects.list

Assigning permission to the role

Note: IAM administrative access required to assign the permissions.

Billing Data Access Role


The following process allows to:
■ Billing data access role
■ Create an IAM Service Account User
■ Enable Billing Account Access
■ Enable Billing Export

Creating Billing Data Access role


To access billing data tables a role, a user must exist within the master APTARE
project. Typically, an administrator user of an individual project will not have
permissions to create roles who can access Billing Data records.
Pre-installation setup for Google Cloud Platform 12
Billing Data Access Role

1. Select the Aptare Connector Project.


2. Select IAM & ADMIN > Roles.
3. Choose CREATE ROLE.
4. Specify title and description.
5. Select General Availability.
6. Click Add Permissions.
7. Specify bigquery in the Filter permission by role field to find and add the
following permissions.
■ bigquery.datasets.get
■ bigquery.jobs.create
■ bigquery.tables.get
■ bigquery.tables.getData
■ bigquery.tables.list

Creating an IAM Service Account User


Create an account used to access GCP from the connector.
1. Select the previously created Aptare Connector project.
2. Navigate to IAM & Admin > Service Accounts.
3. Choose CREATE SERVICE ACCOUNT.
4. Specify the required details and click CREATE.
Pre-installation setup for Google Cloud Platform 13
Billing Data Access Role

5. Choose the Aptare Connector Read Only.


6. Click ADD ANOTHER ROLE and select the Aptare Billing Read Only.

7. Click CONTINUE.
8. Click CREATE KEY and choose the default JSON format.
9. A .JSON file will be downloaded.
10. Click CLOSE.
Pre-installation setup for Google Cloud Platform 14
Billing Data Access Role

Enable billing account access


The service account needs access to list the available billing accounts.
1. Navigate to Billing > Manage Billing Accounts.
2. Select the box next to the billing account which opens the permissions panel
on the right side.
3. Click Add Members.
4. Specify email address for the service user account.
5. Choose the Billing Account Viewer role.
6. Click Save.

Enable billing export


To collect billing data, the billing account needs to be exported to the 'Aptare
Collector Project' using a BigData table.
Following is the procedure to enable billing export option.
1. Navigate to (Big Data) BigQuery
2. Under resources, select aptare-connector.
Pre-installation setup for Google Cloud Platform 15
Cloud API

3. Choose Create Dataset on the right panel.


4. Specify title and click Create Dataset.
5. From the main menu, select Billing.
6. Link the project to a Billing Account.

7. Select Billing Export from the menu item.


8. Select BIGQUERY Export option.
9. Select the project and the data set.
10. Click SAVE

Cloud API
The 'Aptare Collector Project' requires the billing API enabled.
The following procedure details to enable cloud API
1. Select the Aptare Collector Project.
2. Navigate to API & Services > Library.
3. Search for the Billing API's.
4. Select Google Cloud Billing API.
Pre-installation setup for Google Cloud Platform 16
Project access

5. Click ENABLE.
6. Search for the Resource API's.
7. Select Google Cloud Resource Manager API.
8. Click ENABLE.
9. Search for Compute API's and then select Google Compute Engine API.
10. Click ENABLE.

Project access
The following procedure details to grant access to the project.

Grant Access to Projects


For each project, the data collector to track, access needs to be granted to the
service account.
1. Select the project that you want to monitor.
2. Navigate to IAM & Admin > IAM.
3. Click ADD.
4. Specify the email for the service account previously created.
5. Select Aptare Collector Read Only role.
6. Click SAVE.

Note: Repeat the above steps for all projects which needs to be tracked.
Chapter 2
Pre-Installation Setup for
OpenStack Ceilometer
This chapter includes the following topics:

■ Pre-Installation setup for OpenStack Ceilometer

■ Prerequisites for adding Data Collectors (OpenStack Ceilometer)

■ Installation Overview (OpenStack Ceilometer)

■ Adding an OpenStack Ceilometer Data Collector policy

Pre-Installation setup for OpenStack Ceilometer


In most cases, a single instance of the Data Collector can support any number of
enterprise objects. However, each environment has its own unique deployment
configuration requirements, so it is important to understand where the Data Collector
software must be installed so that you can determine how many Data Collectors
must be installed and which servers are best suited for the deployment.

Prerequisites for adding Data Collectors


(OpenStack Ceilometer)
■ 64-bit OS. See the Certified Configurations Guide for supported operating
systems.
■ When the NetBackup IT Analytics system collects data from any vendor
subsystem, the collection process expects name/value pairs to be in US English,
and requires the installation to be done by an Administrator with a US English
locale. The server’s language version can be non-US English.
Pre-Installation Setup for OpenStack Ceilometer 18
Installation Overview (OpenStack Ceilometer)

■ Supports Amazon Corretto 17. Amazon Corretto is a no-cost, multi-platform,


production-ready distribution of the Open Java Development Kit (OpenJDK).
■ For performance reasons, do not install Data Collectors on the same server as
the NetBackup IT Analytics Portal. However, if you must have both on the same
server, verify that the Portal and Data Collector software do not reside in the
same directory.
■ Install only one Data Collector on a server (or OS instance).

Installation Overview (OpenStack Ceilometer)


As a part of cloud-based data collection, NetBackup IT Analytics supports OpenStack
Ceilometer, which exposes metrics provided by the various other projects within
OpenStack into a common retrieval mechanism for chargeback billing. Note, currently
NetBackup IT Analytics only captures metrics for OpenStack Compute (Nova).
Ceilometer views are available for use in the SQL Template Designer and the Data
Collector policy is classified under Cloud.
Use the following list to ensure that you complete each step in the order indicated.
1. Update the Local Hosts file. This enables Portal access.
2. In the Portal, add a Data Collector, if one has not already been created.
3. In the Portal, add the OpenStack Ceilometer data collector policy.
4. On the Data Collector Server, install the Data Collector software.
5. Validate the Data Collector Installation.

Adding an OpenStack Ceilometer Data Collector


policy
■ Before adding the policy: A Data Collector must exist in the Portal, to which you
will add Data Collector Policies.
For specific prerequisites and supported configurations for a specific vendor,
see the Certified Configurations Guide.
■ After adding the policy: For some policies, collections can be run on-demand
using the Run button on the Collector Administration page action bar. The Run
button is only displayed if the policy vendor is supported.
On-demand collection allows you to select which probes and devices to run
collection against. This action collects data the same as a scheduled run, plus
logging information for troubleshooting purposes. For probe descriptions, refer
to the policy.
Pre-Installation Setup for OpenStack Ceilometer 19
Adding an OpenStack Ceilometer Data Collector policy

To add the policy


1 Select Admin > Data Collection > Collector Administration. Currently
configured Portal Data Collectors are displayed.
2 Search for a Collector if required.
3 Select a Data Collector from the list.
Pre-Installation Setup for OpenStack Ceilometer 20
Adding an OpenStack Ceilometer Data Collector policy

4 Click Add Policy, and then select the vendor-specific entry in the menu.
Pre-Installation Setup for OpenStack Ceilometer 21
Adding an OpenStack Ceilometer Data Collector policy

5 Enter or select the parameters. Mandatory parameters are denoted by an


asterisk (*):

Field Description

Collector Domain The domain of the collector to which the collector backup policy is being added. This is a read-only
field. By default, the domain for a new policy will be the same as the domain for the collector.
This field is set when you add a collector.

Policy Domain The Policy Domain is the domain of the policy that is being configured for the Data Collector. The
Policy Domain must be set to the same value as the Collector Domain. The domain identifies the
top level of your host group hierarchy. All newly discovered hosts are added to the root host group
associated with the Policy Domain.

Typically, only one Policy Domain will be available in the drop-down list. If you are a Managed
Services Provider, each of your customers will have a unique domain with its own host group
hierarchy.

To find your Domain name, click your login name and select My Profile from the menu. Your
Domain name is displayed in your profile settings.

Authentication Enter the IP address of the server with Identity Service and port number in the format: <ip
Server Address* address>:<port_number>. The port number is NOT required if you are running on the default port
35357. However, if you are running on a port other than the default, you must specify the port
number.

Port Ceilometer API service port.

User ID* Enter a user ID that has access to the tenants/projects. This user must have an Admin role, which
has access to all projects.

Password* Enter the password associated with the User ID.

Compute Collect metrics based on the length of time since the last collection cycle to a maximum of one
Performance hour. One collection per hour is the recommended time period.

Notes Enter or edit notes for your data collector policy. The maximum number of characters is 1024.
Policy notes are retained along with the policy information for the specific vendor and displayed
on the Collector Administration page as a column making them searchable as well.
Pre-Installation Setup for OpenStack Ceilometer 22
Adding an OpenStack Ceilometer Data Collector policy

Field Description

Test Connection Test Connection initiates a Data Collector process that attempts to connect to the subsystem
using the IP addresses and credentials supplied in the policy. This validation process returns
either a success message or a list of specific connection errors. Test Connection requires that
Agent Services are running.

Several factors affect the response time of the validation request, causing some requests to take
longer than others. For example, there could be a delay when connecting to the subsystem.
Likewise, there could be a delay when getting the response, due to other processing threads
running on the Data Collector.

You can also test the collection of data using the Run functionality available in Admin>Data
Collection>Collector Administration. This On-Demand data collection run initiates a high-level
check of the installation at the individual policy level, including a check for the domain, host group,
URL, Data Collector policy and database connectivity. You can also select individual probes and
servers to test the collection run.
Chapter 3
Pre-Installation Setup for
OpenStack Swift
This chapter includes the following topics:

■ Pre-Installation setup for OpenStack Swift

■ Prerequisites for adding Data Collectors (OpenStack Swift)

■ Installation overview (OpenStack Swift)

■ Adding an OpenStack Swift Data Collector policy

Pre-Installation setup for OpenStack Swift


In most cases, a single instance of the Data Collector can support any number of
enterprise objects. However, each environment has its own unique deployment
configuration requirements, so it is important to understand where the Data Collector
software must be installed so that you can determine how many Data Collectors
must be installed and which servers are best suited for the deployment.

Prerequisites for adding Data Collectors


(OpenStack Swift)
■ 64-bit OS. See the Certified Configurations Guide for supported operating
systems.
■ When the NetBackup IT Analytics system collects data from any vendor
subsystem, the collection process expects name/value pairs to be in US English,
and requires the installation to be done by an Administrator with a US English
locale. The server’s language version can be non-US English.
Pre-Installation Setup for OpenStack Swift 24
Installation overview (OpenStack Swift)

■ Supports Amazon Corretto 17. Amazon Corretto is a no-cost, multi-platform,


production-ready distribution of the Open Java Development Kit (OpenJDK).
■ For performance reasons, do not install Data Collectors on the same server as
the NetBackup IT Analytics Portal. However, if you must have both on the same
server, verify that the Portal and Data Collector software do not reside in the
same directory.
■ Install only one Data Collector on a server (or OS instance).

Installation overview (OpenStack Swift)


Use the following list to ensure that you complete each step in the order indicated.
1. Update the Local Hosts file. This enables Portal access.
2. In the Portal, add a Data Collector, if one has not already been created.
3. In the Portal, add the OpenStack Swift data collector policy.
4. On the Data Collector Server, install the Data Collector software.
5. Validate the Data Collector Installation.

Adding an OpenStack Swift Data Collector policy


■ Before adding the policy: A Data Collector must exist in the Portal, to which you
will add Data Collector Policies.
For specific prerequisites and supported configurations for a specific vendor,
see the Certified Configurations Guide.
■ After adding the policy: For some policies, collections can be run on-demand
using the Run button on the Collector Administration page action bar. The Run
button is only displayed if the policy vendor is supported.
On-demand collection allows you to select which probes and devices to run
collection against. This action collects data the same as a scheduled run, plus
logging information for troubleshooting purposes. For probe descriptions, refer
to the policy.
To add the policy
1 Select Admin > Data Collection > Collector Administration. Currently
configured Portal Data Collectors are displayed.
2 Search for a Collector if required.
3 Select a Data Collector from the list.
Pre-Installation Setup for OpenStack Swift 25
Adding an OpenStack Swift Data Collector policy

4 Click Add Policy, and then select the vendor-specific entry in the menu.
Pre-Installation Setup for OpenStack Swift 26
Adding an OpenStack Swift Data Collector policy

5 Enter or select the parameters. Mandatory parameters are denoted by an


asterisk (*):

Field Description

Collector Domain The domain of the collector to which the collector backup policy is being added. This is a read-only
field. By default, the domain for a new policy will be the same as the domain for the collector.
This field is set when you add a collector.

Policy Domain The Policy Domain is the domain of the policy that is being configured for the Data Collector. The
Policy Domain must be set to the same value as the Collector Domain. The domain identifies the
top level of your host group hierarchy. All newly discovered hosts are added to the root host group
associated with the Policy Domain.

Typically, only one Policy Domain will be available in the drop-down list. If you are a Managed
Services Provider, each of your customers will have a unique domain with its own host group
hierarchy.

To find your Domain name, click your login name and select My Profile from the menu. Your
Domain name is displayed in your profile settings.

Authentication Enter the IP address of the Authentication Server and port number in the format: <ip
Server* address>:<port_number>.

For V1, the port number is NOT required if you are running on the default port. However, if you
are running on a port other than the default, you must specify the port number.

If using V2 authentication, this port number is the Admin port of the authentication server with
the default value of 35757.

Public Port The public port for V2 authentication. Typically, the default value is 5000.

User ID* Enter a user ID that has access to the tenants/projects. This user must have an Admin role, which
has access to all projects. When you click Get Nodes, the credentials are verified to ensure this
is a valid user.

Password* Enter the password associated with the User ID.

Proxy IP* Enter an IP address or host name for the OpenStack proxy server. This address/name may be
the same as what is configured for the Controller.

Proxy Path* This path identifies the location of the OpenStack Swift configuration files. Default: /etc/swift

User ID* Enter a user ID for the Swift Proxy server. This user must have super-user root privileges (sudo,
sesudo, and pbrun are supported). When you click Get Nodes, an SSH connection is made to
the Proxy server and a list of node IP addresses is returned.

Password* Enter the password associated with the Proxy’s User ID.
Pre-Installation Setup for OpenStack Swift 27
Adding an OpenStack Swift Data Collector policy

Field Description

Get Nodes When you click Get Nodes, the Authentication Server credentials are verified. Next, an SSH
connection is made to the Proxy server to return a list of node IP addresses that will be listed in
the table. This process can take up to a minute to complete. When this processing is complete,
click the Details link to list status and any errors, such as authentication failures, that prevented
collection of the list of nodes. Get Nodes requires that Agent Services are running.

Configure When you click Configure, the Data Collector policy is saved and the Host Inventory window is
displayed so that you can take the following actions before collection can take place for the listed
nodes: Manage Credentials, Manage Paths, and Manage Access Control.
Note: Although a list of nodes has been identified, node collection will not complete successfully
until all configurations have been set and collection is activated in the Host Inventory window.
The following message may appear in the metadata log file, if configurations are not correct:
“Could not find a host for this IP address: <ip_address>”

Active Probes

Schedule Click the clock icon to create a schedule. By default, it is collected every 8 hours.

Every Minute, Hourly, Daily, Weekly, and Monthly schedules may be created. Advanced use of
native CRON strings is also available.

Examples of CRON expressions:

*/30 * * * * means every 30 minutes

*/20 9-18 * * * means every 20 minutes between the hours of 9am and 6pm

*/10 * * * 1-5 means every 10 minutes Mon - Fri.


Note: Explicit schedules set for a Collector policy are relative to the time on the Collector server.
Schedules with frequencies are relative to the time that the Data Collector was restarted.

Collection State Values listed for a storage node represent the state of the main Capacity probe for the host: On,
Off, or N/A. After the initial Get Nodes action, the state will always be N/A because the Configure
step must occur before data collection can be attempted.

Status Values listed for a storage node represent the status of all probes for the host: Error, Success,
or N/A. After the initial Get Nodes action, the state will always be N/A because the Configure step
must occur before data collection can be attempted.

Notes Enter or edit notes for your data collector policy. The maximum number of characters is 1024.
Policy notes are retained along with the policy information for the specific vendor and displayed
on the Collector Administration page as a column making them searchable as well.
Chapter 4
Pre-Installation Setup for
Microsoft Azure
This chapter includes the following topics:

■ Pre-Installation setup for Microsoft Azure

■ Setting up credentials for Microsoft Azure Data Collection

■ Install the Azure PowerShell client on a Windows computer

■ Find your tenant and subscription ID

■ Register a new application for the Data Collector

■ Create a principal and assign role to the application

■ Prerequisites for Adding Data Collectors (Microsoft Azure)

■ Installation overview (Microsoft Azure)

■ Add a Microsoft Azure Data Collector policy

Pre-Installation setup for Microsoft Azure


In most cases, a single instance of the Data Collector can support any number of
enterprise objects. However, each environment has its own unique deployment
configuration requirements, so it is important to understand where the Data Collector
software must be installed so that you can determine how many Data Collectors
must be installed and which servers are best suited for the deployment.
As a part of cloud-based data collection, NetBackup IT Analytics supports Microsoft
Azure. Policies identify backup, storage accounts, billing and VMs for Azure
resources. To work with Azure, you need one or more Azure subscriptions. Azure
Pre-Installation Setup for Microsoft Azure 29
Setting up credentials for Microsoft Azure Data Collection

reports are available for billing, capacity and virtual machines. Azure database
views are available in the SQL Template Designer to construct custom reports.
Azure enterprise objects are also defined in the Dynamic Template Designer.

Note: The Data Collector only supports Azure resources deployed with the Resource
Manager model.

Setting up credentials for Microsoft Azure Data


Collection
To setup credentials for Azure data collection:
1. See “Install the Azure PowerShell client on a Windows computer” on page 29.
2. See “Find your tenant and subscription ID” on page 29.
3. See “Register a new application for the Data Collector” on page 30.
4. See “Create a principal and assign role to the application” on page 31.

Install the Azure PowerShell client on a Windows


computer
Use the Azure with Windows PowerShell to access the information required for
data collection. You must execute Windows PowerShell as administrator.
To install the Azure Client with Windows PowerShell
1 Navigate to http://go.microsoft.com/fwlink/p/?linkid=320376&clcid=0x409
2 Install and at the prompt enter the following to import modules:

Install and at the prompt enter the following to import modules:


Install-Module PowerShellGet -Force
Install-Module -Name AzureRM -AllowClobber
Import-Module -Name AzureRM

Find your tenant and subscription ID


1. Execute Azure Windows PowerShell as an administrator.
2. Log into your Azure account:

Login-AzureRMAccount
Pre-Installation Setup for Microsoft Azure 30
Register a new application for the Data Collector

3. View the Tenant ID and Subscription ID in the output:

Get-AzureRmSubscription

Register a new application for the Data Collector


Azure requires a new Application be registered before you can interact with it.
To register a new Application for the Data Collector
You must be logged into your account within Azure Windows PowerShell. The
following steps are performed at the Microsoft Azure PowerShell prompt.
1 Set the context using your Tenant ID and Subscription ID by entering:

Set-AzureRMContext -SubscriptionId <SUBSCRIPTIONID> -TenantId


<TENANTID>

2 Set the Password in a SecureString:

$securePwd = ConvertTo-SecureString "<PASSWORD>" -AsPlainText -Force

3 Revise the DisplayName, Hostname, and Azure Default Directory. Copy/paste


the following at the prompt:

Note: Azure Default Directory can be found in your Azure account under
Subscription>Overview.

$azureAdApplication = New-AzureRmADApplication -DisplayName


"<DISPLAYNAME>" -HomePage
"https://<HOSTNAME>.<AZURE-DEFAULT-DIRECTORY>" -IdentifierUris
"https://<HOSTNAME>.<AZURE-DEFAULT-DIRECTORY>" -Password
$securePwd -AvailableToOtherTenants $true

4 Enter the following to display your application parameters:

$azureAdApplication

5 Write down the Subscription ID, Tenant ID, Application ID, and the Password
you chose. The Application ID is displayed in the output. The Data Collector
requires those four parameters.
Pre-Installation Setup for Microsoft Azure 31
Create a principal and assign role to the application

Create a principal and assign role to the


application
This step enables the newly registered application to have access rights to the
subscription.
To enable access rights to the subscription, you can either create a Contributor role
or create a combination of Reader and Custom roles for the application. Since
Reader and Custom roles have restricted permissions compared to the Contributor
role, you can choose to assign them to the application if the privileges available
through the Contributor role do not comply with your organizational policy.

To create a principal and assign a Contributor role:


1. Create a Principal for the Application:

New-AzureRmADServicePrincipal -ApplicationId <APPLICATIONID>

2. Create a Contributor role:

New-AzureRmRoleAssignment -RoleDefinitionName Contributor


-ServicePrincipalName <APPLICATIONID>

To create a principal and assign Reader and Custom roles:


1. Create a Principal for the Application:

New-AzureRmADServicePrincipal -ApplicationId <APPLICATIONID>

2. Create a Reader role:

New-AzureRmRoleAssignment -RoleDefinitionName Reader


-ServicePrincipalName <APPLICATIONID>

3. Assign a Reader role to the application:

New-AzureRmRoleAssignment -RoleDefinitionName Reader


-ServicePrincipalName <APPLICATIONID>

4. Create a custom role with JSON template.


For example, create a file customrole.json using the following:

{ "Name": "<Role-Name>", "Id": null, "IsCustom": true,


"Description": "<Role Description>", "Actions":
[ "Microsoft.Storage/storageAccounts/listkeys/action" ],
Pre-Installation Setup for Microsoft Azure 32
Prerequisites for Adding Data Collectors (Microsoft Azure)

"NotActions": [], "DataActions": [], "NotDataActions": [],


"AssignableScopes": [ "/subscriptions/<Subscription ID>" ] }

5. Create a Custom role:

New-AzRoleDefinition -InputFile "C:\CustomRoles\customrole.json"

6. Assign a custom role to application:

New-AzureRmRoleAssignment -RoleDefinitionName <customeRoleName>


-ServicePrincipalName <APPLICATIONID>

Prerequisites for Adding Data Collectors


(Microsoft Azure)
■ 64-bit OS. See the Certified Configurations Guide for supported operating
systems.
■ When the NetBackup IT Analytics system collects data from any vendor
subsystem, the collection process expects name/value pairs to be in US English,
and requires the installation to be done by an Administrator with a US English
locale. The server’s language version can be non-US English.
■ Supports Amazon Corretto 17. Amazon Corretto is a no-cost, multi-platform,
production-ready distribution of the Open Java Development Kit (OpenJDK).
■ For performance reasons, do not install Data Collectors on the same server as
the NetBackup IT Analytics Portal. However, if you must have both on the same
server, verify that the Portal and Data Collector software do not reside in the
same directory.
■ Install only one Data Collector on a server (or OS instance).
■ Requires Port 443.

Installation overview (Microsoft Azure)


Use the following list to ensure that you complete each step in the order indicated.
1. Update the Local Hosts file. This enables Portal access.
2. In the Portal, add a Data Collector, if one has not already been created.
3. In the Portal, add the Microsoft Azure data collector policy.
4. On the Data Collector Server, install the Data Collector software.
Pre-Installation Setup for Microsoft Azure 33
Add a Microsoft Azure Data Collector policy

5. Validate the Data Collector Installation.

Add a Microsoft Azure Data Collector policy


■ Before adding the policy: A Data Collector must exist in the Portal, to which you
will add Data Collector Policies.
For specific prerequisites and supported configurations for a specific vendor,
see the Certified Configurations Guide.
■ After adding the policy: For some policies, collections can be run on-demand
using the Run button on the Collector Administration page action bar. The Run
button is only displayed if the policy vendor is supported.
On-demand collection allows you to select which probes and devices to run
collection against. This action collects data the same as a scheduled run, plus
logging information for troubleshooting purposes. For probe descriptions, refer
to the policy.
To add the policy
1 Select Admin > Data Collection > Collector Administration. Currently
configured Portal Data Collectors are displayed.
2 Search for a Collector if required.
3 Select a Data Collector from the list.
Pre-Installation Setup for Microsoft Azure 34
Add a Microsoft Azure Data Collector policy

4 Click Add Policy, and then select the vendor-specific entry in the menu.
Pre-Installation Setup for Microsoft Azure 35
Add a Microsoft Azure Data Collector policy

5 Enter or select the parameters. Mandatory parameters are denoted by an


asterisk (*):

Field Description

Collector Domain The domain of the collector to which the collector backup policy is being added. This is a read-only
field. By default, the domain for a new policy will be the same as the domain for the collector.
This field is set when you add a collector.

Policy Domain The Policy Domain is the domain of the policy that is being configured for the Data Collector. The
Policy Domain must be set to the same value as the Collector Domain. The domain identifies the
top level of your host group hierarchy. All newly discovered hosts are added to the root host group
associated with the Policy Domain.

Typically, only one Policy Domain will be available in the drop-down list. If you are a Managed
Services Provider, each of your customers will have a unique domain with its own host group
hierarchy.

To find your Domain name, click your login name and select My Profile from the menu. Your
Domain name is displayed in your profile settings.

Offer ID* Offer ID of your Microsoft Azure Cloud Subscription. To locate your Offer ID, log into your Azure
portal and the Offer ID is listed as a parameter under your Subscription service.

Tenant ID* Tenant ID of your Microsoft Azure Cloud account. Also known as the GUID.

Application ID* Application ID associated with the registered Microsoft Azure application. Azure requires an
application for the data collector to interact with it. Refer to

See “To register a new Application for the Data Collector” on page 30.

Password* Password associated with the Application ID (the registered Microsoft Azure application).

Cloud Name Select cloud name.


Note: Default value is Public Cloud.

Subscription ID* Subscription ID of your Microsoft Azure Cloud account.


Note: A maximum of 105 subscriptions can be selected in a policy.

Subscription Subscription Details of your Microsoft Azure Cloud account.


Details*
Note: Subscription list will not update automatically. User will have to manually update the list
by clicking on the Get Subscriptions button, in case a subscription gets added or deleted.

Virtual Machines Select to set a collection schedule for Azure virtual machines.

Storage Accounts Select to set a collection schedule for Azure storage accounts.

Billing Select to set a collection schedule for Azure billing.


Pre-Installation Setup for Microsoft Azure 36
Add a Microsoft Azure Data Collector policy

Field Description

Azure Backup Select to set a collection schedule for Azure Backup.

Notes Enter or edit notes for your data collector policy. The maximum number of characters is 1024.
Policy notes are retained along with the policy information for the specific vendor and displayed
on the Collector Administration page as a column making them searchable as well.

Test Connection Test Connection initiates a Data Collector process that attempts to connect to the subsystem
using the IP addresses and credentials supplied in the policy. This validation process returns
either a success message or a list of specific connection errors. Test Connection requires that
Agent Services are running.

Several factors affect the response time of the validation request, causing some requests to take
longer than others. For example, there could be a delay when connecting to the subsystem.
Likewise, there could be a delay when getting the response, due to other processing threads
running on the Data Collector.

You can also test the collection of data using the Run functionality available in Admin>Data
Collection>Collector Administration. This On-Demand data collection run initiates a high-level
check of the installation at the individual policy level, including a check for the domain, host group,
URL, Data Collector policy and database connectivity. You can also select individual probes and
servers to test the collection run.
Chapter 5
Installing the Data
Collector Software
This chapter includes the following topics:

■ Introduction

■ Installing the WMI Proxy service (Windows host resources only)

■ Testing WMI connectivity

■ Install Data Collector Software on Windows

■ Install Data Collector software on Linux

■ Deploy Data Collector in native Kubernetes environment

■ Configure Data Collector manually for Veritas NetBackup

■ Install Data Collector binaries on Windows (without configuration)

■ Install Data Collector binaries on Linux host (without configuration)

Introduction
This section includes the instructions for installing the Data Collector software on
the Data Collector Server. Data Collector software is supported in various flavors
of Linux and Windows. On Windows, if you are collecting data from host resources,
you may need to install the WMI Proxy Service. The WMI Proxy Service is installed
by default, as part of the Data Collector installation on a Windows server.
A GUI based version is available for Windows and a console (command line) based
interface is available for Linux.
Installing the Data Collector Software 38
Installing the WMI Proxy service (Windows host resources only)

When the NetBackup IT Analytics system collects data from any vendor subsystem,
the collection process expects name/value pairs to be in US English, and requires
the installation to be done by an Administrator with a US English locale. The server’s
language version can be non-US English.

Note: Log in as a Local Administrator to have the necessary permissions for this
installation.

Installing the WMI Proxy service (Windows host


resources only)
To collect data from Windows hosts, choose a Windows host on which to install
the WMI proxy.
■ This is required only if you are collecting data from Windows Host Resources.
■ The WMI Proxy needs to be installed on only one Windows host.
■ If the Data Collector is on a Windows server, the WMI Proxy will be installed
there as part of the Data Collector installation.
■ If the Data Collector is on a Linux server, you must identify a Windows server
on which to install the WMI proxy service.
See “Install Data Collector Software on Windows” on page 41.

Testing WMI connectivity


The Windows Management Instrumentation (WMI) Proxy is used by NetBackup IT
Analytics to collect data from Windows hosts. Should you have connectivity issues,
these steps can be taken to test and troubleshoot connectivity.
To verify that WMI is working properly, take the following steps:
1. Log in to the Data Collector server as an Administrator.
2. From the Windows Start menu, type Run in the search box to launch the
following window where you will enter wbemtest.exe and click OK.
Installing the Data Collector Software 39
Testing WMI connectivity

3. In the Windows Management Instrumentation Tester window, click Connect.

4. In the Connect window, preface the Namespace entry with the IP address or
hostname of the target remote server in the following format:

\\<IP Address>\root\cimv2
Installing the Data Collector Software 40
Testing WMI connectivity

5. Complete the following fields in the Connect window and then click Connect.
■ User - Enter the credentials for accessing the remote computer. This may
require you to enable RPC (the remote procedure call protocol) on the
remote computer.
■ Password
■ Authority: Enter NTLMDOMAIN:<NameOfDomain>
where NameOfDomain is the domain of the user account specified in the
User field.

6. Click Enum Classes.


7. In the Superclass Info window, select the Recursive radio button, but do not
enter a superclass name. Then, click OK.
8. The WMI Tester will generate a list of classes. If this list does not appear, go
to the Microsoft Developer Network web site for troubleshooting help.
http://msdn.microsoft.com/en-us/library/ms735120.aspx
Installing the Data Collector Software 41
Install Data Collector Software on Windows

Install Data Collector Software on Windows


To install your Data Collector software:
1 Login to the Data Collector server as a local administrator.
2 Go to the downloads section under Support on www.veritas.com and click the
relevant download link.
Once, downloaded, the Data Collector Installation Wizard launches
automatically. If it does not, navigate to its directory and double-click the
executable file Setup.exe.
3 Review the recommendations on the welcome page and click Next.
You are advised to close all other programs during this installation.
Installing the Data Collector Software 42
Install Data Collector Software on Windows

4 The installation wizard validates the system environment. On successful


validation, click Next.
Installing the Data Collector Software 43
Install Data Collector Software on Windows

5 Review the End User License Agreement (EULA), select I accept the terms
of the license agreement, and click Next.
Installing the Data Collector Software 44
Install Data Collector Software on Windows

6 Specify the directory where you would like to install the Data Collector software
and click Next. The default Windows path is C:\Program
Files\AptareC:\Program Files\Veritas\AnalyticsCollector. Accepting
the default paths is recommended.
If you specify a custom directory, the install creates the AnalyticsCollector
folder within the specified directory.

7 Provide accurate details as described below on the next page and then click
Next.

Data Collection Task Select Data Collector (includes WMI


Proxy) or WMI Proxy Server (only) from
the list.

A single Data Collector can be installed for


multiple vendor subsystem on a single
server.
Installing the Data Collector Software 45
Install Data Collector Software on Windows

Data Collector Registration File Enter the absolute path of the registration
file downloaded from the NetBackup IT
Analytics Portal.

If a registration file is not available,


generate and download it from the Portal
and provide its path. This will auto-populate
the next three fields.

Data Collector Name Read-only and auto-populated.

Data Collector Passcode Read-only and auto-populated.

Data Receiver URL Read-only and auto-populated.

Proxy Settings 1 HTTP/HTTPS: Enter the hostname


or IP address and a port number.

2 UserId: User ID of the proxy server.

3 Password: Password of the proxy


server.

4 No Proxy For: Enter the host names


or IP addresses separated by
commas that will not be routed
through the proxy.
Installing the Data Collector Software 46
Install Data Collector Software on Windows

8 Review the installation summary and the available disk space before you
proceed with the installation.
Installing the Data Collector Software 47
Install Data Collector Software on Windows

9 Click Next to initiate the installation.


Installing the Data Collector Software 48
Install Data Collector Software on Windows

10 Review the post install details and click Next.


Installing the Data Collector Software 49
Install Data Collector Software on Windows

11 To validate the Data Collector installation, run the C:\Program


Files\Veritas\AnalyticsCollector\mbs\bin\checkinstall.bat batch
file.
Close the terminal window once the validation is complete and then click Next.

If you wish to run checkinstall.bat later, you can run the script from the
command prompt.
Installing the Data Collector Software 50
Install Data Collector software on Linux

12 On successful installation of NetBackup IT Analytics Data Collector, click Finish.


Your Data Collector installation is complete.

Install Data Collector software on Linux


To install Data Collector software on Linux:
1 Login as root on the server where NetBackup IT Analytics Data Collector has
to be installed.
2 If the Data Collector system is having low entropy, it can affect the performance
of cryptographic functions and such steps can take considerable amount of
time to complete. You can identify the entropy level of the system from the
content of the /proc/sys/kernel/random/entropy_avail file using command
# cat /proc/sys/kernel/random/entropy_avail. If this value is not more
than 400 consistently, install the rng-tools and start the services as described
below on the data collector system.
Install the rng-tools and start the services as described below.
For RHEL or OEL:
■ Access the command prompt.
Installing the Data Collector Software 51
Install Data Collector software on Linux

■ Install the rng-tools.

yum install rng-tools

■ Start the services.

systemctl start rngd

■ Enable the services.

systemctl start rngd

For SUSE:
■ Access the command prompt.
■ Install the rng-tools.

zypper install rng-tools

■ Start the services.

systemctl start rng-tools

■ Enable the services.

systemctl enable rng-tools

3 Ensure the following rpms are present on the system:


On SUSE: libXrender1 and libXtst6 insserv-compat
On other Linux systems: libXtst and libXrender chkconfig
Since the above rpms are essential for proper functioning of the Data Collector,
you can run the below commands on the Data Collector server to check whether
the rpms are present.
On SUSE: rpm -q libXrender1 libXtst6 insserv-compat
On other Linux systems: rpm -q libXtst libXrender chkconfig
The output of the above commands will print the rpms that are present on the
system.
4 Go to the downloads section under Support on www.veritas.com and click the
relevant download link.
Installing the Data Collector Software 52
Install Data Collector software on Linux

5 Mount the ISO image that you downloaded.

mkdir /mnt/diska
mount -o loop <itanalytics_datacollector_linux_xxxxx.iso>
/mnt/diska

Substitute the name of the ISO image downloaded have downloaded.


6 Start the installer:

cd /
/mnt/diska/dc_installer.sh

7 Review the End User License Agreement (EULA) and enter accept to agree.
8 Provide the install location. The default location is
/usr/openv/analyticscollector. Accepting the default paths is
recommended.
If you specify a custom location, analyticscollector directory is created at
the specified location.
9 The installer requests for the following details.
■ Data Collector Registration File Path: Enter the absolute file path of the
registration file generated and downloaded from the NetBackup IT Analytics
Portal.
■ Web Proxy (HTTP) settings can be configured. Enter y to configure proxy.
The installer prompts for:
■ HTTP Proxy IP Address: Enter the hostname or IP address and a port
number.
■ HTTP Proxy Port: Enter the proxy port number for HTTP proxy.
■ Proxy UserId and password: Enter the credentials for the proxy server.
■ No Proxy For: Enter the host names or IP addresses separated by
commas that will not be routed through the proxy.

The Data Collector installation is complete. You can run the


<Data_Collector_Install_Location>/analyticscollector/mbs/bin/checkinstall.sh
file for verification.
Installing the Data Collector Software 53
Deploy Data Collector in native Kubernetes environment

Deploy Data Collector in native Kubernetes


environment
This procedure provides the steps to deploy Data Collector Docker image on a
Kubernetes cluster through an operator with the required configuration on Linux
hosts. This method enables efficient Data Collector installation and reduces the
human errors caused during manual or ISO-based installations.

Prerequisites and dependencies


System requirements and installation dependencies for the system on which Data
Collector will be installed are listed below:
■ Obtain the Docker image generated from the CI/CD build.
■ Kubernetes must be pre-installed on the system.
■ Assume root role on the host system.
■ Kubernetes cluster must be accessible on the system.
■ Ensure that the file system supporting the /data directory has enough free
space as recommended in the NetBackup IT Analytics Certified Configurations
Guide for Data Collector.
The /data directory in the host system will be mounted inside the container as
/usr/openv/analyticscollector.

■ Obtain the following Data Collector details. You are required to supply these
details to the installer during the installation process.
■ Registry: The name of the registry to which you want to push the installer
images.
■ Absolute path of Data Receiver Certificate file: Absolute path of the data
receiver certificate file downloaded from NetBackup IT Analytics Portal.
■ Absolute path of the Data Collector Registration File: Absolute path of the
Data Collector registration file downloaded from the NetBackup IT Analytics
Portal.
■ Proxy settings:
■ Portal IP address: IP address of the system hosting the NetBackup IT
Analytics Portal.
■ Portal HostName: aptareportal.<DOMAIN> or itanalyticsportal.<DOMAIN>
■ Agent HostName: aptareagent.<DOMAIN> or itanalyticsagent.<DOMAIN>
■ StorageClass Name: Name of the Kubernetes storage class to be used.
Installing the Data Collector Software 54
Deploy Data Collector in native Kubernetes environment

■ Obtain the itanalytics_k8s_artificats.tar from the Veritas Download


Center. The tarball has the container image, operater image, set of .yaml files,
and the scripts.

Deploy the Data Collector in Kubernetes environment


To deploy the Data Collector in Kubernetes environment:
1 Login to the Kubernetes cluster.
2 Run this command on the primary node and label the node on which you want
to deploy the Data Collector.

kubectl label node <worker_node_name>


itaDcNodeKey=itaDcDeploymentNode

3 From the itanalytics_k8s_artifacts.tar location, run this command to


initiate the Data Collector installation.

tar -xvf itanalytics_k8s_artifacts.tar scripts

This saves a scripts folder at the itanalytics_k8s_artifacts.tar file


location,.
4 From the scripts folder, run this script.

cd scripts/
sh itanalytics_dc_installer.sh

Note: The installation logs are saved to


itanalytics_dc_installer_<time_stamp>.log.

5 Provide the Data Collector configuration details when asked by the installer in
the following order.
■ Registry
The installer asks for a confirmation after providing the registry name to
proceed with pushing the images. You need to enter y for a fresh installation.
If for any reason, you are required to re-run the installation and this step
was successfully completed anytime before for the same cluster node, you
can enter n to avoid a rewrite and bypass this step.
■ Absolute path of Data Receiver Certificate file (if you have set an https://
URL for the data receiver)
■ Absolute path of the Data Collector registration file
■ Proxy settings
Installing the Data Collector Software 55
Configure Data Collector manually for Veritas NetBackup

■ Portal IP address
■ Portal HostName
■ Agent HostName
■ StorageClass Name

6 The installer asks to confirm the configuration details before proceeding with
the installation. Enter y to proceed with the data collector installation
After a successful installation, verify whether the Data Collector status appears
Online on the NetBackup IT Analytics Portal.

Connect to the pod instance


Run this command to connect to the pod instance and also to facilitate debugging
when required.

# kubectl exec -it<pod ID> -- bash

Configure Data Collector manually for Veritas


NetBackup
From NetBackup version 10.1.1 onwards, Veritas NetBackup primary server
installation will also deploy NetBackup IT Analytics Data Collector binaries
automatically on Windows ( C:\Program Files\Veritas\AnalyticsCollector)
and Linux (/usr/openv/analyticscollector) system. Also, if Veritas NetBackup
primary server is managed under Veritas Alta, the NetBackup IT Analytics Data
Collector will be automatically configured with NetBackup IT Analytics Portal.
This procedure provides the manual steps to configure the Data Collector for Veritas
NetBackup when Veritas NetBackup primary is not managed under Veritas Alta.
Note that NetBackup IT Analytics Portal must be already installed in your data
center and a Data Collector entry must be added via the Collector Administration
screen of the portal for each NetBackup primary server before you perform this
configuration.
Keep the registration file path (generated while creating the data collector on the
Portal and copied to the NetBackup primary server) handy when you configure the
Data Collector.
See Add/Edit Data Collectors section in the NetBackup IT Analytics User Guide
for more information.
Installing the Data Collector Software 56
Configure Data Collector manually for Veritas NetBackup

To configure the Data Collector manually on Windows:


1 Use the [responsefile.cmd] received through the installer media for this
configuration. You can configure it as described in the steps below.
2 Edit the responsefile as a batch script responsefile.cmd with the following
contents. You can also create one if required with the following content. These
are the responses to the user input required to configure the Data Collector

SET DATACOLLECTOR_REGISTRATION_FILE_PATH=<path to the .json file>


SET HTTP_PROXY_CONF=N
SET PROXY_HTTP_URL=
SET PROXY_HTTP_PORT=
SET PROXY_HTTPS_URL=
SET PROXY_HTTPS_PORT=
SET PROXY_USERID=
SET PROXY_PASSWORD=
SET PROXY_NOT_FOR=

3 Run the command:

"C:\ProgramData\Veritas\NetBackup IT Analytics\DC\configure.cmd"
/RESPFILE:<response_file_path> /INSTALL_TYPE:CONFIG

To configure the Data Collector manually on Linux:


1 Use the sample [responsefile.cmd] available on the install media and also
available at the <Data collector install location>/installer path on
the system for this configuration. You can configure it as described in the steps
below.
2 Update the response file with the following contents. You can also create one
if required with the following content.

COLLECTOR_REGISTRATION_PATH=<path to the json file>


HTTP_PROXY_CONF=N
HTTP_PROXY_ADDRESS=
HTTP_PROXY_PORT=
HTTPS_PROXY_ADDRESS=
HTTPS_PROXY_PORT=
PROXY_USERNAME=
PROXY_PASSWORD=
PROXY_EXCLUDE=
Installing the Data Collector Software 57
Install Data Collector binaries on Windows (without configuration)

3 Update the value for each field with appropriate data.


A sample responsefile is available on the install media as well as the <Data
collector install location>/installer path on the system.

4 Run any one of the following command:

<Install media>/dc_installer.sh -c <responsefile path>

Or

<install location>/installer/dc_installer.sh -c <responsefile


path>

Install Data Collector binaries on Windows


(without configuration)
This Data Collector installation allows you to install the collector independent of the
portal software installation. The collector remains disconnected from the portal until
you configure it using a response file, that contains credentials of the Data Collector
created on the NetBackup IT Analytics Portal and the data receiver.

Install the Data Collector


To install a Data Collector:
1 Download and mount the Data Collector installer ISO file.
2 Install the Data Collector using silentinstall.cmd and follow the installation
prompt.
You can install the Data Collector in the following options:
■ Install at default location:

<ISO_MOUNT_DRIVE>:\silentinstall.cmd /INSTALL_TYPE:INSTALL

■ Install at custom location:

<ISO_MOUNT_DRIVE>:\silentinstall.cmd /INSTALL_PATH:<custom
location for dc installation> /INSTALL_TYPE:INSTALL

The independent Data Collector installation is complete.

Configure the Data Collector using responsefile


A sample responsefile is saved when you install the Data Collector. To connect the
Data Collector with the NetBackup IT Analytics Portal, you must configure its
Installing the Data Collector Software 58
Install Data Collector binaries on Windows (without configuration)

responsefile with the credentials of the Data Collector created on the portal and run
a configuration command as described in the procedure below.
To configure the Data Collector:
1 Obtain the following details from the NetBackup IT Analytics Portal:
■ Registration file downloaded from the Portal.
Installing the Data Collector Software 59
Install Data Collector binaries on Windows (without configuration)

■ Proxy server configuration details


Installing the Data Collector Software 60
Install Data Collector binaries on Windows (without configuration)

2 Update the responseFile.cmd with the above values.

@ECHO OFF
REM -------------------------------------------------
SET DATACOLLECTOR_REGISTRATION_FILE_PATH=
REM -------------------------------------------------
REM Description: Enter the Data Collector's Key File path. The
file path must include name of the file that was downloaded from
the Portal.
REM Valid input values: Absolute path of key file
REM Required: True

REM -------------------------------------------------
SET HTTP_PROXY_CONF=N
REM -------------------------------------------------
REM Description: It indicate whether proxy should be configured
or not
REM Valid input values: Y,N
REM Default value: N

REM -------------------------------------------------
SET PROXY_HTTP_URL=
REM -------------------------------------------------
REM Description: IP/hostname for HTTP Proxy
REM Valid input values: 10.20.30.40, localhost

REM -------------------------------------------------
SET PROXY_HTTP_PORT=
REM -------------------------------------------------
REM Description: Port for HTTP proxy
REM Valid input values: Any number between 0 and 65535

REM -------------------------------------------------
SET PROXY_HTTPS_URL=
REM -------------------------------------------------
REM Description: IP/hostname for HTTPS Proxy
REM Valid input values: 10.20.30.40, localhost

REM -------------------------------------------------
SET PROXY_HTTPS_PORT=
REM -------------------------------------------------
REM Description: Port for HTTPS proxy
REM Valid input values: Any number between 0 and 65535
Installing the Data Collector Software 61
Install Data Collector binaries on Linux host (without configuration)

REM -------------------------------------------------
SET PROXY_USERID=
REM -------------------------------------------------
REM Description: Proxy UserId
REM Default value:

REM -------------------------------------------------
SET PROXY_PASSWORD=
REM -------------------------------------------------
REM Description: Proxy user password
REM Default value:

REM -------------------------------------------------
SET PROXY_NOT_FOR=
REM -------------------------------------------------
REM Description: List of IP/hostname which should be excluded
for proxy
REM Default value:

The Data Collector installation without connecting it with the portal is complete

Configure the Data Collector using responsefile


To configure the installation, run the below command from command prompt:

<ISO_MOUNT_DRIVE>:\silentinstall.cmd /RESPFILE:<responsefile_path>
/INSTALL_PATH:<Data_Collector_installation_path> /INSTALL_TYPE:CONFIG

or

<INATALL_PATH>\DC\configure.cmd" /RESPFILE:<response_file_path>
/INSTALL_TYPE:CONFIG

Uninstall Data Collector


Remove the Data Collector installation from Control Panel > Add and Remove
Programs menu.

Install Data Collector binaries on Linux host


(without configuration)
This installation allows you to install the Data Collector independent of the portal
software installation. The collector remains disconnected from the portal until you
Installing the Data Collector Software 62
Install Data Collector binaries on Linux host (without configuration)

configure it using a response file, that contains credentials of the Data Collector
created on the NetBackup IT Analytics Portal and the data receiver.
To install a Data Collector:
1 Download and mount the Data Collector installer
itanalytics_datacollector_linux_<version>.iso.

# mount -o loop <ISO file path> <path to mount>

2 Install the Data Collector at a custom location.

# <path to mount>/dc_installer.sh -i <user selected path>

Example:

# <path to mount>/dc_installer.sh -i /usr/openv -n

Configure the Data Collector using response file


A sample response file is saved when you install the Data Collector. To connect
the Data Collector with the NetBackup IT Analytics Portal, you must configure its
responsefile with the credentials of the Data Collector created on the portal and run
a configuration command as described in the procedure below.
To configure the Data Collector:
1 Obtain the following details from the NetBackup IT Analytics Portal:
■ Registration file downloaded from the Portal.
■ Proxy server configuration details

2 Update the above values in the responsefile.sample.

COLLECTOR_REGISTRATION_PATH=<path to the .json file>


HTTP_PROXY_CONF=N
HTTP_PROXY_ADDRESS=
HTTP_PROXY_PORT=
HTTPS_PROXY_ADDRESS=
HTTPS_PROXY_PORT=
PROXY_USERNAME=
PROXY_PASSWORD=
PROXY_EXCLUDE=
Installing the Data Collector Software 63
Install Data Collector binaries on Linux host (without configuration)

3 Configure the data collector using the above response file.

# <path to mount>/dc_installer.sh -c <responsefile path>

or

<install location>/installer/ dc_installer.sh -c <responsefile


path>{}

4 Start the data collector service

# <install location>/mbs/bin/aptare_agent start

Uninstall Data Collector


Run this command to uninstall the Data Collector.

<INSTALL_PATH>/UninstallerData/uninstall_dc.sh -r
Chapter 6
Validating Data Collection
This chapter includes the following topics:

■ Validation methods

■ Data Collectors: Vendor-Specific validation methods

■ Working with on-demand Data Collection

■ Using the CLI check install utility

■ List Data Collector configurations

Validation methods
Validation methods are initiated differently based on subsystem vendor associated
with the Data Collector policy, but perform essentially the same functions. Refer to
the following table for vendor-specific validation methods.
■ Test Connection - Initiates a connection attempt directly from a data collector
policy screen that attempts to connect to the subsystem using the IP addresses
and credentials supplied in the policy. This validation process returns either a
success message or a list of specific connection errors.
■ On-Demand data collection run - Initiates an immediate end-to-end run of the
collection process from the Portal without waiting for the scheduled launch. This
on-demand run also serves to validate the policy and its values (the same as
Test Connection), providing a high-level check of the installation at the individual
policy level, including a check for the domain, host group, URL, Data Collector
policy and database connectivity. This is initiated at the policy-level from
Admin>Data Collection>Collector Administration.
See “Working with on-demand Data Collection” on page 67.
Validating Data Collection 65
Data Collectors: Vendor-Specific validation methods

■ CLI Checkinstall Utility- This legacy command line utility performs both the Test
Connection function and On-Demand data collection run from the Data Collector
server.
See “Using the CLI check install utility” on page 72.

Note: NetBackup IT Analytics does not recommend using the CLI Checkinstall
utility for any Data Collector subsystem vendor which supports On-Demand
runs.

Data Collectors: Vendor-Specific validation


methods
Table 6-1 Vendor-specific validation requirements.

Vendor Name Test Connection On-Demand CLI Checkinstall


Utility

Amazon Web Services (AWS) x x

Brocade Switch x

Brocade Zone Alias x x

Cisco Switch x

Cisco Zone Alias x x

Cohesity DataProtect x x

Commvault Simpana x

Compute Resources x

Dell Compellent x

Dell EMC Elastic Cloud Storage (ECS) x x

Dell EMC NetWorker Backup & Recovery x

Dell EMC Unity x x

EMC Avamar x

EMC Data Domain Backup x x

EMC Data Domain Storage x x


Validating Data Collection 66
Data Collectors: Vendor-Specific validation methods

Table 6-1 Vendor-specific validation requirements. (continued)

Vendor Name Test Connection On-Demand CLI Checkinstall


Utility

EMC Isilon x

EMC Symmetrix x x

EMC VNX x x

EMC VNX Celerra x

EMC VPLEX x

EMC XtremIO x x

HDS HCP x x

HDS HNAS x

HP 3PAR x

HP Data Protector x

HP EVA x

HPE Nimble Storage x x

Hitachi Block x

Hitachi Content Platform (HCP) x x

Hitachi NAS x x

IBM Enterprise x

IBM SVC x

IBM Spectrum Protect (TSM) x

IBM VIO x x

IBM XIV x

Microsoft Azure x x

Microsoft Hyper-V x x

Microsoft Windows Server x x

NAKIVO Backup & Replication x x


Validating Data Collection 67
Working with on-demand Data Collection

Table 6-1 Vendor-specific validation requirements. (continued)

Vendor Name Test Connection On-Demand CLI Checkinstall


Utility

NetApp E Series x

Netapp x

Netapp Cluster Mode x

OpenStack Ceilometer x x

OpenStack Swift x x

Test Connection is
included with the Get
Nodes function.

Oracle Recovery Manager (RMAN) x x

Pure FlashArray x x

Rubrik Cloud Data Management x x

VMWare x

Veeam Backup & Replication x x

Veritas Backup Exec x

Veritas NetBackup x x

Veritas NetBackup Appliance X x

Working with on-demand Data Collection


Collections can run on a schedule or on-demand using the Run button on the action
bar. On-demand allows you to select which probes and devices to run. The
on-demand run collects data just like a scheduled run plus additional logging
information for troubleshooting. A stopped Policy still allows an on-demand collection
run, provided the policy is assigned to one of the specified vendors and the collector
is online.

Note: On-demand data collection is not available for all policies.

On-Demand data collection serves multiple purposes. You can use it to:
Validating Data Collection 68
Working with on-demand Data Collection

■ Validate the collection process is working end-to-end when you create a data
collector policy
■ Launch an immediate run of the collection process without waiting for the
scheduled run
■ Populate your database with new/fresh data
■ Choose to view the collection logs on the portal while performing an on-demand
run.
To initiate an on-demand data collection
1 Select Admin > Data Collection > Collector Administration. All Data
Collectors are displayed.
2 Click Expand All to browse for a policy or use Search.
3 Select a data collector policy from the list. If the vendor is supported, the Run
button is displayed on the action bar.
4 Click Run. A dialog allowing you to select servers and individual probes to test
the collection run is displayed. The following example shows the Amazon Web
Services dialog. See the vendor specific content for details on probes and
servers.

5 Select the servers and probes for data collection.


Validating Data Collection 69
Working with on-demand Data Collection

6 The portal enables the user to log the messages at various level during the
collection process. Following are the available options:
■ Enable Real-Time Logs: This option enables the user to log generally
useful information in real-time when the collection is in progress, select
Enable Real-Time Logs.
■ Enable Debug Logs: This option enables the user to log information at a
granular level, select Enable Debug Logs

7 Click Start. Data is collected just like a scheduled run plus additional logging
information for troubleshooting. Once started, you can monitor the status of
the run through to completion.

Note: If there is another data collection run currently in progress when you
click Start, the On-Demand run will wait to start until the in-progress run is
completed.

See “View real-time logging during an on-demand collection” on page 69.


See “Generating debug level logs during an on-demand collection” on page 70.

View real-time logging during an on-demand collection


By default, real-time logging is enabled when you initiate an on-demand collection
for a data collector. Admin > Data Collection > Collector Administration provides
a window to view the logs in real-time as the collection progresses.
Validating Data Collection 70
Working with on-demand Data Collection

The following steps help you to view the real-time logging:


1 Go to Admin > Data Collection > Collector Administration. All Data
Collectors are displayed.
2 Initiate an on-demand data collection as described under Working with
on-demand Data Collection with Enable Real-Time Logs selected.
The Policy State column displays status as Collecting and an icon to open
the Collection Console pop-up.
3 Click the icon next to the Collecting link to view the real-time logs in the
Collection Console. Real-time logs are visible as long as the data collection
is in progress and the Collection Console is open.

You can use the filter on the console to selectively view the logs of your choice.
The Collection Console icon is not visible if the data collection is not in
progress.

Generating debug level logs during an on-demand collection


By default, Enable Debug Logs (Backend only) option is not selected when you
initiate an on-demand collection for a data collector. The Collector Administration
provides a window to generate debug level information as the collection progresses.
Validating Data Collection 71
Working with on-demand Data Collection

The following steps to enable debug level log file generation:


1 Go to Admin > Data Collection > Collector Administration. All Data
Collectors are displayed.
2 Initiate an on-demand data collection as described under Working with
on-demand Data Collection with Enable Debug logs (Backend only) option
selected.
Validating Data Collection 72
Using the CLI check install utility

Note: The path for generated log file on data collector server:
<APTARE_HOME>/mbs/logs/validation/

Using the CLI check install utility


This legacy utility performs both the Test Connection function and On-Demand data
collection run from a command line interface launched from the Data Collector
server.

Note: NetBackup IT Analytics does not recommend using the CLI Checkinstall
utility for any Data Collector subsystem vendor which supports On-Demand runs.

The following directions assume that the Data Collector files have been installed
in their default location:
■ Windows:C:\Program Files\Veritas\AnalyticsCollector
■ Linux:/usr/openv/analyticscollector
If you have installed the files in a different directory, make the necessary path
translations in the following instructions.

Note: Some of the following commands can take up to several hours, depending
on the size of your enterprise.

To run Checkinstall
1 Open a session on the Data Collector server.
Windows: Open a command prompt window.
Linux: Open a terminal/SSH session logged in as root to the Data Collector
Server.
2 Change to the directory where you’ll run the validation script.
Windows: At the command prompt, type:

cd C:\Program Files\Veritas\AnalyticsCollector <enter>

Linux: In the SSH session, type:

cd /usr/openv/analyticscollector <enter>

3 Execute the validation script.


Windows: At the command prompt, type: checkinstall.bat <enter>
Validating Data Collection 73
Using the CLI check install utility

Linux: In the SSH session. type: ./checkinstall.sh <enter>


The checkinstall utility performs a high-level check of the installation, including
a check for the domain, host group and URL, Data Collector policy and database
connectivity. For a component check, specifically for Host Resources, run the
hostresourcedetail.sh|batutility.
This utility fails if :
■ The Data Collector policy has NOT been configured in the Portal.
OR
■ The version of the Data Collector doesn’t match with the version of
NetBackup IT Analytics Portal.
To upgrade Data Collector version to match with the version of NetBackup IT
Analytics Portal.
Login to portal.
Navigating to Admin > Data Collection > Collector Updates page.
Trigger the upgrade.
Ensure that the Data Collector services are online.
Checkinstall includes an option to run a probe for one or more specific devices.
Note that certain Data Collectors will not allow individual selection of devices.
Typically these are collectors that allow the entry of multiple server addresses
or ranges of addresses in a single text box.
These collectors include: Cisco Switch, EMC Data Domain, EMC VNX arrays,
HP 3PAR, IBM mid-range arrays, IBM XIV arrays and VMware.
Data Collectors that probe all devices that are attached to a management
server also do not allow individual selection of devices: EMC Symmetric, File
Analytics, Hitachi arrays and IBM VIO.
4 If the output in the previous steps contains the word FAILED and the reason
of failure is NOT because of version mismatch, then contact Support and have
the following files ready for review:
■ Windows: C:\Program
Files\Veritas\AnalyticsCollector\mbs\logs\validation\

■ Linux: /usr/openv/analyticscollector/mbs/logs/validation/
Validating Data Collection 74
List Data Collector configurations

List Data Collector configurations


Use this utility to list the various child threads and their configurations encapsulated
within a data collector configuration. This utility can be used in conjunction with
other scripts, such as checkinstall.[sh|bat].
On Linux: ./listcollectors.sh
On Windows: listcollectors.bat
Chapter 7
Uninstalling the Data
Collector
This chapter includes the following topics:

■ Uninstall the Data Collector on Linux

■ Uninstall the Data Collector on Windows

Uninstall the Data Collector on Linux


This uninstall process assumes that the Data Collector was installed using the
standard installation process.
To uninstall the Data Collector software from a Linux host:
1 Login to the Data Collector server as root.
2 For NetBackup IT Analytics Data Collector version 10.6 or lower, execute the
Uninstall APTARE IT Analytics Data Collector Agent script located at
<Data Collector home folder>/UninstallerData

For example:

/opt/aptare/UninstallerData/Uninstall APTARE IT Analytics Data


Collector Agent

3 For NetBackup IT Analytics Data Collector version 11.0 or later, execute


uninstall_dc.sh script located at <Data Collector home
folder>/UninstallerData/uninstall_dc.sh

For example:

/opt/aptare/UninstallerData/uninstall_dc.sh
Uninstalling the Data Collector 76
Uninstall the Data Collector on Windows

Uninstall the Data Collector on Windows


This uninstall process assumes that the Data Collector was installed using the
standard installation process.
To uninstall the Data Collector software from a Windows host:
1 Login to the Data Collector server as an administrator.
2 Go to Control Panel > Add and Remove Program > Programs and Features
and uninstall NetBackup IT Analytics Data Collector.
The uninstaller may not delete the entire Data Collector directory structure.
Sometimes new files that were created after the installation are retained along with
their parent directories. If the Data Collector was upgraded from version 10.6 or
older, you may find entries of Kafka and Zookeeper services on the services panel
(default C:\Program Files\Aptare), even after the uninstallation of the Data
Collector. You must manually delete the services and reboot the system.
Chapter 8
Manually Starting the Data
Collector
This chapter includes the following topics:

■ Introduction

Introduction
The installer configures the Data Collector to start automatically, however, it does
not actually start it upon completion of the installation because you must first validate
the installation. Follow these steps, for the relevant operating system, to manually
start the Data Collector service.
This also starts the Aptare Agent process, Zookeeper, and Kafka services on the
respective systems.

On Windows
The installer configures the Data Collector process as a Service.

To view the Data Collector Status:


1. Click Start > Settings > Control Panel
2. Click Administrative Tools.
3. Click Services. The Services dialog is displayed.
4. Start the Aptare Agent service.

On Linux
The installer automatically copies the Data Collector “start” and “stop” scripts to the
appropriate directory, based on the vendor operating system.
Manually Starting the Data Collector 78
Introduction

To start the data collector, use the following command:

/opt/aptare/mbs/bin/aptare_agent start
Appendix A
Firewall Configuration:
Default Ports
This appendix includes the following topics:

■ Firewall configuration: Default ports

Firewall configuration: Default ports


The following table describes the standard ports used by the Portal servers, the
Data Collector servers, and any embedded third-party software products as part of
a standard “out-of-the-box” installation.

Table A-1 Components: Default Ports

Component Default Ports

Apache Web Server http 80

https 443

Jetty Server on Data Collector Server 443

Kafka 9092

Linux Hosts SSH 22

Managed Applications Oracle ASM 1521

MS Exchange 389

MS SQL 1433

File Analytics CIFS 137, 139


Firewall Configuration: Default Ports 80
Firewall configuration: Default ports

Table A-1 Components: Default Ports (continued)

Component Default Ports

Oracle 1521

Oracle TNS listener port

Tomcat - Data Receiver 8011, 8017

Apache connector port and shutdown port for Data


Receiver instance of tomcat

Tomcat - Portal 8009, 8015

Apache connector port and shutdown port for


Portal instance of tomcat

Windows Hosts TCP/IP 1248

WMI 135

DCOM TCP/UDP > 1023

SMB TCP 445

ZooKeeper 2181
Note: NetBackup IT Analytics uses
standalone installation of single-node
Apache ZooKeeper server. For secure
communications, ZooKeeper
single-node cluster must be protected
from external traffic using network
security such as firewall. This is
remediated by ensuring that the
ZooKeeper port (2181) is only
accessible on the local host where
NetBackup IT Analytics Portal/Data
Collector is installed (that includes
Apache ZooKeeper).

Table A-2 Storage Vendors: Default Ports

Storage Vendor Default Ports and Notes

Dell Compellent 1433

SMI-S http (5988)

SMI-S https (5989)

Dell EMC Elastic Cloud Storage (ECS) REST API 4443


Firewall Configuration: Default Ports 81
Firewall configuration: Default ports

Table A-2 Storage Vendors: Default Ports (continued)

Storage Vendor Default Ports and Notes

Dell EMC Unity REST API version 4.3.0 on 443 or 8443

EMC Data Domain Storage SSH 22

EMC Isilon SSH 22

EMC Symmetrix SymCLI over Fibre Channel 2707

EMC VNX NaviCLI 443, 2163, 6389, 6390, 6391,


6392

EMC VNX (Celerra) XML API 443, 2163, 6389, 6390, 6391,
6392

EMC VPLEX https TCP 443

EMC XtremIO REST API https 443

HP 3PAR 22 for CLI

HP EVA 2372

HPE Nimble Storage 5392, REST API Reference Version


5.0.1.0

Hitachi Block Storage TCP 2001

For the HIAA probe: 22015 is used for


HTTP and 22016 is used for HTTPS.

Hitachi Content Platform (HCP) SNMP 161

REST API https 9090

Hitachi NAS (HNAS) SSC 206

Hitachi Vantara All-Flash and Hybrid Flash Storage Hitachi Ops Center Configuration
Manager REST API: 23450 for HTTP
and 23451 for HTTPS.

HIAA : 22015 for HTTP, and 22016 for


HTTPS

IBM Enterprise TCP 1751, 1750, 1718

DSCLI

IBM SVC SSPC w/CIMOM 5988, 5989

IBM XIV XCLI TCP 7778


Firewall Configuration: Default Ports 82
Firewall configuration: Default ports

Table A-2 Storage Vendors: Default Ports (continued)

Storage Vendor Default Ports and Notes

Microsoft Windows Server 2016

WMI 135

DCOM TCP/UDP > 1023

NetApp E-Series SMCLI 2436

NetApp ONTAP 7-Mode and Cluster-Mode ONTAP API

80/443

Pure Storage FlashArray REST API https 443

Table A-3 Data protection: Default ports

Data Protection Vendor Default Ports and Notes

Cohesity DataProtect REST API on Port 80 or 443

Commvault Simpana 1433, 135 (skipped files)

445 (CIFS over TCP)

DCOM >1023

Dell EMC Networker Backup & Recovery Port used for Dell EMC NetWorker
REST API connection. Default: 9090.

EMC Avamar 5555

SSH 22

EMC Data Domain Backup SSH 22

HP Data Protector 5555 WMI ports SSH 22 (Linux)

IBM Spectrum Protect (TSM) 1500

NAKIVO Backup & Replication Director Web UI port (Default: 4443)

Oracle Recovery Manager (RMAN) 1521

Rubrik Cloud Data Management REST API 443

Veeam Backup & Replication 9392

Veritas Backup Exec 1433


Firewall Configuration: Default Ports 83
Firewall configuration: Default ports

Table A-4 Network & Fabrics: Default Ports

Network & Fabrics Vendor Default Ports and Notes

Brocade Switch SMI-S 5988/5989

Cisco Switch SMI-S 5988/5989

Table A-5 Virtualization Vendors: Default Ports

Virtualization Vendor Default Ports and Notes

IBM VIO SSH 22

Microsoft Hyper-V WMI 135

DCOM TCP/UDP > 1023

VMware ESX or ESXi,vCenter,vSphere vSphere VI SDK

https TCP 443

Table A-6 Replication Vendors: Default Ports

Replication Vendor Default Ports and Notes

NetApp ONTAP 7-Mode ONTAP API

80/443

Table A-7 Cloud Vendors: Default Ports

Cloud Vendor Default Ports and Notes

Microsoft Azure https 443

OpenStack Ceilometer 8774, 8777

Keystone Admin 3537

Keystone Public 5000

OpenStack Swift Keystone Admin 35357


Keystone Public 5000

SSH 22

Google Cloud Platform https 443

You might also like