NetBackup IT Analytics Help
- Section I. Introducing NetBackup IT Analytics
- Section II. What's New
- Section III. Certified configurations
- Introduction
- Portal and database servers
- Data Collector server configurations
- Capacity Manager configurations
- Array/LUN performance Data Collection
- EMC Isilon array performance metrics
- NetApp Cluster-Mode performance metrics
- EMC Symmetrix enhanced performance metrics
- Host access privileges, sudo commands, ports, and WMI proxy requirements
- Cloud configurations
- Virtualization Manager configurations
- File Analytics configurations
- Fabric Manager configurations
- Backup Manager configurations
- ServiceNow configurations
- Internal TCP port requirements
- Section IV. End user
- Understand the portal
- About the Admin tab
- Explore your inventory
- Hierarchy toolbar to organize your data
- Show objects
- Use attributes to organize your data
- Pin reports - saving reports with inventory objects
- Assign attributes in the inventory list view
- Get acquainted with reports
- About badging
- Generate and maintain reports
- Select Report Scope
- Group hosts by attributes
- Search for hosts in the report Scope Selector
- Backup Manager advanced scope selector settings
- Solution reports scope selector settings
- Units of Measure in Reports
- Customize report filter logic
- Sort columns in reports
- Convert tabular report to chart
- Distribute, share, schedule, and alert
- Schedule export of reports and dashboards
- Organize reports
- Work with the dynamic template designer
- Dynamic Template Designer Quick Start
- Converting to a Homogeneous, Product-Specific Template
- Dynamic Template Function Configurations
- Create Fields with the Field Builder
- Scope Selector Component - Custom Filter
- Configure a Bar Chart Dynamic Template
- Steps to Create a Bar Chart Dynamic Template
- Configure an Area/Stacked Area Chart Dynamic Template
- Line Charts for Performance Metrics
- Line Chart Field Requirements
- One Object Per Line Chart, One or More Metrics Per Chart
- Multiple Objects Per Line Chart, One Metric Per Chart
- Example of a Stacked Bar Chart Dynamic Template
- Create a Sparkline Chart in a Tabular Dynamic Template
- Adding or Editing Methods
- Validate and Save a Method
- Work with the SQL template designer
- Configure SQL Template Scope Selector Components
- Sample SQL Queries
- Number, Size, Date, and Time Formatting
- Alignment, Aggregation, Bar Type, and Bar Type Color
- Pipelined functions for report query building
- APTlistOfDates
- aptStringConcat
- getServerAttributeValue
- getObjectAttributeValue
- getChildServerGroupContextById
- getServerGroupContextById
- secsToHoursMinSecs
- APTgetTapeDriveStatusName
- getFullPathname
- listJobSummaryAfterRestart
- listJobSummaryAfterRestartNBW
- listJobSummaryAfterRestart for NetWorker Backup Jobs
- listOfBackupWindowDates
- listChargebackCatByVOLSDetail
- listChargebackCatByNcVolDetail
- listChargebackCatByFSDetail (for HNAS)
- listChargebackCatByFSDetail (for EMC Isilon)
- listChargebackByLUNSummary
- listChargebackByLUNDetail
- listChargebackCatByLUNSummary
- listChargebackCatByLUNDetail
- Alert configuration
- Add/Edit an Alert Policy
- About using SNMP
- Manage hosts, backup servers, and host groups
- NetBackup Primary Servers
- Manage attributes and objects
- Provide Portal access and user privileges
- Setting / Resetting passwords
- Managing user group home pages (Administrator)
- Configure primary schedules and backup windows
- Add, edit, and move policies
- Solutions administration
- Manage and monitor data collection
- About data collection tasks
- Add/Edit Data Collectors
- Review collectors and collection status
- Upgrade Data Collectors
- Work with Capacity Manager host data collection
- Host access requirements
- Manage credentials
- Configure host discovery policies to populate the host discovery and collection view
- Validate host connectivity
- Search and export in host discovery and collection
- Propagate probe settings: Copy probes, paste probes
- Discovery policies for Veritas NetBackup
- View and manage system notifications
- Customize with advanced parameters
- Access control advanced parameters
- General Data Collection advanced parameters
- Cloud data collection advanced parameters
- Host discovery and collection advanced parameters
- Backup Manager advanced parameters
- Capacity Manager advanced parameters
- File Analytics advanced parameters
- Virtualization Manager advanced parameters
- Manage your Portal environment
- Manage ransomware scorecard
- Analyze files
- Troubleshoot the Portal
- Retrieving log files
- Debug
- Attribute inheritance overrides
- Understand report data caching
- Understand the portal
- Section V. Report Reference
- Introduction to NetBackup IT Analytics
- Alert Reports
- Ransomware reports
- Risk Mitigation Solution Reports
- Risk Mitigation Reports
- Storage Optimization Solution Reports
- System Administration Reports
- Oracle Job Overview
- Capacity Manager Reports
- Application Capacity Reports
- Array Capacity Utilization Reports
- Array Capacity & Utilization (Generic Data)
- Array Capacity & Utilization (IBM SVC View)
- Array Capacity and Utilization (IBM XIV View)
- Array Capacity and Utilization (NetApp View)
- Array Capacity and Utilization (NetApp Cluster)
- NetApp Storage System Detail
- Array Capacity and Utilization (OpenStack Swift)
- IBM Array Site Summary
- IBM Array Detail
- LUN Utilization Summary
- NetApp Aggregate Detail
- NetApp Cluster-Mode Aggregate Detail
- NetApp Plex Details
- NetApp Volume Details
- NetApp Cluster-Mode Volume Detail
- NetApp StorageGrid Tenant Summary
- Available/Reclaimable Capacity Reports
- Capacity at Risk Reports
- Capacity Chargeback Reports
- Host Capacity Utilization Reports
- SnapMirror Reports
- SnapVault Reports
- Capacity Forecasting Reports
- Storage Performance Reports
- Mission Control for Performance Analysis
- Thin Provisioning Reports
- Hitachi Dynamic Provisioning Pool Utilization
- File Analytics Reports
- Virtualization Manager Reports
- Understanding the Datastore
- VM Server Detail
- VM Snapshot Summary
- VM Detail
- Datastore Utilization Summary
- Datastore Detail
- Fabric Manager Reports
- Host to Storage Dashboard
- Backup Manager Management Reports
- Error Log Summary
- Job Duration Report
- Veeam Backup & Replication Job Summary Report (Homogeneous)
- Veeam and RMAN Job Details Report
- Adding a Note to a Job
- Job Volume Summary Report
- NetBackup deduplication to MSDP savings
- Backup Administration Reports
- Host Details
- IBM Spectrum Protect (TSM) Storage Pools Dashboard
- Job Histogram
- EEBs on Primary server, Media server and Clients reports
- Backup Media Management Reports
- TSM Tape Media Detail Table
- Backup Service Level Agreement (SLA) Reports
- Determining and Improving Backup Start Time Performance
- Determining and Improving Backup Success Performance
- Determining and Improving Backup Duration Performance
- Backup Storage Utilization Reports
- Backup Manager Forecasting Reports
- Backup Billing and Usage Reports
- Backup Policies Reports
- HP Data Protector Backup Specification Detail
- Public Cloud Reports
- AWS Reports
- Microsoft Azure Reports
- Section VI. NetBackup IT Analytics Exporter Installation and Configuration
- Section VII. Data Collector Installation and Troubleshooting
- Installing the Data Collector Software
- Validating Data Collection
- Uninstalling the Data Collector
- Manually Starting the Data Collector
- Data Collector Troubleshooting
- Host resources: Check host connectivity using standard SSH
- Host resources: Generating host resource configuration files
- Configuring parameters for SSH
- CRON Expressions for Policy and Report Schedules
- Clustering Data Collectors with VCS and Veritas NetBackup (RHEL 7)
- Clustering Data Collectors with VCS and Veritas NetBackup (Windows)
- Install and configure NetBackup IT Analytics Data Collector on MSCS environment
- Maintenance Scenarios for Message Relay Server Certificate Generation
- Installing the Data Collector Software
- Section VIII. Data Collector Installation and Configuration for Cohesity NetBackup
- Introduction
- Configure a NetBackup IT Analytics Distributed Data Collector on a NetBackup Primary Server
- Configure Data Collector on non-clustered NetBackup 10.4 and later primary server
- Configure Data Collector on non-clustered NetBackup 10.1.1, 10.2, 10.2.01, 10.3 or 10.3.0.1 primary server
- Configure a Veritas NetBackup Data Collector Policy
- Configuring file analytics in NetBackup Data Collector policy
- Installing the Data Collector software
- Configure SSL
- Centralized Data Collector for NetBackup - Prerequisites, Installation, and Configuration
- Step-1: Choose operating system and complete prerequisites
- Step-5: SSH/WMI
- Upgrading Data Collector Locally
- Clustering Data Collectors with VCS and Veritas NetBackup (RHEL)
- Clustering Data Collectors with VCS and Veritas NetBackup (Windows)
- Install and Configure NetBackup IT Analytics Data Collector on MSCS
- Data Collector Policy Migration
- Pre-Installation setup for Veritas NetBackup appliance
- Pre-installation setup for Veritas Flex Appliance
- Data Collector Troubleshooting
- Host resources: Check host connectivity using standard SSH
- Host resources: Generating host resource configuration files
- Configuring parameters for SSH
- Appendix B. Configure Appliances
- Appendix C. Firewall configuration: Default ports
- Appendix D. CRON Expressions for Policy and Report Schedules
- Appendix E. Maintenance Scenarios for Message Relay Server Certificate Generation
- Section IX. Data Collector Installation and Configuration
- Introduction
- Install and configure a Data Collector
- Step-1: Choose operating system and complete prerequisites
- Installing the Data Collector software
- Configure SSL
- Section X. Data Protection
- Configuration for Veritas Backup Exec
- Configuration for Cohesity DataProtect
- Configuration for Commvault Simpana
- Open TCP/IP access to the Commvault database
- Set up a read-only user in the CommServe server
- Configuration for EMC Avamar
- Import EMC Avamar server information
- Configuration for EMC Data Domain backup
- Configuration for Dell EMC NetWorker backup & recovery
- Importing generic backup data
- Configuration for generic backup
- CSV format specification
- Configuration for HP Data Protector
- Architecture overview (HP Data Protector)
- Configure the Data Collector server in Cell Manager (HP Data Protector)
- Configuration for IBM Spectrum Protect (TSM)
- Architecture overview (IBM Spectrum Protect -TSM)
- Import IBM Spectrum Protect (TSM) information
- Configuration for NAKIVO Backup & Replication
- Configuration for Oracle Recovery Manager (RMAN)
- Configuration for Rubrik Cloud Data Management
- Configuration for Veeam Backup & Replication
- Appendix F. Load historic events
- Load Veritas NetBackup events
- Configuration for Veritas Backup Exec
- Section XI. Storage (Capacity)
- Configuration for Compute Resources
- Configuration for DELL EMC Elastic Cloud Storage (ECS)
- Configuration for Dell EMC Unity
- Configuration for EMC data domain storage
- Configuration for EMC Isilon
- Configuration for EMC Symmetrix
- Configuration for EMC VNX Celerra
- Configuration for EMC VNX CLARiiON
- Configuration for EMC VPLEX
- Configuration for EMC XtremIO
- Configuration for FUJITSU ETERNUS Data Collector
- Configuration for Hitachi Block
- Configuring a Hitachi Device manager user
- Configuration for Hitachi Content Platform (HCP)
- Hitachi content platform system management console
- Hitachi content platform tenant management console
- Configuration for Hitachi NAS
- Configuration for Hitachi Vantara All-Flash and Hybrid Flash Storage
- Configuration of Host inventory
- Host Access Privileges, Sudo Commands, Ports, and WMI Proxy Requirements
- Configure host Discovery policies to populate the host Inventory
- Validate host connectivity
- Host Inventory search and host Inventory export
- Configure and edit host probes
- Propagate Probe Settings: Copy Probes, Paste Probes
- Configuration for HP 3PAR
- Configuration for HP EVA
- Configuration for HPE Nimble Storage
- Configuration for HPE StoreOnce
- Configuration for IBM Enterprise
- Configuration for IBM COS
- Configuration for IBM SVC
- Configuration for IBM XIV
- Configuration for Microsoft Windows server
- Configuration for NetApp-7
- Configuration for NetApp StorageGRID
- Configuration for NetApp Cluster
- Configuration for NetApp E-Series
- Configuration for NEC HYDRAstor
- Configuration for Pure Storage FlashArray
- Section XII. Compute (Virtualization and Host Collection)
- Configuration for VMware
- Configuration for IBM VIO
- Configuration for Microsoft Hyper-V
- Section XIII. Cloud
- Configuration for Amazon Web Services (AWS)
- Mandatory probe user privileges
- Link AWS accounts for Collection of consolidated billing data
- Configuration for Google Cloud Platform
- Configuration for OpenStack Ceilometer
- Configuration for OpenStack Swift
- Configuration for Microsoft Azure
- Configuration for Amazon Web Services (AWS)
- Section XIV. Fabric
- Configuration for Brocade switch
- Configuration for Cisco switch
- Configuration for Brocade Zone alias
- Configuration for Cisco Zone alias
- Configuration for Brocade switch
- Section XV. File Analytics
- Configuration for File Analytics
- Host Discovery and Collection File Analytics probe
- Adding a File Analytics Data Collector policy
- File Analytics Export Folder Size and Folder Depth
- Configuration for File Analytics
- Section XVI. Data Collection Validation and Troubleshooting
- Validate data collection
- Data Collector Troubleshooting
- Host resources: Check host connectivity using standard SSH
- Host resources: Generating host resource configuration files
- Configuring parameters for SSH
- Uninstalling the Data Collector
- Section XVII. System Administration
- Preparing for Updates
- Backing Up and Restoring Data
- Monitoring NetBackup IT Analytics
- Accessing NetBackup IT Analytics Reports with the REST API
- Defining NetBackup Estimated Tape Capacity
- Automating Host Group Management
- Categorize host operating systems by platform and version
- Load relationships between hosts and host group
- Automate NetBackup utilities
- Scheduling utilities to run automatically
- Attribute Management
- Importing Generic Backup Data
- Backup Job Overrides
- Managing Host Data Collection
- System Configuration in the Portal
- Custom parameters
- Performance Profile Schedule Customization
- LDAP and SSO authentication for Portal access
- Changing Oracle Database User Passwords
- Integrating with CyberArk
- Tuning NetBackup IT Analytics
- Defining Report Metrics
- Working with Log Files
- Portal and data collector log files - reduce logging
- Data collector log file naming conventions
- Portal log files
- SNMP Trap Alerting
- SSL Certificate Configuration
- Configure virtual hosts for portal and / or data collection SSL
- Keystore on the portal server
- Portal Properties: Format and Portal Customizations
- Data Retention Periods for SDK Database Objects
- Data aggregation
- Troubleshooting
- Appendix G. Kerberos based proxy user's authentication in Oracle
- Appendix H. Configure TLS-enabled Oracle database on NetBackup IT Analytics Portal and data receiver
- Appendix I. NetBackup IT Analytics for NetBackup on Kubernetes and appliances
- Section XVIII. Licensing
- License installation and guidelines
- License overview
- Verify the current license configuration
- Storage suite
- Protection suite
- Backup Manager
- Backup Manager
- Complete suite
- Managing licenses
- Configure the Data Collector policy to exclude the object
- License management from the command line
- Troubleshooting
- License installation and guidelines
- Section XIX. Inventory reports and operations
- Section XX. OpsCenter Transition
Manual steps for database import / export using data pump
Follow the steps to execute the Data Pump Export in a Linux environment
- Login to the Linux database server and switch to user aptare.
- Ensure that file
/opt/aptare/database/tools/expdp_scdb.par
is owned by aptare user and has 755 permissions. - Ensure Oracle listener and Oracle services are running.
- Run following commands:
su - aptare source <INSTALL_PATH>/aptare/bin/aptare_env.sh sqlplus / as sysdba alter session set container=scdb;
Note:
The alter session set container=scdb; command is required for a container database. Please ignore it for a non-cdb environment.
CREATE OR REPLACE DIRECTORY datapump_dir AS '/tmp';
In case of a preferred folder such as
new_directory_path
:CREATE OR REPLACE DIRECTORY datapump_dir AS '/new_directory_path';
- Export the database using following command:
/opt/aptare/oracle/bin/expdp parfile=/opt/aptare/database/tools/expdp_scdb.par
- You can also choose to ignore the
par
file and include parameters in the expdp command directly. In other words, the above command can be replaced by the following command which can also be executed from aptare user./opt/aptare/oracle/bin/expdp system/aptaresoftware@//localhost:1521/scdb FULL=Y directory=datapump_dir dumpfile=aptare_scdb.exp logfile=export_scdb.log CONTENT=ALL flashback_time=systimestamp
After successful completion, the data pump export file
aptare_scdb.exp
is saved in/tmp
directory of the Linux Database server.In case you have specified a preferred directory,
aptare_scdb.exp
is saved to that preferred location (such as/new_directory_path
). - This step is required only if the database is exported from an NetBackup IT Analytics version of 10.5 or above. Execute cp /opt/aptare/datarcvrconf/aptare.ks /tmp command to copy the aptare.ks file to /tmp folder
Follow the steps to execute Data Pump Import in a Linux Environment
- Place the export file
aptare_scdb.exp
created using data pump export in/tmp
directory.If you have a different preferred directory (for example
/new_directory_path
), then placeaptare_scdb.exp
in your preferred directory (/new_directory_path
). - Ensure the
aptare_scdb.exp
file is owned by the aptare user and has 755 permissions. - Ensure that files
/opt/aptare/database/tools/unlock_portal_linux.sql
and/opt/aptare/database/tools/impdp_scdb.par
are owned by aptare user and have 755 permissions. - Using root user, stop all Oracle and Aptare services by running following command: /opt/aptare/bin/aptare stop from root user.
- Using root user start Oracle services by running following command: /opt/aptare/bin/oracle start
- Ensure Oracle listener is running. Using aptare user check for status of Listener using following command: lsnrctl status
- Run following commands:
su - aptare source <INSTALL_PATH>/aptare/bin/aptare_env.sh sqlplus / as sysdba alter session set container=scdb;
Note:
The alter session set container=scdb; command is required for a container database. Please ignore it for a non-cdb environment.
drop user aptare_ro cascade;
drop user portal cascade;
CREATE OR REPLACE DIRECTORY datapump_dir AS '/tmp';
In case of a preferred folder such as
new_directory_path
:CREATE OR REPLACE DIRECTORY datapump_dir AS '/new_directory_path';
- Run the following command using aptare user
/opt/aptare/oracle/bin/impdp parfile=/opt/aptare/database/tools/impdp_scdb.par
- You can also choose to ignore the par file and include parameters in the impdp command directly. In other words, the above command can be replaced by the following command which can also be executed from aptare user.
/opt/aptare/oracle/bin/impdp system/aptaresoftware@//localhost:1521/scdb schemas=portal,aptare_ro directory=datapump_dir dumpfile=aptare_scdb.exp logfile=import_scdb.log
- When import completes in a non-CDB environment, remove first command 'alter session set container = scdb;' from the file unlock_portal_linux.sql and run following command from aptare user.
Note:
The removal of the 'alter session set container = scdb;' is required only for a non-CDB environment and no change is required if it is a Container database.
sqlplus / as sysdba @/opt/aptare/database/tools/unlock_portal_linux.sql
- After exiting from sqlplus, execute following command from aptare user
sqlplus portal/portal@//localhost:1521/scdb @/opt/aptare/database/tools/validate_sp.sql
Go to
/tmp
directory and check the fileimport_scdb.log
.In case you have specified a preferred directory, check for
import_scdb.log
in your preferred location.Check the log file for the compilation warnings for the packages: view apt_v_solution_history_log, cmv_adaptor_pkg, avm_common_pkg, sdk_common_pkg, server_group_package, load_package, common_package, util. These compilation warnings are addressed by the script itself and no action is required from the user.
Note:
If you are importing DB from 10.4, upgrade the portal after the import to 10.5 build.
This step is required only if the database is exported from an NetBackup IT Analytics version of 10.5 or above. Run the following commands to copy the
aptare.ks
file todatarcvrconf
folder.cp /tmp/aptare.ks /opt/aptare/datarcvrconf/ chown aptare:tomcat /opt/aptare/datarcvrconf/ chmod 664 /opt/aptare/datarcvrconf/aptare.ks
Run updateUser.sh to change the password of the application account. For example, to change the password for the admin123 application user, run: updateUser.sh admin123 newPassword
Restart all Oracle and Aptare services by running /opt/aptare/bin/aptare restart from root user.
Login to the portal Application using the application account.
Follow the steps for Windows Data Pump Export
- Login to the Windows database server.
- Ensure Oracle TNS listener and Oracle services are running.
- Ensure Aptare user has access to the file c:\opt\oracle\database\tools\expdp_scdb_win.par
Run following commands:
sqlplus system/aptaresoftware@//localhost:1521/scdb
create or replace directory datapump_dir as 'c:\opt\oracle\logs';
Exit
- After exiting out of sqlplus, execute the following command: c:\opt\oracle\bin\expdp parfile=c:\opt\oracle\database\tools\expdp_scdb_win.par
- You can also choose to ignore the par file and include parameters in the expdp command directly. In other words, the above command can be replaced by the following command: c:\opt\oracle\bin\expdp system/aptaresoftware@//localhost:1521/scdb FULL=Y DIRECTORY=datapump_dir LOGFILE=export_scdb.log DUMPFILE=aptare_scdb.exp CONTENT=ALL FLASHBACK_TIME=systimestamp
- After successful completion, the data pump export file aptare_scdb.exp is saved in C:\opt\oracle\logs directory of the Windows Database server.
- Copy file c:\opt\datarcvrconf\aptare.ks to c:\opt\oracle\logs folder.
Note:
This step is required only if database is exported from an NetBackup IT Analytics version of 10.5 or above.
Follow the steps for Windows Data Pump Import
- Login to the Windows database server.
- The Aptare user will already have access to import files c:\opt\oracle\database\tools\unlock_portal_win.sql and c:\opt\oracle\database\tools\impdp_scdb_win.par. In case the Oracle user does not have read and execute privileges on these files, please ensure privileges are granted before starting the import.
- Place the export file aptare_scdb.exp in c:\opt\oracle\logs directory
- In case the name of the export file is capitalized, please change it to lower case. For example, change the name 'APTARE_SCDB.EXP' to 'aptare_scdb.exp'
- Stop all Oracle and Aptare services using stopAllServices from windows services tab.
- Start OracleServicescdb from windows services tab and ensure Oracle TNS listener is running.
Run following commands:
Sqlplus / as sysdba
Alter session set container = scdb; (note this command is included only for a container database, otherwise switch to container database is not required)
DROP USER aptare_ro CASCADE;
DROP USER portal CASCADE;
CREATE OR REPLACE DIRECTORY datapump_dir AS 'c:\opt\oracle\logs';
EXIT;
- After exiting out of sqlplus execute following command:
c:\opt\oracle\bin\impdp parfile=c:\opt\oracle\database\tools\impdp_scdb_win.par
- You can also choose to ignore the par file and include parameters in the impdp command directly. In other words, the above command can be replaced by the following command: c:\opt\oracle\bin\impdp "sys/*@//localhost:1521/scdb as sysdba" SCHEMAS=portal,aptare_ro DIRECTORY=datapump_dir LOGFILE=import_scdb.log DUMPFILE=aptare_scdb.exp
- After import is complete, execute the following command: sqlplus "sys/*@//localhost:1521/scdb as sysdba" @c:\opt\oracle\database\tools\unlock_portal_win.sql
- After exiting out of sqlplus, execute the following command: sqlplus portal/portal@//localhost:1521/scdb @c:\opt\oracle\database\tools\validate_sp.sql
To check for import logs, go to
c:\opt\aptare\oracle\logs
and check the fileimport_scdb.log
.Check the log file for the compilation warnings for the packages: view apt_v_solution_history_log, cmv_adaptor_pkg, avm_common_pkg, sdk_common_pkg, server_group_package, load_package, common_package, util. These compilation warnings are addressed by the script itself and no action is required from the user.
Note:
If you are importing DB from 10.4, upgrade the portal after the import to 10.5 build
Copy the saved file from
c:\opt\oracle\logs\aptare.ks
toc:\opt\datarcvrconf\
folder. Ensure that the file is owned by NetBackup IT Analytics user and has appropriate Read and Write access to the copied file.Note:
This step is required only if database is exported from an NetBackup IT Analytics version of 10.5 or above.
After successful completion of the import process, run StopAllservices using service tab on Windows.
Run startAllServices using service tab on Windows.
Run
updateUser.bat
from utils directory to change the password of the application account. For example, to change the password for the admin123 application user, run: updateUser.bat admin123 newPasswordLogin to the portal Application using the application account.