NetBackup IT Analytics System Administrator Guide
- Introduction
- Preparing for updates
- Backing up and restoring data
- Best practices for disaster recovery
- Oracle database backups
- File system backups
- Oracle database: Cold backup
- Oracle database: Export backups
- Scheduling the oracle database export
- Oracle database: On demand backup
- Restoring the NetBackup IT Analytics system
- Import the Oracle database
- Manual steps for database import / export using data pump
- Monitoring NetBackup IT Analytics
- Accessing NetBackup IT Analytics reports with the REST API
- Defining NetBackup estimated tape capacity
- Automating host group management
- About automating host group management
- Task overview: managing host groups in bulk
- Preparing to use PL/SQL utilities
- General utilities
- Categorize host operating systems by platform and version
- Identifying a host group ID
- Move or copy clients
- Organize clients by attribute
- Move host group
- Delete host group
- Move hosts and remove host groups
- Organize clients into groups by backup server
- Merge duplicate backup clients
- Merge duplicate hosts
- Bulk load utilities
- Veritas NetBackup utilities
- Automate NetBackup utilities
- Organize clients into groups by management server
- Set up an inactive clients group
- Set up a host group for clients in inactive policies
- Set up clients by policy
- Set up clients by policy type
- IBM Tivoli storage manager utilities
- Set up clients by policy domain
- Set up clients by IBM Tivoli storage manager instance
- Scheduling utilities to run automatically
- Host matching identification for single-domain multi-customer environments
- Attribute management
- Attribute bulk load utilities
- Attribute naming rules
- Rename attributes before upgrading
- Load host attributes and values
- Load attributes and values and assign to hosts
- Load array attributes and values and assign to arrays
- Overview of application attributes and values
- Load application database attributes and values
- Load MS Exchange organization attributes and values
- Load LUN attributes and values
- Load switch attributes and values
- Load port attributes and values
- Load Subscription attributes and values
- Importing generic backup data
- Backup job overrides
- Managing host data collection
- System configuration in the Portal
- System configuration in the Portal
- System configuration: functions
- Navigation overview
- System configuration parameter descriptions: Additional info
- Anomaly detection
- Data collection: Capacity chargeback
- Database administration: database
- Host discovery: EMC Avamar
- Host discovery: Host
- Events captured for audit
- Custom parameters
- Adding/editing a custom parameter
- Portal customizations
- Configuring global default inventory object selection
- Restricting user IDs to single sessions
- Customizing date format in the report scope selector
- Customizing the maximum number of lines for exported reports
- Customizing the total label display in tabular reports
- Customizing the host management page size
- Customizing the path and directory for File Analytics database
- Configuring badge expiration
- Configuring the maximum cache size in memory
- Configuring the cache time for reports
- Performance profile schedule customization
- LDAP and SSO authentication for Portal access
- Change Oracle database user passwords
- Integrate with CyberArk
- Tuning NetBackup IT Analytics
- Working with log files
- About debugging NetBackup IT Analytics
- Turn on debugging
- Database logging
- Portal and data collector log files - reduce logging
- Database SCON logging - reduce logging
- Refreshing the database SCON log
- Logging user activity in audit.log
- Logging only what a user deletes
- Logging all user activity
- Data collector log files
- Data collector log file organization
- Data collector log file naming conventions
- General data collector log files
- Find the event / meta collector ID
- Portal log files
- Database log files
- Installation / Upgrade log files
- Defining report metrics
- SNMP trap alerting
- SSL certificate configuration
- SSL certificate configuration
- SSL implementation overview
- Obtain an SSL certificate
- Update the web server configuration to enable SSL
- Configure virtual hosts for portal and / or data collection SSL
- Enable / Disable SSL for a Data Collector
- Enable / Disable SSL for emailed reports
- Test and troubleshoot SSL configurations
- Create a self-signed SSL certificate
- Configure the Data Collector to trust the certificate
- Keystore file locations on the Data Collector server
- Import a certificate into the Data Collector Java keystore
- Keystore on the portal server
- Add a virtual interface to a Linux server
- Add a virtual / secondary IP address on Windows
- Portal properties: Format and portal customizations
- Introduction
- Configuring global default inventory object selection
- Restricting user IDs to single sessions
- Customizing date format in the report scope selector
- Customizing the maximum number of lines for exported reports
- Customizing the total label display in tabular reports
- Customizing the host management page size
- Customizing the path and directory for file analytics database
- Configuring badge expiration
- Configuring the maximum cache size in memory
- Configuring the cache time for reports
- Configuring LDAP to use active directory (AD) for user group privileges
- Data retention periods for SDK database objects
- Data retention periods for SDK database objects
- Data aggregation
- Find the domain ID and database table names
- Retention period update for SDK user-defined objects example
- SDK user-defined database objects
- Capacity: default retention for basic database tables
- Capacity: default retention for EMC Symmetrix enhanced performance
- Capacity: Default retention for EMC XtremIO
- Capacity: Default retention for Dell EMC Elastic Cloud Storage (ECS)
- Capacity: Default retention for Windows file server
- Capacity: Default retention for Pure Storage FlashArray
- Cloud: Default retention for Amazon Web Services (AWS)
- Cloud: Default retention for Microsoft Azure
- Cloud: Default retention for OpenStack Ceilometer
- Configure multi-tenancy data purging retention periods
- Troubleshooting
- Appendix A. Kerberos based proxy user's authentication in Oracle
- Appendix B. Configure TLS-enabled Oracle database on NetBackup IT Analytics Portal and data receiver
- About Transport Layer Security (TLS)
- TLS in Oracle environment
- Configure TLS in Oracle with NetBackup IT Analytics on Linux in split architecture
- Configure TLS in Oracle with NetBackup IT Analytics on Linux in non-split architecture
- Configure TLS in Oracle with NetBackup IT Analytics on Windows in split architecture
- Configure TLS in Oracle with NetBackup IT Analytics on Windows in non-split architecture
- Configure TLS in user environment
- Appendix C. NetBackup IT Analytics for NetBackup on Kubernetes and appliances
Load port attributes and values
Function: The Load Port Attributes utility provides an efficient method of assigning attributes to a large number of ports. Please note Fabric Manager must be installed or the loading will fail.
To load port attributes and values:
- Create a CSV file of Ports, Attributes and Values.
- Execute the Load Port Attribute Utility.
- Verify the Port Attributes Load.
- Create a report template using the SQL Template Designer.
Once attribute values are assigned to application databases, a SQL Template Designer report can query the database to report on the application databases.
The loadPortAttributeFile utility assigns attribute values to a list of switches. This utility takes as input a comma-separated values (CSV) file.
Note:
This CSV file becomes the master document of record for Port Attributes and therefore must be preserved in a working directory for future updates.
Create a spreadsheet table, in the format shown in the following example, and save it as a CSV file in a working directory. This file is specific to loading port attributes.
Columns:
The first column lists the Fabric Identifier.
The second column lists the Switch Identifier.
The third column lists the Port Element Name.
Each additional column lists attributes and values that will be applied to the port. Multiple attributes can be assigned to a single port object.
Rows:
First (Header) Row - Contains the fields that uniquely identify the Fabric Identifier Name, Switch Identifier, Port element name followed by Attribute names. The header row is information only and is not processed as a data row.
Subsequent rows list the Fabric Identifier, Switch Identifier, Port element name followed by the attribute values that you assign to each port.
Before you begin, Bulk Load utility must be run in SQLPLUS as APTARE user.
The load_package utility is located in:
Linux:
/opt/aptare/database/stored_proceduresWindows:
\opt\oracle\database\stored_procedures
To assign attributes to application databases
- Create a table in a spreadsheet.
- Save the table as a comma-separated file (for example,
PortAttributes.csv). - Log in to the portal server.
- At the command line:
su -aptare
- At the command line, launch sqlplus:
sqlplus <pwd>/<pwd>@//localhost:1521/scdb
Example:
sqlplus portal/portal@//localhost:1521/scdb
- Execute the following at the SQL prompt:
SQL> Execute load_package.loadPortAttributeFile ('pathname_and_filename','domain_name', Fabric_identifier_col_num, switch_identifier_col_num, port_ele_name_col_num, 'log_path_name', 'log_file_name','check_valid_value');Example:
SQL> Execute load_package.loadPortAttributeFile('/tmp/portAttributes.csv', 'DomainEMEA', 1, 2,3,'/tmp/logs','portAttributes.log','Y');Where:
pathname_and_filename
Full path + filename (enclosed in single straight quotes) of the CSV file.
Windows example: '
c:\temp\PortAttributes.csv'Linux example: '
/tmp/PortAttributes.csv'domain_name
Name (enclosed in single straight quotes) of the domain in which the ports reside. Example: 'DomainEMEA'
Fabric_identifier_col_num
Column number in the CSV file where the Fabric Identifier is listed; Example: 1
switch_identifier_col_num
Column number in the CSV file where the Switch Identifier is listed; Example: 2
port_ele_name_col_num
Column number in the CSV file where the Port Element Name is listed; Example: 3
log_path_name
Full path (enclosed in single straight quotes) where the log file will be created/updated; verify that you have write access to this directory.
Optional: If a log path and filename are not specified, log records are written to scon.log and scon.err. To omit this parameter, enter: Example: 'c:\temp' or '/tmp'
log_file_name
Log file name enclosed in single straight quotes.
Optional: If a log path and filename are not specified, entries are written to
scon.logandscon.err. To omit this parameter, enter: ''Example: '
PortAttributes.log'check_valid_value
'Y' or 'N' - enclosed in single straight quotes.
Y - Checks if the attribute value exists. If the utility determines that the attribute value is not valid, it skips this row and does not assign the attribute value to the switch object.
N - Updates without checking that the attribute value exists. This option is seldom chosen, but it is available for certain customer environments where attributes may have been created without values (with scripts that bypass the user interface).
- Check the log file for status and errors.
- Restart the portal services so that the newly added attributes become available in the product.
To verify that the attribute load was successful:
- In the portal, go to Reports.
- Select a blue user folder.
- Select New SQL Template.
- With the SQL Template Designer open, click the Query tab.
- Enter the following query in the SQL Template Designer to verify Switch attributes:
select * from aps_v_swi_port_attribute