NetBackup IT Analytics System Administrator Guide
- Introduction
- Preparing for updates
- Backing up and restoring data
- Best practices for disaster recovery
- Oracle database backups
- File system backups
- Oracle database: Cold backup
- Oracle database: Export backups
- Scheduling the oracle database export
- Oracle database: On demand backup
- Restoring the NetBackup IT Analytics system
- Import the oracle database
- Manual steps for database import / export using data pump
- Monitoring NetBackup IT Analytics
- Accessing NetBackup IT Analytics reports with the REST API
- Defining NetBackup estimated tape capacity
- Automating host group management
- About automating host group management
- Task overview: managing host groups in bulk
- Preparing to use PL/SQL utilities
- General utilities
- Categorize host operating systems by platform and version
- Identifying a host group ID
- Move or copy clients
- Organize clients by attribute
- Move host group
- Delete host group
- Move hosts and remove host groups
- Organize clients into groups by backup server
- Merge duplicate backup clients
- Bulk load utilities
- Veritas NetBackup utilities
- Automate NetBackup utilities
- Organize clients into groups by management server
- Set up an inactive clients group
- Set up a host group for clients in inactive policies
- Set up clients by policy
- Set up clients by policy type
- IBM Tivoli storage manager utilities
- Set up clients by policy domain
- Set up clients by IBM Tivoli storage manager instance
- Scheduling utilities to run automatically
- Attribute management
- Attribute bulk load utilities
- Attribute naming rules
- Rename attributes before upgrading
- Load host attributes and values
- Load attributes and values and assign to hosts
- Load array attributes and values and assign to arrays
- Overview of application attributes and values
- Load application database attributes and values
- Load MS Exchange organization attributes and values
- Load LUN attributes and values
- Load switch attributes and values
- Load port attributes and values
- Importing generic backup data
- Backup job overrides
- Managing host data collection
- System configuration in the portal
- System configuration in the portal
- System configuration: functions
- Navigation overview
- System configuration parameter descriptions: Additional info
- Anomaly detection
- Data collection: Capacity chargeback
- Database administration: database
- Host discovery: EMC Avamar
- Host discovery: Host
- Events captured for audit
- Custom parameters
- Adding/editing a custom parameter
- Portal customizations
- Configuring global default inventory object selection
- Restricting user IDs to single sessions
- Customizing date format in the report scope selector
- Customizing the maximum number of lines for exported reports
- Customizing the total label display in tabular reports
- Customizing the host management page size
- Customizing the path and directory for File Analytics database
- Configuring badge expiration
- Configuring the maximum cache size in memory
- Configuring the cache time for reports
- Performance profile schedule customization
- Configuring AD/LDAP
- Configuring single sign-on (SSO) using security assertion markup language (SAML)
- Change Oracle database user passwords
- Integrate with CyberArk
- Tuning NetBackup IT Analytics
- Working with log files
- About debugging NetBackup IT Analytics
- Turn on debugging
- Database logging
- Portal and data collector log files - reduce logging
- Database SCON logging - reduce logging
- Refreshing the database SCON log
- Logging user activity in audit.log
- Logging only what a user deletes
- Logging all user activity
- Data collector log files
- Data collector log file organization
- Data collector log file naming conventions
- General data collector log files
- Find the event / meta collector ID
- Portal log files
- Database log files
- Installation / Upgrade log files
- Defining report metrics
- SNMP trap alerting
- SSL certificate configuration
- SSL certificate configuration
- SSL implementation overview
- Obtain an SSL certificate
- Update the web server configuration to enable SSL
- Configure virtual hosts for portal and / or data collection SSL
- Enable / Disable SSL for a Data Collector
- Enable / Disable SSL for emailed reports
- Test and troubleshoot SSL configurations
- Create a self-signed SSL certificate
- Configure the Data Collector to trust the certificate
- Keystore file locations on the Data Collector server
- Import a certificate into the Data Collector Java keystore
- Keystore on the portal server
- Add a virtual interface to a Linux server
- Add a virtual / secondary IP address on Windows
- Portal properties: Format and portal customizations
- Introduction
- Configuring global default inventory object selection
- Restricting user IDs to single sessions
- Customizing date format in the report scope selector
- Customizing the maximum number of lines for exported reports
- Customizing the total label display in tabular reports
- Customizing the host management page size
- Customizing the path and directory for file analytics database
- Configuring badge expiration
- Configuring the maximum cache size in memory
- Configuring the cache time for reports
- Configuring LDAP to use active directory (AD) for user group privileges
- Data retention periods for SDK database objects
- Data retention periods for SDK database objects
- Find the domain ID and database table names
- Retention period update for SDK user-defined objects example
- SDK user-defined database objects
- Capacity: default retention for basic database tables
- Capacity: default retention for EMC Symmetrix enhanced performance
- Capacity: Default retention for EMC XtremIO
- Capacity: Default retention for Dell EMC Elastic Cloud Storage (ECS)
- Capacity: Default retention for Windows file server
- Capacity: Default retention for Pure Storage FlashArray
- Cloud: Default retention for Amazon Web Services (AWS)
- Cloud: Default retention for Microsoft Azure
- Cloud: Default retention for OpenStack Ceilometer
- Configure multi-tenancy data purging retention periods
- Troubleshooting
- Appendix A. Configure TLS-enabled Oracle database on NetBackup IT Analytics Portal and data receiver
- About Transport Layer Security (TLS)
- TLS in Oracle environment
- Configure TLS in Oracle with NetBackup IT Analytics on Linux in split architecture
- Configure TLS in Oracle with NetBackup IT Analytics on Linux in non-split architecture
- Configure TLS in Oracle with NetBackup IT Analytics on Windows in split architecture
- Configure TLS in Oracle with NetBackup IT Analytics on Windows in non-split architecture
- Configure TLS in user environment
Manual steps for database import / export using data pump
Follow the steps to execute the Data Pump Export in a Linux environment
- Login to the Linux database server and switch to user aptare.
- Ensure that file
/opt/aptare/database/tools/expdp_scdb.paris owned by aptare user and has 755 permissions. - Ensure Oracle listener and Oracle services are running.
- Run following commands:
su - aptare sqlplus / as sysdba alter session set container=scdb;
Note:
The alter session set container=scdb; command is required for a container database. Please ignore it for a non-cdb environment.
CREATE OR REPLACE DIRECTORY datapump_dir AS '/tmp';
In case of a preferred folder such as
new_directory_path:CREATE OR REPLACE DIRECTORY datapump_dir AS '/new_directory_path';
- Export the database using following command:
/opt/aptare/oracle/bin/expdp parfile=/opt/aptare/database/tools/expdp_scdb.par
- You can also choose to ignore the
parfile and include parameters in the expdp command directly. In other words, the above command can be replaced by the following command which can also be executed from aptare user./opt/aptare/oracle/bin/expdp system/aptaresoftware@//localhost:1521/scdb FULL=Y directory=datapump_dir dumpfile=aptare_scdb.exp logfile=export_scdb.log CONTENT=ALL flashback_time=systimestamp
After successful completion, the data pump export file
aptare_scdb.expis saved in/tmpdirectory of the Linux Database server.In case you have specified a preferred directory,
aptare_scdb.expis saved to that preferred location (such as/new_directory_path). - This step is required only if the database is exported from an NetBackup IT Analytics version of 10.5 or above. Execute cp /opt/aptare/datarcvrconf/aptare.ks /tmp command to copy the aptare.ks file to /tmp folder
Follow the steps to execute Data Pump Import in a Linux Environment
- Place the export file
aptare_scdb.expcreated using data pump export in/tmpdirectory.If you have a different preferred directory (for example
/new_directory_path), then placeaptare_scdb.expin your preferred directory (/new_directory_path). - Ensure the
aptare_scdb.expfile is owned by the aptare user and has 755 permissions. - Ensure that files
/opt/aptare/database/tools/unlock_portal_linux.sqland/opt/aptare/database/tools/impdp_scdb.parare owned by aptare user and have 755 permissions. - Using root user, stop all Oracle and Aptare services by running following command: /opt/aptare/bin/aptare stop from root user.
- Using root user start Oracle services by running following command: /opt/aptare/bin/oracle start
- Ensure Oracle listener is running. Using aptare user check for status of Listener using following command: lsnrctl status
- Run following commands:
su - aptare sqlplus / as sysdba alter session set container=scdb;
Note:
The alter session set container=scdb; command is required for a container database. Please ignore it for a non-cdb environment.
drop user aptare_ro cascade;
drop user portal cascade;
CREATE OR REPLACE DIRECTORY datapump_dir AS '/tmp';
In case of a preferred folder such as
new_directory_path:CREATE OR REPLACE DIRECTORY datapump_dir AS '/new_directory_path';
- Run the following command using aptare user
/opt/aptare/oracle/bin/impdp parfile=/opt/aptare/database/tools/impdp_scdb.par
- You can also choose to ignore the par file and include parameters in the impdp command directly. In other words, the above command can be replaced by the following command which can also be executed from aptare user.
/opt/aptare/oracle/bin/impdp system/aptaresoftware@//localhost:1521/scdb schemas=portal,aptare_ro directory=datapump_dir dumpfile=aptare_scdb.exp logfile=import_scdb.log
- When import completes in a non-CDB environment, remove first command 'alter session set container = scdb;' from the file unlock_portal_linux.sql and run following command from aptare user.
Note:
The removal of the 'alter session set container = scdb;' is required only for a non-CDB environment and no change is required if it is a Container database.
sqlplus / as sysdba @/opt/aptare/database/tools/unlock_portal_linux.sql
- After exiting from sqlplus, execute following command from aptare user
sqlplus portal/portal@//localhost:1521/scdb @/opt/aptare/database/tools/validate_sp.sql
Go to
/tmpdirectory and check the fileimport_scdb.log.In case you have specified a preferred directory, check for
import_scdb.login your preferred location.Check the log file for the compilation warnings for the packages: view apt_v_solution_history_log, cmv_adaptor_pkg, avm_common_pkg, sdk_common_pkg, server_group_package, load_package, common_package, util. These compilation warnings are addressed by the script itself and no action is required from the user.
Note:
If you are importing DB from 10.4, upgrade the portal after the import to 10.5 build.
This step is required only if the database is exported from an NetBackup IT Analytics version of 10.5 or above. Run the following commands to copy the
aptare.ksfile todatarcvrconffolder.cp /tmp/aptare.ks /opt/aptare/datarcvrconf/ chown aptare:tomcat /opt/aptare/datarcvrconf/ chmod 664 /opt/aptare/datarcvrconf/aptare.ks
Run updateUser.sh to change the password of the application account. For example, to change the password for the admin123 application user, run: updateUser.sh admin123 newPassword
Restart all Oracle and Aptare services by running /opt/aptare/bin/aptare restart from root user.
Login to the portal Application using the application account.
Follow the steps for Windows Data Pump Export
- Login to the Windows database server.
- Ensure Oracle TNS listener and Oracle services are running.
- Ensure Aptare user has access to the file c:\opt\oracle\database\tools\expdp_scdb_win.par
Run following commands:
sqlplus system/aptaresoftware@//localhost:1521/scdb
create or replace directory datapump_dir as 'c:\opt\oracle\logs';
Exit
- After exiting out of sqlplus, execute the following command: c:\opt\oracle\bin\expdp parfile=c:\opt\oracle\database\tools\expdp_scdb_win.par
- You can also choose to ignore the par file and include parameters in the expdp command directly. In other words, the above command can be replaced by the following command: c:\opt\oracle\bin\expdp system/aptaresoftware@//localhost:1521/scdb FULL=Y DIRECTORY=datapump_dir LOGFILE=export_scdb.log DUMPFILE=aptare_scdb.exp CONTENT=ALL FLASHBACK_TIME=systimestamp
- After successful completion, the data pump export file aptare_scdb.exp is saved in C:\opt\oracle\logs directory of the Windows Database server.
- Copy file c:\opt\datarcvrconf\aptare.ks to c:\opt\oracle\logs folder.
Note:
This step is required only if database is exported from an NetBackup IT Analytics version of 10.5 or above.
Follow the steps for Windows Data Pump Import
- Login to the Windows database server.
- The Aptare user will already have access to import files c:\opt\oracle\database\tools\unlock_portal_win.sql and c:\opt\oracle\database\tools\impdp_scdb_win.par. In case the Oracle user does not have read and execute privileges on these files, please ensure privileges are granted before starting the import.
- Place the export file aptare_scdb.exp in c:\opt\oracle\logs directory
- In case the name of the export file is capitalized, please change it to lower case. For example, change the name 'APTARE_SCDB.EXP' to 'aptare_scdb.exp'
- Stop all Oracle and Aptare services using stopAllServices from windows services tab.
- Start OracleServicescdb from windows services tab and ensure Oracle TNS listener is running.
Run following commands:
Sqlplus / as sysdba
Alter session set container = scdb; (note this command is included only for a container database, otherwise switch to container database is not required)
DROP USER aptare_ro CASCADE;
DROP USER portal CASCADE;
CREATE OR REPLACE DIRECTORY datapump_dir AS 'c:\opt\oracle\logs';
EXIT;
- After exiting out of sqlplus execute following command:
c:\opt\oracle\bin\impdp parfile=c:\opt\oracle\database\tools\impdp_scdb_win.par
- You can also choose to ignore the par file and include parameters in the impdp command directly. In other words, the above command can be replaced by the following command: c:\opt\oracle\bin\impdp "sys/*@//localhost:1521/scdb as sysdba" SCHEMAS=portal,aptare_ro DIRECTORY=datapump_dir LOGFILE=import_scdb.log DUMPFILE=aptare_scdb.exp
- After import is complete, execute the following command: sqlplus "sys/*@//localhost:1521/scdb as sysdba" @c:\opt\oracle\database\tools\unlock_portal_win.sql
- After exiting out of sqlplus, execute the following command: sqlplus portal/portal@//localhost:1521/scdb @c:\opt\oracle\database\tools\validate_sp.sql
To check for import logs, go to
c:\opt\aptare\oracle\logsand check the fileimport_scdb.log.Check the log file for the compilation warnings for the packages: view apt_v_solution_history_log, cmv_adaptor_pkg, avm_common_pkg, sdk_common_pkg, server_group_package, load_package, common_package, util. These compilation warnings are addressed by the script itself and no action is required from the user.
Note:
If you are importing DB from 10.4, upgrade the portal after the import to 10.5 build
Copy the saved file from
c:\opt\oracle\logs\aptare.kstoc:\opt\datarcvrconf\folder. Ensure that the file is owned by NetBackup IT Analytics user and has appropriate Read and Write access to the copied file.Note:
This step is required only if database is exported from an NetBackup IT Analytics version of 10.5 or above.
After successful completion of the import process, run StopAllservices using service tab on Windows.
Run startAllServices using service tab on Windows.
Run
updateUser.batfrom utils directory to change the password of the application account. For example, to change the password for the admin123 application user, run: updateUser.bat admin123 newPasswordLogin to the portal Application using the application account.