NetBackup™ for Cloud Object Store Administrator's Guide
- Introduction
- Managing Cloud object store assets
- Planning NetBackup protection for Cloud object store assets
- Enhanced backup performance in 11.0 or later
- Prerequisites for adding Cloud object store accounts
- Configuring buffer size for backups
- Configure a temporary staging location
- Configuring advanced parameters for Cloud object store
- Permissions required for Amazon S3 cloud provider user
- Permissions required for Azure blob storage
- Permissions required for GCP
- Limitations and considerations
- Adding Cloud object store accounts
- Manage Cloud object store accounts
- Scan for malware
- Protecting Cloud object store assets
- About accelerator support
- About incremental backup
- About dynamic multi-streaming
- About storage lifecycle policies
- About policies for Cloud object store assets
- Planning for policies
- Prerequisites for Cloud object store policies
- Creating a backup policy
- Policy attributes
- Creating schedule attributes for policies
- Configuring the Start window
- Configuring the exclude dates
- Configuring the include dates
- Configuring the Cloud objects tab
- Adding conditions
- Adding tag conditions
- Examples of conditions and tag conditions
- Managing Cloud object store policies
- Recovering Cloud object store assets
- Troubleshooting
- Error 5541: Cannot take backup, the specified staging location does not have enough space
- Error 5537: Backup failed: Incorrect read/write permissions are specified for the download staging path.
- Error 5538: Cannot perform backup. Incorrect ownership is specified for the download staging path.
- Reduced acceleration during the first full backup, after upgrade to versions 10.5 and 11.
- After backup, some files in the shm folder and shared memory are not cleaned up.
- After an upgrade to NetBackup version 10.5, copying, activating, and deactivating policies may fail for older policies
- Backup fails with default number of streams with the error: Failed to start NetBackup COSP process.
- Backup fails, after you select a scale out server or Snapshot Manager as a backup host
- Backup fails or becomes partially successful on GCP storage for objects with content encoded as GZIP.
- Recovery for the original bucket recovery option starts, but the job fails with error 3601
- Recovery Job does not start
- Restore fails: "Error bpbrm (PID=3899) client restore EXIT STATUS 40: network connection broken"
- Access tier property not restored after overwriting the existing object in the original location
- Reduced accelerator optimization in Azure for OR query with multiple tags
- Backup failed and shows a certificate error with Amazon S3 bucket names containing dots (.)
- Azure backup jobs fail when space is provided in a tag query for either tag key name or value.
- The Cloud object store account has encountered an error
- The bucket is list empty during policy selection
- Creating a second account on Cloudian fails by selecting an existing region
- Restore failed with 2825 incomplete restore operation
- Bucket listing of a cloud provider fails when adding a bucket in the Cloud objects tab
- AIR import image restore fails on the target domain if the Cloud store account is not added to the target domain
- Backup for Azure Data Lake fails when a back-level media server is used with backup host or storage server version 10.3
- Backup fails partially in Azure Data Lake: "Error nbpem (pid=16018) backup of client
- Recovery for Azure Data Lake fails: "This operation is not permitted as the path is too deep"
- Empty directories are not backed up in Azure Data Lake
- Recovery error: "Invalid alternate directory location. You must specify a string with length less than 1025 valid characters"
- Recovery error: "Invalid parameter specified"
- Restore fails: "Cannot perform the COSP operation, skipping the object: [/testdata/FxtZMidEdTK]"
- Cloud store account creation fails with incorrect credentials
- Discovery failures due to improper permissions
- Restore failures due to object lock
Configuring the Cloud objects tab
The Cloud objects tab lets you select the Cloud object store account that you want to use to connect to cloud resources to protect objects in the desired buckets. NetBackup lets you make a discrete selection of the buckets/containers and objects that you want to protect using the policy. You can use queries to intelligently filter and select the items that you want to protect.
NetBackup supports a single backup host or scale-out server per policy. Hence, to distribute the load, you have to create multiple policies, and using queries, you can bifurcate the load of buckets/objects being backed up across multiple backup hosts or scale-out servers.
To configure cloud objects:
- Select a Cloud object store account and Host. You can see a list of accounts and backup hosts that you are privileged to access. If you use a scale-out server for the account, the Host field is disabled. You cannot change a scale-out server while creating a policy.
- (Optional) Select the Allow dynamic multi-streaming option to allow NetBackup to divide automatic backups for each bucket/container into concurrent multiple streams. This option can dramatically improve the backup time of the protected buckets/containers.
In the field, Maximum number of streams per bucket/container specify a number between 1 to 64. The default value is 8.
Some streams may go to a queue, if the maximum number of concurrent jobs allowed in the storage unit selected for the policy, is less than the total number of streams that are running for the policy. For optimal performance, keep the Maximum concurrent jobs allowed property of selected storage greater than the total number of streams that you expect the policy to handle. The minimum value for the Maximum concurrent jobs for the storage must be 64.
Optionally, specify a Temporary staging location path, or use the default path. You can leave this field blank to use the default path. See Configure a temporary staging location.
Note:
If you enable dynamic multi-streaming, all selected buckets and containers are completely backed up. You cannot define any queries for the buckets or containers that you have selected.
- To add buckets/containers, click Add near the Buckets/Containers table. In the Add bucket/containers dialog, do any of the following to add buckets/containers.
To add a particular container, enter the name in the Bucket/Container name field, and click Add.
Select one or more bucket/container from the Bucket/Containers table, and click Add. You can use the search box above the table to filter the list.
If the Cloud object store account credentials do not have permission to list buckets, the bucket list remains empty. But you can manually add buckets.
In the Cloud objects tab, click Remove in the row of any bucket/container name in the Buckets/Containers table to remove it from the policy. Enter a keyword in the search box to filter the table.
- To add a query to the selected buckets/containers, click Add query under Queries.
- Enter a name for the query, and select the buckets that you want to filter using the query.
- In the Select objects/blobs table, select the option Include all objects/blobs in the selected buckets/containers to backup one or more entire buckets.
- Under Buckets with no queries, select the buckets/containers to which you want to add queries. If a bucket is previously selected to include all queries, that bucket does not appear in this list. Click Add condition or Add Tag condition to add a condition or a tag condition. See Adding conditions . and See Adding tag conditions . respectively, for more details.