Please enter search query.
Search <book_title>...
NetBackup™ for Cloud Object Store Administrator's Guide
Last Published:
2025-11-24
Product(s):
NetBackup (11.1)
- Introduction
- Managing Cloud object store assets
- Planning NetBackup protection for Cloud object store assets
- Enhanced backup performance in 11.0 or later
- Prerequisites for adding Cloud object store accounts
- Configuring buffer size for backups
- Configure a temporary staging location
- Configuring advanced dynamic multi-streaming parameters
- Permissions required for Amazon S3 cloud provider user
- Permissions required for Azure blob storage
- Permission required for Azure Data Lake Storage
- Permissions required for GCP
- Limitations and considerations
- Adding Cloud object store accounts
- Manage Cloud object store accounts
- Scan for malware
- Protecting Cloud object store assets
- About accelerator support
- About incremental backup
- About dynamic multi-streaming
- About object change tracking
- Configuring object change tracking
- Configuring access permissions for the buckets
- Configuring access policy on the log bucket
- Configuration guidelines for IBM Storage Ceph
- Enable bucket logging for source buckets
- Creating policy for the log bucket
- Additional storage requirements at the staging location
- Configuring bucket logging in IBM Storage Ceph
- Maintaining the log bucket
- Configuring NetBackup for object change tracking
- Configuring NetBackup policy for object change tracking
- Verifying object change tracking in the Activity monitor
- Scenarios when NetBackup overrides object change tracking
- About storage lifecycle policies
- About policies for Cloud object store assets
- Planning for policies
- Prerequisites for Cloud object store policies
- Creating a backup policy
- Policy attributes
- Creating schedule attributes for policies
- Configuring the Start window
- Configuring the exclude dates
- Configuring the include dates
- Configuring the Cloud objects tab
- Adding conditions
- Adding tag conditions
- Examples of conditions and tag conditions
- Managing Cloud object store policies
- Recovering Cloud object store assets
- Troubleshooting
- Error 5549: Cannot validate bucket logging information
- Error 5576: The maximum number of concurrent jobs specified for a storage unit, must be greater than or equal to the number of streams specified in the policy.
- Error 5579: Falling back to object listing for change detection, not considering object change tracking for this bucket, specified in the policy.
- Error 5580: The specified failover strategy for the storage unit group is incompatible with the Cloud object store policy, with dynamic multi-streaming.
- Error 5545: Backup failed as NetBackup cannot parse records from the log object
- Error 5541: Cannot take backup, the specified staging location does not have enough space
- Error 5537: Backup failed: Incorrect read/write permissions are specified for the download staging path.
- Error 5538: Cannot perform backup. Incorrect ownership is specified for the download staging path.
- Reduced acceleration during the first full backup, after upgrade to versions 10.5 and 11.
- After backup, some files in the shm folder and shared memory are not cleaned up.
- After an upgrade to NetBackup version 10.5, copying, activating, and deactivating policies may fail for older policies
- Backup fails with default number of streams with the error: Failed to start NetBackup COSP process.
- Backup fails, after you select a scale out server or Snapshot Manager as a backup host
- Backup fails or becomes partially successful on GCP storage for objects with content encoded as GZIP.
- Recovery for the original bucket recovery option starts, but the job fails with error 3601
- Recovery Job does not start
- Restore fails: "Error bpbrm (PID=3899) client restore EXIT STATUS 40: network connection broken"
- Access tier property not restored after overwriting the existing object in the original location
- Reduced accelerator optimization in Azure for OR query with multiple tags
- Backup failed and shows a certificate error with Amazon S3 bucket names containing dots (.)
- Azure backup jobs fail when space is provided in a tag query for either tag key name or value.
- The Cloud object store account has encountered an error
- The bucket is list empty during policy selection
- Creating a second account on Cloudian fails by selecting an existing region
- Restore failed with 2825 incomplete restore operation
- Bucket listing of a cloud provider fails when adding a bucket in the Cloud objects tab
- AIR import image restore fails on the target domain if the Cloud store account is not added to the target domain
- Backup for Azure Data Lake fails when a back-level media server is used with backup host or storage server version 10.3
- Backup fails partially in Azure Data Lake: "Error nbpem (pid=16018) backup of client
- Recovery for Azure Data Lake fails: "This operation is not permitted as the path is too deep"
- Empty directories are not backed up in Azure Data Lake
- Recovery error: "Invalid alternate directory location. You must specify a string with length less than 1025 valid characters"
- Recovery error: "Invalid parameter specified"
- Restore fails: "Cannot perform the COSP operation, skipping the object: [/testdata/FxtZMidEdTK]"
- Cloud store account creation fails with incorrect credentials
- Discovery failures due to improper permissions
- Restore failures due to object lock
Backup fails, after you select a scale out server or Snapshot Manager as a backup host
Explanation:
The error occurs when a policy has Dynamic multi-streaming, and you update the Cloud object store account used in the policy to use a scale-out server or NetBackup Snapshot Manager as a backup host.
Workaround:
Do any of the following:
Disable Dynamic multi-streaming in the policy.
Do not use a scale out server or NetBackup Snapshot Manager as a backup host in the Cloud object store account.