NetBackup™ for Cloud Object Store Administrator's Guide

Last Published:
Product(s): NetBackup (11.1)
  1. Introduction
    1.  
      Overview of NetBackup protection for Cloud object store
    2.  
      Features of NetBackup Cloud object store workload support
  2. Managing Cloud object store assets
    1.  
      Planning NetBackup protection for Cloud object store assets
    2.  
      Enhanced backup performance in 11.0 or later
    3.  
      Prerequisites for adding Cloud object store accounts
    4.  
      Configuring buffer size for backups
    5.  
      Configure a temporary staging location
    6.  
      Configuring advanced dynamic multi-streaming parameters
    7.  
      Permissions required for Amazon S3 cloud provider user
    8.  
      Permissions required for Azure blob storage
    9.  
      Permission required for Azure Data Lake Storage
    10.  
      Permissions required for GCP
    11.  
      Limitations and considerations
    12. Adding Cloud object store accounts
      1.  
        Creating cross-account access in AWS
      2.  
        Check certificate for revocation
      3.  
        Managing Certification Authorities (CA) for NetBackup Cloud
      4.  
        Adding a new region
    13.  
      Manage Cloud object store accounts
    14. Scan for malware
      1.  
        Backup images
      2.  
        Assets by policy type
  3. Protecting Cloud object store assets
    1. About accelerator support
      1.  
        How NetBackup accelerator works with Cloud object store
      2.  
        Accelerator notes and requirements
      3.  
        Accelerator force rescan for Cloud object store (schedule attribute)
      4.  
        Accelerator backup and NetBackup catalog
      5.  
        Calculate the NetBackup accelerator track log size
    2.  
      About incremental backup
    3.  
      About dynamic multi-streaming
    4. About object change tracking
      1.  
        Configuring object change tracking
      2.  
        Configuring access permissions for the buckets
      3.  
        Configuring access policy on the log bucket
      4.  
        Configuration guidelines for IBM Storage Ceph
      5.  
        Enable bucket logging for source buckets
      6.  
        Creating policy for the log bucket
      7.  
        Additional storage requirements at the staging location
      8.  
        Configuring bucket logging in IBM Storage Ceph
      9.  
        Maintaining the log bucket
      10.  
        Configuring NetBackup for object change tracking
      11.  
        Configuring NetBackup policy for object change tracking
      12.  
        Verifying object change tracking in the Activity monitor
      13.  
        Scenarios when NetBackup overrides object change tracking
    5. About storage lifecycle policies
      1.  
        Adding an SLP
    6.  
      About policies for Cloud object store assets
    7.  
      Planning for policies
    8.  
      Prerequisites for Cloud object store policies
    9.  
      Creating a backup policy
    10.  
      Policy attributes
    11.  
      Creating schedule attributes for policies
    12. Configuring the Start window
      1.  
        Adding, changing, or deleting a time window in a policy schedule
      2.  
        Example of schedule duration
    13.  
      Configuring the exclude dates
    14.  
      Configuring the include dates
    15.  
      Configuring the Cloud objects tab
    16.  
      Adding conditions
    17.  
      Adding tag conditions
    18.  
      Examples of conditions and tag conditions
    19. Managing Cloud object store policies
      1.  
        Copy a policy
      2.  
        Deactivating or deleting a policy
      3.  
        Manually backup assets
  4. Recovering Cloud object store assets
    1.  
      Prerequisites for recovering Cloud object store objects
    2.  
      Configuring Cloud object retention properties
    3.  
      Recovering Cloud object store assets
  5. Troubleshooting
    1.  
      Error 5549: Cannot validate bucket logging information
    2.  
      Error 5576: The maximum number of concurrent jobs specified for a storage unit, must be greater than or equal to the number of streams specified in the policy.
    3.  
      Error 5579: Falling back to object listing for change detection, not considering object change tracking for this bucket, specified in the policy.
    4.  
      Error 5580: The specified failover strategy for the storage unit group is incompatible with the Cloud object store policy, with dynamic multi-streaming.
    5.  
      Error 5545: Backup failed as NetBackup cannot parse records from the log object
    6.  
      Error 5541: Cannot take backup, the specified staging location does not have enough space
    7.  
      Error 5537: Backup failed: Incorrect read/write permissions are specified for the download staging path.
    8.  
      Error 5538: Cannot perform backup. Incorrect ownership is specified for the download staging path.
    9.  
      Reduced acceleration during the first full backup, after upgrade to versions 10.5 and 11.
    10.  
      After backup, some files in the shm folder and shared memory are not cleaned up.
    11.  
      After an upgrade to NetBackup version 10.5, copying, activating, and deactivating policies may fail for older policies
    12.  
      Backup fails with default number of streams with the error: Failed to start NetBackup COSP process.
    13.  
      Backup fails, after you select a scale out server or Snapshot Manager as a backup host
    14.  
      Backup fails or becomes partially successful on GCP storage for objects with content encoded as GZIP.
    15.  
      Recovery for the original bucket recovery option starts, but the job fails with error 3601
    16.  
      Recovery Job does not start
    17.  
      Restore fails: "Error bpbrm (PID=3899) client restore EXIT STATUS 40: network connection broken"
    18.  
      Access tier property not restored after overwriting the existing object in the original location
    19.  
      Reduced accelerator optimization in Azure for OR query with multiple tags
    20.  
      Backup failed and shows a certificate error with Amazon S3 bucket names containing dots (.)
    21.  
      Azure backup jobs fail when space is provided in a tag query for either tag key name or value.
    22.  
      The Cloud object store account has encountered an error
    23.  
      The bucket is list empty during policy selection
    24.  
      Creating a second account on Cloudian fails by selecting an existing region
    25.  
      Restore failed with 2825 incomplete restore operation
    26.  
      Bucket listing of a cloud provider fails when adding a bucket in the Cloud objects tab
    27.  
      AIR import image restore fails on the target domain if the Cloud store account is not added to the target domain
    28.  
      Backup for Azure Data Lake fails when a back-level media server is used with backup host or storage server version 10.3
    29.  
      Backup fails partially in Azure Data Lake: "Error nbpem (pid=16018) backup of client
    30.  
      Recovery for Azure Data Lake fails: "This operation is not permitted as the path is too deep"
    31.  
      Empty directories are not backed up in Azure Data Lake
    32.  
      Recovery error: "Invalid alternate directory location. You must specify a string with length less than 1025 valid characters"
    33.  
      Recovery error: "Invalid parameter specified"
    34.  
      Restore fails: "Cannot perform the COSP operation, skipping the object: [/testdata/FxtZMidEdTK]"
    35.  
      Cloud store account creation fails with incorrect credentials
    36.  
      Discovery failures due to improper permissions
    37.  
      Restore failures due to object lock

Examples of conditions and tag conditions

Here is an example to illustrate the use of conditions and tag conditions.

Consider the container/bucket has the following files/directories:

  • The following blobs are tagged with "Project": "HR" tag

    • OrganizationData/Hr/resumes/resume1_selected.pdf

    • OrganizationData/Hr/resumes/resume2_rejected.pdf

    • OrganizationData/Hr/resumes/resume3_noupdate.pdf

  • The following blobs are tagged with "Project": "Finance" tag value

    • OrganizationData/Fin/accounts/account1/records1.txt

    • OrganizationData/Fin/accounts/account2/records2.txt

    • OrganizationData/Fin/accounts/account3/records3.txt

    • OrganizationData/Fin/accounts/monthly_expenses/Jul2022.rec

    • OrganizationData/Fin/accounts/monthly_expenses/Aug2022.rec

  • The following blobs are tagged with "Project": "Security"

    • The blob Getepass.pdf: Has one more tag with "TypeOfData":"ID_Cards" so this is tagged with two tags (that is: Security and ID_Cards)

    • OrganizationData/newJoinees/tempPassesList.xls

  • The following blobs are tagged with "Project": "Environment"

    • EnvironmentContribution.xls

    • NewPlantedTrees.xls

Example prefix conditions:

  • Case 1: To backup all resumes irrespective of their status (like, selected or rejected) from OrganizationData add the query:

    prefix Equal to OrganizationData/Hr/resumes/resume

    Result: All records that start with OrganizationData/Hr/resumes/resume are backed up.

  • Case 2: To backup all resumes and records from Fin and HR, add any of the following queries:

    prefix Equal to OrganizationData/Hr/resumes/resume

    Or

    prefix Equal to OrganizationData/Fin/accounts/account1/rec

    Note:

    You can add multiple prefixes with OR conditions.

Result: All records starting with OrganizationData/Hr/resumes/resume or OrganizationData/Fin/accounts/account1/rec are backed up.

Example object conditions:

To backup a specific object/blob add the following query:

object Equal to OrganizationData/Fin/accounts/monthly_expenses/Jul2022.rec

Result: Only the blob with the name Jul2022.rec is selected.

Example tag conditions:

  • Case 1: To back up all blobs tagged with "Project": "Finance", add the following query:

    tagKey Equal to 'Project' and tagVal Equal to 'Finance'

    Result: All objects/blobs tagged with "Project" = "Finance" are selected.

  • Case 2: To back up data that matches with project Finance or Security, add the query:

    tagKey Equal to 'Project' and tagValue eq 'Finance' OR tagKey Equal to 'Project' and tagValue eq 'Security'

    Result: All object/blobs tagged with "Project": "Finance" or "Project":"Security" are selected.

  • Case 3: To back up data from "Project":"Security" and "TypeOfData":"ID_Cards" add the queries:

    (tagKey Equal to 'Project' and tagValue Equal to 'Security') AND (tagKey Equal to 'TypeOfData' and tagValue Equal to 'ID_Cards')

    Result: Data with tag "Project":" Security" and "TypeOfData": "ID_Cards" are selected.