Free NCP-DB-6.5 Exam Braindumps (page: 13)

Page 13 of 38

What is the purpose of Data Access Management policies in NDB Multi-Cluster?

  1. To register multiple Nutanix clusters in NDB
  2. To perform snapshot operations on a single Nutanix cluster
  3. To manage time machine data availability across all registered Nutanix clusters in NDB
  4. To remove data accessibility of a time machine across all registered Nutanix clusters in NDB

Answer(s): C

Explanation:

Data Access Management (DAM) policies are a feature of NDB Multi-Cluster that allows you to control the access and availability of time machine data across different Nutanix clusters. You can use DAM policies to specify which clusters can access the time machine data of a source database, and which clusters can replicate the time machine data for backup or disaster recovery purposes. DAM policies help you to optimize the storage and network resources, as well as ensure the security and compliance of your database workloads. The purpose of DAM policies is not to register multiple Nutanix clusters in NDB, as this is done by using the Add Cluster option in the NDB settings page. The purpose of DAM policies is also not to perform snapshot operations on a single Nutanix cluster, as this is done by using the Time Machine feature in the NDB dashboard. The purpose of DAM policies is also not to remove data accessibility of a time machine across all registered Nutanix clusters in NDB, as this is done by using the Delete option in the Time Machine page.


Reference:

Nutanix Database Management & Automation Training Course, Module 6: Managing NDB Multi- Cluster, Lesson 2: Data Access Management Policies, Slide 3: Data Access Management Policies Nutanix Certified Professional - Database Automation (NCP-DB) 5 Exam, Section 6: Administer an NDB Environment, Objective 6.5: Apply procedural concepts to create Data Access Management

(DAM) policies



How does NDB send notifications when alerts are generated?

  1. SNMP
  2. APIs
  3. Pulse
  4. Email

Answer(s): D

Explanation:

NDB sends notifications when alerts are generated via email. The email notifications can be configured to send to one or more recipients, and can be customized to include the alert severity, category, description, and resolution steps. The email notifications help to inform the database administrator and other stakeholders about the status and issues of the NDB-managed databases and operations.
NDB does not send notifications via SNMP, APIs, or Pulse. SNMP is a protocol for collecting and organizing information about managed devices on a network. APIs are interfaces for communicating and exchanging data between different applications or systems. Pulse is a feature of the Nutanix cluster that collects and sends diagnostic and usage data to Nutanix for analysis and support.


Reference:

Nutanix Database Management & Automation Training Course, Module 3: Nutanix Era Deployment, Lesson 3.2: Nutanix Era Deployment, slide 11.
Nutanix Database Management & Automation Training Course, Module 5: Nutanix Era Operations, Lesson 5.1: Nutanix Era Operations, slide 6.
Nutanix Database Management & Automation Training Course, Module 5: Nutanix Era Operations, Lesson 5.2: Nutanix Era Alerts and Notifications, slides 5-7.



An administrator is tasked with auditing NDB SLAs.
What data will the administrator be reviewing?

  1. Snapshot schedules
  2. Clone Management
  3. Data retention policies
  4. Recovery Time Objective

Answer(s): C

Explanation:

NDB SLAs are service level agreements that define the data protection and recovery objectives for NDB-managed databases. NDB SLAs consist of data retention policies that specify how long the snapshots and log backups of a database are kept in the Time Machine. Data retention policies can be customized to meet different business and compliance requirements, such as daily, weekly, monthly, or yearly retention periods. NDB SLAs also determine the frequency and schedule of the snapshots and log backups, as well as the storage location and replication options. An administrator who is tasked with auditing NDB SLAs will be reviewing the data retention policies of each database and Time Machine, as well as the snapshot and log backup history and status. The administrator will also be able to monitor the storage usage and performance of the NDB SLAs, and modify or delete the SLAs as needed. The other options are not part of the NDB SLAs, but rather separate features or concepts of NDB. Snapshot schedules are the intervals at which NDB takes snapshots of the databases, which are determined by the SLAs. Clone management is the process of creating, refreshing, or deleting database clones from the Time Machine. Recovery time objective (RTO) is the maximum acceptable time for restoring a database after a failure, which is influenced by the SLAs, but not defined by them.


Reference:

Nutanix Certified Professional - Database Automation (NCP-DB) v6.5, Section 5 - Protect NDB- managed Databases Using Time Machine, Objective 5.1: Create, delete, and modify SLA retention policies
Nutanix Database Management & Automation (NDMA) Course, Module 4: Nutanix Database Service (NDB) Data Protection, Lesson 4.1: Data Protection Overview, Topic: SLA Concepts Nutanix Database Service (NDB) User Guide, Chapter 6: SLAs, Section: SLA Overview



A development team has requested that an administrator provide them a copy of the production Finance database. The business requires that any financial data is masked before going into development.
How should the administrator create a clone with masked data for the development environment?

  1. From the Time Machine, create a clone and paste the masking commands in the post-clone field of the Pre-Post Commands section.
  2. 1. Create a masking script on the source DB VM, Dev VM or SW Profile VM.
    2. Create the clone from the Time Machine and define the post-clone option with the full path\name of the masking script.
  3. 1. Create a script to mask the data.
    2. Create the clone from the Time Machine and define the post-clone option with the full path\name of the masking script.
  4. From the Time Machine, create a clone and paste the masking commands in the pre-clone field of the Pre-Post Commands section.

Answer(s): B

Explanation:

According to the Nutanix Database Automation (NCP-DB) course, the Pre-Post Commands section allows the administrator to specify custom scripts that can be executed before or after the clone operation1. The masking script can be created on any of the VMs that have access to the source database, such as the source DB VM, the Dev VM, or the SW Profile VM2. The script should contain the commands to mask the sensitive data in the Finance database, such as replacing the real values with dummy values or encrypting the data2. The administrator can then create the clone from the Time Machine and define the post-clone option with the full path and name of the masking script1. This will ensure that the script is executed after the clone is created, and the data is masked before it is available for the development team1. The other options are not correct, as they either use the wrong field (pre-clone instead of post-clone), or do not specify where to create or store the masking script.


Reference:

1: Nutanix Database Automation (NCP-DB) course, Module 4: Database Cloning, Lesson 4.4: Pre-Post Commands, slide 5
2: Nutanix Database Automation (NCP-DB) course, Module 4: Database Cloning, Lesson 4.4: Pre-Post Commands, slide 7



Page 13 of 38



Post your Comments and Discuss Nutanix NCP-DB-6.5 exam with other Community members:

Wael Edward commented on September 01, 2024
Just Passed using this dump in Egypt All Question except 4 were from this dump But not all the answers were correct
Anonymous
upvote