Free AZ-305 Exam Braindumps (page: 8)

Page 7 of 73

What should you include in the identity management strategy to support the planned changes?

  1. Deploy domain controllers for corp.fabrikam.com to virtual networks in Azure.
  2. Move all the domain controllers from corp.fabrikam.com to virtual networks in Azure.
  3. Deploy a new Azure AD tenant for the authentication of new R&D projects.
  4. Deploy domain controllers for the rd.fabrikam.com forest to virtual networks in Azure.

Answer(s): A

Explanation:

Directory synchronization between Azure Active Directory (Azure AD) and corp.fabrikam.com must not be affected by a link failure between Azure and the on- premises network. (This requires domain controllers in Azure).
Users on the on-premises network must be able to authenticate to corp.fabrikam.com if an Internet link fails. (This requires domain controllers on-premises).



You need to recommend a notification solution for the IT Support distribution group.
What should you include in the recommendation?

  1. a SendGrid account with advanced reporting
  2. an action group
  3. Azure Network Watcher
  4. Azure AD Connect Health

Answer(s): D

Explanation:

An email distribution group named IT Support must be notified of any issues relating to the directory synchronization services.
Note: You can configure the Azure AD Connect Health service to send email notifications when alerts indicate that your identity infrastructure is not healthy. his occurs when an alert is generated, and when it is resolved.


Reference:

https://docs.microsoft.com/en-us/azure/active-directory/hybrid/how-to-connect-health-operations



HOTSPOT (Drag and Drop is not supported)
You plan to migrate App1 to Azure.
You need to recommend a storage solution for App1 that meets the security and compliance requirements.
Which type of storage should you recommend, and how should you recommend configuring the storage? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:

  1. See Explanation section for answer.

Answer(s): A

Explanation:


Box 1: Standard general-purpose v2
Standard general-purpose v2 supports Blob Storage.
Azure Storage provides data protection for Blob Storage and Azure Data Lake Storage Gen2.
Scenario:
Litware identifies the following security and compliance requirements:
- Once App1 is migrated to Azure, you must ensure that new data can be written to the app, and the modification of new and existing data is prevented for a period of three years.
- On-premises users and services must be able to access the Azure Storage account that will host the data in App1.
- Access to the public endpoint of the Azure Storage account that will host the App1 data must be prevented.
All Azure SQL databases in the production environment must have Transparent Data Encryption (TDE) enabled.

- App1 must NOT share physical hardware with other workloads.
Box 2: Hierarchical namespace
Scenario: Plan: Migrate App1 to Azure virtual machines.
Azure Data Lake Storage Gen2 implements an access control model that supports both Azure role-based access control (Azure RBAC) and POSIX-like access control lists (ACLs).
Data Lake Storage Gen2 and the Network File System (NFS) 3.0 protocol both require a storage account with a hierarchical namespace enabled.


Reference:

https://docs.microsoft.com/en-us/azure/storage/blobs/data-protection-overview https://docs.microsoft.com/en-us/azure/storage/blobs/immutable-storage-overview



You have an Azure subscription that contains an Azure Blob Storage account named store1.
You have an on-premises file server named Server1 that runs Windows Server 2016. Server1 stores 500 GB of company files.
You need to store a copy of the company files from Server1 in store1.
Which two possible Azure services achieve this goal? Each correct answer presents a complete solution.
NOTE: Each correct selection is worth one point.

  1. an Azure Logic Apps integration account
  2. an Azure Import/Export job
  3. Azure Data Factory
  4. an Azure Analysis services On-premises data gateway
  5. an Azure Batch account

Answer(s): B,C

Explanation:

B: You can use the Azure Import/Export service to securely export large amounts of data from Azure Blob storage. The service requires you to ship empty drives to the Azure datacenter. The service exports data from your storage account to the drives and then ships the drives back.
C: Big data requires a service that can orchestrate and operationalize processes to refine these enormous stores of raw data into actionable business insights.
Azure Data Factory is a managed cloud service that's built for these complex hybrid extract-transform-load (ETL), extract-load-transform (ELT), and data integration projects.


Reference:

https://docs.microsoft.com/en-us/azure/storage/common/storage-import-export-data-from-blobs https://docs.microsoft.com/en-us/azure/data-factory/introduction






Post your Comments and Discuss Microsoft AZ-305 exam with other Community members:

AZ-305 Exam Discussions & Posts