Microsoft Azure DP-201 exam exercise questions, DP-201 dumps easy to prepare for passing exams


Posted On Oct 8 2019 by

Designing an Azure Data Solution” Exam DP-201. Candidates for this exam are Microsoft Azure data engineers who collaborate with business stakeholders to identify and meet the data requirements to design data solutions that use Azure data services.

Azure data engineers are responsible for data-related tasks that include designing Azure data storage solutions that use relational and non-relational data stores, batch and real-time data processing solutions, and data security and compliance solutions.

Here you can get the latest free DP-201 exam exercise questions and answers for free and easily improve your skills!

DP-201 exam

Candidates for this exam must design data solutions that use the following Azure services: Azure Cosmos DB, Azure SQL Database, Azure SQL Data Warehouse, Azure Data Lake Storage, Azure Data Factory, Azure Stream Analytics, Azure Databricks, and Azure Blob storage

Follow the link to find more information about https://www.leads4pass.com/dp-201.html exam.

Watch the Microsoft Azure DP-201 video tutorial online

Table of Contents:

Latest Microsoft Azure DP-201 pdf

[PDF] Free Microsoft Azure DP-201 pdf dumps download from Google Drive: https://drive.google.com/open?id=1yqED9d-JympHFeYcm3BWSkK24lR2ggOR

Microsoft Certified: Azure Data Engineer Associate:https://www.microsoft.com/en-us/learning/azure-data-engineer.aspx

Skills measured

  • Implement data storage solutions
  • Manage and develop data processing
  • Monitor and optimize data solutions
  • Design Azure data storage solutions
  • Design data processing solutions
  • Design for data security and compliance

Free Microsoft Azure DP-201 Exam Practice Questions

QUESTION 1
A company has locations in North America and Europe. The company uses Azure SQL Database to support business
apps.
Employees must be able to access the app data in case of a region-wide outage. A multi-region availability solution is
needed with the following requirements:
Read-access to data in a secondary region must be available only in case of an outage of the primary region.
The Azure SQL Database compute and storage layers must be integrated and replicated together.
You need to design the multi-region high availability solution.
What should you recommend? To answer, select the appropriate values in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:lead4pass dp-201 exam question q1

Correct Answer:

lead4pass dp-201 exam question q1-1

Box 1: Standard
The following table describes the types of storage accounts and their capabilities:

lead4pass dp-201 exam question q1-2

Box 2: Geo-redundant storage
If your storage account has GRS enabled, then your data is durable even in the case of a complete regional outage or a
disaster in which the primary region isn\\’t recoverable.
Note: If you opt for GRS, you have two related options to choose from:
GRS replicates your data to another data center in a secondary region, but that data is available to be read only if
Microsoft initiates a failover from the primary to secondary region. Read-access geo-redundant storage (RA-GRS) is
based on
GRS. RA-GRS replicates your data to another data center in a secondary region, and also provides you with the option
to read from the secondary region. With RA-GRS, you can read from the secondary region regardless of whether
Microsoft initiates a failover from the primary to secondary region.

lead4pass dp-201 exam question q1-3

References:
https://docs.microsoft.com/en-us/azure/storage/common/storage-introduction
https://docs.microsoft.com/en-us/azure/storage/common/storage-redundancy-grs

 

QUESTION 2
You are designing an Azure SQL Data Warehouse. You plan to load millions of rows of data into the data warehouse
each day.
You must ensure that staging tables are optimized for data loading.
You need to design the staging tables.
What type of tables should you recommend?
A. Round-robin distributed table
B. Hash-distributed table
C. Replicated table
D. External table
Correct Answer: A
To achieve the fastest loading speed for moving data into a data warehouse table, load data into a staging table. Define
the staging table as a heap and use round-robin for the distribution option.
Incorrect:
Not B: Consider that loading is usually a two-step process in which you first load to a staging table and then insert the
data into a production data warehouse table. If the production table uses a hash distribution, the total time to load and
insert might be faster if you define the staging table with the hash distribution. Loading to the staging table takes longer,
but the second step of inserting the rows to the production table does not incur data movement across the distributions.
References:
https://docs.microsoft.com/en-us/azure/sql-data-warehouse/guidance-for-loading-data

 

QUESTION 3
A company is designing a solution that uses Azure Databricks.
The solution must be resilient to regional Azure datacenter outages.
You need to recommend the redundancy type for the solution.
What should you recommend?
A. Read-access geo-redundant storage
B. Locally-redundant storage
C. Geo-redundant storage
D. Zone-redundant storage
Correct Answer: C
If your storage account has GRS enabled, then your data is durable even in the case of a complete regional outage or a
disaster in which the primary region isn\\’t recoverable.
References: https://medium.com/microsoftazure/data-durability-fault-tolerance-resilience-in-azure-databricks-95392982bac7

 

QUESTION 4
You plan to use Azure SQL Database to support a line of business app.
You need to identify sensitive data that is stored in the database and monitor access to the data.
Which three actions should you recommend? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.
A. Enable Data Discovery and Classification.
B. Implement Transparent Data Encryption (TDE).
C. Enable Auditing.
D. Run Vulnerability Assessment.
E. Use Advanced Threat Protection.
Correct Answer: CDE

 

QUESTION 5
You need to design the runtime environment for the Real Time Response system.
What should you recommend?
A. General Purpose nodes without the Enterprise Security package
B. Memory Optimized Nodes without the Enterprise Security package
C. Memory Optimized nodes with the Enterprise Security package
D. General Purpose nodes with the Enterprise Security package
Correct Answer: B
Scenario: You must maximize the performance of the Real Time Response system.

 

QUESTION 6
You need to design a sharding strategy for the Planning Assistance database.
What should you recommend?
A. a list mapping shard map on the binary representation of the License Plate column
B. a range mapping shard map on the binary representation of the speed column
C. a list mapping shard map on the location column
D. a range mapping shard map on the time column
Correct Answer: A
Data used for Planning Assistance must be stored in a sharded Azure SQL Database.
A shard typically contains items that fall within a specified range determined by one or more attributes of the data.
These attributes form the shard key (sometimes referred to as the partition key). The shard key should be static. It
shouldn\\’t be based on data that might change.
References: https://docs.microsoft.com/en-us/azure/architecture/patterns/sharding

 

QUESTION 7
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains
a unique solution that might meet the stated goals. Some question sets might have more than one correct solution,
while
others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not
appear in the review screen.
You are designing an Azure SQL Database that will use elastic pools. You plan to store data about customers in a table.
Each record uses a value for CustomerID.
You need to recommend a strategy to partition data based on values in CustomerID.
Proposed Solution: Separate data into customer regions by using vertical partitioning.
Does the solution meet the goal?
A. Yes
B. No
Correct Answer: B
Vertical partitioning is used for cross-database queries. Instead we should use Horizontal Partitioning, which also is
called charding.
References: https://docs.microsoft.com/en-us/azure/sql-database/sql-database-elastic-query-overview

 

QUESTION 8
You need to design the encryption strategy for the tagging data and customer data.
What should you recommend? To answer, drag the appropriate setting to the correct drop targets. Each source may be
used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
NOTE: Each correct selection is worth one point.
Select and Place:lead4pass dp-201 exam question q8

All cloud data must be encrypted at rest and in transit.
Box 1: Transparent data encryption Encryption of the database file is performed at the page level. The pages in an
encrypted database are encrypted before they are written to disk and decrypted when read into memory. Box 2:
Encryption at rest
Encryption at Rest is the encoding (encryption) of data when it is persisted.
References: https://docs.microsoft.com/en-us/sql/relational-databases/security/encryption/transparent-data-encryption?view=sql-server-2017 https://docs.microsoft.com/en-us/azure/security/azure-security-encryption-atrest

 

QUESTION 9
You plan to deploy an Azure SQL Database instance to support an application. You plan to use the DTU-based
purchasing model.
Backups of the database must be available for 30 days and point-in-time restoration must be possible.
You need to recommend a backup and recovery policy.
What are two possible ways to achieve the goal? Each correct answer presents a complete solution.
NOTE: Each correct selection is worth one point.
A. Use the Premium tier and the default backup retention policy.
B. Use the Basic tier and the default backup retention policy.
C. Use the Standard tier and the default backup retention policy.
D. Use the Standard tier and configure a long-term backup retention policy.
E. Use the Premium tier and configure a long-term backup retention policy.
Correct Answer: DE
The default retention period for a database created using the DTU-based purchasing model depends on the service
tier:
Basic service tier is 1 week.
Standard service tier is 5 weeks.
Premium service tier is 5 weeks.
Incorrect Answers:
B: Basic tier only allows restore points within 7 days.
References: https://docs.microsoft.com/en-us/azure/sql-database/sql-database-long-term-retention

 

QUESTION 10
You are designing an Azure Databricks cluster that runs user-defined local processes. You need to recommend a
cluster configuration that meets the following requirements: Minimize query latency.
-Reduce overall costs.

Maximize the number of users that can run queries on the cluster at the same time. Which cluster type should you
recommend?
A.
Standard with Autoscaling
B.
High Concurrency with Auto Termination
C.
High Concurrency with Autoscaling
D.
Standard with Auto Termination
Correct Answer: A

 

QUESTION 11
A company is developing a mission-critical line of business app that uses Azure SQL Database Managed Instance.
You must design a disaster recovery strategy for the solution/
You need to ensure that the database automatically recovers when full or partial loss of the Azure SQL Database
service occurs in the primary region.
What should you recommend?
A. Failover-group
B. Azure SQL Data Sync
C. SQL Replication
D. Active geo-replication
Correct Answer: A
Auto-failover groups is a SQL Database feature that allows you to manage replication and failover of a group of
databases on a SQL Database server or all databases in a Managed Instance to another region (currently in public
preview for Managed Instance). It uses the same underlying technology as active geo-replication. You can initiate
failover manually or you can delegate it to the SQL Database service based on a user-defined policy.
References: https://docs.microsoft.com/en-us/azure/sql-database/sql-database-auto-failover-group

 

QUESTION 12
You need to recommend an Azure SQL Database service tier.
What should you recommend?
A. Business Critical
B. General Purpose
C. Premium
D. Standard
E. Basic
Correct Answer: C
The data engineers must set the SQL Data Warehouse compute resources to consume 300 DWUs.
Note: There are three architectural models that are used in Azure SQL Database: General Purpose/Standard Business
Critical/Premium Hyperscale
Incorrect Answers:
A: Business Critical service tier is designed for the applications that require low-latency responses from the underlying
SSD storage (1-2 ms in average), fast recovery if the underlying infrastructure fails, or need to off-load reports,
analytics, and read-only queries to the free of charge readable secondary replica of the primary database.
References: https://docs.microsoft.com/en-us/azure/sql-database/sql-database-service-tier-business-critical

 

QUESTION 13
You need to design the authentication and authorization methods for sensors.
What should you recommend? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:lead4pass dp-201 exam question q13

Correct Answer:

lead4pass dp-201 exam question q13-1

Sensor data must be stored in a Cosmos DB named treydata in a collection named SensorData
Sensors must have permission only to add items to the SensorData collection
Box 1: Resource Token
Resource tokens provide access to the application resources within a Cosmos DB database.
Enable clients to read, write, and delete resources in the Cosmos DB account according to the permissions they\\’ve
been granted.
Box 2: Cosmos DB user
You can use a resource token (by creating Cosmos DB users and permissions) when you want to provide access to
resources in your Cosmos DB account to a client that cannot be trusted with the master key.
References:
https://docs.microsoft.com/en-us/azure/cosmos-db/secure-access-to-data

Related DP-201 Popular Exam resources

titlepdf youtube Microsoft lead4pass Lead4Pass Total Questions
Microsoft Azure lead4pass DP-201 dumps pdf lead4pass DP-201 youtube Exam DP-201: Designing an Azure Data Solution – Microsoft https://www.leads4pass.com/DP-201.html 74 Q&A
lead4pass DP-200 dumps pdf lead4pass DP-200 youtube Exam DP-200: Implementing an Azure Data Solution – Microsoft https://www.leads4pass.com/DP-200.html 86 Q&A
lead4pass ms-301 dumps pdf lead4pass ms-301 youtube Exam MS-301: Deploying SharePoint Server Hybrid (beta) https://www.leads4pass.com/ms-301.html 63 Q&A

Get Lead4Pass Coupons(12% OFF)

lead4pass coupon

What are the advantages of Lead4pass?

Lead4pass employs the most authoritative exam specialists from Microsoft, Cisco, IBM, CompTIA, etc. We update exam data throughout the year. Highest pass rate! We have a large user base. We are an industry leader! Choose Lead4Pass to pass the exam with ease!

why lead4pass

Summarize:

It’s not easy to pass the Microsoft exam, but with accurate learning materials and proper practice, you can crack the exam with excellent results. Lead4pass provides you with the most relevant learning materials that you can use to help you prepare.

Last Updated on: October 8th, 2019 at 9:18 am, by admin


Written by admin