What should you do to improve the performance of the job?

You are monitoring the Data Factory pipeline that runs from Cosmos DB to SQL Database for Race Central.

You discover that the job takes 45 minutes to run.

What should you do to improve the performance of the job?
A . Decrease parallelism for the copy activities.
B . Increase that data integration units.
C . Configure the copy activities to use staged copy.
D . Configure the copy activities to perform compression.

Answer: B

Explanation:

Performance tuning tips and optimization features. In some cases, when you run a copy activity in Azure Data Factory, you see a "Performance tuning tips" message on top of the copy activity monitoring, as shown in the following example. The message tells you the bottleneck that was identified for the given copy run. It also guides you on what to change to boost copy throughput. The performance tuning tips currently provide suggestions like: Use PolyBase when you copy data into Azure SQL Data Warehouse.

Increase Azure Cosmos DB Request Units or Azure SQL Database DTUs (Database Throughput Units)

when the resource on the data store side is the bottleneck.

Remove the unnecessary staged copy.

References: https://docs.microsoft.com/en-us/azure/data-factory/copy-activity-performance

Latest DP-200 Dumps Valid Version with 242 Q&As

Latest And Valid Q&A | Instant Download | Once Fail, Full Refund

Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments