What should you include in the Data Factory pipeline for Race Central?

What should you include in the Data Factory pipeline for Race Central?
A . a copy activity that uses a stored procedure as a source
B . a copy activity that contains schema mappings
C . a delete activity that has logging enabled
D . a filter activity that has a condition

Answer: B

Explanation:

Scenario:

An Azure Data Factory pipeline must be used to move data from Cosmos DB to SQL Database for Race Central. If the data load takes longer than 20 minutes, configuration changes must be made to Data Factory.

The telemetry data is sent to a MongoDB database. A custom application then moves the data to databases in SQL Server 2017. The telemetry data in MongoDB has more than 500 attributes. The application changes the attribute names when the data is moved to SQL Server 2017.

You can copy data to or from Azure Cosmos DB (SQL API) by using Azure Data Factory pipeline.

Column mapping applies when copying data from source to sink. By default, copy activity map source data to sink by column names. You can specify explicit mapping to customize the column mapping based on your need. More specifically, copy activity:

Read the data from source and determine the source schema

✑ Use default column mapping to map columns by name, or apply explicit column mapping if specified.

✑ Write the data to sink

✑ Write the data to sink

References: https://docs.microsoft.com/en-us/azure/data-factory/copy-activity-schema-and-type-mapping

Latest DP-200 Dumps Valid Version with 242 Q&As

Latest And Valid Q&A | Instant Download | Once Fail, Full Refund

Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments