Microsoft DP-500 Designing and Implementing Enterprise-Scale Analytics Solutions Using Microsoft Azure and Microsoft Power BI Online Training

To help you pass your DP-500 Designing and Implementing Enterprise-Scale Analytics Solutions Using Microsoft Azure and Microsoft Power BI exam, you can prepare your Microsoft DP-500 Exam with less effort. You would have basic and advanced understanding about all the concepts of Microsoft DP-500 Exam Certification. These Microsoft DP-500 Designing and Implementing Enterprise-Scale Analytics Solutions Using Microsoft Azure and Microsoft Power BI Online Training are designed for your convenience and you can rely on them without any hesitation. With the help of Exam4Training Microsoft DP-500 Designing and Implementing Enterprise-Scale Analytics Solutions Using Microsoft Azure and Microsoft Power BI Online Training for the preparation of Microsoft Certified: Azure Enterprise Data Analyst Associate DP-500 Exam, you would be able to pass this exam in first attempt with maximum grades.

Page 1 of 3

1. What should you configure in the deployment pipeline?

2. HOTSPOT

You need to populate the CustomersWithProductScore table.

How should you complete the stored procedure? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.



3. DRAG DROP

You need to create the customized Power Bl usage reporting. The Usage Metrics Report dataset has already been created. The solution must minimize development and administrative effort.

Which four actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.



4. Topic 2, Contoso, Ltd



Overview

Contoso, Ltd. is a company that sells enriched financial data to a variety of external customers.

Contoso has a main office in Los Angeles and two branch offices in New York and Seattle.



Data Infrastructure

Contoso has a 50-TB data warehouse that uses an instance of SQL Server on Azure Virtual Machines.

The data warehouse populates an Azure Synapse Analytics workspace that is accessed by the external customers. Currently, the customers can access alt the data.

Contoso has one Power Bl workspace named FinData that contains a single dataset. The dataset

contains financial data from around the world. The workspace is used by 10 internal users and one external customer. The dataset has the following two data sources: the data warehouse and the Synapse Analytics serverless SQL pool.

Users frequently query the Synapse Analytics workspace by using Transact-SQL.



User Problems

Contoso identifies the following user issues:

• Some users indicate that the visuals in Power Bl reports are slow to render when making filter selections.

• Users indicate that queries against the serverless SQL pool fail occasionally because the size of tempdb has been exceeded.

• Users indicate that the data in Power Bl reports is stale. You discover that the refresh process of the Power Bl model occasionally times out



Planned Changes

Contoso plans to implement the following changes:

• Into the existing Power Bl dataset, integrate an external data source that is accessible by using the REST API.

• Build a new dataset in the FinData workspace by using data from the Synapse Analytics dedicated SQL pool.

• Provide all the customers with their own Power Bl workspace to create their own reports. Each workspace will use the new dataset in the FinData workspace.

• Implement subscription levels for the customers. Each subscription level will provide access to specific rows of financial data.

• Deploy prebuilt datasets to Power Bl to simplify the query experience of the customers.

• Provide internal users with the ability to incorporate machine learning models loaded to the dedicated SQL pool.



You need to recommend a solution to add new fields to the financial data Power Bl dataset with data from the Microsoft SQL Server data warehouse.

What should you include in the recommendation?

5. You need to recommend a solution for the customer workspaces to support the planned changes.

Which two configurations should you include in the recommendation? Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point.

6. DRAG DROP

You need to integrate the external data source to support the planned changes.

Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.



7. DRAG DROP

You need to create Power Bl reports that will display data based on the customers' subscription level.

Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.



8. Topic 3, Misc. Questions



You develop a solution that uses a Power Bl Premium capacity. The capacity contains a dataset that is expected to consume 50 GB of memory.

Which two actions should you perform to ensure that you can publish the model successfully to the Power Bl service? Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point.

9. DRAG DROP

You have a Power Bl dataset that contains the following measures:

• Budget

• Actuals

• Forecast

You create a report that contains 10 visuals.

You need provide users with the ability to use a slicer to switch between the measures in two visuals only.

You create a dedicated measure named cg Measure switch.

How should you complete the DAX expression for the Actuals measure? To answer, drag the appropriate values to the targets. Each value may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content. NOTE: Each correct selection is worth one point.



10. You have a Power Bi workspace named Workspacel in a Premium capacity. Workspacel contains a dataset.

During a scheduled refresh, you receive the following error message: "Unable to save the changes since the new dataset size of 11,354 MB exceeds the limit of 10,240 MB."

You need to ensure that you can refresh the dataset.

What should you do?