Microsoft DP-500 Designing and Implementing Enterprise-Scale Analytics Solutions Using Microsoft Azure and Microsoft Power BI Online Training
Microsoft DP-500 Online Training
The questions for DP-500 were last updated at May 13,2024.
- Exam Code: DP-500
- Exam Name: Designing and Implementing Enterprise-Scale Analytics Solutions Using Microsoft Azure and Microsoft Power BI
- Certification Provider: Microsoft
- Latest update: May 13,2024
DRAG DROP
You need to ensure that the new process for deploying reports and datasets to the User Experience workspace meets the technical requirements.
Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.
The enterprise analytics team needs to resolve the DAX measure performance issues.
What should the team do first?
- A . Use Performance analyzer in Power Bl Desktop to get the DAX durations.
- B . Use DAX Studio to get detailed statistics on the server timings.
- C . Use DAX Studio to review the Vertipaq Analyzer metrics.
- D . Use Tabular Editor to create calculation groups.
You need to optimize the workflow for the creation of reports and the adjustment of tables by the enterprise analytics team.
What should you do?
- A . Add a tenant-level storage connection to Power Bl
- B . Create a linked service in workspace1.
- C . Create an integration runtime in workspace1.
- D . From the Tenant setting, enable Use global search for Power BL
You need to meet the technical requirements for deploying reports and datasets to the User Experience workspace.
What should you do?
- A . From the Corporate Data Models and User Experience workspaces, select Allow contributors to update the app.
- B . From the Corporate Data Models and User Experience workspace, set License mode to Premium per user
- C . From the Tenant settings, set Allow specific users to turn on external data sharing to Enable.
- D . From the Developer settings, set Allow service principals to use Power Bl APIs to Enable.
You need to recommend changes to the Power Bl tenant to meet the technical requirements for external data sharing.
Which tenant setting should you recommend disabling?
- A . Allow shareable links to grant access to everyone in your organization
- B . Allow Azure Active Directory guest users to edit and manage content in the organization
- C . Users can reassign personal workspaces
- D . Show Azure Active Directory guests in lists of suggested people
The group registers the Power Bl tenant as a data source1.
You need to ensure that all the analysts can view the assets in the Power Bl tenant The solution must meet the technical requirements for Microsoft Purview and Power BI.
What should you do?
- A . Create a scan.
- B . Deploy a Power Bl gateway.
- C . Search the data catalog.
- D . Create a linked service.
Topic 4, Misc. Questions
You develop a solution that uses a Power Bl Premium capacity. The capacity contains a dataset that is expected to consume 50 GB of memory.
Which two actions should you perform to ensure that you can publish the model successfully to the Power Bl service? Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point.
- A . Increase the Max Offline Dataset Size setting.
- B . Invoke a refresh to load historical data based on the incremental refresh policy.
- C . Restart the capacity.
- D . Publish an initial dataset that is less than 10 GB.
- E . Publish the complete dataset.
BE
Explanation:
Enable large datasets
Steps here describe enabling large datasets for a new model published to the service. For existing datasets, only step 3 is necessary.
Create a model in Power BI Desktop. If your dataset will become larger and progressively consume more memory, be sure to configure Incremental refresh.
Publish the model as a dataset to the service.
In the service > dataset > Settings, expand Large dataset storage format, set the slider to On, and then select Apply.
Enable large dataset slider
Invoke a refresh to load historical data based on the incremental refresh policy. The first refresh could take a while to load the history. Subsequent refreshes should be faster, depending on your incremental refresh policy.
Reference: https://docs.microsoft.com/en-us/power-bi/enterprise/service-premium-large-models
DRAG DROP
You have a Power Bl dataset that contains the following measures:
• Budget
• Actuals
• Forecast
You create a report that contains 10 visuals.
You need provide users with the ability to use a slicer to switch between the measures in two visuals only.
You create a dedicated measure named cg Measure switch.
How should you complete the DAX expression for the Actuals measure? To answer, drag the appropriate values to the targets. Each value may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content. NOTE: Each correct selection is worth one point.
Explanation:
Box 1: SELECTEDMEASURENAME()
SELECTEDMEASURENAME is used by expressions for calculation items to determine the measure that is in context by name.
Syntax: SELECTEDMEASURENAME()
No parameters.
Example:
The following calculation item expression checks if the current measure is Expense Ratio and conditionally applies calculation logic. Since the check is based on a string comparison, it is not subject to formula fixup and will not benefit from object renaming being automatically reflected. For a similar comparison that would benefit from formula fixup, please see the ISSLECTEDMEASURE function instead.
IF (
SELECTEDMEASURENAME = "Expense Ratio",
SELECTEDMEASURE (),
DIVIDE ( SELECTEDMEASURE (), COUNTROWS ( DimDate ) )
)
Box 2: SELECTEDVALUE()
SELECTEDVALUE returns the value when the context for columnName has been filtered down to one distinct value only. Otherwise returns alternateResult.
Syntax:
SELECTEDVALUE(<columnName>[, <alternateResult>])
M1, M2, … – A list of measures.
Reference:
https://docs.microsoft.com/en-us/dax/selectedmeasurename-function-dax
https://docs.microsoft.com/en-us/dax/selectedvalue-function
You have a Power Bi workspace named Workspacel in a Premium capacity. Workspacel contains a dataset.
During a scheduled refresh, you receive the following error message: "Unable to save the changes since the new dataset size of 11,354 MB exceeds the limit of 10,240 MB."
You need to ensure that you can refresh the dataset.
What should you do?
- A . Turn on Large dataset storage format.
- B . Connect Workspace1 to an Azure Data Lake Storage Gen2 account
- C . Change License mode to Premium per user.
- D . Change the location of the Premium capacity.
D
Explanation:
Assigning workspaces to capacities
Workspaces can be assigned to a Premium capacity in the Power BI Admin portal or, for a workspace, in the Workspace pane.
Note: Capacity limits
Workspace storage limits, whether for My Workspace or an app workspace, depend on whether the workspace is in shared or Premium capacity.
* Shared capacity limits
For workspaces in shared capacity:
There is a per-workspace storage limit of 10 GB.
Premium Per User (PPU) tenants have a 100 TB storage limit.
When using a Pro license, the total usage can’t exceed the tenant storage limit of 10 GB multiplied by the number of Pro licenses in the tenant.
* Premium capacity limits
For workspaces in Premium capacity:
There is a limit of 100 TB per Premium capacity.
There is no per-user storage limit.
Workspace storage usage is shown as 0 (as shown in this screenshot) if the workspace is assigned to a Premium capacity.
Incorrect:
Not C: If your organization is using the original version of Power BI Premium, you’re required to migrate to the modern Premium Gen2 platform. Microsoft began migrating all Premium capacities to Gen2.
Reference:
https://docs.microsoft.com/en-us/power-bi/enterprise/service-premium-capacity-manage-gen2
https://docs.microsoft.com/en-us/power-bi/admin/service-admin-manage-your-data-storage-in-power-bi
You have a dataset that contains a table named UserPermissions. UserPermissions contains the following data.
You plan to create a security role named User Security for the dataset. You need to filter the dataset based on the current users.
What should you include in the DAX expression?
- A . [UserPermissions] – USERNAME()
- B . [UserPermissions] – USERPRINCIPALNAME()
- C . [User] = USERPRINCIPALNAME()
- D . [User] = USERNAME()
- E . [User] = USEROBJECTID()
D
Explanation:
USERNAME() returns the domain name and username from the credentials given to the system at connection time.
It should be compared to column name of User, which in DAX is expressed through [User].
Reference: https://docs.microsoft.com/en-us/dax/username-function-dax