Exam4Training

Microsoft AZ-204 Developing Solutions for Microsoft Azure Online Training

Question #1

Topic 1, Windows Server 2016 virtual machine

Case study

This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.

To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.

At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

To start the case study

To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. If the case study has an All Information tab, note that the information displayed is identical to the information displayed on the subsequent tabs. When you are ready to answer a question, click the Question button to return to the question.

Current environment

Windows Server 2016 virtual machine

This virtual machine (VM) runs BizTalk Server 2016. The VM runs the following workflows:

– Ocean Transport C This workflow gathers and validates container information including container contents and arrival notices at various shipping ports.

– Inland Transport C This workflow gathers and validates trucking information including fuel usage, number of stops, and routes.

The VM supports the following REST API calls:

– Container API C This API provides container information including weight, contents, and other attributes.

– Location API C This API provides location information regarding shipping ports of call and trucking stops.

– Shipping REST API C This API provides shipping information for use and display on the shipping website.

Shipping Data

The application uses MongoDB JSON document storage database for all container and transport information.

Shipping Web Site

The site displays shipping container tracking information and container contents. The site is located at http://shipping.wideworldimporters.com/

Proposed solution

The on-premises shipping application must be moved to Azure. The VM has been migrated to a new Standard_D16s_v3 Azure VM by using Azure Site Recovery and must remain running in Azure to complete the BizTalk component migrations. You create a Standard_D16s_v3 Azure VM to host BizTalk Server.

The Azure architecture diagram for the proposed solution is shown below:

Requirements

Shipping Logic app

The Shipping Logic app must meet the following requirements:

– Support the ocean transport and inland transport workflows by using a Logic App.

– Support industry-standard protocol X12 message format for various messages including vessel content details and arrival notices.

– Secure resources to the corporate VNet and use dedicated storage resources with a fixed costing model.

– Maintain on-premises connectivity to support legacy applications and final BizTalk migrations.

Shipping Function app

Implement secure function endpoints by using app-level security and include Azure Active Directory (Azure AD).

REST APIs

The REST API’s that support the solution must meet the following requirements:

– Secure resources to the corporate VNet.

– Allow deployment to a testing location within Azure while not incurring additional costs.

– Automatically scale to double capacity during peak shipping times while not causing application downtime.

– Minimize costs when selecting an Azure payment model.

Shipping data

Data migration from on-premises to Azure must minimize costs and downtime.

Shipping website

Use Azure Content Delivery Network (CDN) and ensure maximum performance for dynamic content while minimizing latency and costs.

Issues

Windows Server 2016 VM

The VM shows high network latency, jitter, and high CPU utilization. The VM is critical and has not been backed up in the past. The VM must enable a quick restore from a 7-day snapshot to include in-place restore of disks in case of failure.

Shipping website and REST APIs

The following error message displays while you are testing the website:

Failed to load http://test-shippingapi.wideworldimporters.com/: No ‘Access-Control-Allow-Origin’ header is present on the requested resource. Origin ‘http://test.wideworldimporters.com/’ is therefore not allowed access.

HOTSPOT

You need to configure Azure CDN for the Shipping web site.

Which configuration options should you use? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.

Reveal Solution Hide Solution

Correct Answer:

Explanation:

Scenario: Shipping website

Use Azure Content Delivery Network (CDN) and ensure maximum performance for dynamic content while minimizing latency and costs.

Tier: Standard

Profile: Akamai

Optimization: Dynamic site acceleration

Dynamic site acceleration (DSA) is available for Azure CDN Standard from Akamai, Azure CDN Standard from Verizon, and Azure CDN Premium from Verizon profiles.

DSA includes various techniques that benefit the latency and performance of dynamic content. Techniques include route and network optimization, TCP optimization, and more.

You can use this optimization to accelerate a web app that includes numerous responses

that aren’t cacheable. Examples are search results, checkout transactions, or real-time data. You can continue to use core Azure CDN caching capabilities for static data.


Question #2

HOTSPOT

You need to secure the Shipping Function app.

How should you configure the app? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.

Reveal Solution Hide Solution

Correct Answer:

Explanation:

Scenario: Shipping Function app: Implement secure function endpoints by using app-level security and include Azure Active Directory (Azure AD).

Box 1: Function

Box 2: JSON based Token (JWT)

Azure AD uses JSON based tokens (JWTs) that contain claims

Box 3: HTTP

How a web app delegates sign-in to Azure AD and obtains a token

User authentication happens via the browser. The OpenID protocol uses standard HTTP protocol messages.

References: https://docs.microsoft.com/en-us/azure/active-directory/develop/authentication-scenarios


Question #3

You need to secure the Shipping Logic App.

What should you use?

  • A . Azure App Service Environment (ASE)
  • B . Azure AD B2B integration
  • C . Integration Service Environment (ISE)
  • D . VNe t service endpoint

Reveal Solution Hide Solution

Correct Answer: C
C

Explanation:

Scenario: The Shipping Logic App requires secure resources to the corporate VNet and use dedicated storage resources with a fixed costing model.

You can access to Azure Virtual Network resources from Azure Logic Apps by using integration service environments (ISEs).

Sometimes, your logic apps and integration accounts need access to secured resources, such as virtual machines (VMs) and other systems or services, that are inside an Azure virtual network. To set

up this access, you can create an integration service environment (ISE) where you can run your logic apps and create your integration accounts.

Reference: https://docs.microsoft.com/en-us/azure/logic-apps/connect-virtual-network-vnet-isolated-environment-overview

Question #4

DRAG DROP

You need to support the message processing for the ocean transport workflow.

Which four actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.

Reveal Solution Hide Solution

Correct Answer:

Explanation:

Step 1: Create an integration account in the Azure portal

You can define custom metadata for artifacts in integration accounts and get that metadata during runtime for your logic app to use. For example, you can provide metadata for artifacts, such as partners, agreements, schemas, and maps – all store metadata using key-value pairs.

Step 2: Link the Logic App to the integration account

A logic app that’s linked to the integration account and artifact metadata you want to use.

Step 3: Add partners, schemas, certificates, maps, and agreements

Step 4: Create a custom connector for the Logic App.

Reference: https://docs.microsoft.com/bs-latn-ba/azure/logic-apps/logic-apps-enterprise-integration-metadata


Question #5

You need to support the requirements for the Shipping Logic App.

What should you use?

  • A . Azure Active Directory Application Proxy
  • B . Point-to-Site (P2S) VPN connection
  • C . Site-to-Site (S2S) VPN connection
  • D . On-premises Data Gateway

Reveal Solution Hide Solution

Correct Answer: D
D

Explanation:

Before you can connect to on-premises data sources from Azure Logic Apps, download and install the on-premises data gateway on a local computer. The gateway works as a bridge that provides quick data transfer and encryption between data sources on premises (not in the cloud) and your logic apps.

The gateway supports BizTalk Server 2016.

Note: Microsoft have now fully incorporated the Azure BizTalk Services capabilities into Logic Apps and Azure App Service Hybrid Connections.

Logic Apps Enterprise Integration pack bring some of the enterprise B2B capabilities like AS2 and X12, EDI standards support

Scenario: The Shipping Logic app must meet the following requirements:

✑ Support the ocean transport and inland transport workflows by using a Logic App.

✑ Support industry-standard protocol X12 message format for various messages

including vessel content details and arrival notices.

✑ Secure resources to the corporate VNet and use dedicated storage resources with a fixed costing model.

✑ Maintain on-premises connectivity to support legacy applications and final BizTalk migrations.

Reference: https://docs.microsoft.com/en-us/azure/logic-apps/logic-apps-gateway-install

Question #6

You need to migrate on-premises shipping data to Azure.

What should you use?

  • A . Azure Migrate
  • B . Azure Cosmos DB Data Migration tool (dt.exe)
  • C . AzCopy
  • D . Azure Database Migration service

Reveal Solution Hide Solution

Correct Answer: D
D

Explanation:

Migrate from on-premises or cloud implementations of MongoDB to Azure Cosmos DB with minimal downtime by using Azure Database Migration Service. Perform resilient migrations of MongoDB data at scale and with high reliability.

Scenario: Data migration from on-premises to Azure must minimize costs and downtime.

The application uses MongoDB JSON document storage database for all container and transport information.

Reference: https://azure.microsoft.com/en-us/updates/mongodb-to-azure-cosmos-db-online-and-offline-migrations-are-now-available/

Question #7

HOTSPOT

You need to resolve the Shipping web site error.

How should you configure the Azure Table Storage service? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.

Reveal Solution Hide Solution

Correct Answer:

Explanation:

Box 1: AllowedOrigins

A CORS request will fail if Access-Control-Allow-Origin is missing.

Scenario:

The following error message displays while you are testing the website:

Box 2: http://test-shippingapi.wideworldimporters.com

Syntax: Access-Control-Allow-Origin: *

Access-Control-Allow-Origin: <origin>

Access-Control-Allow-Origin: null

<origin> Specifies an origin. Only a single origin can be specified.

Box 3: AllowedOrigins

Box 4: POST

The only allowed methods are GET, HEAD, and POST. In this case POST is used.

"<Corsrule>" "allowedmethods" Failed to load no "Access-control-Origin" header is present

References: https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Access-Control-Allow-Origin


Question #8

HOTSPOT

You need to correct the VM issues.

Which tools should you use? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.

Reveal Solution Hide Solution

Correct Answer:

Explanation:

Backup and Restore: Azure Backup

Scenario: The VM is critical and has not been backed up in the past. The VM must enable a quick restore from a 7-day snapshot to include in-place restore of disks in case of failure.

In-Place restore of disks in IaaS VMs is a feature of Azure Backup.

Performance: Accelerated Networking

Scenario: The VM shows high network latency, jitter, and high CPU utilization.

Accelerated networking enables single root I/O virtualization (SR-IOV) to a VM, greatly improving its networking performance. This high-performance path bypasses the host from the datapath, reducing latency, jitter, and CPU utilization, for use with the most demanding network workloads on supported VM types.

Reference: https://azure.microsoft.com/en-us/blog/an-easy-way-to-bring-back-your-azure-vm-with-in-place-restore/


Question #9

HOTSPOT

You need to update the APIs to resolve the testing error.

How should you complete the Azure CLI command? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.

Reveal Solution Hide Solution

Correct Answer:

Explanation:

Enable Cross-Origin Resource Sharing (CORS) on your Azure App Service Web App.

Enter the full URL of the site you want to allow to access your WEB API or * to allow all domains.

Box 1: cors

Box 2: add

Box 3: allowed-origins

Box 4: http://testwideworldimporters.com/

Reference: http://donovanbrown.com/post/How-to-clear-No-Access-Control-Allow-Origin-header-error-with-Azure-App-Service


Question #10

HOTSPOT

You need to configure Azure App Service to support the REST API requirements.

Which values should you use? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.

Reveal Solution Hide Solution

Correct Answer:

Explanation:

Plan: Standard

Standard support auto-scaling

Instance Count: 10

Max instances for standard is 10.

Scenario:

The REST API’s that support the solution must meet the following requirements:

✑ Allow deployment to a testing location within Azure while not incurring additional costs.

✑ Automatically scale to double capacity during peak shipping times while not causing application downtime.

✑ Minimize costs when selecting an Azure payment model.

References: https://azure.microsoft.com/en-us/pricing/details/app-service/plans/


Question #11

Topic 2, Contoso, Ltd

Case study

This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.

To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.

At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

To start the case study

To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

Background

Overview

You are a developer for Contoso, Ltd. The company has a social networking website that is developed as a Single Page Application (SPA). The main web application for the social networking website loads user uploaded content from blob storage.

You are developing a solution to monitor uploaded data for inappropriate content.

The following process occurs when users upload content by using the SPA:

• Messages are sent to ContentUploadService.

• Content is processed by ContentAnalysisService.

• After processing is complete, the content is posted to the social network or a rejection message is posted in its place.

The ContentAnalysisService is deployed with Azure Container Instances from a private Azure Container Registry named contosoimages.

The solution will use eight CPU cores.

Azure Active Directory

Contoso, Ltd. uses Azure Active Directory (Azure AD) for both internal and guest accounts.

Requirements

ContentAnalysisService

The company’s data science group built ContentAnalysisService which accepts user generated content as a string and returns a probable value for inappropriate content. Any values over a specific threshold must be reviewed by an employee of Contoso, Ltd.

You must create an Azure Function named CheckUserContent to perform the content checks.

Costs

You must minimize costs for all Azure services.

Manual review

To review content, the user must authenticate to the website portion of the ContentAnalysisService using their Azure AD credentials. The website is built using React and all pages and API endpoints require authentication. In order to review content a user must be part of a ContentReviewer role. All completed reviews must include the reviewer’s email address for auditing purposes.

High availability

All services must run in multiple regions. The failure of any service in a region must not impact overall application availability.

Monitoring

An alert must be raised if the ContentUploadService uses more than 80 percent of available CPU cores.

Security

You have the following security requirements:

– Any web service accessible over the Internet must be protected from cross site scripting attacks.

– All websites and services must use SSL from a valid root certificate authority.

– Azure Storage access keys must only be stored in memory and must be available only to the service.

– All Internal services must only be accessible from internal Virtual Networks (VNets).

– All parts of the system must support inbound and outbound traffic restrictions.

– All service calls must be authenticated by using Azure AD.

User agreements

When a user submits content, they must agree to a user agreement. The agreement allows employees of Contoso, Ltd. to review content, store cookies on user devices, and track user’s IP addresses.

Information regarding agreements is used by multiple divisions within Contoso, Ltd.

User responses must not be lost and must be available to all parties regardless of individual service uptime. The volume of agreements is expected to be in the millions per hour.

Validation testing

When a new version of the ContentAnalysisService is available the previous seven days of content must be processed with the new version to verify that the new version does not significantly deviate from the old version.

Issues

Users of the ContentUploadService report that they occasionally see HTTP 502 responses on specific pages.

Code

ContentUploadService

You need to configure the ContentUploadService deployment.

Which two actions should you perform? Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point.

  • A . Add the following markup to line CS23:
    types: Private
  • B . Add the following markup to line CS24:
    osType: Windows
  • C . Add the following markup to line CS24:
    osType: Linux
  • D . Add the following markup to line CS23:
    types: Public

Reveal Solution Hide Solution

Correct Answer: C
C

Explanation:

Scenario: All Internal services must only be accessible from Internal Virtual Networks (VNets)

There are three Network Location types C Private, Public and Domain

Reference: https://devblogs.microsoft.com/powershell/setting-network-location-to-private/

Question #12

You need to store the user agreements.

Where should you store the agreement after it is completed?

  • A . Azure Storage queue
  • B . Azure Event Hub
  • C . Azure Service Bus topic
  • D . Azure Event Grid topic

Reveal Solution Hide Solution

Correct Answer: B
B

Explanation:

Azure Event Hub is used for telemetry and distributed data streaming.

This service provides a single solution that enables rapid data retrieval for real-time processing as well as repeated replay of stored raw data. It can capture the streaming data into a file for processing and analysis.

It has the following characteristics:

✑ low latency

✑ capable of receiving and processing millions of events per second

✑ at least once delivery

Reference: https://docs.microsoft.com/en-us/azure/event-grid/compare-messaging-services

Question #13

HOTSPOT

You need to implement the bindings for the CheckUserContent function.

How should you complete the code segment? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.

Reveal Solution Hide Solution

Correct Answer:

Explanation:

Box 1: [BlobTrigger(..)]

Box 2: [Blob(..)]

Azure Blob storage output binding for Azure Functions. The output binding allows you to modify and delete blob storage data in an Azure Function.

The attribute’s constructor takes the path to the blob and a FileAccess parameter indicating read or write, as shown in the following example:

[FunctionName("ResizeImage")]

public static void Run(

[BlobTrigger("sample-images/{name}")] Stream image,

[Blob("sample-images-md/{name}", FileAccess.Write)] Stream imageSmall)

{



}

Scenario: You must create an Azure Function named CheckUserContent to perform the content checks.

The company’s data science group built ContentAnalysisService which accepts user generated content as a string and returns a probable value for inappropriate content. Any values over a specific threshold must be reviewed by an employee of Contoso, Ltd.

Reference: https://docs.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-blob-output


Question #14

DRAG DROP

You need to add markup at line AM04 to implement the ContentReview role.

How should you complete the markup? To answer, drag the appropriate json segments to the correct locations. Each json segment may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content. NOTE: Each correct selection is worth one point.

Reveal Solution Hide Solution

Correct Answer:

Explanation:

Box 1: allowedMemberTypes

allowedMemberTypes specifies whether this app role definition can be assigned to users and groups by setting to "User", or to other applications (that are accessing this application in daemon service scenarios) by setting to "Application", or to both.

Note: The following example shows the appRoles that you can assign to users.

"appId": "8763f1c4-f988-489c-a51e-158e9ef97d6a",

"appRoles": [

{

"allowedMemberTypes": [

"User"

],

"displayName": "Writer",

"id": "d1c2ade8-98f8-45fd-aa4a-6d06b947c66f",

"isEnabled": true,

"description": "Writers Have the ability to create tasks.",

"value": "Writer"

}

],

"availableToOtherTenants": false,

Box 2: User

Scenario: In order to review content a user must be part of a ContentReviewer role.

Box 3: value

value specifies the value which will be included in the roles claim in authentication and access tokens.

Reference: https://docs.microsoft.com/en-us/graph/api/resources/approle


Question #15

HOTSPOT

You need to add code at line AM09 to ensure that users can review content using Content Analysis Service.

How should you complete the code? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.

Reveal Solution Hide Solution

Correct Answer:

Explanation:

Box 1: "oauth2Permissions": ["login"]

oauth2Permissions specifies the collection of OAuth 2.0 permission scopes that the web API (resource) app exposes to client apps. These permission scopes may be granted to client apps during consent.

Box 2: "oauth2AllowImplicitFlow":true

For applications (Angular, Ember.js, React.js, and so on), Microsoft identity platform supports the OAuth 2.0 Implicit Grant flow.

Reference: https://docs.microsoft.com/en-us/azure/active-directory/develop/reference-app-manifest


Question #16

HOTSPOT

You need to ensure that network security policies are met.

How should you configure network security? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.

Reveal Solution Hide Solution

Correct Answer:

Explanation:

Box 1: Valid root certificate

Scenario: All websites and services must use SSL from a valid root certificate authority.

Box 2: Azure Application Gateway

Scenario:

✑ Any web service accessible over the Internet must be protected from cross site scripting attacks.

✑ All Internal services must only be accessible from Internal Virtual Networks (VNets)

✑ All parts of the system must support inbound and outbound traffic restrictions.

Azure Web Application Firewall (WAF) on Azure Application Gateway provides centralized protection of your web applications from common exploits and vulnerabilities. Web applications are increasingly targeted by malicious attacks that exploit commonly known vulnerabilities. SQL injection and cross-site scripting are among the most common attacks.

Application Gateway supports autoscaling, SSL offloading, and end-to-end SSL, a web application firewall (WAF), cookie-based session affinity, URL path-based routing, multisite hosting, redirection, rewrite HTTP headers and other features.

Note: Both Nginx and Azure Application Gateway act as a reverse proxy with Layer 7 loadbalancing features plus a WAF to ensure strong protection against common web vulnerabilities and exploits.

You can modify Nginx web server configuration/SSL for X-XSS protection. This helps to prevent cross-site scripting exploits by forcing the injection of HTTP headers with X-XSS protection.

Reference:

https://docs.microsoft.com/en-us/azure/web-application-firewall/ag/ag-overview

https://www.upguard.com/articles/10-tips-for-securing-your-nginx-deployment


Question #17

You need to monitor ContentUploadService accourding to the requirements.

Which command should you use?

  • A . az monitor metrics alert create Cn alert Cg … – -scopes … – -condition "avg Percentage CPU > 8"
  • B . az monitor metrics alert create Cn alert Cg … – -scopes … – -condition "avg Percentage CPU > 800"
  • C . az monitor metrics alert create Cn alert Cg … – -scopes … – -condition "CPU Usage > 800"
  • D . az monitor metrics alert create Cn alert Cg … – -scopes … – -condition "CPU Usage > 8"

Reveal Solution Hide Solution

Correct Answer: B
B

Explanation:

Scenario: An alert must be raised if the ContentUploadService uses more than 80 percent of available CPU-cores

Reference: https://docs.microsoft.com/sv-se/cli/azure/monitor/metrics/alert

Question #18

HOTSPOT

You need to ensure that validation testing is triggered per the requirements.

How should you complete the code segment? To answer, select the appropriate values in the answer area. NOTE: Each correct selection is worth one point.

Reveal Solution Hide Solution

Correct Answer:

Explanation:

Box 1: RepositoryUpdated

When a new version of the ContentAnalysisService is available the previous seven days of content must be processed with the new version to verify that the new version does not significantly deviate from the old version.

Box 2: service

Box 3: imageCollection

Reference: https://docs.microsoft.com/en-us/azure/devops/notifications/oob-supported-event-types


Question #19

DRAG DROP

You need to add YAML markup at line CS17 to ensure that the ContentUploadService can access Azure Storage access keys.

How should you complete the YAML markup? To answer, drag the appropriate YAML segments to the correct locations. Each YAML segment may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content. NOTE: Each correct selection is worth one point.

Reveal Solution Hide Solution

Correct Answer:

Explanation:

Box 1: volumeMounts

Example:

volumeMounts:

– mountPath: /mnt/secrets

name: secretvolume1

volumes:

– name: secretvolume1 secret:

mysecret1: TXkgZmlyc3Qgc2VjcmV0IEZPTwo=

Box 2: volumes

Box 3: secret

Reference: https://docs.microsoft.com/en-us/azure/container-instances/container-instances-volume-secret


Question #20

HOTSPOT

You need to add code at line AM10 of the application manifest to ensure that the requirement for manually reviewing content can be met.

How should you complete the code? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.

Reveal Solution Hide Solution

Correct Answer:

Explanation:

Box 1: sid

Sid: Session ID, used for per-session user sign-out. Personal and Azure AD accounts.

Scenario: Manual review

To review content, the user must authenticate to the website portion of the ContentAnalysisService using their Azure AD credentials. The website is built using React and all pages and API endpoints require authentication. In order to review content a user must be part of a ContentReviewer role.

Box 2: email

Scenario: All completed reviews must include the reviewer’s email address for auditing purposes.


Question #21

You need to investigate the http server log output to resolve the issue with the ContentUploadService.

Which command should you use first?

  • A . az webapp log
  • B . az ams live-output
  • C . az monitor activity-log
  • D . az container attach

Reveal Solution Hide Solution

Correct Answer: C
C

Explanation:

Scenario: Users of the ContentUploadService report that they occasionally see HTTP 502 responses on specific pages.

"502 bad gateway" and "503 service unavailable" are common errors in your app hosted in Azure App Service.

Microsoft Azure publicizes each time there is a service interruption or performance degradation.

The az monitor activity-log command manages activity logs.

Note: Troubleshooting can be divided into three distinct tasks, in sequential order:

✑ Observe and monitor application behavior

✑ Collect data

✑ Mitigate the issue

Reference: https://docs.microsoft.com/en-us/cli/azure/monitor/activity-log

Question #22

You need to deploy the CheckUserContent Azure function. The solution must meet the security and cost requirements.

Which hosting model should you use?

  • A . Consumption plan
  • B . Premium plan
  • C . App Service plan

Reveal Solution Hide Solution

Correct Answer: C
Question #23

Topic 3, City Power & Light

Case study

This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.

To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.

At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

To start the case study

To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

Background

City Power & Light company provides electrical infrastructure monitoring solutions for homes and businesses. The company is migrating solutions to Azure.

Current environment

Architecture overview

The company has a public website located at http://www.cpandl.com/. The site is a single-page web application that runs in Azure App Service on Linux. The website uses files stored in Azure Storage and cached in Azure Content Delivery Network (CDN) to serve static content.

API Management and Azure Function App functions are used to process and store data in Azure Database for PostgreSQL. API Management is used to broker communications to the Azure Function app functions for Logic app integration. Logic apps are used to orchestrate the data processing while Service Bus and Event Grid handle messaging and events.

The solution uses Application Insights, Azure Monitor, and Azure Key Vault.

Architecture diagram

The company has several applications and services that support their business. The company plans to implement serverless computing where possible.

The overall architecture is shown below.

User authentication

The following steps detail the user authentication process:

✑ The user selects Sign in in the website.

✑ The browser redirects the user to the Azure Active Directory (Azure AD) sign in page.

✑ The user signs in.

✑ Azure AD redirects the user’s session back to the web application. The URL includes an access token.

✑ The web application calls an API and includes the access token in the authentication header. The application ID is sent as the audience (‘aud’) claim in the access token.

✑ The back-end API validates the access token.

Requirements

Corporate website

✑ Communications and content must be secured by using SSL.

✑ Communications must use HTTPS.

✑ Data must be replicated to a secondary region and three availability zones.

✑ Data storage costs must be minimized.

Azure Database for PostgreSQL

The database connection string is stored in Azure Key Vault with the following attributes:

✑ Azure Key Vault name: cpandlkeyvault

✑ Secret name: PostgreSQLConn

✑ Id: 80df3e46ffcd4f1cb187f79905e9a1e8

The connection information is updated frequently. The application must always use the latest information to connect to the database.

Azure Service Bus and Azure Event Grid

✑ Azure Event Grid must use Azure Service Bus for queue-based load leveling.

✑ Events in Azure Event Grid must be routed directly to Service Bus queues for use in buffering.

✑ Events from Azure Service Bus and other Azure services must continue to be routed to Azure Event Grid for processing.

Security

✑ All SSL certificates and credentials must be stored in Azure Key Vault.

✑ File access must restrict access by IP, protocol, and Azure AD rights.

✑ All user accounts and processes must receive only those privileges which are essential to perform their intended function.

Compliance

Auditing of the file updates and transfers must be enabled to comply with General Data Protection Regulation (GDPR). The file updates must be read-only, stored in the order in which they occurred, include only create, update, delete, and copy operations, and be retained for compliance reasons.

Issues

Corporate website

While testing the site, the following error message displays:

CryptographicException: The system cannot find the file specified.

Function app

You perform local testing for the RequestUserApproval function. The following error

message displays:

‘Timeout value of 00:10:00 exceeded by function: RequestUserApproval’

The same error message displays when you test the function in an Azure development environment when you run the following Kusto query: FunctionAppLogs

| where FunctionName = = "RequestUserApproval"

Logic app

You test the Logic app in a development environment. The following error message displays:

‘400 Bad Request’

Troubleshooting of the error shows an HttpTrigger action to call the RequestUserApproval function.

Code

Corporate website

Security.cs:

Function app

RequestUserApproval.cs:

You need to correct the RequestUserApproval Function app error.

What should you do?

  • A . Update line RA13 to use the async keyword and return an HttpRequest object value.
  • B . Configure the Function app to use an App Service hosting plan. Enable the Always On setting of the hosting plan.
  • C . Update the function to be stateful by using Durable Functions to process the request payload.
  • D . Update the functionTimeout property of the host.json project file to 15 minutes.

Reveal Solution Hide Solution

Correct Answer: C
C

Explanation:

Async operation tracking

The HTTP response mentioned previously is designed to help implement long-running HTTP async APIs with Durable Functions. This pattern is sometimes referred to as the polling consumer pattern.

Both the client and server implementations of this pattern are built into the Durable Functions HTTP APIs.

Function app

You perform local testing for the RequestUserApproval function. The following error message displays:

‘Timeout value of 00:10:00 exceeded by function: RequestUserApproval’

The same error message displays when you test the function in an Azure development environment when you run the following Kusto query:

FunctionAppLogs

| where FunctionName = = "RequestUserApproval"

Reference: https://docs.microsoft.com/en-us/azure/azure-functions/durable/durable-functions-http-features

Question #24

HOTSPOT

You need to configure the Account Kind, Replication, and Storage tier options for the corporate website’s Azure Storage account.

How should you complete the configuration? To answer, select the appropriate options in the dialog box in the answer area. NOTE: Each correct selection is worth one point.

Reveal Solution Hide Solution

Correct Answer:

Explanation:

Account Kind: StorageV2 (general-purpose v2)

Scenario: Azure Storage blob will be used (refer to the exhibit). Data storage costs must be minimized.

General-purpose v2 accounts: Basic storage account type for blobs, files, queues, and tables.

Recommended for most scenarios using Azure Storage.

Incorrect Answers:

BlockBlobStorage accounts: Storage accounts with premium performance characteristics for block blobs and append blobs. Recommended for scenarios with high transactions rates, or scenarios that use smaller objects or require consistently low storage latency.

General-purpose v1 accounts: Legacy account type for blobs, files, queues, and tables. Use general-purpose v2 accounts instead when possible.

Replication: Geo-redundant Storage

Scenario: Data must be replicated to a secondary region and three availability zones.

Geo-redundant storage (GRS) copies your data synchronously three times within a single physical location in the primary region using LRS. It then copies your data asynchronously to a single physical location in the secondary region.

Incorrect Answers:

Geo-zone-redundant storage (GZRS), but it would be more costly.

Storage tier: Cool

Data storage costs must be minimized.

Note: Azure storage offers different access tiers, which allow you to store blob object data in the most cost-effective manner. The available access tiers include:

Hot – Optimized for storing data that is accessed frequently.

Cool – Optimized for storing data that is infrequently accessed and stored for at least 30 days.

Reference:

https://docs.microsoft.com/en-us/azure/storage/common/storage-account-overview

https://docs.microsoft.com/en-us/azure/storage/common/storage-redundancy

https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blob-storage-tiers?tabs=azure-portal


Question #25

HOTSPOT

You need to retrieve the database connection string.

Which values should you use? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.

Reveal Solution Hide Solution

Correct Answer:

Explanation:

Azure database connection string retrieve REST API vault.azure.net/secrets/

Box 1: cpandlkeyvault

We specify the key vault, cpandlkeyvault.

Scenario: The database connection string is stored in Azure Key Vault with the following attributes:

Azure Key Vault name: cpandlkeyvault

Secret name: PostgreSQLConn

Id: 80df3e46ffcd4f1cb187f79905e9a1e8

Box 2: PostgreSQLConn

We specify the secret, PostgreSQLConn

Example, sample request:

https://myvault.vault.azure.net//secrets/mysecretname/4387e9f3d6e14c459867679a90fd0f79?api-version=7.1

Box 3: Querystring

Reference: https://docs.microsoft.com/en-us/rest/api/keyvault/getsecret/getsecret


Question #26

DRAG DROP

You need to correct the corporate website error.

Which four actions should you recommend be performed in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.

Reveal Solution Hide Solution

Correct Answer:

Explanation:

Scenario: Corporate website

While testing the site, the following error message displays:

Cryptographic Exception: The system cannot find the file specified.

Step 1: Generate a certificate

Step 2: Upload the certificate to Azure Key Vault

Scenario: All SSL certificates and credentials must be stored in Azure Key Vault.

Step 3: Import the certificate to Azure App Service

Step 4: Update line SCO5 of Security.cs to include error handling and then redeploy the code

Reference: https://docs.microsoft.com/en-us/azure/app-service/configure-ssl-certificate


Question #27

HOTSPOT

You need to configure API Management for authentication.

Which policy values should you use? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.

Reveal Solution Hide Solution

Correct Answer:

Explanation:

Box 1: Validate JWT

The validate-jwt policy enforces existence and validity of a JWT extracted from either a specified HTTP Header or a specified query parameter.

Scenario: User authentication (see step 5 below)

The following steps detail the user authentication process:

✑ The user selects Sign in in the website.

✑ The browser redirects the user to the Azure Active Directory (Azure AD) sign in page.

✑ The user signs in.

✑ Azure AD redirects the user’s session back to the web application. The URL includes an access token.

✑ The web application calls an API and includes the access token in the authentication header. The application ID is sent as the audience (‘aud’) claim in the access token.

✑ The back-end API validates the access token.

Box 2: Outbound

Reference: https://docs.microsoft.com/en-us/azure/api-management/api-management-access-restriction-policies


Question #28

You need to authenticate the user to the corporate website as indicated by the architectural diagram.

Which two values should you use? Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point.

  • A . ID token signature
  • B . ID token claims
  • C . HTTP response code
  • D . Azure AD endpoint URI
  • E . Azure AD tenant ID

Reveal Solution Hide Solution

Correct Answer: B, E
B, E

Explanation:

Claims in access tokens

JWTs (JSON Web Tokens) are split into three pieces:

✑ Header – Provides information about how to validate the token including information about the type of token and how it was signed.

✑ Payload – Contains all of the important data about the user or app that is attempting to call your service.

✑ Signature – Is the raw material used to validate the token.

Your client can get an access token from either the v1.0 endpoint or the v2.0 endpoint using a variety of protocols.

Scenario: User authentication (see step 5 below)

The following steps detail the user authentication process:

✑ The user selects Sign in in the website.

✑ The browser redirects the user to the Azure Active Directory (Azure AD) sign in page.

✑ The user signs in.

✑ Azure AD redirects the user’s session back to the web application. The URL

includes an access token.

✑ The web application calls an API and includes the access token in the authentication header. The application ID is sent as the audience (‘aud’) claim in the access token.

✑ The back-end API validates the access token.

Reference: https://docs.microsoft.com/en-us/azure/api-management/api-management-access-restriction-policies

Question #29

You need to investigate the Azure Function app error message in the development environment.

What should you do?

  • A . Connect Live Metrics Stream from Application Insights to the Azure Function app and filter the metrics.
  • B . Create a new Azure Log Analytics workspace and instrument the Azure Function app with Application Insights.
  • C . Update the Azure Function app with extension methods from Microsoft.Extensions.Logging to log events by using the log instance.
  • D . Add a new diagnostic setting to the Azure Function app to send logs to Log Analytics.

Reveal Solution Hide Solution

Correct Answer: A
A

Explanation:

Azure Functions offers built-in integration with Azure Application Insights to monitor functions.

The following areas of Application Insights can be helpful when evaluating the behavior, performance, and errors in your functions:

Live Metrics: View metrics data as it’s created in near real-time.

Failures

Performance

Metrics

Reference: https://docs.microsoft.com/en-us/azure/azure-functions/functions-monitoring

Question #30

HOTSPOT

You need to configure the integration for Azure Service Bus and Azure Event Grid.

How should you complete the CLI statement? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.

Reveal Solution Hide Solution

Correct Answer:

Explanation:

Box 1: eventgrid

To create event subscription use: az eventgrid event-subscription create

Box 2: event-subscription

Box 3: servicebusqueue

Scenario: Azure Service Bus and Azure Event Grid

Azure Event Grid must use Azure Service Bus for queue-based load leveling. Events in Azure Event Grid must be routed directly to Service Bus queues for use in buffering.

Events from Azure Service Bus and other Azure services must continue to be routed to Azure Event Grid for processing.

Reference: https://docs.microsoft.com/en-us/cli/azure/eventgrid/event-subscription?view=azure-cli-latest#az_eventgrid_event_subscription_create


Question #31

HOTSPOT

You need to correct the Azure Logic app error message.

Which configuration values should you use? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.

Reveal Solution Hide Solution

Correct Answer:

Explanation:

Scenario: You test the Logic app in a development environment.

The following error message displays:

‘400 Bad Request’

Troubleshooting of the error shows an HttpTrigger action to call the RequestUserApproval function.

NOTE: If the inbound call’s request body doesn’t match your schema, the trigger returns an HTTP 400 Bad Request error.

Box 1: function

If you have an Azure function where you want to use the system-assigned identity, first enable authentication for Azure functions.

Box 2: system-assigned

Your logic app or individual connections can use either the system-assigned identity or a single user-assigned identity, which you can share across a group of logic apps, but not both.

Reference: https://docs.microsoft.com/en-us/azure/logic-apps/create-managed-service-identity


Question #32

HOTSPOT

You need to configure Azure Service Bus to Event Grid integration.

Which Azure Service Bus settings should you use? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.

Reveal Solution Hide Solution

Correct Answer:

Explanation:

Box 1: Premium

Service Bus can now emit events to Event Grid when there are messages in a queue or a subscription when no receivers are present. You can create Event Grid subscriptions to your Service Bus namespaces, listen to these events, and then react to the events by starting a receiver. With this feature, you can use Service Bus in reactive programming models.

To enable the feature, you need the following items:

A Service Bus Premium namespace with at least one Service Bus queue or a Service Bus topic with at least one subscription.

Contributor access to the Service Bus namespace.

Box 2: Contributor

Reference: https://docs.microsoft.com/en-us/azure/service-bus-messaging/service-bus-to-event-grid-integration-concept


Question #33

HOTSPOT

You need to configure security and compliance for the corporate website files.

Which Azure Blob storage settings should you use? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.

Reveal Solution Hide Solution

Correct Answer:

Explanation:

Box 1: role-based access control (RBAC)

Azure Storage supports authentication and authorization with Azure AD for the Blob and Queue services via Azure role-based access control (Azure RBAC).

Scenario: File access must restrict access by IP, protocol, and Azure AD rights.

Box 2: change feed

The purpose of the change feed is to provide transaction logs of all the changes that occur to the blobs and the blob metadata in your storage account.

The file updates must be read-only, stored in the order in which they occurred, include only create, update, delete, and copy operations, and be retained for compliance reasons.

Reference:

https://docs.microsoft.com/en-us/azure/cdn/cdn-sas-storage-support

https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blob-change-feed?tabs=azure-portal


Question #34

You need to ensure that all messages from Azure Event Grid are processed.

What should you use?

  • A . Azure Event Grid topic
  • B . Azure Service Bus topic
  • C . Azure Service Bus queue
  • D . Azure Storage queue
  • E . Azure Logic App custom connector

Reveal Solution Hide Solution

Correct Answer: B
B

Explanation:

As a solution architect/developer, you should consider using Service Bus queues when:

✑ Your solution needs to receive messages without having to poll the queue. With Service Bus, you can achieve it by using a long-polling receive operation using the TCP-based protocols that Service Bus supports.

Reference: https://docs.microsoft.com/en-us/azure/service-bus-messaging/service-bus-azure-and-service-bus-queues-compared-contrasted

Question #35

Topic 4, Proseware, Inc

Case study

This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.

To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.

At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

To start the case study

To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

Background

You are a developer for Proseware, Inc. You are developing an application that applies a set of governance policies for Proseware’s internal services, external services, and applications. The application will also provide a shared library for common functionality.

Requirements

Policy service

You develop and deploy a stateful ASP.NET Core 2.1 web application named Policy service to an Azure App Service Web App. The application reacts to events from Azure Event Grid and performs policy actions based on those events.

The application must include the Event Grid Event ID field in all Application Insights telemetry.

Policy service must use Application Insights to automatically scale with the number of policy actions that it is performing.

Policies

Log policy

All Azure App Service Web Apps must write logs to Azure Blob storage. All log files should be saved to a container named logdrop. Logs must remain in the container for 15 days.

Authentication events

Authentication events are used to monitor users signing in and signing out. All authentication events must be processed by Policy service. Sign outs must be processed as quickly as possible.

PolicyLib

You have a shared library named PolicyLib that contains functionality common to all ASP.NET Core web services and applications.

The PolicyLib library must:

✑ Exclude non-user actions from Application Insights telemetry.

✑ Provide methods that allow a web service to scale itself.

✑ Ensure that scaling actions do not disrupt application usage.

Other

Anomaly detection service

You have an anomaly detection service that analyzes log information for anomalies. It is implemented as an Azure Machine Learning model. The model is deployed as a web service. If an anomaly is detected, an Azure Function that emails administrators is called by using an HTTP WebHook.

Health monitoring

All web applications and services have health monitoring at the /health service endpoint.

Issues

Policy loss

When you deploy Policy service, policies may not be applied if they were in the process of being applied during the deployment.

Performance issue

When under heavy load, the anomaly detection service undergoes slowdowns and rejects connections.

Notification latency

Users report that anomaly detection emails can sometimes arrive several minutes after an anomaly is detected.

App code

EventGridController.cs

Relevant portions of the app files are shown below. Line numbers are included for reference only and include a two-character prefix that denotes the specific file to which they belong.

LoginEvent.cs

Relevant portions of the app files are shown below. Line numbers are included for reference only and include a two-character prefix that denotes the specific file to which they belong.

You need to resolve a notification latency issue.

Which two actions should you perform? Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point.

  • A . Set Always On to true.
  • B . Ensure that the Azure Function is using an App Service plan.
  • C . Set Always On to false.
  • D . Ensure that the Azure Function is set to use a consumption plan.

Reveal Solution Hide Solution

Correct Answer: AB
AB

Explanation:

Azure Functions can run on either a Consumption Plan or a dedicated App Service Plan. If you run in a dedicated mode, you need to turn on the Always On setting for your Function App to run properly. The Function runtime will go idle after a few minutes of inactivity, so only HTTP triggers will actually "wake up" your functions. This is similar to how WebJobs must have Always On enabled.

Scenario: Notification latency: Users report that anomaly detection emails can sometimes arrive several minutes after an anomaly is detected.

Anomaly detection service: You have an anomaly detection service that analyzes log information for anomalies. It is implemented as an Azure Machine Learning model. The model is deployed as a web service. If an anomaly is detected, an Azure Function that emails administrators is called by using an HTTP WebHook.

Reference: https://github.com/Azure/Azure-Functions/wiki/Enable-Always-On-when-running-on-dedicated-App-Service-Plan

Question #36

DRAG DROP

You need to implement the Log policy.

How should you complete the Azure Event Grid subscription? To answer, drag the appropriate JSON segments to the correct locations. Each JSON segment may be used once, more than once, or not at all. You may need to drag the split bar between panes to view content. NOTE: Each correct selection is worth one point.

Reveal Solution Hide Solution

Correct Answer:

Explanation:

Box 1: WebHook

Scenario: If an anomaly is detected, an Azure Function that emails administrators is called by using an HTTP WebHook.

endpointType: The type of endpoint for the subscription (webhook/HTTP, Event Hub, or queue).

Box 2: SubjectBeginsWith

Box 3: Microsoft.Storage.BlobCreated

Scenario: Log Policy

All Azure App Service Web Apps must write logs to Azure Blob storage. All log files should be saved to a container named logdrop. Logs must remain in the container for 15 days.

Example subscription schema

{

"properties": {

"destination": {

"endpointType": "webhook",

"properties": {

"endpointUrl":

"https://example.azurewebsites.net/api/HttpTriggerCSharp1?code=VXbGWce53l48Mt8wuotr0GPmyJ/nDT4hgdFj9DpBiRt38qqnnm5OFg=="

}

},

"filter": {

"includedEventTypes": [ "Microsoft.Storage.BlobCreated", "Microsoft.Storage.BlobDeleted" ],

"subjectBeginsWith": "blobServices/default/containers/mycontainer/log",

"subjectEndsWith": ".jpg",

"isSubjectCaseSensitive ": "true"

}

}

}

Reference: https://docs.microsoft.com/en-us/azure/event-grid/subscription-creation-schema


Question #37

You need to ensure that the solution can meet the scaling requirements for Policy Service.

Which Azure Application Insights data model should you use?

  • A . an Application Insights dependency
  • B . an Application Insights event
  • C . an Application Insights trace
  • D . an Application Insights metric

Reveal Solution Hide Solution

Correct Answer: D
D

Explanation:

Application Insights provides three additional data types for custom telemetry:

Trace – used either directly, or through an adapter to implement diagnostics logging using an instrumentation

framework that is familiar to you, such as Log4Net or System.Diagnostics.

Event – typically used to capture user interaction with your service, to analyze usage patterns.

Metric – used to report periodic scalar measurements.

Scenario:

Policy service must use Application Insights to automatically scale with the number of policy actions that it is performing.

Reference: https://docs.microsoft.com/en-us/azure/azure-monitor/app/data-model

Question #38

DRAG DROP

You need to implement telemetry for non-user actions.

How should you complete the Filter class? To answer, drag the appropriate code segments to the correct locations. Each code segment may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content. NOTE: Each correct selection is worth one point.

Reveal Solution Hide Solution

Correct Answer:

Explanation:

Scenario: Exclude non-user actions from Application Insights telemetry.

Box 1: ITelemetryProcessor

To create a filter, implement ITelemetryProcessor. This technique gives you more direct control over what is included or excluded from the telemetry stream.

Box 2: ITelemetryProcessor

Box 3: ITelemetryProcessor

Box 4: RequestTelemetry

Box 5: /health

To filter out an item, just terminate the chain.

Reference: https://docs.microsoft.com/en-us/azure/azure-monitor/app/api-filtering-sampling


Question #39

DRAG DROP

You need to ensure that PolicyLib requirements are met.

How should you complete the code segment? To answer, drag the appropriate code segments to the correct locations. Each code segment may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content. NOTE: Each correct selection is worth one point.

Reveal Solution Hide Solution

Correct Answer:

Explanation:

Scenario: You have a shared library named PolicyLib that contains functionality common to all ASP.NET Core web services and applications.

The PolicyLib library must:

✑ Exclude non-user actions from Application Insights telemetry.

✑ Provide methods that allow a web service to scale itself.

✑ Ensure that scaling actions do not disrupt application usage.

Box 1: ITelemetryInitializer

Use telemetry initializers to define global properties that are sent with all telemetry; and to override selected behavior of the standard telemetry modules.

Box 2: Initialize

Box 3: Telemetry.Context

Box 4: ((EventTelemetry)telemetry).Properties["EventID"]


Question #40

DRAG DROP

You need to add code at line EG15 in EventGridController.cs to ensure that the Log policy applies to all services.

How should you complete the code? To answer, drag the appropriate code segments to the correct locations. Each code segment may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content. NOTE: Each correct selection is worth one point.

Reveal Solution Hide Solution

Correct Answer:

Explanation:

Scenario, Log policy: All Azure App Service Web Apps must write logs to Azure Blob storage.

Box 1: Status

Box 2: Succeeded

Box 3: operationName

Microsoft.Web/sites/write is resource provider operation. It creates a new Web App or updates an existing one.

Reference: https://docs.microsoft.com/en-us/azure/role-based-access-control/resource-provider-operations


Question #41

HOTSPOT

You need to insert code at line LE03 of LoginEvent.cs to ensure that all authentication events are processed correctly.

How should you complete the code? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.

Reveal Solution Hide Solution

Correct Answer:

Explanation:

Box 1: id

id is a unique identifier for the event.

Box 2: eventType

eventType is one of the registered event types for this event source.

Box 3: dataVersion

dataVersion is the schema version of the data object. The publisher defines the schema version.

Scenario: Authentication events are used to monitor users signing in and signing out. All authentication events must be processed by Policy service. Sign outs must be processed as quickly as possible.

The following example shows the properties that are used by all event publishers:

[

{

"topic": string,

"subject": string,

"id": string,

"eventType": string,

"eventTime": string,

"data":{

object-unique-to-each-publisher

},

"dataVersion": string,

"metadataVersion": string

}

]

Reference: https://docs.microsoft.com/en-us/azure/event-grid/event-schema


Question #42

HOTSPOT

You need to implement the Log policy.

How should you complete the EnsureLogging method in EventGridController.cs? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.

Reveal Solution Hide Solution

Correct Answer:

Explanation:

Box 1: logdrop

All log files should be saved to a container named logdrop.

Box 2: 15

Logs must remain in the container for 15 days.

Box 3: UpdateApplicationSettings

All Azure App Service Web Apps must write logs to Azure Blob storage.

Reference: https://blog.hompus.nl/2017/05/29/adding-application-logging-blob-to-a-azure-web-app-service-using-powershell/


Question #43

Topic 5, Litware Inc

Case study

This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.

To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.

At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

To start the case study

To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

Background

You are a developer for Litware Inc., a SaaS company that provides a solution for managing employee expenses. The solution consists of an ASP.NET Core Web API project that is deployed as an Azure Web App.

Overall architecture

Employees upload receipts for the system to process. When processing is complete, the employee receives a summary report email that details the processing results. Employees then use a web application to manage their receipts and perform any additional tasks needed for reimbursement.

Receipt processing

Employees may upload receipts in two ways:

✑ Uploading using an Azure Files mounted folder

✑ Uploading using the web application

Data Storage

Receipt and employee information is stored in an Azure SQL database.

Documentation

Employees are provided with a getting started document when they first use the solution. The documentation includes details on supported operating systems for Azure File upload, and instructions on how to configure the mounted folder.

Solution details

Users table

Web Application

You enable MSI for the Web App and configure the Web App to use the security principal name WebAppIdentity.

Processing

Processing is performed by an Azure Function that uses version 2 of the Azure Function runtime. Once processing is completed, results are stored in Azure Blob Storage and an Azure SQL database. Then, an email summary is sent to the user with a link to the processing report. The link to the report must remain valid if the email is forwarded to another user.

Logging

Azure Application Insights is used for telemetry and logging in both the processor and the web application. The processor also has TraceWriter logging enabled. Application Insights must always contain all log messages.

Requirements

Receipt processing

Concurrent processing of a receipt must be prevented.

Disaster recovery

Regional outage must not impact application availability. All DR operations must not be dependent on application running and must ensure that data in the DR region is up to date.

Security

✑ User’s SecurityPin must be stored in such a way that access to the database does not allow the viewing of SecurityPins. The web application is the only system that should have access to SecurityPins.

✑ All certificates and secrets used to secure data must be stored in Azure Key Vault.

✑ You must adhere to the principle of least privilege and provide privileges which are essential to perform the intended function.

✑ All access to Azure Storage and Azure SQL database must use the application’s Managed Service Identity (MSI).

✑ Receipt data must always be encrypted at rest.

✑ All data must be protected in transit.

✑ User’s expense account number must be visible only to logged in users. All other views of the expense account number should include only the last segment, with the remaining parts obscured.

✑ In the case of a security breach, access to all summary reports must be revoked without impacting other parts of the system.

Issues

Upload format issue

Employees occasionally report an issue with uploading a receipt using the web application. They report that when they upload a receipt using the Azure File Share, the receipt does not appear in their profile. When this occurs, they delete the file in the file share and use the web application, which returns a 500 Internal Server error page.

Capacity issue

During busy periods, employees report long delays between the time they upload the receipt and when it appears in the web application.

Log capacity issue

Developers report that the number of log messages in the trace output for the processor is too high, resulting in lost log messages.

Application code

Processing.cs

Database.cs

ReceiptUploader.cs

ConfigureSSE.ps1

DRAG DROP

You need to add code at line PC32 in Processing.cs to implement the GetCredentials method in the Processing class.

How should you complete the code? To answer, drag the appropriate code segments to the correct locations. Each code segment may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content. NOTE: Each correct selection is worth one point.

Reveal Solution Hide Solution

Correct Answer:

Explanation:

Box 1: AzureServiceTokenProvider()

Box 2: tp.GetAccessTokenAsync("..")

Acquiring an access token is then quite easy. Example code:

private async Task<string> GetAccessTokenAsync()

{

var tokenProvider = new AzureServiceTokenProvider();

return await tokenProvider.GetAccessTokenAsync("https://storage.azure.com/");

}

Reference: https://joonasw.net/view/azure-ad-authentication-with-azure-storage-and-managed-service-identity


Question #44

DRAG DROP

You need to ensure disaster recovery requirements are met.

What code should you add at line PC16? To answer, drag the appropriate code fragments to the correct locations. Each code fragment may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content. NOTE: Each correct selection is worth one point.

Reveal Solution Hide Solution

Correct Answer:

Explanation:

Scenario: Disaster recovery. Regional outage must not impact application availability. All DR operations must not be dependent on application running and must ensure that data in the DR region is up to date.

Box 1: DirectoryTransferContext

We transfer all files in the directory.

Note: The TransferContext object comes in two forms: SingleTransferContext and DirectoryTransferContext. The former is for transferring a single file and the latter is for transferring a directory of files.

Box 2: ShouldTransferCallbackAsync

The DirectoryTransferContext.ShouldTransferCallbackAsync delegate callback is invoked to tell whether a transfer should be done.

Box 3: False

If you want to use the retry policy in Copy, and want the copy can be resume if break in the middle, you can use SyncCopy (isServiceCopy = false).

Note that if you choose to use service side copy (‘isServiceCopy’ set to true), Azure (currently) doesn’t provide SLA for that. Setting ‘isServiceCopy’ to false will download the source blob loca

Reference:

https://docs.microsoft.com/en-us/azure/storage/common/storage-use-data-movement-library

https://docs.microsoft.com/en-us/dotnet/api/microsoft.windowsazure.storage.datamovement.directorytransfercontext.shouldtransfercallbackasync?view=azure-dotnet


Question #45

HOTSPOT

You need to add code at line PC26 of Processing.cs to ensure that security policies are met.

How should you complete the code that you will add at line PC26? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.

Reveal Solution Hide Solution

Correct Answer:

Explanation:

Box 1: var key = await

Resolver.ResolveKeyAsyn(keyBundle,KeyIdentifier.CancellationToken.None);

Box 2: var x = new BlobEncryptionPolicy(key,resolver); Example:

// We begin with cloudKey1, and a resolver capable of resolving and caching Key Vault secrets.

BlobEncryptionPolicy encryptionPolicy = new BlobEncryptionPolicy(cloudKey1, cachingResolver);

client.DefaultRequestOptions.EncryptionPolicy = encryptionPolicy;

Box 3: cloudblobClient. DefaultRequestOptions.EncryptionPolicy = x;

Reference: https://github.com/Azure/azure-storage-

net/blob/master/Samples/GettingStarted/EncryptionSamples/KeyRotation/Program.cs


Question #46

You need to ensure the security policies are met.

What code do you add at line CS07 of ConfigureSSE.ps1?

  • A . CPermissionsToKeys create, encrypt, decrypt
  • B . CPermissionsToCertificates create, encrypt, decrypt
  • C . CPermissionsToCertificates wrapkey, unwrapkey, get
  • D . CPermissionsToKeys wrapkey, unwrapkey, get

Reveal Solution Hide Solution

Correct Answer: B
B

Explanation:

Scenario: All certificates and secrets used to secure data must be stored in Azure Key Vault.

You must adhere to the principle of least privilege and provide privileges which are essential to perform the intended function.

The Set-AzureRmKeyValutAccessPolicy parameter -PermissionsToKeys specifies an array of key operation permissions to grant to a user or service principal. The acceptable values for this parameter: decrypt, encrypt, unwrapKey, wrapKey, verify, sign, get, list, update, create, import, delete, backup, restore, recover, purge

Reference: https://docs.microsoft.com/en-us/powershell/module/azurerm.keyvault/set-azurermkeyvaultaccesspolicy

Question #47

You need to ensure receipt processing occurs correctly.

What should you do?

  • A . Use blob properties to prevent concurrency problems
  • B . Use blob SnapshotTime to prevent concurrency problems
  • C . Use blob metadata to prevent concurrency problems
  • D . Use blob leases to prevent concurrency problems

Reveal Solution Hide Solution

Correct Answer: D
D

Explanation:

You can create a snapshot of a blob. A snapshot is a read-only version of a blob that’s taken at a point in time. Once a snapshot has been created, it can be read, copied, or deleted, but not modified. Snapshots provide a way to back up a blob as it appears at a moment in time.

Scenario: Processing is performed by an Azure Function that uses version 2 of the Azure Function runtime. Once processing is completed, results are stored in Azure Blob Storage and an Azure SQL database. Then, an email summary is sent to the user with a link to the processing report. The link to the report must remain valid if the email is forwarded to another user.

Reference: https://docs.microsoft.com/en-us/rest/api/storageservices/creating-a-snapshot-of-a-blob

Question #48

You need to resolve the capacity issue.

What should you do?

  • A . Convert the trigger on the Azure Function to an Azure Blob storage trigger
  • B . Ensure that the consumption plan is configured correctly to allow scaling
  • C . Move the Azure Function to a dedicated App Service Plan
  • D . Update the loop starting on line PC09 to process items in parallel

Reveal Solution Hide Solution

Correct Answer: D
D

Explanation:

If you want to read the files in parallel, you cannot use for Each. Each of the async callback function calls does return a promise. You can await the array of promises that you’ll get with Promise.all.

Scenario: Capacity issue: During busy periods, employees report long delays between the time they upload the receipt and when it appears in the web application.

Reference: https://stackoverflow.com/questions/37576685/using-async-await-with-a-foreach-loop


Question #49

You need to resolve the log capacity issue.

What should you do?

  • A . Create an Application Insights Telemetry Filter
  • B . Change the minimum log level in the host.json file for the function
  • C . Implement Application Insights Sampling
  • D . Set a LogCategoryFilter during startup

Reveal Solution Hide Solution

Correct Answer: C
C

Explanation:

Scenario, the log capacity issue: Developers report that the number of log message in the trace output for the processor is too high, resulting in lost log messages.

Sampling is a feature in Azure Application Insights. It is the recommended way to reduce telemetry traffic and storage, while preserving a statistically correct analysis of application data. The filter selects items that are related, so that you can navigate between items when you are doing diagnostic investigations. When metric counts are presented to you in the portal, they are renormalized to take account of the sampling, to minimize any effect on the statistics.

Sampling reduces traffic and data costs, and helps you avoid throttling.

Reference: https://docs.microsoft.com/en-us/azure/azure-monitor/app/sampling

Question #50

Topic 6, Coho Winery

Case study

This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.

To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.

At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

To start the case study

To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

LabelMaker app

Coho Winery produces, bottles, and distributes a variety of wines globally. You are a developer implementing highly scalable and resilient applications to support online order processing by using Azure solutions.

Coho Winery has a LabelMaker application that prints labels for wine bottles. The application sends data to several printers. The application consists of five modules that run independently on virtual machines (VMs). Coho Winery plans to move the application to Azure and continue to support label creation.

External partners send data to the LabelMaker application to include artwork and text for custom label designs.

Requirements. Data

You identify the following requirements for data management and manipulation:

✑ Order data is stored as nonrelational JSON and must be queried using SQL.

✑ Changes to the Order data must reflect immediately across all partitions. All reads to the Order data must fetch the most recent writes.

Requirements. Security

You have the following security requirements:

✑ Users of Coho Winery applications must be able to provide access to documents, resources, and applications to external partners.

✑ External partners must use their own credentials and authenticate with their organization’s identity management solution.

✑ External partner logins must be audited monthly for application use by a user account administrator to maintain company compliance.

✑ Storage of e-commerce application settings must be maintained in Azure Key Vault.

✑ E-commerce application sign-ins must be secured by using Azure App Service authentication and Azure Active Directory (AAD).

✑ Conditional access policies must be applied at the application level to protect company content.

✑ The LabelMaker application must be secured by using an AAD account that has full access to all namespaces of the Azure Kubernetes Service (AKS) cluster.

Requirements. LabelMaker app

Azure Monitor Container Health must be used to monitor the performance of workloads that are deployed to Kubernetes environments and hosted on Azure Kubernetes Service (AKS).

You must use Azure Container Registry to publish images that support the AKS deployment.

Architecture

Issues

Calls to the Printer API App fail periodically due to printer communication timeouts.

Printer communication timeouts occur after 10 seconds. The label printer must only receive up to 5 attempts within one minute.

The order workflow fails to run upon initial deployment to Azure.

Order.json

Relevant portions of the app files are shown below. Line numbers are included for reference only.

This JSON file contains a representation of the data for an order that includes a single item.

Order.json

DRAG DROP

You need to deploy a new version of the LabelMaker application to ACR.

Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.

Reveal Solution Hide Solution

Correct Answer:

Explanation:

Step 1: Build a new application image by using dockerfile

Step 2: Create an alias if the image with the fully qualified path to the registry

Before you can push the image to a private registry, you’ve to ensure a proper image name. This can be achieved using the docker tag command. For demonstration purpose, we’ll use Docker’s hello world image, rename it and push it to ACR.

# pulls hello-world from the public docker hub $ docker pull hello-world

# tag the image in order to be able to push it to a private registry $ docker tag hello-word <REGISTRY_NAME>/hello-world

# push the image

$ docker push <REGISTRY_NAME>/hello-world

Step 3: Log in to the registry and push image

In order to push images to the newly created ACR instance, you need to login to ACR form the Docker CLI. Once logged in, you can push any existing docker image to your ACR instance.

Scenario:

Coho Winery plans to move the application to Azure and continue to support label creation.

LabelMaker app

Azure Monitor Container Health must be used to monitor the performance of workloads that are deployed to Kubernetes environments and hosted on Azure Kubernetes Service (AKS).

You must use Azure Container Registry to publish images that support the AKS deployment.

Reference:

https://thorsten-hans.com/how-to-use-a-private-azure-container-registry-with-kubernetes-9b86e67b93b6

https://docs.microsoft.com/en-us/azure/container-registry/container-registry-tutorial-quick-task


Question #51

You need to access data from the user claim object in the e-commerce web app.

What should you do first?

  • A . Write custom code to make a Microsoft Graph API call from the e-commerce web app.
  • B . Assign the Contributor RBAC role to the e-commerce web app by using the Resource Manager create role assignment API.
  • C . Update the e-commerce web app to read the HTTP request header values.
  • D . Using the Azure CLI, enable Cross-origin resource sharing (CORS) from the e-commerce checkout API to the e-commerce web app.

Reveal Solution Hide Solution

Correct Answer: C
C

Explanation:

Methods to Get User Identity and Claims in a .NET Azure Functions App include:

ClaimsPrincipal from the Request Context

The ClaimsPrincipal object is also available as part of the request context and can be extracted from the HttpRequest.HttpContext.

User Claims from the Request Headers.

App Service passes user claims to the app by using special request headers.

Reference: https://levelup.gitconnected.com/four-alternative-methods-to-get-user-identity-and-claims-in-a-net-azurefunctions-app-df98c40424bb

Question #52

HOTSPOT

You need to configure Azure Cosmos DB.

Which settings should you use? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.

Reveal Solution Hide Solution

Correct Answer:

Explanation:

Box 1: Strong

When the consistency level is set to strong, the staleness window is equivalent to zero, and the clients are guaranteed to read the latest committed value of the write operation.

Scenario: Changes to the Order data must reflect immediately across all partitions. All reads to the Order data must fetch the most recent writes.

Note: You can choose from five well-defined models on the consistency spectrum. From strongest to

weakest, the models are: Strong, Bounded staleness, Session, Consistent prefix, Eventual

Box 2: SQL

Scenario: You identify the following requirements for data management and manipulation:

Order data is stored as nonrelational JSON and must be queried using Structured Query Language (SQL).


Question #53

HOTSPOT

You need to retrieve all order line items from Order.json and sort the data alphabetically by the city.

How should you complete the code? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.

Reveal Solution Hide Solution

Correct Answer:

Explanation:

Box 1: orders o

Scenario: Order data is stored as nonrelational JSON and must be queried using SQL.

Box 2:li

Box 3: o.line_items

Box 4: o.city

The city field is in Order, not in the 2s.


Question #54

Topic 7, VanArsdel. Ltd

Case study

This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.

To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.

At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

To start the case study

To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. If the case study has an All Information tab, note that the information displayed is identical to the information displayed on the subsequent tabs. When you are ready to answer a question, click the Question button to return to the question.

Background

VanArsdel. Ltd. is a global office supply company. The company fs based in Canada and has retail store locations across the world. The company is developing several cloud-based solutions to support their stores, distributors, suppliers, and delivery services.

Current environment

Requirements

The application components must meet the following requirements:

Corporate website

• Secure the website by using SSL

• Minimize costs tor data storage and hosting.

• Implement native GitHub workflows for continuous integration and continuous deployment (Cl/CO).

• Distribute the website content globally for local use.

• Implement monitoring by using Application Insights and availability web tests including SSL certificate validity and custom header value verification.

• The website must have 99.95 percent uptime.

Corporate website

The company provides a public website located at htlp://www. vanaisdelttd.com. The website consists of a React JavaScript user interface, HTML,CSS, image assets, and several APIs hosted in Azure functions.

Retail store locations

• Azure Functions must process data immediately when data is uploaded to Blob storage. Azure Functions must update Azure Cosmos D3 by using native SQL language queries.

• Audit store sale transaction information nightly to validate data, process sates financials, and reconcile inventory.

Delivery services

• Store service telemetry data in Azure Cosmos DB by using an Azure Function. Data must include an item id. the delivery vehicle license plate, vehicle package capacity, and current vehicle location coordinates.

• Store delivery driver profile information in Azure Active Directory Azure AD) by using an Azure Function called from the corporate website.

Inventory services

The company has contracted a third-party to develop an API for inventory processing that requires access to a specific blob within the retail store storage account for three months to include read-only access to the data.

Security

• All Azure Functions must centralize management and distribution of configuration data for different environments and geographies, encrypted by using a company-provided RSA-HSM key.

• Authentication and authorization must use Azure AD and services must use managed identities where possible.

Retail Store Locations

• You must perform a point-in-time restoration of the retail store location data due to an unexpected and accidental deletion of data.

• Azure Cosmos DB queries from the Azure Function exhibit high Request Unit (RU) usage and contain multiple, complex queries that exhibit high point read latency for large items as the function app is scaling.

HOTSPOT

You need to Implement the retail store location Azure Function.

How should you configure the solution? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.

Reveal Solution Hide Solution

Correct Answer:

Explanation:

Scenario: Retail store locations: Azure Functions must process data immediately when data is uploaded to Blob storage.

Box 1: HTTP

Binding configuration example: https://<storage_account_name>.blob.core.windows.net

Box 2: Input

Read blob storage data in a function: Input binding

Box 3: Blob storage

The Blob storage trigger starts a function when a new or updated blob is detected. Azure Functions integrates with Azure Storage via triggers and bindings. Integrating with Blob storage allows you to build functions that react to changes in blob data as well as read and write values.

Reference: https://docs.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-blob-trigger


Question #55

You need to secure the Azure Functions to meet the security requirements.

Which two actions should you perform? Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point.

  • A . Store the RSA-HSM key in Azure Cosmos DB. Apery the built-in policies for customer-managed keys and allowed locations.
  • B . Create a free tier Azure App Configuration instance with a new Azure AD service principal.
  • C . Store the RSA-HSM key in Azure Key Vault with soft-delete and purge-protection features enabled.
  • D . Store the RSA-HSM key in Azure Blob storage with an Immutability policy applied to the container.
  • E . Create a standard tier Azure App Configuration instance with an assigned Azure AD managed identity.

Reveal Solution Hide Solution

Correct Answer: CE
CE

Explanation:

Scenario: All Azure Functions must centralize management and distribution of configuration data for different environments and geographies, encrypted by using a company-provided RSA-HSM key.

Microsoft Azure Key Vault is a cloud-hosted management service that allows users to encrypt keys and small secrets by using keys that are protected by hardware security modules (HSMs).

You need to create a managed identity for your application.

Reference: https://docs.microsoft.com/en-us/azure/app-service/app-service-key-vault-references

Question #56

You need to audit the retail store sales transactions.

What are two possible ways to achieve the goal? Each correct answer presents a complete solution. NOTE: Each correct selection is worth one point.

  • A . Update the retail store location data upload process to include blob index tags. Create an Azure Function to process the blob index tags and filter by store location
  • B . Enable blob versioning for the storage account. Use an Azure Function to process a list of the blob versions per day.
  • C . Process an Azure Storage blob inventory report by using an Azure Function. Create rule filters on the blob inventory report,
  • D . Subscribe to blob storage events by using an Azure Function and Azure Event Grid. Filter the events by store location.
  • E . Process the change feed logs of the Azure Blob storage account by using an Azure Function. Specify a time range for the change feed data.

Reveal Solution Hide Solution

Correct Answer: DE
DE

Explanation:

Scenario: Audit store sale transaction information nightly to validate data, process sales financials, and reconcile inventory.

"Process the change feed logs of the Azure Blob storage account by using an Azure Function. Specify a time range for the change feed data": Change feed support is well-suited for scenarios that process data based on objects that have changed. For example, applications can:

Store, audit, and analyze changes to your objects, over any period of time, for security, compliance or intelligence for enterprise data management.

"Subscribe to blob storage events by using an Azure Function and Azure Event Grid. Filter the events by store location": Azure Storage events allow applications to react to events, such as the creation and deletion of blobs. It does so without the need for complicated code or expensive and inefficient polling services. The best part is you only pay for what you use.

Blob storage events are pushed using Azure Event Grid to subscribers such as Azure Functions, Azure Logic Apps, or even to your own http listener. Event Grid provides reliable event delivery to your applications through rich retry policies and dead-lettering.

Reference:

https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blob-change-feed

https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blob-event-overview

Question #57

You need to implement a solution to resolve the retail store location data issue.

Which three Azure Blob features should you enable? Each correct answer presents part of the solution. NOTE Each correct selection is worth one point

  • A . Immutability
  • B . Snapshots
  • C . Versioning
  • D . Soft delete
  • E . Object replication
  • F . Change feed

Reveal Solution Hide Solution

Correct Answer: CDF
CDF

Explanation:

Scenario: You must perform a point-in-time restoration of the retail store location data due to an unexpected and accidental deletion of data.

Before you enable and configure point-in-time restore, enable its prerequisites for the storage account: soft delete, change feed, and blob versioning.

Reference: https://docs.microsoft.com/en-us/azure/storage/blobs/point-in-time-restore-manage

Question #58

HOTSPOT

You need to reliably identify the delivery driver profile information.

How should you configure the system? To answer, select the appropriate options in the answer area.

NOTE Each correct selection is worth one point.

Reveal Solution Hide Solution

Correct Answer:


Question #59

HOTSPOT

You need to implement event routing for retail store location data.

Which configuration should you use?

Reveal Solution Hide Solution

Correct Answer:


Question #60

HOTSPOT

You need to implement the delivery service telemetry data

How should you configure the solution? NOTE: Each correct selection is worth one point.

Reveal Solution Hide Solution

Correct Answer:


Question #61

you need to reduce read latency for the retail store solution.

What are two possible ways to achieve the goal? Each correct answer presents a complete solution. NOTE: Each correct selection is worth one point.

  • A . Create a new composite index for the store location data queries in Azure Cosmos DB. Modify the queries to support parameterized SQL and update the Azure function app to call the new Queries.
  • B . Configure Azure Cosmos DB consistency to strong consistency Increase the RUs for the container supporting store location data.
  • C . Provision an Azure Cosmos OB dedicated gateway, update blob storage to use the new dedicated gateway endpoint.
  • D . Configure Azure Cosmos DB consistency to session consistency. Cache session tokens in a new Azure Redis cache instance after every write. Update reads to use the session token stored in Azure Redis.
  • E . Provision an Azure Cosmos DB dedicated gateway Update the Azure Function app connection string to use the new dedicated gateway endpoint.

Reveal Solution Hide Solution

Correct Answer: AC
Question #62

HOTSPOT

You need to implement the corporate website.

How should you configure the solution?

Reveal Solution Hide Solution

Correct Answer:


Question #63

You need to test the availability of the corporate website.

Which two test types can you use?

  • A . Custom testing using the Track Availability API method
  • B . Standard
  • C . URL Ping
  • D . Multi-step

Reveal Solution Hide Solution

Correct Answer: AB
Question #64

You need to secure the Azure Functions to meet the security requirements.

Which two actions should you perform? Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point.

  • A . Store the RSA-HSM key in Azure Key Vault with soft-delete and purge-protection features enabled
  • B . Store the RSA-HSM key in Azure Blob storage with an immutability policy applied to the container.
  • C . Store the RSA-HSM key in Azure Cosmos DB. Apply the built-in policies for customer-managed Keys and allowed locations
  • D . Create a standard tier Azure App Configuration instance with an assigned Azure AD managed identity.
  • E . Create a free tier Azure App Configuration instance with a new Azure AD service principal.

Reveal Solution Hide Solution

Correct Answer: BC
Question #65

You need to grant access to the retail store location data for the inventory service development effort.

What should you use?

  • A . Azure AD access token
  • B . Azure RBAC role
  • C . Azure AD ID token
  • D . Shared access signature (SAS) token
  • E . Azure AD refresh token

Reveal Solution Hide Solution

Correct Answer: D
Question #66

Topic 8, Misc. Questions

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.

After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.

You develop a software as a service (SaaS) offering to manage photographs. Users upload photos to a web service which then stores the photos in Azure Storage Blob storage. The storage account type is General-purpose V2.

When photos are uploaded, they must be processed to produce and save a mobile-friendly version of the image. The process to produce a mobile-friendly version of the image must start in less than one minute.

You need to design the process that starts the photo processing.

Solution: Convert the Azure Storage account to a BlockBlobStorage storage account.

Does the solution meet the goal?

  • A . Yes
  • B . No

Reveal Solution Hide Solution

Correct Answer: B
B

Explanation:

Not necessary to convert the account, instead move photo processing to an Azure Function triggered from the blob upload.

Azure Storage events allow applications to react to events. Common Blob storage event scenarios include image or video processing, search indexing, or any file-oriented workflow.

Note: Only storage accounts of kind StorageV2 (general purpose v2) and BlobStorage support event integration. Storage (general purpose v1) does not support integration with Event Grid.

Reference: https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blob-event-overview

Question #67

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.

After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.

You develop a software as a service (SaaS) offering to manage photographs. Users upload photos to a web service which then stores the photos in Azure Storage Blob storage. The storage account type is General-purpose V2.

When photos are uploaded, they must be processed to produce and save a mobile-friendly version of the image. The process to produce a mobile-friendly version of the image must start in less than one minute.

You need to design the process that starts the photo processing.

Solution: Move photo processing to an Azure Function triggered from the blob upload.

Does the solution meet the goal?

  • A . Yes
  • B . No

Reveal Solution Hide Solution

Correct Answer: A
A

Explanation:

Azure Storage events allow applications to react to events. Common Blob storage event scenarios include image or video processing, search indexing, or any file-oriented workflow.

Events are pushed using Azure Event Grid to subscribers such as Azure Functions, Azure Logic Apps, or even to your own http listener.

Note: Only storage accounts of kind StorageV2 (general purpose v2) and BlobStorage support event integration. Storage (general purpose v1) does not support integration with Event Grid.

Reference: https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blob-event-overview

Question #68

You are developing an application that uses Azure Blob storage.

The application must read the transaction logs of all the changes that occur to the blobs and the blob metadata in the storage account for auditing purposes. The changes must be in the order in which they occurred, include only create, update, delete, and copy operations and be retained for compliance reasons.

You need to process the transaction logs asynchronously.

What should you do?

  • A . Process all Azure Blob storage events by using Azure Event Grid with a subscriber Azure Function app.
  • B . Enable the change feed on the storage account and process all changes for available events.
  • C . Process all Azure Storage Analytics logs for successful blob events.
  • D . Use the Azure Monitor HTTP Data Collector API and scan the request body for successful blob events.

Reveal Solution Hide Solution

Correct Answer: B
B

Explanation:

Change feed support in Azure Blob Storage

The purpose of the change feed is to provide transaction logs of all the changes that occur to the blobs and the blob metadata in your storage account. The change feed provides ordered, guaranteed, durable, immutable, read-only log of these changes. Client applications can read these logs at any time, either in streaming or in batch mode. The change feed enables you to build efficient and scalable solutions that process change events that occur in your Blob Storage account at a low cost.

Reference: https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blob-change-feed

Question #69

DRAG DROP

You are developing an application to use Azure Blob storage. You have configured Azure Blob storage to include change feeds.

A copy of your storage account must be created in another region. Data must be copied from the current storage account to the new storage account directly between the storage servers.

You need to create a copy of the storage account in another region and copy the data.

In which order should you perform the actions? To answer, move all actions from the list of actions to the answer area and arrange them in the correct order.

Reveal Solution Hide Solution

Correct Answer:

Explanation:

https://docs.microsoft.com/en-us/azure/storage/common/storage-account-move?tabs=azure-portal#modify-the-template


Question #70

HOTSPOT

You are developing an ASP.NET Core web application. You plan to deploy the application to Azure Web App for Containers.

The application needs to store runtime diagnostic data that must be persisted across application restarts. You have the following code:

You need to configure the application settings so that diagnostic data is stored as required.

How should you configure the web app’s settings? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.

Reveal Solution Hide Solution

Correct Answer:

Explanation:

Box 1: If WEBSITES_ENABLE_APP_SERVICE_STORAGE

If WEBSITES_ENABLE_APP_SERVICE_STORAGE setting is unspecified or set to true, the /home/ directory will be shared across scale instances, and files written will persist across restarts

Box 2: /home

Reference: https://docs.microsoft.com/en-us/azure/app-service/containers/app-service-linux-faq


Question #71

You are developing a web app that is protected by Azure Web Application Firewall (WAF). All traffic to the web app is routed through an Azure Application Gateway instance that is used by multiple web apps. The web app address is contoso.azurewebsites.net.

All traffic must be secured with SSL. The Azure Application Gateway instance is used by multiple web apps.

You need to configure the Azure Application Gateway for the app.

Which two actions should you perform? Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point.

  • A . In the Azure Application Gateway’s HTTP setting, enable the Use for App service setting.
  • B . Convert the web app to run in an Azure App service environment (ASE).
  • C . Add an authentication certificate for contoso.azurewebsites.net to the Azure Application gateway.
  • D . In the Azure Application Gateway’s HTTP setting, set the value of the Override backend path option to contoso22.azurewebsites.net.

Reveal Solution Hide Solution

Correct Answer: AD
AD

Explanation:

D: The ability to specify a host override is defined in the HTTP settings and can be applied to any back-end pool during rule creation.

The ability to derive the host name from the IP or FQDN of the back-end pool members. HTTP settings also provide an option to dynamically pick the host name from a back-end pool member’s FQDN if configured with the option to derive host name from an individual back-end pool member.

A (not C): SSL termination and end to end SSL with multi-tenant services.

In case of end to end SSL, trusted Azure services such as Azure App service web apps do not require whitelisting the backends in the application gateway. Therefore, there is no need to add any authentication certificates.

Reference: https://docs.microsoft.com/en-us/azure/application-gateway/application-gateway-web-app-overview


Question #72

HOTSPOT

You are implementing a software as a service (SaaS) ASP.NET Core web service that will run as an Azure Web App. The web service will use an on-premises SQL Server database for storage. The web service also includes a WebJob that processes data updates.

Four customers will use the web service.

✑ Each instance of the WebJob processes data for a single customer and must run as a singleton instance.

✑ Each deployment must be tested by using deployment slots prior to serving production data.

✑ Azure costs must be minimized.

✑ Azure resources must be located in an isolated network.

You need to configure the App Service plan for the Web App.

How should you configure the App Service plan? To answer, select the appropriate settings in the answer area. NOTE: Each correct selection is worth one point.

Reveal Solution Hide Solution

Correct Answer:

Explanation:

Number of VM instances: 4

You are not charged extra for deployment slots.

Pricing tier: Isolated

The App Service Environment (ASE) is a powerful feature offering of the Azure App Service that gives network isolation and improved scale capabilities. It is essentially a deployment of the Azure App Service into a subnet of a customer’s Azure Virtual Network (VNet).

References: https://azure.microsoft.com/sv-se/blog/announcing-app-service-isolated-more-power-scale-and-ease-of-use/


Question #73

DRAG DROP

You are a developer for a software as a service (SaaS) company that uses an Azure Function to process orders. The Azure Function currently runs on an Azure Function app that is triggered by an Azure Storage queue.

You are preparing to migrate the Azure Function to Kubernetes using Kubernetes-based Event Driven Autoscaling (KEDA).

You need to configure Kubernetes Custom Resource Definitions (CRD) for the Azure Function.

Which CRDs should you configure? To answer, drag the appropriate CRD types to the correct locations. Each CRD type may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content. NOTE: Each correct selection is worth one point.

Reveal Solution Hide Solution

Correct Answer:

Explanation:

Box 1: Deployment

To deploy Azure Functions to Kubernetes use the func kubernetes deploy command has several attributes that directly control how our app scales, once it is deployed to Kubernetes.

Box 2: ScaledObject

With –polling-interval, we can control the interval used by KEDA to check Azure Service Bus Queue for messages.

Example of ScaledObject with polling interval

apiVersion: keda.k8s.io/v1alpha1

kind: ScaledObject

metadata:

name: transformer-fn

namespace: tt

labels:

deploymentName: transformer-fn

spec:

scaleTargetRef:

deploymentName: transformer-fn

pollingInterval: 5

minReplicaCount: 0

maxReplicaCount: 100

Box 3: Secret

Store connection strings in Kubernetes Secrets.

Example: to create the Secret in our demo Namespace:

# create the k8s demo namespace

kubectl create namespace tt

# grab connection string from Azure Service Bus

KEDA_SCALER_CONNECTION_STRING=$(az servicebus queue authorization-rule keys list

-g $RG_NAME

–namespace-name $SBN_NAME

–queue-name inbound

-n keda-scaler

–query "primaryConnectionString"

-o tsv)

# create the kubernetes secret

kubectl create secret generic tt-keda-auth

–from-literal KedaScaler=$KEDA_SCALER_CONNECTION_STRING

–namespace tt

Reference: https://www.thinktecture.com/en/kubernetes/serverless-workloads-with-keda/


Question #74

HOTSPOT

You are creating a CLI script that creates an Azure web app related services in Azure App Service.

The web app uses the following variables:

You need to automatically deploy code from GitHub to the newly created web app.

How should you complete the script? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.

Reveal Solution Hide Solution

Correct Answer:

Explanation:

Box 1: az appservice plan create

The azure group creates command successfully returns JSON result. Now we can use resource group to create a azure app service plan

Box 2: az webapp create

Create a new web app..

Box 3: –plan $webappname

with the serviceplan we created in step 1.

Box 4: az webapp deployment

Continuous Delivery with GitHub. Example:

az webapp deployment source config –name firstsamplewebsite1 –resource-group websites–repo-url $gitrepo –branch master –git-token $token

Box 5: –repo-url $gitrepo –branch master –manual-integration

Reference: https://medium.com/@satish1v/devops-your-way-to-azure-web-apps-with-azure-cli-206ed4b3e9b1


Question #75

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.

After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.

You develop a software as a service (SaaS) offering to manage photographs. Users upload photos to a web service which then stores the photos in Azure Storage Blob storage. The storage account type is General-purpose V2.

When photos are uploaded, they must be processed to produce and save a mobile-friendly version of the image. The process to produce a mobile-friendly version of the image must start in less than one minute.

You need to design the process that starts the photo processing.

Solution: Trigger the photo processing from Blob storage events.

Does the solution meet the goal?

  • A . Yes
  • B . NO

Reveal Solution Hide Solution

Correct Answer: B
B

Explanation:

You need to catch the triggered event, so move the photo processing to an Azure Function triggered from the blob upload

Note: Azure Storage events allow applications to react to events. Common Blob storage event scenarios include image or video processing, search indexing, or any file-oriented workflow.

Events are pushed using Azure Event Grid to subscribers such as Azure Functions, Azure Logic Apps, or even to your own http listener.

Note: Only storage accounts of kind StorageV2 (general purpose v2) and BlobStorage support event integration. Storage (general purpose v1) does not support integration with Event Grid.

Reference: https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blob-event-overview

Question #76

HOTSPOT

You are developing a ticket reservation system for an airline.

The storage solution for the application must meet the following requirements:

✑ Ensure at least 99.99% availability and provide low latency.

✑ Accept reservations event when localized network outages or other unforeseen failures occur.

✑ Process reservations in the exact sequence as reservations are submitted to minimize overbooking or selling the same seat to multiple travelers.

✑ Allow simultaneous and out-of-order reservations with a maximum five-second tolerance window.

You provision a resource group named airlineResourceGroup in the Azure South-Central US region.

You need to provision a SQL SPI Cosmos DB account to support the app.

How should you complete the Azure CLI commands? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.

Reveal Solution Hide Solution

Correct Answer:

Explanation:

Box 1: BoundedStaleness

Bounded staleness: The reads are guaranteed to honor the consistent-prefix guarantee. The reads might lag behind writes by at most "K" versions (that is, "updates") of an item or by "T" time interval. In other words, when you choose bounded staleness, the "staleness" can be configured in two ways:

The number of versions (K) of the item

The time interval (T) by which the reads might lag behind the writes

Incorrect Answers:

Strong

Strong consistency offers a linearizability guarantee. Linearizability refers to serving requests concurrently. The reads are guaranteed to return the most recent committed version of an item. A client never sees an uncommitted or partial write. Users are always guaranteed to read the latest committed write.

Box 2: –enable-automatic-failover true

For multi-region Cosmos accounts that are configured with a single-write region, enable automatic-failover by using Azure CLI or Azure portal. After you enable automatic failover, whenever there is a regional disaster, Cosmos DB will automatically failover your account.


Question #77

You develop Azure solutions.

You must connect to a No-SQL globally-distributed database by using the .NET API.

You need to create an object to configure and execute requests in the database.

Which code segment should you use?

  • A . new Container(EndpointUri, PrimaryKey);
  • B . new Database(Endpoint, PrimaryKey);
  • C . new CosmosClient(EndpointUri, PrimaryKey);

Reveal Solution Hide Solution

Correct Answer: C
C

Explanation:

Example:

// Create a new instance of the Cosmos Client

this.cosmosClient = new CosmosClient(EndpointUri, PrimaryKey)

//ADD THIS PART TO YOUR CODE

await this.CreateDatabaseAsync();

Reference: https://docs.microsoft.com/en-us/azure/cosmos-db/sql-api-get-started

Question #78

DRAG DROP

You are developing a new page for a website that uses Azure Cosmos DB for data storage.

The feature uses documents that have the following format:

You must display data for the new page in a specific order.

You create the following query for the page:

You need to configure a Cosmos DB policy to the support the query.

How should you configure the policy? To answer, drag the appropriate JSON segments to the correct locations. Each JSON segment may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content. NOTE: Each correct selection is worth one point.

Reveal Solution Hide Solution

Correct Answer:

Explanation:

Box 1: compositeIndexes

You can order by multiple properties. A query that orders by multiple properties requires a composite index.

Box 2: descending

Example: Composite index defined for (name ASC, age ASC):

It is optional to specify the order. If not specified, the order is ascending.

{

"automatic":true,

"indexingMode":"Consistent",

"includedPaths":[

{

"path":"/*"

}

],

"excludedPaths":[],

"compositeIndexes":[

[

{

"path":"/name",

},

{

"path":"/age",

}

]

]

}


Question #79

HOTSPOT

You are building a traffic monitoring system that monitors traffic along six highways. The system produces time series analysis-based reports for each highway. Data from traffic sensors are stored in Azure Event Hub.

Traffic data is consumed by four departments. Each department has an Azure Web App that displays the time-series-based reports and contains a WebJob that processes the incoming data from Event Hub. All Web Apps run on App Service Plans with three instances.

Data throughout must be maximized. Latency must be minimized.

You need to implement the Azure Event Hub.

Which settings should you use? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.

Reveal Solution Hide Solution

Correct Answer:

Explanation:

Box 1: 6

The number of partitions is specified at creation and must be between 2 and 32.

There are 6 highways.

Box 2: Highway

Reference: https://docs.microsoft.com/en-us/azure/event-hubs/event-hubs-features


Question #80

DRAG DROP

You are developing a microservices solution. You plan to deploy the solution to a multinode Azure Kubernetes Service (AKS) cluster.

You need to deploy a solution that includes the following features:

✑ reverse proxy capabilities

✑ configurable traffic routing

✑ TLS termination with a custom certificate

Which components should you use? To answer, drag the appropriate components to the correct requirements. Each component may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content. NOTE: Each correct selection is worth one point.

Reveal Solution Hide Solution

Correct Answer:

Explanation:

Box 1: Helm

To create the ingress controller, use Helm to install nginx-ingress.

Box 2: kubectl

To find the cluster IP address of a Kubernetes pod, use the kubectl get pod command on your local machine, with the option -o wide.

Box 3: Ingress Controller

An ingress controller is a piece of software that provides reverse proxy, configurable traffic routing, and TLS termination for Kubernetes services. Kubernetes ingress resources are used to configure the ingress rules and routes for individual Kubernetes services.

Incorrect Answers:

Virtual Kubelet: Virtual Kubelet is an open-source Kubernetes kubelet implementation that masquerades as a kubelet. This allows Kubernetes nodes to be backed by Virtual Kubelet providers such as serverless cloud container platforms.

CoreDNS: CoreDNS is a flexible, extensible DNS server that can serve as the Kubernetes cluster DNS.

Like Kubernetes, the CoreDNS project is hosted by the CNCF.

Reference:

https://docs.microsoft.com/bs-cyrl-ba/azure/aks/ingress-basic

https://www.digitalocean.com/community/tutorials/how-to-inspect-kubernetes-networking


Question #81

Your company is developing an Azure API.

You need to implement authentication for the Azure API.

You have the following requirements:

✑ All API calls must be secure.

✑ Callers to the API must not send credentials to the API.

Which authentication mechanism should you use?

  • A . Basic
  • B . Anonymous
  • C . Managed identity
  • D . Client certificate

Reveal Solution Hide Solution

Correct Answer: C
C

Explanation:

Use the authentication-managed-identity policy to authenticate with a backend service using the managed identity of the API Management service. This policy essentially uses the managed identity to obtain an access token from Azure Active Directory for accessing the specified resource. After successfully obtaining the token, the policy will set the value of the token in the Authorization header using the Bearer scheme.

Reference: https://docs.microsoft.com/bs-cyrl-ba/azure/api-management/api-management-authentication-policies

Question #82

You are a developer for a SaaS company that offers many web services.

All web services for the company must meet the following requirements:

✑ Use API Management to access the services

✑ Use OpenID Connect for authentication

✑ Prevent anonymous usage

A recent security audit found that several web services can be called without any authentication.

Which API Management policy should you implement?

  • A . jsonp
  • B . authentication-certificate
  • C . check-header
  • D . validate-jwt

Reveal Solution Hide Solution

Correct Answer: D
D

Explanation:

Add the validate-jwt policy to validate the OAuth token for every incoming request.

Incorrect Answers:

A: The jsonp policy adds JSON with padding (JSONP) support to an operation or an API to allow cross-

domain calls from JavaScript browser-based clients. JSONP is a method used in JavaScript programs to request data from a server in a different domain. JSONP bypasses the limitation enforced by most web browsers where access to web pages must be in the same domain.

JSONP – Adds JSON with padding (JSONP) support to an operation or an API to allow cross-domain calls from JavaScript browser-based clients.

Reference: https://docs.microsoft.com/en-us/azure/api-management/api-management-howto-protect-backend-with-aad

Question #83

DRAG DROP

Contoso, Ltd. provides an API to customers by using Azure API Management (APIM). The API authorizes users with a JWT token.

You must implement response caching for the APIM gateway. The caching mechanism must detect the user ID of the client that accesses data for a given location and cache the response for that user ID.

You need to add the following policies to the policies file:

• a set-variable policy to store the detected user identity

• a cache-lookup-value policy

• a cache-store-value policy

• a find-and-replace policy to update the response body with the user profile information

To which policy section should you add the policies? To answer, drag the appropriate sections to the correct policies. Each section may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content

NOTE: Each correct selection is worth one point

Reveal Solution Hide Solution

Correct Answer:

Explanation:

Box 1: Inbound.

A set-variable policy to store the detected user identity.

Example:

<policies>

<inbound>

<!– How you determine user identity is application dependent –>

<set-variable

name="enduserid"

value="@(context.Request.Headers.GetValueOrDefault("Authorization","").Split(‘ ‘)[1].AsJwt()?.Subject)" />

Box 2: Inbound

A cache-lookup-value policy

Example:

<inbound>

<base />

<cache-lookup vary-by-developer="true | false" vary-by-developer-groups="true | false" downstream-caching-type="none | private | public" must-revalidate="true | false">

<vary-by-query-parameter>parameter name</vary-by-query-parameter> <!– optional, can repeated several times –>

</cache-lookup>

</inbound>

Box 3: Outbound

A cache-store-value policy.

Example:

<outbound>

<base />

<cache-store duration="3600" />

</outbound>

Box 4: Outbound

A find-and-replace policy to update the response body with the user profile information.

Example:

<outbound>

<!– Update response body with user profile–>

<find-and-replace

from=’"$userprofile$"’

to="@((string)context.Variables["userprofile"])" />

<base />

</outbound>

Reference:

https://docs.microsoft.com/en-us/azure/api-management/api-management-caching-policies

https://docs.microsoft.com/en-us/azure/api-management/api-management-sample-cache-by-key


Question #84

DRAG DROP

You develop a web application.

You need to register the application with an active Azure Active Directory (Azure AD) tenant.

Which three actions should you perform in sequence? To answer, move all actions from the list of actions to the answer area and arrange them in the correct order.

Reveal Solution Hide Solution

Correct Answer:

Explanation:

Register a new application using the Azure portal

✑ Sign in to the Azure portal using either a work or school account or a personal Microsoft account.

✑ If your account gives you access to more than one tenant, select your account in the upper right corner. Set your portal session to the Azure AD tenant that you want.

✑ Search for and select Azure Active Directory. Under Manage, select App registrations.

✑ Select New registration. (Step 1)

✑ In Register an application, enter a meaningful application name to display to users.

✑ Specify who can use the application. Select the Azure AD instance. (Step 2)

✑ Under Redirect URI (optional), select the type of app you’re building: Web or

Public client (mobile & desktop). Then enter the redirect URI, or reply URL, for your application. (Step 3)

✑ When finished, select Register.


Question #85

You are developing an internal website for employees to view sensitive data. The website uses Azure Active Directory (AAD) for authentication. You need to implement multifactor authentication for the website.

What should you do? Each correct answer presents part of the solution. NOTE; Each correct selection is worth one point.

  • A . In Azure AD, create a new conditional access policy.
  • B . In Azure AD, enable application proxy.
  • C . Configure the website to use Azure AD B2C.
  • D . In Azure AD conditional access, enable the baseline policy.
  • E . Upgrade to Azure AD Premium.

Reveal Solution Hide Solution

Correct Answer: A, E
A, E

Explanation:

Reference: https://docs.microsoft.com/en-us/azure/active-directory/authentication/howto-mfa-getstarted

Question #86

DRAG DROP

You are developing an application. You have an Azure user account that has access to two subscriptions.

You need to retrieve a storage account key secret from Azure Key Vault.

In which order should you arrange the PowerShell commands to develop the solution? To answer, move all commands from the list of commands to the answer area and arrange them in the correct order.

Reveal Solution Hide Solution

Correct Answer:

Explanation:

Step 1: Get-AzSubscription

If you have multiple subscriptions, you might have to specify the one that was used to create your key vault. Enter the following to see the subscriptions for your account: Get-AzSubscription

Step 2: Set-AzContext -SubscriptionId

To specify the subscription that’s associated with the key vault you’ll be logging, enter:

Set-AzContext -SubscriptionId <subscriptionID>

Step 3: Get-AzStorageAccountKey

You must get that storage account key.

Step 4: $secretvalue = ConvertTo-SecureString <storageAccountKey> -AsPlainText -Force Set-AzKeyVaultSecret -VaultName <vaultName> -Name <secretName> -SecretValue $secretvalue

After retrieving your secret (in this case, your storage account key), you must convert that key to a secure string, and then create a secret with that value in your key vault.

Step 5: Get-AzKeyVaultSecret

Next, get the URI for the secret you created. You’ll need this URI in a later step to call the

key vault and retrieve your secret. Run the following PowerShell command and make note

of the ID value, which is the secret’s URI:

Get-AzKeyVaultSecret CVaultName <vaultName>


Question #87

You are developing an ASP.NET Core Web API web service. The web service uses Azure Application Insights for all telemetry and dependency tracking. The web service reads and writes data to a database other than Microsoft SQL Server.

You need to ensure that dependency tracking works for calls to the third-party database.

Which two Dependency Telemetry properties should you store in the database? Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point.

  • A . Telemetry.Context.Operation.Id
  • B . Tetemetry.Context.Cloud.Rolelnstance
  • C . Telemetry.Id
  • D . Telemetry.ContextSession.Id
  • E . Telemetry.Name

Reveal Solution Hide Solution

Correct Answer: AC
AC

Explanation:

References: https://docs.microsoft.com/en-us/azure/azure-monitor/app/custom-operations-tracking

Example:

public async Task Enqueue(string payload)

{

// StartOperation is a helper method that initializes the telemetry item

// and allows correlation of this operation with its parent and children.

var operation = telemetryClient.StartOperation<DependencyTelemetry>("enqueue " + queueName);

operation.Telemetry.Type = "Azure Service Bus";

operation.Telemetry.Data = "Enqueue " + queueName;

var message = new BrokeredMessage(payload);

// Service Bus queue allows the property bag to pass along with the message.

// We will use them to pass our correlation identifiers (and other context)

// to the consumer.

message.Properties.Add("ParentId", operation.Telemetry.Id);

message.Properties.Add("RootId", operation.Telemetry.Context.Operation.Id);

Reference: https://docs.microsoft.com/en-us/azure/azure-monitor/app/custom-operations-tracking

Question #88

HOTSPOT

You are using Azure Front Door Service.

You are expecting inbound files to be compressed by using Brotli compression. You discover that inbound XML files are not compressed. The files are 9 megabytes (MB) in size.

You need to determine the root cause for the issue.

To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.

Reveal Solution Hide Solution

Correct Answer:

Explanation:

Box 1: No

Front Door can dynamically compress content on the edge, resulting in a smaller and faster response to your clients. All files are eligible for compression. However, a file must be of a MIME type that is eligible for compression list.

Box 2: No

Sometimes you may wish to purge cached content from all edge nodes and force them all to retrieve new updated assets. This might be due to updates to your web application, or to quickly update assets that contain incorrect information.

Box 3: Yes

These profiles support the following compression encodings: Gzip (GNU zip), Brotli

Reference: https://docs.microsoft.com/en-us/azure/frontdoor/front-door-caching


Question #89

HOTSPOT

You are developing an Azure App Service hosted ASP.NET Core web app to deliver video on-demand streaming media. You enable an Azure Content Delivery Network (CDN) Standard for the web endpoint. Customer videos are downloaded from the web app by using the following example URL.: http://www.contoso.com/content.mp4?quality=1

All media content must expire from the cache after one hour. Customer videos with varying quality must be delivered to the closest regional point of presence (POP) node.

You need to configure Azure CDN caching rules.

Which options should you use? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.

Reveal Solution Hide Solution

Correct Answer:

Explanation:

Box 1: Override

Override: Ignore origin-provided cache duration; use the provided cache duration instead.

This will not override cache-control: no-cache.

Set if missing: Honor origin-provided cache-directive headers, if they exist; otherwise, use the provided cache duration.

Incorrect:

Bypass cache: Do not cache and ignore origin-provided cache-directive headers.

Box 2: 1 hour

All media content must expire from the cache after one hour.

Box 3: Cache every unique URL

Cache every unique URL: In this mode, each request with a unique URL, including the query string, is treated as a unique asset with its own cache. For example, the response from the origin server for a request for example.ashx?q=test1 is cached at the POP node and returned for subsequent caches with the same query string. A request for example.ashx?q=test2 is cached as a separate asset with its own time-to-live setting.

Incorrect Answers:

Bypass caching for query strings: In this mode, requests with query strings are not cached at the CDN POP node. The POP node retrieves the asset directly from the origin server and passes it to the requestor with each request.

Ignore query strings: Default mode. In this mode, the CDN point-of-presence (POP) node passes the query strings from the requestor to the origin server on the first request and caches the asset. All subsequent requests for the asset that are served from the POP ignore the query strings until the cached asset expires.

Reference: https://docs.microsoft.com/en-us/azure/cdn/cdn-query-string


Question #90

DRAG DROP

You develop a web app that uses tier D1 app service plan by using the Web Apps feature of Microsoft Azure App Service.

Spikes in traffic have caused increases in page load times.

You need to ensure that the web app automatically scales when CPU load is about 85 percent and minimize costs.

Which four actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order. NOTE: More than one order of answer choices is correct. You will receive credit for any of the correct orders you select.

Reveal Solution Hide Solution

Correct Answer:

Explanation:

Step 1: Configure the web app to the Standard App Service Tier

The Standard tier supports auto-scaling, and we should minimize the cost.

Step 2: Enable autoscaling on the web app

First enable autoscale

Step 3: Add a scale rule

Step 4: Add a Scale condition

Reference: https://docs.microsoft.com/en-us/azure/monitoring-and-diagnostics/monitoring-autoscale-get-started


Question #91

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.

After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.

You are developing an Azure Service application that processes queue data when it receives a message from a mobile application. Messages may not be sent to the service consistently.

You have the following requirements:

✑ Queue size must not grow larger than 80 gigabytes (GB).

✑ Use first-in-first-out (FIFO) ordering of messages.

✑ Minimize Azure costs.

You need to implement the messaging solution.

Solution: Use the .Net API to add a message to an Azure Service Bus Queue from the mobile application. Create an Azure Function App that uses an Azure Service Bus Queue trigger.

Does the solution meet the goal?

  • A . Yes
  • B . No

Reveal Solution Hide Solution

Correct Answer: A
A

Explanation:

You can create a function that is triggered when messages are submitted to an Azure Storage queue.

Reference: https://docs.microsoft.com/en-us/azure/azure-functions/functions-create-storage-queue-triggered-function

Question #92

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.

After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.

You are developing an Azure solution to collect point-of-sale (POS) device data from 2,000 stores located throughout the world. A single device can produce 2 megabytes (MB) of data every 24 hours. Each store location has one to five devices that send data.

You must store the device data in Azure Blob storage. Device data must be correlated based on a device identifier. Additional stores are expected to open in the future.

You need to implement a solution to receive the device data.

Solution: Provision an Azure Notification Hub. Register all devices with the hub.

Does the solution meet the goal?

  • A . Yes
  • B . No

Reveal Solution Hide Solution

Correct Answer: B
B

Explanation:

Instead use an Azure Service Bus, which is used order processing and financial transactions.

Reference: https://docs.microsoft.com/en-us/azure/event-grid/compare-messaging-services

Question #93

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.

After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.

You are developing an Azure solution to collect point-of-sale (POS) device data from 2,000 stores located throughout the world. A single device can produce 2 megabytes (MB) of data every 24 hours. Each store location has one to five devices that send data.

You must store the device data in Azure Blob storage. Device data must be correlated based on a device identifier. Additional stores are expected to open in the future.

You need to implement a solution to receive the device data.

Solution: Provision an Azure Service Bus. Configure a topic to receive the device data by using a correlation filter.

Does the solution meet the goal?

  • A . Yes
  • B . No

Reveal Solution Hide Solution

Correct Answer: A
A

Explanation:

A message is raw data produced by a service to be consumed or stored elsewhere. The Service Bus is for high-value enterprise messaging, and is used for order processing and financial transactions.

Reference: https://docs.microsoft.com/en-us/azure/event-grid/compare-messaging-services

Question #94

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.

After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.

You are developing an Azure solution to collect point-of-sale (POS) device data from 2,000 stores located throughout the world. A single device can produce 2 megabytes (MB) of data every 24 hours. Each store location has one to five devices that send data.

You must store the device data in Azure Blob storage. Device data must be correlated based on a device identifier. Additional stores are expected to open in the future.

You need to implement a solution to receive the device data.

Solution: Provision an Azure Event Grid. Configure event filtering to evaluate the device identifier.

Does the solution meet the goal?

  • A . Yes
  • B . No

Reveal Solution Hide Solution

Correct Answer: B
B

Explanation:

Instead use an Azure Service Bus, which is used order processing and financial transactions.

Note: An event is a lightweight notification of a condition or a state change. Event hubs is usually used reacting to status changes.

Reference: https://docs.microsoft.com/en-us/azure/event-grid/compare-messaging-services

Question #95

DRAG DROP

You manage several existing Logic Apps.

You need to change definitions, add new logic, and optimize these apps on a regular basis.

What should you use? To answer, drag the appropriate tools to the correct functionalities. Each tool may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content. NOTE: Each correct selection is worth one point.

Reveal Solution Hide Solution

Correct Answer:

Explanation:

Box 1: Enterprise Integration Pack

After you create an integration account that has partners and agreements, you are ready to create a business to business (B2B) workflow for your logic app with the Enterprise Integration Pack.

Box 2: Code View Editor

To work with logic app definitions in JSON, open the Code View editor when working in the Azure portal or in Visual Studio, or copy the definition into any editor that you want.

Box 3: Logical Apps Designer

You can build your logic apps visually with the Logic Apps Designer, which is available in the Azure portal through your browser and in Visual Studio.

Reference:

https://docs.microsoft.com/en-us/azure/logic-apps/logic-apps-enterprise-integration-b2b

https://docs.microsoft.com/en-us/azure/logic-apps/logic-apps-author-definitions

https://docs.microsoft.com/en-us/azure/logic-apps/logic-apps-overview


Question #96

A company is developing a solution that allows smart refrigerators to send temperature information to a central location. You have an existing Service Bus.

The solution must receive and store message until they can be processed. You create an Azure Service Bus Instance by providing a name, pricing tier, subscription, resource group, and location.

You need to complete the configuration.

Which Azure CLI or PowerShell command should you run?

A)

B)

C)

D)

  • A . Option A
  • B . Option B
  • C . Option C
  • D . Option D

Reveal Solution Hide Solution

Correct Answer: A
A

Explanation:

A service bus instance has already been created (Step 2 below). Next is step 3, Create a Service Bus queue.

Note:

Steps:

Step 1: # Create a resource group

resourceGroupName="myResourceGroup"

az group create –name $resourceGroupName –location eastus

Step 2: # Create a Service Bus messaging namespace with a unique name namespaceName=myNameSpace$RANDOM

az servicebus namespace create –resource-group $resourceGroupName –name $namespaceName –location eastus

Step 3: # Create a Service Bus queue

az servicebus queue create –resource-group $resourceGroupName –namespace-name $namespaceName –name BasicQueue

Step 4: # Get the connection string for the namespace

connectionString=$(az servicebus namespace authorization-rule keys list –resource-group

$resourceGroupName –namespace-name $namespaceName –name RootManageSharedAccessKey –query primaryConnectionString –output tsv)

Reference: https://docs.microsoft.com/en-us/azure/service-bus-messaging/service-bus-quickstart-cli

Question #97

HOTSPOT

You are developing an application that uses Azure Storage Queues.

You have the following code:

For each of the following statements, select Yes if the statement is true. Otherwise, select No. NOTE: Each correct selection is worth one point.

Reveal Solution Hide Solution

Correct Answer:

Explanation:

Box 1: No

The QueueDescription.LockDuration property gets or sets the duration of a peek lock; that is, the amount of time that the message is locked for other receivers. The maximum value for LockDuration is 5 minutes; the default value is 1 minute.

Box 2: Yes

You can peek at the message in the front of a queue without removing it from the queue by calling the PeekMessage method.

Box 3: Yes

Reference:

https://docs.microsoft.com/en-us/azure/storage/queues/storage-dotnet-how-to-use-queues

https://docs.microsoft.com/en-us/dotnet/api/microsoft.servicebus.messaging.queuedescription.lockduration


Question #98

HOTSPOT

You are working for Contoso, Ltd.

You define an API Policy object by using the following XML markup:

For each of the following statements, select Yes if the statement is true. Otherwise, select No. NOTE: Each correct selection is worth one point.

Reveal Solution Hide Solution

Correct Answer:

Explanation:

Box 1: Yes

Use the set-backend-service policy to redirect an incoming request to a different backend than the one specified in the API settings for that operation. Syntax: <set-backend-service base-url="base URL of the backend service" />

Box 2: No

The condition is on 512k, not on 256k.

Box 3: No

The set-backend-service policy changes the backend service base URL of the incoming request to the one specified in the policy.

Reference: https://docs.microsoft.com/en-us/azure/api-management/api-management-transformation-policies


Question #99

You are developing a solution that will use Azure messaging services.

You need to ensure that the solution uses a publish-subscribe model and eliminates the need for constant polling.

What are two possible ways to achieve the goal? Each correct answer presents a complete solution. NOTE: Each correct selection is worth one point.

  • A . Service Bus
  • B . Event Hub
  • C . Event Grid
  • D . Queue

Reveal Solution Hide Solution

Correct Answer: A, C
A, C

Explanation:

It is strongly recommended to use available messaging products and services that support a publish-subscribe model, rather than building your own. In Azure, consider using Service Bus or Event Grid. Other technologies that can be used for pub/sub messaging include Redis, RabbitMQ, and Apache Kafka.

Reference: https://docs.microsoft.com/en-us/azure/architecture/patterns/publisher-subscriber

Question #100

A company is implementing a publish-subscribe (Pub/Sub) messaging component by using Azure Service Bus. You are developing the first subscription application.

In the Azure portal you see that messages are being sent to the subscription for each topic. You create and initialize a subscription client object by supplying the correct details, but the subscription application is still not consuming the messages.

You need to complete the source code of the subscription client

What should you do?

  • A . await subscriptionClient.CloseAsync();
  • B . await subscriptionClient.AddRuleAsync(new RuleDescription(RuleDescription.DefaultRuleName, new TrueFilter()));
  • C . subscriptionClient.RegisterMessageHandler(ProcessMessagesAsync, messageHandlerOptions);
  • D . subscriptionClient = new SubscriptionClient(ServiceBusConnectionString, TopicName, SubscriptionName);

Reveal Solution Hide Solution

Correct Answer: C
C

Explanation:

Using topic client, call RegisterMessageHandler which is used to receive messages continuously from the entity. It registers a message handler and begins a new thread to receive messages. This handler is waited on every time a new message is received by the receiver.

subscriptionClient.RegisterMessageHandler(ReceiveMessagesAsync, messageHandlerOptions);

Reference: https://www.c-sharpcorner.com/article/azure-service-bus-topic-and-subscription-pub-sub/

Exit mobile version