What should you include in the solution?

You have an Azure SQL database.

You need to implement a disaster recovery solution that meets the following requirements:

• Minimizes how long it takes to recover the database if a datacenter fails

• Minimizes administrative effort

What should you include in the solution?
A . Azure Backup
B. active geo-replication
C. Azure Site Recovery
D. auto-failover groups

Answer: C

What should you include in the recommendation?

You are developing an application that uses Azure Data Lake Storage Gen 2.

You need to recommend a solution to grant permissions to a specific application for a limited time period.

What should you include in the recommendation?
A . role assignments
B. account keys
C. shared access signatures (SAS)
D. Azure Active Directory (Azure AD) identities

Answer: C

Explanation:

A shared access signature (SAS) provides secure delegated access to resources in your storage account. With a SAS, you have granular control over how a client can access your data. For example:

What resources the client may access.

What permissions they have to those resources.

How long the SAS is valid.

Note: Data Lake Storage Gen2 supports the following authorization mechanisms:

✑ Shared Key authorization

✑ Shared access signature (SAS) authorization

✑ Role-based access control (Azure RBAC)

✑ Access control lists (ACL) Data Lake Storage Gen2 supports the following authorization mechanisms:

✑ Shared Key authorization

✑ Shared access signature (SAS) authorization

✑ Role-based access control (Azure RBAC)

✑ Access control lists (ACL)

Reference: https://docs.microsoft.com/en-us/azure/storage/common/storage-sas-overview

What should you do?

You are monitoring an Azure Stream Analytics job.

You discover that the Backlogged input Events metric is increasing slowly and is consistently non-zero.

You need to ensure that the job can handle all the events.

What should you do?
A . Remove any named consumer groups from the connection and use $default.
B. Change the compatibility level of the Stream Analytics job.
C. Create an additional output stream for the existing input stream.
D. Increase the number of streaming units (SUs).

Answer: D

Explanation:

Backlogged Input Events: Number of input events that are backlogged. A non-zero value for this metric implies that your job isn’t able to keep up with the number of incoming events. If this value is slowly increasing or consistently non-zero, you should scale out your job, by increasing the SUs.

Reference: https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-monitoring

What should you include in the solution?

You have an Azure SQL database named DB1.

You need to encrypt DB1.

The solution must meet the following requirements;

• Encrypt data in motion.

• Support comparison operators.

• Provide randomized encryption.

What should you include in the solution?
A . Always Encrypted
B. column-level encryption
C. Transparent Data Encryption (TDE)
D. Always Encrypted with secure enclaves

Answer: A

Which windowing function should you use to perform the streaming aggregation of the sales data?

Which windowing function should you use to perform the streaming aggregation of the sales data?
A . Sliding
B. Hopping
C. Session
D. Tumbling

Answer: D

Explanation:

Scenario: The sales data, including the documents in JSON format, must be gathered as it arrives and analyzed online by using Azure Stream Analytics. The analytics process will perform aggregations that must be done continuously, without gaps, and without overlapping.

Tumbling window functions are used to segment a data stream into distinct time segments and perform a function against them, such as the example below. The key differentiators of a Tumbling window are that they repeat, do not overlap, and an event cannot belong to more than one tumbling window.

Timeline

Description automatically generated

Reference: https://github.com/MicrosoftDocs/azure-docs/blob/master/articles/stream-analytics/stream-analytics-window-functions.md

Which type of storage accounts should you use for the backups?

You are building a database backup solution for a SQL Server database hosted on an Azure virtual machine.

In the event of an Azure regional outage, you need to be able to restore the database backups. The solution must minimize costs.

Which type of storage accounts should you use for the backups?
A . locally-redundant storage (LRS)
B. read-access geo-redundant storage (RA-GRS)
C. zone-redundant storage (ZRS)
D. geo-redundant storage

Answer: B

Explanation:

Geo-redundant storage (with GRS or GZRS) replicates your data to another physical location in the secondary region to protect against regional outages. However, that data is available to be read only if the customer or Microsoft initiates a failover from the primary to secondary region. When you enable read access to the secondary region, your data is available to be read if the primary region becomes unavailable. For read access to the secondary region, enable read-access geo-redundant storage (RA-GRS) or read-access geo-zone-redundant storage (RA-GZRS).

Reference: https://docs.microsoft.com/en-us/azure/storage/common/storage-redundancy

Which four actions should you perform in sequence?

DRAG DROP

You need to implement statistics maintenance for SalesSQLDb1. The solution must meet the technical requirements.

Which four actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.

Answer:

Explanation:

Automating Azure SQL DB index and statistics maintenance using Azure Automation:

Which feature should you use to provide customers with the required level of access based on their service agreement?

You are evaluating the business goals.

Which feature should you use to provide customers with the required level of access based on their service agreement?
A . dynamic data masking
B. Conditional Access in Azure
C. service principals
D. row-level security (RLS)

Answer: D

Explanation:

Reference: https://docs.microsoft.com/en-us/sql/relational-databases/security/row-level-security?view=sql-server-ver15

Which audit log destination should you use to meet the monitoring requirements?

Which audit log destination should you use to meet the monitoring requirements?
A . Azure Storage
B. Azure Event Hubs
C. Azure Log Analytics

Answer: C

Explanation:

Scenario: Use a single dashboard to review security and audit data for all the PaaS databases.

With dashboards can bring together operational data that is most important to IT across all your Azure resources, including telemetry from Azure Log Analytics.

Note: Auditing for Azure SQL Database and Azure Synapse Analytics tracks database events and writes them to an audit log in your Azure storage account, Log Analytics workspace, or Event Hubs.

Reference: https://docs.microsoft.com/en-us/azure/azure-monitor/visualize/tutorial-logs-dashboards

Which two objects should you include in the solution?

You are designing a security model for an Azure Synapse Analytics dedicated SQL pool that will support multiple companies.

You need to ensure that users from each company can view only the data of their respective company.

Which two objects should you include in the solution? Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point.
A . a column encryption key
B. asymmetric keys
C. a function
D. a custom role-based access control (RBAC) role
E. a security policy

Answer: D,E

Explanation:

Azure RBAC is used to manage who can create, update, or delete the Synapse workspace and its SQL pools, Apache Spark pools, and Integration runtimes.

Define and implement network security configurations for resources related to your dedicated SQL pool with Azure Policy.

Reference:

https://docs.microsoft.com/en-us/azure/synapse-analytics/security/synapse-workspace-synapse-rbac

https://docs.microsoft.com/en-us/security/benchmark/azure/baselines/synapse-analytics-security-baseline