Which Databricks services should you log?

You have an Azure Databricks resource.

You need to log actions that relate to changes in compute for the Databricks resource.

Which Databricks services should you log?
A . clusters
B . jobs
C . DBFS
D . SSH
E . workspace

Answer: E

Explanation:

Cloud Provider Infrastructure Logs.

Databricks logging allows security and admin teams to demonstrate conformance to data governance

standards within or from a Databricks workspace.

Customers, especially in the regulated industries, also need records on activities like:

✑ User access control to cloud data storage

✑ Cloud Identity and Access Management roles

✑ User access to cloud network and compute

Azure Databricks offers three distinct workloads on several VM Instances tailored for your data analytics workflow―the Jobs Compute and Jobs Light Compute workloads make it easy for data engineers to build and execute jobs, and the All-Purpose Compute workload makes it easy for data scientists to explore, visualize, manipulate, and share data and insights interactively.

Reference: https://databricks.com/blog/2020/03/25/trust-but-verify-with-databricks.html

Latest DP-300 Dumps Valid Version with 176 Q&As

Latest And Valid Q&A | Instant Download | Once Fail, Full Refund

Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments