Free Databricks-Certified-Data-Analyst-Associate Exam Braindumps (page: 3)

Page 3 of 12

A data analyst wants to create a dashboard with three main sections: Development, Testing, and Production. They want all three sections on the same dashboard, but they want to clearly designate the sections using text on the dashboard.

Which of the following tools can the data analyst use to designate the Development, Testing, and Production sections using text?

  1. Separate endpoints for each section
  2. Separate queries for each section
  3. Markdown-based text boxes
  4. Direct text written into the dashboard in editing mode
  5. Separate color palettes for each section

Answer(s): C

Explanation:

Markdown-based text boxes are useful as labels on a dashboard. They allow the data analyst to add text to a dashboard using the %md magic command in a notebook cell and then select the dashboard icon in the cell actions menu. The text can be formatted using markdown syntax and can include headings, lists, links, images, and more. The text boxes can be resized and moved around on the dashboard using the float layout option.


Reference:

Dashboards in notebooks, How to add text to a dashboard in Databricks



A data analyst needs to use the Databricks Lakehouse Platform to quickly create SQL queries and data visualizations. It is a requirement that the compute resources in the platform can be made serverless, and it is expected that data visualizations can be placed within a dashboard.

Which of the following Databricks Lakehouse Platform services/capabilities meets all of these requirements?

  1. Delta Lake
  2. Databricks Notebooks
  3. Tableau
  4. Databricks Machine Learning
  5. Databricks SQL

Answer(s): E

Explanation:

Databricks SQL is a serverless data warehouse on the Lakehouse that lets you run all of your SQL and BI applications at scale with your tools of choice, all at a fraction of the cost of traditional cloud data warehouses1. Databricks SQL allows you to create SQL queries and data visualizations using the SQL Analytics UI or the Databricks SQL CLI2. You can also place your data visualizations within a dashboard and share it with other users in your organization3. Databricks SQL is powered by Delta Lake, which provides reliability, performance, and governance for your data lake4.


Reference:

Databricks SQL

Query data using SQL Analytics

Visualizations in Databricks notebooks

Delta Lake



A data analyst is attempting to drop a table my_table. The analyst wants to delete all table metadata and data.

They run the following command:

DROP TABLE IF EXISTS my_table;

While the object no longer appears when they run SHOW TABLES, the data files still exist.

Which of the following describes why the data files still exist and the metadata files were deleted?

  1. The table's data was larger than 10 GB
  2. The table did not have a location
  3. The table was external
  4. The table's data was smaller than 10 GB
  5. The table was managed

Answer(s): C

Explanation:

An external table is a table that is defined in the metastore, but its data is stored outside of the Databricks environment, such as in S3, ADLS, or GCS.
When an external table is dropped, only the metadata is deleted from the metastore, but the data files are not affected. This is different from a managed table, which is a table whose data is stored in the Databricks environment, and whose data files are deleted when the table is dropped. To delete the data files of an external table, the analyst needs to specify the PURGE option in the DROP TABLE command, or manually delete the files from the storage system.


Reference:

DROP TABLE, Drop Delta table features, Best practices for dropping a managed Delta Lake table



After running DESCRIBE EXTENDED accounts.customers;, the following was returned:



Now, a data analyst runs the following command:

DROP accounts.customers;

Which of the following describes the result of running this command?

  1. Running SELECT * FROM delta. `dbfs:/stakeholders/customers` results in an error.
  2. Running SELECT * FROM accounts.customers will return all rows in the table.
  3. All files with the .customers extension are deleted.
  4. The accounts.customers table is removed from the metastore, and the underlying data files are deleted.
  5. The accounts.customers table is removed from the metastore, but the underlying data files are untouched.

Answer(s): E

Explanation:

the accounts.customers table is an EXTERNAL table, which means that it is stored outside the default warehouse directory and is not managed by Databricks. Therefore, when you run the DROP command on this table, it only removes the metadata information from the metastore, but does not delete the actual data files from the file system. This means that you can still access the data using the location path (dbfs:/stakeholders/customers) or create another table pointing to the same location. However, if you try to query the table using its name (accounts.customers), you will get an error because the table no longer exists in the metastore.


Reference:

DROP TABLE | Databricks on AWS, Best practices for dropping a managed Delta Lake table - Databricks



Page 3 of 12



Post your Comments and Discuss Databricks Databricks-Certified-Data-Analyst-Associate exam with other Community members:

Kimmu Badger commented on August 09, 2024
Good Material
UNITED STATES
upvote