Databricks Databricks-Certified-Data-Analyst-Associate Exam (page: 2)
Databricks Certified Data Analyst Associate
Updated on: 02-Jan-2026

A data analyst has set up a SQL query to run every four hours on a SQL endpoint, but the SQL endpoint is taking too long to start up with each run.

Which of the following changes can the data analyst make to reduce the start-up time for the endpoint while managing costs?

  1. Reduce the SQL endpoint cluster size
  2. Increase the SQL endpoint cluster size
  3. Turn off the Auto stop feature
  4. Increase the minimum scaling value
  5. Use a Serverless SQL endpoint

Answer(s): E

Explanation:

A Serverless SQL endpoint is a type of SQL endpoint that does not require a dedicated cluster to run queries. Instead, it uses a shared pool of resources that can scale up and down automatically based on the demand. This means that a Serverless SQL endpoint can start up much faster than a SQL endpoint that uses a cluster, and it can also save costs by only paying for the resources that are used. A Serverless SQL endpoint is suitable for ad-hoc queries and exploratory analysis, but it may not offer the same level of performance and isolation as a SQL endpoint that uses a cluster. Therefore, a data analyst should consider the trade-offs between speed, cost, and quality when choosing between a Serverless SQL endpoint and a SQL endpoint that uses a cluster.


Reference:

Databricks SQL endpoints, Serverless SQL endpoints, SQL endpoint clusters



A data engineering team has created a Structured Streaming pipeline that processes data in micro- batches and populates gold-level tables. The microbatches are triggered every minute.

A data analyst has created a dashboard based on this gold-level data. The project stakeholders want to see the results in the dashboard updated within one minute or less of new data becoming available within the gold-level tables.

Which of the following cautions should the data analyst share prior to setting up the dashboard to complete this task?

  1. The required compute resources could be costly
  2. The gold-level tables are not appropriately clean for business reporting
  3. The streaming data is not an appropriate data source for a dashboard
  4. The streaming cluster is not fault tolerant
  5. The dashboard cannot be refreshed that quickly

Answer(s): A

Explanation:

A Structured Streaming pipeline that processes data in micro-batches and populates gold-level tables every minute requires a high level of compute resources to handle the frequent data ingestion, processing, and writing. This could result in a significant cost for the organization, especially if the data volume and velocity are large. Therefore, the data analyst should share this caution with the project stakeholders before setting up the dashboard and evaluate the trade-offs between the desired refresh rate and the available budget. The other options are not valid cautions because:

B) The gold-level tables are assumed to be appropriately clean for business reporting, as they are the final output of the data engineering pipeline. If the data quality is not satisfactory, the issue should be addressed at the source or silver level, not at the gold level.

C) The streaming data is an appropriate data source for a dashboard, as it can provide near real-time insights and analytics for the business users. Structured Streaming supports various sources and sinks for streaming data, including Delta Lake, which can enable both batch and streaming queries on the same data.

D) The streaming cluster is fault tolerant, as Structured Streaming provides end-to-end exactly-once fault-tolerance guarantees through checkpointing and write-ahead logs. If a query fails, it can be restarted from the last checkpoint and resume processing.

E) The dashboard can be refreshed within one minute or less of new data becoming available in the gold-level tables, as Structured Streaming can trigger micro-batches as fast as possible (every few seconds) and update the results incrementally. However, this may not be necessary or optimal for the business use case, as it could cause frequent changes in the dashboard and consume more resources.


Reference:

Streaming on Databricks, Monitoring Structured Streaming queries on Databricks, A look at the new Structured Streaming UI in Apache Spark 3.0, Run your first Structured Streaming workload



Which of the following approaches can be used to ingest data directly from cloud-based object storage?

  1. Create an external table while specifying the DBFS storage path to FROM
  2. Create an external table while specifying the DBFS storage path to PATH
  3. It is not possible to directly ingest data from cloud-based object storage
  4. Create an external table while specifying the object storage path to FROM
  5. Create an external table while specifying the object storage path to LOCATION

Answer(s): E

Explanation:

External tables are tables that are defined in the Databricks metastore using the information stored in a cloud object storage location. External tables do not manage the data, but provide a schema and a table name to query the data. To create an external table, you can use the CREATE EXTERNAL TABLE statement and specify the object storage path to the LOCATION clause. For example, to create an external table named ext_table on a Parquet file stored in S3, you can use the following statement:

SQL

CREATE EXTERNAL TABLE ext_table (

col1 INT,

col2 STRING

)

STORED AS PARQUET

LOCATION 's3://bucket/path/file.parquet'

AI-generated code. Review and use carefully. More info on FAQ.


Reference:

External tables



A data analyst wants to create a dashboard with three main sections: Development, Testing, and Production. They want all three sections on the same dashboard, but they want to clearly designate the sections using text on the dashboard.

Which of the following tools can the data analyst use to designate the Development, Testing, and Production sections using text?

  1. Separate endpoints for each section
  2. Separate queries for each section
  3. Markdown-based text boxes
  4. Direct text written into the dashboard in editing mode
  5. Separate color palettes for each section

Answer(s): C

Explanation:

Markdown-based text boxes are useful as labels on a dashboard. They allow the data analyst to add text to a dashboard using the %md magic command in a notebook cell and then select the dashboard icon in the cell actions menu. The text can be formatted using markdown syntax and can include headings, lists, links, images, and more. The text boxes can be resized and moved around on the dashboard using the float layout option.


Reference:

Dashboards in notebooks, How to add text to a dashboard in Databricks



A data analyst needs to use the Databricks Lakehouse Platform to quickly create SQL queries and data visualizations. It is a requirement that the compute resources in the platform can be made serverless, and it is expected that data visualizations can be placed within a dashboard.

Which of the following Databricks Lakehouse Platform services/capabilities meets all of these requirements?

  1. Delta Lake
  2. Databricks Notebooks
  3. Tableau
  4. Databricks Machine Learning
  5. Databricks SQL

Answer(s): E

Explanation:

Databricks SQL is a serverless data warehouse on the Lakehouse that lets you run all of your SQL and BI applications at scale with your tools of choice, all at a fraction of the cost of traditional cloud data warehouses1. Databricks SQL allows you to create SQL queries and data visualizations using the SQL Analytics UI or the Databricks SQL CLI2. You can also place your data visualizations within a dashboard and share it with other users in your organization3. Databricks SQL is powered by Delta Lake, which provides reliability, performance, and governance for your data lake4.


Reference:

Databricks SQL

Query data using SQL Analytics

Visualizations in Databricks notebooks

Delta Lake



Viewing Page 2 of 10



Share your comments for Databricks Databricks-Certified-Data-Analyst-Associate exam with other users:

trying 7/28/2023 12:37:00 PM

thanks good stuff
UNITED STATES


exampei 10/4/2023 2:40:00 PM

need dump c_tadm_23
Anonymous


Eman Sawalha 6/10/2023 6:18:00 AM

next time i will write a full review
GREECE


johnpaul 11/15/2023 7:55:00 AM

first time using this site
ROMANIA


omiornil@gmail.com 7/25/2023 9:36:00 AM

please sent me oracle 1z0-1105-22 pdf
BANGLADESH


John 8/29/2023 8:59:00 PM

very helpful
Anonymous


Kvana 9/28/2023 12:08:00 PM

good info about oml
UNITED STATES


Checo Lee 7/3/2023 5:45:00 PM

very useful to practice
UNITED STATES


dixitdnoh@gmail.com 8/27/2023 2:58:00 PM

this website is very helpful.
UNITED STATES


Sanjay 8/14/2023 8:07:00 AM

good content
INDIA


Blessious Phiri 8/12/2023 2:19:00 PM

so challenging
Anonymous


PAYAL 10/17/2023 7:14:00 AM

17 should be d ,for morequery its scale out
Anonymous


Karthik 10/12/2023 10:51:00 AM

nice question
Anonymous


Godmode 5/7/2023 10:52:00 AM

yes.
NETHERLANDS


Bhuddhiman 7/30/2023 1:18:00 AM

good mateial
Anonymous


KJ 11/17/2023 3:50:00 PM

good practice exam
Anonymous


sowm 10/29/2023 2:44:00 PM

impressivre qustion
Anonymous


CW 7/6/2023 7:06:00 PM

questions seem helpful
Anonymous


luke 9/26/2023 10:52:00 AM

good content
Anonymous


zazza 6/16/2023 9:08:00 AM

question 21 answer is alerts
ITALY


Abwoch Peter 7/4/2023 3:08:00 AM

am preparing for exam
Anonymous


mohamed 9/12/2023 5:26:00 AM

good one thanks
EGYPT


Mfc 10/23/2023 3:35:00 PM

only got thru 5 questions, need more to evaluate
Anonymous


Whizzle 7/24/2023 6:19:00 AM

q26 should be b
Anonymous


sarra 1/17/2024 3:44:00 AM

the aaa triad in information security is authentication, accounting and authorisation so the answer should be d 1, 3 and 5.
UNITED KINGDOM


DBS 5/14/2023 12:56:00 PM

need to attend this
UNITED STATES


Da_costa 8/1/2023 5:28:00 PM

these are free brain dumps i understand, how can one get free pdf
Anonymous


vikas 10/28/2023 6:57:00 AM

provide access
EUROPEAN UNION


Abdullah 9/29/2023 2:06:00 AM

good morning
Anonymous


Raj 6/26/2023 3:12:00 PM

please upload the ncp-mci 6.5 dumps, really need to practice this one. thanks guys
Anonymous


Miguel 10/5/2023 12:21:00 PM

question 16: https://help.salesforce.com/s/articleview?id=sf.care_console_overview.htm&type=5
SPAIN


Hiren Ladva 7/8/2023 10:34:00 PM

yes i m prepared exam
Anonymous


oliverjames 10/24/2023 5:37:00 AM

my experience was great with this site as i studied for the ms-900 from here and got 900/1000 on the test. my main focus was on the tutorials which were provided and practice questions. thanks!
GERMANY


Bhuddhiman 7/20/2023 11:52:00 AM

great course
UNITED STATES