Google Associate-Data-Practitioner Exam (page: 3)
Google Cloud Associate Data Practitioner
Updated on: 12-Feb-2026

Your company has developed a website that allows users to upload and share video files. These files are most frequently accessed and shared when they are initially uploaded. Over time, the files are accessed and shared less frequently, although some old video files may remain very popular.

You need to design a storage system that is simple and cost-effective.
What should you do?

  1. Create a single-region bucket with Autoclass enabled.
  2. Create a single-region bucket. Configure a Cloud Scheduler job that runs every 24 hours and changes the storage class based on upload date.
  3. Create a single-region bucket with custom Object Lifecycle Management policies based on upload date.
  4. Create a single-region bucket with Archive as the default storage class.

Answer(s): C

Explanation:

Creating a single-region bucket with custom Object Lifecycle Management policies based on upload date is the most appropriate solution. This approach allows you to automatically transition objects to less expensive storage classes as their access frequency decreases over time. For example, frequently accessed files can remain in the Standard storage class initially, then transition to Nearline, Coldline, or Archive storage as their popularity wanes. This strategy ensures a cost-effective and efficient storage system while maintaining simplicity by automating the lifecycle management of video files.



You recently inherited a task for managing Dataflow streaming pipelines in your organization and noticed that proper access had not been provisioned to you. You need to request a Google-provided IAM role so you can restart the pipelines. You need to follow the principle of least privilege.
What should you do?

  1. Request the Dataflow Developer role.
  2. Request the Dataflow Viewer role.
  3. Request the Dataflow Worker role.
  4. Request the Dataflow Admin role.

Answer(s): A

Explanation:

The Dataflow Developer role provides the necessary permissions to manage Dataflow streaming pipelines, including the ability to restart pipelines. This role adheres to the principle of least privilege, as it grants only the permissions required to manage and operate Dataflow jobs without unnecessary administrative access. Other roles, such as Dataflow Admin, would grant broader permissions, which are not needed in this scenario.



You need to create a new data pipeline. You want a serverless solution that meets the following requirements:

· Data is streamed from Pub/Sub and is processed in real-time.

· Data is transformed before being stored.

· Data is stored in a location that will allow it to be analyzed with SQL using Looker.



Which Google Cloud services should you recommend for the pipeline?

  1. 1. Dataproc Serverless
    2. Bigtable
  2. 1. Cloud Composer
    2. Cloud SQL for MySQL
  3. 1. BigQuery
    2. Analytics Hub
  4. 1. Dataflow
    2. BigQuery

Answer(s): D

Explanation:

To build a serverless data pipeline that processes data in real-time from Pub/Sub, transforms it, and stores it for SQL-based analysis using Looker, the best solution is to use Dataflow and BigQuery. Dataflow is a fully managed service for real-time data processing and transformation, while BigQuery is a serverless data warehouse that supports SQL-based querying and integrates seamlessly with Looker for data analysis and visualization. This combination meets the requirements for real-time streaming, transformation, and efficient storage for analytical queries.



Your team wants to create a monthly report to analyze inventory data that is updated daily. You need to aggregate the inventory counts by using only the most recent month of data, and save the results to be used in a Looker Studio dashboard.
What should you do?

  1. Create a materialized view in BigQuery that uses the SUM( ) function and the DATE_SUB( ) function.
  2. Create a saved query in the BigQuery console that uses the SUM( ) function and the DATE_SUB( ) function. Re-run the saved query every month, and save the results to a BigQuery table.
  3. Create a BigQuery table that uses the SUM( ) function and the _PARTITIONDATE filter.
  4. Create a BigQuery table that uses the SUM( ) function and the DATE_DIFF( ) function.

Answer(s): A

Explanation:

Creating a materialized view in BigQuery with the SUM() function and the DATE_SUB() function is the best approach. Materialized views allow you to pre-aggregate and cache query results, making them efficient for repeated access, such as monthly reporting. By using the DATE_SUB() function, you can filter the inventory data to include only the most recent month. This approach ensures that the aggregation is up-to-date with minimal latency and provides efficient integration with Looker Studio for dashboarding.



You have a BigQuery dataset containing sales dat

  1. This data is actively queried for the first 6 months. After that, the data is not queried but needs to be retained for 3 years for compliance reasons. You need to implement a data management strategy that meets access and compliance requirements, while keeping cost and administrative overhead to a minimum.
    What should you do?
  2. Use BigQuery long-term storage for the entire dataset. Set up a Cloud Run function to delete the data from BigQuery after 3 years.
  3. Partition a BigQuery table by month. After 6 months, export the data to Coldline storage.
    Implement a lifecycle policy to delete the data from Cloud Storage after 3 years.
  4. Set up a scheduled query to export the data to Cloud Storage after 6 months. Write a stored procedure to delete the data from BigQuery after 3 years.
  5. Store all data in a single BigQuery table without partitioning or lifecycle policies.

Answer(s): B

Explanation:

Partitioning the BigQuery table by month allows efficient querying of recent data for the first 6 months, reducing query costs. After 6 months, exporting the data to Coldline storage minimizes storage costs for data that is rarely accessed but needs to be retained for compliance. Implementing a lifecycle policy in Cloud Storage automates the deletion of the data after 3 years, ensuring compliance while reducing administrative overhead. This approach balances cost efficiency and compliance requirements effectively.



Viewing Page 3 of 16



Share your comments for Google Associate-Data-Practitioner exam with other users:

David 10/31/2025 4:06:16 PM

Today I wrote this exam and passed, i totally relay on this practice exam. The questions were very tough, these questions are valid and I encounter the same.
UNITED STATES


Thor 10/21/2025 5:16:29 AM

Anyone used this dump recently?
NEW ZEALAND


Vladimir 9/25/2025 9:11:14 AM

173 question is A not D
Anonymous


khaos 9/21/2025 7:07:26 AM

nice questions
Anonymous


Katiso Lehasa 9/15/2025 11:21:52 PM

Thanks for the practice questions they helped me a lot.
Anonymous


Einstein 9/2/2025 7:42:00 PM

Passed this exam today. All questions are valid and this is not something you can find in ChatGPT.
UNITED KINGDOM


vito 8/22/2025 4:16:51 AM

i need to pass exam for VMware 2V0-11.25
Anonymous


Matt 7/31/2025 11:44:40 PM

Great questions.
UNITED STATES


OLERATO 7/1/2025 5:44:14 AM

great dumps to practice for the exam
SOUTH AFRICA


Adekunle willaims 6/9/2025 7:37:29 AM

How reliable and relevant are these questions?? also i can see the last update here was January and definitely new questions would have emerged.
Anonymous


Alex 5/24/2025 12:54:15 AM

Can I trust to this source?
Anonymous


SPriyak 3/17/2025 11:08:37 AM

can you please provide the CBDA latest test preparation
UNITED STATES


Chandra 11/28/2024 7:17:38 AM

This is the best and only way of passing this exam as it is extremely hard. Good questions and valid dump.
INDIA


Sunak 1/25/2025 9:17:57 AM

Can I use this dumps when I am taking the exam? I mean does somebody look what tabs or windows I have opened ?
BULGARIA


Frank 2/15/2024 11:36:57 AM

Finally got a change to write this exam and pass it! Valid and accurate!
CANADA


Anonymous User 2/2/2024 6:42:12 PM

Upload this exam please!
Anonymous


Nicholas 2/2/2024 6:17:08 PM

Thank you for providing these questions. It helped me a lot with passing my exam.
Anonymous


Timi 8/19/2023 5:30:00 PM

my first attempt
UNITED KINGDOM


Blessious Phiri 8/13/2023 10:32:00 AM

very explainable
Anonymous


m7md ibrahim 5/26/2023 6:21:00 PM

i think answer of q 462 is variance analysis
Anonymous


Tehu 5/25/2023 12:25:00 PM

hi i need see questions
Anonymous


Ashfaq Nasir 1/17/2024 1:19:00 AM

best study material for exam
Anonymous


Roberto 11/27/2023 12:33:00 AM

very interesting repository
ITALY


Nale 9/18/2023 1:51:00 PM

american history 1
Anonymous


Tanvi 9/27/2023 4:02:00 AM

good level of questions
Anonymous


Boopathy 8/17/2023 1:03:00 AM

i need this dump kindly upload it
Anonymous


s_123 8/12/2023 4:28:00 PM

do we need c# coding to be az204 certified
Anonymous


Blessious Phiri 8/15/2023 3:38:00 PM

excellent topics covered
Anonymous


Manasa 12/5/2023 3:15:00 AM

are these really financial cloud questions and answers, seems these are basic admin question and answers
Anonymous


Not Robot 5/14/2023 5:33:00 PM

are these comments real
Anonymous


kriah 9/4/2023 10:44:00 PM

please upload the latest dumps
UNITED STATES


ed 12/17/2023 1:41:00 PM

a company runs its workloads on premises. the company wants to forecast the cost of running a large application on aws. which aws service or tool can the company use to obtain this information? pricing calculator ... the aws pricing calculator is primarily used for estimating future costs
UNITED STATES


Muru 12/29/2023 10:23:00 AM

looks interesting
Anonymous


Tech Lady 10/17/2023 12:36:00 PM

thanks! that’s amazing
Anonymous