Google Associate-Data-Practitioner Exam (page: 1)
Google Cloud Associate Data Practitioner
Updated on: 12-Feb-2026

Your retail company wants to predict customer churn using historical purchase data stored in BigQuery. The dataset includes customer demographics, purchase history, and a label indicating whether the customer churned or not. You want to build a machine learning model to identify customers at risk of churning. You need to create and train a logistic regression model for predicting customer churn, using the customer_data table with the churned column as the target label.
Which BigQuery ML query should you use?









Answer(s): B

Explanation:

In BigQuery ML, when creating a logistic regression model to predict customer churn, the correct query should:

Exclude the target label column (in this case, churned) from the feature columns, as it is used for training and not as a feature input.

Rename the target label column to label, as BigQuery ML requires the target column to be named label.

The chosen query satisfies these requirements:

SELECT * EXCEPT(churned), churned AS label: Excludes churned from features and renames it to label.

The OPTIONS(model_type='logistic_reg') specifies that a logistic regression model is being trained.

This setup ensures the model is correctly trained using the features in the dataset while targeting the churned column for predictions.



Your company has several retail locations. Your company tracks the total number of sales made at each location each day. You want to use SQL to calculate the weekly moving average of sales by location to identify trends for each store.
Which query should you use?









Answer(s): C

Explanation:

To calculate the weekly moving average of sales by location:

The query must group by store_id (partitioning the calculation by each store).

The ORDER BY date ensures the sales are evaluated chronologically.

The ROWS BETWEEN 6 PRECEDING AND CURRENT ROW specifies a rolling window of 7 rows (1 week if each row represents daily data).

The AVG(total_sales) computes the average sales over the defined rolling window.

Chosen query meets these requirements:

PARTITION BY store_id groups the calculation by each store.

ORDER BY date orders the rows correctly for the rolling average.

ROWS BETWEEN 6 PRECEDING AND CURRENT ROW ensures the 7-day moving average.



Your company is building a near real-time streaming pipeline to process JSON telemetry data from small appliances. You need to process messages arriving at a Pub/Sub topic, capitalize letters in the serial number field, and write results to BigQuery. You want to use a managed service and write a minimal amount of code for underlying transformations.
What should you do?

  1. Use a Pub/Sub to BigQuery subscription, write results directly to BigQuery, and schedule a transformation query to run every five minutes.
  2. Use a Pub/Sub to Cloud Storage subscription, write a Cloud Run service that is triggered when objects arrive in the bucket, performs the transformations, and writes the results to BigQuery.
  3. Use the "Pub/Sub to BigQuery" Dataflow template with a UDF, and write the results to BigQuery.
  4. Use a Pub/Sub push subscription, write a Cloud Run service that accepts the messages, performs the transformations, and writes the results to BigQuery.

Answer(s): C

Explanation:

Using the "Pub/Sub to BigQuery" Dataflow template with a UDF (User-Defined Function) is the optimal choice because it combines near real-time processing, minimal code for transformations, and scalability. The UDF allows for efficient implementation of custom transformations, such as capitalizing letters in the serial number field, while Dataflow handles the rest of the managed pipeline seamlessly.



You want to process and load a daily sales CSV file stored in Cloud Storage into BigQuery for downstream reporting. You need to quickly build a scalable data pipeline that transforms the data while providing insights into data quality issues.
What should you do?

  1. Create a batch pipeline in Cloud Data Fusion by using a Cloud Storage source and a BigQuery sink.
  2. Load the CSV file as a table in BigQuery, and use scheduled queries to run SQL transformation scripts.
  3. Load the CSV file as a table in BigQuery. Create a batch pipeline in Cloud Data Fusion by using a BigQuery source and sink.
  4. Create a batch pipeline in Dataflow by using the Cloud Storage CSV file to BigQuery batch template.

Answer(s): A

Explanation:

Using Cloud Data Fusion to create a batch pipeline with a Cloud Storage source and a BigQuery sink is the best solution because:

Scalability: Cloud Data Fusion is a scalable, fully managed data integration service.

Data transformation: It provides a visual interface to design pipelines, enabling quick transformation of data.

Data quality insights: Cloud Data Fusion includes built-in tools for monitoring and addressing data quality issues during the pipeline creation and execution process.



You manage a Cloud Storage bucket that stores temporary files created during data processing. These temporary files are only needed for seven days, after which they are no longer needed. To reduce storage costs and keep your bucket organized, you want to automatically delete these files once they are older than seven days.
What should you do?

  1. Set up a Cloud Scheduler job that invokes a weekly Cloud Run function to delete files older than seven days.
  2. Configure a Cloud Storage lifecycle rule that automatically deletes objects older than seven days.
  3. Develop a batch process using Dataflow that runs weekly and deletes files based on their age.
  4. Create a Cloud Run function that runs daily and deletes files older than seven days.

Answer(s): B

Explanation:

Configuring a Cloud Storage lifecycle rule to automatically delete objects older than seven days is the best solution because:

Built-in feature: Cloud Storage lifecycle rules are specifically designed to manage object lifecycles, such as automatically deleting or transitioning objects based on age.

No additional setup: It requires no external services or custom code, reducing complexity and maintenance.

Cost-effective: It directly achieves the goal of deleting files after seven days without incurring additional compute costs.



Viewing Page 1 of 16



Share your comments for Google Associate-Data-Practitioner exam with other users:

Tom 3/18/2022 8:00:00 PM

i purchased this exam dumps from another website with way more questions but they were all invalid and outdate. this exam dumps was right to the point and all from recent exam. it was a hard pass.
UNITED KINGDOM


Edrick GOP 10/24/2023 6:00:00 AM

it was a good experience and i got 90% in the 200-901 exam.
Anonymous


anonymous 8/10/2023 2:28:00 AM

hi please upload this
Anonymous


Bakir 7/6/2023 7:24:00 AM

please upload it
UNITED KINGDOM


Aman 6/18/2023 1:27:00 PM

really need this dump. can you please help.
UNITED KINGDOM


Neela Para 1/8/2024 6:39:00 PM

really good and covers many areas explaining the answer.
NEW ZEALAND


Karan Patel 8/15/2023 12:51:00 AM

yes, can you please upload the exam?
UNITED STATES


NISHAD 11/7/2023 11:28:00 AM

how many questions are there in these dumps?
UNITED STATES


Pankaj 7/3/2023 3:57:00 AM

hi team, please upload this , i need it.
UNITED STATES


DN 9/4/2023 11:19:00 PM

question 14 - run terraform import: this is the recommended best practice for bringing manually created or destroyed resources under terraform management. you use terraform import to associate an existing resource with a terraform resource configuration. this ensures that terraform is aware of the resource, and you can subsequently manage it with terraform.
Anonymous


Zhiguang 8/19/2023 11:37:00 PM

please upload dump. thanks in advance.
Anonymous


deedee 12/23/2023 5:51:00 PM

great great
UNITED STATES


Asad Khan 11/1/2023 3:10:00 AM

answer 16 should be b your organizational policies require you to use virtual machines directly
Anonymous


Sale Danasabe 10/24/2023 5:21:00 PM

the question are kind of tricky of you didnt get the hnag on it.
Anonymous


Luis 11/16/2023 1:39:00 PM

can anyone tell me if this is for rhel8 or rhel9?
UNITED STATES


hik 1/19/2024 1:47:00 PM

good content
UNITED STATES


Blessious Phiri 8/15/2023 2:18:00 PM

pdb and cdb are critical to the database
Anonymous


Zuned 10/22/2023 4:39:00 AM

till 104 questions are free, lets see how it helps me in my exam today.
UNITED STATES


Muhammad Rawish Siddiqui 12/3/2023 12:11:00 PM

question # 56, answer is true not false.
SAUDI ARABIA


Amaresh Vashishtha 8/27/2023 1:33:00 AM

i would be requiring dumps to prepare for certification exam
Anonymous


Asad 9/8/2023 1:01:00 AM

very helpful
PAKISTAN


Blessious Phiri 8/13/2023 3:10:00 PM

control file is the heart of rman backup
Anonymous


Senthil 9/19/2023 5:47:00 AM

hi could you please upload the ibm c2090-543 dumps
Anonymous


Harry 6/27/2023 7:20:00 AM

appriciate if you could upload this again
AUSTRALIA


Anonymous 7/10/2023 4:10:00 AM

please upload the dump
SWEDEN


Raja 6/20/2023 5:30:00 AM

i found some questions answers mismatch with explanation answers. please properly update
UNITED STATES


Doora 11/30/2023 4:20:00 AM

nothing to mention
Anonymous


deally 1/19/2024 3:41:00 PM

knowable questions
UNITED STATES


Sonia 7/23/2023 4:03:00 PM

very helpfull
UNITED STATES


binEY 10/6/2023 5:15:00 AM

good questions
Anonymous


Neha 9/28/2023 1:58:00 PM

its helpful
Anonymous


Desmond 1/5/2023 9:11:00 PM

i just took my oracle exam and let me tell you, this exam dumps was a lifesaver! without them, iam not sure i would have passed. the questions were tricky and the answers were obscure, but the exam dumps had everything i needed. i would recommend to anyone looking to pass their oracle exams with flying colors (and a little bit of cheating) lol.
SINGAPORE


Davidson OZ 9/9/2023 6:37:00 PM

22. if you need to make sure that one computer in your hot-spot network can access the internet without hot-spot authentication, which menu allows you to do this? answer is ip binding and not wall garden. wall garden allows specified websites to be accessed with users authentication to the hotspot
Anonymous


381 9/2/2023 4:31:00 PM

is question 1 correct?
Anonymous