Google Professional Cloud Database Engineer PROFESSIONAL CLOUD DATABASE ENGINEER Exam Questions in PDF

Free Google PROFESSIONAL CLOUD DATABASE ENGINEER Dumps Questions (page: 5)

Your team uses thousands of connected IoT devices to collect device maintenance data for your oil and gas customers in real time. You want to design inspection routines, device repair, and replacement schedules based on insights gathered from the data produced by these devices. You need a managed solution that is highly scalable, supports a multi-cloud strategy, and offers low latency for these IoT devices.
What should you do?

  1. Use Firestore with Looker.
  2. Use Cloud Spanner with Data Studio.
  3. Use MongoD8 Atlas with Charts.
  4. Use Bigtable with Looker.

Answer(s): C

Explanation:

This scenario has BigTable written all over it - large amounts of data from many devices to be analysed in realtime. I would even argue it could qualify as a multicloud solution, given the links to HBASE. BUT it does not support SQL queries and is not therefore compatible (on its own) with Looker. Firestore + Looker has the same problem. Spanner + Data Studio is at least a compatible pairing, but I agree with others that it doesn't fit this use-case - not least because it's Google-native. By contrast, MongoDB Atlas is a managed solution (just not by Google) which is compatible with the proposed reporting tool (Mongo's own Charts), it's specifically designed for this type of solution and of course it can run on any cloud.



Your application follows a microservices architecture and uses a single large Cloud SQL instance, which is starting to have performance issues as your application grows. in the Cloud Monitoring dashboard, the CPU utilization looks normal You want to follow Google-recommended practices to resolve and prevent these performance issues while avoiding any major refactoring.
What should you do?

  1. Use Cloud Spanner instead of Cloud SQL.
  2. Increase the number of CPUs for your instance.
  3. Increase the storage size for the instance.
  4. Use many smaller Cloud SQL instances.

Answer(s): D

Explanation:

https://cloud.google.com/sql/docs/mysql/best-practices#data-arch



You need to perform a one-time migration of data from a running Cloud SQL for MySQL instance in the us-central1 region to a new Cloud SQL for MySQL instance in the us-east1 region. You want to follow Google-recommended practices to minimize performance impact on the currently running instance.
What should you do?

  1. Create and run a Dataflow job that uses JdbcIO to copy data from one Cloud SQL instance to another.
  2. Create two Datastream connection profiles, and use them to create a stream from one Cloud SQL instance to another.
  3. Create a SQL dump file in Cloud Storage using a temporary instance, and then use that file to import into a new instance.
  4. Create a CSV file by running the SQL statement SELECT...INTO OUTFILE, copy the file to a Cloud Storage bucket, and import it into a new instance.

Answer(s): C

Explanation:

https://cloud.google.com/sql/docs/mysql/import-export#serverless



You are running a mission-critical application on a Cloud SQL for PostgreSQL database with a multi- zonal setup. The primary and read replica instances are in the same region but in different zones. You need to ensure that you split the application load between both instances.
What should you do?

  1. Use Cloud Load Balancing for load balancing between the Cloud SQL primary and read replica instances.
  2. Use PgBouncer to set up database connection pooling between the Cloud SQL primary and read replica instances.
  3. Use HTTP(S) Load Balancing for database connection pooling between the Cloud SQL primary and read replica instances.
  4. Use the Cloud SQL Auth proxy for database connection pooling between the Cloud SQL primary and read replica instances.

Answer(s): B

Explanation:

https://severalnines.com/blog/how-achieve-postgresql-high-availability-pgbouncer/

https://cloud.google.com/blog/products/databases/using-haproxy-to-scale-read-only-workloads-on- cloud-sql-for-postgresql
This answer is correct because PgBouncer is a lightweight connection pooler for PostgreSQL that can help you distribute read requests between the Cloud SQL primary and read replica instances1. PgBouncer can also improve performance and scalability by reducing the overhead of creating new connections and reusing existing ones1. You can install PgBouncer on a Compute Engine instance and configure it to connect to the Cloud SQL instances using private IP addresses or the Cloud SQL Auth proxy2.



Your organization deployed a new version of a critical application that uses Cloud SQL for MySQL with high availability (HA) and binary logging enabled to store transactional information. The latest release of the application had an error that caused massive data corruption in your Cloud SQL for MySQL database. You need to minimize data loss.
What should you do?

  1. Open the Google Cloud Console, navigate to SQL > Backups, and select the last version of the automated backup before the corruption.
  2. Reload the Cloud SQL for MySQL database using the LOAD DATA command to load data from CSV files that were used to initialize the instance.
  3. Perform a point-in-time recovery of your Cloud SQL for MySQL database, selecting a date and time before the data was corrupted.
  4. Fail over to the Cloud SQL for MySQL HA instance. Use that instance to recover the transactions that occurred before the corruption.

Answer(s): C

Explanation:

Binary Logging enabled, with that you can identify the point of time the data was good and recover from that point time. https://cloud.google.com/sql/docs/mysql/backup- recovery/pitr#perform_the_point-in-time_recovery_using_binary_log_positions



Share your comments for Google PROFESSIONAL CLOUD DATABASE ENGINEER exam with other users:

N
Nhlanhla
12/13/2023 5:26:00 AM

just passed the exam on my first try using these dumps.

R
Rizwan
1/6/2024 2:18:00 AM

very helpful

Y
Yady
5/24/2023 10:40:00 PM

these questions look good.

K
Kettie
10/12/2023 1:18:00 AM

this is very helpful content

S
SB
7/21/2023 3:18:00 AM

please provide the dumps

D
David
8/2/2023 8:20:00 AM

it is amazing

U
User
8/3/2023 3:32:00 AM

quesion 178 about "a banking system that predicts whether a loan will be repaid is an example of the" the answer is classification. not regresion, you should fix it.

Q
quen
7/26/2023 10:39:00 AM

please upload apache spark dumps

E
Erineo
11/2/2023 5:34:00 PM

q14 is b&c to reduce you will switch off mail for every single alert and you will switch on daily digest to get a mail once per day, you might even skip the empty digest mail but i see this as a part of the daily digest adjustment

P
Paul
10/21/2023 8:25:00 AM

i think it is good question

U
Unknown
8/15/2023 5:09:00 AM

good for students who wish to give certification.

C
Ch
11/20/2023 10:56:00 PM

is there a google drive link to the images? the links in questions are not working.

J
Joey
5/16/2023 5:25:00 AM

very promising, looks great, so much wow!

A
alaska
10/24/2023 5:48:00 AM

i scored 87% on the az-204 exam. thanks! i always trust

N
nnn
7/9/2023 11:09:00 PM

good need more

U
User-sfdc
12/29/2023 7:21:00 AM

sample questions seems good

T
Tamer dam
8/4/2023 10:21:00 AM

huawei is ok

Y
YK
12/11/2023 1:10:00 AM

good one nice

D
de
8/28/2023 2:38:00 AM

please continue

D
DMZ
6/25/2023 11:56:00 PM

this exam dumps just did the job. i donot want to ruffle your feathers but your exam dumps and mock test engine is amazing.

J
Jose
8/30/2023 6:14:00 AM

nice questions

T
Tar01
7/24/2023 7:07:00 PM

the explanation are really helpful

D
DaveG
12/15/2023 4:50:00 PM

just passed my exam yesterday on my first attempt. these dumps were extremely helpful in passing first time. the questions were very, very similar to these questions!

A
A.K.
6/30/2023 6:34:00 AM

cosmos db is paas not saas

S
S Roychowdhury
6/26/2023 5:27:00 PM

what is the percentage of common questions in gcp exam compared to 197 dump questions? are they 100% matching with real gcp exam?

B
Bella
7/22/2023 2:05:00 AM

not able to see questions

S
Scott
9/8/2023 7:19:00 AM

by far one of the best sites for free questions. i have pass 2 exams with the help of this website.

D
donald
8/19/2023 11:05:00 AM

excellent question bank.

A
Ashwini
8/22/2023 5:13:00 AM

it really helped

S
sk
5/13/2023 2:07:00 AM

excelent material

C
Christopher
9/5/2022 10:54:00 PM

the new versoin of this exam which i downloaded has all the latest questions from the exam. i only saw 3 new questions in the exam which was not in this dump.

S
Sam
9/7/2023 6:51:00 AM

question 8 - can cloudtrail be used for storing jobs? based on aws - aws cloudtrail is used for governance, compliance and investigating api usage across all of our aws accounts. every action that is taken by a user or script is an api call so this is logged to [aws] cloudtrail. something seems incorrect here.

T
Tanvi Rajput
8/14/2023 10:55:00 AM

question 13 tda - c01 answer : quick table calculation -> percentage of total , compute using table down

P
PMSAGAR
9/19/2023 2:48:00 AM

pls share teh dump

AI Tutor 👋 I’m here to help!