Google Associate Cloud Engineer Associate Cloud Engineer Exam Questions in PDF

Free Google Associate Cloud Engineer Dumps Questions (page: 6)

You need to set up permissions for a set of Compute Engine instances to enable them to write data into a particular Cloud Storage bucket. You want to follow Google-recommended practices.
What should you do?

  1. Create a service account with an access scope. Use the access scope `https://www.googleapis.com/auth/devstorage.write_only'.
  2. Create a service account with an access scope. Use the access scope `https://www.googleapis.com/auth/cloud-platform'.
  3. Create a service account and add it to the IAM role `storage.objectCreator' for that bucket.
  4. Create a service account and add it to the IAM role `storage.objectAdmin' for that bucket.

Answer(s): C

Explanation:

https://cloud.google.com/iam/docs/understanding-service- accounts#using_service_accounts_with_compute_engine https://cloud.google.com/storage/docs/access-control/iam-roles



You have sensitive data stored in three Cloud Storage buckets and have enabled data access logging. You want to verify activities for a particular user for these buckets, using the fewest possible steps. You need to verify the addition of metadata labels and which files have been viewed from those buckets.
What should you do?

  1. Using the GCP Console, filter the Activity log to view the information.
  2. Using the GCP Console, filter the Stackdriver log to view the information.
  3. View the bucket in the Storage section of the GCP Console.
  4. Create a trace in Stackdriver to view the information.

Answer(s): A

Explanation:

https://cloud.google.com/storage/docs/audit-logs https://cloud.google.com/compute/docs/logging/audit-logging#audited_operations



You are the project owner of a GCP project and want to delegate control to colleagues to manage buckets and files in Cloud Storage. You want to follow Google-recommended practices.
Which IAM roles should you grant your colleagues?

  1. Project Editor
  2. Storage Admin
  3. Storage Object Admin
  4. Storage Object Creator

Answer(s): B

Explanation:

Storage Admin (roles/storage.admin) Grants full control of buckets and objects.
When applied to an individual bucket, control applies only to the specified bucket and objects within the bucket.

firebase.projects.get resourcemanager.projects.get resourcemanager.projects.list storage.buckets.*
storage.objects.*
https://cloud.google.com/storage/docs/access-control/iam-roles

This role grants full control of buckets and objects.
When applied to an individual bucket, control applies only to the specified bucket and objects within the bucket. Ref: https://cloud.google.com/iam/docs/understanding-roles#storage-roles



You have an object in a Cloud Storage bucket that you want to share with an external company. The object contains sensitive data.
You want access to the content to be removed after four hours. The external company does not have a Google account to which you can grant specific user-based access privileges. You want to use the most secure method that requires the fewest steps.
What should you do?

  1. Create a signed URL with a four-hour expiration and share the URL with the company.
  2. Set object access to `public' and use object lifecycle management to remove the object after four hours.
  3. Configure the storage bucket as a static website and furnish the object's URL to the company.
    Delete the object from the storage bucket after four hours.
  4. Create a new Cloud Storage bucket specifically for the external company to access. Copy the object to that bucket. Delete the bucket after four hours have passed.

Answer(s): A

Explanation:

Signed URLs are used to give time-limited resource access to anyone in possession of the URL, regardless of whether they have a Google account. https://cloud.google.com/storage/docs/access- control/signed-urls



You are creating a Google Kubernetes Engine (GKE) cluster with a cluster autoscaler feature enabled. You need to make sure that each node of the cluster will run a monitoring pod that sends container metrics to a third-party monitoring solution.
What should you do?

  1. Deploy the monitoring pod in a StatefulSet object.
  2. Deploy the monitoring pod in a DaemonSet object.
  3. Reference the monitoring pod in a Deployment object.
  4. Reference the monitoring pod in a cluster initializer at the GKE cluster creation time.

Answer(s): B

Explanation:

https://cloud.google.com/kubernetes-engine/docs/concepts/daemonset https://cloud.google.com/kubernetes-engine/docs/concepts/daemonset#usage_patterns

DaemonSets attempt to adhere to a one-Pod-per-node model, either across the entire cluster or a subset of nodes. As you add nodes to a node pool, DaemonSets automatically add Pods to the new nodes as needed.

In GKE, DaemonSets manage groups of replicated Pods and adhere to a one-Pod-per-node model, either across the entire cluster or a subset of nodes. As you add nodes to a node pool, DaemonSets automatically add Pods to the new nodes as needed. So, this is a perfect fit for our monitoring pod. Ref: https://cloud.google.com/kubernetes-engine/docs/concepts/daemonset DaemonSets are useful for deploying ongoing background tasks that you need to run on all or certain nodes, and which do not require user intervention. Examples of such tasks include storage daemons like ceph, log collection daemons like fluentd, and node monitoring daemons like collectd. For example, you could have DaemonSets for each type of daemon run on all of your nodes. Alternatively, you could run multiple DaemonSets for a single type of daemon, but have them use different configurations for different hardware types and resource needs.



Share your comments for Google Associate Cloud Engineer exam with other users:

Q
Q44
7/30/2023 11:50:00 AM

ans is coldline i think

A
Anastasiia
12/28/2023 9:06:00 AM

totally not correct answers. 21. you have one gcp account running in your default region and zone and another account running in a non-default region and zone. you want to start a new compute engine instance in these two google cloud platform accounts using the command line interface. what should you do? correct: create two configurations using gcloud config configurations create [name]. run gcloud config configurations activate [name] to switch between accounts when running the commands to start the compute engine instances.

A
Asad Khan
11/1/2023 2:44:00 AM

answer 10 should be a because only a new project will be created & the organization is the same.

A
Asad Khan
11/1/2023 3:10:00 AM

answer 16 should be b your organizational policies require you to use virtual machines directly

T
Tar01
7/24/2023 7:07:00 PM

the explanation are really helpful

S
S Roychowdhury
6/26/2023 5:27:00 PM

what is the percentage of common questions in gcp exam compared to 197 dump questions? are they 100% matching with real gcp exam?

AI Tutor 👋 I’m here to help!