Google Professional Security Operations Engineer Exam (page: 2)
Google Professional Security Operations Engineer
Updated on: 12-Feb-2026

Your team is responsible for cybersecurity for a large multinational corporation. You have been tasked with identifying unknown command and control nodes (C2s) that are potentially active in your organization's environment. You need to generate a list of potential matches within the next 24 hours.
What should you do?

  1. Write a rule in Google Security Operations (SecOps) that scans historic network outbound connections against ingested threat intelligence Run the rule in a retrohunt against the full tenant.
  2. Load network records into BigQuery to identify endpoints that are communicating with domains outside three standard deviations of normal.
  3. Review Security Health Analytics (SHA) findings in Security Command Center (SCC).
  4. Write a YARA-L rule in Google Security Operations (SecOps) that compares network traffic of endpoints to low prevalence domains against recent WHOIS registrations.

Answer(s): A

Explanation:

The fastest and most effective way to identify unknown C2 nodes within 24 hours is to write a detection rule in Google SecOps that compares historic outbound connections against ingested threat intelligence, then run it as a retrohunt across the full tenant. Retrohunt enables rapid scanning of past telemetry at scale to surface potential matches without waiting for new events to occur.



You received an alert from Container Threat Detection that an added binary has been executed in a business critical workload. You need to investigate and respond to this incident.
What should you do? (Choose two.)

  1. Notify the workload owner. Follow the response playbook, and ask the threat hunting team to identify the root cause of the incident.
  2. Review the finding, investigate the pod and related resources, and research the related attack and response methods.
  3. Review the finding, quarantine the cluster containing the running pod, and delete the running pod to prevent further compromise.
  4. Silence the alert in the Security Command Center (SCC) console, as the alert is a low severity finding.
  5. Keep the cluster and pod running, and investigate the behavior to determine whether the activity is malicious.

Answer(s): A,B

Explanation:

The correct response involves both notifying the workload owner and following the response playbook to ensure coordinated incident handling, and reviewing the finding while investigating the pod and related resources to understand the attack and determine the appropriate remediation. This approach ensures proper communication, structured incident response, and thorough technical investigation without prematurely deleting or silencing critical evidence.



You are reviewing the security analyst team's playbook action process. Currently, security analysts navigate to the Playbooks tab in Google Security Operations (SecOps) for each alert and manually run steps assigned to a user. You need to present all actions from alerts awaiting user input in one location for the analyst to execute.
What should you do?

  1. Enable approval links in the manual action and display them as clickable links to the user in a HTML widget in the Default Case View tab.
  2. Add a general insight in your playbook to display manual action details to the user.
  3. Use the Pending Actions widget in the Default Case View in settings.
  4. Create an Alert View with the playbook that incorporates the Pending Actions widget.

Answer(s): C

Explanation:

The correct approach is to use the Pending Actions widget in the Default Case View. This widget consolidates all manual playbook actions that require analyst input, allowing them to be executed from a single location. This streamlines the workflow, reduces manual navigation, and ensures analysts don't miss pending steps across multiple alerts.



You are managing a Google Security Operations (SecOps) implementation for a regional customer. Your customer informs you that logs are appearing in the platform after a consistent six-hour delay. After some research, you determine that there is a log time zone issue. You want to fix this problem.
What should you do?

  1. Modify the default parser and include a default time zone.
  2. Create a parser extension to correct the time zone.
  3. Create a custom parser to correct the time zone.
  4. Modify the UI settings to correct the time zone.

Answer(s): B

Explanation:

The correct fix is to create a parser extension to correct the time zone. Parser extensions let you adjust specific fields, such as timestamps, without modifying the default parser. This resolves ingestion delays caused by time zone mismatches while maintaining the integrity and upgrade compatibility of the default parser.



Your organization uses Google Security Operations (SecOps). You need to identify the most commonly occurring processes and applications across your organization's large number of servers so you can implement baselines and exclusion lists on a regular basis. You want to use the most efficient approach.
What should you do?

  1. Use the UDM lookup feature to identify relevant process-related UDM fields and values.
  2. Run a UDM search, and review aggregations for relevant process-related UDM fields.
  3. Review the Google SecOps SIEM Rules & Detections, and identify the most common processes appearing in alerts that are marked as false positives.
  4. Generate a Google SecOps SIEM dashboard based on relevant UDM fields, such as processes, that provides the counts for process names and files.

Answer(s): B

Explanation:

The most efficient method is to run a UDM search and use aggregations on process-related UDM fields. This allows you to quickly identify the most common processes and applications across all servers, providing accurate data to establish baselines and exclusion lists without relying only on alerts or dashboards.



You work for an organization that uses Security Command Center (SCC) with Event Threat Detection (ETD) enabled. You need to enable ETD detections for data exfiltration attempts from designated sensitive Cloud Storage buckets and BigQuery datasets. You want to minimize Cloud Logging costs.
What should you do?

  1. Enable "data read" audit logs only for the designated sensitive Cloud Storage buckets and BigQuery datasets.
  2. Enable "data read" and "data write" audit logs only for the designated sensitive Cloud Storage buckets and BigQuery datasets.
  3. Enable "data read" and "data write" audit logs for all Cloud Storage buckets and BigQuery datasets throughout the organization.
  4. Enable VPC Flow Logs for the VPC networks containing resources that access the sensitive Cloud Storage buckets and BigQuery datasets.

Answer(s): A

Explanation:

To detect data exfiltration attempts from sensitive Cloud Storage buckets and BigQuery datasets using ETD, you only need "data read" audit logs. These logs capture access and read events (which indicate potential exfiltration). Enabling them only for the designated sensitive resources minimizes Cloud Logging costs while still providing the necessary visibility for detections.



Your company uses Security Command Center (SCC) and Google Security Operations (SecOps). Last week, an attacker attempted to establish persistence by generating a key for an unused service account. You need to confirm that you are receiving alerts when keys are created for unused service accounts and that newly created keys are automatically deleted. You want to minimize the amount of manual effort required.
What should you do?

  1. Generate a YARA-L rule in Google SecOps that detects when a service account key is created. Using the built-in IDE, create a custom action in Google SecOps SOAR that deletes the service account key.
  2. Use the Initial Access: Dormant Service Account Key Created finding from SCC, and ingest this finding into Google SecOps. Create a custom action in Google SecOps SOAR that is triggered on this finding. Use the built-in IDE to build code to delete the service account key.
  3. Configure a Cloud Logging sink to write logs to a Pub/Sub topic that filters for the methodName:
    "google.iam.admin.v1.CreateServiceAccountKey" field. Create a Cloud Run function that subscribes to the Pub/Sub topic and deletes the service account key.
  4. Use the Initial Access: Dormant Service Account Key Created finding from SCC, and write this finding to a Pub/Sub topic. Create a Cloud Run function that subscribes to the Pub/Sub topic and deletes the service account key.

Answer(s): B

Explanation:

The most efficient solution is to use the built-in SCC detection "Initial Access: Dormant Service Account Key Created", ingest the finding into Google SecOps, and automate the response with a custom SOAR action that deletes the key. This leverages existing SCC findings for accurate detection, integrates directly with Google SecOps for centralized alerting, and minimizes manual effort by automating remediation.



Your company recently adopted Security Command Center (SCC) but is not using Google Security Operations (SecOps). Your organization has thousands of active projects. You need to detect anomalous behavior in your Google Cloud environment by windowing and aggregating data over a given time period, based on specific log events or advanced calculations. You also need to provide an interface for analysts to triage the alerts. How should you build this capability?

  1. Send the logs to Cloud SQL, and run a scheduled query against these events using a Cloud Run scheduled job. Configure an aggregated log filter to stream event-driven logs to a Pub/Sub topic. Configure a trigger to send an email alert when new events are sent to this feed.
  2. Sink the logs to BigQuery, and configure Cloud Run functions to execute a periodic job and generate normalized alerts in a Pub/Sub topic for findings. Use log-based metrics to generate event-driven alerts and send these alerts to the Pub/Sub topic. Write the alerts as findings using the SCC API.
  3. Use log-based metrics to generate event-driven alerts for the detection scenarios. Configure a Cloud Monitoring alert policy to send email alerts to your security operations team.
  4. Create a series of aggregated log sinks for each required finding, and send the normalized findings as JSON files to Cloud Storage. Use the write event to generate an alert.

Answer(s): B

Explanation:

The correct approach is to sink logs to BigQuery, where you can perform windowing and advanced aggregations over time. Then, use Cloud Run functions to periodically query BigQuery and generate normalized alerts published to a Pub/Sub topic. From there, alerts can be written back into SCC as findings via the SCC API, giving analysts a central interface for triage. This architecture supports large-scale environments, advanced calculations, and efficient integration with SCC.



Viewing Page 2 of 18



Share your comments for Google Professional Security Operations Engineer exam with other users:

Manasa 12/5/2023 3:15:00 AM

are these really financial cloud questions and answers, seems these are basic admin question and answers
Anonymous


Not Robot 5/14/2023 5:33:00 PM

are these comments real
Anonymous


kriah 9/4/2023 10:44:00 PM

please upload the latest dumps
UNITED STATES


ed 12/17/2023 1:41:00 PM

a company runs its workloads on premises. the company wants to forecast the cost of running a large application on aws. which aws service or tool can the company use to obtain this information? pricing calculator ... the aws pricing calculator is primarily used for estimating future costs
UNITED STATES


Muru 12/29/2023 10:23:00 AM

looks interesting
Anonymous


Tech Lady 10/17/2023 12:36:00 PM

thanks! that’s amazing
Anonymous


Mike 8/20/2023 5:12:00 PM

the exam dumps are helping me get a solid foundation on the practical techniques and practices needed to be successful in the auditing world.
UNITED STATES


Nobody 9/18/2023 6:35:00 PM

q 14 should be dmz sever1 and notepad.exe why does note pad have a 443 connection
Anonymous


Muhammad Rawish Siddiqui 12/4/2023 12:17:00 PM

question # 108, correct answers are business growth and risk reduction.
SAUDI ARABIA


Emmah 7/29/2023 9:59:00 AM

are these valid chfi questions
KENYA


Mort 10/19/2023 7:09:00 PM

question: 162 should be dlp (b)
EUROPEAN UNION


Eknath 10/4/2023 1:21:00 AM

good exam questions
INDIA


Nizam 6/16/2023 7:29:00 AM

I have to say this is really close to real exam. Passed my exam with this.
EUROPEAN UNION


poran 11/20/2023 4:43:00 AM

good analytics question
Anonymous


Antony 11/23/2023 11:36:00 AM

this looks accurate
INDIA


Ethan 8/23/2023 12:52:00 AM

question 46, the answer should be data "virtualization" (not visualization).
Anonymous


nSiva 9/22/2023 5:58:00 AM

its useful.
UNITED STATES


Ranveer 7/26/2023 7:26:00 PM

Pass this exam 3 days ago. The PDF version and the Xengine App is quite useful.
SOUTH AFRICA


Sanjay 8/15/2023 10:22:00 AM

informative for me.
UNITED STATES


Tom 12/12/2023 8:53:00 PM

question 134s answer shoule be "dlp"
JAPAN


Alex 11/7/2023 11:02:00 AM

in 72 the answer must be [sys_user_has_role] table.
Anonymous


Finn 5/4/2023 10:21:00 PM

i appreciated the mix of multiple-choice and short answer questions. i passed my exam this morning.
IRLAND


AJ 7/13/2023 8:33:00 AM

great to find this website, thanks
UNITED ARAB EMIRATES


Curtis Nakawaki 6/29/2023 9:11:00 PM

examination questions seem to be relevant.
UNITED STATES


Umashankar Sharma 10/22/2023 9:39:00 AM

planning to take psm test
Anonymous


ED SHAW 7/31/2023 10:34:00 AM

please allow to download
UNITED STATES


AD 7/22/2023 11:29:00 AM

please provide dumps
UNITED STATES


Ayyjayy 11/6/2023 7:29:00 AM

is the answer to question 15 correct ? i feel like the answer should be b
BAHRAIN


Blessious Phiri 8/12/2023 11:56:00 AM

its getting more technical
Anonymous


Jeanine J 7/11/2023 3:04:00 PM

i think these questions are what i need.
UNITED STATES


Aderonke 10/23/2023 2:13:00 PM

helpful assessment
UNITED KINGDOM


Tom 1/5/2024 2:32:00 AM

i am confused about the answers to the questions. do you know if the answers are correct?
KOREA REPUBLIC OF


Vinit N. 8/28/2023 2:33:00 AM

hi, please make the dumps available for my upcoming examination.
UNITED STATES


Sanyog Deshpande 9/14/2023 7:05:00 AM

good practice
UNITED STATES