WGU Managing-Cloud-Security Exam (page: 1)
WGU Managing Cloud Security (JY02)
Updated on: 24-Mar-2026

Which phase of the cloud data life cycle involves activities such as data categorization and classification, including data labeling, marking, tagging, and assigning metadata?

  1. Store
  2. Use
  3. Destroy
  4. Create

Answer(s): D

Explanation:

The cloud data life cycle defines distinct stages that data goes through from its origin until its disposal. The Create phase is the very first stage, and this is where data is generated or captured by systems, applications, or users. At this point, data does not yet have context for storage or use, so it must be appropriately categorized and classified. Activities like labeling, marking, tagging, and assigning metadata are critical because they establish the foundation for enforcing controls throughout the rest of the life cycle.

Classification ensures that data is aligned with sensitivity levels, regulatory requirements, and business value. For example, financial records may be labeled "confidential" while general marketing content may be marked "public." These distinctions guide how encryption, access controls, and monitoring will be applied in subsequent phases such as storage, sharing, or use.

According to industry frameworks, starting security at the Create phase ensures that controls "follow the data" across environments. Without proper classification at creation, organizations risk mismanaging sensitive data downstream.



Which phase of the cloud data life cycle involves the process of crypto-shredding?

  1. Destroy
  2. Create
  3. Archive
  4. Store

Answer(s): A

Explanation:

The Destroy phase of the cloud data life cycle is where information is permanently removed from systems. A common technique in cloud environments for this phase is crypto-shredding (or cryptographic erasure). Rather than physically destroying the media, crypto-shredding involves deleting or revoking encryption keys used to protect the data. Once those keys are destroyed, the encrypted data becomes mathematically unrecoverable, even if the underlying storage media remains intact.

This method is particularly useful in cloud environments where storage is virtualized and hardware cannot easily be physically destroyed. Crypto-shredding provides compliance-friendly assurance that sensitive data such as personally identifiable information (PII), financial data, or healthcare records cannot be accessed after retention periods expire or contractual obligations end.

By incorporating crypto-shredding into the Destroy phase, organizations align with standards for secure data sanitization. This ensures legal defensibility during audits and e-discovery and demonstrates proper lifecycle governance. The emphasis is on making data inaccessible while still maintaining operational efficiency and environmental responsibility.



In most redundant array of independent disks (RAID) configurations, data is stored across different disks.
Which method of storing data is described?

  1. Striping
  2. Archiving
  3. Mapping
  4. Crypto-shredding

Answer(s): A

Explanation:

The method described is striping, which is a technique used in RAID configurations to improve performance and distribute risk. Striping involves splitting data into smaller segments and writing those segments across multiple disks simultaneously. For example, if a file is divided into four parts, each part is written to a separate disk in the RAID array.

This parallelism enhances input/output (I/O) performance because multiple drives can be accessed at once. It also provides resilience depending on the RAID level.
While striping by itself (RAID 0) increases performance but not redundancy, when combined with mirroring or parity (e.g., RAID 5 or RAID 10), it offers both speed and fault tolerance.

The purpose of striping in the data management context is to optimize how data is stored, accessed, and protected. It is fundamentally different from archiving, mapping, or crypto-shredding, as those serve different objectives (long-term storage, logical placement, or secure deletion). Striping is central to high-performance storage systems and supports availability in mission-critical environments.



As part of training to help the data center engineers understand different attack vectors that affect the infrastructure, they work on a set of information about access and availability attacks that was presented. Part of the labs requires the engineers to identify different threat vectors and their names.
Which threat prohibits the use of data by preventing access to it?

  1. Brute force
  2. Encryption
  3. Rainbow tables
  4. Denial of service

Answer(s): D

Explanation:

The described threat is a Denial of Service (DoS) attack. In security contexts, a DoS attack aims to make a system, application, or data unavailable to legitimate users by overwhelming resources. Unlike brute force or rainbow table attacks, which target authentication mechanisms, or encryption, which is a defensive control, DoS focuses on disrupting availability--the "A" in the Confidentiality, Integrity, Availability (CIA) triad.

DoS can be executed in many ways: flooding a network with traffic, exhausting server memory, or overwhelming application processes.
When scaled by multiple coordinated systems, it becomes a Distributed Denial of Service (DDoS) attack. In either case, the effect is the same--authorized users cannot access critical data or services.

For cloud environments, where service uptime is crucial, DoS protections such as rate limiting, auto- scaling, and upstream filtering are essential. Training data center engineers to recognize DoS helps them understand the importance of resilience strategies and ensures continuity planning includes availability safeguards.



An engineer has been given the task of ensuring all of the keys used to encrypt archival data are securely stored according to industry standards.
Which location is a secure option for the engineer to store encryption keys for decrypting data?

  1. A repository that is made private
  2. An escrow that is kept separate from the data it is tied to
  3. An escrow that is kept local to the data it is tied to
  4. A repository that is made public

Answer(s): B

Explanation:

Industry best practice requires that encryption keys are stored separately from the data they protect. This ensures that if the data storage system is compromised, attackers cannot immediately decrypt sensitive information. The use of a secure escrow system is a recognized approach.

An escrow provides controlled storage for encryption keys, ensuring they are only accessible by authorized processes and not co-located with the protected data. Keeping keys "local" to the data creates a single point of failure. A public or private repository without specialized protection mechanisms would also be insufficient due to risks of insider threats or misconfiguration.

By placing keys in an independent escrow system, the organization enforces separation of duties, strengthens defense-in-depth, and aligns with cryptographic standards from NIST and ISO. This practice is vital when dealing with archival data, where long-term confidentiality must be preserved even as systems evolve.



An organization wants to ensure that all entities trust any certificate generated internally in the organization.
What should be used to generate these certificates?

  1. Individual users' private keys
  2. The organization's certificate repository server
  3. The organization's certificate authority server
  4. Individual systems' private keys

Answer(s): C

Explanation:

Trust in digital certificates comes from their issuance by a Certificate Authority (CA). A CA is a trusted entity that validates identities and signs certificates. In internal environments, organizations often operate a private CA to issue certificates for users, systems, and services.

If certificates were generated by individual private keys or systems without central authority, there would be no unified trust chain, and validating authenticity across the organization would be impossible. A certificate repository server only distributes certificates but cannot establish trust.

By using an organizational CA server, all certificates are linked to a root of trust. Systems configured to trust the organization's CA will trust any certificate it issues. This allows secure internal communications (TLS, VPN, email signing) and ensures scalability as new services come online. It also supports compliance with enterprise PKI policies.



A customer service representative needs to verify a customer's private information, but the representative does not need to see all the information.
Which technique should the service provider use to protect the privacy of the customer?

  1. Hashing
  2. Encryption
  3. Masking
  4. Tokenization

Answer(s): C

Explanation:

Data masking is a privacy-preserving technique that replaces sensitive fields with obfuscated or partial values while retaining usability. For example, displaying only the last four digits of a Social Security Number or credit card number. This allows a representative to verify identity without accessing the full data set.

Hashing and encryption protect data at rest or in transit, but they do not allow selective partial display. Tokenization substitutes sensitive data with unique tokens but is typically used for storage and processing rather than interactive verification. Masking, on the other hand, is specifically designed for scenarios where a user must work with limited but recognizable data.

By using masking, organizations enforce the principle of least privilege, reduce exposure of sensitive information, and align with privacy standards such as PCI DSS and GDPR.



An organization is planning for an upcoming Payment Card Industry Data Security Standard (PCI DSS) audit and wants to ensure that only relevant files are included in the audit materials.
Which process should the organization use to ensure that the relevant files are identified?

  1. Normalization
  2. Tokenization
  3. Categorization
  4. Anonymization

Answer(s): C

Explanation:

Categorization is the process of systematically identifying and classifying files according to content and relevance. In preparation for a PCI DSS audit, it is critical to identify which files fall within scope--those that contain cardholder data or impact its security.

Normalization adjusts data format, tokenization substitutes sensitive data with tokens, and anonymization removes identifiers.
While useful, none directly address the task of isolating "relevant files" for audit. Categorization ensures that files are grouped correctly, allowing auditors to focus on the proper scope and preventing unnecessary exposure of unrelated data.

This step aligns with PCI DSS requirements that limit scope to systems and data directly affecting cardholder data security. Proper categorization streamlines audits and demonstrates effective data governance.



Viewing Page 1 of 11



Share your comments for WGU Managing-Cloud-Security exam with other users:

Agathenta 12/16/2023 1:36:00 PM

q35 should be a
Anonymous


MD. SAIFUL ISLAM 6/22/2023 5:21:00 AM

sap c_ts450_2021
Anonymous


Satya 7/24/2023 3:18:00 AM

nice questions
UNITED STATES


sk 5/13/2023 2:10:00 AM

ecellent materil for unserstanding
INDIA


Gerard 6/29/2023 11:14:00 AM

good so far
Anonymous


Limbo 10/9/2023 3:08:00 AM

this is way too informative
BOTSWANA


Tejasree 8/26/2023 1:46:00 AM

very helpfull
UNITED STATES


Yolostar Again 10/12/2023 3:02:00 PM

q.189 - answers are incorrect.
Anonymous


Shikha Bakra 9/10/2023 5:16:00 PM

awesome job in getting these questions
AUSTRALIA


Kevin 10/20/2023 2:01:00 AM

i cant find aws certified practitioner clf-c01 exam in aws website but i found aws certified practitioner clf-c02 exam. can everyone please verify the difference between the two clf-c01 and clf-c02? thank you
UNITED STATES


D Mario 6/19/2023 10:38:00 PM

grazie mille. i got a satisfactory mark in my exam test today because of this exam dumps. sorry for my english.
ITALY


Bharat Kumar Saraf 10/31/2023 4:36:00 AM

some of the answers are incorrect. need to be reviewed.
HONG KONG


JP 7/13/2023 12:21:00 PM

so far so good
Anonymous


Kiky V 8/8/2023 6:32:00 PM

i am really liking it
Anonymous


trying 7/28/2023 12:37:00 PM

thanks good stuff
UNITED STATES


exampei 10/4/2023 2:40:00 PM

need dump c_tadm_23
Anonymous


Eman Sawalha 6/10/2023 6:18:00 AM

next time i will write a full review
GREECE


johnpaul 11/15/2023 7:55:00 AM

first time using this site
ROMANIA


omiornil@gmail.com 7/25/2023 9:36:00 AM

please sent me oracle 1z0-1105-22 pdf
BANGLADESH


John 8/29/2023 8:59:00 PM

very helpful
Anonymous


Kvana 9/28/2023 12:08:00 PM

good info about oml
UNITED STATES


Checo Lee 7/3/2023 5:45:00 PM

very useful to practice
UNITED STATES


dixitdnoh@gmail.com 8/27/2023 2:58:00 PM

this website is very helpful.
UNITED STATES


Sanjay 8/14/2023 8:07:00 AM

good content
INDIA


Blessious Phiri 8/12/2023 2:19:00 PM

so challenging
Anonymous


PAYAL 10/17/2023 7:14:00 AM

17 should be d ,for morequery its scale out
Anonymous


Karthik 10/12/2023 10:51:00 AM

nice question
Anonymous


Godmode 5/7/2023 10:52:00 AM

yes.
NETHERLANDS


Bhuddhiman 7/30/2023 1:18:00 AM

good mateial
Anonymous


KJ 11/17/2023 3:50:00 PM

good practice exam
Anonymous


sowm 10/29/2023 2:44:00 PM

impressivre qustion
Anonymous


CW 7/6/2023 7:06:00 PM

questions seem helpful
Anonymous


luke 9/26/2023 10:52:00 AM

good content
Anonymous


zazza 6/16/2023 9:08:00 AM

question 21 answer is alerts
ITALY