Amazon AWS Certified Developer - Associate DVA-C02 AWS Certified Developer - Associate DVA-C02 Exam Questions in PDF

Free Amazon AWS Certified Developer - Associate DVA-C02 Dumps Questions (page: 1)

A company is implementing an application on Amazon EC2 instances. The application needs to process incoming transactions. When the application detects a transaction that is not valid, the application must send a chat message to the company's support team. To send the message, the application needs to retrieve the access token to authenticate by using the chat API.
A developer needs to implement a solution to store the access token. The access token must be encrypted at rest and in transit. The access token must also be accessible from other AWS accounts.
Which solution will meet these requirements with the LEAST management overhead?

  1. Use an AWS Systems Manager Parameter Store SecureString parameter that uses an AWS Key Management Service (AWS KMS) AWS managed key to store the access token. Add a resource-based policy to the parameter to allow access from other accounts. Update the IAM role of the EC2 instances with permissions to access Parameter Store. Retrieve the token from Parameter Store with the decrypt flag enabled. Use the decrypted access token to send the message to the chat.
  2. Encrypt the access token by using an AWS Key Management Service (AWS KMS) customer managed key. Store the access token in an Amazon DynamoDB table. Update the IAM role of the EC2 instances with permissions to access DynamoDB and AWS KMS. Retrieve the token from DynamoDDecrypt the token by using AWS KMS on the EC2 instances. Use the decrypted access token to send the message to the chat.
  3. Use AWS Secrets Manager with an AWS Key Management Service (AWS KMS) customer managed key to store the access token. Add a resource-based policy to the secret to allow access from other accounts. Update the IAM role of the EC2 instances with permissions to access Secrets Manager. Retrieve the token from Secrets Manager. Use the decrypted access token to send the message to the chat.
  4. Encrypt the access token by using an AWS Key Management Service (AWS KMS) AWS managed key. Store the access token in an Amazon S3 bucket. Add a bucket policy to the S3 bucket to allow access from other accounts. Update the IAM role of the EC2 instances with permissions to access Amazon S3 and AWS KMS. Retrieve the token from the S3 bucket. Decrypt the token by using AWS KMS on the EC2 instances. Use the decrypted access token to send the massage to the chat.

Answer(s): C



A company is running Amazon EC2 instances in multiple AWS accounts. A developer needs to implement an application that collects all the lifecycle events of the EC2 instances. The application needs to store the lifecycle events in a single Amazon Simple Queue Service (Amazon SQS) queue in the company's main AWS account for further processing.
Which solution will meet these requirements?

  1. Configure Amazon EC2 to deliver the EC2 instance lifecycle events from all accounts to the Amazon EventBridge event bus of the main account. Add an EventBridge rule to the event bus of the main account that matches all EC2 instance lifecycle events. Add the SQS queue as a target of the rule.
  2. Use the resource policies of the SQS queue in the main account to give each account permissions to write to that SQS queue. Add to the Amazon EventBridge event bus of each account an EventBridge rule that matches all EC2 instance lifecycle events. Add the SQS queue in the main account as a target of the rule.
  3. Write an AWS Lambda function that scans through all EC2 instances in the company accounts to detect EC2 instance lifecycle changes. Configure the Lambda function to write a notification message to the SQS queue in the main account if the function detects an EC2 instance lifecycle change. Add an Amazon EventBridge scheduled rule that invokes the Lambda function every minute.
  4. Configure the permissions on the main account event bus to receive events from all accounts. Create an Amazon EventBridge rule in each account to send all the EC2 instance lifecycle events to the main account event bus. Add an EventBridge rule to the main account event bus that matches all EC2 instance lifecycle events. Set the SQS queue as a target for the rule.

Answer(s): D



An application is using Amazon Cognito user pools and identity pools for secure access. A developer wants to integrate the user-specific file upload and download features in the application with Amazon S3. The developer must ensure that the files are saved and retrieved in a secure manner and that users can access only their own files. The file sizes range from 3 KB to 300 MB.
Which option will meet these requirements with the HIGHEST level of security?

  1. Use S3 Event Notifications to validate the file upload and download requests and update the user interface (UI).
  2. Save the details of the uploaded files in a separate Amazon DynamoDB table. Filter the list of files in the user interface (UI) by comparing the current user ID with the user ID associated with the file in the table.
  3. Use Amazon API Gateway and an AWS Lambda function to upload and download files. Validate each request in the Lambda function before performing the requested operation.
  4. Use an IAM policy within the Amazon Cognito identity prefix to restrict users to use their own folders in Amazon S3.

Answer(s): D



A company is building a scalable data management solution by using AWS services to improve the speed and agility of development. The solution will ingest large volumes of data from various sources and will process this data through multiple business rules and transformations.
The solution requires business rules to run in sequence and to handle reprocessing of data if errors occur when the business rules run. The company needs the solution to be scalable and to require the least possible maintenance.
Which AWS service should the company use to manage and automate the orchestration of the data flows to meet these requirements?

  1. AWS Batch
  2. AWS Step Functions
  3. AWS Glue
  4. AWS Lambda

Answer(s): B



A developer has created an AWS Lambda function that is written in Python. The Lambda function reads data from objects in Amazon S3 and writes data to an Amazon DynamoDB table. The function is successfully invoked from an S3 event notification when an object is created. However, the function fails when it attempts to write to the DynamoDB table.
What is the MOST likely cause of this issue?

  1. The Lambda function's concurrency limit has been exceeded.
  2. DynamoDB table requires a global secondary index (GSI) to support writes.
  3. The Lambda function does not have IAM permissions to write to DynamoDB.
  4. The DynamoDB table is not running in the same Availability Zone as the Lambda function.

Answer(s): C



A developer is creating an AWS CloudFormation template to deploy Amazon EC2 instances across multiple AWS accounts. The developer must choose the EC2 instances from a list of approved instance types.
How can the developer incorporate the list of approved instance types in the CloudFormation template?

  1. Create a separate CloudFormation template for each EC2 instance type in the list.
  2. In the Resources section of the CloudFormation template, create resources for each EC2 instance type in the list.
  3. In the CloudFormation template, create a separate parameter for each EC2 instance type in the list.
  4. In the CloudFormation template, create a parameter with the list of EC2 instance types as AllowedValues.

Answer(s): D



A developer has an application that makes batch requests directly to Amazon DynamoDB by using the BatchGetItem low-level API operation. The responses frequently return values in the UnprocessedKeys element.
Which actions should the developer take to increase the resiliency of the application when the batch response includes values in UnprocessedKeys? (Choose two.)

  1. Retry the batch operation immediately.
  2. Retry the batch operation with exponential backoff and randomized delay.
  3. Update the application to use an AWS software development kit (AWS SDK) to make the requests.
  4. Increase the provisioned read capacity of the DynamoDB tables that the operation accesses.
  5. Increase the provisioned write capacity of the DynamoDB tables that the operation accesses.

Answer(s): B,D



A company is running a custom application on a set of on-premises Linux servers that are accessed using Amazon API Gateway. AWS X-Ray tracing has been enabled on the API test stage.
How can a developer enable X-Ray tracing on the on-premises servers with the LEAST amount of configuration?

  1. Install and run the X-Ray SDK on the on-premises servers to capture and relay the data to the X-Ray service.
  2. Install and run the X-Ray daemon on the on-premises servers to capture and relay the data to the X-Ray service.
  3. Capture incoming requests on-premises and configure an AWS Lambda function to pull, process, and relay relevant data to X-Ray using the PutTraceSegments API call.
  4. Capture incoming requests on-premises and configure an AWS Lambda function to pull, process, and relay relevant data to X-Ray using the PutTelemetryRecords API call.

Answer(s): B



Share your comments for Amazon AWS Certified Developer - Associate DVA-C02 exam with other users:

A
Anonymous User
4/14/2026 12:31:34 PM

Question 2:
For question 2, the key concept is the Longest Prefix Match. Routers pick the route whose subnet mask is the most specific (largest prefix length) that still matches the destination IP.
From the options:

  • A) 10.10.10.0/28 ? 10.10.10.0–10.10.10.15
  • B) 10.10.13.0/25 ? 10.10.13.0–10.10.13.127
  • C) 10.10.13.144/28 ? 10.10.13.144–10.10.13.159
  • D) 10.10.13.208/29 ? 10.10.13.208–10.10.13.215

The destination Host A’s IP must fall within 10.10.13.208–10.10.13.215 for the /29 to be the best match. Since /29 is the longest prefix among the matching options, Router1 will use 10.10.13.208/29.
Thus, the correct answer is D.

S
srameh
4/14/2026 10:09:29 AM

Question 3:

  • Correct answer: Phase 4, Post Accreditation

  • Explanation:
- In DITSCAP, the four phases are: - Phase 1: Definition (concept and requirements) - Phase 2: Verification (design and testing) - Phase 3: Validation (fielding and evaluation) - Phase 4: Post Accreditation (ongoing operations and lifecycle management) - The description—continuing operation of an accredited IT system and addressing changing threats throughout its life cycle—fits the Post Accreditation phase, which covers operations, maintenance, monitoring, and reauthorization as threats and environment evolve.

O
onibokun10
4/13/2026 7:50:14 PM

Question 129:
Correct answer: CNAME

  • A CNAME record creates an alias for a domain, so newapplication.comptia.org will resolve to whatever IP address www.comptia.org resolves to. This ensures both names point to the same resource without duplicating the IP.
  • Why not the others:
- SOA defines authoritative information for a zone. - MX specifies mail exchange servers. - NS designates name servers for a zone.
  • Notes: The alias name (newapplication.comptia.org) should not have other records if you use a CNAME for it, and CNAMEs aren’t used for the zone apex (root) domain. This scenario uses a subdomain, so a CNAME is appropriate.

A
Anonymous User
4/13/2026 6:29:58 PM

Question 1:

  • Correct answer: C

  • Why this is best:
- Uses OS Login with IAM, so SSH access is granted via Google accounts rather than distributing per-user SSH keys. - Granting the compute.osAdminLogin role to a Google group gives admin access to all team members in a centralized, auditable way. - Access is auditable: Cloud Audit Logs show who accessed which VM, satisfying the security requirement to determine who accessed a given instance.
  • How it works:
- Enable OS Login on the project/instances (enable-oslogin metadata). - Add the team’s

A
Anonymous User
4/13/2026 1:00:51 PM

Question 2:

  • Answer: D. Azure Advisor

  • Why: To view security-related recommendations for resources in the Compute and Apps area (including App Service Web Apps and Functions), you use Azure Advisor. Advisor surfaces personalized best-practice recommendations across resources, including security, and shows which resources are affected and the severity.

  • Why not the others:
- Azure Log Analytics is for ad-hoc querying of telemetry, not for viewing security recommendations. - Azure Event Hubs is for streaming telemetry data, not for security recommendations.
  • Quick tip: In the portal, navigate to Azure Advisor and check the Security recommendations for App Services to see actionable items and affe

D
Don
4/11/2026 5:36:42 AM

Recommend using AI for Solutions rather the Answer(s) submitted here

M
Mogae Malapela
4/8/2026 6:37:56 AM

This is very interesting

A
Anon
4/6/2026 5:22:54 PM

Are these the same questions you have to pay for in ExamTopics?

L
LRK
3/22/2026 2:38:08 PM

For Question 7 - while the answer description indicates the correct answer, the option no. mentioned is incorrect. Nice and Comprehensive. Thankyou

R
Rian
3/19/2026 9:12:10 AM

This is very good and accurate. Explanation is very helpful even thou some are not 100% right but good enough to pass.

G
Gerrard
3/18/2026 6:58:37 AM

The DP-900 exam can be tricky if you aren't familiar with Microsoft’s specific cloud terminology. I used the practice questions from free-braindumps.com and found them incredibly helpful. The site breaks down core data concepts and Azure services in a way that actually mirrors the real test. As a resutl I passed my exam.

V
Vineet Kumar
3/6/2026 5:26:16 AM

interesting

J
Joe
1/20/2026 8:25:24 AM

Passed this exam 2 days ago. These questions are in the exam. You are safe to use them.

N
NJ
12/24/2025 10:39:07 AM

Helpful to test your preparedness before giving exam

A
Ashwini
12/17/2025 8:24:45 AM

Really helped

J
Jagadesh
12/16/2025 9:57:10 AM

Good explanation

S
shobha
11/29/2025 2:19:59 AM

very helpful

P
Pandithurai
11/12/2025 12:16:21 PM

Question 1, Ans is - Developer,Standard,Professional Direct and Premier

E
Einstein
11/8/2025 4:13:37 AM

Passed this exam in first appointment. Great resource and valid exam dump.

D
David
10/31/2025 4:06:16 PM

Today I wrote this exam and passed, i totally relay on this practice exam. The questions were very tough, these questions are valid and I encounter the same.

T
Thor
10/21/2025 5:16:29 AM

Anyone used this dump recently?

V
Vladimir
9/25/2025 9:11:14 AM

173 question is A not D

K
khaos
9/21/2025 7:07:26 AM

nice questions

K
Katiso Lehasa
9/15/2025 11:21:52 PM

Thanks for the practice questions they helped me a lot.

E
Einstein
9/2/2025 7:42:00 PM

Passed this exam today. All questions are valid and this is not something you can find in ChatGPT.

V
vito
8/22/2025 4:16:51 AM

i need to pass exam for VMware 2V0-11.25

M
Matt
7/31/2025 11:44:40 PM

Great questions.

O
OLERATO
7/1/2025 5:44:14 AM

great dumps to practice for the exam

A
Adekunle willaims
6/9/2025 7:37:29 AM

How reliable and relevant are these questions?? also i can see the last update here was January and definitely new questions would have emerged.

A
Alex
5/24/2025 12:54:15 AM

Can I trust to this source?

S
SPriyak
3/17/2025 11:08:37 AM

can you please provide the CBDA latest test preparation

C
Chandra
11/28/2024 7:17:38 AM

This is the best and only way of passing this exam as it is extremely hard. Good questions and valid dump.

S
Sunak
1/25/2025 9:17:57 AM

Can I use this dumps when I am taking the exam? I mean does somebody look what tabs or windows I have opened ?

F
Frank
2/15/2024 11:36:57 AM

Finally got a change to write this exam and pass it! Valid and accurate!

AI Tutor 👋 I’m here to help!