Amazon SAP-C01 Exam (page: 10)
Amazon AWS Certified Solutions Architect - Professional SAP-C02
Updated on: 09-Dec-2025

Viewing Page 10 of 107

A company is building a serverless application that runs on an AWS Lambda function that is attached to a VPC. The company needs to integrate the application with a new service from an external provider. The external provider supports only requests that come from public IPv4 addresses that are in an allow list.

The company must provide a single public IP address to the external provider before the application can start using the new service.

Which solution will give the application the ability to access the new service?

  1. Deploy a NAT gateway. Associate an Elastic IP address with the NAT gateway. Configure the VPC to use the NAT gateway.
  2. Deploy an egress-only internet gateway. Associate an Elastic IP address with the egress-only internet gateway. Configure the elastic network interface on the Lambda function to use the egress-only internet gateway.
  3. Deploy an internet gateway. Associate an Elastic IP address with the internet gateway. Configure the Lambda function to use the internet gateway.
  4. Deploy an internet gateway. Associate an Elastic IP address with the internet gateway. Configure the default route in the public VPC route table to use the internet gateway.

Answer(s): A

Explanation:

A) Deploy a NAT gateway. Associate an Elastic IP address with the NAT gateway. Configure the VPC to use the NAT gateway is the correct solution. A NAT gateway allows resources within a private subnet (such as a Lambda function attached to a VPC) to access the internet or external services while keeping the internal resources private. By associating an Elastic IP address with the NAT gateway, you can ensure that all outbound traffic uses a single, predictable public IP address. This setup will satisfy the requirement of providing a single public IP address to the external provider for allow list purposes.



A solutions architect has developed a web application that uses an Amazon API Gateway Regional endpoint and an AWS Lambda function. The consumers of the web application are all close to the AWS Region where the application will be deployed. The Lambda function only queries an Amazon Aurora MySQL database. The solutions architect has configured the database to have three read replicas.

During testing, the application does not meet performance requirements. Under high load, the application opens a large number of database connections. The solutions architect must improve the application’s performance.

Which actions should the solutions architect take to meet these requirements? (Choose two.)

  1. Use the cluster endpoint of the Aurora database.
  2. Use RDS Proxy to set up a connection pool to the reader endpoint of the Aurora database.
  3. Use the Lambda Provisioned Concurrency feature.
  4. Move the code for opening the database connection in the Lambda function outside of the event handler.
  5. Change the API Gateway endpoint to an edge-optimized endpoint.

Answer(s): B,D

Explanation:

B) Use RDS Proxy to set up a connection pool to the reader endpoint of the Aurora database, and
D) Move the code for opening the database connection in the Lambda function outside of the event handler are the correct answers.

RDS Proxy helps improve database performance by efficiently managing database connections through a connection pool, which is critical in high-load scenarios where too many connections could overwhelm the Aurora MySQL database. By directing traffic to the reader endpoint, it also offloads the read queries from the primary instance.
Moving the code for opening the database connection outside the Lambda function's event handler ensures that the database connection is reused across multiple invocations, reducing the overhead of repeatedly opening and closing connections, improving both performance and scalability.



A company is planning to host a web application on AWS and wants to load balance the traffic across a group of Amazon EC2 instances. One of the security requirements is to enable end-to-end encryption in transit between the client and the web server.

Which solution will meet this requirement?

  1. Place the EC2 instances behind an Application Load Balancer (ALB). Provision an SSL certificate using AWS Certificate Manager (ACM), and associate the SSL certificate with the ALB. Export the SSL certificate and install it on each EC2 instance. Configure the ALB to listen on port 443 and to forward traffic to port 443 on the instances.
  2. Associate the EC2 instances with a target group. Provision an SSL certificate using AWS Certificate Manager (ACM). Create an Amazon CloudFront distribution and configure it to use the SSL certificate. Set CloudFront to use the target group as the origin server.
  3. Place the EC2 instances behind an Application Load Balancer (ALB) Provision an SSL certificate using AWS Certificate Manager (ACM), and associate the SSL certificate with the ALB. Provision a third-party SSL certificate and install it on each EC2 instance. Configure the ALB to listen on port 443 and to forward traffic to port 443 on the instances.
  4. Place the EC2 instances behind a Network Load Balancer (NLB). Provision a third-party SSL certificate and install it on the NLB and on each EC2 instance. Configure the NLB to listen on port 443 and to forward traffic to port 443 on the instances.

Answer(s): C

Explanation:

C) Place the EC2 instances behind an Application Load Balancer (ALB). Provision an SSL certificate using AWS Certificate Manager (ACM), and associate the SSL certificate with the ALB. Provision a third-party SSL certificate and install it on each EC2 instance. Configure the ALB to listen on port 443 and to forward traffic to port 443 on the instances is the correct solution.

This approach ensures end-to-end encryption by using an SSL certificate for both the Application Load Balancer (ALB) and the EC2 instances. The ALB handles the SSL termination for the initial client connection, and by installing a third-party SSL certificate on the EC2 instances, traffic between the ALB and the EC2 instances is also encrypted, ensuring end-to-end encryption.

This setup meets the security requirement while providing load balancing for traffic to the EC2 instances.



A company wants to migrate its data analytics environment from on premises to AWS. The environment consists of two simple Node.js applications. One of the applications collects sensor data and loads it into a MySQL database. The other application aggregates the data into reports. When the aggregation jobs run, some of the load jobs fail to run correctly.

The company must resolve the data loading issue. The company also needs the migration to occur without interruptions or changes for the company’s customers.

What should a solutions architect do to meet these requirements?

  1. Set up an Amazon Aurora MySQL database as a replication target for the on-premises database. Create an Aurora Replica for the Aurora MySQL database, and move the aggregation jobs to run against the Aurora Replica. Set up collection endpoints as AWS Lambda functions behind a Network Load Balancer (NLB), and use Amazon RDS Proxy to write to the Aurora MySQL database. When the databases are synced, disable the replication job and restart the Aurora Replica as the primary instance. Point the collector DNS record to the NLB.
  2. Set up an Amazon Aurora MySQL database. Use AWS Database Migration Service (AWS DMS) to perform continuous data replication from the on-premises database to Aurora. Move the aggregation jobs to run against the Aurora MySQL database. Set up collection endpoints behind an Application Load Balancer (ALB) as Amazon EC2 instances in an Auto Scaling group. When the databases are synced, point the collector DNS record to the AL Disable the AWS DMS sync task after the cutover from on premises to AWS.
  3. Set up an Amazon Aurora MySQL database. Use AWS Database Migration Service (AWS DMS) to perform continuous data replication from the on-premises database to Aurora. Create an Aurora Replica for the Aurora MySQL database, and move the aggregation jobs to run against the Aurora Replica. Set up collection endpoints as AWS Lambda functions behind an Application Load Balancer (ALB), and use Amazon RDS Proxy to write to the Aurora MySQL database. When the databases are synced, point the collector DNS record to the ALB. Disable the AWS DMS sync task after the cutover from on premises to AWS.
  4. Set up an Amazon Aurora MySQL database. Create an Aurora Replica for the Aurora MySQL database, and move the aggregation jobs to run against the Aurora Replica. Set up collection endpoints as an Amazon Kinesis data stream. Use Amazon Kinesis Data Firehose to replicate the data to the Aurora MySQL database. When the databases are synced, disable the replication job and restart the Aurora Replica as the primary instance. Point the collector DNS record to the Kinesis data stream.

Answer(s): C

Explanation:

C) Set up an Amazon Aurora MySQL database. Use AWS Database Migration Service (AWS DMS) to perform continuous data replication from the on-premises database to Aurora. Create an Aurora Replica for the Aurora MySQL database, and move the aggregation jobs to run against the Aurora Replica. Set up collection endpoints as AWS Lambda functions behind an Application Load Balancer (ALB), and use Amazon RDS Proxy to write to the Aurora MySQL database. When the databases are synced, point the collector DNS record to the ALB. Disable the AWS DMS sync task after the cutover from on premises to AWS is the correct answer.

This solution leverages AWS DMS for seamless, continuous replication from the on-premises MySQL database to Amazon Aurora, ensuring minimal downtime during the migration. Using Aurora Replica for the aggregation jobs offloads the read traffic, improving performance and reducing load on the primary database. By setting up AWS Lambda functions for data collection and Amazon RDS Proxy for connection management, the solution provides scalability and handles the load jobs efficiently. The use of ALB for the Lambda endpoints allows smooth handling of traffic, ensuring no interruptions for customers during the migration.



A health insurance company stores personally identifiable information (PII) in an Amazon S3 bucket. The company uses server-side encryption with S3 managed encryption keys (SSE-S3) to encrypt the objects. According to a new requirement, all current and future objects in the S3 bucket must be encrypted by keys that the company’s security team manages. The S3 bucket does not have versioning enabled.

Which solution will meet these requirements?

  1. In the S3 bucket properties, change the default encryption to SSE-S3 with a customer managed key. Use the AWS CLI to re-upload all objects in the S3 bucket. Set an S3 bucket policy to deny unencrypted PutObject requests.
  2. In the S3 bucket properties, change the default encryption to server-side encryption with AWS KMS managed encryption keys (SSE-KMS). Set an S3 bucket policy to deny unencrypted PutObject requests. Use the AWS CLI to re-upload all objects in the S3 bucket.
  3. In the S3 bucket properties, change the default encryption to server-side encryption with AWS KMS managed encryption keys (SSE-KMS). Set an S3 bucket policy to automatically encrypt objects on GetObject and PutObject requests.
  4. In the S3 bucket properties, change the default encryption to AES-256 with a customer managed key. Attach a policy to deny unencrypted PutObject requests to any entities that access the S3 bucket. Use the AWS CLI to re-upload all objects in the S3 bucket.

Answer(s): B

Explanation:

B) In the S3 bucket properties, change the default encryption to server-side encryption with AWS KMS managed encryption keys (SSE-KMS). Set an S3 bucket policy to deny unencrypted PutObject requests. Use the AWS CLI to re-upload all objects in the S3 bucket is the correct answer.

The requirement is to ensure that all current and future objects in the S3 bucket are encrypted with customer-managed keys, which means using AWS KMS managed encryption keys (SSE-KMS) instead of the default SSE-S3 encryption.

To meet this requirement:

You need to change the default encryption setting to SSE-KMS.
You must deny any unencrypted PutObject requests to ensure compliance for future uploads.
Because the bucket does not have versioning enabled, you need to re-upload the existing objects to apply the new encryption (SSE-KMS).
This solution ensures compliance with the new encryption requirements while properly encrypting both existing and future objects with KMS-managed keys.



Viewing Page 10 of 107



Share your comments for Amazon SAP-C01 exam with other users:

Chere 9/15/2023 4:21:00 AM

found it good
Anonymous


Thembelani 5/30/2023 2:47:00 AM

excellent material
Anonymous


vinesh phale 9/11/2023 2:51:00 AM

very helpfull
UNITED STATES


Bhagiii 11/4/2023 7:04:00 AM

well explained.
Anonymous


Rahul 8/8/2023 9:40:00 PM

i need the pdf, please.
CANADA


CW 7/11/2023 2:51:00 PM

a good source for exam preparation
UNITED STATES


Anchal 10/23/2023 4:01:00 PM

nice questions
INDIA


J Nunes 9/29/2023 8:19:00 AM

i need ielts general training audio guide questions
BRAZIL


Ananya 9/14/2023 5:16:00 AM

please make this content available
UNITED STATES


Swathi 6/4/2023 2:18:00 PM

content is good
Anonymous


Leo 7/29/2023 8:45:00 AM

latest dumps please
INDIA


Laolu 2/15/2023 11:04:00 PM

aside from pdf the test engine software is helpful. the interface is user-friendly and intuitive, making it easy to navigate and find the questions.
UNITED STATES


Zaynik 9/17/2023 5:36:00 AM

questions and options are correct, but the answers are wrong sometimes. so please check twice or refer some other platform for the right answer
Anonymous


Massam 6/11/2022 5:55:00 PM

90% of questions was there but i failed the exam, i marked the answers as per the guide but looks like they are not accurate , if not i would have passed the exam given that i saw about 45 of 50 questions from dump
Anonymous


Anonymous 12/27/2023 12:47:00 AM

answer to this question "what administrative safeguards should be implemented to protect the collected data while in use by manasa and her product management team? " it should be (c) for the following reasons: this administrative safeguard involves controlling access to collected data by ensuring that only individuals who need the data for their job responsibilities have access to it. this helps minimize the risk of unauthorized access and potential misuse of sensitive information. while other options such as (a) documenting data flows and (b) conducting a privacy impact assessment (pia) are important steps in data protection, implementing a "need to know" access policy directly addresses the issue of protecting data while in use by limiting access to those who require it for legitimate purposes. (d) is not directly related to safeguarding data during use; it focuses on data transfers and location.
INDIA


Japles 5/23/2023 9:46:00 PM

password lockout being the correct answer for question 37 does not make sense. it should be geofencing.
Anonymous


Faritha 8/10/2023 6:00:00 PM

for question 4, the righr answer is :recover automatically from failures
UNITED STATES


Anonymous 9/14/2023 4:27:00 AM

question number 4s answer is 3, option c. i
UNITED STATES


p das 12/7/2023 11:41:00 PM

very good questions
UNITED STATES


Anna 1/5/2024 1:12:00 AM

i am confused about the answers to the questions. are the answers correct?
KOREA REPUBLIC OF


Bhavya 9/13/2023 10:15:00 AM

very usefull
Anonymous


Rahul Kumar 8/31/2023 12:30:00 PM

need certification.
CANADA


Diran Ole 9/17/2023 5:15:00 PM

great exam prep
CANADA


Venkata Subbarao Bandaru 6/24/2023 8:45:00 AM

i require dump
Anonymous


D 7/15/2023 1:38:00 AM

good morning, could you please upload this exam again,
Anonymous


Ann 9/15/2023 5:39:00 PM

hi can you please upload the dumps for sap contingent module. thanks
AUSTRALIA


Sridhar 1/16/2024 9:19:00 PM

good questions
Anonymous


Summer 10/4/2023 9:57:00 PM

looking forward to the real exam
Anonymous


vv 12/2/2023 2:45:00 PM

good ones for exam preparation
UNITED STATES


Danny Zas 9/15/2023 4:45:00 AM

this is a good experience
UNITED STATES


SM 1211 10/12/2023 10:06:00 PM

hi everyone
UNITED STATES


A 10/2/2023 6:08:00 PM

waiting for the dump. please upload.
UNITED STATES


Anonymous 7/16/2023 11:05:00 AM

upload cks exam questions
Anonymous


Johan 12/13/2023 8:16:00 AM

awesome training material
NETHERLANDS