Amazon SAP-C01 Exam (page: 12)
Amazon AWS Certified Solutions Architect - Professional SAP-C02
Updated on: 11-Dec-2025

Viewing Page 12 of 107

A company with several AWS accounts is using AWS Organizations and service control policies (SCPs). An administrator created the following SCP and has attached it to an organizational unit (OU) that contains AWS account 1111-1111-1111:


Developers working in account 1111-1111-1111 complain that they cannot create Amazon S3 buckets. How should the administrator address this problem?

  1. Add s3:CreateBucket with “Allow” effect to the SCP.
  2. Remove the account from the OU, and attach the SCP directly to account 1111-1111-1111.
  3. Instruct the developers to add Amazon S3 permissions to their IAM entities.
  4. Remove the SCP from account 1111-1111-1111.

Answer(s): C

Explanation:

C) Instruct the developers to add Amazon S3 permissions to their IAM entities is the correct answer.

An SCP (Service Control Policy) defines what services and actions can be used within an AWS account, but it does not grant permissions on its own. It acts as a boundary. In this case, the SCP does not explicitly deny the ability to create Amazon S3 buckets, so the issue is that the developers' IAM roles or users are not assigned the appropriate permissions to create S3 buckets.

To resolve this, the developers need to have the necessary S3 permissions (e.g., s3:CreateBucket) in their IAM roles or policies. Once the appropriate permissions are added, they will be able to create S3 buckets, as the SCP does not restrict that action.



A company has a monolithic application that is critical to the company’s business. The company hosts the application on an Amazon EC2 instance that runs Amazon Linux 2. The company’s application team receives a directive from the legal department to back up the data from the instance’s encrypted Amazon Elastic Block Store (Amazon EBS) volume to an Amazon S3 bucket. The application team does not have the administrative SSH key pair for the instance. The application must continue to serve the users.

Which solution will meet these requirements?

  1. Attach a role to the instance with permission to write to Amazon S3. Use the AWS Systems Manager Session Manager option to gain access to the instance and run commands to copy data into Amazon S3.
  2. Create an image of the instance with the reboot option turned on. Launch a new EC2 instance from the image. Attach a role to the new instance with permission to write to Amazon S3. Run a command to copy data into Amazon S3.
  3. Take a snapshot of the EBS volume by using Amazon Data Lifecycle Manager (Amazon DLM). Copy the data to Amazon S3.
  4. Create an image of the instance. Launch a new EC2 instance from the image. Attach a role to the new instance with permission to write to Amazon S3. Run a command to copy data into Amazon S3.

Answer(s): A

Explanation:

A) Attach a role to the instance with permission to write to Amazon S3. Use the AWS Systems Manager Session Manager option to gain access to the instance and run commands to copy data into Amazon S3 is the correct answer.

This solution allows you to securely access the EC2 instance without needing the SSH key pair by using AWS Systems Manager Session Manager. Once access is gained through Session Manager, the necessary commands can be executed to copy data from the EBS volume to the Amazon S3 bucket. Attaching an IAM role with S3 write permissions to the instance ensures that the instance has the necessary permissions to upload the data to S3.

This approach does not interrupt the running application, ensuring that the application continues to serve users while meeting the backup requirement from the legal department.



A solutions architect needs to copy data from an Amazon S3 bucket m an AWS account to a new S3 bucket in a new AWS account. The solutions architect must implement a solution that uses the AWS CLI. Which combination of steps will successfully copy the data? (Choose three.)

  1. Create a bucket policy to allow the source bucket to list its contents and to put objects and set object ACLs in the destination bucket. Attach the bucket policy to the destination bucket.
  2. Create a bucket policy to allow a user in the destination account to list the source bucket’s contents and read the source bucket’s objects. Attach the bucket policy to the source bucket.
  3. Create an IAM policy in the source account. Configure the policy to allow a user in the source account to list contents and get objects in the source bucket, and to list contents, put objects, and set object ACLs in the destination bucket. Attach the policy to the user.
  4. Create an IAM policy in the destination account. Configure the policy to allow a user in the destination account to list contents and get objects in the source bucket, and to list contents, put objects, and set objectACLs in the destination bucket. Attach the policy to the user.
  5. Run the aws s3 sync command as a user in the source account. Specify the source and destination buckets to copy the data.
  6. Run the aws s3 sync command as a user in the destination account. Specify the source and destination buckets to copy the data.

Answer(s): B,D,F

Explanation:

The correct answers are:

B) Create a bucket policy to allow a user in the destination account to list the source bucket’s contents and read the source bucket’s objects. Attach the bucket policy to the source bucket.
This step ensures that the destination account has the necessary permissions to access and read the objects from the source bucket.

D) Create an IAM policy in the destination account. Configure the policy to allow a user in the destination account to list contents and get objects in the source bucket, and to list contents, put objects, and set object ACLs in the destination bucket. Attach the policy to the user.
This step grants the user in the destination account permissions to interact with both the source and destination buckets.

F) Run the aws s3 sync command as a user in the destination account. Specify the source and destination buckets to copy the data.
Running the aws s3 sync command from the destination account allows the user to copy the data from the source S3 bucket to the new S3 bucket, ensuring that the permissions set in the previous steps are applied correctly.

This combination of actions ensures that the data is copied from the source bucket in one AWS account to the destination bucket in another AWS account using the AWS CLI, with the appropriate permissions for accessing and managing the data across accounts.



A company built an application based on AWS Lambda deployed in an AWS CloudFormation stack. The last production release of the web application introduced an issue that resulted in an outage lasting several minutes. A solutions architect must adjust the deployment process to support a canary release.

Which solution will meet these requirements?

  1. Create an alias for every new deployed version of the Lambda function. Use the AWS CLI update-alias command with the routing-config parameter to distribute the load.
  2. Deploy the application into a new CloudFormation stack. Use an Amazon Route 53 weighted routing policy to distribute the load.
  3. Create a version for every new deployed Lambda function. Use the AWS CLI update-function-configuration command with the routing-config parameter to distribute the load.
  4. Configure AWS CodeDeploy and use CodeDeployDefault.OneAtATime in the Deployment configuration to distribute the load.

Answer(s): A

Explanation:

A) Create an alias for every new deployed version of the Lambda function. Use the AWS CLI update-alias command with the routing-config parameter to distribute the load is the correct answer.

This approach allows you to implement a canary release using AWS Lambda's versioning and aliases. By creating an alias for the new version and using the update-alias command with the routing-config parameter, you can gradually shift traffic to the new version of the Lambda function. This allows you to test the new version with a small percentage of users before fully rolling it out, which is a key aspect of canary releases.

This method ensures that you can detect and mitigate any issues with new Lambda function versions before they affect all users, minimizing the risk of outages or issues during deployment.



A finance company hosts a data lake in Amazon S3. The company receives financial data records over SFTP each night from several third parties. The company runs its own SFTP server on an Amazon EC2 instance in a public subnet of a VPC. After the files are uploaded, they are moved to the data lake by a cron job that runs on the same instance. The SFTP server is reachable on DNS sftp.example.com through the use of Amazon Route 53.

What should a solutions architect do to improve the reliability and scalability of the SFTP solution?

  1. Move the EC2 instance into an Auto Scaling group. Place the EC2 instance behind an Application Load Balancer (ALB). Update the DNS record sftp.example.com in Route 53 to point to the ALB.
  2. Migrate the SFTP server to AWS Transfer for SFTP. Update the DNS record sftp.example.com in Route 53 to point to the server endpoint hostname.
  3. Migrate the SFTP server to a file gateway in AWS Storage Gateway. Update the DNS record sftp.example.com in Route 53 to point to the file gateway endpoint.
  4. Place the EC2 instance behind a Network Load Balancer (NLB). Update the DNS record sftp.example.com in Route 53 to point to the NLB.

Answer(s): B

Explanation:

B) Migrate the SFTP server to AWS Transfer for SFTP. Update the DNS record sftp.example.com in Route 53 to point to the server endpoint hostname is the correct answer.

AWS Transfer for SFTP is a fully managed service that scales automatically and is highly reliable compared to managing an SFTP server on an EC2 instance. This migration would offload the operational burden of managing the SFTP server while providing enhanced scalability, availability, and built-in integration with Amazon S3 for direct data transfer to the data lake. By updating the DNS record in Route 53 to point to the AWS Transfer SFTP endpoint, the company ensures a smooth transition without requiring changes from the third parties uploading the data.

This solution improves both reliability and scalability without the need for manual instance management or custom scaling configurations.



Viewing Page 12 of 107



Share your comments for Amazon SAP-C01 exam with other users:

Yolostar Again 10/12/2023 3:02:00 PM

q.189 - answers are incorrect.
Anonymous


Shikha Bakra 9/10/2023 5:16:00 PM

awesome job in getting these questions
AUSTRALIA


Kevin 10/20/2023 2:01:00 AM

i cant find aws certified practitioner clf-c01 exam in aws website but i found aws certified practitioner clf-c02 exam. can everyone please verify the difference between the two clf-c01 and clf-c02? thank you
UNITED STATES


D Mario 6/19/2023 10:38:00 PM

grazie mille. i got a satisfactory mark in my exam test today because of this exam dumps. sorry for my english.
ITALY


Bharat Kumar Saraf 10/31/2023 4:36:00 AM

some of the answers are incorrect. need to be reviewed.
HONG KONG


JP 7/13/2023 12:21:00 PM

so far so good
Anonymous


Kiky V 8/8/2023 6:32:00 PM

i am really liking it
Anonymous


trying 7/28/2023 12:37:00 PM

thanks good stuff
UNITED STATES


exampei 10/4/2023 2:40:00 PM

need dump c_tadm_23
Anonymous


Eman Sawalha 6/10/2023 6:18:00 AM

next time i will write a full review
GREECE


johnpaul 11/15/2023 7:55:00 AM

first time using this site
ROMANIA


omiornil@gmail.com 7/25/2023 9:36:00 AM

please sent me oracle 1z0-1105-22 pdf
BANGLADESH


John 8/29/2023 8:59:00 PM

very helpful
Anonymous


Kvana 9/28/2023 12:08:00 PM

good info about oml
UNITED STATES


Checo Lee 7/3/2023 5:45:00 PM

very useful to practice
UNITED STATES


dixitdnoh@gmail.com 8/27/2023 2:58:00 PM

this website is very helpful.
UNITED STATES


Sanjay 8/14/2023 8:07:00 AM

good content
INDIA


Blessious Phiri 8/12/2023 2:19:00 PM

so challenging
Anonymous


PAYAL 10/17/2023 7:14:00 AM

17 should be d ,for morequery its scale out
Anonymous


Karthik 10/12/2023 10:51:00 AM

nice question
Anonymous


Godmode 5/7/2023 10:52:00 AM

yes.
NETHERLANDS


Bhuddhiman 7/30/2023 1:18:00 AM

good mateial
Anonymous


KJ 11/17/2023 3:50:00 PM

good practice exam
Anonymous


sowm 10/29/2023 2:44:00 PM

impressivre qustion
Anonymous


CW 7/6/2023 7:06:00 PM

questions seem helpful
Anonymous


luke 9/26/2023 10:52:00 AM

good content
Anonymous


zazza 6/16/2023 9:08:00 AM

question 21 answer is alerts
ITALY


Abwoch Peter 7/4/2023 3:08:00 AM

am preparing for exam
Anonymous


mohamed 9/12/2023 5:26:00 AM

good one thanks
EGYPT


Mfc 10/23/2023 3:35:00 PM

only got thru 5 questions, need more to evaluate
Anonymous


Whizzle 7/24/2023 6:19:00 AM

q26 should be b
Anonymous


sarra 1/17/2024 3:44:00 AM

the aaa triad in information security is authentication, accounting and authorisation so the answer should be d 1, 3 and 5.
UNITED KINGDOM


DBS 5/14/2023 12:56:00 PM

need to attend this
UNITED STATES


Da_costa 8/1/2023 5:28:00 PM

these are free brain dumps i understand, how can one get free pdf
Anonymous