Amazon DAS-C01 Exam (page: 4)
Amazon AWS Certified Data Analytics - Specialty (DAS-C01)
Updated on: 12-Feb-2026

Viewing Page 4 of 22

A media company has a streaming playback application. The company needs to collect and analyze data to provide near-real-time feedback on playback issues within 30 seconds. The company requires a consumer application to identify playback issues, such as decreased quality during a speci ed time frame. The data will be streamed in JSON format. The schema can change over time.
Which solution will meet these requirements?

  1. Send the data to Amazon Kinesis Data Firehose with delivery to Amazon S3. Con gure an S3 event to invoke an AWS Lambda function to process and analyze the data.
  2. Send the data to Amazon Managed Streaming for Apache Kafka. Con gure Amazon Kinesis Data Analytics for SQL Application as the consumer application to process and analyze the data.
  3. Send the data to Amazon Kinesis Data Firehose with delivery to Amazon S3. Con gure Amazon S3 to initiate an event for AWS Lambda to process and analyze the data.
  4. Send the data to Amazon Kinesis Data Streams. Con gure an Amazon Kinesis Data Analytics for Apache Flink application as the consumer application to process and analyze the data.

Answer(s): D



An ecommerce company stores customer purchase data in Amazon RDS. The company wants a solution to store and analyze historical data. The most recent 6 months of data will be queried frequently for analytics workloads. This data is several terabytes large. Once a month, historical data for the last 5 years must be accessible and will be joined with the more recent data. The company wants to optimize performance and cost.
Which storage solution will meet these requirements?

  1. Create a read replica of the RDS database to store the most recent 6 months of data. Copy the historical data into Amazon S3. Create an AWS Glue Data Catalog of the data in Amazon S3 and Amazon RDS. Run historical queries using Amazon Athena.
  2. Use an ETL tool to incrementally load the most recent 6 months of data into an Amazon Redshift cluster. Run more frequent queries against this cluster. Create a read replica of the RDS database to run queries on the historical data.
  3. Incrementally copy data from Amazon RDS to Amazon S3. Create an AWS Glue Data Catalog of the data in Amazon S3. Use Amazon Athena to query the data.
  4. Incrementally copy data from Amazon RDS to Amazon S3. Load and store the most recent 6 months of data in Amazon Redshift. Con gure an Amazon Redshift Spectrum table to connect to all historical data.

Answer(s): D



A company leverages Amazon Athena for ad-hoc queries against data stored in Amazon S3. The company wants to implement additional controls to separate query execution and query history among users, teams, or applications running in the same AWS account to comply with internal security policies.
Which solution meets these requirements?

  1. Create an S3 bucket for each given use case, create an S3 bucket policy that grants permissions to appropriate individual IAM users. and apply the S3 bucket policy to the S3 bucket.
  2. Create an Athena workgroup for each given use case, apply tags to the workgroup, and create an IAM policy using the tags to apply appropriate permissions to the workgroup.
  3. Create an IAM role for each given use case, assign appropriate permissions to the role for the given use case, and add the role to associate the role with Athena.
  4. Create an AWS Glue Data Catalog resource policy for each given use case that grants permissions to appropriate individual IAM users, and apply the resource policy to the speci c tables used by Athena.

Answer(s): B


Reference:

https://aws.amazon.com/athena/faqs/



A company wants to use an automatic machine learning (ML) Random Cut Forest (RCF) algorithm to visualize complex real-world scenarios, such as detecting seasonality and trends, excluding outers, and imputing missing values. The team working on this project is non-technical and is looking for an out-of-the-box solution that will require the LEAST amount of management overhead.
Which solution will meet these requirements?

  1. Use an AWS Glue ML transform to create a forecast and then use Amazon QuickSight to visualize the data.
  2. Use Amazon QuickSight to visualize the data and then use ML-powered forecasting to forecast the key business metrics.
  3. Use a pre-build ML AMI from the AWS Marketplace to create forecasts and then use Amazon QuickSight to visualize the data.
  4. Use calculated elds to create a new forecast and then use Amazon QuickSight to visualize the data.

Answer(s): B


Reference:

https://aws.amazon.com/blogs/big-data/query-visualize-and-forecast-trufactor-web-session-intelligence-with-aws-data-exchange/



A retail company's data analytics team recently created multiple product sales analysis dashboards for the average selling price per product using Amazon
QuickSight. The dashboards were created from .csv les uploaded to Amazon S3. The team is now planning to share the dashboards with the respective external product owners by creating individual users in Amazon QuickSight. For compliance and governance reasons, restricting access is a key requirement. The product owners should view only their respective product analysis in the dashboard reports. Which approach should the data analytics team take to allow product owners to view only their products in the dashboard?

  1. Separate the data by product and use S3 bucket policies for authorization.
  2. Separate the data by product and use IAM policies for authorization.
  3. Create a manifest le with row-level security.
  4. Create dataset rules with row-level security.

Answer(s): D



A company has developed an Apache Hive script to batch process data stared in Amazon S3. The script needs to run once every day and store the output in
Amazon S3. The company tested the script, and it completes within 30 minutes on a small local three-node cluster. Which solution is the MOST cost-effective for scheduling and executing the script?

  1. Create an AWS Lambda function to spin up an Amazon EMR cluster with a Hive execution step. Set KeepJobFlowAliveWhenNoSteps to false and disable the termination protection ag. Use Amazon CloudWatch Events to schedule the Lambda function to run daily.
  2. Use the AWS Management Console to spin up an Amazon EMR cluster with Python Hue. Hive, and Apache Oozie. Set the termination protection ag to true and use Spot Instances for the core nodes of the cluster. Con gure an Oozie work ow in the cluster to invoke the Hive script daily.
  3. Create an AWS Glue job with the Hive script to perform the batch operation. Con gure the job to run once a day using a time-based schedule.
  4. Use AWS Lambda layers and load the Hive runtime to AWS Lambda and copy the Hive script. Schedule the Lambda function to run daily by creating a work ow using AWS Step Functions.

Answer(s): A



A company wants to improve the data load time of a sales data dashboard. Data has been collected as .csv les and stored within an Amazon S3 bucket that is partitioned by date. The data is then loaded to an Amazon Redshift data warehouse for frequent analysis. The data volume is up to 500 GB per day.
Which solution will improve the data loading performance?

  1. Compress .csv les and use an INSERT statement to ingest data into Amazon Redshift.
  2. Split large .csv les, then use a COPY command to load data into Amazon Redshift.
  3. Use Amazon Kinesis Data Firehose to ingest data into Amazon Redshift.
  4. Load the .csv les in an unsorted key order and vacuum the table in Amazon Redshift.

Answer(s): B


Reference:

https://aws.amazon.com/blogs/big-data/using-amazon-redshift-spectrum-amazon-athena-and-aws-glue-with-node-js-in-production/



A company has a data warehouse in Amazon Redshift that is approximately 500 TB in size. New data is imported every few hours and read-only queries are run throughout the day and evening. There is a particularly heavy load with no writes for several hours each morning on business days. During those hours, some queries are queued and take a long time to execute. The company needs to optimize query execution and avoid any downtime.
What is the MOST cost-effective solution?

  1. Enable concurrency scaling in the workload management (WLM) queue.
  2. Add more nodes using the AWS Management Console during peak hours. Set the distribution style to ALL.
  3. Use elastic resize to quickly add nodes during peak times. Remove the nodes when they are not needed.
  4. Use a snapshot, restore, and resize operation. Switch to the new target cluster.

Answer(s): A



Viewing Page 4 of 22



Share your comments for Amazon DAS-C01 exam with other users:

Bharat Kumar Saraf 10/31/2023 4:36:00 AM

some of the answers are incorrect. need to be reviewed.
HONG KONG


JP 7/13/2023 12:21:00 PM

so far so good
Anonymous


Kiky V 8/8/2023 6:32:00 PM

i am really liking it
Anonymous


trying 7/28/2023 12:37:00 PM

thanks good stuff
UNITED STATES


exampei 10/4/2023 2:40:00 PM

need dump c_tadm_23
Anonymous


Eman Sawalha 6/10/2023 6:18:00 AM

next time i will write a full review
GREECE


johnpaul 11/15/2023 7:55:00 AM

first time using this site
ROMANIA


omiornil@gmail.com 7/25/2023 9:36:00 AM

please sent me oracle 1z0-1105-22 pdf
BANGLADESH


John 8/29/2023 8:59:00 PM

very helpful
Anonymous


Kvana 9/28/2023 12:08:00 PM

good info about oml
UNITED STATES


Checo Lee 7/3/2023 5:45:00 PM

very useful to practice
UNITED STATES


dixitdnoh@gmail.com 8/27/2023 2:58:00 PM

this website is very helpful.
UNITED STATES


Sanjay 8/14/2023 8:07:00 AM

good content
INDIA


Blessious Phiri 8/12/2023 2:19:00 PM

so challenging
Anonymous


PAYAL 10/17/2023 7:14:00 AM

17 should be d ,for morequery its scale out
Anonymous


Karthik 10/12/2023 10:51:00 AM

nice question
Anonymous


Godmode 5/7/2023 10:52:00 AM

yes.
NETHERLANDS


Bhuddhiman 7/30/2023 1:18:00 AM

good mateial
Anonymous


KJ 11/17/2023 3:50:00 PM

good practice exam
Anonymous


sowm 10/29/2023 2:44:00 PM

impressivre qustion
Anonymous


CW 7/6/2023 7:06:00 PM

questions seem helpful
Anonymous


luke 9/26/2023 10:52:00 AM

good content
Anonymous


zazza 6/16/2023 9:08:00 AM

question 21 answer is alerts
ITALY


Abwoch Peter 7/4/2023 3:08:00 AM

am preparing for exam
Anonymous


mohamed 9/12/2023 5:26:00 AM

good one thanks
EGYPT


Mfc 10/23/2023 3:35:00 PM

only got thru 5 questions, need more to evaluate
Anonymous


Whizzle 7/24/2023 6:19:00 AM

q26 should be b
Anonymous


sarra 1/17/2024 3:44:00 AM

the aaa triad in information security is authentication, accounting and authorisation so the answer should be d 1, 3 and 5.
UNITED KINGDOM


DBS 5/14/2023 12:56:00 PM

need to attend this
UNITED STATES


Da_costa 8/1/2023 5:28:00 PM

these are free brain dumps i understand, how can one get free pdf
Anonymous


vikas 10/28/2023 6:57:00 AM

provide access
EUROPEAN UNION


Abdullah 9/29/2023 2:06:00 AM

good morning
Anonymous


Raj 6/26/2023 3:12:00 PM

please upload the ncp-mci 6.5 dumps, really need to practice this one. thanks guys
Anonymous


Miguel 10/5/2023 12:21:00 PM

question 16: https://help.salesforce.com/s/articleview?id=sf.care_console_overview.htm&type=5
SPAIN