Amazon DAS-C01 Exam (page: 2)
Amazon AWS Certified Data Analytics - Specialty (DAS-C01)
Updated on: 12-Feb-2026

Viewing Page 2 of 22

A manufacturing company has been collecting IoT sensor data from devices on its factory oor for a year and is storing the data in Amazon Redshift for daily analysis. A data analyst has determined that, at an expected ingestion rate of about 2 TB per day, the cluster will be undersized in less than 4 months. A long-term solution is needed. The data analyst has indicated that most queries only reference the most recent 13 months of data, yet there are also quarterly reports that need to query all the data generated from the past 7 years. The chief technology o cer (CTO) is concerned about the costs, administrative effort, and performance of a long-term solution.
Which solution should the data analyst use to meet these requirements?

  1. Create a daily job in AWS Glue to UNLOAD records older than 13 months to Amazon S3 and delete those records from Amazon Redshift. Create an external table in Amazon Redshift to point to the S3 location. Use Amazon Redshift Spectrum to join to data that is older than 13 months.
  2. Take a snapshot of the Amazon Redshift cluster. Restore the cluster to a new cluster using dense storage nodes with additional storage capacity.
  3. Execute a CREATE TABLE AS SELECT (CTAS) statement to move records that are older than 13 months to quarterly partitioned data in Amazon Redshift Spectrum backed by Amazon S3.
  4. Unload all the tables in Amazon Redshift to an Amazon S3 bucket using S3 Intelligent-Tiering. Use AWS Glue to crawl the S3 bucket location to create external tables in an AWS Glue Data Catalog. Create an Amazon EMR cluster using Auto Scaling for any daily analytics needs, and use Amazon Athena for the quarterly reports, with both using the same AWS Glue Data Catalog.

Answer(s): A



An insurance company has raw data in JSON format that is sent without a prede ned schedule through an Amazon Kinesis Data Firehose delivery stream to an
Amazon S3 bucket. An AWS Glue crawler is scheduled to run every 8 hours to update the schema in the data catalog of the tables stored in the S3 bucket. Data analysts analyze the data using Apache Spark SQL on Amazon EMR set up with AWS Glue Data Catalog as the metastore. Data analysts say that, occasionally, the data they receive is stale. A data engineer needs to provide access to the most up-to-date data.
Which solution meets these requirements?

  1. Create an external schema based on the AWS Glue Data Catalog on the existing Amazon Redshift cluster to query new data in Amazon S3 with Amazon Redshift Spectrum.
  2. Use Amazon CloudWatch Events with the rate (1 hour) expression to execute the AWS Glue crawler every hour.
  3. Using the AWS CLI, modify the execution schedule of the AWS Glue crawler from 8 hours to 1 minute.
  4. Run the AWS Glue crawler from an AWS Lambda function triggered by an S3:ObjectCreated:* event noti cation on the S3 bucket.

Answer(s): D



A company that produces network devices has millions of users. Data is collected from the devices on an hourly basis and stored in an Amazon S3 data lake.
The company runs analyses on the last 24 hours of data ow logs for abnormality detection and to troubleshoot and resolve user issues. The company also analyzes historical logs dating back 2 years to discover patterns and look for improvement opportunities. The data ow logs contain many metrics, such as date, timestamp, source IP, and target IP. There are about 10 billion events every day.
How should this data be stored for optimal performance?

  1. In Apache ORC partitioned by date and sorted by source IP
  2. In compressed .csv partitioned by date and sorted by source IP
  3. In Apache Parquet partitioned by source IP and sorted by date
  4. In compressed nested JSON partitioned by source IP and sorted by date

Answer(s): A



A banking company is currently using an Amazon Redshift cluster with dense storage (DS) nodes to store sensitive data. An audit found that the cluster is unencrypted. Compliance requirements state that a database with sensitive data must be encrypted through a hardware security module (HSM) with automated key rotation.
Which combination of steps is required to achieve compliance? (Choose two.)

  1. Set up a trusted connection with HSM using a client and server certi cate with automatic key rotation.
  2. Modify the cluster with an HSM encryption option and automatic key rotation.
  3. Create a new HSM-encrypted Amazon Redshift cluster and migrate the data to the new cluster.
  4. Enable HSM with key rotation through the AWS CLI.
  5. Enable Elliptic Curve Di e-Hellman Ephemeral (ECDHE) encryption in the HSM.

Answer(s): A,C


Reference:

https://docs.aws.amazon.com/redshift/latest/mgmt/working-with-db-encryption.html



A company is planning to do a proof of concept for a machine learning (ML) project using Amazon SageMaker with a subset of existing on- premises data hosted in the company's 3 TB data warehouse. For part of the project, AWS Direct Connect is established and tested. To prepare the data for ML, data analysts are performing data curation. The data analysts want to perform multiple step, including mapping, dropping null elds, resolving choice, and splitting elds. The company needs the fastest solution to curate the data for this project.
Which solution meets these requirements?

  1. Ingest data into Amazon S3 using AWS DataSync and use Apache Spark scrips to curate the data in an Amazon EMR cluster. Store the curated data in Amazon S3 for ML processing.
  2. Create custom ETL jobs on-premises to curate the data. Use AWS DMS to ingest data into Amazon S3 for ML processing.
  3. Ingest data into Amazon S3 using AWS DMS. Use AWS Glue to perform data curation and store the data in Amazon S3 for ML processing.
  4. Take a full backup of the data store and ship the backup les using AWS Snowball. Upload Snowball data into Amazon S3 and schedule data curation jobs using AWS Batch to prepare the data for ML.

Answer(s): C



A US-based sneaker retail company launched its global website. All the transaction data is stored in Amazon RDS and curated historic transaction data is stored in Amazon Redshift in the us-east-1 Region. The business intelligence (BI) team wants to enhance the user experience by providing a dashboard for sneaker trends.
The BI team decides to use Amazon QuickSight to render the website dashboards. During development, a team in Japan provisioned Amazon QuickSight in ap- northeast-1. The team is having di culty connecting Amazon QuickSight from ap-northeast-1 to Amazon Redshift in us-east-1.
Which solution will solve this issue and meet the requirements?

  1. In the Amazon Redshift console, choose to con gure cross-Region snapshots and set the destination Region as ap-northeast-1. Restore the Amazon Redshift Cluster from the snapshot and connect to Amazon QuickSight launched in ap-northeast-1.
  2. Create a VPC endpoint from the Amazon QuickSight VPC to the Amazon Redshift VPC so Amazon QuickSight can access data from Amazon Redshift.
  3. Create an Amazon Redshift endpoint connection string with Region information in the string and use this connection string in Amazon QuickSight to connect to Amazon Redshift.
  4. Create a new security group for Amazon Redshift in us-east-1 with an inbound rule authorizing access from the appropriate IP address range for the Amazon QuickSight servers in ap-northeast-1.

Answer(s): D



An airline has .csv-formatted data stored in Amazon S3 with an AWS Glue Data Catalog. Data analysts want to join this data with call center data stored in
Amazon Redshift as part of a dally batch process. The Amazon Redshift cluster is already under a heavy load. The solution must be managed, serverless, well- functioning, and minimize the load on the existing Amazon Redshift cluster. The solution should also require minimal effort and development activity.
Which solution meets these requirements?

  1. Unload the call center data from Amazon Redshift to Amazon S3 using an AWS Lambda function. Perform the join with AWS Glue ETL scripts.
  2. Export the call center data from Amazon Redshift using a Python shell in AWS Glue. Perform the join with AWS Glue ETL scripts.
  3. Create an external table using Amazon Redshift Spectrum for the call center data and perform the join with Amazon Redshift.
  4. Export the call center data from Amazon Redshift to Amazon EMR using Apache Sqoop. Perform the join with Apache Hive.

Answer(s): C



A data analyst is using Amazon QuickSight for data visualization across multiple datasets generated by applications. Each application stores les within a separate Amazon S3 bucket. AWS Glue Data Catalog is used as a central catalog across all application data in Amazon S3. A new application stores its data within a separate S3 bucket. After updating the catalog to include the new application data source, the data analyst created a new Amazon QuickSight data source from an Amazon Athena table, but the import into SPICE failed.
How should the data analyst resolve the issue?

  1. Edit the permissions for the AWS Glue Data Catalog from within the Amazon QuickSight console.
  2. Edit the permissions for the new S3 bucket from within the Amazon QuickSight console.
  3. Edit the permissions for the AWS Glue Data Catalog from within the AWS Glue console.
  4. Edit the permissions for the new S3 bucket from within the S3 console.

Answer(s): B


Reference:

https://aws.amazon.com/blogs/big-data/harmonize-query-and-visualize-data-from-various-providers-using-aws-glue-amazon-athena-and- amazon- quicksight/



Viewing Page 2 of 22



Share your comments for Amazon DAS-C01 exam with other users:

Divya 8/2/2023 6:54:00 AM

need more q&a to go ahead
Anonymous


Rakesh 10/6/2023 3:06:00 AM

question 59 - a newly-created role is not assigned to any user, nor granted to any other role. answer is b https://docs.snowflake.com/en/user-guide/security-access-control-overview
Anonymous


Nik 11/10/2023 4:57:00 AM

just passed my exam today. i saw all of these questions in my text today. so i can confirm this is a valid dump.
HONG KONG


Deep 6/12/2023 7:22:00 AM

needed dumps
INDIA


tumz 1/16/2024 10:30:00 AM

very helpful
UNITED STATES


NRI 8/27/2023 10:05:00 AM

will post once the exam is finished
UNITED STATES


kent 11/3/2023 10:45:00 AM

relevant questions
Anonymous


Qasim 6/11/2022 9:43:00 AM

just clear exam on 10/06/2202 dumps is valid all questions are came same in dumps only 2 new questions total 46 questions 1 case study with 5 question no lab/simulation in my exam please check the answers best of luck
Anonymous


Cath 10/10/2023 10:09:00 AM

q.112 - correct answer is c - the event registry is a module that provides event definitions. answer a - not correct as it is the definition of event log
VIET NAM


Shiji 10/15/2023 1:31:00 PM

good and useful.
INDIA


Ade 6/25/2023 1:14:00 PM

good questions
Anonymous


Praveen P 11/8/2023 5:18:00 AM

good content
UNITED STATES


Anastasiia 12/28/2023 9:06:00 AM

totally not correct answers. 21. you have one gcp account running in your default region and zone and another account running in a non-default region and zone. you want to start a new compute engine instance in these two google cloud platform accounts using the command line interface. what should you do? correct: create two configurations using gcloud config configurations create [name]. run gcloud config configurations activate [name] to switch between accounts when running the commands to start the compute engine instances.
Anonymous


Priyanka 7/24/2023 2:26:00 AM

kindly upload the dumps
Anonymous


Nabeel 7/25/2023 4:11:00 PM

still learning
Anonymous


gure 7/26/2023 5:10:00 PM

excellent way to learn
UNITED STATES


ciken 8/24/2023 2:55:00 PM

help so much
Anonymous


Biswa 11/20/2023 9:28:00 AM

understand sql col.
Anonymous


Saint Pierre 10/24/2023 6:21:00 AM

i would give 5 stars to this website as i studied for az-800 exam from here. it has all the relevant material available for preparation. i got 890/1000 on the test.
Anonymous


Rose 7/24/2023 2:16:00 PM

this is nice.
Anonymous


anon 10/15/2023 12:21:00 PM

q55- the ridac workflow can be modified using flow designer, correct answer is d not a
UNITED STATES


NanoTek3 6/13/2022 10:44:00 PM

by far this is the most accurate exam dumps i have ever purchased. all questions are in the exam. i saw almost 90% of the questions word by word.
UNITED STATES


eriy 11/9/2023 5:12:00 AM

i cleared the az-104 exam by scoring 930/1000 on the exam. it was all possible due to this platform as it provides premium quality service. thank you!
UNITED STATES


Muhammad Rawish Siddiqui 12/8/2023 8:12:00 PM

question # 232: accessibility, privacy, and innovation are not data quality dimensions.
SAUDI ARABIA


Venkat 12/27/2023 9:04:00 AM

looks wrong answer for 443 question, please check and update
Anonymous


Varun 10/29/2023 9:11:00 PM

great question
Anonymous


Doc 10/29/2023 9:36:00 PM

question: a user wants to start a recruiting posting job posting. what must occur before the posting process can begin? 3 ans: comment- option e is incorrect reason: as part of enablement steps, sap recommends that to be able to post jobs to a job board, a user need to have the correct permission and secondly, be associated with one posting profile at minimum
UNITED KINGDOM


It‘s not A 9/17/2023 5:31:00 PM

answer to question 72 is d [sys_user_role]
Anonymous


indira m 8/14/2023 12:15:00 PM

please provide the pdf
UNITED STATES


ribrahim 8/1/2023 6:05:00 AM

hey guys, just to let you all know that i cleared my 312-38 today within 1 hr with 100 questions and passed. thank you so much brain-dumps.net all the questions that ive studied in this dump came out exactly the same word for word "verbatim". you rock brain-dumps.net!!! section name total score gained score network perimeter protection 16 11 incident response 10 8 enterprise virtual, cloud, and wireless network protection 12 8 application and data protection 13 10 network défense management 10 9 endpoint protection 15 12 incident d
SINGAPORE


Andrew 8/23/2023 6:02:00 PM

very helpful
Anonymous


latha 9/7/2023 8:14:00 AM

useful questions
GERMANY


ibrahim 11/9/2023 7:57:00 AM

page :20 https://exam-dumps.com/snowflake/free-cof-c02-braindumps.html?p=20#collapse_453 q 74: true or false: pipes can be suspended and resumed. true. desc.: pausing or resuming pipes in addition to the pipe owner, a role that has the following minimum permissions can pause or resume the pipe https://docs.snowflake.com/en/user-guide/data-load-snowpipe-intro
FINLAND


Franklin Allagoa 7/5/2023 5:16:00 AM

i want hcia exam dumps
Anonymous