Snowflake SnowPro Advanced Data Engineer Exam (page: 2)
Snowflake SnowPro Advanced Data Engineer
Updated on: 25-Dec-2025

A Data Engineer wants to check the status of a pipe named my_pipe. The pipe is inside a database named test and a schema named Extract (case-sensitive).

Which query will provide the status of the pipe?

  1. SELECT SYSTEM$PIPE_STATUS("test.'extract'.my_pipe");
  2. SELECT SYSTEM$PIPE_STATUS('test."Extract".my_pipe');
  3. SELECT * FROM SYSTEM$PIPE_STATUS('test."Extract".my_pipe');
  4. SELECT * FROM SYSTEM$PIPE_STATUS("test.'extract'.my_pipe");

Answer(s): B



Company A and Company B both have Snowflake accounts. Company A's account is hosted on a different cloud provider and region than Company B's account. Companies A and B are not in the same Snowflake organization.

How can Company A share data with Company B? (Choose two.)

  1. Create a share within Company A's account and add Company B's account as a recipient of that share.
  2. Create a share within Company A's account, and create a reader account that is a recipient of the share.
    Grant Company B access to the reader account.
  3. Use database replication to replicate Company A's data into Company B's account. Create a share within Company B's account and grant users within Company B's account access to the share.
  4. Create a new account within Company A's organization in the same cloud provider and region as Company B's account. Use database replication to replicate Company A's data to the new account. Create a share within the new account, and add Company B's account as a recipient of that share.
  5. Create a separate database within Company A's account to contain only those data sets they wish to share with Company B. Create a share within Company A's account and add all the objects within this separate database to the share. Add Company B's account as a recipient of the share.

Answer(s): A,B



A Data Engineer is trying to load the following rows from a CSV file into a table in Snowflake with the following structure:





The engineer is using the following COPY INTO statement:



However, the following error is received:

Number of columns in file (6) does not match that of the corresponding table (3), use file format option error_on_column_count_mismatch=false to ignore this error File 'address.csv.gz', line 3, character 1 Row 1 starts at line 2, column "STGCUSTOMER"[6] If you would like to continue loading when an error is encountered, use other values such as 'SKIP_FILE' or 'CONTINUE' for the ON_ERROR option.

Which file format option should be used to resolve the error and successfully load all the data into the table?

  1. ESCAPE_UNENCLOSED FIELD = '\\'
  2. ERROR_ON_COLUMN_COUNT_MISMATCH = FALSE
  3. FIELD_DELIMITER = ','
  4. FIELD_OPTIONALLY_ENCLOSED_BY = '"'

Answer(s): D



A Data Engineer is working on a continuous data pipeline which receives data from Amazon Kinesis Firehose and loads the data into a staging table which will later be used in the data transformation process. The average file size is 300-500 MB.

The Engineer needs to ensure that Snowpipe is performant while minimizing costs.

How can this be achieved?

  1. Increase the size of the virtual warehouse used by Snowpipe.
  2. Split the files before loading them and set the SIZE_LIMIT option to 250 M
  3. Change the file compression size and increase the frequency of the Snowpipe loads.
  4. Decrease the buffer size to trigger delivery of files sized between 100 to 250 MB in Kinesis Firehose.

Answer(s): D



What is a characteristic of the operations of streams in Snowflake?

  1. Whenever a stream is queried, the offset is automatically advanced.
  2. When a stream is used to update a target table, the offset is advanced to the current time.
  3. Querying a stream returns all change records and table rows from the current offset to the current time.
  4. Each committed and uncommitted transaction on the source table automatically puts a change record in the stream.

Answer(s): B



At what isolation level are Snowflake streams?

  1. Snapshot
  2. Repeatable read
  3. Read committed
  4. Read uncommitted

Answer(s): B



A CSV file, around 1 TB in size, is generated daily on an on-premise server. A corresponding table, internal stage, and file format have already been created in Snowflake to facilitate the data loading process.

How can the process of bringing the CSV file into Snowflake be automated using the LEAST amount of operational overhead?

  1. Create a task in Snowflake that executes once a day and runs a COPY INTO statement that references the internal stage. The internal stage will read the files directly from the on-premise server and copy the newest file into the table from the on-premise server to the Snowflake table.
  2. On the on-premise server, schedule a SQL file to run using SnowSQL that executes a PUT to push a specific file to the internal stage. Create a task that executes once a day in Snowflake and runs a COPY INTO statement that references the internal stage. Schedule the task to start after the file lands in the internal stage.
  3. On the on-premise server, schedule a SQL file to run using SnowSQL that executes a PUT to push a specific file to the internal stage. Create a pipe that runs a COPY INTO statement that references the internal stage. Snowpipe auto-ingest will automatically load the file from the internal stage when the new file lands in the internal stage.
  4. On the on-premise server, schedule a Python file that uses the Snowpark Python library. The Python script will read the CSV data into a DataFrame and generate an INSERT INTO statement that will directly load into the table. The script will bypass the need to move a file into an internal stage.

Answer(s): B



A company is using Snowpipe to bring in millions of rows every day of Change Data Capture (CDC) into a Snowflake staging table on a real-time basis. The CDC needs to get processed and combined with other data in Snowflake and land in a final table as part of the full data pipeline.

How can a Data Engineer MOST efficiently process the incoming CDC on an ongoing basis?

  1. Create a stream on the staging table and schedule a task that transforms data from the stream, only when the stream has data.
  2. Transform the data during the data load with Snowpipe by modifying the related COPY INTO statement to include transformation steps such as CASE statements and JOINS.
  3. Schedule a task that dynamically retrieves the last time the task was run from information_schema.task_history and use that timestamp to process the delta of the new rows since the last time the task was run.
  4. Use a CREATE OR REPLACE TABLE AS statement that references the staging table and includes all the transformation SQL. Use a task to run the full CREATE OR REPLACE TABLE AS statement on a scheduled basis.

Answer(s): A



Viewing Page 2 of 16



Share your comments for Snowflake SnowPro Advanced Data Engineer exam with other users:

Prasana 6/23/2023 1:59:00 AM

please post the questions for preparation
Anonymous


test user 9/24/2023 3:15:00 AM

thanks for the questions
AUSTRALIA


Draco 7/19/2023 5:34:00 AM

please reopen it now ..its really urgent
UNITED STATES


Megan 4/14/2023 5:08:00 PM

these practice exam questions were exactly what i needed. the variety of questions and the realistic exam-like environment they created helped me assess my strengths and weaknesses. i felt more confident and well-prepared on exam day, and i owe it to this exam dumps!
UNITED KINGDOM


abdo casa 8/9/2023 6:10:00 PM

thank u it very instructuf
Anonymous


Danny 1/15/2024 9:10:00 AM

its helpful?
INDIA


hanaa 10/3/2023 6:57:00 PM

is this dump still valid???
Anonymous


Georgio 1/19/2024 8:15:00 AM

question 205 answer is b
Anonymous


Matthew Dievendorf 5/30/2023 9:37:00 PM

question 39, should be answer b, directions stated is being sudneted from /21 to a /23. a /23 has 512 ips so 510 hosts. and can make 4 subnets out of the /21
Anonymous


Adhithya 8/11/2022 12:27:00 AM

beautiful test engine software and very helpful. questions are same as in the real exam. i passed my paper.
UNITED ARAB EMIRATES


SuckerPumch88 4/25/2022 10:24:00 AM

the questions are exactly the same in real exam. just make sure not to answer all them correct or else they suspect you are cheating.
UNITED STATES


soheib 7/24/2023 7:05:00 PM

question: 78 the right answer i think is d not a
Anonymous


srija 8/14/2023 8:53:00 AM

very helpful
EUROPEAN UNION


Thembelani 5/30/2023 2:17:00 AM

i am writing this exam tomorrow and have dumps
Anonymous


Anita 10/1/2023 4:11:00 PM

can i have the icdl excel exam
Anonymous


Ben 9/9/2023 7:35:00 AM

please upload it
Anonymous


anonymous 9/20/2023 11:27:00 PM

hye when will post again the past year question for this h13-311_v3 part since i have to for my test tommorow…thank you very much
Anonymous


Randall 9/28/2023 8:25:00 PM

on question 22, option b-once per session is also valid.
Anonymous


Tshegofatso 8/28/2023 11:51:00 AM

this website is very helpful
SOUTH AFRICA


philly 9/18/2023 2:40:00 PM

its my first time exam
SOUTH AFRICA


Beexam 9/4/2023 9:06:00 PM

correct answers are device configuration-enable the automatic installation of webview2 runtime. & policy management- prevent users from submitting feedback.
NEW ZEALAND


RAWI 7/9/2023 4:54:00 AM

is this dump still valid? today is 9-july-2023
SWEDEN


Annie 6/7/2023 3:46:00 AM

i need this exam.. please upload these are really helpful
PAKISTAN


Shubhra Rathi 8/26/2023 1:08:00 PM

please upload the oracle 1z0-1059-22 dumps
Anonymous


Shiji 10/15/2023 1:34:00 PM

very good questions
INDIA


Rita Rony 11/27/2023 1:36:00 PM

nice, first step to exams
Anonymous


Aloke Paul 9/11/2023 6:53:00 AM

is this valid for chfiv9 as well... as i am reker 3rd time...
CHINA


Calbert Francis 1/15/2024 8:19:00 PM

great exam for people taking 220-1101
UNITED STATES


Ayushi Baria 11/7/2023 7:44:00 AM

this is very helpfull for me
Anonymous


alma 8/25/2023 1:20:00 PM

just started preparing for the exam
UNITED KINGDOM


CW 7/10/2023 6:46:00 PM

these are the type of questions i need.
UNITED STATES


Nobody 8/30/2023 9:54:00 PM

does this actually work? are they the exam questions and answers word for word?
Anonymous


Salah 7/23/2023 9:46:00 AM

thanks for providing these questions
Anonymous


Ritu 9/15/2023 5:55:00 AM

interesting
CANADA