Snowflake SnowPro Advanced Data Engineer Exam (page: 3)
Snowflake SnowPro Advanced Data Engineer
Updated on: 25-Dec-2025

A Data Engineer would like to define a file structure for loading and unloading data.

Where can the file structure be defined? (Choose three.)

  1. COPY command
  2. MERGE command
  3. FILE FORMAT object
  4. PIPE object
  5. STAGE object
  6. INSERT command

Answer(s): A,C,E



A Data Engineer is building a set of reporting tables to analyze consumer requests by region for each of the Data Exchange offerings annually, as well as click-through rates for each listing.

Which views is needed MINIMALLY as data sources?

  1. SNOWFLAKE.DATA_SHARING_USAGE.LISTING_EVENTS_DAILY
  2. SNOWFLAKE.DATA_SHARING_USAGE.LISTING_CONSUMPTION_DAILY
  3. SNOWFLAKE.DATA_SHARING_USAGE.LISTING_TELEMETRY_DAILY
  4. SNOWFLAKE.ACCOUNT_USAGE.DATA_TRANSFER_HISTORY

Answer(s): A



The following code is executed in a Snowflake environment with the default settings:



What will be the result of the select statement?

  1. SQL compilation error: Object 'CUSTOMER' does not exist or is not authorized.
  2. John
  3. 1
  4. 1John

Answer(s): A



Which command will load data successfully to the table named finance_dept_location?

  1. copy into finance_dept_location(city, zip, sale_date, price) from (select t.$1, t.$2, t.$3, t.$4 from @finstage/finance.csv.gz t)

    file_format = (format_name = fincsvformat);
  2. copy into finance_dept_location(city, zip, sale_date, price) from (select t.$1, t.$2, t.$3, t.$4 from @finstage/finance.csv.gz t) file_format = (format_name = fincsvformat)
    validation_mode=return_all_errors;
  3. copy into finance_dept_location(city, zip, sale_date, price) from (select t.$1, t.$2, t.$3, t.$4 from @finstage/finance.csv.gz t limit 100) file_format = (format_name = fincsvformat);
  4. copy into finance_dept_location(city, zip, sale_date, price) from (select t.$1, t.$2, t.$3, t.$4 from @finstage/finance.csv.gz t where t.$1 like '%austin%')
    file_format = (format_name = fincsvformat);

Answer(s): A

Explanation:

The COPY INTO command is used to load data from a staged file into a table in Snowflake. The correct command should:
1. Reference the correct stage and file @finstage/finance.csv.gz
2. Map the file's positional columns ($1, $2, $3, $4) to the table columns Using SELECT t.$1,
E. $2, t.$3, t.$4
3. Use a predefined file format (fincsvformat) This ensures correct parsing of CSV files.
4. Avoid unnecessary constraints (such as LIMIT or WHERE filters) that could prevent loading the full dataset.



Which of the following grants are required for the role kafka_load_role_1 running the Snowflake Connector for Kafka, with the intent of loading data to Snowflake? (Choose three.)

(Assume this role already exists and has usage access to the schema kafka_schema in database kafka_db, the target for data loading.)

  1. grant create pipe on schema kafka_schema to role kafka_load_role_1;
  2. grant create stream on schema kafka_schema to role kafka_load_role_1;
  3. grant create stage on schema kafka_schema to role kafka_load_role_1;
  4. grant create table on schema kafka_schema to role kafka_load_role_1;
  5. grant create task on schema kafka_schema to role kafka_load_role_1;
  6. grant create external table on schema kafka_schema to role kafka_load_role_1;

Answer(s): A,C,D

Explanation:

A pipe is necessary for Snowpipe, which continuously ingests Kafka data into Snowflake. Without this permission, the Kafka connector cannot create the ingestion pipeline.
A stage is needed as an intermediate storage location for incoming Kafka data before it is loaded into tables.
The connector requires this permission to manage the staging process.
Since the Kafka Connector loads data into Snowflake tables, the role needs permission to create tables in kafka_schema.



A Data Engineer creates a pipe called mypipe.

What command needs to be run to verify that the pipe is configured correctly and is running?

  1. show pipes like 'mypipe'
  2. select system$pipe_status('public.mypipe')
  3. describe pipe 'public.mypipe'
  4. select system$verify_pipe('public.mypipe')

Answer(s): B

Explanation:

To check if a Snowpipe is correctly configured and running, the SYSTEM$PIPE_STATUS function is used. This function returns details about the current status of the pipe, including whether it is running, paused, or encountering errors.



Which attribute is used by Snowpipe to ensure the same file is not loaded twice from a path in a stage?

  1. File partition details
  2. File checksum
  3. Creation_time of the file
  4. Filename

Answer(s): D

Explanation:

Snowpipe tracks previously loaded files using their filenames to prevent duplicate ingestion.
When a file is processed, its filename and associated metadata are recorded in Snowflake's metadata. If the same file appears again, Snowpipe ignores it unless explicitly reloaded.
Snowflake maintains metadata on loaded files for 14 days to ensure idempotent behavior, preventing accidental duplicate loads.



A company called IOT Corporation has subsidiary companies in Germany and Japan.

The German subsidiary has a Snowflake account called IOTGER in AWS region EU (Frankfurt) (Region ID: eu-central-1).
The Japanese subsidiary has a Snowflake account called IOTJPN in Azure region Japan East (Tokyo)

(Region ID: japaneast).

The subsidiaries are totally independent of one another, and their Snowflake accounts belong to different Snowflake organizations.

A Data Engineer needs to share data in a database called IOT_PROD from the German account to the Japanese account.

What steps need to be taken by the German subsidiary so that the Japanese subsidiary can use the shared data?

  1. Create share S1 into IOTGER. Add the objects to be shared from IOT_PROD in IOTGER to the share S1, then add account IOTJPN to the share S1.
  2. Replicate database IOT_PROD from IOTGER to IOTJPN with the same database name. Create share S1 into IOTJPN, add the objects to be shared from IOT_PROD in IOTJPN to the share S1, and add account IOTJPN to the share S1.
  3. Create a clone of the database IOT_PROD into IOTJPN. Create secure views into the cloned database that read data from the objects to be shared. Create share S1 into IOTJPN. Add the secure views to the share
    S1, and add account IOTJPN to the share S1.
  4. Create an additional Snowflake account IOTGER2 into region Japan East. Replicate database IOT_PROD from IOTGER to IOTGER2 with the same database name. Create share S1 into IOTGER2, and add the objects to be shared from IOT_PROD in IOTGER2 to the share S1. Then add account IOTJPN to the share S1.

Answer(s): D

Explanation:

Snowflake does not support direct cross-cloud data sharing between different cloud providers (AWS and Azure) or across different organizations. The only way to share data between IOTGER (AWS Frankfurt) and IOTJPN (Azure Tokyo) is through database replication.
Steps required for cross-cloud sharing:
1. Create an additional Snowflake account (IOTGER2) in Japan East (Azure).
- This account will act as an intermediary between IOTGER (AWS Frankfurt) and IOTJPN (Azure Tokyo).
2. Replicate the database (IOT_PROD) from IOTGER (AWS) to IOTGER2 (Azure).
- Snowflake's Database Replication feature allows database copies to be synchronized across Snowflake accounts in different cloud providers.
3. Create a share (S1) in IOTGER2 and add the replicated objects.
- Once the data is available in Azure Japan East, the standard data sharing mechanism can be used.
4. Grant access to IOTJPN (Azure Tokyo) from IOTGER2.
- Since both IOTGER2 and IOTJPN exist on the same Azure cloud, data sharing is now possible.



Viewing Page 3 of 16



Share your comments for Snowflake SnowPro Advanced Data Engineer exam with other users:

shobha 11/29/2025 2:19:59 AM

very helpful
INDIA


Pandithurai 11/12/2025 12:16:21 PM

Question 1, Ans is - Developer,Standard,Professional Direct and Premier
Anonymous


Einstein 11/8/2025 4:13:37 AM

Passed this exam in first appointment. Great resource and valid exam dump.
Anonymous


David 10/31/2025 4:06:16 PM

Today I wrote this exam and passed, i totally relay on this practice exam. The questions were very tough, these questions are valid and I encounter the same.
UNITED STATES


Thor 10/21/2025 5:16:29 AM

Anyone used this dump recently?
NEW ZEALAND


Vladimir 9/25/2025 9:11:14 AM

173 question is A not D
Anonymous


khaos 9/21/2025 7:07:26 AM

nice questions
Anonymous


Katiso Lehasa 9/15/2025 11:21:52 PM

Thanks for the practice questions they helped me a lot.
Anonymous


Einstein 9/2/2025 7:42:00 PM

Passed this exam today. All questions are valid and this is not something you can find in ChatGPT.
UNITED KINGDOM


vito 8/22/2025 4:16:51 AM

i need to pass exam for VMware 2V0-11.25
Anonymous


Matt 7/31/2025 11:44:40 PM

Great questions.
UNITED STATES


OLERATO 7/1/2025 5:44:14 AM

great dumps to practice for the exam
SOUTH AFRICA


Adekunle willaims 6/9/2025 7:37:29 AM

How reliable and relevant are these questions?? also i can see the last update here was January and definitely new questions would have emerged.
Anonymous


Alex 5/24/2025 12:54:15 AM

Can I trust to this source?
Anonymous


SPriyak 3/17/2025 11:08:37 AM

can you please provide the CBDA latest test preparation
UNITED STATES


Chandra 11/28/2024 7:17:38 AM

This is the best and only way of passing this exam as it is extremely hard. Good questions and valid dump.
INDIA


Sunak 1/25/2025 9:17:57 AM

Can I use this dumps when I am taking the exam? I mean does somebody look what tabs or windows I have opened ?
BULGARIA


Frank 2/15/2024 11:36:57 AM

Finally got a change to write this exam and pass it! Valid and accurate!
CANADA


Anonymous User 2/2/2024 6:42:12 PM

Upload this exam please!
Anonymous


Nicholas 2/2/2024 6:17:08 PM

Thank you for providing these questions. It helped me a lot with passing my exam.
Anonymous


Timi 8/19/2023 5:30:00 PM

my first attempt
UNITED KINGDOM


Blessious Phiri 8/13/2023 10:32:00 AM

very explainable
Anonymous


m7md ibrahim 5/26/2023 6:21:00 PM

i think answer of q 462 is variance analysis
Anonymous


Tehu 5/25/2023 12:25:00 PM

hi i need see questions
Anonymous


Ashfaq Nasir 1/17/2024 1:19:00 AM

best study material for exam
Anonymous


Roberto 11/27/2023 12:33:00 AM

very interesting repository
ITALY


Nale 9/18/2023 1:51:00 PM

american history 1
Anonymous


Tanvi 9/27/2023 4:02:00 AM

good level of questions
Anonymous


Boopathy 8/17/2023 1:03:00 AM

i need this dump kindly upload it
Anonymous


s_123 8/12/2023 4:28:00 PM

do we need c# coding to be az204 certified
Anonymous


Blessious Phiri 8/15/2023 3:38:00 PM

excellent topics covered
Anonymous


Manasa 12/5/2023 3:15:00 AM

are these really financial cloud questions and answers, seems these are basic admin question and answers
Anonymous


Not Robot 5/14/2023 5:33:00 PM

are these comments real
Anonymous


kriah 9/4/2023 10:44:00 PM

please upload the latest dumps
UNITED STATES