Snowflake SnowPro Advanced Data Engineer Exam (page: 3)
Snowflake SnowPro Advanced Data Engineer
Updated on: 31-Mar-2026

A Data Engineer would like to define a file structure for loading and unloading data.

Where can the file structure be defined? (Choose three.)

  1. COPY command
  2. MERGE command
  3. FILE FORMAT object
  4. PIPE object
  5. STAGE object
  6. INSERT command

Answer(s): A,C,E



A Data Engineer is building a set of reporting tables to analyze consumer requests by region for each of the Data Exchange offerings annually, as well as click-through rates for each listing.

Which views is needed MINIMALLY as data sources?

  1. SNOWFLAKE.DATA_SHARING_USAGE.LISTING_EVENTS_DAILY
  2. SNOWFLAKE.DATA_SHARING_USAGE.LISTING_CONSUMPTION_DAILY
  3. SNOWFLAKE.DATA_SHARING_USAGE.LISTING_TELEMETRY_DAILY
  4. SNOWFLAKE.ACCOUNT_USAGE.DATA_TRANSFER_HISTORY

Answer(s): A



The following code is executed in a Snowflake environment with the default settings:



What will be the result of the select statement?

  1. SQL compilation error: Object 'CUSTOMER' does not exist or is not authorized.
  2. John
  3. 1
  4. 1John

Answer(s): A



Which command will load data successfully to the table named finance_dept_location?

  1. copy into finance_dept_location(city, zip, sale_date, price) from (select t.$1, t.$2, t.$3, t.$4 from @finstage/finance.csv.gz t)

    file_format = (format_name = fincsvformat);
  2. copy into finance_dept_location(city, zip, sale_date, price) from (select t.$1, t.$2, t.$3, t.$4 from @finstage/finance.csv.gz t) file_format = (format_name = fincsvformat)
    validation_mode=return_all_errors;
  3. copy into finance_dept_location(city, zip, sale_date, price) from (select t.$1, t.$2, t.$3, t.$4 from @finstage/finance.csv.gz t limit 100) file_format = (format_name = fincsvformat);
  4. copy into finance_dept_location(city, zip, sale_date, price) from (select t.$1, t.$2, t.$3, t.$4 from @finstage/finance.csv.gz t where t.$1 like '%austin%')
    file_format = (format_name = fincsvformat);

Answer(s): A

Explanation:

The COPY INTO command is used to load data from a staged file into a table in Snowflake. The correct command should:
1. Reference the correct stage and file @finstage/finance.csv.gz
2. Map the file's positional columns ($1, $2, $3, $4) to the table columns Using SELECT t.$1,
E. $2, t.$3, t.$4
3. Use a predefined file format (fincsvformat) This ensures correct parsing of CSV files.
4. Avoid unnecessary constraints (such as LIMIT or WHERE filters) that could prevent loading the full dataset.



Which of the following grants are required for the role kafka_load_role_1 running the Snowflake Connector for Kafka, with the intent of loading data to Snowflake? (Choose three.)

(Assume this role already exists and has usage access to the schema kafka_schema in database kafka_db, the target for data loading.)

  1. grant create pipe on schema kafka_schema to role kafka_load_role_1;
  2. grant create stream on schema kafka_schema to role kafka_load_role_1;
  3. grant create stage on schema kafka_schema to role kafka_load_role_1;
  4. grant create table on schema kafka_schema to role kafka_load_role_1;
  5. grant create task on schema kafka_schema to role kafka_load_role_1;
  6. grant create external table on schema kafka_schema to role kafka_load_role_1;

Answer(s): A,C,D

Explanation:

A pipe is necessary for Snowpipe, which continuously ingests Kafka data into Snowflake. Without this permission, the Kafka connector cannot create the ingestion pipeline.
A stage is needed as an intermediate storage location for incoming Kafka data before it is loaded into tables.
The connector requires this permission to manage the staging process.
Since the Kafka Connector loads data into Snowflake tables, the role needs permission to create tables in kafka_schema.



A Data Engineer creates a pipe called mypipe.

What command needs to be run to verify that the pipe is configured correctly and is running?

  1. show pipes like 'mypipe'
  2. select system$pipe_status('public.mypipe')
  3. describe pipe 'public.mypipe'
  4. select system$verify_pipe('public.mypipe')

Answer(s): B

Explanation:

To check if a Snowpipe is correctly configured and running, the SYSTEM$PIPE_STATUS function is used. This function returns details about the current status of the pipe, including whether it is running, paused, or encountering errors.



Which attribute is used by Snowpipe to ensure the same file is not loaded twice from a path in a stage?

  1. File partition details
  2. File checksum
  3. Creation_time of the file
  4. Filename

Answer(s): D

Explanation:

Snowpipe tracks previously loaded files using their filenames to prevent duplicate ingestion.
When a file is processed, its filename and associated metadata are recorded in Snowflake's metadata. If the same file appears again, Snowpipe ignores it unless explicitly reloaded.
Snowflake maintains metadata on loaded files for 14 days to ensure idempotent behavior, preventing accidental duplicate loads.



A company called IOT Corporation has subsidiary companies in Germany and Japan.

The German subsidiary has a Snowflake account called IOTGER in AWS region EU (Frankfurt) (Region ID: eu-central-1).
The Japanese subsidiary has a Snowflake account called IOTJPN in Azure region Japan East (Tokyo)

(Region ID: japaneast).

The subsidiaries are totally independent of one another, and their Snowflake accounts belong to different Snowflake organizations.

A Data Engineer needs to share data in a database called IOT_PROD from the German account to the Japanese account.

What steps need to be taken by the German subsidiary so that the Japanese subsidiary can use the shared data?

  1. Create share S1 into IOTGER. Add the objects to be shared from IOT_PROD in IOTGER to the share S1, then add account IOTJPN to the share S1.
  2. Replicate database IOT_PROD from IOTGER to IOTJPN with the same database name. Create share S1 into IOTJPN, add the objects to be shared from IOT_PROD in IOTJPN to the share S1, and add account IOTJPN to the share S1.
  3. Create a clone of the database IOT_PROD into IOTJPN. Create secure views into the cloned database that read data from the objects to be shared. Create share S1 into IOTJPN. Add the secure views to the share
    S1, and add account IOTJPN to the share S1.
  4. Create an additional Snowflake account IOTGER2 into region Japan East. Replicate database IOT_PROD from IOTGER to IOTGER2 with the same database name. Create share S1 into IOTGER2, and add the objects to be shared from IOT_PROD in IOTGER2 to the share S1. Then add account IOTJPN to the share S1.

Answer(s): D

Explanation:

Snowflake does not support direct cross-cloud data sharing between different cloud providers (AWS and Azure) or across different organizations. The only way to share data between IOTGER (AWS Frankfurt) and IOTJPN (Azure Tokyo) is through database replication.
Steps required for cross-cloud sharing:
1. Create an additional Snowflake account (IOTGER2) in Japan East (Azure).
- This account will act as an intermediary between IOTGER (AWS Frankfurt) and IOTJPN (Azure Tokyo).
2. Replicate the database (IOT_PROD) from IOTGER (AWS) to IOTGER2 (Azure).
- Snowflake's Database Replication feature allows database copies to be synchronized across Snowflake accounts in different cloud providers.
3. Create a share (S1) in IOTGER2 and add the replicated objects.
- Once the data is available in Azure Japan East, the standard data sharing mechanism can be used.
4. Grant access to IOTJPN (Azure Tokyo) from IOTGER2.
- Since both IOTGER2 and IOTJPN exist on the same Azure cloud, data sharing is now possible.



Viewing Page 3 of 16



Share your comments for Snowflake SnowPro Advanced Data Engineer exam with other users:

Simmi 8/24/2023 7:25:00 AM

hi there, i would like to get dumps for this exam
AUSTRALIA


johnson 10/24/2023 5:47:00 AM

i studied for the microsoft azure az-204 exam through it has 100% real questions available for practice along with various mock tests. i scored 900/1000.
GERMANY


Manas 9/9/2023 1:48:00 AM

please upload 1z0-1072-23 exam dups
UNITED STATES


SB 9/12/2023 5:15:00 AM

i was hoping if you could please share the pdf as i’m currently preparing to give the exam.
Anonymous


Jagjit 8/26/2023 5:01:00 PM

i am looking for oracle 1z0-116 exam
UNITED STATES


S Mallik 11/27/2023 12:32:00 AM

where we can get the answer to the questions
Anonymous


PiPi Li 12/12/2023 8:32:00 PM

nice questions
NETHERLANDS


Dan 8/10/2023 4:19:00 PM

question 129 is completely wrong.
UNITED STATES


gayathiri 7/6/2023 12:10:00 AM

i need dump
UNITED STATES


Deb 8/15/2023 8:28:00 PM

love the site.
UNITED STATES


Michelle 6/23/2023 4:08:00 AM

can you please upload it back?
Anonymous


Ajay 10/3/2023 12:17:00 PM

could you please re-upload this exam? thanks a lot!
Anonymous


him 9/30/2023 2:38:00 AM

great about shared quiz
Anonymous


San 11/14/2023 12:46:00 AM

goood helping
Anonymous


Wang 6/9/2022 10:05:00 PM

pay attention to questions. they are very tricky. i waould say about 80 to 85% of the questions are in this exam dump.
UNITED STATES


Mary 5/16/2023 4:50:00 AM

wish you would allow more free questions
Anonymous


thomas 9/12/2023 4:28:00 AM

great simulation
Anonymous


Sandhya 12/9/2023 12:57:00 AM

very g inood
Anonymous


Agathenta 12/16/2023 1:36:00 PM

q35 should be a
Anonymous


MD. SAIFUL ISLAM 6/22/2023 5:21:00 AM

sap c_ts450_2021
Anonymous


Satya 7/24/2023 3:18:00 AM

nice questions
UNITED STATES


sk 5/13/2023 2:10:00 AM

ecellent materil for unserstanding
INDIA


Gerard 6/29/2023 11:14:00 AM

good so far
Anonymous


Limbo 10/9/2023 3:08:00 AM

this is way too informative
BOTSWANA


Tejasree 8/26/2023 1:46:00 AM

very helpfull
UNITED STATES


Yolostar Again 10/12/2023 3:02:00 PM

q.189 - answers are incorrect.
Anonymous


Shikha Bakra 9/10/2023 5:16:00 PM

awesome job in getting these questions
AUSTRALIA


Kevin 10/20/2023 2:01:00 AM

i cant find aws certified practitioner clf-c01 exam in aws website but i found aws certified practitioner clf-c02 exam. can everyone please verify the difference between the two clf-c01 and clf-c02? thank you
UNITED STATES


D Mario 6/19/2023 10:38:00 PM

grazie mille. i got a satisfactory mark in my exam test today because of this exam dumps. sorry for my english.
ITALY


Bharat Kumar Saraf 10/31/2023 4:36:00 AM

some of the answers are incorrect. need to be reviewed.
HONG KONG


JP 7/13/2023 12:21:00 PM

so far so good
Anonymous


Kiky V 8/8/2023 6:32:00 PM

i am really liking it
Anonymous


trying 7/28/2023 12:37:00 PM

thanks good stuff
UNITED STATES


exampei 10/4/2023 2:40:00 PM

need dump c_tadm_23
Anonymous