Snowflake SnowPro Advanced Data Engineer Exam (page: 3)
Snowflake SnowPro Advanced Data Engineer
Updated on: 12-Feb-2026

A Data Engineer would like to define a file structure for loading and unloading data.

Where can the file structure be defined? (Choose three.)

  1. COPY command
  2. MERGE command
  3. FILE FORMAT object
  4. PIPE object
  5. STAGE object
  6. INSERT command

Answer(s): A,C,E



A Data Engineer is building a set of reporting tables to analyze consumer requests by region for each of the Data Exchange offerings annually, as well as click-through rates for each listing.

Which views is needed MINIMALLY as data sources?

  1. SNOWFLAKE.DATA_SHARING_USAGE.LISTING_EVENTS_DAILY
  2. SNOWFLAKE.DATA_SHARING_USAGE.LISTING_CONSUMPTION_DAILY
  3. SNOWFLAKE.DATA_SHARING_USAGE.LISTING_TELEMETRY_DAILY
  4. SNOWFLAKE.ACCOUNT_USAGE.DATA_TRANSFER_HISTORY

Answer(s): A



The following code is executed in a Snowflake environment with the default settings:



What will be the result of the select statement?

  1. SQL compilation error: Object 'CUSTOMER' does not exist or is not authorized.
  2. John
  3. 1
  4. 1John

Answer(s): A



Which command will load data successfully to the table named finance_dept_location?

  1. copy into finance_dept_location(city, zip, sale_date, price) from (select t.$1, t.$2, t.$3, t.$4 from @finstage/finance.csv.gz t)

    file_format = (format_name = fincsvformat);
  2. copy into finance_dept_location(city, zip, sale_date, price) from (select t.$1, t.$2, t.$3, t.$4 from @finstage/finance.csv.gz t) file_format = (format_name = fincsvformat)
    validation_mode=return_all_errors;
  3. copy into finance_dept_location(city, zip, sale_date, price) from (select t.$1, t.$2, t.$3, t.$4 from @finstage/finance.csv.gz t limit 100) file_format = (format_name = fincsvformat);
  4. copy into finance_dept_location(city, zip, sale_date, price) from (select t.$1, t.$2, t.$3, t.$4 from @finstage/finance.csv.gz t where t.$1 like '%austin%')
    file_format = (format_name = fincsvformat);

Answer(s): A

Explanation:

The COPY INTO command is used to load data from a staged file into a table in Snowflake. The correct command should:
1. Reference the correct stage and file @finstage/finance.csv.gz
2. Map the file's positional columns ($1, $2, $3, $4) to the table columns Using SELECT t.$1,
E. $2, t.$3, t.$4
3. Use a predefined file format (fincsvformat) This ensures correct parsing of CSV files.
4. Avoid unnecessary constraints (such as LIMIT or WHERE filters) that could prevent loading the full dataset.



Which of the following grants are required for the role kafka_load_role_1 running the Snowflake Connector for Kafka, with the intent of loading data to Snowflake? (Choose three.)

(Assume this role already exists and has usage access to the schema kafka_schema in database kafka_db, the target for data loading.)

  1. grant create pipe on schema kafka_schema to role kafka_load_role_1;
  2. grant create stream on schema kafka_schema to role kafka_load_role_1;
  3. grant create stage on schema kafka_schema to role kafka_load_role_1;
  4. grant create table on schema kafka_schema to role kafka_load_role_1;
  5. grant create task on schema kafka_schema to role kafka_load_role_1;
  6. grant create external table on schema kafka_schema to role kafka_load_role_1;

Answer(s): A,C,D

Explanation:

A pipe is necessary for Snowpipe, which continuously ingests Kafka data into Snowflake. Without this permission, the Kafka connector cannot create the ingestion pipeline.
A stage is needed as an intermediate storage location for incoming Kafka data before it is loaded into tables.
The connector requires this permission to manage the staging process.
Since the Kafka Connector loads data into Snowflake tables, the role needs permission to create tables in kafka_schema.



A Data Engineer creates a pipe called mypipe.

What command needs to be run to verify that the pipe is configured correctly and is running?

  1. show pipes like 'mypipe'
  2. select system$pipe_status('public.mypipe')
  3. describe pipe 'public.mypipe'
  4. select system$verify_pipe('public.mypipe')

Answer(s): B

Explanation:

To check if a Snowpipe is correctly configured and running, the SYSTEM$PIPE_STATUS function is used. This function returns details about the current status of the pipe, including whether it is running, paused, or encountering errors.



Which attribute is used by Snowpipe to ensure the same file is not loaded twice from a path in a stage?

  1. File partition details
  2. File checksum
  3. Creation_time of the file
  4. Filename

Answer(s): D

Explanation:

Snowpipe tracks previously loaded files using their filenames to prevent duplicate ingestion.
When a file is processed, its filename and associated metadata are recorded in Snowflake's metadata. If the same file appears again, Snowpipe ignores it unless explicitly reloaded.
Snowflake maintains metadata on loaded files for 14 days to ensure idempotent behavior, preventing accidental duplicate loads.



A company called IOT Corporation has subsidiary companies in Germany and Japan.

The German subsidiary has a Snowflake account called IOTGER in AWS region EU (Frankfurt) (Region ID: eu-central-1).
The Japanese subsidiary has a Snowflake account called IOTJPN in Azure region Japan East (Tokyo)

(Region ID: japaneast).

The subsidiaries are totally independent of one another, and their Snowflake accounts belong to different Snowflake organizations.

A Data Engineer needs to share data in a database called IOT_PROD from the German account to the Japanese account.

What steps need to be taken by the German subsidiary so that the Japanese subsidiary can use the shared data?

  1. Create share S1 into IOTGER. Add the objects to be shared from IOT_PROD in IOTGER to the share S1, then add account IOTJPN to the share S1.
  2. Replicate database IOT_PROD from IOTGER to IOTJPN with the same database name. Create share S1 into IOTJPN, add the objects to be shared from IOT_PROD in IOTJPN to the share S1, and add account IOTJPN to the share S1.
  3. Create a clone of the database IOT_PROD into IOTJPN. Create secure views into the cloned database that read data from the objects to be shared. Create share S1 into IOTJPN. Add the secure views to the share
    S1, and add account IOTJPN to the share S1.
  4. Create an additional Snowflake account IOTGER2 into region Japan East. Replicate database IOT_PROD from IOTGER to IOTGER2 with the same database name. Create share S1 into IOTGER2, and add the objects to be shared from IOT_PROD in IOTGER2 to the share S1. Then add account IOTJPN to the share S1.

Answer(s): D

Explanation:

Snowflake does not support direct cross-cloud data sharing between different cloud providers (AWS and Azure) or across different organizations. The only way to share data between IOTGER (AWS Frankfurt) and IOTJPN (Azure Tokyo) is through database replication.
Steps required for cross-cloud sharing:
1. Create an additional Snowflake account (IOTGER2) in Japan East (Azure).
- This account will act as an intermediary between IOTGER (AWS Frankfurt) and IOTJPN (Azure Tokyo).
2. Replicate the database (IOT_PROD) from IOTGER (AWS) to IOTGER2 (Azure).
- Snowflake's Database Replication feature allows database copies to be synchronized across Snowflake accounts in different cloud providers.
3. Create a share (S1) in IOTGER2 and add the replicated objects.
- Once the data is available in Azure Japan East, the standard data sharing mechanism can be used.
4. Grant access to IOTJPN (Azure Tokyo) from IOTGER2.
- Since both IOTGER2 and IOTJPN exist on the same Azure cloud, data sharing is now possible.



Viewing Page 3 of 16



Share your comments for Snowflake SnowPro Advanced Data Engineer exam with other users:

John 8/29/2023 8:59:00 PM

very helpful
Anonymous


Kvana 9/28/2023 12:08:00 PM

good info about oml
UNITED STATES


Checo Lee 7/3/2023 5:45:00 PM

very useful to practice
UNITED STATES


dixitdnoh@gmail.com 8/27/2023 2:58:00 PM

this website is very helpful.
UNITED STATES


Sanjay 8/14/2023 8:07:00 AM

good content
INDIA


Blessious Phiri 8/12/2023 2:19:00 PM

so challenging
Anonymous


PAYAL 10/17/2023 7:14:00 AM

17 should be d ,for morequery its scale out
Anonymous


Karthik 10/12/2023 10:51:00 AM

nice question
Anonymous


Godmode 5/7/2023 10:52:00 AM

yes.
NETHERLANDS


Bhuddhiman 7/30/2023 1:18:00 AM

good mateial
Anonymous


KJ 11/17/2023 3:50:00 PM

good practice exam
Anonymous


sowm 10/29/2023 2:44:00 PM

impressivre qustion
Anonymous


CW 7/6/2023 7:06:00 PM

questions seem helpful
Anonymous


luke 9/26/2023 10:52:00 AM

good content
Anonymous


zazza 6/16/2023 9:08:00 AM

question 21 answer is alerts
ITALY


Abwoch Peter 7/4/2023 3:08:00 AM

am preparing for exam
Anonymous


mohamed 9/12/2023 5:26:00 AM

good one thanks
EGYPT


Mfc 10/23/2023 3:35:00 PM

only got thru 5 questions, need more to evaluate
Anonymous


Whizzle 7/24/2023 6:19:00 AM

q26 should be b
Anonymous


sarra 1/17/2024 3:44:00 AM

the aaa triad in information security is authentication, accounting and authorisation so the answer should be d 1, 3 and 5.
UNITED KINGDOM


DBS 5/14/2023 12:56:00 PM

need to attend this
UNITED STATES


Da_costa 8/1/2023 5:28:00 PM

these are free brain dumps i understand, how can one get free pdf
Anonymous


vikas 10/28/2023 6:57:00 AM

provide access
EUROPEAN UNION


Abdullah 9/29/2023 2:06:00 AM

good morning
Anonymous


Raj 6/26/2023 3:12:00 PM

please upload the ncp-mci 6.5 dumps, really need to practice this one. thanks guys
Anonymous


Miguel 10/5/2023 12:21:00 PM

question 16: https://help.salesforce.com/s/articleview?id=sf.care_console_overview.htm&type=5
SPAIN


Hiren Ladva 7/8/2023 10:34:00 PM

yes i m prepared exam
Anonymous


oliverjames 10/24/2023 5:37:00 AM

my experience was great with this site as i studied for the ms-900 from here and got 900/1000 on the test. my main focus was on the tutorials which were provided and practice questions. thanks!
GERMANY


Bhuddhiman 7/20/2023 11:52:00 AM

great course
UNITED STATES


Anuj 1/14/2024 4:07:00 PM

very good question
Anonymous


Saravana Kumar TS 12/8/2023 9:49:00 AM

question: 93 which statement is true regarding the result? sales contain 6 columns and values contain 7 columns so c is not right answer.
INDIA


Lue 3/30/2023 11:43:00 PM

highly recommend just passed my exam.
CANADA


DC 1/7/2024 10:17:00 AM

great practice! thanks
UNITED STATES


Anonymus 11/9/2023 5:41:00 AM

anyone who wrote this exam recently?
SOUTH AFRICA