Snowflake SnowPro Advanced Data Engineer Exam Questions
SnowPro Advanced Data Engineer (Page 2 )

Updated On: 17-Feb-2026

A Data Engineer needs to load JSON output from some software into Snowflake using Snowpipe.

Which recommendations apply to this scenario? (Choose three.)

  1. Load large files (1 GB or larger).
  2. Ensure that data files are 100-250 MB (or larger) in size, compressed.
  3. Load a single huge array containing multiple records into a single table row.
  4. Verify each value of each unique element stores a single native data type (string or number).
  5. Extract semi-structured data elements containing null values into relational columns before loading.
  6. Create data files that are less than 100 MB and stage them in cloud storage at a sequence greater than once each minute.

Answer(s): B,D,E



A Data Engineer is working on a Snowflake deployment in AWS eu-west-1 (Ireland). The Engineer is planning to load data from staged files into target tables using the COPY INTO command.

Which sources are valid? (Choose three.)

  1. Internal stage on GCP us-central1 (Iowa)
  2. Internal stage on AWS eu-central-1 (Frankfurt)
  3. External stage on GCP us-central1 (Iowa)
  4. External stage in an Amazon S3 bucket on AWS eu-west-1 (Ireland)
  5. External stage in an Amazon S3 bucket on AWS eu-central-1 (Frankfurt)
  6. SSD attached to an Amazon EC2 instance on AWS eu-west-1 (Ireland)

Answer(s): B,D,E



What is the purpose of the BUILD_STAGE_FILE_URL function in Snowflake?

  1. It generates an encrypted URL for accessing a file in a stage.
  2. It generates a staged URL for accessing a file in a stage.
  3. It generates a permanent URL for accessing files in a stage.
  4. It generates a temporary URL for accessing a file in a stage.

Answer(s): C



A Data Engineer needs to ingest invoice data in PDF format into Snowflake so that the data can be queried and used in a forecasting solution.

What is the recommended way to ingest this data?

  1. Use Snowpipe to ingest the files that land in an external stage into a Snowflake table.
  2. Use a COPY INTO command to ingest the PDF files in an external stage into a Snowflake table with a VARIANT column.
  3. Create an external table on the PDF files that are stored in a stage and parse the data into structured data.
  4. Create a Java User-Defined Function (UDF) that leverages Java-based PDF parser libraries to parse PDF data into structured data.

Answer(s): D



A table is loaded using Snowpipe and truncated afterwards. Later, a Data Engineer finds that the table needs to be reloaded, but the metadata of the pipe will not allow the same files to be loaded again.

How can this issue be solved using the LEAST amount of operational overhead?

  1. Wait until the metadata expires and then reload the file using Snowpipe.
  2. Modify the file by adding a blank row to the bottom and re-stage the file.
  3. Set the FORCE=TRUE option in the Snowpipe COPY INTO command.
  4. Recreate the pipe by using the CREATE OR REPLACE PIPE command.

Answer(s): C






Post your Comments and Discuss Snowflake SnowPro Advanced Data Engineer exam dumps with other Community members:

Join the SnowPro Advanced Data Engineer Discussion