Free Snowflake SnowPro Advanced Data Engineer Exam Braindumps (page: 2)

91.1% Passing Rate DOWNLOAD PDF EXAM
114 Questions & Answers
Page 2 of 16

A Data Engineer needs to load JSON output from some software into Snowflake using Snowpipe.

Which recommendations apply to this scenario? (Choose three.)

  1. Load large files (1 GB or larger).
  2. Ensure that data files are 100-250 MB (or larger) in size, compressed.
  3. Load a single huge array containing multiple records into a single table row.
  4. Verify each value of each unique element stores a single native data type (string or number).
  5. Extract semi-structured data elements containing null values into relational columns before loading.
  6. Create data files that are less than 100 MB and stage them in cloud storage at a sequence greater than once each minute.

Answer(s): B,D,E



A Data Engineer is working on a Snowflake deployment in AWS eu-west-1 (Ireland). The Engineer is planning to load data from staged files into target tables using the COPY INTO command.

Which sources are valid? (Choose three.)

  1. Internal stage on GCP us-central1 (Iowa)
  2. Internal stage on AWS eu-central-1 (Frankfurt)
  3. External stage on GCP us-central1 (Iowa)
  4. External stage in an Amazon S3 bucket on AWS eu-west-1 (Ireland)
  5. External stage in an Amazon S3 bucket on AWS eu-central-1 (Frankfurt)
  6. SSD attached to an Amazon EC2 instance on AWS eu-west-1 (Ireland)

Answer(s): B,D,E



What is the purpose of the BUILD_STAGE_FILE_URL function in Snowflake?

  1. It generates an encrypted URL for accessing a file in a stage.
  2. It generates a staged URL for accessing a file in a stage.
  3. It generates a permanent URL for accessing files in a stage.
  4. It generates a temporary URL for accessing a file in a stage.

Answer(s): C



A Data Engineer needs to ingest invoice data in PDF format into Snowflake so that the data can be queried and used in a forecasting solution.

What is the recommended way to ingest this data?

  1. Use Snowpipe to ingest the files that land in an external stage into a Snowflake table.
  2. Use a COPY INTO command to ingest the PDF files in an external stage into a Snowflake table with a VARIANT column.
  3. Create an external table on the PDF files that are stored in a stage and parse the data into structured data.
  4. Create a Java User-Defined Function (UDF) that leverages Java-based PDF parser libraries to parse PDF data into structured data.

Answer(s): D



A table is loaded using Snowpipe and truncated afterwards. Later, a Data Engineer finds that the table needs to be reloaded, but the metadata of the pipe will not allow the same files to be loaded again.

How can this issue be solved using the LEAST amount of operational overhead?

  1. Wait until the metadata expires and then reload the file using Snowpipe.
  2. Modify the file by adding a blank row to the bottom and re-stage the file.
  3. Set the FORCE=TRUE option in the Snowpipe COPY INTO command.
  4. Recreate the pipe by using the CREATE OR REPLACE PIPE command.

Answer(s): C



Which Snowflake objects does the Snowflake Kafka connector use? (Choose three.)

  1. Pipe
  2. Serverless task
  3. Internal user stage
  4. Internal table stage
  5. Internal named stage
  6. Storage integration

Answer(s): A,D,E



A Data Engineer is implementing a near real-time ingestion pipeline to load data into Snowflake using the Snowflake Kafka connector. There will be three Kafka topics created.

Which Snowflake objects are created automatically when the Kafka connector starts? (Choose three.)

  1. Tables
  2. Tasks
  3. Pipes
  4. Internal stages
  5. External stages
  6. Materialized views

Answer(s): A,C,D



Which stages support external tables?

  1. Internal stages only; within a single Snowflake account
  2. Internal stages only; from any Snowflake account in the organization
  3. External stages only; from any region, and any cloud provider
  4. External stages only; only on the same region and cloud provider as the Snowflake account

Answer(s): C






Post your Comments and Discuss Snowflake SnowPro Advanced Data Engineer exam prep with other Community members:

SnowPro Advanced Data Engineer Exam Discussions & Posts