Free SnowPro Advanced Architect Exam Braindumps (page: 8)

Page 7 of 17

A media company needs a data pipeline that will ingest customer review data into a Snowflake table, and apply some transformations. The company also needs to use Amazon Comprehend to do sentiment analysis and make the de- identified final data set available publicly for advertising companies who use different cloud providers in different regions.

The data pipeline needs to run continuously ang efficiently as new records arrive in the object storage leveraging event notifications. Also, the operational complexity, maintenance of the infrastructure, including platform upgrades and security, and the development effort should be minimal.
Which design will meet these requirements?

  1. Ingest the data using COPY INTO and use streams and tasks to orchestrate transformations. Export the data into Amazon S3 to do model inference with Amazon Comprehend and ingest the data back into a Snowflake table. Then create a listing in the Snowflake Marketplace to make the data available to other companies.
  2. Ingest the data using Snowpipe and use streams and tasks to orchestrate transformations. Create an external function to do model inference with Amazon Comprehend and write the final records to a Snowflake table. Then create a listing in the Snowflake Marketplace to make the data available to other companies.
  3. Ingest the data into Snowflake using Amazon EMR and PySpark using the Snowflake Spark connector. Apply transformations using another Spark job. Develop a python program to do model inference by leveraging the Amazon Comprehend text analysis API. Then write the results to a Snowflake table and create a listing in the Snowflake Marketplace to make the data available to other companies.
  4. Ingest the data using Snowpipe and use streams and tasks to orchestrate transformations. Export the data into Amazon S3 to do model inference with Amazon Comprehend and ingest the data back into a Snowflake table. Then create a listing in the Snowflake Marketplace to make the data available to other companies.

Answer(s): B



A Snowflake Architect is designing an application and tenancy strategy for an organization where strong legal isolation rules as well as multi-tenancy are requirements. Which approach will meet these requirements if Role-Based Access Policies (RBAC) is a viable option for isolating tenants?

  1. Create accounts for each tenant in the Snowflake organization.
  2. Create an object for each tenant strategy if row level security is viable for isolating tenants.
  3. Create an object for each tenant strategy if row level security is not viable for isolating tenants.
  4. Create a multi-tenant table strategy if row level security is not viable for isolating tenants.

Answer(s): B



Which statements describe characteristics of the use of materialized views in Snowflake? (Choose two.)

  1. They can include ORDER BY clauses.
  2. They cannot include nested subqueries.
  3. They can include context functions, such as CURRENT_TIME().
  4. They can support MIN and MAX aggregates.
  5. They can support inner joins, but not outer joins.

Answer(s): C,D



The Data Engineering team at a large manufacturing company needs to engineer data coming from many sources to support a wide variety of use cases and data consumer requirements which include:

1) Finance and Vendor Management team members who require reporting and visualization
2) Data Science team members who require access to raw data for ML model development
3) Sales team members who require engineered and protected data for data monetization What Snowflake data modeling approaches will meet these requirements? (Choose two.)

  1. Consolidate data in the company’s data lake and use EXTERNAL TABLES.
  2. Create a raw database for landing and persisting raw data entering the data pipelines.
  3. Create a set of profile-specific databases that aligns data with usage patterns.
  4. Create a single star schema in a single database to support all consumers’ requirements.
  5. Create a Data Vault as the sole data pipeline endpoint and have all consumers directly access the Vault.

Answer(s): D,E






Post your Comments and Discuss Snowflake SnowPro Advanced Architect exam with other Community members:

SnowPro Advanced Architect Discussions & Posts