Free DSA-C02 Exam Braindumps (page: 6)

Page 5 of 17

As Data Scientist looking out to use Reader account, Which ones are the correct considerations about Reader Accounts for Third-Party Access?

  1. Reader accounts (formerly known as "read-only accounts") provide a quick, easy, and cost- effective way to share data without requiring the consumer to become a Snowflake customer.
  2. Each reader account belongs to the provider account that created it.
  3. Users in a reader account can query data that has been shared with the reader account, but cannot perform any of the DML tasks that are allowed in a full account, such as data loading, insert, update, and similar data manipulation operations.
  4. Data sharing is only possible between Snowflake accounts.

Answer(s): D

Explanation:

Data sharing is only supported between Snowflake accounts. As a data provider, you might want to share data with a consumer who does not already have a Snowflake account or is not ready to be- come a licensed Snowflake customer.
To facilitate sharing data with these consumers, you can create reader accounts. Reader accounts (formerly known as "read-only accounts") provide a quick, easy, and cost-effective way to share data without requiring the consumer to become a Snowflake customer. Each reader account belongs to the provider account that created it. As a provider, you use shares to share databases with reader accounts; however, a reader account can only consume data from the provider account that created it.
So, Data Sharing is possible between Snowflake & Non-snowflake accounts via Reader Account.



A Data Scientist as data providers require to allow consumers to access all databases and database objects in a share by granting a single privilege on shared databases.
Which one is incorrect SnowSQL command used by her while doing this task?
Assuming:
A database named product_db exists with a schema named product_agg and a table named Item_agg.
The database, schema, and table will be shared with two accounts named xy12345 and yz23456.
1. USE ROLE accountadmin;
2. CREATE DIRECT SHARE product_s;
3. GRANT USAGE ON DATABASE product_db TO SHARE product_s;
4. GRANT USAGE ON SCHEMA product_db. product_agg TO SHARE product_s;
5. GRANT SELECT ON TABLE sales_db. product_agg.Item_agg TO SHARE product_s;
6. SHOW GRANTS TO SHARE product_s;
7. ALTER SHARE product_s ADD ACCOUNTS=xy12345, yz23456;
8. SHOW GRANTS OF SHARE product_s;

  1. GRANT USAGE ON DATABASE product_db TO SHARE product_s;
  2. CREATE DIRECT SHARE product_s;
  3. GRANT SELECT ON TABLE sales_db. product_agg.Item_agg TO SHARE product_s;
  4. ALTER SHARE product_s ADD ACCOUNTS=xy12345, yz23456;

Answer(s): C

Explanation:

CREATE SHARE product_s is the correct Snowsql command to create Share object.
Rest are correct ones.
https://docs.snowflake.com/en/user-guide/data-sharing-provider#creating-a-share-using-sql



Which object records data manipulation language (DML) changes made to tables, including inserts, updates, and deletes, as well as metadata about each change, so that actions can be taken using the changed data of Data Science Pipelines?

  1. Task
  2. Dynamic tables
  3. Stream
  4. Tags
  5. Delta
  6. OFFSET

Answer(s): C

Explanation:

A stream object records data manipulation language (DML) changes made to tables, including inserts, updates, and deletes, as well as metadata about each change, so that actions can be taken using the changed data. This process is referred to as change data capture (CDC). An individual table stream tracks the changes made to rows in a source table. A table stream (also referred to as simply a "stream") makes a "change table" available of what changed, at the row level, between two transactional points of time in a table. This allows querying and consuming a sequence of change records in a transactional fashion.
Streams can be created to query change data on the following objects:
· Standard tables, including shared tables.
· Views, including secure views
· Directory tables
· Event tables



Which are the following additional Metadata columns Stream contains that could be used for creating Efficient Data science Pipelines & helps in transforming only the New/Modified data only?

  1. METADATA$ACTION
  2. METADATA$FILE_ID
  3. METADATA$ISUPDATE
  4. METADATA$DELETE
  5. METADATA$ROW_ID

Answer(s): A,C,E

Explanation:

A stream stores an offset for the source object and not any actual table columns or data.
When que- ried, a stream accesses and returns the historic data in the same shape as the source object (i.e. the same column names and ordering) with the following additional columns:
METADATA$ACTION
Indicates the DML operation (INSERT, DELETE) recorded.
METADATA$ISUPDATE
Indicates whether the operation was part of an UPDATE statement. Updates to rows in the source object are represented as a pair of DELETE and INSERT records in the stream with a metadata column METADATA$ISUPDATE values set to TRUE.
Note that streams record the differences between two offsets. If a row is added and then updated in the current offset, the delta change is a new row. The METADATA$ISUPDATE row records a FALSE

value.
METADATA$ROW_ID
Specifies the unique and immutable ID for the row, which can be used to track changes to specific rows over time.






Post your Comments and Discuss Snowflake DSA-C02 exam with other Community members:

DSA-C02 Discussions & Posts