Free DSA-C02 Exam Braindumps (page: 6)

Page 6 of 17

Mark the Incorrect understanding of Data Scientist about Streams?

  1. Streams on views support both local views and views shared using Snowflake Secure Data Sharing, including secure views.
  2. Streams can track changes in materialized views.
  3. Streams itself does not contain any table data.
  4. Streams do not support repeatable read isolation.

Answer(s): B,D

Explanation:

Streams on views support both local views and views shared using Snowflake Secure Data Sharing, including secure views. Currently, streams cannot track changes in materialized views. stream itself does not contain any table data. A stream only stores an offset for the source object and returns CDC records by leveraging the versioning history for the source object.
When the first stream for a table is created, several hidden columns are added to the source table and begin storing change tracking metadata. These columns consume a small amount of storage. The CDC records returned when querying a stream rely on a combination of the offset stored in the stream and the change tracking metadata stored in the table. Note that for streams on views, change tracking must be enabled explicitly for the view and underlying tables to add the hidden columns to these tables. Streams support repeatable read isolation. In repeatable read mode, multiple SQL statements within a transaction see the same set of records in a stream. This differs from the read committed mode supported for tables, in which statements see any changes made by previous statements executed within the same transaction, even though those changes are not yet committed. The delta records returned by streams in a transaction is the range from the current position of the stream until the transaction start time. The stream position advances to the transaction start time if the transaction commits; otherwise it stays at the same position.



Data Scientist used streams in ELT (extract, load, transform) processes where new data inserted in-to a staging table is tracked by a stream. A set of SQL statements transform and insert the stream contents into a set of production tables. Raw data is coming in the JSON format, but for analysis he needs to transform it into relational columns in the production tables. which of the following Data transformation SQL function he can used to achieve the same?

  1. He could not apply Transformation on Stream table data.
  2. lateral flatten()
  3. METADATA$ACTION ()
  4. Transpose()

Answer(s): B

Explanation:

To know about lateral flatten SQL Function, please refer:
https://docs.snowflake.com/en/sql-reference/constructs/join-lateral#example-of-using-lateral-with- flatten



Which command manually triggers a single run of a scheduled task (either a standalone task or the root task in a DAG) independent of the schedule defined for the task?

  1. RUN TASK
  2. CALL TASK
  3. EXECUTE TASK
  4. RUN ROOT TASK

Answer(s): C

Explanation:

The EXECUTE TASK command manually triggers a single run of a scheduled task (either a standalone task or the root task in a DAG) independent of the schedule defined for the task. A successful run of a root task triggers a cascading run of child tasks in the DAG as their precedent task completes, as though the root task had run on its defined schedule.
This SQL command is useful for testing new or modified standalone tasks and DAGs before you enable them to execute SQL code in production.
Call this SQL command directly in scripts or in stored procedures. In addition, this command sup- ports integrating tasks in external data pipelines. Any third-party services that can authenticate into your Snowflake account and authorize SQL actions can execute the EXECUTE TASK command to run tasks.



Which of the following Snowflake parameter can be used to Automatically Suspend Tasks which are running Data science pipelines after specified Failed Runs?

  1. SUSPEND_TASK
  2. SUSPEND_TASK_AUTO_NUM_FAILURES
  3. SUSPEND_TASK_AFTER_NUM_FAILURES
  4. There is none as such available.

Answer(s): C

Explanation:

Automatically Suspend Tasks After Failed Runs
Optionally suspend tasks automatically after a specified number of consecutive runs that either fail or time out. This feature can reduce costs by suspending tasks that consume Snowflake credits but fail to run to completion. Failed task runs include runs in which the SQL code in the task body either produces a user error or times out. Task runs that are skipped, canceled, or that fail due to a sys-tem error are considered indeterminate and are not included in the count of failed task runs. Set the SUSPEND_TASK_AFTER_NUM_FAILURES = num parameter on a standalone task or the root task in a DAG. When the parameter is set to a value greater than 0, the following behavior applies to runs of the standalone task or DAG:
Standalone tasks are automatically suspended after the specified number of consecutive task runs either fail or time out.
The root task is automatically suspended after the run of any single task in a DAG fails or times out the specified number of times in consecutive runs.
The parameter can be set when creating a task (using CREATE TASK) or later (using ALTER TASK). The setting applies to tasks that rely on either Snowflake-managed compute resources (i.e. serverless compute model) or user-managed compute resources (i.e. a virtual warehouse). The SUSPEND_TASK_AFTER_NUM_FAILURES parameter can also be set at the account, database, or schema level. The setting applies to all standalone or root tasks contained in the modified object. Note that explicitly setting the parameter at a lower (i.e. more granular) level overrides the parameter value set at a higher level.



Page 6 of 17



Post your Comments and Discuss Snowflake DSA-C02 exam with other Community members:

beza commented on September 25, 2024
The question and answer sample is very helpful
Anonymous
upvote

Bhuvaneswari E commented on September 25, 2024
Good for preparation
Anonymous
upvote

Mohammad commented on September 25, 2024
helpful, but i think it should be updated
Anonymous
upvote

Harish commented on September 25, 2024
Good level of questions
Anonymous
upvote

Kiran commented on September 25, 2024
Good collection
Anonymous
upvote

seb Tan commented on September 25, 2024
Very accurate and curated
AUSTRALIA
upvote

Mario commented on September 25, 2024
Passed my automation anywhere ADVANCED - RPA- PROFESSIONAL exam. Thank you website owner.
Italy
upvote

Oluwal commented on September 24, 2024
Great questions
UNITED STATES
upvote

Tanu commented on September 24, 2024
Great study material to prepare for the exam
Anonymous
upvote

Mohammed commented on September 24, 2024
Thank you for providing this exam dumps. The site is amazing and very clean. Please keep it this way and don't add any annoying ads or recaptcha validation like other sites.
GERMANY
upvote

Pranesh commented on September 24, 2024
preparing for the exam. little help might be good
UNITED STATES
upvote