Splunk SPLK-1005 Exam
Splunk Cloud Certified Admin (Page 2 )

Updated On: 1-Feb-2026

At what point in the indexing pipeline set is SEDCMD applied to data?

  1. In the aggregator queue
  2. In the parsing queue
  3. In the exec pipeline
  4. In the typing pipeline

Answer(s): D

Explanation:

In Splunk, SEDCMD (Stream Editing Commands) is applied during the Typing Pipeline of the data indexing process. The Typing Pipeline is responsible for various tasks, such as applying regular expressions for field extractions, replacements, and data transformation operations that occur after the initial parsing and aggregation steps.
Here's how the indexing process works in more detail:

Parsing Pipeline: In this stage, Splunk breaks incoming data into events, identifies timestamps, and assigns metadata.
Merging Pipeline: This stage is responsible for merging events and handling time-based operations. Typing Pipeline: The Typing Pipeline is where SEDCMD operations occur. It applies regular expressions and replacements, which is essential for modifying raw data before indexing. This pipeline is also responsible for field extraction and other similar operations. Index Pipeline: Finally, the processed data is indexed and stored, where it becomes available for searching.


Reference:

To verify this information, you can refer to the official Splunk documentation on the data pipeline and indexing process, specifically focusing on the stages of the indexing pipeline and the roles they play. Splunk Docs often discuss the exact sequence of operations within the pipeline, highlighting when and where commands like SEDCMD are applied during data processing.

Source:
Splunk Docs: Managing Indexers and Clusters of Indexers Splunk Answers: Community discussions and expert responses frequently clarify where specific operations occur within the pipeline.



Which of the following methods is valid for creating index-time field extractions?

  1. Use the UI to create a sourcetype, specify the field name and corresponding regular expression with capture statement.
  2. Create a configuration app with the index-time props.conf and/or transfoms. conf, and upload the app via UI.
  3. Use the CU app to define settings in fields.conf, and restart Splunk Cloud.
  4. Use the rex command to extract the desired field, and then save as a calculated field.

Answer(s): B

Explanation:

The valid method for creating index-time field extractions is to create a configuration app that includes the necessary props.conf and/or transforms.conf configurations. This app can then be uploaded via the UI. Index-time field extractions must be defined in these configuration files to ensure that fields are extracted correctly during indexing. Splunk Documentation


Reference:

Index-time field extractions



When adding a directory monitor and specifying a sourcetype explicitly, it applies to all files in the directory and subdirectories. If automatic sourcetyping is used, a user can selectively override it in which file on the forwarder?

  1. transforms.conf
  2. props.conf
  3. inputs.conf
  4. outputs.cont

Answer(s): B

Explanation:

When a directory monitor is set up with automatic sourcetyping, a user can selectively override the sourcetype assignment by configuring the props.conf file on the forwarder. The props.conf file allows you to define how data should be parsed and processed, including assigning or overriding sourcetypes for specific data inputs.
Splunk Documentation


Reference:

props.conf configuration



By default, which of the following capabilities are granted to the sc_admin role?

  1. indexes_edit, edit___token, admin_all_objects, delete_by_keyword
  2. indexes_edit, fsh_manage, acs_conf, list_indexesdiscovert
  3. indexes_edit, fsh_manage, admin_all_objects can_delete
  4. indexes_edit, edit_token_http, admin _all objects, edit limits_conf

Answer(s): C

Explanation:

By default, the sc_admin role in Splunk Cloud is granted several important capabilities, including:
indexes_edit: The ability to create, edit, and manage indexes. fsh_manage: Manage full-stack monitoring integrations. admin_all_objects: Full administrative control over all objects in Splunk. can_delete: The ability to delete events using the delete command. Option C correctly lists these default capabilities for the sc_admin role. Splunk Documentation


Reference:

User roles and capabilities



Li was asked to create a Splunk configuration to monitor syslog files stored on Linux servers at their organization. This configuration will be pushed out to multiple systems via a Splunk app using the on- prem deployment server.
The system administrators have provided Li with a directory listing for the logging locations on three syslog hosts, which are representative of the file structure for all systems collecting this dat

  1. An example from each system is shown below:



    A)



    B)



    C)



    D)

  2. Option A
  3. Option B
  4. Option C
  5. Option D

Answer(s): A

Explanation:

The correct monitor statement that will capture all variations of the syslog file paths across different systems is [monitor:///var/log/network/syslog*/linux_secure/*].
This configuration works because:
syslog* matches directories that start with "syslog" (like syslog01, syslog02, etc.). The wildcard * after linux_secure/ will capture all files within that directory, including different filenames like syslog.log and syslog.log.2020090801.
This setup will ensure that all the necessary files from the different syslog hosts are monitored. Splunk Documentation


Reference:

Monitor files and directories



Viewing page 2 of 17
Viewing questions 6 - 10 out of 80 questions



Post your Comments and Discuss Splunk SPLK-1005 exam prep with other Community members:

Join the SPLK-1005 Discussion