SCDM CCDM Exam
Certified Clinical Data Manager (Page 5 )

Updated On: 7-Feb-2026

What method is used for quality control of the query resolution process?

  1. Calculate the time from discrepancy identified to query sent.
  2. Tabulate the number of queries sent per site.
  3. Calculate the time from query sent to query resolution from the site.
  4. Perform random audits of the resolved query forms.

Answer(s): D

Explanation:

The most effective method for quality control (QC) of the query resolution process is to perform random audits of resolved query forms. This ensures that queries are being appropriately raised,

addressed, and resolved in accordance with the study protocol, data management plan (DMP), and standard operating procedures (SOPs).
According to the GCDMP (Chapter: Data Validation and Cleaning), QC activities should verify that the data review and query management process maintains high accuracy and consistency. Random auditing of resolved queries enables verification that:
Queries were raised for legitimate discrepancies,
The site's responses were appropriate, and
The resolution actions taken by data management were correct and well-documented. Metrics such as turnaround time (options A and C) or query counts (option B) measure efficiency but do not assess quality. True quality control focuses on ensuring that data corrections preserve accuracy, auditability, and traceability -- the fundamental principles of data integrity in clinical research.


Reference:

(CCDM-Verified Sources)
SCDM Good Clinical Data Management Practices (GCDMP), Chapter: Data Validation and Cleaning, Section 5.4 ­ Query Management and Quality Control
ICH E6 (R2) GCP, Section 5.5.3 ­ Data Integrity and Validation Procedures



Which metric will identify edit checks that may not be working properly?

  1. Count by edit check of the number of times the check fired
  2. Count by site of the number of times any edit check fired
  3. Average number of edit check identified discrepancies per form
  4. Average number of times each edit check has fired

Answer(s): A

Explanation:

The best metric to identify malfunctioning or ineffective edit checks is the count by edit check of the number of times the check fired. This allows data managers to assess whether specific edit checks are performing as intended.
According to the GCDMP, Chapter: Data Validation and Cleaning, edit checks are programmed logic conditions that identify data inconsistencies or potential errors during data entry. A properly functioning edit check should trigger only when data falls outside acceptable or logical limits. If an edit check fires too frequently or not at all, it may indicate a logic error in the check's programming or configuration.
By analyzing counts by individual edit checks, data managers can:
Identify checks that never trigger (potentially inactive or incorrectly written), Detect overactive checks (poorly designed parameters causing excessive false positives), and Optimize system performance and review efficiency.
This metric supports continuous improvement in data validation logic and contributes to cleaner, higher-quality clinical databases.


Reference:

(CCDM-Verified Sources)
SCDM Good Clinical Data Management Practices (GCDMP), Chapter: Data Validation and Cleaning,

Section 6.2 ­ Edit Check Design and Performance Metrics FDA Guidance: Computerized Systems Used in Clinical Investigations ­ Section on Validation of Electronic Data Systems



What should be done if the site continues to provide inconsistent data after several re-queries?

  1. Continue to re-query until the site changes the data
  2. Gently lead the site to the correct response
  3. Escalate the issue to the appropriate site contact personnel
  4. Do nothing, the data will remain inconsistent

Answer(s): C

Explanation:

If a clinical site continues to provide inconsistent or illogical data after multiple queries, the correct course of action is to escalate the issue to the appropriate site contact personnel, typically the Clinical Research Associate (CRA) or Site Monitor.
According to the Good Clinical Data Management Practices (GCDMP), persistent data discrepancies often indicate a misunderstanding of the protocol, CRF instructions, or data entry procedures at the site level. Repeatedly re-querying the same data without escalation wastes time and risks introducing bias or error. By escalating through formal communication channels, the issue can be clarified through re-training, documentation review, or site monitoring visits. The GCDMP emphasizes that escalation ensures data accuracy, site accountability, and protocol adherence, maintaining both data quality and regulatory compliance. Data managers must document the escalation process in the Data Management Plan (DMP) and ensure proper follow-up resolution is achieved.


Reference:

(CCDM-Verified Sources)
SCDM Good Clinical Data Management Practices (GCDMP), Chapter: Communication and Issue Escalation, Section 4.2 ­ Handling Persistent Data Discrepancies ICH E6 (R2) Good Clinical Practice, Section 5.18 ­ Monitoring and Site Communication FDA Guidance for Industry: Oversight of Clinical Investigations ­ Risk-Based Monitoring, Section on Issue Escalation



Which competency is necessary for EDC system use in a study using the medical record as the source?

  1. Screening study subjects
  2. Using ePRO devices
  3. Resolving discrepant data
  4. Training on how to log into Medical Records system

Answer(s): D

Explanation:

In studies where the medical record serves as the source document, the Electronic Data Capture (EDC) system users (typically study coordinators or site personnel) must have appropriate training on how to access and log into the medical record system. This competency ensures that data abstracted from the electronic medical record (EMR) are complete, accurate, and verifiable in compliance with Good Clinical Practice (GCP) and Good Clinical Data Management Practices (GCDMP).

According to the GCDMP (Chapter: EDC Systems and Data Capture) and ICH E6(R2), all personnel involved in data entry and verification must be trained in both the EDC and the primary source systems (e.g., EMR). This ensures that the integrity of data flow--from source to EDC--is maintained, and that personnel understand system access controls, audit trails, and proper documentation of source verification.
While resolving discrepant data (C) and screening subjects (A) are part of study operations, the competency directly related to EDC system use in EMR-based studies is the ability to properly log into and navigate the medical records system to extract source data.


Reference:

(CCDM-Verified Sources)
SCDM GCDMP, Chapter: Electronic Data Capture (EDC), Section 5.1 ­ Source Data and System Access Requirements
ICH E6(R2) Good Clinical Practice, Section 4.9 ­ Source Documents and Data Handling FDA Guidance: Use of Electronic Health Record Data in Clinical Investigations, Section 3 ­ Investigator Responsibilities



A Data Manager is importing data from an external facility.
Which is commonly checked first?

  1. Incoming files have the expected number of records
  2. Incoming files are conformant to the data transfer specifications
  3. Data in incoming files are consistent with existing data in the study database
  4. Data in the incoming files are internally consistent

Answer(s): B

Explanation:

When importing external data (e.g., laboratory or imaging results) into a clinical database, the first step in data import quality control is to verify that incoming files conform to the pre-specified data transfer specifications (DTS).
According to the GCDMP (Chapter: External Data Transfers and Integration), the Data Transfer Specification defines file structure, variable names, data types, delimiters, record counts, and validation rules. The initial import check confirms that the received file matches the technical and structural requirements before content or record consistency is evaluated. Subsequent checks--such as record counts (A), data consistency with existing database (C), and internal logical consistency (D)--are performed only after the file structure is validated and confirmed to match the specifications. Failure to perform this first check may cause import errors or corrupted data loads.
Thus, the first and most critical verification step is ensuring file conformity to the agreed data transfer specifications, making option B correct.


Reference:

(CCDM-Verified Sources)
SCDM GCDMP, Chapter: External Data Transfers, Section 4.2 ­ Data Transfer File Validation and Import Checks
ICH E6(R2) GCP, Section 5.5.3 ­ Validation of Computerized Systems and Data Imports



Viewing page 5 of 31
Viewing questions 21 - 25 out of 150 questions



Post your Comments and Discuss SCDM CCDM exam prep with other Community members:

Join the CCDM Discussion