According to the FDA Guidance for Industry, Providing Regulatory Submissions in Electronic Format (April 2006) and Good Clinical Data Management Practices (GCDMP, May 2007), which of the following is the most acceptable for a derived field?
Answer(s): C
In clinical data management, a derived field refers to any variable that is not directly collected from the Case Report Form (CRF) but is instead calculated or inferred from one or more collected variables (for example, calculating an average blood pressure from multiple readings). Proper documentation of derived fields is essential for ensuring data traceability, transparency, and compliance with both FDA and SCDM guidelines.According to the Good Clinical Data Management Practices (GCDMP, May 2007), all derivations and transformations applied to clinical data must be clearly defined and documented in metadata such as the dataset definition file (also referred to as data specifications, variable definition tables, or Define.xml files). The derivation algorithm should be explicitly stated in this documentation to allow independent verification, regulatory review, and reproducibility of results.The FDA Guidance for Industry (April 2006) on electronic submissions further emphasizes that derived fields must be supported by comprehensive metadata that defines the computational method used. This documentation enables the FDA or any regulatory body to audit and reproduce analytical results without ambiguity. Annotating or describing derivations directly on the CRF (as in options A, B, or D) is not sufficient, as CRFs represent data collection instruments--not analytical documentation.Therefore, the correct and regulatory-compliant practice is to provide the derivation algorithm for a calculated field within the dataset definition file, aligning with both FDA and GCDMP expectations for data integrity and auditability.
(CCDM-Verified Sources)Society for Clinical Data Management (SCDM), Good Clinical Data Management Practices (GCDMP), Chapter: Data Handling and Processing Derived and Calculated Data Fields, Section 5.3.3FDA Guidance for Industry: Providing Regulatory Submissions in Electronic Format, April 2006, Section 3.2 on Dataset Documentation RequirementsCDISC Define.xml Implementation Guide Metadata and Algorithm Documentation for Derived Variables
A study numbers subjects sequentially within each site and does not reuse site numbers. Which information is required when joining data across tables?
Answer(s): A
When subjects are numbered sequentially within each site, it means that the subject identification numbers (Subject IDs) restart from 001 at each site. For example, Site 101 may have Subject 001, and Site 102 may also have a Subject 001. In such cases, the subject number alone is not globally unique across the entire study. Therefore, when integrating or joining data across multiple database tables (for example, linking demographic, adverse event, and laboratory data), both the site number and the subject number are required to create a unique key that accurately identifies each record. According to the Good Clinical Data Management Practices (GCDMP, Chapter on CRF Design and Data Collection), every data record in a clinical trial database must be uniquely and unambiguously identified. This is typically achieved through a composite key, combining identifiers such as site number, subject number, and sometimes study number. The GCDMP specifies that a robust data structure must prevent duplication or mislinking of records across domains or tables. Furthermore, FDA and CDISC standards (SDTM model) also emphasize the importance of unique subject identifiers (USUBJID), which are derived from concatenating the study ID, site ID, and subject ID. This ensures traceability, integrity, and accuracy of subject-level data during database joins, data exports, and regulatory submissions.Thus, in the described scenario, since subject numbering restarts at each site, both the site number and subject number are required to uniquely identify and correctly join subject data across different datasets or tables.
(CCDM-Verified Sources)SCDM Good Clinical Data Management Practices (GCDMP), Chapter: CRF Design and Data Collection, Section 4.1 Unique Subject IdentificationCDISC SDTM Implementation Guide, Section 5.2 Subject and Site Identification (Variable: USUBJID) FDA Guidance for Industry: Computerized Systems Used in Clinical Investigations, Section 6 Data Integrity and Record Identification
Which of the following factors can be tested through a second test transfer?
Answer(s): B
In the context of database design and external data management, a test data transfer (or trial data load) is performed to ensure the proper configuration, structure, and integrity of data imported from an external vendor or system. The second test transfer is specifically useful to confirm that data structures and formats are consistently aligned between the sending and receiving systems after initial adjustments have been made from the first test. According to the Good Clinical Data Management Practices (GCDMP), the file format -- including variables, data types, field lengths, delimiters, and encoding -- must be validated during test transfers to confirm compatibility and ensure accurate loading into the target database. Once the initial test identifies and corrects errors (e.g., mismatched variable names or data types), the second transfer verifies that the corrections have been implemented correctly and that the file structure functions as intended.Testing change management (A) involves procedural controls, not data transfers. The transfer method (C) and transfer frequency (D) are validated during initial process setup, not during subsequent test transfers.Therefore, option B (File format) is correct, as the second test transfer verifies the technical integrity of the file structure before live production transfers begin.
(CCDM-Verified Sources)SCDM Good Clinical Data Management Practices (GCDMP), Chapter: External Data Transfers and Data Integration, Section 5.2 Test Transfers and File Validation FDA Guidance for Industry: Computerized Systems Used in Clinical Investigations, Section 6.3 Data Import and Validation Controls
Which of the following statements would be BEST included in a data management plan describing the process for making self-evident corrections in a clinical database?
Answer(s): D
A self-evident correction (SEC) refers to a data correction that is obvious, logical, and unambiguous -- such as correcting an impossible date (e.g., 31-APR-2024) or standardizing a known abbreviation (e.g., "BP" to "Blood Pressure"). According to the Good Clinical Data Management Practices (GCDMP), SECs can be applied by data management staff following pre-approved conventions defined in the Data Management Plan (DMP).The DMP should explicitly describe the criteria for SECs, including the types of errors eligible for this correction method, the required documentation, and the communication procedure to inform the investigative site. The process must maintain audit trail transparency and ensure that all changes are traceable and justified.Options A and B suggest unauthorized or informal change procedures, which violate audit and compliance standards. Option C is too restrictive, as it prevents the efficient correction of non-clinical transcription or formatting errors.Therefore, option D is correct: "Self-evident changes may be made per the listed conventions and documented to the investigative site." This approach aligns with CCDM expectations for balancing efficiency, accuracy, and regulatory compliance.
(CCDM-Verified Sources)SCDM GCDMP, Chapter: Data Validation and Cleaning, Section 6.2 Self-Evident Corrections FDA 21 CFR Part 11 Electronic Records; Audit Trails and Traceability Requirements
Which document contains the details of when, to whom, and in what manner the vendor data will be sent?
A Data Transfer Agreement (DTA) defines the operational and technical details for transferring data between a sponsor and an external vendor (e.g., central lab, ECG vendor). It is a formalized, controlled document specifying what data will be sent, when transfers will occur, the transfer method, file structure, encryption or security protocols, and the recipients of the data. The DTA is developed jointly by the sponsor and vendor before production data transfers begin. According to the GCDMP, Chapter on External Data Transfers, this agreement ensures both parties share a clear understanding of timing, responsibility, and data content to minimize errors and ensure regulatory compliance.The Data Management Plan (DMP) outlines general data handling processes but does not capture the technical specifics of vendor data transfer logistics. The Project Plan (A) and Communication Plan (B) are broader operational tools and not specific to data transfer protocols. Hence, option C (Data Transfer Agreement) is the correct answer, as it precisely governs the procedural and technical framework of vendor data exchange.
(CCDM-Verified Sources)SCDM Good Clinical Data Management Practices (GCDMP), Chapter: External Data Transfers, Section 4.1 Data Transfer Agreements and SpecificationsICH E6(R2) Good Clinical Practice, Section 5.5 Trial Management, Data Handling, and Record Keeping
Which metric reveals the timeliness of the site-work dimension of site performance?
The site-work dimension of site performance evaluates how efficiently sites manage and resolve data-related tasks -- particularly query resolution, data entry, and correction timelines. Among the given metrics, the median and range of time from query generation to resolution (D) directly measures the site's responsiveness and data management efficiency. According to the GCDMP (Chapter on Metrics and Performance Measurement), this indicator helps identify sites that delay query resolution, which can impact overall study timelines and data quality. Tracking this metric allows the data management team to proactively provide additional training or communication to underperforming sites.Other options measure different aspects of project progress:A reflects overall database closure speed.B and C relate to study startup and enrollment readiness, not ongoing data work. Thus, option D accurately represents a site performance timeliness metric, aligning with CCDM principles for operational performance measurement.
(CCDM-Verified Sources)SCDM Good Clinical Data Management Practices (GCDMP), Chapter: Metrics and Performance Management, Section 5.4 Site Query Resolution Metrics ICH E6(R2) Good Clinical Practice, Section 5.18 Monitoring and Site Performance Oversight
What is the main reason 21 CFR Part 11 requires that EDC systems maintain an audit trail?
The primary purpose of maintaining an audit trail as required under 21 CFR Part 11 is to preserve data integrity. According to the U.S. FDA's regulation on electronic records and signatures, every change to electronic data must be traceable, including information about who made the change, when it was made, and what the change entailed.The Good Clinical Data Management Practices (GCDMP) outlines that an audit trail provides a permanent, chronological record of all modifications to clinical data. This ensures transparency and allows the reconstruction of the course of data entry and modification. The regulation aims to prevent unauthorized or undocumented data manipulation, thereby maintaining the accuracy, reliability, and validity of electronic records.The FDA 21 CFR Part 11, Section 11.10(e) explicitly mandates that systems must use secure, computer-generated, time-stamped audit trails to independently record the date and time of operator entries and actions that create, modify, or delete electronic records. This ensures the data remains trustworthy and defensible in regulatory reviews or inspections. Therefore, the main reason for requiring an audit trail is to preserve data integrity -- ensuring that all data captured, modified, or transmitted is authentic, accurate, and complete throughout the study lifecycle.
(CCDM-Verified Sources)SCDM Good Clinical Data Management Practices (GCDMP), Chapter: Regulatory Compliance and Data IntegrityFDA 21 CFR Part 11 Electronic Records; Electronic Signatures, Section 11.10(e) ICH E6 (R2) Good Clinical Practice, Section 5.5.3 Data Integrity and System Validation
Electronic submission standards require that an individual subject's complete CRF should be provided as what type of file:
Electronic submission standards, as established by FDA, CDISC, and ICH, require that an individual subject's complete Case Report Form (CRF) be submitted as a Portable Document Format (.pdf) file. The PDF format is universally recognized and accepted because it ensures that the structure, format, and visual fidelity of the CRF are preserved exactly as originally designed, regardless of software or hardware environment.According to the FDA Guidance for Industry: Providing Regulatory Submissions in Electronic Format (2006) and CDISC SDTM standards, sponsors must include a subject-level CRF in PDF form for each participant in the submission dataset. This requirement ensures that reviewers can trace data points from analysis datasets back to their source entries in the CRF, fulfilling the principles of data traceability and transparency.The Good Clinical Data Management Practices (GCDMP) also support this requirement, emphasizing that CRF archiving should maintain readability and regulatory accessibility. Formats like RTF, DOCX, or SAS datasets are not acceptable substitutes for regulatory CRF submission because they may alter formatting, structure, or introduce modifiable content, violating FDA data integrity principles.
(CCDM-Verified Sources)SCDM Good Clinical Data Management Practices (GCDMP), Chapter: Data Archiving and Submission FDA Guidance for Industry: Providing Regulatory Submissions in Electronic Format, April 2006 CDISC SDTM Implementation Guide, Section 5.3 CRF Representation and Traceability
Share your comments for SCDM CCDM exam with other users:
took the test last week, i did have about 15 - 20 word for word from this site on the test. (only was able to cram 600 of the questions from this site so maybe more were there i didnt review) had 4 labs, bgp, lacp, vrf with tunnels and actually had to skip a lab due to time. lots of automation syntax questions.
no comments
nice questions bring out the best in you.
really helpful
question #50 and question #81 are exactly the same questions, azure site recovery provides________for virtual machines. the first says that it is fault tolerance is the answer and second says disater recovery. from my research, it says it should be disaster recovery. can anybody explain to me why? thank you
iam thankful for these exam dumps questions, i would not have passed without this exam dumps.
some of the answers seem to be inaccurate. q10 for example shouldnt it be an m custom column?
are the question real or fake?
thank you for providing such assistance.
nice questions
my 3rd purcahse from this site. these exam dumps are helpful. very helpful.
found it good
excellent material
very helpfull
well explained.
i need the pdf, please.
a good source for exam preparation
i need ielts general training audio guide questions
please make this content available
content is good
latest dumps please
aside from pdf the test engine software is helpful. the interface is user-friendly and intuitive, making it easy to navigate and find the questions.
questions and options are correct, but the answers are wrong sometimes. so please check twice or refer some other platform for the right answer
90% of questions was there but i failed the exam, i marked the answers as per the guide but looks like they are not accurate , if not i would have passed the exam given that i saw about 45 of 50 questions from dump
answer to this question "what administrative safeguards should be implemented to protect the collected data while in use by manasa and her product management team? " it should be (c) for the following reasons: this administrative safeguard involves controlling access to collected data by ensuring that only individuals who need the data for their job responsibilities have access to it. this helps minimize the risk of unauthorized access and potential misuse of sensitive information. while other options such as (a) documenting data flows and (b) conducting a privacy impact assessment (pia) are important steps in data protection, implementing a "need to know" access policy directly addresses the issue of protecting data while in use by limiting access to those who require it for legitimate purposes. (d) is not directly related to safeguarding data during use; it focuses on data transfers and location.
password lockout being the correct answer for question 37 does not make sense. it should be geofencing.
for question 4, the righr answer is :recover automatically from failures
question number 4s answer is 3, option c. i
very good questions
i am confused about the answers to the questions. are the answers correct?
very usefull
need certification.
great exam prep