What does "Compliance" refer to in the context of data, and what kind of documentation is required to demonstrate it?
Compliance refers to the degree to which data and its handling processes adhere to established ethical standards, conventions, protocols, or regulations. Documenting this is crucial to ensure transparency and accountability.
It is important to understand that the need for ethical compliance documentation is not exclusive to clinical trials. It also applies to observational studies and any other research that involves the use of secondary data.
You should include in this attribute, for example, documentation about ethical committee's approval (if needed), Data Protection Impact Assessments, DPOs reportings, security or data protection certifications, etc.
Related Articles
Does the time to access truly reflect data quality?
While not an intrinsic quality dimension of the data itself, access time can be a critical factor in the decision to use a dataset. This metric can be measured by the HDAB, but its scope must be clearly defined. The measurement should only cover the ...
Maturity Dashboard - Data Management refers exclusively to secondary use, or also the primary one?
Proper management of data in its primary use undoubtedly contributes to data quality for secondary use. However, here we are asking exclusively about management for secondary use.
How should 'time to access' be calculated for a dataset with multiple components that have different access times (e.g., a mix of instantly available open data and restricted data that requires approval)?
For a complex dataset with components that have different access times, the reported "time to access" should always be the longest time required. In other words, the metric must represent the total time until all requested components of the dataset ...
Is the Quality Label agnostic to data type, or is it different depending on the type of dataset (tabular, image, text, genomic...)?
It is agnostic to data type data usage and data users.
Will QUANTUM provide any tools to directly measure the quality of our data, or should we use our own tools and procedures to assess the quality dimensions of our dataset?
Since the QUANTUM tool has to be agnostic to data type, usage, and user it cannot provide you with specific tools and algorithms. This is also because such tools to be implemented may need to access the real data which for many cases and users is ...