Will QUANTUM provide any tools to directly measure the quality of our data, or should we use our own tools and procedures to assess the quality dimensions of our dataset?
Since the QUANTUM tool has to be agnostic to data type, usage, and user it cannot provide you with specific tools and algorithms. This is also because such tools to be implemented may need to access the real data which for many cases and users is sensitive or in general not allowable. For this reason, it is provided guidelines and recommended approaches to calculate each metric.
Related Articles
Will we be provided with a dataset to test the QUANTUM quality tool, or will we use a dataset from our organization?
No, you need to employ one of your own.
Does the time to access truly reflect data quality?
While not an intrinsic quality dimension of the data itself, access time can be a critical factor in the decision to use a dataset. This metric can be measured by the HDAB, but its scope must be clearly defined. The measurement should only cover the ...
Maturity Dashboard - Data Management refers exclusively to secondary use, or also the primary one?
Proper management of data in its primary use undoubtedly contributes to data quality for secondary use. However, here we are asking exclusively about management for secondary use.
How should 'time to access' be calculated for a dataset with multiple components that have different access times (e.g., a mix of instantly available open data and restricted data that requires approval)?
For a complex dataset with components that have different access times, the reported "time to access" should always be the longest time required. In other words, the metric must represent the total time until all requested components of the dataset ...
Is the Quality Label agnostic to data type, or is it different depending on the type of dataset (tabular, image, text, genomic...)?
It is agnostic to data type data usage and data users.