BUSINESS ADMINISTRATION
BUSINESS ANALYTICS
Question
[CLICK ON ANY CHOICE TO KNOW THE RIGHT ANSWER]
|
|
Data dictionary
|
|
Artificial intelligence
|
|
Data quality management
|
|
Data governance and trust
|
Detailed explanation-1: -Data quality management (DQM) is the pipeline process that checks the data for required values, valid data types, and valid codes.
Detailed explanation-2: -Basic Data Checks Verify formatting is correct. Confirm the data is what you expect. Compare data to previous values. Observe changes in information where change might not be expected, such as industry classification.
Detailed explanation-3: -As the process checks the data quality, it attempts to correct the problems, if it is possible and if the system is configured to do so. When determining whether or not to correct data quality problems, the system uses the configured data quality management (DQM) rules.
Detailed explanation-4: -Data Quality in the ETL layer: We check for things such as differences in row counts (showing data has been added or lost incorrectly), partially loaded datasets (usually with high null count), and duplicated records.