Key | Value |
---|---|
Table | DS16 Risk Register Tasks |
Severity | CRITICAL |
Unique ID | 1160569 |
Summary | Is this risk task duplicated by risk ID & task ID? |
Error message | Count of combo of risk_ID & task_ID > 1. |
The following text was generated by an AI tool and hasn't been reviewed for accuracy by a human! It might be useful, but it also might have errors. Are you a human? You can help by reviewing it for accuracy! Edit it as needed then remove this message.
The Data Integrity and Quality (DIQ) check titled "Duplicate Risk Task" is designed to identify any instances where a risk task is duplicated in the DS16 Risk Register Tasks table. This duplication is determined by the combination of the risk_ID and task_ID fields.
If the count of a specific combination of risk_ID and task_ID is greater than 1, this indicates that there is a duplicate risk task in the data. This is not expected as each risk task should be unique and not repeated within the dataset.
The likely cause of this error could be a data entry issue where the same risk task has been entered more than once. Alternatively, it could be due to a system error where data has been duplicated during the data import or export process.
To resolve this issue, it is recommended to review the data entry process to ensure that each risk task is only entered once. Additionally, check the system processes for any potential issues that could be causing data duplication.
The following text was generated by an AI tool and hasn't been reviewed for accuracy by a human! It might be useful, but it also might have errors. Are you a human? You can help by reviewing it for accuracy! Edit it as needed then remove this message.
This test is being performed to ensure that there are no duplicate entries in the 'DS16 Risk Register Tasks' table for the combination of risk ID and task ID. Duplicate entries can lead to incorrect analysis and interpretation of data, which can have serious implications for project management decisions.
The severity of this test is marked as 'CRITICAL', which is the highest level of severity. This means that if any duplicate entries are found, they must be corrected before the data can be reviewed. This is crucial because duplicate entries can distort the true picture of risks and tasks in the EVMS construction project, leading to misallocation of resources, incorrect risk assessments, and other potential issues.
Therefore, this check is of utmost importance to maintain the integrity and quality of the data, ensuring that each risk task is unique and accurately represented in the dataset.
CREATE FUNCTION [dbo].[fnDIQ_DS16_RR_Tasks_PK] (
@upload_id int = 0
)
RETURNS TABLE
AS RETURN
(
with Dupes as (
SELECT risk_id, task_id
FROM DS16_risk_register_tasks
WHERE upload_ID = @upload_ID
GROUP BY risk_id, task_id
HAVING COUNT(*) > 1
)
SELECT
R.*
FROM
DS16_risk_register_tasks R INNER JOIN Dupes D ON R.risk_ID = D.risk_ID
AND R.task_ID = D.task_ID
WHERE
upload_ID = @upload_ID
)