Key | Value |
---|---|
Table | DS10 CC Log Detail |
Severity | CRITICAL |
Unique ID | 1100478 |
Summary | Is the transaction duplicated by transaction & CC log ID? |
Error message | Count of transaction_ID & CC_log_ID combo > 1. |
The following text was generated by an AI tool and hasn't been reviewed for accuracy by a human! It might be useful, but it also might have errors. Are you a human? You can help by reviewing it for accuracy! Edit it as needed then remove this message.
The Data Integrity and Quality (DIQ) check titled "Duplicate CC Log Detail Transaction" is designed to identify any instances where the same transaction has been logged more than once in the DS10 CC Log Detail table. This is determined by checking if the combination of transaction_ID and CC_log_ID appears more than once in the table.
If the DIQ check identifies any such instances, it means that there are duplicate entries in the DS10 CC Log Detail table. This could be due to a data entry error, a system glitch, or a process issue where the same transaction is being logged multiple times.
The fields causing the issue are the transaction_ID and CC_log_ID. In a properly functioning system, each combination of transaction_ID and CC_log_ID should be unique, meaning that each transaction is logged only once. If the same combination appears more than once, it indicates a duplicate entry.
To resolve this issue, you should review the identified records and remove any duplicates, ensuring that each transaction is logged only once in the DS10 CC Log Detail table.
The following text was generated by an AI tool and hasn't been reviewed for accuracy by a human! It might be useful, but it also might have errors. Are you a human? You can help by reviewing it for accuracy! Edit it as needed then remove this message.
This test is being performed to check for duplicate entries in the 'DS10 CC Log Detail' table of the EVMS construction project management data. Specifically, it is looking for instances where the combination of 'transaction_ID' and 'CC_log_ID' appears more than once, which would indicate a duplicate entry.
The importance of this check lies in maintaining the accuracy and reliability of the data. Duplicate entries can distort analyses and lead to incorrect conclusions. For example, if costs are being calculated based on these entries, duplicates would artificially inflate the total cost.
The severity of this test is marked as 'CRITICAL', which is the highest level of severity. This means that if any duplicates are found, they must be corrected before the data can be further reviewed or analyzed. This underscores the critical nature of data integrity and quality checks in ensuring the validity of the data used in project management.
CREATE FUNCTION [dbo].[fnDIQ_DS10_CCLogDetails_PK] (
@upload_id int = 0
)
RETURNS TABLE
AS RETURN
(
with Dupes as (
SELECT transaction_id, CC_log_ID
FROM DS10_CC_log_detail
WHERE upload_ID = @upload_ID
GROUP BY transaction_id, CC_log_ID
HAVING COUNT(*) > 1
)
SELECT
C.*
FROM
DS10_CC_log_detail C INNER JOIN Dupes D ON C.transaction_ID = D.transaction_ID
AND C.CC_log_ID = D.CC_log_ID
WHERE
upload_ID = @upload_ID
)