Quantexa, a data analytics firm, asked 356 IT and data experts working in the UK, USA and Canada how many customer records were duplicated in their respective sectors. In the banking industry, the average response was ten per cent.
However, the number could be even higher with 29 per cent of respondents estimating that between 11 and 15 per cent were duplicates and 15 per cent estimating it was between 16 and 20 per cent.
When asked about the impact of duplicates, 41 per cent of respondents said it increased exposure to risk.
A third of respondents said it made data reconciliation and remediation time-consuming while 23 per cent said it made making “timely and accurate decisions” more difficult.
Quantexa’s chief product officer Dan Higgins said: “Duplicated records are a growing reason why banks are questioning the overall data quality available to them.”
“Different iterations of a name, changes in address or multiple phone numbers can all harmlessly create these replications. This becomes a costly waste of resources across data, IT, and business teams, hampering digital transformation efforts,” Higgins continued.
Only 27 per cent of respondents felt that all divisions across the organisation trusted the accuracy of data to make decisions, although 60 per cent said most divisions did. 13 per cent said only a few trusted the accuracy of the data.