In a recent article at Harvard Business Review, the cost of dirty data in the US alone is a staggering 3 trillion. Needless to say, the worldwide figures will be much much higher.
Dirty duplicate data hampers productivity, reduces decision making and hurts operations. While looking at these costs, we must also look at why the problem persists. There can be multiple factors – lack of a clear data policy, difficulty in coordination of different departments that produce and consume data,
Easy to use tools in the hands of business analysts can really help in making a big difference to the data quality, reduce reliance on IT and improve efficiency.
Posted on September 26th, 2016