The gradual improvement of processes and human behaviour is traditionally seen as planning, acting, validating, verifying and finally reacting and adjusting. Because it is not always clear which actions will make the greatest contribution, we usually collect as much data as possible. You never know. However, rather soon than late the problem of questionable data quality comes into play.
The more the information is actually used, the quicker errors pop up and the faster the quality is discussed and improved, causing the confidence to grow gradually. Sources that are less often used by fewer knowledge workers, have it a lot harder to be taken seriously and sometimes end up in a downward spiral. Recognition of the downward spiral and start-up of an analytical approach towards data quality can help (re-)establish the confidence in the underlying data, distinguish real data errors from interpretation errors, and help to move things forward.
Since information systems are living systems they will never be free of errors. Learning how to read them and how to cope with them is the way forward. Analytics is needed to distinguish errors from interpretations very early in the process. Hence, there is no real need to wait for perfect data to start with analytics.
Getting started with analytical decision-making right away…
Is good enough really good enough…?
Changing a wheel of a moving train…