Objective review and data quality goals of data models
Did you ever ask yourself which score your data model would achieve? Could you imagine 90%, 95% or even 100% across 10 categories of objective criteria?
No?
Yes?
Either way, if you answered with “no” or “yes”, recommend using something to test the quality of your data model(s). For years there have been methods to test and ensure quality in software development, like ISTQB, IEEE, RUP, ITIL, COBIT and many more. In data warehouse projects I observed test methods testing everything: loading processes (ETL), data quality, organizational processes, security, …
But data models? Never! But why?
Message: Thank you for signing The Data Doctrine!
What a fantastic moment. I’ve just signed The Data Doctrine. What is the data doctrine? In a similar philosophy to the Agile Manifesto it offers us data geeks a data-centric culture:
Value Data Programmes1 Preceding Software Projects
Value Stable Data Structures Preceding Stable Code
Value Shared Data Preceding Completed Software
Value Reusable Data Preceding Reusable Code
While reading the data doctrine I saw myself looking around seeing all the lost options and possibilities in data warehouse projects because of companies, project teams, or even individuals ignoring the value of data by incurring the consequences. I saw it in data warehouse projects, struggling with the lack of stable data structures in source systems as well as in the data warehouse. In a new fancy system, where no one cares about which, what and how data was generated. And for a data warehouse project even worse, is the practice of keeping data locked with access limited to a few principalities of departments castles.
All this is not the way to get value out of corporate data, and to leverage it for value creation.
As I advocate flexible, lean and easily extendable data warehouse principles and practices, I’ll support the idea of The Data Doctrine to evolve the understanding for the need of data architecture as well as of data-centric principles.
So long,
Dirk
1 To emphasize the point, we (the authors of The Data Doctrine) use the British spelling of “programme” to reinforce the difference between a data programme, which is a set of structured activities and a software program, which is a set of instructions that tell a computer what to do (Wikipedia, 2016).
Im April 2013 war ich wieder beim Matter-Programm, Data Vault Architecture, in den Niederlanden wo ich Tom Breur kennen lernen durfte.
In einer angeregten Diskussion über die Automatisierung von Data Warehousing mit Data Vault und der Eignung von Projektmethoden dafür lud Tom mich und Oliver Cramer zu einem Besuch von einem Kunden von sich ein: Der BinckBank.
Tom Breur: “The best Agile BI shops I have ever seen.”
Am 24. September 2013 war es dann soweit. Wir besuchten gemeinsam mit Tom die BinckBanck in Amsterdam und schauten uns das Agile Data Warehouse an, welches mit Data Vault aufgebaut wurde. Wir trafen uns mit dem BICC-Team, um über die Entstehungsgeschichte, die Umsetzung, die Herausforderungen und die Erfolgsfaktoren zu sprechen.