One more time
"Many data warehouses are launched with much fanfare and promise but quickly fail to live up to expectations"
and indeed multiple studies have shown high failure rates for data warehouse projects. I was at a Gartner conference earlier this week when an analyst stated that "the vast majority" of business intelligence initiatives fail to deliver tangible value. Yet, as a wise colleague of my often says:
"There is never time to do it right, but always time to do it again"
By this he means that data warehouse projects cut corners and make simplifying assumptions in their design about how the business works. It is much harder to make the design truly robust to business change, and yet this inability to deal properly with major business change is what eventually leads to problems for most data warehouses. A reorganization occurs, and it takes three months to redesign the star schema, fix up the load routines, modify the data mart production process, test all this etc. In the mean time the business is getting no up-to-date information. What do they do? They knock up a few spreadsheets or perhaps something quick in MS Access, "just for now". Then another change happens two months later: the company buys another company, which of course has different product codes, customer segmentation, cost allocation rules etc to the parent. Putting this new data into the warehouse is added to the task list of the data warehouse team, who have yet to finish adapting to the earlier reorganisation. The business users need to see the whole business picture right now, so extend their "temporary" spreadsheet or MS Access systems a little bit more. Since they have control of these, they start to do more using this, and after a time it hardly seems like the data warehouse is really that necessary any more. Of course they let the IT people get on with it (it is not their budget after all) but usage declines, and they give up telling the data warehouse team about the next major new requirement, as they never seem to see results in time anyway. Eventually the data warehouse falls into disuse. Eventually a new manager comes in and finds the spreadsheet and MS Access mess to be unmanageable, and a new budget is found to have another go, either from scratch or by rewriting the old warehouse. And so the cycle begins again.
Sound familiar? The overriding issue is the need to reflect business change in the warehouse quickly, in time for the business customers to make use of it, and before they start reverting to skunkworks spreadsheets and side solutions that they can get a contractor to knock up qucikly.
Until the industry starts adopting more robust, high quality modeling and design approaches, such as that based on generic modeling, this tale will repeat itself time and time again. The average data warehouse has 72% maintenance costs i.e. if it costs USD 3M to build, it will cost over USD 2M to maintain, every year. This is an unsustainable figure. Still, there will always be a new financial year, and new project budgets to start again from scratch.....