A survey sponsored by Oracle (http://www.computing.co.uk/itweek/news/2153695/surveys-show-bi-failing-s
) hits a new low in terms of insight. The classic line is: "In Oracle's survey of 200 UK and Irish IT managers, over half of organisations said they did not have any BI systems, though 69 percent of respondents said BI was important to help senior managers run their business." Apart from the apparent conclusion that over 20% of the respondents seem to struggle to keep two ideas in their head for more than five minutes, the notion that half of the UK's companies lack a single BI tool is pretty absurd. We have had well over a decade of Business Objects, Cognos and others jamming BI tools, and even before that there were tools like Focus and Nomad. You would have to be recently returned from the moon not to have encountered a BI software salesman as a UK IT manager.
I do wonder sometimes about the accuracy of some of these surveys. I recall years ago at a Gartner conference being handed a thick survey, which demanded all kinds of detail in terms of IT budget breakdown, future spending trends by area etc. You needed to return the completed survey in order to get a chance of winning a prize, and I remember saying to a guy next to me who had just finished his "how on earth do you remember all of that budget info for your organisation?" The reply was "are you kidding, I just made it up, but I really want that prize". Many surveys do make use of incentives to get people to fill them in, and I wonder just how accurate the data really is in many of them as a result.
Separately, a more plausible insight in a different survey is: "Meanwhile, a survey of 1,000 UK business managers at companies with over 250 staff, published by ICS, indicates a widespread need for better BI systems. The study found that over three quarters of respondents were forced to make decisions "blind" due to late or insufficient business information". By contrast, this is entirely believable, though not for the reason that the article gave. The critical issue is that you can have as many pretty reporting tools and dashboards as you like, but you need accurate and timely information to feed those systems coming from a data warehouse (unless you are one of the few brave souls using EII). The problem is that most data warehouses are entirely unable to keep up with the pace of business change (reorganisations, acquisitions etc) and so are constantly out of date. Consider a data warehouse with just ten source systems. A major change in one of its sources will impact the warehouse schema, and may take three months to fix the schema, the load routines and the reports that are impacted by the change (this is a pretty typical figure in my experience at Shell).
A major change of this type does not happen every day, but is almost certain to happen once a year to each of these source systems, maybe twice. There are then ten sets of separate changes, each taking three months worth of changes needed to the warehouse every year. Even assuming that the changes are neatly spread over the year and that you have plenty of programming resources to fix the changes, so you can do these in parallel, you still have 15 months of change to fit into 12 months; basically the warehouse can never catch up. You may well have more than 10 sources for your data warehouse, so the problem could be even worse than this. This is indeed what happens in reality: the data warehouse is usually out of date, so armies of Excel jockeys in finance get the answers via email and have to manually number-crunch for anything really critical while the warehouse lumbers on with out of date information. This situation is not the fault of the BI tools - it is the fault of the data warehouses that feed the BI tools. Until companies admit that the status quo is failing and start abandoning custom-build warehouses this problem will persist. It is like with treating alcoholism: the first step is admitting that there is a problem.