Tuesday, July 18, 2006

Some rare common sense

Ad Stam and Mark Allin have written an excellent piece in DM Review this month covering data stewardship and master data management. They correctly point out that, with regards to business intelligence systems, that “change will occur, and time to react will decrease” and lay out a sensible architecture for dealing with this issue. I very much like the way they put emphasis on the need for a business unit to deal with data governance as a key building block. In the article they explain the key requirements of such a group and make the interesting analogy of logistics, which is usually sourced these days to a separate unit or even separate company. Similarly they believe that the management of master data should be managed by a non-IT business unit.

The article also correctly distinguishes between “golden copy” data held in the data warehouse and a master data management repository, which in addition will hold master data in all its stages. The master data repository should be linked to the data warehouse, but are not the same physical entity since the master data repository has to handle “unclean” data whereas the data warehouse should have only fully validates data stored in it.

It is a pleasant change to read such a sensible article on best practice in data management, but this is because Ad and Mark are real practitioners in serious enterprise-wide projects through their work at Atos Origin e.g. at customers like Philips. They are not people who spend their lives giving slick Powerpoint presentations at conferences but are close to the action in real-world implementations. I worry that there are too many people on the conference circuit who are eloquent speakers but haven’t actually seen a real live project for a long time. I have known Ad Stam for many years and can testify that his team at Atos are an extremely competent and experienced set of practitioners who focus on customer delivery rather than self-publicity. If you have a data warehouse or MDM project then you could do a lot worse than use Ad’s team.

Monday, July 17, 2006

Vive la France

For some time I have been involved with an EU project that wrapped up last week in Brussels. With the unpromising name Sun&Sup it tried to identify the issues that hold back hi-tech start-ups in Europe, and to make recommendations that could improve the current situation. The project invited periodic input from selected hi-tech start-up companies across the EU (along with various service providers to start-ups) and I represented the UK on this project.

Make no mistake that there is a problem: once you get beyond SAP, Business Objects and Sage you will be hard pressed to name a large European software company. Israel has done a better job than the combined resources of Europe, with companies like Check Point Software, Amdocs, Mercury Interactive and many others. Israel has the second highest ranking for VC investment, and even in absolute terms has the second highest number of start-ups after the USA, yet it has a population of just over 6 million. There are many reasons for Europe’s hi-tech malaise, and few easy answers. The Sun&Sup project tried to deliver some very low-key, pragmatic services in pilot form, such as a self-help network of companies wishing to expand across borders, an expert system to help companies assess their business plans, a mentoring program to provide non-executive directors for start-ups, amongst others. Its most ambitious recommendation was to lobby to replicate the US system in government procurement, which sets aside USD 100 billion of government spending for small companies. European government procurement favour large companies: 50% of economic activity in Europe is from SMEs, yet only 30% of government spending is with SMEs. Of course opening up more government business to SMEs would not be a panacea, but it would help, as the successful federal Small Business Act has demonstrated for many years.

The highlight of the wrap-up session of the project in Brussels was to hear the French Trade minister Christine Lagarde making an eloquent case for the need for change in public procurement. It was indeed refreshing to an Anglo-Saxon ear to hear a small business initiative being championed by a French minister. Ms Lagarde was an extremely impressive speaker, yet clearly faced entrenched opposition from the Commission and indeed from several member countries in trying to open up public procurement. Indeed, from the way that several of the modest Sun&Sup initiatives ended up being buried or transferred to other EU projects, it seemed clear that the lack of high-tech competitiveness in Europe is something that will remain the subject of much hand-wringing for a long time to come.

Thursday, July 13, 2006

SeeWhy real-time event monitoring makes sense...

Given the consolidation in the business intelligence sector, and the recent share price dips even of leaders Cognos and Business Objects, you might wonder why anyone would bring out a new BI product. Certainly there is no shortage of reporting tools, data mining has yet to break out of its statistician niche, while visualisation tools have again failed to become a mass market. However one area that does make sense for a new entrant to focus on is real-time event monitoring, which is typically today addressed (poorly) by the vendors of major applications.

SeeWhy software is a UK start-up which has managed to get over the key hurdle of signing up initial high class customers such as Diageo. It is run by Charles Nichols, previously an executive of Business Objects. Charles is a smart guy who understands the space well. The software pulls data out of real-time message queues, enabling alerts to be generated e.g. for supply chain data in the case of Diageo. The company should continue to focus on this niche in my view, and ovoid trying to be "all things to all men". For example it would be natural to extend its capability to data mining in order to spot anomalies or trends, but would be wise to partner with existing data mining tools in order to do this. Similarly, if they start to build up repository capabilities and looking at trends in their customer data they should avoid trying to compete with general purpose data warehouse technology, or they risk undermining their message of "real time" analysis. I have written elsewhere how EII vendors struggle when they try and position themselves as general purpose business intelligence tools, since fundamental issues like data quality get in the way if you do not have a persistent store of data such as a data warehouse. This has led to pioneer EII vendor Metamatrix stalling in the market, with virtually no growth in revenues last year. By concentrating on drawing data from real time message queues, marketing to that niche and by selective partnering in other areas SeeWhy should be able to prosper in an apparently crowded market.

Wednesday, July 12, 2006

Managing software risk

An interesting, common sense debate was featured in Silicon.com last week. A panel of CIOs was asked whether they felt comfortable buying from small suppliers or whether they preferred dealing with the big players. There was a surprising degree of consensus that, in general, CIOs felt OK about small suppliers: indeed in some cases they actively preferred them. Perhaps this is a sign of a steady recovery in economic confidence, with CIOs preparing to come out of their shells into which they retreated in 2001 and 2002. As I have written about before, buying software from small suppliers carries risks, but this is true of big suppliers also. Just because a giant company may not go bust does not stop them dropping products for any number of reasons, as I can testify from personal experience.

The one element in the article that did make me smile was the assumption that code escrow was a form of insurance against a small vendor folding. Indeed code escrow arrangements have become quite standard in contracts, and generate modest fees for those organisations that provide the service. I hate to disillusion those CIOs, but code escrow is not the panacea it may seem. Sure, so you get the source code, but then what? Firstly, you have to hope that the vendor has been diligent about keeping their escrow up to date with the version of software that you are actually using. But more to the point, the raw code itself is of limited use without the design specifications that go along with it (at least assuming you actually want to continue developing it). Even if you are looking at basic support only, how well documented is the code? I had the misfortune to try and execute an escrow contract once when I was working at Esso. The tape of source code duly turned up and it was 3 million lines of undocumented assembler code. While my colleague (an expert at assembler code) got a misty gleam in his eye as he could see a job for life coming up, we concluded that we simply couldn’t justify taking this on, and opted to go for a complete replacement instead. So, if you are insisting on source code escrow from your vendor, be aware of the pitfalls and ask some searching questions about documentation.

Friday, July 07, 2006

What's next for ERP?

The ERP landscape became simpler this week, when SSA was swallowed up by a private equity company called Golden Gate Capital. This group (and its subsidiary Extensity) has now absorbed Baan, Comshare, Dun & Bradstreet Software, Epiphany, Infinium, Marcam and Systems Union. As Ventana points out this means that the choice boils down to SAP, Oracle, Microsoft and this new amalgam under GoldenGate/Extensity. It is interesting that this private equity group seems to be performing the role CA used to play: hoover up under-performing companies, slim them down and milk the support revenue stream. What the article implies is that this is pretty much the endgame for the ERP space, but I am not so sure. The one dimension missing here is the hosted model.

Salesforce.com showed what could be done with a hosted software model. In the ERP world we are seeing new entrants like Intacct and Ataoi, which while they are small so far are making solid inroads into their chosen markets. At present this approach may appeal more to SMEs, but remember that salesforce.com started this way as well, only later taking on Siebel more directly. I know the CEO of one of these emerging hosted ERP vendors, who was amused to be in a competitive bid with SAP at one prospect. His company’ bid was less than one-tenth that of the behemoth. I’m not suggesting that these hosted ERP systems compare in functionality with SAP and Oracle, but perhaps that after all is the point. Traditional ERP systems have become so bloated that large parts of them remain unused and having systems hosted avoids all the environmental installation problems that ensure with traditionally installed software, where there are so many combinations of operating system, DBMS, transaction monitor etc that the vendors have to spend as much time testing combinations of software than actually writing new functionality.

It may take some time but I think the next change in the ERP market will be via this hosted model. When you start to see defensive market offerings by the giant vendors, just as Siebel started an on-demand offering (but too late), then we will know that this prediction has been fulfilled.

Thursday, July 06, 2006

Don’t prototype: iterate

Stephen Swoyer makes a case for using EII as a prototype for data warehouses in his recent Enterprise Systems article. As the article reflects, there are some dangers as well as benefits here e.g. the prototype may just be extended and a proper data warehouse system never built. This is a problem because, as I have argued elsewhere. EII is suitable only for a small subset of business intelligence needs. However the valid point is that business users do want to see prototypes, especially in an area like business intelligence where the requirements tend to be fluid and ill-defined. However there is an alternative to buying an EII tool, knocking up a prototype and then building the proper warehouse.

These days you do not have to build a data warehouse, since you can buy one. Packaged solutions can be deployed much more rapidly than data warehouses that have to be designed and built by hand, and if they are flexible enough then an iterative approach to the warehouse can be taken. A great example of this was at Owens Corning, who deployed a data warehouse package over a 90 day period, using weekly builds of the business model. Each week a new piece of the business problem was tackled, the business model was updated in the package and the results presented back to the users. Issues would arise, feedback taken, and the next week the model would be refined, and a new business area started. This highly iterative approach ensured that the business users were fully engaged with the project, and could see a visible improvement in what they were going to get week by week.

After a few weeks the problems became less technical and functional, and more business related e.g. issues of data quality and how certain business model issues were to be resolved. After 90 days the application was delivered, and this was no prototype: it was a fully working, deployed production warehouse. The insights this application gave saved Owens Corning a great deal of money, so much so that the project had a three week payback period. Indeed the project became an Infoworld 100 winner.

Data warehouse project leaders need to rid themselves of the notion that data warehouses have to be long, custom build projects. At present TDWI reckons they take 16 months to deliver on average. This is far too long if using a traditional waterfall methodology, and indeed needs a more iterative approach. But why build a throwaway prototype when you can implement the real thing via a data warehouse package?

Monday, July 03, 2006

The long and winding data quality road

As Rick Sherman quite rightly points out in his commonsense article in DM Review, data quality is not something that you can realistically fix before you build a data warehouse. Data quality in operational systems is often scarily low, but often the only way that this will be highlighted is when it is brought together at a data warehouse. People often assume that the data inside their ERP systems is somehow sacrosanct and immune to data quality issues, and it often comes as a big disappointment when they discover that this is not so.

In one example it turned out that a product was mis-priced in the SAP system in a region, resulting in the product being sold at cost price. This anomaly went undetected for over a year before a data warehouse project brought this to the surface (by comparing gross margin by product by country). Initially everyone assumed it was a bug in the warehouse software, but it was not. Indeed this insight alone pretty much paid for the project.

If a data warehouse can be implemented rapidly and in an iterative fashion then it can quickly highlight business issues such as this one, which may be as a result of data quality or could be as a result of new insight that was previously unavailable: the "wood for the trees" problem. Eventually data quality needs to come out of the closet and be treated as a serious business issue, dealt with by a corporate business function that have the political clout to fix the problems at source. Some progressive companies have set up such organisations, which may report into finance or another corporate function, but never into the CIO.

However, to show just how long a journey data quality can be, I can recall working with such a function in Esso UK in the mid 1980s. The issue is only now dawning on many companies, and still has to surface in most.