Sunday 15 November 2009

The Journey from BI to Corporate Performance Management: Part 1.

In this series of posts, I will examine how organisations are making the journey from Business Intelligence to Corporate Performance Management. As this is the first in the series let's examine the obvious question 'what is the difference between BI and CPM?'

During my time with a market and thought leader in this field, Cognos, there were a number of views which give us a useful way of examining this question;

Firstly, that BI and CPM are the same thing. I have already given away my own view on this with the title of this post. Let's start with what BI is. A reasonable definition is that BI is the use of information for the purpose of decision making. It is distinct and different from 'reporting' which has an operational focus. BI systems are used by managers across the enterprise to drive informed and accurate business decisions. Strategic BI, using scorecards and dashboards for example, is used by the senior management team as a strategic decision making tool and come close to the definition of CPM. However, not close enough.

Secondly, that Enterprise Planning and CPM are the same thing. This is a view held by some Planning, Budgeting and Forecasting (PB&F) specialists that think of BI as 'just reporting'. Understandably, they hold the view that driving organisational behaviour through the planning process by making it more frequent than the traditional annual 'spreadfest' is at the heart of CPM. If we also consider that PB&F systems do make it easy for profit and cost centre managers up and down the organisation to take an active part in the process then we are certainly getting close to CPM. They are not the same thing though.

Thirdly that BI and Enterprise Planning combined are the same as CPM. An organisation that implements systems for measuring performance, for adapting the plan and for managing the impact on forecasted revenues is clearly looking to both measure and change performance. Implementing the systems to report, analyse, plan and monitor is certainly critical to CPM. However, BI plus PB&F is still not CPM.

Finally, the view that I subscribe to, is that CPM involves the systems (BI and PB&F) but also the business processes and strategic approach to performance management as characterised by methodologies such as Economic Value Add (EVA) or the Balanced Score Card (BSC). An organisation that implements the systems, adapts the strategic decision making process and changes business processes is truly executing on the CPM vision.

CPM is often described as a 'journey'. It is a fundamental change to the way in which many organisations manage themselves today so there are no magic bullets, applications or software features. CPM does not come 'out of a box'. Instead the change may begin with a series of BI implementations that cause an organisation to challenge 'established wisdom' and manage with a much deeper organisational insight. Alternatively it may begin with a corporate initiative to adopt the Balanced Scorecard aligning, at the highest level, strategic direction with organisational activity. What is interesting to me is that many such initiatives are described as failures because they 'didn't make CPM happen' Of course many of these are poorly executed implementations but occasionally they are, in themselves, as successful as they can be. Take a step back, look at them again and they may well be a perfectly reasonable step forwards on a CPM journey that is rarely short or simple but often worthwhile.

In the next post I will describe some of the steps that might be made on the CPM Journey wherever you are today.

Wednesday 23 September 2009

Data Warehouse Design, A Two or Three Tier Approach?

A recent LinkedIn discussion on data warehouse design approaches has reassured me that after a decade of running and managing BI projects that there is still healthy and lively debate on what is the 'right' approach. This particular discussion is focused on if there should be two 'tiers' or three with a tier representing a transformation step and (usually) a persisted data schema.

A two tier approach comprises loading or staging which is transformed into a second, dimensional tier. The three tier approach includes an interim step, a relational integrated schema.

One of the common (and understandable concerns) is that three tiers will increase development times. The logic is seemingly sound. It *must* take longer to build three things than it does to build two things. However, in reality it doesn't!

The reason it doesn't is simple. Whilst there are many successful implementation using only two tiers, there are always three critical transformations to make, three essential data problems to 'fix'.

These are;

1. Load. This is the job of extracting data from source databases quickly and without disrupting source applications.


2. Integrate. Critically, this is integrating and representing the loaded data into a single, consistent (relational) format.


3. Present. This is simplifying the enterprise model into a dimensional schema that is highly performant and is easier for end users to navigate.

The choice then is to do this in three simpler steps or two more complex (actually one simple load step and one *really* complex integrate/presentation) step.

Finally, a three tier approach has consistently proven to be easier to maintain and support when the inevitable change requests start to come through. Actually a three tier approach implemented correctly can prove to be easier and therefore more cost effective than two tier over the total life of an application.

Thursday 30 July 2009

BI Application Design – The Missing Step

If Amazon applied typical BI application design techniques to their web site the user experience might be very different. For example, I might be browsing the DVD's for a gift and need a little inspiration so I hit the 'reports' tab. Here I get presented with a long list of reports, one of which one is 'Top n Products'. I then get prompted with a pick list of products where I select 'DVD' and finally I select '10' to get the Top 10. The list is pretty interesting, but there is nothing that grabs me so I decide to look at the next 10. I re-run the report, selecting 20 instead of 10. This time, I spy the ideal box set and go back to the main site to make my purchase. I am sure you get the picture by now. It sounds awful, clunky and not at all like the actual Amazon experience.

BI reports can sometimes get designed with little or no understanding of the decisions they support. Actually, more often than not, the requirement is communicated as a report layout and the underlying business need is at best inferred or at worst lost in a fixed specification of rows, columns, filters, sorting and grouping. Without an understanding of the audience for the report or how the information is used, the report layout communicates a general requirement in a one-size-fits all report.

A BI specialist on a project that we are currently assisting with recently demonstrated how it should really be done.

The requirement is for a team of internal sales reps each targeted with a number of customers to call each month. Our BI specialist was asked to provide a report which compared actual calls made with target. Rather than creating one multi-purpose report, he created three specific solutions. One for the rep, one for the sales managers and one for the senior management team. The report for the rep contains the number of calls they have made, their target and how many calls they need to make today, this week and for the remainder of the month. It is run daily. The rep can plan their daily, weekly and monthly activity based on this information. The report for the sales managers is ranked so that they can manage the individuals accordingly. This report is weekly, reflecting the frequency with which they review and action the performance of their teams. The senior management team report is monthly and is a team summary for monthly and quarterly sales performance meetings with the sales managers.

Of course, the Amazon comparison is not completely fair. Whilst it is using highly summarised information for the purpose of decision making the decisions are discrete and simple. Which computer game should I buy? What computer games did others that bought this one also buy? It is also an application that has a single user type – customer. Not only that, there are millions of users which makes design effort not only cost-effective but critical to generating revenue.

However, some of the design principles absolutely apply;

  1. Understand the decision (which reps do I need to coach to make more calls?)
  2. Understand the information required to make the decision (how many calls they have made, what is their target and what is the variance)
  3. Identify the actions that can be made as a result of decision making (reps may be incented, coached or disciplined)
  4. Link the information as closely as possible to the business process it supports (sales managers review activity levels every Friday morning)

Finally, this doesn't mean there is not a case for analysis. BI supports managers and not all decision making is predictable and routine. The business environment is continuously changing and each month or quarter will throw up new and interesting business challenges to solve. These can be subtle variants of historical business challenges or completely new. This is where decision makers need the flexibility to explore and analyse trends, exceptions and patterns to validate what is happening and determine the actions that they will take next.

The answer, as is often the case, lies in a combination of providing BI reports that support decision makers closest to the point at which they need to make the decision along with the flexibility we have come to expect from good OLAP tools.

Wednesday 1 July 2009

Integrating Business Intelligence and Planning

I was prompted to post on this subject by a LinkedIn question, what place do your Cognos or Hyperion Planning applications have in your Enterprise Data Warehouse?

The major vendors, including IBM (Cognos), SAP (BOBJ) and Oracle (Hyperion) have both Planning & Forecasting and BI solutions in their vast product portfolio. The questioner was focused on understanding what the challenges are in integrating data from the planning application into the enterprise data warehouse. It is, after all, common for the business to want to report, say actual revenue along with planned or forecast revenue and a variance. This is basic corporate performance management and answers the question 'how am I doing against plan?' or 'am I going to make forecast?'

There is an understandable misconception that because both the BI tools and Planning tools come from the same vendor that they 'automatically' integrate. They don't. There are points of integration including UI, security and, in Cognos Planning, functionality to create a reporting database and metadata. However, none of this represents a panacea. If you think about it for a moment, they can't. A planning or forecasting application is as just that, an application. It is another source of data along with all other operational systems and the reporting requirements have to be considered carefully. And, of course, the way in which plan and actual data are integrated and reported should be driven from the requirement and not vendor product functionality.

There are a number of considerations, including;

Data flows both ways. A planning application usually requires actual data because it is common for business managers to build this year's plan from a prior year's actuals. Or it might be that YTD actual will help business manager pull together more accurate forecast numbers. This means that actual data needs to flow from the data warehouse into the planning application. Similarly, a planning application may require historical plans which are not stored in the planning application but might be found in the EDW or other operational sources.

Data in a Cognos Planning Database is not Like Other Operational Data. Cognos planning models are incredibly flexible. However, this means that that changes to the model, even seemingly trivial changes, can result in column and/or table name changes. This will 'break' the metadata and reports. Add to this the fact that planning applications, unlike other operational applications, can change each year and you will find that you are trying to hit a moving target. A level of abstraction is required to protect the reporting from these application changes which is one of the reasons that EDW's exist in the first place.

Planning Hierarchies are not the same as Dimensions. It is not unusual for planning hierarchies to be extracted from dimensions in the EDW. However, they are not always the same thing. For example, it may be that major products are forecast individually but below a certain threshold products are forecasted as 'other'. This is, perhaps, a trivial example but is just another example of what we already know. That reference or master data can be subtly different across operational systems and the business has to define a common definition for the purpose of consistent management reporting.

Planning Reporting is Real-Time. There is usually an operational (and therefore real-time) requirement to report during the planning process. The finance team or profit centre managers adjust their forecast numbers across dimensions until they arrive at an overall plan that meets the objectives set for them by their senior management team. They will want to repeatedly run reports to validate the latest position. This real-time reporting requirement might drive the developers of the planning application to build all reporting from within the planning application but will get frustrated that as the reporting requirements grow so will the need to import more and more actual data into the planning application. Again this simply confirms what we already know about data warehousing. Some reporting requirements are operational and need to come from the operational application, others are summarised BI or management reporting and will be developed against the EDW.

This isn't an exhaustive list of considerations but they do illustrate why reporting requirements for a planning application should be considered at the outset of a planning application project. It is inevitable that integrated, consistent, enterprise reporting will need extensions to the BI and EDW solution whilst other real-time, operational reports will stay within the planning application. This is why a reporting solution from a planning application should be properly architected rather than expected to appear 'out of the box' form the product vendor.


   

Tuesday 23 June 2009

Business Intelligence and the Semantic Web

Analytics strategist Seth Grimes was in town last week speaking in Covent Garden on the subject of Web 3.0. For those still catching up with what this means then the evolution of the web is generally thought of as;

Web 1.0

Retronym which refers to the web as largely a publishing paradigm

Web 2.0

The interactive web characterised by the rise of social networking

Web 3.0

The semantic web. Functionally rich and understandable to machines as well as people

Die Hard 4.0

Fourth instalment of the Bruce Willis franchise released in the US as Live Free or Die Hard

Whilst web 3.0 is some way off there are some early glimpses into the world of possibilities with search engine innovations from Google and Wolfram Alpha. Fellow BI blogger, Peter Thomas who also writes about this subject in his blog Literary Calculus uses the example http://www.google.co.uk/search?&q=age+of+the+pope.

What interests me about the semantic web though, has much less to do with what might be described as contextually aware searching and more to do with the impact on Business Intelligence.

BI, since the 1970's, has been almost entirely focused on numbers. This is understandable given that these tend to be highly structured, organised and available in databases. Arguably though, this wasn't the original vision. In his 1958 (yes, 1958) article "A Business Intelligence System", IBM visionary, Hans Peter Luhn describes the objective as "to supply suitable information to support specific activities carried out by individuals, groups, departments, divisions.." and what Luhn goes on to describe is statistical analysis of text and documents as information sources.

Seth Grimes, in his Intelligent Enterprises article BI at 50 Turns Back to the Future makes the point that BI has more latterly revisited this original vision on the analysis of text and document sources which, after all, reflects the vast bulk of corporate data.

Early innovators, including my own company Artesian, are already making progress in the field of analysing documents for the purpose of business intelligence. This new breed of semantically aware business intelligence technology can "supply suitable information" to "support activity" by answering questions like;

  • 'which of my competitors are growing and which are declining?'
  • 'are my customers launching initiatives that could be supported by our products or services?'
  • 'do market behaviours indicate a declining need for our products?'
  • 'how did customers respond to our competitors when they changed their business in a way that we are also considering?'

This isn't to suggest that semantically aware BI should function like search engines. Indeed, I would strongly argue that it should not. Search engines deliver a single set of answers to a single ad-hoc question. Business processes are much more frequent, diverse, repeatable and involve wider audiences. Further, businesses require scalability and high degrees of automation. Here, the business questions need to be regularly monitored, visualised, distributed, shared and collaborated around. Interestingly this has more in common with today's best practice BI systems. So there it is. The shape of semantically aware BI is emerging through the fog of future developments and it is unsurprisingly an evolution of what is best about business intelligence as we know it today with some breakthrough thinking that will unlock meaning from the colossal volume of corporate and online documents.

Wednesday 17 June 2009

BI Project Managers and Eyebrows

Like eyebrows, you don't really notice project managers when they are there but if you are rash enough to let them go you will end up looking startled and stupid.

I point this out because over a period of more than 10 years I have had the opportunity to observe many, many BI projects and one of the most surprising patterns is the scaling back of project management largely because the project is going well!

The openly declared reason is usually cost or some other misdirection but it is invariably preceded with pointed questions about what value the project manager has been adding to a project that is going so well. Perversely, the better the project is doing, the higher the risk that there will be murmurings about things like the overhead of project reporting and that project management activity will ultimately be reduced or even removed altogether. It has become as common and predictable as it is deeply and logically flawed.

Perhaps this is one of the phenomena that explains why the trend for project failure is not getting any better. According to the latest Standish Group report which is covered by Peter Taylor, author of 'The Lazy Project Manager', in his blog 'Are your Project Managers working too hard to be successful?' instances of challenged (late, over budget or reduced deliverable) projects continues to rise.

As BI practitioners we often value technical skills, competency in the reporting tool and the deep musing of the data architect and yet have a blind spot when it comes to project management. This may be partly because early BI projects were often departmental in scale. It may also be because many of today's BI Competency Centres originated as 'skunk works' initiatives and see project management as all methodology and meetings but we ignore it at our peril.

It is true that project management can be at its most obviously valuable when priorities need resetting, additional resources have to be secured or controlled management escalation is called for. However, we shouldn't assume that if a Project Manager is not doing these things that they are not doing anything.

Planned projects with predictable timescales along with accurate project reporting are rewarded with confidence from our business sponsors. A considered set of risks based on real-life experience of BI projects will mitigate against them becoming time sucking issues and properly managed issues will prevent them becoming show-stoppers.

A good Project Manager may make it look easy but don't take the lack of fire fighting and crisis meetings as an indication that nothing is being done. Look deeper for the benefits of order over chaos or be prepared to invest in an eyebrow pencil for a look that is decidedly a poor second best.



Monday 1 June 2009

Small Children, Energy and Efficient Data Warehousing

Last week, I referred to Peter Thomas, and his article Using multiple BI tools in a BI implementation – Part II, In the article, Peter points out that the way to drive consistency across dimensions and measures is to define as much logic as possible in the data warehouse.

I was musing over this again this weekend (I hear the cries of what an interesting life you have) whilst out for lunch with friends and their small children (ages 9 and 7) On the short walk to the restaurant I was amused by how different their approach at getting from A to B was to the 'grown ups'. We were focused on getting to the destination in a relatively direct and efficient way. However, the children ran to and fro, stopped, doubled-back, looped a few circles and even randomly waved their arms in the air. They generally spent as much time in a state of motion as possible. Clearly the criteria of small children, when en-route to a destination, is to use the maximum amount of energy possible!

This, if you will go with me on this, is rather like trying to implement data warehouse consistency in the BI tools rather than further downstream in the data warehouse. You do eventually get to the destination, but will probably be exhausted, out of breath and hungry. This is fine if you are two children out for dinner with some stuffy grown-ups but not an efficient use of the somewhat limited time of a BI practitioner.

A typical BI architecture comprises tiers that include;

  • Source Systems
  • ETL
  • Data (Data Warehouse, Data Marts, OLAP Cubes)
  • BI Metadata
  • BI Application (Reports, Scorecards, Dashboards)

A properly architected data warehouse (more on this in later blogs) should have been built against an enterprise schema and is therefore *the* consistent representation of business information. Common definitions of customers, departments, profit and products live here. If there is one good reason for this (although there are many) then it is simply that there these can all be defined once in the data warehouse but would have to be defined many, many times in what can often be hundreds of reports that comprise a BI solution.

One of the reasons that we fail to do this is that inconsistency is often made visible for the first time by the BI tools. At this point the project momentum is around building metadata models and reports. Inconsistencies are fixed where the resources are focused ... in reports. Add to this that revisiting the design may need involvement from the ETL developer, the DW designer and the business analyst and, if there is a lack of clarity, the business users. It is no surprise that the report author is inclined to fix it where they stand. After all, the tools make the fix simple and it is only when the report author has built the same calculation for the tenth time that they become suspicious about starting it in the first place. And of course, the initial build of the BI application is only the beginning. Many more reports will be built over the life of the BI application.

So work hard to establish the correct definitions during the design of the data warehouse and it will reap productivity gains. Leave it to the BI application only if you have the carefree attitude, the free-time and the energy levels of an eight year old.

Friday 29 May 2009

BI Tools: Just because you can ...

Peter Thomas, in his article Using multiple BI tools in a BI implementation – Part II, points out that the way to drive consistency across dimensions and measures is to define as much logic as possible in the data warehouse.

There was a challenge to this approach because it misses the opportunity to exploit some of the features of BI tools. This challenge, I think, invites the maxim that just because you can ... doesn't mean you should!

BI Software vendors frequently build functionality into their tools that may not always be appropriate to exploit regardless of what the marketing message implores us to do. As an example, many reporting tools are able to integrate multiple data sources in metadata but is this really the best place to integrate?

Recently, I was engaged by a public sector organisation because their BI vendor had implemented this very feature. They outsourced their Data Warehouse and were sensitive to the costs associated with Data Warehouse and ETL changes so saw this as an opportunity to implement a 'virtual' warehouse and take greater control of their reporting solution whilst reducing data warehouse maintenance costs. Sensibly, before they embarked on a significant project to adapt this approach they sought external guidance on a best practice approach. Unsurprisingly, the conclusion was that it remains best practice to transform data from operational systems into a persistent store (data warehouse) rather than a transient (virtual) equivalent because;

  • Application Performance. Queries from the reporting tool will have an unpredictable and likely negative performance on operational systems.
  • Transformation Efficiency. Transformation in a typical DW architecture is typically designed to run once, usually overnight in a controlled and predictable environment rather than many times as each report is required or run.
  • Maintainability. Transformation is better done as a series of small, manageable and auditable steps rather than a single complex process.
  • Best Practice. In our experience, persistent, multi-tier data architecture is the most commonly adopted approach by successful BI installations.

Over the last ten years, BI tools have grown up into rich, enterprise technologies. However, some boast capabilities far beyond what we, as BI practitioners, should consider using in a properly architected solution.

Wednesday 20 May 2009

Six Ways to Justify your BI Implementation

The R in ROI is Hard to Find

Most organisations seem to have the I part of this calculation well covered. In fact the investment is so well understood that it can typically be analysed by project and ongoing costs and by staffing, hardware, software, consultancy assistance and training. Indeed, we live in a climate where knowing these costs is vital.

Calculating the R is somewhat more problematic though. This is partly because the return is often a mix of the tangible and intangible but also because there are often multiple ways to justify a BI implementation. The first step in putting together a rock solid business case that will secure a green light is to consider which approach to use. What follows are some of the justification approaches that I have seen used successfully.

The Productivity Justification

One of the least unambiguous methods of calculating a return is productivity. If a monthly management report pack takes two heads (or FTE's) the whole month to prepare then there is an obvious and ongoing cost saving to be made by replacing this effort with an automated BI solution. However, even this can be a complex justification. A typical finance department deals with such a variety of tasks and activities that identifying the effort in this way is not always straightforward. Then there is all the hidden effort as countless cost centre managers produce their own subtle variations in Excel. Not only is this undermining the integrity of business information it is hiding the cost involved in sharing it. I have lost count of the number of business review meetings where managers spend more time debating the accuracy of their own spreadsheet than assessing what should be the single business view and making decisions. Generally speaking though, the cost of business productivity improvements can be assessed to a sufficient level of accuracy that it makes a credible business case.

However, whilst there are invariably productivity benefits to all Business Intelligence project, they are rarely the primary ones.

The 'Go to Jail' Justification

Another justification is that the information is mandatory and defies an ROI calculation because without the information the business cannot operate. Statutory obligations under acts like Sarbanes Oxley include titles that deal with enhanced disclosure and corporate fraud accountability that could even result in criminal penalties including prison. There is nothing like the threat of a fall all the way from the boardroom to the prison yard to focus the mind on reporting transparency. There are often many other benefits realised as a result of meeting these obligations but the production of legally required reports and the continued liberty of senior executives is often justification enough.

The Flying Blind Justification

This typically comes from someone from the senior management team. Multiple, poorly integrated ERP's and outdated information systems comprising a thousand manually manipulated spreadsheets can create the feeling of flying a very expensive aircraft without any instrumentation. I recently worked with a customer who had a completely new management team with ambitious plans to revitalise the business. The ultimate goal was to double the share price and to do this they had a solid understanding of where they needed to be. They had worked with a consultancy partner to determine where cost savings could be made and which revenue streams to accelerate or choke depending on their profitability profile. As a team they had a tightly defined set of key performance indicators and targets but could not get the information from existing systems which were operationally strong but KPI weak. The project was entirely justified on the strategic needs of the management team but also delivered consequential benefits to senior managers and middle managers too.

The Information as a Product or Service Justification

Using management information to add value to customer service offerings is an increasingly common project rationale. Information, typically available in a variety of operational systems has value to the customer once it has been integrated, cleaned, sanitised, summarised and made available as an extranet solution. An environmental organisation we are currently working with provides information back to their customers on the amount of waste that is being recycled on their behalf allowing them to demonstrate that they are meeting government and local authority targets and their duty of care. The data is a subset of information already being extracted into the corporate data warehouse and aside from winning new business is also becoming a mandatory requirement for some local authorities thereby freezing out some of the less information capable competitors.

The Big Problem Justification

The richest rewards are typically the least tangible though and are associated with having access to timely and accurate insight.

The return is a product of better decisions made as a result of having this insight so it is difficult to generalise and problematic when it comes to articulating it in numbers. Highly visible business issues can help here. A few years I partnered with an on-line retailer who had customer returns that exceeded £10m without having a handle on exactly why. In a wafer-thin margin business, a small percentage reduction in this number was a significant enough prize to justify a project to help them understand which product categories were being returned and for what reasons.

Large, visible, business issues that require information to understand and fix have been the mother of some great innovations in the field of business intelligence.

The BYO (Build Your Own) Justification

BI Implementations don't always have a big problem to hang their hats on or a new senior management team hungry for better, faster information. Indeed most do not. For these, the justification has to come from a considered review of the types of decisions that are being supported and the financial benefits of making better ones. Dorothy Miller, in her article Measuring the Return on Investment for Business Intelligence, describes the challenges of justifying BI more clearly than I ever could and recommends a Benefits Audit approach to assessing the benefits and therefore the value.

As a minimum, there should always be a section in the requirements documentation which articulates the requirements as business questions. These might include what is the order backlog?, what is the trend in sales win/lose?, how many deals are being discounted more than the average of 15% and what is the billable utilisation of each consulting team compared to the company average?
Again, the impact of accurate information to improve the business based on answering these questions is highly subjective but represents a good starting point for articulating the benefits and potential returns.

Wednesday 13 May 2009

Business Intelligence ... no glaze please

When asked "what is BI ?" it is so easy to illicit a look so glazed that it could compete with a Krispy Kreme donut.

The problem is we tend to explain BI in terms of the technology. Don't misunderstand me, this is perfectly understandable. After all, it was the convergence of a number of technologies over the last 20 years that made so much of what Business Intelligence is today possible. Also, the technology does some amazing things. It allows users to navigate their information with a slice, dice, drag or drop and then visualise it in any number of charts, maps or diagrams.

However, before I launch into an enthusiastic guided (or misguided) tour of OLAP, data warehousing and metadata with a detailed explanation of the difference between scorecards and dashboards I usually remind myself that it is possible to get through the BI elevator pitch after you have pressed the button and before the doors have closed.

At it's simplest, Business Intelligence is business information for the purpose of decision making. That's it.

But let me just explain a little more about master data management ...