Thursday 25 November 2010

Single version of the truth, philosophy or reality?

We all hear this a lot. The purpose of our new (BI/Analytics/Data Warehouse) project is to deliver 'a single version of the truth'. In a project we are engaged with right now the expression is one version of reality or 1VOR. For UK boomers that will almost undoubtedly bring to mind a steam engine but I digress.

I have to admit, I find the term jarring whenever I hear it because it implies something simple and  attainable through a single system which is rarely the reality.

In fact it's rarely attained causing some of our community to ponder on it's viability or even if it exists. Robin Bloor's 'Is there a single version of the Truth' and  Beyond a single version of the truth in the Obsessive Compulsive Data Quality blog are great examples.

Much, on this subject, has been written by data quality practitioners and speaks to master data management and the desire, for example, for a single and consistent view of a customer. Banks often don't understand customers, they understand accounts and if the number of (err, for example Hotel Chocolat) home shopping brochures I receive is anything to go by then many retailers don't get it either. Personally I want my bank and my chocolatier to know when I am interacting with them. I'm a name, not a number, particularly when it comes to chocolate.

This problem is also characterised by the tired and exasperated tone of a Senior Manager asking for (and sometimes begging for) a single version of the truth. This is usually because they had a 'number' (probably revenue) and went to speak to one of their Department Head about it (probably because it was unexpectedly low) and rather than spending time on understanding what the number means or what the business should do, they spent 45 minutes comparing the Senior Managers 'number' with the Department Heads 'number'. In trying to reconcile them, they also find some more 'numbers' too. It probably passed the time nicely. Make this a monthly meeting or a QBR involving a number of department heads and the 45 minutes will stretch into hours without any real insight from which decisions might have been made.

This is partly about provenance. Ideally it came from a single system of record (Finance, HR) or corporate BI but it most likely came from a spreadsheet or even worse a presentation with a spreadsheet embedded in it.

It's also about purity (or the addition of impurities, at least) It might have started pure but the department head or an analyst that works in their support and admin team calculated the number based on an extract from the finance system and possibly some other spreadsheets. The numbers were probably adjusted because of some departmental nuance. For example, if it's a Sales Team, the Sales Manager might include all the sales for a rep that joined part way through the year whilst Finance left the revenue with the previous team.

It will be no comfort (or surprise) to our Senior Manager that it is also a Master Data Management problem too. Revenue by product can only make sense if everyone in the organisation can agree the brands, categories and products that classify the things that are sold. Superficially this sounds simple but even this week I have spoken with a global business that is launching a major initiative, involving hundreds of man hours to resolve just this issue.

It's also about terminology. We sacrifice precision in language for efficiency. In most organistions we dance dangerously around synonyms and homonyms because it mostly doesn't catch us out. Net revenue ... net of what? And whilst we are on the subject ... revenue. Revenue as it was ordered, as it was delivered, as it was invoiced and as it is recognised according to GAAP rules in the finance system. By the way does your number include credit notes? And this is a SIMPLE example. Costs are often centralised, allocated or shared in some way and all dependent on a set of rules that only a handful of people in the finance team really understand.

Finally, it's about perspective. Departments in an organisation often talk about the same things but mean subtly different things because they have different perspectives. The sales team mean ordered revenue because once someone has signed hard (three copies) their job is done whilst the SMT are probably concerned about the revenue that they share with the markets in their statutory accounts.

So is a single version of the truth philisophy? Can it really be achieved? The answer is probably that there are multiple versions of the truth but they are, in many organisations, all wrong. Many organisations are looking at different things with differing perspectives and they are ALL inaccurate.

A high performing organisations should be trying to unpick these knots, in priority order, one at a time. Eventually they will be able to look at multiple versions of the truth and understand their business from multiple perspectives. Indeed the differences between the truth's will probably tell them something they didn't know from what they used to call 'the single version of the truth'.

Friday 5 November 2010

Five Cloud Myths

I know, this is a BI Blog, not a Cloud Blog but the Cloud is affecting everything and whilst BI in the Cloud will be difficult, it's journey from on-premises to cloud might be long but ... it is inevitable.

In any case, I have heard and would like to dispel the following five myths that I have heard at least once in the last few weeks.

Myth 1. Cloud will Dis-intermediate
I remember a keynote presentation from a presenter I deeply respected on the explosion of the web and how it would bring about dis-intermediation. The logic was sound. Consumers can connect directly with providers so the need for brokers, intermediaries and miscellaneous third parties appeared to be unnecessary. The growth of cloud computing has caused some to predict the same and yet a cursory glance of the services on offer will demonstrate there is no reduction in intermediaries or intermediation opportunity. If a service, offering or product will make things easier for a consumer then they will buy it. It might be comparison, consolidation or simply cheaper prices but if you can add something and make the connection painless then you can come to the party and make money for adding value.

Myth 2. Cloud is Bad for Services
With infrastructure issues 'solved' and the availability of commodity services that fix common problems like tax  and currency exchange calculations some are suggesting that the opportunity for consulting provision will diminish. My educated response to this is 'yeah right' I have yet to meet the business that has solved all the issues and is not looking for any more innovation to reduce cost or get a competitive leg-up. If the simple things are going to get solved in one place then fantastic, we can all move on to innovating and adding real value. We're all tired of solving the same problems so let's get the solutions in the cloud and build some real innovation on top of them.

Myth # 3. Cloud Means Blue Sky
Hmm. Cloud encourages a lot of blue sky thinking. But there are few businesses starting with a clean sheet (if you will pardon the mixed metaphor) There are on-premises systems to integrate, proprietary solutions that will not leave the office without a fight and that's without thinking about how difficult Cloud BI can be given that it involves the complex integration of what is often large volumes of data. And multi-tenanted solutions need everyone to play nicely so those queries from hell need to be reigned in or it stays on-premises.

Cloud is (like Performance Management) a journey that needs careful planning whilst delivering improvements along the way.

Myth # 4. Cloud is ASP
This is usually suffixed with the phrase 'but marketed differently'. Of course, it has some commonality with the principles of application service provision but to dismiss it as 'the same as' is to miss the point as profoundly as  Cloud will change our industry.

Myth #5. Cloud is a Fad
Actually Myth #4 is often rolled out as 'evidence' of just how faddish Cloud is. In fact Cloud is having an impact on all system wherever it is physically located. Users now expect their systems in the office to be as easy to use and visually appealing as the web sites they use at home. In fact they do both in both locations. Cloud has also set the expectation that all systems will be available on all devices. It's not mobile or desktop anymore, it's a range of devices.

As final evidence that Cloud is not a fad, it's clear to me that to be a successful cloud vendor, like Salesforce, then you have to continue to show value month after month or your users will go elsewhere. As Phil Wainewright, CEO of Procullux Ventures put's it ... software used to be a holiday romance but cloud requires a long-term relationship. If that means I am married to Marc Benioff then I might need to re-think the analogy but there is no doubt that Cloud vendors need to play the long game.

Sunday 24 October 2010

Too many choices for the modern analytics Solution Architect

We analytics practitioners have always had the luxury of alternatives to the RDBM as part of our data architectures. OLAP of one form or another has been providing what one of my colleagues calls ‘query at the speed of thought’ for well over a decade. However, the range of options available to a solutions architect today is bordering on overwhelming.

First off, the good old RDBMS offers hashing, materialised views, bitmap indexes and other physical implementation options that don’t really require us to think too differently about the raw SQL. The columnar database and implementations of it in products like Sybase IQ are another option. The benefits are not necessarily obvious. We data geeks always used to think the performance issues where about joining but then the smart people at InfoBright, Kickfire et al told us that shorter rows are the answer to really fast queries on large data volumes. There is some sense in this given that disk i/o is an absolute bottleneck so less columns means less redundant data reading. The Oracle and Microsoft hats are in the columnar ring (if you will excuse the mixed geometry and metaphor) with Exadata 2 and Gemini/Vertipaq so they are becoming mainstream options.


Data Warehouse appliances are yet another option. The combined hardware, operating systems and software solution usually using massively parallel (MPP) deliver high performance on really large volumes. And by large we probably mean Peta not Tera. Sorry NCR, Tera just doesn’t impress anyone anymore. And whilst we are on the subject of Teradata, it was probably one of the first appliances but then NCR strategically decided to go open shortly before the data warehouse appliance market really opened up. The recent IBM acquisition of Netezza and the presence of Oracle and NCR is reshaping what was once considered niche and special into the mainstream. 


We have established that the absolute bottleneck is disk i/o so in memory options should be a serious consideration. There are  in-memory BI products but the action is really where the data is.Databases include TimesTen (now Oracle’s) and IBM’s solidDB. Of course, TM1 fans will point out that they had in-memory OLAP when they were listening to Duran Duran CD’s and they would be right.

The cloud has to get a mention here because it is changing everything. We can’t ignore those databases that have grown out of the need for massive data volumes like Google’s BigTable, Amazon’s RDS and Hadoop. They might not have been built with analytics in mind but they are offering ways of dealing with unstructured and semi-structured data and this is becoming increasingly important as organisations include data from on-line editorial and social media sources in their analytics. All of that being said, large volumes and limited pipes are keeping many on-premises for now.

So, what’s the solution? Well that is the job of the Solutions Architect. I am not sidestepping the question (well actually, I am a little) However, it’s time to examine the options and identify what information management technologies should form part of your data architecture. It it is no longer enough to simply chose an RDBMS.  

Friday 13 August 2010

External BI and How Weight Loss is More Interesting than Life itself

More and more we find ourselves helping customers with externally facing business intelligence as well as what we might call traditional and internally facing business analytics.

Those that work with me will know that I advocate articulating BI requirements as business questions so External BI can be characterised as answering questions like;

  • Which of my prospective customers are doing things that they might need our help with?
  • What are my competitors doing that represent threats or opportunities?
  • What is happening in my marketplace that will affect our strategy?

B2C businesses might also ask questions like;

  • What are our customers saying about us?
  • Do we have a good or poor reputation?
  • What do people think about our brand?

It is important that the answers to the first block of questions are detailed so that you get answers from editorial sources, blogs and tweets that might include 'Our #1 competitor has changed their advertising agency' or 'One of key accounts is rumoured to be making an acquisition' The second block may also need detailed answers but it is surprising how much can be determined by monitoring the volume of traffic or 'buzz'.

A solution we recently built for a pharmaceutical company gave us some fascinating insights based on buzz. During the first two weeks of the implementation one of their drugs was associated with cancer. As you might imagine, the volume of posts, news stories and tweets spiked dramatically. Over the next few days, the volume decreased as the facts became known and rumour and speculation were displaced. Eventually the buzz quietened to normal patterns. Unusually, the following week, a second rumour surfaced. This time, that the very same drug was associated with weight loss. The same pattern emerged with a pronounced spike over a few days followed by a calm induced by rational and factual information dissemination. What was surprising to us was that the spike associated with weight loss was higher and longer. In other words, there was more interest, more conversation and more interaction when a product was associated with weight loss than when it was associated with a terminal disease.

Sigh.

Wednesday 14 July 2010

CRM A great source for Sales Performance Management but a terrible substitute

The reputation of CRM implementations is a mixed one but we could say that about many IT endeavours. The good news though is that there seems to be a more balanced take now. Take a look at Lauren May's article, 'CRM is No Longer a Four Letter Word' as an example.

There are many reasons that it has a bad rep. Not least because many organisations focused on IT, systems and process but forgot that it was essentially about the err Customer. And the costs! In the high profile, now failed, EDS implementation for Sky, the original costs before (allegedly) spiralling out of control were estimated at over £40m.

In spite of the misses, CRM is still a purchasing priority for CIO's in 2010 according to Ed Thompson, Gartner Analyst who suggests "For most organizations, the single most logical way to differentiate the business is through great customer experiences, rather than having the lowest cost or most innovative products and services".

However, I can't help think that some of the reputation is based on the classic principle that it was oversold and under delivered. Analytics are a good example of this with many of the platforms promising innovative ways of analysing sales performance. They promise democracy through dashboards but it was never going to be that simple.

CRM implementations are allowing us to capture more about sales processes and activities than ever before. However, valuable and actionable insight will not simply fall out of the CRM implementation. The biggest challenge is that they all treat their own data like an island. Useful sales analytics can only typically come from bringing together information from Sales, Marketing and Finance systems.

Ask a sales rep for the revenue number and they will tell you what was booked on the order. But we all know that a lot can happen between an order and an invoice and the difference is a constant source of frustration to a management team trying to drive the business in the right direction using a gage with one set of numbers and three needles all at slightly different points. And, of course, the cost of a campaign or the profitability of customers is not information captured in a CRM platform but requires information from the CFO and the Marketing Director too. Reporting or dashboard tools in a CRM implementation are great at reporting operational CRM but they cannot bring information together from all those places that are needed in a meaningful way. At least, not yet.

I am not being critical of CRM systems. Their existence has opened up new opportunities for the analysis of sales performance for organisations and that's a very good thing. However, those that expect integrated analytics that provide useful insight into the plan variance, demonstrate which sales tactics are working or identify which products and customers are proving more profitable may not get it from their shiny new CRM dashboards. Meaningful and actionable insight can only come from combining CRM information with other organisational data into a coherent and common framework. It can, as I have seen it done in many instances, look like it is part of the CRM implementation but the real work is done behind the scenes by smart data warehousing and enterprise business intelligence tools.

Do you know where your sales organisation is heading? Sales Leaders take the IBM Sales Performance Management Assessment here;

www.artesiansolutions.com/bathwicksurvey

Sunday 20 June 2010

BI Requirements Should Be Challenged and Discussed not just Gathered

There are many resources remonstrating with the IT community on the importance of gathering requirements. Failing to gather requirements, they warn, will lead to a poor solution delivered late and over budget. This is largely inarguable.

However, I would warn that simply 'gathering' requirement is as big a risk. Fred Brooks, author of 'The Mytical Man Month' once said that 'the hardest part of building a software system is deciding what to build'. And deciding what to build is a two way process rather than the act of listening, nodding and documenting that we all too often see in Business Intelligence projects.

From time to time, I hear someone cry foul on this assertion. They argue that it seems like the tail is wagging the dog or that the business cannot compromise on the requirement. I usually point out that simply building what the user asked for doesn't happen in any other field of engineering. Architects advise on the cost of materials when planning a major new office building, City officials take advice on the best possible location for a bridge and environmental consultants are actively engaged in deciding exactly if and what should be built in any major civil engineering project.

And this is exactly how we should approach business analytic requirements. As a two-way exploration of what is required, possible solutions and the implications of each. Incidentally, this is particularly difficult to do if business users are asked to gather and document their own requirements without input from their implementation team.

An example of why this is important is rooted in the fact that many BI technologies (including IBM Cognos) are tools not programming languages. They have been built around a model to increase productivity. That is, if you understand and work with the assumptions behind the model reports, dashboards and other BI application objects can be built very quickly. Bend the model and development times increase. Attempting to work completely around the model may result in greatly reduced productivity and therefore vastly increased development time.

So be wary of treating 'gathering' and 'analysis' as distinct and separate steps. Instead, the process should be an iterative collaboration between users and engineers. Requirements should be understood but so should the implications from a systems perspective. The resulting solution will almost undoubtedly be a better fit and it will significantly increase the chance of it being delivered on time, at the right cost and with an increased understanding between those that need the systems and those that build them.

"We fail more often because we solve the wrong problem than because we get the wrong solution to the right problem", Russell Ackoff, 1974

Monday 31 May 2010

The Empire State Building and BI Projects

I was in New York a few weeks ago and took the family to the Empire State Building. It's opening in 1931 coincided with the great depression. In fact, so much of its space remained unoccupied that it became known, for a while, as the Empty State Building. Today, no one can argue with its fame or its success. Even if it is now only the 15th tallest building in the world, it stands as an encouraging reminder that many successful projects span economic cycles.

It would be foolish to argue that, a BI project, started today, will still be in place in eight decades. Indeed, many will be replaced in as many years. However, many BI competency centres or departments have been initiated in similar circumstances. Many evolved during the last economic downturn to help businesses identify less profitable parts of the business or simply those areas that could withstand cutbacks without reducing the businesses ability to react when the economic cycle reversed. Accurate insights helps business navigate through a recessionary market and then, once through, help them identify the products, markets and geographies that show signs of growth first.

This goes some way into explaining why last month's Gartner report, 'Market Share:Business Intelligence, Analytics and Performance Management' found that BI continues to outgrow other enterprise software. Year on year, 2008 to 2009, BI grew at a little over 4% to $9.3billion. I am not convinced that anything can be called 'recession proof' but three decades of growth in a market now dominated by SAP (BO) IBM (Cognos) Oracle, Microsoft and the global SAS campus doesn't appear to be over yet. My own view is that the most significant reason for consistent growth in the BI market is rooted in something a colleague of mine reminds me of from time to time. BI, he asserts, is a process not a project. BI as a market continues to grow precisely because its organisational adoption continues to grow. Successful BI implementations satisfy one set of requirements but new ones emerge.

One final parallel between the famous New York landmark and BI projects. Apparently the famous art deco spire of the Empire State was intended to be a mooring mast for dirigibles but this proved wildly impractical once the building was completed. The broadcast tower that replaced the dirigible mooring is a spectacular example of adapting a project as the business around it changed and more about the environment became known rather than theoretical.

Thursday 8 April 2010

Why are targets a political football?

Battle lines are being drawn up as we head towards the election and interestingly there is a new point of contention for the major parties. Targets. In relation to the NHS, indeed any Government department, they have become short hand for "too many middle managers", "waste" or "red tape".

In a recent BBC article http://news.bbc.co.uk/1/hi/uk_politics/8538601.stm Professor David Kerr, a political adviser suggests that patient choice has been affected by a "blizzard of targets" and that the NHS has been left "bogged down by process driven targets". A cursory glance might leave many thinking that targets are a fairly bad idea. Valuable resources otherwise engaged in the business of saving lives are redirected to meaningless management of numbers. Even Radio 4 presenters are starting to suggest that a core component of "efficiency" would be to "drop all those targets".

I can't think of any human organisational endeavour where we assume that planning outcomes and taking a reading to measure progress against plan would be considered inefficient. Should we have built Terminal 5 without monitoring current and forecasting future passenger throughput? Fly to New York without (albeit automatically) monitoring the land speed, altitude and current position? Start at either end of the Channel Tunnel without monitoring when (or if) it will meet in the middle? I appreciate that the nature of the news can mean that complex subjects are overly simplified and that politicians need to work with sound bites but to suggest that targets are inherently a bad thing would be to mislead in the extreme. Let's find some other shorthand for overly bureaucratic departments, saying that it is 'all those targets' doesn't cut it. Measuring the wrong things or measuring the right things inefficiently is wasteful but measuring nothing means that a business (or a department) is guessing it's way to the next crisis.

Sunday 14 March 2010

The Journey from BI to CPM: Part 3, Cause and Effect

So far we have established that there is no single journey from BI to CPM. Instead it is an evolution that builds on successful implementation and aligns itself closer to the business in order to drive deeper and sustainable success. In the last post I also described a process of establishing requirements which was less about reporting and more about establishing the business priorities, the objectives required to deliver against them and then, in turn, the metrics required.

In this post I want to encourage you to expand on the objectives to understand how they are going to be achieved. This will help the business identify if it is measuring the right things whilst validating what you have discovered so far. It may also uncover related or more detailed information requirements.

For example, the engineering business we talked about in part 2, were looking to increase market share. One of the objectives they identified to achieve this was to increase account share. If we continue to explore this objective further we are likely to identify the initiatives required to achieve it. To increase account share the sales team must cross-sell. In order to cross-sell there has to be collaboration across the sales people that sell their products and services into a single account. In order for the sales team to cross-sell the services team must also align around a strategic account plan.

So our objective to increase account share actually becomes a series of cause and effect objectives or initiatives;

INCREASE MARKET SHARE by increasing account share

  • Cross Sell
  • Align services to strategic account plan
    • Encourage global account collaboration
    • Create compensation plan that reward global account collaboration

Here, we can see a basic cause and effect between these objectives. We can also see that they begin to cross departments because sales, services and HR all own objectives that contribute to the ultimate senior management team objective of increasing market share.

There are a number of methodologies which would drive these discussions including those introduced by Robert Kaplan and David Norton in a series of books including 'The Strategy Focused Organisation' and through their organisation Palladium. This approach organises this cause and effect matrix of objectives across the four balanced scorecard perspectives financial, customer, internal and learning and growth. The completed diagram, a strategy map, is a compelling visual representation of those objectives that should be driving the organisational strategy.

Whatever methods or techniques are used, the essence is aligning organisational information with current business priorities. The alternative, to either replicate (albeit improve) existing systems or to ask the business to mock up endless cross tabs in spreadsheets to communicate their requirements would only deliver partial success.

These last two posts have been about the impact of information systems on performance management. Whilst critical, this is only part of the story. Performance Management is also about what the business does with the insight. How it collaborates, adapts and executes the plan which in turn requires robust planning, budgeting and forecasting. More on this in future posts.

This series of posts draws greatly from the work of Kaplan and Norton and their approach to Corporate Performance Management using the Balanced Scorecard and Strategy Maps.

Sunday 24 January 2010

The Journey from BI to CPM: Part 2, Establish Business Objectives

This part of the journey is about asking if you are measuring the right things today. This seems like an obvious point but reporting systems are often put in place around those things that are easy to measure rather than those things that are important. Over time, systems are replaced but this can make things worse not better. It is all too common for the new system to deliver more or less the same information but in a new technology. Faster, sexier, more charts, in a dashboard, around a scorecard, through a browser but essentially the same old information the organisation has always reported.

To determine what needs measuring requires a more considered approach, a step back. It requires an understanding of the few critical business goals or objectives that are important today. Obviously, this will depend on who in the business you ask. Ask the board and the response is likely to include the strategic objectives and financial outcomes that describe how the company creates value for its shareholders. Ask a functional head and the response is more likely to include those things that contribute to financial outcomes. Sales Management might be looking to "improve sales productivity", engineering to "increase number of product launches" and marketing "increase campaign effectiveness"

There are techniques for organising the various objectives across an enterprise into a coherent whole but it is too large an undertaking for me to cover here. Similarly, I am not advocating a lifetime of paralysis through analysis here. At this stage it doesn't matter about the level. Some objectives will be strategic, some tactical, some board level and some departmental. Whichever part of the organisation you are engaged with will both constrain and influence the answer to your questions, what matters is to ask about objectives, vision and goals. For example, I recently worked with the European Management Team of a global engineering company. They were tasked with growing revenues in a market that wasn't growing. This obviously meant taking market share.

Their approach to achieving this was to go after larger deals through the direct channel whilst growing their indirect channel to pick up deals in the mid market. This strategy could be articulated as a number of objectives;

  • Grow indirect sales
  • Close more deals that of €1m or more
  • Retain existing customers
  • Increase account share

The next step was to identify the metrics by which these objectives could be measured which included;

  • Grow indirect sales (% Indirect Revenue Against Total)
  • Close larger deals (#Deals >=€1m)
  • Retain existing customers (Renewal Retention%)

So it is really straightforward. Ask the business what it is trying to achieve, to articulate the objectives and then determine how they can be measured. The example above is fairly high level and strategic but the approach to more tactical requirements is exactly the same. What are you trying to achieve? How are you trying to achieve it? How do you measure it?

The next part of the journey will require that you understand the initiatives that support these objectives and then further, the metrics that drive those.

This series of posts draws greatly from the work of Kaplan and Norton and their approach to Corporate Performance Management using the Balanced Scorecard and Strategy Maps.