« Launch Late to Launch Often | Main | Venture Terms - Liquidation Preferences and Participation »

How and What to Measure

Then, shalt thou count to three. No more. No less. Three shalt be the number thou shalt count, and the number of the counting shall be three. Four shalt thou not count, nor either count thou two, excepting that thou then proceed to three. Five is right out. Once the number three, being the third number, be reached, then, lobbest thou thy Holy Hand Grenade of Antioch toward thy foe - Michael Palin character, The Holy Grail

Successful businesses measure and count things. I think that's a safe assumption on top of which we can drop the following hypothesis: unsuccessful business either measure nothing, the wrong things, too many things, or finally, they measure the right things but they don't communicate the measurements efficiently.

Great lessons about what and how to measure can be found, not surprisingly, within companies whose profitability is entirely built around delivering service X more efficiently than the market. Consider a performance-based marketing company such as LinkShare, Performics or the like. These companies have to religiously measure how well certain kinds of affiliate inventory can convert to sales for certain kinds of marketers so that they can more efficiently deliver lower Cost Per Action results for marketers. Since there are lots of performance-based marketing companies, only the ones that convert at the lowest CPA's will stick around. You will find when you peek inside these companies that they measure trends and exceptions, and have very efficient mechanisms for communicating trends and exceptions throughout the organization in a way that facilitates rapid management decision making.

Of course, every company ultimately needs to execute on product or service X more effectively than the rest of the market in order to be most successful (let's ignore monopolies and oligopolies here), so it will help us to understand how we should deal with measurements and metrics reporting within the company.

Measurements and metrics reports should serve two purposes:

  • Speed up, not slow down, communications within the business
  • Provide leading indicators about the state of the business against which management can make decisions

Metrics reports should NOT:

  • be a platform for rationalization
  • necessarily require meetings to discuss
  • facilitate the illusion of progress

The first two points about the goals of metrics reports shed some light on how our metrics reports should communicate information.

Trends and Exceptions. A year ago, one of our network operations reports was quite detailed. In order to make sure that we never missed an impending issue regarding server load or infrastructure, we had a weekly report that spit out hundreds of numbers. The so and so database is only 23% utilized, the so and so load balancer is processing N hundred million requests a day, on and on. Yikes. If you looked at this report week to week, you would quickly develop the behavior of "there is nothing interesting in this report against which I need to have additional conversations or make decisions", so it became a report that was created around which we had a weekly meeting to talk about the report. Bad. How do you improve communications between the network operations team and management and improve the delivery of information that can help make decisions about the business? Instead of reporting the same hundred metrics week to week, you report only on "trends and exceptions". When you make the network infrastructure report more of a sparse matrix in which the information showing up today is only there because it's exceptional, it becomes a much better communications and decision tool because you are only seeing things that stick out. Your eyes don't glaze over the numbers. If there aren't any exceptions to highlight, well then you probably don't need to meet about the lack of exceptions.

Green Yellow Red. Keeping in mind that the goal of a metrics report is to make communications more efficient and provide data against which decisions can be made, it is amazing how just adding a simple set of color codes to a detailed metrics report can accomplish both goals. Our most necessarily detailed operations report is color coded with green, yellow, red indicators. Green is "here is a good thing that happened you might want to know about", yellow is "here's an interesting issue, but the team is on top of it, no action required" and red is, as you would expect "issue requires management attention - here's what we've done so far". Something as simple as color coding a report improves your ability to see patterns in the business and react to them more quickly.

But certainly there are some kinds of metrics that can't be described in terms of trends and exceptions and color codes. The classic sales pipeline report, for example, helps the sales team and management understand how well the company is doing against a plan and if done right, they serve as a very clear leading indicator of revenue and a great communications tool between sales and management. My one general suggestion around the pipeline report is to avoid the mistake of creating a mechanism for the illusion of progress. What do I mean by that? Only that by creating very detailed levels of customer engagement reporting, ostensibly for the purpose of getting a better picture of how likely a deal is to close, what you can end up with is a mechanism for potentially meaningless changes in status.

The best sales pipelines I've seen are very simple and only have a couple of different stages that a lead can go through along with a simple green yellow red on pipeline vs. plan that takes into account historical close percentages on qualified leads. Frequently, management wants more detailed visibility on qualified leads, particularly n big enterprise software deals, and therefore the company will add varying stages to qualified leads, which we can just call A-E or 1-5. A potential customer at stage 5 is very likely to close, and stage 1 is perhaps just had first meeting. While I understand that million dollar software license pipelines can't just be "unqualified, qualified, and closed", the pipeline report should provide only enough detail to accurately predict results, otherwise you are just facilitating the illusion of progress. I once worked with a company that had very detailed stages that a qualified lead would go through, but of course, these stages are all abstractions until you've signed on dotted lines. Management then felt these very detailed stages gave it a better handle on "where this customer was in the process", however, what you would start to see happen is that the sales team would "manage status" in order to effect the illusion of progress, not even consciously, but just as a way to show effort week to week. We might be on a monday pipeline call, and two big deals would move from stage 2 to stage 3...."we made a lot of progress last week. I think we can bump GiantCo up from a 33% to a 50% likelihood...." and then we would hear another five minute explanation regarding why this was now 17% more likely to close. There are two problems with this kind of measurement. First, it's not real unless management has some very detailed sense of the historical rate with which stage 3 deals close vs. stage 2 deals by these exact sales people, and second and maybe even more importantly, the business becomes more about managing status and soaking up time in meetings around status instead of running the business.

Much better to have one page pipeline summaries that highlight how the next months and quarter look against plan based on historical close percentages, and then sheets behind that with trends and exceptions that are color coded (eg, we have three deals that came in last week that are 50% larger than our average deal size, the GiantCo deal has been in the pipeline for 200 days and hasn't closed so it's yellow because once a deal is more than 200 days old we tend not to close them, etc.).

The bottom line on metrics is that you want to provide everybody in the company with the best information about the health of the company and create a platform for decision making. By just "measuring everything" you can get into situations in which the business becomes more about the measuring and less about the communications and decision making. The things you DO measure, you should measure religiously and take decisive action when the data speak up.

TrackBack

TrackBack URL for this entry:
//www.burningdoor.com/mt/mt-tb.cgi/1620

Comments

Thanks for a great post. I couldn't said it better.

Color coding describes the shift that a company's management needs to make as the business grows. Labeling a detailed report in green/yellow/red still requires someone to go through the gory details before making sense of trends and exceptions. At first it is the core group of employees that handles analysis from beginning to end, while afterwards first level analysis can be delegated and management only makes decisions.

About your description of sales pipeline management – I couldn't agree more!

See my post "You Can Only Measure 3% of What Matters": http://ben.casnocha.com/2007/01/you_can_only_me.html

I link to Tom Peters audioblog where he calls metrics "essential but useless."

When you can measure what you are speaking about, and express it in numbers, you know something about it; but when you cannot measure it, when you cannot express it in numbers, your knowledge is of a meager and unsatisfactory kind: it may be the beginning of knowledge, but you have scarcely, in your thoughts, advanced to the state of science
Lord Kelvin

Ben, tom peters is "quotable but useless". I love the concept about metrics that are created for the illusion of progress. We are doing this right now in our own company and I never knew how to say it was wrong before. This is exactly the problem. We are measuring what gives us the ability to say we're making progress, not what helps the company. Great post!

One comment on the stages of a sales cycle: I agree that a few are better than many - I also think that in complex sales situations, a set of binary events (met with key decision maker,for example) that must be accomplished before an opportunity can move to the next stage can eliminate the subjective "I think we can move to a 50% probability" BS. I also think that these large deals are binary - they either close or they don't - and assigning a probability of closing is a useless exercise. It's much better to say "We've accomplished tasks X, Y, and Z, and this opportunity can now be called 'qualified', which means we are within 60 days of closing, have developed and submitted a proposal, etc". This results in much more accurate forecasting of individual opportunities, which also makes the overall forecast better.

Every sales guy games the system, exactly as you say. The only way to solve this problem and ensure the accuracy of your pipeline metrics is to define some tangible, objective indicator to mark the progress of an opportunity through each stage of the sales pipeline.

Example - If stage X is "Problem Identification," and stage Y is "Buying Vision," you need the sales exec to get some written or electronic confirmation *from the prospect* indicating that the sales exec's specific characterization of the solution the prospect would buy is accurate. This simple requirement drives a whole set of constructive behaviors in the sales team, in addition to giving you more accurate data.

Not sure who's gaming the system? As a check, look at average close rates by pipeline stage by sales exec. You'll be able to calculate the average for each stage, identify outliers, and help those folks improve their technique or their reporting in such a way that improves overall results.

And that, at the end of the day, is what good measurement is supposed to do.

Thanks for an excellent article. I particularly like the red green yellow section and the mention of the challenges showing sales pipeline metrics. I've got a few examples on The Dashboard Spy collection of Dashboard Screenshots over at http://www.enterprise-dashboard.com

Post a comment

(If you haven't left a comment here before, you may need to be approved by the site owner before your comment will appear. Until then, it won't appear on the entry. Thanks for waiting.)