⌘ Listen to the Podcast

Ron Ed, I hate to do this to you, but you need heart surgery.

Ed: No. Are you going to do it,

Ron? You're not going to do it, are you?

Ron: No, I don't specialize in that. You've gone online like a good consumer, WebMD, Mayo Clinic, and you've even checked local hospitals in your area. You've talked to some people, your doctors, nurses, friends, colleagues, all of that type of thing. You've narrowed it down to two surgeons. You find in the published literature on websites, doctor scorecards, that surgeon A has a 65% mortality rate for his heart patients, and surgeon B has a 25% mortality rate. Of course, the mortality rate is the risk of dying from the surgery. Which surgeon would you choose?

Ed: I guess the short answer would be you'd go with the one with the less mortality rate, but reality is you would say, "I need ask some more questions," don't I, because I need to know if maybe surgeon whose 65% mortality rate takes on the hardest cases. Therefore, that's the person that I should go to. If he or she were specializing in the case that I had, I guess, I still have a 65% chance, but maybe somebody else giving that surgery is way more. 

Ron: You're exactly right. In other words, the thought experiment you don't have enough information. You would want to dive deeper. I would want to know, for instance the surgeon with the 25% rate is he pre-screening patients and only taking on the easy cases.

Ed: There's the other aside. [inaudible 00:01:37] the system.

Ron: Like you say, surgeon A with higher death rate could be taking on the far more complicated cases, the ones that are beyond hope, and he could be the more skilled surgeon. The point is that the problem with measurements is that they cannot only do drive out judgment, but they can give us a false sense of knowledge. They can also provide the illusion of accuracy. I've equated this, Ed, to something again we borrow from the medical community with the iatrogenic illnesses that we talked about, but I borrowed a concept from the insurance industry called moral hazard. A moral hazard is when people have an incentive to take more risks or to act more carelessly when they're insured. If you think about fire insurance it causes arson. If you think about unemployment insurance there's no doubt that it leads to people being unemployed longer. Life insurance can lead to suicide or even murder in extreme cases. If you think about Federal disaster insurance, people can build on a flood plain because now you're incentivized to do it. These are all what actuaries call moral risks. It's a huge problem. I think the same types of risks exist when we're talking about measurements, because just like the surgeon question looking at those two number side by side you think I'm going to go with the lower one. You're being driven to something that's reckless or careless because you're not basing it on reality.

Ed: I’m reminded of a funny line that we'll throw away sometimes in presentations and we'll say 57.8% of all statistics are made up. It's important when you use that joke to do the point eight because it's a double joke because the whole idea is if you have to go to a decimal place on a percentage it's the illusion of accuracy. 

Ron: Yes.

Ed: Right? It's the illusion of accuracy. No one's going to go, "I really needed to know that it was 57.8%. Thanks."

Ron: This goes back, and I know we've touched on this theme before with the McKinsey maxim, this statement that what you can measure you can manage. I know that Peter Drucker never said this.

Ed: Thank you. Thank you.

Ron: Peter Drucker never said it. He never wrote it. He didn't believe it, by the way. What Drucker said was what you measure is what you'll get. Not quite the same thing. The McKinsey maxim, what you can measure you can manage. I used to fervently believe this. I used to say this to my customers all the time. This was my world view. Now I think it's very, very dangerous because it implies that if we just measure things we can manage them or control them better. This is nonsense. You don't change your weight by weighing yourself more frequently or more accurately. I think the whole problem with measurements is we can substitute measurements for thinking. It does. It crowds out thinking.

Ed: I gives a lot of people the sense of the illusion of control. That's the phrase I like to use. You have the illusion of control because you have some measurement system in place. I think you're correct. As with many of our shows, we have to state that what we're not saying is that any and all measurement in business is bad. We're not stating that.

Ron: I couldn't state, Ed. The accountant in me wouldn't let me. It's a nonsense statement anyway if you think about it. Businesses, we've always measured things every since commerce has been around we've been measuring things or counting things. The question is, what's important and how should those measurements be structured. That's what we want to talk about.

Ed: Especially in knowledge work, especially in knowledge work because way too often there are leaders and managers who said, "How are we going to measure that? How are we going to measure that?" We're trying to do an innovation here. We're trying to do some outside the box thinking, to coin a phrase. Measurement isn't really the be-all and end-all here. It's really about creativity and innovation. We have to sometimes fail. We have to feel comfortable with at least taking a shot at something, and maybe it's not going to work out. Let's not worry about it. 

Ron: Right. As Drucker pointed out, the decision, in effect, is a judgment. What you start with is not so much the numbers or the measurements, but you start with opinions which are basically hypotheses, and then you test those hypotheses. What we're saying is that your measurements, whatever those measurements might be, they need to be linked to a theory, a testable hypotheses of cause and effect. It's the theory that guides what we measure, as I think Einstein said that or something close to that. It's the theory that drives the measurement, not the other way around. 

Ed: We can't let the entrenched, incumbent theory rule our minds because we've always done it that way. We've always measured that, so that must be the right thing. That's what I see is that there's no questioning of the measurement. There's no is the right measurement in a lot of cases. It's just this what we should measure because that's what we've told to. The other thing that I see is very few times do measurements fall off the board, the whole balance scorecard and these dashboards, which I think are great. The company that I work for sells them. The challenge is that 57 different key performance indicators means you're going to look at none of them. 

Ron: Yes, 57 equals zero in that point. 

Ed: The dashboard is exactly that. It's supposed to be the dashboard of the five things that you need to know when you're driving the car. Speed, oil pressure. It's a limited number of things.

Ron: They're the critical things, basically. You certainly wouldn't want to look at last month's oil pressure and fuel.

Ed: In arrears. That's good.  How much gas did I have last week at this time? 

Ron: Which is what accountants love to do because we love to play historians of bad memories. To your point about how statistics become entrenched, one of the largest statistical undertakings ever by the US government was to try and measure the size of the communist countries. I just want to give the example here of East Germany. In the 1989 addition of the statistical abstract of the United States, they reported that East Germany was about the same size as West Germany and even had a higher per capita GDP. They had thought this for a long time. Of course, 1989 was, I think, the year of the Berlin Wall fell. Any taxicab driver going back and forth through Checkpoint Charlie would tell you that East Germany was obviously inferior to West Germany, yet the statistics portray just the opposite. It was just a colossal failure from a statistical undertaking. It does just show you how statistics and numbers and measurements can absolutely mislead us. 

Ed: Who is the guy, belief, science is the belief and the ignorance of experts. Fryman, Fryman. 

Ron: Yes, yes. 

Ed: That's exactly what happened in the case of this East Germany. These experts and they were all experts I suppose that were trying to figure this out. Didn't get very far. 

Ron: Not dumb people.

Ed: No, no, no. 

Ron: This drew on economists, and in the census department, and in the Department of Commerce, and even the CIA had a hand in the statistical project, but they were just way off. That's one of the things that's not talked a lot about when we talk about measurements. Like you say, you put out a number with a decimal place, or better yet, two decimal places, and then it looks like it very, very precise. Yet, there's an enormous amount of errors in our measurements. Look at how many times the GDP is revised after it's first published, or the unemployment rate, or the number of jobs created. The revisions dwarf sometimes the change.

Ed: Retrospectively. That's happened in a lot of cases too, hasn't it, where they go back and restate stuff and it wasn't even close to the first time.

Ron: In fact, I think the first quarter of this year wasn't it that we first projected 1-1/2% growth in the GDP, and then they revised it to a negative growth for the. 

Ed: Whoops.

Ron: That's a pretty big swing. That's not all that uncommon when you're talking about statistics and measurement. One of the things that I would love to do today is talk about some of the moral hazards of measurements. This is something, Ed, I wrote in the book that you contributed to and maybe you can talk about this from the Bill James sabermetric standpoint. Baseball is a game that's fixed with measurements, isn't it?

Ed: Oh, it is. We love it. My 8-year-old son is now totally into it. He has the total baseball books that I used to have, pouring over the statistics. They actually glow when you open these books. The point is that what you can't do is you can't determine what any individual player is going to do on the field at any particular time against any particular pitcher no matter what statistics you have because just about every baseball game I watch I say, "Never seen that before." 

Ron: That's why we watch it, right?

Ed: Right.

Ron: If we knew it was going to happen how boring would that be? This is why the innovation creativity should take us by surprise. 

Ed: Exactly. Exactly. 

Ron: If we could plan it we wouldn't need it. Folks, what we're going to do is we're going to talk about the seven moral hazards of measurement. I don't know if we're going to get to all of them, Ed, but we'll give it our best shot. This is something, again, that I wrote in my book Measure What Matters to Customers. Ed, you contributed to that book on the discussion about sabermetrics. I wrote that book because I wanted to refute the McKinsey maxim. I wanted to refute the idea that what you can measure you can mange. For me, it was a cathartic process because I was renouncing something that I believed in my entire professional career. Now I actually think it's kind of dangerous. 

Ed: Stanley Marcus was one of the sons of the founders of Neiman Marcus. He used to have a saying as he was really helping the store make it through the Great Depression that a market never came into his store and bought anything but a lot of customers sure did. That really leads us to this first moral hazard which is we can count customers but not individuals. What did you mean by that,

Ron?

Ron: I love that. He did say that, that I've never seen a market walk into my store, but a lot of customers have and bought things and made me a rich man. His point was that we tend to aggregate people and call them, whether you call them consumers or customers or you give it amorphous term like the market. He said but it's really just human beings. It reminds me of what singer Joan Baez used to say. Having a relationship with 100,000 is no problem. It's having the relationship with one person that's really, really hard. You can turn that around and look at Stalin's famous remark, of course, I don't think he said it, but there's a lot of debate about it. One death is a tragedy; whereas, a million is a statistic. Once we get past the individual human being, then we just become these globs of statistics and measures and aggregates. That doesn't describe the flesh and blood of the economic system. 

Ed: No. this manifests itself in interesting ways in organizations. I call it collective noun syndrome. You'll hear things like sales just doesn't get it. 

Ron: Yes.

Ed: Operations is really slowing us down, as if those things are collectives had an entity onto themselves. Who in sales? Who in operations? There's a person here. We keep coming back to this mantra of nobody here but us people. That's the whole point of this whole enterprise is that we're made of people here and not just these numbers. The numbers tend to give some, I don't know, satisfaction that we can say, "At least we measured it, at least we counted it." 

Ron: Right. It's like the McDonald's. Over a billion or whatever it is now hamburgers. Okay, great. When are they going to stop this? I guess the way I think about that is if you look at a company like GM or Toyota, they roughly sell, I don't know what it is, between nine and 10 million cars a year, something like that, worldwide. You know what? Those are sold one at a time, in effect. Those are sold basically one at a time. One customer at a time. We can aggregate them and we can throw them into these analysis, but you have to deal with the human component here. I think that's the first moral hazard is it's easy to look at aggregates, but it misses the individuality. It's like that old saying that I love that I can prove on average everybody in the world has one testicle. 

Ed: Statistically true.

Ron: Statistically I'm absolutely right, but if I believe that as a sentient human being I'm an idiot. It's like if you have a room, a conference room, and some people are complaining that it's too hot, and others are complaining that it's too cold. You can't sit there and aggregate them and say, "On average, you should all be all right."

Ed: This has manifested itself differently in more recent cases under the heading of personas. Have you gotten into any conversations with people about this? We have to develop, who's our marketing persona. 

Ron: Yes. 

Ed: Right?

Ron: Yes.

Ed: Wait a minute. I actually much prefer an exercise where we look at the current customers that we do have and name them. Fred, Ethel, Lucy and Ricky. 

Ron: Right, right. 

Ed: These are the customers, let's look at them, and then maybe try to predict out and extrapolate what we can do for them tomorrow or five years from now. To come up with these personas as if this is somehow helpful, I think it's gobbledy-gook in my opinion.

Ron: I agree. You know, Ed, this is where one area where big data or analytics might help us because they're getting so good at tracking us on mobile devices and even where we are, and what we're searching, and things like that they can target things to your specific situation. Now that can get creepy, no doubt about it. It does allow us to treat people a little bit more individualized rather than just these masses of aggregates.

Ed: Yes. Dude, I have to tell you, so here's the deal. This is about six weeks ago, I'm flipping through my Facebook at night. I'm born in Brooklyn, grew up on Long Island, huge New York Mets fan. We're talking about the whole baseball thing. 

Ron: Sure.

Ed: I now live in Texas, right? I get an ad on my Facebook, and this is the ad, I kid you not, is a New York Mets T-shirt, but the Mets logo is carved out and superimposed over the outline of the state of Texas. 

Ron: Wow. Wow.

Ed: I'm like, "I have to get me this T-shirt." It's like $25 for the T-shirt. I have to have this. It turns out this is some of the stuff that and why Facebook market cap through the roof on this, these are things that are emerging out of this called dark posts. 

Ron: Yes.

Ed: This is where we're getting to we're not talking then just about consumers. We're looking at targeting individuals. I just thought it was a brilliant strategy, and I think that's why Facebook is in such good shape right now because I think they look at this is individuals.

Ron: Sure, sure. They know where you're searching on the web. They know what you're doing, and then their an advertiser. They can sell advertisers to target exactly what you want. I know there's a fine line there with privacy and all that, but the point is that I think it's going to become easier to treat us more like individuals.

Ed: No doubt.

Ron: That was really Stanley Marcus' point. Because if you think about companies that treat everybody the same, customers don't want to be treated the same. They want to be treated individually. Look at companies that do treat us all the same, it's usually the postal service or the cable companies. I would not hold out either of those as great service organizational models.

Ed: Not usually.

Ron: That's our first moral hazard, folks, is we can count consumers but not individuals. That's the point.

Ed: Sure.

Ron: Then, the second moral hazard that is you change what you measure.

Ed: Yes.

Ron: We talked about this a little bit in terms of the prep for the show, the Heisenberg uncertainty principle, the observer effect. 

Ed: Right.

Ron: The fact that when you have people sitting around and in lab coats experimenting whether it's a human experiment or even a physics experiment, they could have an influence on the measurements, couldn't they?

Ed: Yeah. As we look this up, the Heisenberg principle is very specific to particle physics. It's nota one for one analogy here because it says that as you're trying to observe the position of a particle you're actually affecting by looking at that you're also affecting the speed of the particle or the trajectory of the particle. Vice versa, if you try to understand the trajectory, you can't understand the position or mass, I'm sorry that's what it is, or mass of the particle. It's really odd. It is related to what's called the observer effect which is there are certain experiments, especially at the extraordinarily small level, where the instrument itself interferes with the measurement, the instrument itself doing the measuring. Let's face it, if you're going to try to measure something to a very, very specific degree, let's say with temperature, you have to use some kind of an instrument to do that. The instrument itself must have a temperature.

Ron: Right.

Ed: Right? It's going to affect the outcome of what it is you're trying to measure. It's pretty interesting. Yes, we definitely do change what you measure. This goes back to what Drucker said, you'll measure what you get. That's what he did say. 

Ron: Right. I like the central banker's have a law they call, it's called Goodhart's law that any target that is set quickly loses its meaning because it becomes manipulated over time. We humans we're scamps. If you put a numerical target on us we're going to find ways to gain the system and that's what we do, whether that's manipulation, whether it's malfeasance, misfeasance. We'll figure out a way to manipulation it. That is also part of this you change what you measure big time.

Ed: That's right. This is why I think why a salesperson compensation system is always changing. It's because once you put something in place, and this is not a knock it salespeople, this is just human beings responding to incentives. We talked about that over and over again in multiple shows. They're going to respond to incentives and they can figure out a way to gain certain things so that it acts in their favor. Then, in order to fix that, go make a correction on that, management has to go in and change the system again because they can't keep it being from gamed. 

Ron: Right, right. It also brings up an interesting point about if you have some type of change, management going on in your organization so you want to create new behaviors. If you stick with your old measurements you're probably going to get the old behaviors and the old results. 

Ed: Yup.

Ron: In other words, you have to change the measurements too. Like you said with the first moral hazard, are measurements become entrenched over time. We're not willing to dislodge them so easily.

Ed: They do, they do. Then, what does happen, and of course, one of the great examples of this is the comstat policing that took place in New York and other cities in the late 1980s and 90s where every day, every week, the police at the top level, the lieutenants and the captains, were reporting back to right to the commissioner about crimes in their area and what were they doing to take care of this, this, and this. They were raked over the coals for this. Sure enough, they went out and they arrested a ton of people because if that's what you get paid for and told that this is what you should be doing you're going to arrest everyone. I guess crime went down, but also the rate of these petty offenses and people being locked up for long periods of time for a relatively small nonviolent crimes, I don't know if that's overall all that great for society.

Ron: Right. It does really illustrate the point that you better be darn careful about what you're measuring because that's what you're going to end up getting.

Ed: Yup, yup.

Ron: That leads to the third moral hazard. We'll probably have to take a break before we get through all of this, but the third moral hazard is that measures crowd out intuition and insight. I think the doctor example illustrates this well. You put two very precise statistics in front of people, and they just pick the lower or higher one without any more questions. There's also another statistic, I think, that we do as a society and the government puts this one out that I believe illustrates this better than anything. That's what we're going to talk about next, Ed. We're talking about the seven moral hazards of measurements and so far, Ed, we've done two. Moral hazard number one is we can count consumers but not individuals, and the second one was you change what you measure. The third is measures crowd out intuition and insight. I think this goes to your point, Ed, about how are measurements can become so entrenched. Because one of the measures that we do as a society is the poverty rate. Now a lot of people don't really understand what this measurement is or, indeed, where it came from. Actually, this woman by the name of Molly Orshansky from the Social Security Administration came up with this measurement in the early 1960s. What she did was she decided that the poverty rate would be set at an arbitrary three times the cost of the US Department of Agriculture's economy food plan. She just took this economy food plan, I guess it was based on a certain caloric intake and the right types of food groups and all of that, and just multiplied it by three and said that's the poverty rate. A guy by the name of Nicholas Eberstat, he's a fellow at the American Enterprise Institute, wrote a fantastic book called The Tyranny of Numbers. He talks about a lot of these problems with the measurement system. He says this is probably the single worse measurement in our government's statistical arsenal because it looks at the income of the poor and not their consumption. If you want to figure out somebody's standard of living you don't look at their income you look at their consumption. Think about a kid. Kids have no income. I can't run around, though, and say your kids are poor. 

Ed: Yes. I've seen this. There's lots of dispute about this back and forth. I've seen people argue that in the United States to become one of the signs of poverty is obesity which you have to take a step back from and go wait, wait. I personally don't think that that's necessarily statistically true, but the fact that that has even entered as a mean into our thinking that folks who are in poverty … because they tend to eat poorly. They make poor food choices, but they're getting too many calories or too sedentary. This really throws us for a loop. You're right. It crowds out this whole idea of judgment and insight. That's really the problem, I think, in businesses is that they just then turn to the numbers and just say, "What do the numbers say?" I've heard people say that. What do the numbers say.

Ron: The obsession with we have to have numbers, more analysis, more measurement before we can make a decision. It's like no, it's probably not going to help you past a certain point. It's going to start clouding your thinking.

Ed: I was working with an organization recently, and this was this week, where this exact thing happened. This poor gal that was I working with, she is now in our sixth iteration of number creation around a decision that needs to be made. I was finally coaching her through this process. I said, "Look, this is not about six different ways of looking at the numbers. This is clearly the leader in this organization not wanting to make a difficult decision." 

Ron: Right, right. It's almost like we need more analysis so I can put off making this decision.

Ed: That's exactly right. The sole purpose of it was to … the measurement activity then became not only crowds out the intuition but a reason for deferral.

Ron: Right. Defensive decision making. I think Roy Sutherland when we talked to him he talked about that as well. Ed, just parenthetically to close off this poverty rate thing, Nicholas Eberstat the guy who wrote The Tyranny of Numbers, recomputed the poverty rate based on consumption. Now I believe the government has the various poverty statistics and one of them does do this. They look at the poors consumption, not their reported income. If you do it that way, the poverty rate goes down to 2% to 3%. It yields dramatically different results simply depending on how you measure it. That's part of the problem with this.

Ed: Even absolute versus relative poverty too. 

Ron: It brings up that topic as well. You've said many times, I've heard you say that you rather be poor in America today than anywhere else in the world.

Ed: Right.

Ron: Yeah, I think one person in India said, "I want to go America where the poor people are fat." 

Ed: Right.

Ron: It also brings up another point about this idea that it crowds out intuition and insight. If you've ever been bumped off, it's more accurate to say bribed off, an oversold airplane flight you have a guy by the name of Julian Simon to thank for that. He was an economist. He's no longer with us.

Ed: Brilliant guy.

Ron: Absolutely brilliant guy. He was talking to, I think, it was a United flight attendant. This was in the 60s, so late 60s early 70s, when the airlines were still being regulated by the Civil Aeronautics Board. She was talking about the problem of oversold flights. The airlines obviously had tons of data on this and they knew what the odds were of an oversold flight because it was a problem that fed on itself. Because there was a probability of being bumped you would book multiple flight, maybe even under different names because back then we didn't have the security procedures we do today, and so that would just feed to the problem because then they'd start selling more seats as less people showed up and even more flights became oversold. Julian Simon's shaving the next morning. He comes up with solution to this problem. This vexed the airlines for years. In fact, their theory, it was pretty funny, they would bump old people and military people under the theory that they'd be the least likely to complain. 

Ed: It's a theory.

Ron: It was probably a really good one because those were probably the two of the nicer segments of the population. What Simon came up with is why don't we just do a reverse auction and give the person the seat who values it the most and pay the person who values it the least, pay them to get off the flight. When he wrote the airlines with this idea and he wrote to the Civil Aeronautics Board as well, they all told him he was nuts, this would never work, or they denied that they did this. He said they wouldn't even run an experiment on it. Of course, today, this was common. Once they figured this out and they tested and it worked, then every airlines does this around the world because it works. The point is is that he didn't come up with it pouring through statistics and numbers. None of that would've helped him. He came up with it by understanding a theory or having a theory about human behavior and testing it.

Ed: Right. This picks up a good thought. Why can't organizations continue to do this? It's a brilliant pricing strategy too, by the way. Right?

Ron: It is.

Ed: The reason why they bump you off for the $250 is because they probably sold a $2,500 first class ticket which then had a cascading effect in that the guy who thought he was going to get the upgrade wasn't, and then the guy who … all the way back to the back of the plane. For $250, we get a fare, a passenger who's a full paying, full boat, at $2,500. Pretty good deal.

Ron: Yes. You've probably sat on commute flights or whatever that tend to be very stuffed with business passengers. When they come on and they're trying to bribe somebody off, everybody sits there with their hands fold and say, "Come on, you can do better than that." They keep raising the price. 

Ed: Right, exactly. Why can't firms do this in cases where they have a high level or difficulty managing their capacity? Why not bribe off someone who's placed an order and say, "Listen, we're just going to refund you 10% if you don't mind taking delivery two weeks later." 

Ron: Absolutely. I think sophisticated organizations do do this. Certainly, the airlines and hotels this is what revenue management, yield management's all about. It's demand driven pricing. It's one way to manage capacity through pricing.

Ed: Right.

Ron: I was really intrigued with Simon's story because he wrote about this in his autobiography. I didn't realize it was him who had this idea. Just how long it took him to even get an airline to run an experiment, I actually think he got CAB to agree with him and force an airline to run an experiment. Of course, now every airline swears by this method.

Ed: Right, right. It just comes down to this is that all measurements are really judgments in disguise.

Ron: They really are.

Ed: The judgment comes first, and then the measurement comes second. I think what we're really calling for here is not a lack of measurement, but just make sure that before you affirm a measurement or keep an entrenched measurement around that you say, "Is this the right thing that we want to measure?" Let's go back to the judgment every so often and question as to whether or not it's the right measurement.

Ron: Yup. That brings us to the fourth moral hazard which are measures are unreliable. I think we've talked in the past about the idea that in this country when a sheep is born per capita GDP goes up, but when a baby's born it goes down.

Ed: That's all you need to know to understand economists are really screwed up people. 

Ron: Absolutely. We have all these economic statistics, and if they've been fudged or made up or there're errors in them those errors can just carry over year after year, decade after decade. Probably my favorite example of how measures can be unreliable, and this goes back to statistics lie and liars use statistics that quote. It's the Bain and Company. If you go on their website, it used to be right on their homepage, it isn't anymore. I checked it this morning. It's under our results or something on their web page. It says our clients outperform the market four to one. They show this little graph and the S&P 500 index, and then they show the Bain and Company client returns on the S&P, and of course, it's four to one. I'm looking at this going, these are business consultants, strategy consultants, so really smart people, probably taken some statistic courses, probably understand that causation and correlation aren't necessarily linked. Wet streaks don't cause rain. This is the equivalent of the rooster taking credit for the sunrise because he crows every morning. I'd be willing to bet that Bain's clients outperform the S&P 500, and thus they have more money to hire Bain. 

Ed: In Excel, we call that a circular error. Right? That's the challenge there. The other great example of this is something that is near and dear to both our hearts which is in professional firms who keep timesheets. In every session that I give we're asked this question, and I said have you ever lied on a timesheet. Every hand goes up. Most times, have you ever lied on a timesheet this week, every hand goes up. Because people put down how much they think they should put down, or in some cases what they can get away with because it's not always a bad thing. They're not always trying to pad their timesheets with the extra stuff. Sometimes they're saying if I put down that it actually took me six hours my boss is going think I'm a moron, so I'm only going to put three. In many cases, it's up and down.

Ron: Or, if I finished it in half the time that they gave me a budget for I'd go play golf and I'd still write in the full budget amount or something, or take a longer lunch, or whatever.

Ed: Exactly.

Ron: As one of our colleagues said, these errors don't cancel themselves out.

Ed: No, they don't. They don't. Moral hazard number five is the more we measure the less we can compare. Comparing information has a place, but it has to be tempered with a theory of what is being observed and if there's a good reason for it and an understanding for the underlying causes. I think the biggest,

Ron, that this manifests itself is in the not-for-profit space, believe it or not. Because I know you and I have heard this, and a guy by the name of Dan Pallota has written a great book called Uncharitable How Restraints on Nonprofits Undermined Their Potential. Whenever something happens, a catastrophe in the world happens, people on Facebook go I'm going to give to so-and-so charity, someone will inevitably say how much of that dollar that I give is going to get to the actual people. 

Ron: Right.

Ed: Is going to go to the actual cause, because I don't want to spend my dollar and have 40% of it go to administrative salaries. Poletta in his TED talk in his book, I believe, also makes this great point that if we want to get some of the best and brightest working on some of these challenging problems that really face society, then we're going to have to start paying people in line with that. He says it's great, if you want to make 50 million dollars selling violent video games, we say absolutely go for that, but if you want to make half a million trying to cure malaria well then you're a parasite because you can't make half a million dollars by trying to cure malaria. That's just not right. 

Ron: Right. He's so right about this. The measurement crowds out the result just because 90 cents of every dollar goes to the cause doesn't mean the charity's effective. He throws out the thought experiment. I think it's a very, very compelling point. If the Jonas Salk Foundation found the cure for polio but spent 80% of their donations on overhead would you care? 

Ed: Yeah. The answer should be no, a resounding no, but people say yes. They say, "Yeah, that would matter." 

Ron: I know, but the measurement is crowding out the judgment of the results. I think that's the point that the more we measure the less we can actually compare. Because just the doctor example, I compared 65% to 25% mortality rate, but there's more to it than that.

Ed: Right.

Ron: That takes us to the sixth moral hazard, Ed, which is the more intellectual the capital the less you can measure it. We talked about this in our show, the economy and mind, and this is why we've titled the show Business and the Knowledge Economy because now we know 80% of the developed world's wealth resides in human capital. If you think about a accounting statements, they don't measure this. They can't measure it. In fact, what they do with human capital is they expense it as salaries and wages on the income statement. It doesn't ever show up as part of the wealth of a company. This is why the accounting statements, the income statement, the balance sheet, they're called the three blind mice because they don't really comprehend the fair market value of a business. All they're designed to do is record the value of a transaction that's already taken place. They can't peer into the future and look at value because accounting is not a theory. It's simply a measurement.

Ed: It's an identity equation, as you say. It gets worse because it gets extrapolated out. You and I have talked to thousands of accountants. In the end, accounting debits equals credits, right? Then, that's gets extrapolated out that if I have a sale on my book and it's debit or credit for me, then somebody else is just an expense. It's the same debit or credit for them and the amount is the same. Therefore, zero sum game thinking, and we're back to that again because it has to be the same. There's no measurement of where any of this stuff comes from I love what David Boyle wrote in The Sum of Our Discontent where he says decisions by number are a bit like painting by the numbers. They don't make for good art. I don't think they make for good decisions either.

Ron: No, they don't. If you look at data, reason, calculation, and measurement, all these things, they can only produce conclusions. They don't inspire action. They're not going to inspire you to be creative, or do something new, or step out on the ledge or take a risk. Like Clayton Christensen says data is by definition about the past, and yet if we want to peer into the future we have to have a theory. We have to use theory or some type of hypotheses that if we do this then we're going to create a whole market, like I've been thinking Apple with the iPod or the iPad.

Ed: Right. Then, in the last few minutes that we have left that actually leads directly into moral hazard number seven which is that measurements are lagging by definition, especially with accounting because it is called accounting like as in accounting for yesterday.

Ron: This is something I think we've talked a lot about a lagging indicator. By definition, any measurement is almost by definition lagging unless you're getting it in real time, then it can become a coincident indicator. To use a lagging indicator to run your business is like using your smoke alarm to time your cookies. We have to understand even if we're looking at benchmarking data and our competition that these are all lagging indicators and you're only looking at the result. You're not really analyzing the effect or the process that led to that result. My favorite example of this, Ed, it comes from Walter Williams the economist who says if you had a poker game, if you had three guys who play poker regularly on a weekly basis, and these three guys, the first guy won 75% of the time, the second guy won 20% of the time, and the third guy won 5% of the time, what conclusions could you draw about that game just looking at the results, looking at the numbers, the measurement of winning? He says you can't draw anything. You could say maybe A's a great poker player or maybe C is a terrible poker face or whatever, or maybe A's cheating, but you can't draw any of those conclusions until you know something about the process.

Ed: Right. 

Ron: It's not enough just to look at the result. You have to understand the process. That's one of the problems with the fact that all of this data and measurements are lagging indicators. 

Ed: Exactly. Let's spend a few minutes, if you will, on Shawn's email that he sent us. By the way, we do post about four or five days ahead of time what our next show is going to be about on verasage.com/tsoe. If there's a topic that is of interest to you and you'd like us to answer your question on the show while we talk about it rather than retrospectively, please pay attention to verage.com/tsoe and feel free to send us your question. Shawn says that he is very interested in this topic because he has a presentation about KPIs next week. He believes they have value, but he also believes that there's an inherent risk, and this is what we were talking about, if you measure the wrong thing or if you rely on them too much for decision making. I love what he says here. He says, "I believe KPIs are a good tool in a toolbox that a manager can use to oversee operations. I believe, however, that the person wielding them must have knowledge, skill, and experience to know how to use them, when to use them, and what to do with them. Just as you wouldn't build a building alone with just a hammer, you can't monitor and manage with one and only one view.

Ron: Yup. I think that's so true. You have to understand, I think, these moral hazards of measurements and don't let them crowd out judgment, don't let them crowd out intuition and wisdom. 

Ed: Clearly. That's really, again, our mantra here it's not that we're suggesting no measurement. That would be completely and totally foolish, but we are suggesting that you revisit the metrics that you are looking at. Please, don't have 27 key performance indicators because if you have 27 of them it's not key anymore. Three to five. Right,

Ron? 

Ron: Yup.

Ed: They should change because you should be testing different theories at different points in your business. There is something, I think, to be said for some consistency if you're suing Net Promoter Score as a measurement for your customer satisfaction or loyalty. There is something to be said to play that out from a long-term perspective, a three to five-year period. There's still a danger in getting totally stuck on that, because I can't tell you now since Net Promoter Score has become so ubiquitous how many times, and this just happened to me last week when I got my car serviced, before they sent me away they said they're going to call you with a survey, is there any reason why can't give me a nine or a 10. It's being gamed. 

Ron: They almost coerce you, or in my case, they gave me free oil changes or a certificate for a cash detail if I give them a great review. It's like, come on, what's going on here? What are we doing? It's a great comment, Shawn, and I think you're exactly right. I think that's what these seven hazard really illustrate, that we have to be very careful about our measurements and not stay too committed to them, like the poverty statistic, for example, is a measurement that's completely meaningless but yet we're so invested in it that we're not willing to analyze it and hold it in front of us and challenge our assumptions about it. Those are our seven hazards. Of course, again folks, we will post all of these on our show notes.

⌘ Listen to the Podcast