6 Big Data Analysis Mistakes That Hinder Lean Efforts

Published by Jeff Hajek on

Data collection is a big part of any problem solving effort, and problem solving is what Lean is all about. So, it follows that the people who make the most significant gains in their continuous improvement efforts are those that can effectively turn data into actionable information.

Yep. That means that you still get to use math, even though your schools days may be long behind you. But knowing what to do to work your way through a pile of data is only half the battle. You also have to know what not to do. So, here are a few of the biggest traps people fall into when they try to interpret their data.

  1. Confusing correlation with causation. Over the last few years, there has been a lot of media coverage about a possible connection between autism and vaccinations. Causation would say that there is something in the vaccines that triggers the disease in children. Correlation would indicate that there is a link, but it is because both are related to another factor. For example, vaccines are given primarily to children, and autism is generally identified in children of vaccination age. It follows that there could be an age-related factor that links vaccinations and autism. (Note: The controversy remains. The American Medical Association and legal cases refute a causal relationship. Various autism groups are convinced of a connection.)
  2. Acting on noise. Noise is the inherent fluctuation present in a system. Special causes are the unusual, identifiable issues. Imagine you are going out on a jog. Every time you run a mile on the same course, you would probably have a similar time, but there would be some variation. That is noise. But if you got stuck at a red light, or tripped, or had to tie your shoe, you would expect to see a longer time. Obviously, those are special causes. The problem comes when people try to chase the noise in a system. Trying to stabilize a processes’ natural fluctuation is a huge challenge. One note, though, in most cases, the line between noise and special cause is simply a matter of how powerful of a microscope you are using to look at the process. You can nearly always, when you look closely enough, figure out what is causing the noise. As a practical matter, though, few processes have eliminated the special causes you can see with the naked eye. Focus on those first.
  3. Ignoring special causes. The opposite of number 2 above, it is just as bad to ignore the data trail when it leads to a special cause by simply dismissing it as part of the system. This is most common when people become numb to a problem, or when they have a glut of small issues. Consider how often safety stock is added to material systems to cover variations in delivery times. The specials causes are often rolled up into a belief that the system has natural variation in it. In truth, going after the special causes one at a time will slowly but surely reduce the variation.
  4. Stressing anecdotal evidence. One of the reasons that the link between autism and vaccines is believed in such a widespread manner is that there is a lot of anecdotal evidence to support it. Many people point to children who developed autism after a vaccine. Because of the personal nature of this form of data, people put a lot of strength into it. The same holds true with data supporting business processes. The issues people see with their own eyes often weigh heavier than other data.
  5. Using available data instead of necessary data. One data analysis shortcut is to scrub databases for information. When the right data is present, it is a gift. But when the data is missing an important factor, there is a tendency to change the question to match what the data can answer. Sometimes it does make sense to get a good answer for no cost versus a better answer for a substantial number of data collection hours. But in many cases, the low-cost answer really isn’t all that good.
  6. Self-fulfilling prophecies. When you start analyzing data with an expectation of an answer, rookies tend to end up with that answer more frequently than an experienced data sleuth would. They tend to inadvertently stack the deck towards the answer they want.

Learn to identify these traps before you fall into them. One good method is to bounce your data analysis methods off a mentor, or at least a peer, to get a second set of eyes on your results. Most of these traps have some psychology behind them, so an unvested person is generally more likely to spot the pitfalls than you are.

Of note, Velaction offers a data collection module, complete with DVD, that can help you and your team become more effective problem solvers.

Lean Terms Videos

0 Comments

Leave a Reply

Your email address will not be published. Required fields are marked *