Booze, bonks and bodies | The Economist

The various Bonds are more different than you think

Sourced through from:

Michel Baudin‘s comments:

Once hailed by Edward Tufte as purveyor of the most sophisticated graphics in the press, Britain’s “The Economist” has apparently surrendered to the dictatorship of the stacked-bars.

Continue reading

How to Really See What is Going On in Your Workplace | IndustryWeek | Jamie Flinchbaugh

How managers can use the four levels of observation to really see what is going on in their workplace:

      1. Stories and anecdotes.
      2. Data and graphs.
      3. Pictures and diagrams.
      4. Direct observation.”


Michel Baudin‘s comments:
Deep down, I believe I agree with Jamie Flinchbaugh on observation, but I am puzzled by the way he phrases it. He describes stories and anecdotes as “the most abstract level of observation.” I see them as a means of persuasion, not observation, and concrete, not abstract.

I don’t see data as necessarily dependent on assumptions. What assumptions are there behind, say, the number of boxes of Cereal Z you sold last month? It is just a fact. While photographs are a form of data, graphs and diagrams are ways of analyzing data and presenting results, which is also downstream from observation.

For the analysis of a plant, I see three main sources of input:

  1. Direct observation of the operations.
  2. Interviews with key members of the organization.
  3. The organization’s data.

The Lean literature justifiably emphasizes direct observation. You go to where the work is being done, and then apply various mental techniques to help you notice relevant characteristics. You may even gather data in the form of photographs an videos for future analysis.

But it cannot be your only source. You also need to know what the manager’s ambitions are for the organization, what they have tried to realize them, and what obstacles they feel they have encountered. Their perceptions may or may not agree with what you see with your own eyes, but you need to know what they are.

Finally, any business activity leaves a data trail that should not be ignored, including product and process definitions, current status, history, and plans for the near and distant future. All of this also needs to be reviewed and confronted with direct observation and human perceptions.

It’s when you present your conclusions and recommendations that you use stories, graphs, diagrams, pictures, and videos to get your point across.

See on Scoop.itlean manufacturing

Betting on Lean, or …. Analytics versus Empowerment | Bill Waddell

See on Scoop.itlean manufacturing

“Management is all about playing the odds. […]  In operations, calculate lot sizes, generate forecasts and set quality standards with enough data and increasingly sophisticated algorithms and statistical methods and you will increase the chances of coming close enough.  At least that is the theory, and the hope.

This is the basic premise of big data and ERP.  With point of sale scanning, RFID, smart phones and all of the other data collecting technologies increasingly in use, the data to feed the engines is more and more available.  The potential and the lure of the data driven, analytical approach to finding the center line and getting more decisions closer to correctness is growing.

The other approach is empowered people.  Recognizing that management cannot be involved in every one of the individual customer interactions and operational, situational, tiny decisions, those calls are left to the people on the spot.  They are expected to rely on their knowledge, understanding of company values and goals, and the information available to them in very real time to decide what to do.[…] The basic question is whether empowered people will get it right more often than big computer.”

Michel Baudin‘s insight:

In this article, Bill Waddell presents the data-driven approach to management decision making as contradictory to people empowerment. I do not see these as mutually exclusive.

In 1993, there was a group within Toyota’s logistics organization in the US that, based on weather data, thought that the Mississippi might flood the railroad routes used to ship parts from the Midwest to the NUMMI plant in California. Four days before the flood, they reserved all the trucking available in the Chicago area, for the daily cost of 6 minutes of production at NUMMI. When the flood hit, they were able to ship the parts by truck around the flood zone, and NUMMI didn’t miss a beat.

This is what a good data scientist  does.

In Numbersense, Kaiser Fung points out that data analysis isn’t just about the data, but also about the assumptions people make about it. As an example, he points out the Republican polling fiasco of the 2012 election, as being due to the combination of flawed data collection and equally flawed modeling.

In other words, it’s not a computer that comes up with answers from data, but a human being, and the quality of these answers depends as much on the human analyst’s understanding of the underlying reality as it does on the ability to collect clicks from the web or transactions from point-of-sale systems.

Good data analysis does not require petabytes of data. In statistics, a small sample is 10 points; a large sample, 100 points. The difference matters because, with small samples, there are many convenient approximations that you cannot make. But 100 points is plenty for these approximations to work.

With millions of points, the tiniest wiggle in your data will show overwhelming significance in any statistical test, which means that these test are not much use in that context. To figure out what this tiny wiggle is telling you about reality, however, you still need to understand the world the data is coming from.

I don’t see an opposition between relying on people and relying on data, because, whether you realize it or not, you are never relying on data, only on people’s ability to make sense of it.

See on