The Art of the Question | Robert W. “Doc” Hall | Compression Institute

“Twenty-five years ago, I tried to coach adult college students to seek and solve problems using the classic Deming PDCA Circle. In classrooms, students were unused to identifying their own problems rather than having them pre-defined. The first time through this exercise, over half did not reflect on a problem to seek root cause. Instead, they went shopping for a gizmo, a program, or a recipe to fix the problem – a quick-fix mentality.”

Sourced through the Compression Institute

Michel Baudin‘s comments: 33 years ago, Robert W. Hall wrote Zero Inventories, the first original, technically meaty book in English about Lean Manufacturing, and I have had great respect for him ever since.

 #PDCA

Is SPC obsolete?

In the broadest sense, Statistical Process Control (SPC) is the application of statistical tools to characteristics of materials in order to achieve and maintain process capability. In this broad sense, you couldn’t say that it is obsolete, but common usage is more restrictive. The semiconductor process engineers who apply statistical design of experiments (DOE) to the same goals don’t describe what they do as SPC. When manufacturing professionals talk about SPC, they usually mean Control Charts, Histograms, Scatter Plots, and other techniques dating back from the 1920s to World War II, and this body of knowledge in the 21st century is definitely obsolete.

Tools like Control Charts or Binomial Probability Paper have impressive theoretical foundations and are designed to work around the information technology of the 1920s. Data was recorded on paper spreadsheets, you looked up statistical parameters in books of tables, and computed with slide rules, adding machines or, in some parts of Asia, abacuses (See Figure 1). In Control Charts, for example, using ranges instead of standard deviations was a way to simplify calculations. These clever tricks addressed issues we no longer have.

Figure 1. Information technology in the 1920s

Another consideration is the manufacturing technology for which process capability needs to be achieved. Shewhart developed control charts at Western Electric, AT&T’s manufacturing arm and the high technology of the 1920s. The number of critical parameters and the tolerance requirements of their products have no common measure with those of their descendants in 21st century electronics. For integrated circuits in particular, the key parameters cannot be measured until testing at the end of a process that takes weeks and hundreds of operations, and the root causes of problems are often more complex interactions between features built at multiple operations than can be understood with the tools of SPC. In addition, the quantity of data generated is much larger than anything the SPC techniques were meant to handle. If you capture 140 parameters per chip, on 400 chips/wafer and 500 wafers/day, that is 28,000,000 measurements per day. SPC dealt with a trickle of data; in current electronics manufacturing, it comes out of a fire hose, and this is still nothing compared to the daily terabytes generated in e-commerce or internet search  (See Figure 2).

Figure 2. Data, from trickle to flood, 1920 to 2011

What about mature industries? SPC is a form of supervisory control. It is not about telling machines what to do and making sure they do it, but about checking that the output is as expected, detecting deviations or drifts, and triggering human intervention before these anomalies have a chance to damage products. Since the 1920s, however, lower-level controls embedded in the machines have improved enough to make control charts redundant. The SPC literature recommends measurements over go/no-go checking, because measurements provide richer information, but the tables are turned once process capability is no longer the issue. The quality problems in machining or fabrication today are generated by discrete events like tool breakage or human error, including picking wrong parts, mistyping machine settings or selecting the wrong process program. The challenge is to detect these incidents and react promptly, and, for this purpose, go/no-go checking with special-purpose gauges is faster and better than taking measurements.

In a nutshell, SPC is yesterday’s statistical technology to solve the problems of yesterday’s manufacturing. It doesn’t have the power to address the problems of today’s high technlogy, and it is unnecessary in mature industries. The reason it is not completely dead is that it has found its way into standards that customers impose on their suppliers, even when they don’t comply themselves. This is why you still see Control Charts posted on hallway walls in so many plants.

But SPC has left a legacy. In many ways,  Six Sigma is SPC 2.0. It has the same goals, with more modern tools and a different implementation approach to address the challenge of bringing statistical thinking to the shop floor. That TV journalists describe all changes as “significant” reveals how far the vocabulary of statistics has spread; that they use it without qualifiers shows that they don’t know what it means. They might argue that levels of significance would take too long to explain in a newscast, but, if that were the concern, they could save air time by just saying “change.” In fact, they are just using the word to add weight to make the change sound more, well, significant.

In factories, the promoters of SPC, over decades, have not succeeded in getting basic statistical concepts understood in factories. Even in plants that claimed to practice “standard SPC,” I have seen technicians arbitrarily picking parts here and there in a bin and describing it as “random sampling.” When asking why Shewhart used averages rather than individual measurements on X-bar charts, I have yet to hear anyone answer that averages follow a Bell-shaped distribution even when individual measurements don’t. I have also seen software “solutions” that checked individual measurements against control limits set for averages…

I believe the Black Belt concept in Six Sigma was intended as a solution to this problem. The idea was to give solid statistical training to 1% of the work force and let them be a resource for the remaining 99%. The Black Belts were not expected to be statisticians at the level of academic specialists, but process engineers with enough knowledge of modern statistics to be effective in achieving process capability where it is a challenge.

Problem-solving: Dr. House versus the Shop Floor

Dr. House‘s fictional team of doctors may be the most famous problem-solving group on the planet. Week after week, they solve daunting medical mysteries under an abrasive, unfeeling leader, working in their differential diagnosis sessions with nothing more than a tiny white board to write lists of symptoms.

In real life, Steve Jobs, a man with character flaws on  a par with Dr. House, was able to lead teams in the development of products from the Apple II to the iPad. In light of this, you may wonder why, when faced with problems like an occasionally warped plastic part or wrong gasket, we need to have a team go through brainstorming sessions in which no idea is called stupid, draw fishbone diagrams and formally ask five times why the defect was produced and why it escaped.

House’s team, Apple engineers and Pixar animators are in professions they chose and for which a thick skin is required. They are the product of an education, training and experience in which abuse is used to filter the uncommitted. By contrast, assemblers and machinists are there not to realize childhood dreams but because these are the best jobs they could get. In addition, if they have even a few years of experience in a non-Lean plant, they have been trained to do as they are told. Outside of work, they can be artists,  do-it-yourselfers, or community leaders, but they have not been expected to use the corresponding skills at work.

Over the past decades, many manufacturers have realized that this is a mistake, and that there are emergency response situations that are resolved faster with the participation of the people who do the work than without it, and many small improvement opportunities that are never taken unless operators take them on. But welcoming and soliciting their help is not enough. Historically, the first attempt was the suggestion system, dating back to 1880. It is still in use at many companies, including Toyota, but, while it is part of continuous improvement, it is not an approach to problem-solving. Employees make suggestions about whatever they have ideas about; problem-solving, instead, requires a focus on a subject identified by management or by customers, and usually needs a team rather than an individual.

Kaoru Ishikawa’s concept of the Quality Circle in 1962 was a breakthrough, not only in organizing participants in small groups but also in teaching them the 7 tools of QC to solve quality problems, as well as brainstorming, PDCA, and presentation techniques. The key idea was that pulling a group of shop floor people together was not enough. Quality Circles still exist, primarily in Japan, but the ideas  of providing technical tools and a structure to organize small-group activities around projects have propagated many other areas. Setup time reduction projects for example, can be run effectively like Quality Circles but with the SMED methodology taught instead of QC tools. Conversely, if working on quality issues, a Kaizen Event team may use the same technical tools as Quality Circles, but is managed differently.

To an uninvolved engineer, a scientist or a medical doctor, “problem-solving” as practiced by shop floor teams may appear crude and simplistic. He or she may, for example, view a fishbone diagram as a poor excuse for a fault-tree because it makes no distinction between “OR,” “XOR” or “AND” combination of causes. In the fishbone diagram, these details are not omitted for lack of sophistication but instead by due consideration of the purpose. You can fill out a useful fishbone diagram in a brainstorming session with a problem-solving team, but you would get bogged down in details if you tried to generate a full-blown fault-tree. There are many simple techniques that could potentially be applied. The value of a problem-solving method is that, for a given range of problems, it has shown itself both sophisticated enough to work and simple enough to be applied by the teams at hand.

In this as in every other aspect of Lean, it makes a difference whether an approach is adopted for internal reasons or to comply with an external mandate. A customer that has developed a problem-solving methodology may require suppliers to adopt it when responding to quality problem reports. The suppliers then formally comply, but it may or may not be effective in their circumstances. For example, a car company that buys chips from a semiconductor manufacturing may mandate failure analysis on all defective chips, but this analysis will provide information on process conditions as they were six months before, when the chip was made. Since then, the process that caused the defect has gone through three engineering changes that make the results irrelevant. These results would have been relevant for mechanical parts with shorter processes and less frequent engineering changes, but the car company doesn’t differentiate between suppliers.

Steven Spear on Problem-Solving with JIT: Not Bad for an Academic Paper

Steven Spear’s The Essence of Just-in-Time:Imbedding diagnostic tests in work-systems to achieve operational excellence  is a working paper from Harvard Business School in 2002 focused on the interaction between JIT and problem-solving. It is an important topic, only briefly alluded to in Lean Logistics and covered in more detail in When to Use Statistics, One-Piece Flow, or Mistake-Proofing to Improve Quality, but there are many other improvement opportunities besides product quality, and shining a light on their relationship with JIT is useful.

Spear’s paper is worth reading because he did his homework: it is based on research that involved immersion in a Toyota supplier support team, visits to seven Toyota plants and 12 suppliers in Japan and the US, and working as an assembler in a non-Toyota plant for comparison. I recommend in particular sections 4 to 7. Section 4 is a case study of mattress manufacturing at Aisin Seiki, from which the following sections draw general conclusions.

You have to look past the other sections, which mainly reflect Spear’s membership in the academia tribe. His research is described as an “ethnographic study,” which conjures up the image of an American or European spending 15 years among the Guaranis of Paraguay recording what they are willing to share of their language and culture. That this vocabulary should be used in a study of Lean reflects how alien the world of manufacturing is to academia.

As an academic, Spear is obligated to reference other academics, but not non-members of the tribe, no matter how major their contributions. For example, the only Japanese author in the bibliography is Takahiro Fujimoto, from the  University of Tokyo, but neither  Taiichi Ohno nor Shigeo Shingo appear. Section 3, on Methods, opens with “Many scholars argue…” With all due respect, the arguments of scholars don’t amount to a hill of beans in Manufacturing, because, unlike Computer Science or Biology, it is not a field to which they have contributed much. From Taylor and Gastev to Ohno and Shingo, the key innovators in Manufacturing have almost all been self-taught, Lillian Gilbreth being the exception with a PhD. Why was Spear’s research not done in an Industrial Engineering department, where its content would normally place it? As I found in my own ethnographic studies of academia, the need for grants pushes researchers in other directions, like genetic algorithms.