Oct 27 2023
“Today’s manufacturing systems have become more automated, data-driven, and sophisticated than ever before. Visit any modern shop floor and you’ll find a plethora of IT systems, HMIs, PLC data streams, machine controllers, engineering support, and other digital initiatives, all vying to improve manufacturing quality and efficiencies.
That begs these questions: With all this technology, is statistical process control (SPC) still relevant? Is SPC even needed anymore? Some believe manufacturing sophistication trumps SPC technologies that were invented 100 years ago. But is that true? We the authors believe that SPC is indeed relevant today and can be a vitally important aid to manufacturing.”
Michel Baudin‘s comments: Of the changes affecting the relevance of 100-year-old SPC to today’s manufacturing, the authors only consider one: the availability of much larger data sets. They do not discuss neither how processes in electronics, pharmaceuticals, aerospace, automotive, and many other industries are more complex today than 100 years ago. They don’t think it relevant that thinking in Probability and Statistics has evolved into today’s Data Science in the past 100 years, producing many new tools that leverage today’s IT and OT.
Points about SPC
Of the eight points they make in the article, only two are truly about SPC:
- Point 1 is that you can deploy it more easily with today’s IT than with paper and pencils. SPC courses, to this day, teach techniques developed specifically to work around the limitations of the paper-and-pencil age. Paperless SPC is a horseless carriage.
- Point 7 is about “How to set up your processes more optimally to get the most out of them.” It raises the question of who uses SPC for this. Biotechnology doesn’t, and relies instead exclusively on genetic engineering. Neither do semiconductors. In the wafer process, they use Run2Run controls to tweak the next operation based on the outcome of the present one.
Point 2 puts forward SPC as a tool to manage “high-volume data streams,” which is interesting, given that Shewhart and his followers designed SPC specifically to work with a trickle of manually collected data. The keywords you encounter in the context of high data volume in manufacturing include in-memory databases, data warehouses, data lakes, data lakehouses, and not SPC.
Point 3 presents SPC as a means of breaking down data silos. Data silos emerge when different groups in an organization use different information models. They use different names for the same data objects, store them in different formats, and represent their relationships differently. These are difficult issues to solve, and you won’t find any help on this in the SPC literature.
Point 4 is about “Repurposing SPC data.” If it means putting the technical data on processes collected on the shopfloor to all sorts of uses, it is a worthwhile effort. If you want guidance on how to do it, you will find it under data mining, not SPC.
Point 5, “measuring data bias and inaccuracies” is about data cleaning, Again, good luck finding guidance on how to do it in the SPC literature.
About Point 6, “instantly communicate shop floor issues across the enterprise,” other keywords, like Andon boards, come to mind.
Point 8 presents SPC as the “voice of the process.” Processes, however, speak in many other, more immediate ways than SPC.
The article concludes by saying: “your organization can benefit greatly by modernizing a 100-year old technology: statistical process control.” I couldn’t agree more.