Nov 1 2011
Design First, Lean Second: A Case Study
Via Scoop.it – lean manufacturing
For when you are not stuck with a 10-year old design…
Show original
Nov 1 2011
Via Scoop.it – lean manufacturing
For when you are not stuck with a 10-year old design…
Show original
By Michel Baudin • Press clippings, Technology 0 • Tags: Lean manufacturing, Manufacturing engineering, Manufactuting
Oct 27 2011
We have all seen the absurd situation in the featured picture above of a line of customers waiting for taxis while a line of taxis next to them is waiting for customers, with a barrier separating them. This particular instance is from The Hopeful Traveler blog. The cabs are from London, but the same scene could have been shot in many other major world cities.
I am sure we have all encountered similar situations in other circumstances, which may or may not be easy to resolve. One particular case where it should be easy is the restaurant buffet. Figure 1 shows a typical scene in buffet restaurants, with a line of people waiting to get food all on the one side of the table, while food is waiting and accessible on the opposite side.
Figure 1. A typical buffet
I think the fundamental mistake is the assumption that a buffet is like an assembly line, providing sequential access to dishes. This means that you cannot get to the Alo Gobi until the person in front of you is done with the Tandoori. The ideal buffet would instead provide random access, meaning that each customer would have immediate access to all dishes at all times. While it may not be feasible, you can get much closer to it than with the linear buffet. The following picture shows an alternative organization of a buffet in circular islands that is non-sequential.
Figure 2. A buffet island at the Holiday Inn in Visalia, CA
The limitation of this concept is that replenishment by waiters can interfere with customers. To avoid this, you would want dishes to be replenished from inside the circle while customers help themselves on the outside, as in the following sketch:
Figure 3. A buffet island with replenishment from inside
One problem with the circular buffet island, however, is its lack of modularity. You can add or remove whole islands but you cannot expand or shrink an island, which you can if you use straight tables arranged in a U-shape, as in Figure 4.
Figure 4. Buffet island with straight tables
This buffet island may superficially look like a manufacturing cell, but it is radically different. Its purpose is random access to food as opposed to sequential processing of work pieces, and the materials do not flow around the cell but from the inside out.
Such are the thoughts going through my mind while munching on the Naan at Darbar.
By Michel Baudin • Technology 3 • Tags: Buffet, industrial engineering, Lean manufacturing, Restaurant
Oct 26 2011
Data mining, in general, is the retrieval of information from data collected for a different purpose, such as using sales transaction histories to infer what products tend to be bought together. By contrast, design of experiments involves the collection of observations for the purpose of confirming or refuting hypotheses.
This perspective on data mining is consistent with the literature in expressing purpose, but most authors go further. They include in their definitions that data mining is done with computers, using large databases and specific analytical tools, which I think is too restrictive. The tools they list are the ones they have found useful in analyzing the behavior of millions of users of search engines or commerce websites, and they are not obviously applicable in other areas, such as manufacturing.
During World War II, British analysts used the serial numbers of captured or destroyed German tanks to estimate the numbers produced. Because serial numbers were not attached for this purpose, it was data mining. It used clever statistical models but, obviously, no computers.
Today, PhD-level data miners at Google, eBay, or Amazon sift through the page views and click-throughs of millions of users for clues to patterns they can use. The data, automatically collected, is accurate and collected by the terabytes every day. This “big data” requires parallel processing on clusters of computers and lends itself to the most advanced analytical tools ever developed.
Compared to this fire hose of data, what manufacturing produces is a trickle. In a factory, the master data/technical specs, plans and schedules, status of operations and work in process, and the history of production over, say, 12 months, usually adds up to a few gigabytes. It doesn’t fit on one spreadsheet, but it often does on a memory stick. On the other hand, much of it is still manually generated and therefore contains errors, and it is often structured in ways that make it difficult to work with.
Even if manufacturing companies could hire the data miners away from their current jobs, their experience with e-commerce or web search would not have prepared them well for the different challenges of manufacturing data mining.
There is an opportunity for data mining to contribute to competitiveness in manufacturing, but the approach must start from the needs. It must not be an e-commerce cure in search of manufacturing diseases.
By Michel Baudin • Technology 0 • Tags: Data mining, Lean manufacturing, Manufactuting
Oct 14 2011
Revisiting Pareto is a paper about the application of the 80/20 law, also known as the law of the vital few and the trivial many and the Pareto principle. It presents new ways of applying it to manufacturing operations and is scheduled for publication in Industrial Engineer Magazine in January, 2012. As a preview, I am including here the following:
Abstract of the paper as scheduled for publication in 1/2012
The Pareto principle, so named by J.M. Juran in the 1940s, is also known as the 80/20 law, the law of the vital few and the trivial many, and ABC classification. It is a simple yet powerful idea, generally accepted but still underutilized in business. First recognized by Vilfredo Pareto in the distribution of farm land in Italy around 1900, it applies in various forms to city populations as counted in the 2010 census of the US, as well as to quality defects in manufacturing, the market demand for products, the consumption of components in production, and various metrics of item quantities in warehouses.
The key issues in making it useful are (1) to properly define quantities and categories, (2) to collect or retrieve the data needed, (3) to identify the actions to take as a result of its application and (4) to present the analysis and recommendations in a compelling way to decision makers.
In the classical Pareto analysis of quality problems, defect categories only partition the defectives if none has more than one type of defect. When multiple defects can be found on the same unit, the elimination of the tallest bar may increase the height of its runner-up. With independent defect categories and an electronic spreadsheet, this is avoided by calculating the yield of every test in the sequence instead of simply counting occurrences, and multiplying these yields to obtain the yield of a sequence of tests. We can then plot a bar chart of the relative frequencies each defect in the population that is actually tested for it, and, as a cumulative chart, the probability of observing at least one of the preceding defects.
When analyzing warehouse content, there are usually too many different items to count physically, and we must rely on data in the plant’s databases. The working capital tied up by each item is usually the most accessible. But physical characteristics like volume occupied, weight or part count are often scattered in multiple databases and sometimes not recorded. The challenge then is to extract, clean, and integrate the relevant data from these different sources.
In both supply chain and in-plant logistics, the design of the system should start with an analysis of part consumption by a production line in terms of frequency of use, another quantity that is not additive across items. In this application, the classical Pareto chart is replaced by the S-curve, which plots the number of shippable product units as a function of the frequency ranks of the components used, and thus provides a business basis for grouping them into A-, B- and C-items, also known as runners, repeaters and strangers.
The presentation of the results should be focused on communication, not decoration, and use graphics only to the extent that they serve this purpose.
While the Pareto principle can be easily validated in many contexts, we are short on explanations as to why that is. We do not provide one, but we show how, in a simulation of a multiple-round roulette game in which each player’s winnings at each round determine his odds in the next one, the distribution of chips among the players eventually comes close to the 80/20 rule. This happens even though it is a pure game of chance: players start out with equal numbers of chips and have no way to influence the outcome.
Assume we use a roulette wheel to allocate chips among players in multiple rounds, with the following rules:
This is a pure game of chance. There is nothing players can do to influence the outcome, and the players who do best in one round are set to do even better on the next one. It is also a zero-sum game, in that whatever a player wins comes out of the other players’ hands. There is no safety net: a player with no chips after any round is out of the game. If you know the present state of the game –how many chips the player in each rank has– then the results of future rounds is independent of the path by which you arrived at the present state. Technically, it is a random walk and a Markov process. The concept of the game is shown in Figure 1.
It could be viewed as a representing a simplified Oklahoma Land Rush, where farmers of equal talent are initially given identical plots of land. Even in the first round, some plots will have higher crop yields than others. If the average crop is needed to survive, the farmers with a surplus sell it to the farmers with a shortfall in exchange for land and start the next season with land in proportion to the crop they had on the previous one. The players could also be functionally equivalent, similarly priced consumable products sold in the same channels, like toothpaste or soft drinks, and the chips would be the buyers of these products.
If we run this game through enough rounds, do we tend towards a distribution where 20% of the players hold 80% of the chips, or do all the chips end up in the hands of only one player? We simulated this game in software with 1 million chips to be divided among 1,000 players in 100,000 rounds, and the simulation takes about 2 minutes to run on a desktop PC.
The results show that the “wealth distribution” is stable, beyond 10,000 rounds, with 23% of the players collectively holding 80% of the chips Figure 2 shows the distribution after 100,000 rounds, which roughly satisfies the 80/20 law, but not recursively, because it takes the top 49% of the players to account for 96% of the chips, instead of 36%.
Figure 2. Chip distribution after 100,000 rounds
Figure 3 shows other indicators of the simulation results every 10,000 rounds. The proportion of the top players holding 80% of all the chips oscillates between 21.5% and 25%, with a very slow downward trend. The number of chips held by the top 10 players also varies, but is essentially flat.Figure 3. Indicators every 10,000 rounds
This pure game of chance, which favors at every round the winners of the previous round, does not rapidly concentrate all the chips in a few hands. Instead, it produces a distribution that is as close to Pareto’s as anything we can observe in demographics, quality, or production control. This suggests the following two conclusions:
The simulation has shown the limitations of the conclusions drawn from a Pareto chart. When a product is a Runner, by definition it deserves a dedicated production line. That is a direct consequence of its being at the top of the chart. It does not, however, imply that it is “better” in any way than that the other products. The simulation arrived at a near 80/20 distribution of the chips without assuming any difference in the players’ ability to acquire chips: the roulette wheel has no favorites, and the players can do nothing to influence the outcome.
By Michel Baudin • Technology 4 • Tags: industrial engineering, Lean manufacturing, Management, Pareto
Mar 3 2011
Based on an NWLEAN post entitled: Laws of Nature – Pareto efficiency and Pareto improvements, from 3/3/2011
In manufacturing, Italian economist Vilfredo Pareto is mostly known for the Pareto diagrams and the 80/20 law, but in economics, he is also known for the unrelated concept of Pareto efficiency, or Pareto optimality, which is also relevant to Lean. A basic tenet of Lean is that a factory can always be improved, and that, once you have achieved any level of performance, it is just the starting point for the next round of improvement. Perfection is something you never achieve but always pursue and, if you dig deep enough, you always find opportunities. This is the vocabulary you use when discussing the matter with fellow production people. If, however, you are taking college courses on the side, you might score more points with your instructor by saying, as an empirical law of nature, that a business system is never Pareto-efficient. It means the same thing, but our problem is that this way of thinking is taught neither in Engineering nor in Business school, and that few managers practice it.
A system is Pareto-efficient if you cannot improve any aspect of its performance without making something else worse. Managers who believe their factories to be Pareto-efficient think, for example, that you cannot improve quality without lengthening lead times and increasing costs, which is exactly what Lean does. In fact, eliminating waste is synonymous with making improvements in some dimensions of performance without degrading anything else, or taking advantage of the lack of Pareto-efficiency in the plant.
When we say that a factory can always be improved it is a postulate, an assumption you start from when you walk through the gates. The overwhelming empirical evidence is that, if you make that assumption, you find improvement opportunities. Obviously, if you don’t make that assumption, you won’t find any, because you won’t be trying.
This is not a minor issue. Writing in the Harvard Business Review back in 1991 about Activity-Based Costing, Robert Kaplan stated that all the possible shop floor improvements had already been made over the previous 50 years. He was teaching his MBA students that factories were Pareto-efficient and that it was therefore pointless to try and improve them. They would do better to focus on financial engineering and outsource production.
The idea that improving factories is futile and a distraction from more “strategic” pursuits dies hard. It is expressed repeatedly in a variety of ways. The diminishing returns argument is that, as you keep reaching for fruits that hang ever higher, the effort requires starts being excessive with respect to the benefits, but there are two things to consider:
Another argument is that the focus on waste elimination discourages activities like R&D that do not have an immediate impact on sales. The improvement effort, however, isn’t about what we do but how we do it. Nobody in his right mind would call R&D waste, even on projects that fail. Waste in R&D comes in the form of researchers waiting for test equipment, sitting through badly organized meetings, or filling out administrative paperwork.
In manufacturing itself, some see the pursuit of improvement as a deterrent to investment in new technology. While it is clear that the improvement mindset does not lead to solving every problem by buying new machines, the practitioners of continuous improvement are in fact better informed, savvier buyers of new technology. On one side of the shop floor, you see a cell with old machines on which incremental improvements over several years have reduced staffing requirements from 5 operators to 1. On the other side of the aisle, you see a brand new, fully automatic line with a design that incorporates the lessons learned on the old one.
Others have argued that a society that pursues improvement will be slower to develop and adopt new, disruptive technology. But does the machinist improving a fixture deter the founder of the next Facebook? There is no connection. If the machinist were not making improvements, his creativity would most likely be untapped. And his improvement work does not siphon off the venture capital needed for disruptive technology.
By Michel Baudin • Laws of nature 14 • Tags: Autonomation, Continuous improvement, industrial engineering, jidoka, Line design, Manufacturing engineering
Nov 8 2011
Factory life with and without Kaizen
In Kaizen, Masaaki Imai describes Japanese executives returning in the 1970s to American plants they had visited thirty years before and being struck by the absence of change: they saw the same production lines with the same equipment operated the same way. This started me looking for photographic evidence. Overall, pickings were slim, but I did find the above pictures of the same coke oven at the Ford River Rouge plant shot in 1946 and 1976.
If there is Kaizen activity in a factory, how does it change the work life of employees at all levels? The following chart compares the breakdowns of their use of time, with and without Kaizen:
In a plant without Kaizen, operators and supervisors are fully occupied with routine daily tasks: production for the operators, expediting parts, enforcing discipline, and record keeping for supervisors. Only middle managers and executives spend a fraction of their time on projects involving capacity changes, new production lines or new technology. Nobody works on incremental improvements to the way the work is done today.
By contrast, Kaizen involves employees at all levels in such improvement activities to different degrees. Between improvement and daily routine, the boundary is sharp; between improvement and innovation, fuzzy. Over time, the cumulative effect of incremental improvement is radical change by itself. In addition, the skills acquired and the lessons learned from incremental improvements are incorporated into new line or plant design projects. In some Japanese auto parts plants, I remember seeing automatic lines side-by-side, where one used old machines that had been gradually retrofitted with devices that reduced the need for human intervention, while the other one had been built from scratch with new machines to be automatic. The Kaizen work done to implement the former was essential to the success of the latter.
Share this:
Like this:
By Michel Baudin • Management, Technology 10 • Tags: Continuous improvement, industrial engineering, Kaizen, Lean manufacturing, Manufacturing engineering