Oct 20 2011
Oct 19 2011
In an article on this topic in Industry Week today, Ralph Keller asserts that Continuous Improvement is focused on business processes rather than technology.
However, if you wrap tinfoil around the feet of a welding fixture to make it easier to clean, replace bolts with clamps on a machine to reduce setup time, or mount a hand tool on the machine on which it is used, it usually counts as Continuous Improvement but involves technical changes to work that I don’t think anyone would describe as business processes.
Yes, Continuous Improvement is done without expensive technology, but it does involve cheap technology.
Ralph Keller also reminds us that Continuous Improvement is not “rocket science,” which implies that it is easier. I agree that it is different, but not easier. I don’t know any rocket scientist with the skills to facilitate Continuous Improvement.
Oct 16 2011
Communicating face to face is supposed to be more effective than electronically. Yet it is common today to see professionals travel thousands of miles to be in the same room and then engage in parallel play on their laptops while one of them is giving a presentation. The speaker stands in front of tables arranged in a U, facing a wall of open laptops. The people behind are silent and focused, and seldom make comments or ask questions. They appear to be taking notes, but in fact, they are completing an unrelated budget spreadsheet, checking emails, or playing solitaire.
It takes the presentation skills of a Steve Jobs to successfully compete for attention with the rest of the world coming to each seat through the web, and most speakers do not have these skills, especially when in a foreign language, and it is also quite possible that the information they have to convey does not lend itself to an entertaining presentation, but today’s business audiences make no allowance for this. The notion of giving each person a polite hearing, as taught in elementary school, is gone.
The intrinsic discourtesy of this behavior, however, is not the main problem. By providing escapes,web-connected laptops make badly-organized, boring meetings bearable, and allow them to continue unchallenged. Possibly the most tedious of all business rites, the regular staff meeting involves members of a department standing up one after the other and giving a status update on issues that are usually of interest only to themselves and the manager. Without laptops, some members would fall asleep and snore, and others would eventually rebel against this waste of time. And this would lead to a positive change. Wired in to their laptops, they don’t challenge the status quo.
A few years ago, cell phones routinely disrupted meetings, with their owners leaving the room to take the calls, which were obviously more important than anything the people present in the flesh might have to say. Today, cell phones disruptions are abating, as meeting participants usually comply with requests to turn them off, or, even better, to “turn them back on at the end of the meeting, so that they don’t miss their important calls.” The current situation with laptops, however, is not sustainable. It is a technology-induced problem, that technology may actually solve. Laptops are an endangered species, likely to be replaced with tablets that do not provide the same kind of visual shield as a raised laptop screen. Already today, people who bring iPads to meetings actually participate in the meetings, as doing otherwise would be too conspicuous.
Oct 14 2011
Revisiting Pareto is a paper about the application of the 80/20 law, also known as the law of the vital few and the trivial many and the Pareto principle. It presents new ways of applying it to manufacturing operations and is scheduled for publication in Industrial Engineer Magazine in January, 2012. As a preview, I am including here the following:
- The complete abstract of the paper as it will be published.
- A section on a simulation of a game leading to the emergence of a Pareto distribution of wealth among players, which was cut from the article because it is meant to stimulate thinking and does not contain actionable recommendations.
Abstract of the paper as scheduled for publication in 1/2012
The Pareto principle, so named by J.M. Juran in the 1940s, is also known as the 80/20 law, the law of the vital few and the trivial many, and ABC classification. It is a simple yet powerful idea, generally accepted but still underutilized in business. First recognized by Vilfredo Pareto in the distribution of farm land in Italy around 1900, it applies in various forms to city populations as counted in the 2010 census of the US, as well as to quality defects in manufacturing, the market demand for products, the consumption of components in production, and various metrics of item quantities in warehouses.
The key issues in making it useful are (1) to properly define quantities and categories, (2) to collect or retrieve the data needed, (3) to identify the actions to take as a result of its application and (4) to present the analysis and recommendations in a compelling way to decision makers.
In the classical Pareto analysis of quality problems, defect categories only partition the defectives if none has more than one type of defect. When multiple defects can be found on the same unit, the elimination of the tallest bar may increase the height of its runner-up. With independent defect categories and an electronic spreadsheet, this is avoided by calculating the yield of every test in the sequence instead of simply counting occurrences, and multiplying these yields to obtain the yield of a sequence of tests. We can then plot a bar chart of the relative frequencies each defect in the population that is actually tested for it, and, as a cumulative chart, the probability of observing at least one of the preceding defects.
When analyzing warehouse content, there are usually too many different items to count physically, and we must rely on data in the plant’s databases. The working capital tied up by each item is usually the most accessible. But physical characteristics like volume occupied, weight or part count are often scattered in multiple databases and sometimes not recorded. The challenge then is to extract, clean, and integrate the relevant data from these different sources.
In both supply chain and in-plant logistics, the design of the system should start with an analysis of part consumption by a production line in terms of frequency of use, another quantity that is not additive across items. In this application, the classical Pareto chart is replaced by the S-curve, which plots the number of shippable product units as a function of the frequency ranks of the components used, and thus provides a business basis for grouping them into A-, B- and C-items, also known as runners, repeaters and strangers.
The presentation of the results should be focused on communication, not decoration, and use graphics only to the extent that they serve this purpose.
Emergence of the Pareto distribution in a simulation
While the Pareto principle can be easily validated in many contexts, we are short on explanations as to why that is. We do not provide one, but we show how, in a simulation of a multiple-round roulette game in which each player’s winnings at each round determine his odds in the next one, the distribution of chips among the players eventually comes close to the 80/20 rule. This happens even though it is a pure game of chance: players start out with equal numbers of chips and have no way to influence the outcome.
Assume we use a roulette wheel to allocate chips among players in multiple rounds, with the following rules:
- On the first round, the players all have equal wedges of the roulette wheel, and the roulette is spun once for each chip.
- On each following round, each player’s wedge is proportional to the amount he won on the preceding round.
This is a pure game of chance. There is nothing players can do to influence the outcome, and the players who do best in one round are set to do even better on the next one. It is also a zero-sum game, in that whatever a player wins comes out of the other players’ hands. There is no safety net: a player with no chips after any round is out of the game. If you know the present state of the game –how many chips the player in each rank has– then the results of future rounds is independent of the path by which you arrived at the present state. Technically, it is a random walk and a Markov process. The concept of the game is shown in Figure 1.
It could be viewed as a representing a simplified Oklahoma Land Rush, where farmers of equal talent are initially given identical plots of land. Even on the first round, some plots will have higher crop yields than others. If the average crop is needed to survive, the farmers with a surplus sell it to the farmers with a shortfall in exchange for land, and start the next season with land in proportion to the crop they had on the previous one. The players could also be functionally equivalent, similarly priced consumable products sold in the same channels, like toothpaste or soft drinks, and the chips would be the buyers of these products.
If we run this game through enough rounds, do we tend towards a distribution where 20% of the players hold 80% of the chips, or do all the chips end up in the hands on only one player? We simulated this game in software with 1 million chips to be divided among 1,000 players in 100,000 rounds, and the simulation takes about 2 minutes to run on a desktop PC.
The results show that the “wealth distribution” is stable, beyond 10,000 rounds, with 23% of the players collectively holding 80% of the chips Figure 2 shows the distribution after 100,000 rounds, which roughly satisfies the 80/20 law, but not recursively, because it takes the top 49% of the players to account for 96% of the chips, instead of 36%.
Figure 3 shows other indicators of the simulation results every 10,000 rounds. The proportion of the top players holding 80% of all the chips oscillates between 21.5% and 25%, with a very slow downward trend. The number of chips held by the top 10 players also varies, but is essentially flat.Figure 3. Indicators every 10,000 rounds
This pure game of chance, which favors at every round the winners of the previous round, does not rapidly concentrate all the chips in a few hands. Instead, it produces a distribution that is as close to Pareto’s as anything we can observe in demographics, quality, or production control. This suggests the following two conclusions:
- The reason the 80/20 law is so often verified with actual data is that is does not require some categories to be “better” than others. A long random walk is sufficient to produce it, even when you start out with equal quantities assigned to each category. The products that are Runners are not necessarily better in any way than the Repeaters or the Strangers. From the point of view of Manufacturing, it does not matter why or how they recame Runners: as long as they dominate the volume, they need dedicated resources.
- If some categories are intrinsically better able to attract the quantity to be distributed than others, we can expect the distribution to be more unequal than 80/20. If the players are farmers, differences in skill may skew the distribution further. If they are products competing for customers, the better products may take over the market, but it does not always happen.
The simulation has shown the limitations of the conclusions drawn from a Pareto chart. When a product is a Runner, by definition it deserves a dedicated production line. That is a direct consequence of its being at the top of the chart. It does not, however, imply that it is “better” in any way than that the other products. The simulation arrived at a near 80/20 distribution of the chips without assuming any difference in the players’ ability to acquire chips: the roulette wheel has no favorites, and the players can do nothing to influence the outcome.
Jun 14 2011
On 5/24/2011, NWLEAN moderator Bill Kluck launched the richest, most vigorous debate in the 13-year history of that forum by asking members whether they were cost cutters or capacity enhancers. Over the following three weeks, 29 authors posted 68 spirited, yet courteous messages accessible from NWLEAN to anyone with a Yahoo! ID. The contributors included several well-known authors and some of my favorite sparring partners. In alphabetical order, they were Dan Barch, William Bowman, Robert Byrd, Abhijit Deshpande, Mark Graban, Jim Harrington, Jonathan Harrison, Rob Herhold, Blair Hogg, Gangadhar Joshi, Bill Kluck, Joachim Knuf, Ed Larmore, Paul Layton, Ted Mayeshiba, Jim McKechnie, Larry Miller, Joe Murli, John Nelson, Stephen P. O’Brien, Anthony Reardon, Tom Robinson, Sunita Sangra, Dale Savage, Patrick D. Smith, Tom Stogsdill, Mike Thelen, Chuck Woods, plus a few others who did not use their real names.
The message that started the thread was as follows:
“It seems more and more (especially in this recession!) that lean professionals fall into 2 camps:
– Cost cutters
– Capacity enhancers
There are some out there that are saying that you HAVE to cut costs, or you’re not getting lean.
There are others that say lean is about improving capacity, so you can enhance the customer experience.
Which are you, and why? Is this driven by the top of your organization, people who may not understand what lean is? Do we risk becoming cost cutters, or smoke-and-mirror guys?
Although I didn’t participate initially, it was clear to me that the effect of a successful Lean implementation was limited neither to cost cutting nor to capacity enhancement. Instead, I see Lean as the pursuit of concurrent improvement in all dimensions of manufacturing performance, but I had already shared my thoughts on this matter in Lean as the End of Management Whack-a-Mole. Management Whack-a-Mole is the game illustrated in Figure 1, in which managers boost performance in one dimension at the expense of the others, shifting focus every few months without ever achieving a genuine overall improvement.
Several of the early posts, however, prompted me to jump into the fray and respond on a variety of topics, initially as follows:
- People trained in Finance and Management Accounting have had their shot at running US Manufacturing from the 1950s to the 1970s, with outcomes that should make them modest about the value of their approaches.
- Companies that have grown through Lean include Toyota since 1950, Wiremold from 1991 to 2000, and many others that are not publicized. A successful Lean effort helps Sales in specific ways, as, for example, when more flexibility enables Manufacturing to accept a sample order for a new component by a customer, who then designs it into his own products and becomes a major OEM.
- Cost reductions are a by-product of Lean but not its primary purpose. Manufacturing is a competitive sport, and Lean a strategy that makes you a stronger player, and not just right now but in the long run as well. When you hire a top coach for a sports team it’s not to cut costs but to win games.
- ROI is just one ratio, invented at DuPont 100 years ago to report in a common form the activities of multiple business units. There is nothing magic about it, and, by itself, it certainly does not give a complete picture of the health of a business, as even finance people recognize.
This caused further exchanges with Robert Byrd, who wrote: “…The most basic concept is Profit = Price – Cost. The market dictates price, so we have the best leverage for maximizing profit through controlling our cost structure…” My response:
Is this always true? What about situations in which time to market is of the essence? I remember a client that was introducing a new frozen food product. On one of our visits they had us taste the R&D prototype, which was delicious. Three months later, they had a jury-rigged production line making 900 units/minute. The line left much to be desired, but we could not argue with their focus on getting the product to market. During all that time, their focus was not on getting it done cheaply but on getting it done at all. Then they could go back and improve the line.
He also wrote: “…Which is best accomplished through the elimination of waste and problem solving…” My response:
This is the key point. We have yet to encounter a manufacturing or service organization in which the details of how work is being done contains no improvement opportunity, and this is what most managers simply do not believe. They are trained to think in terms of “big pictures,” and look for cost cutting opportunities like eliminating entire departments or reducing every department’s budget by 5%.
And finally: “If the aim is only to increase profit by reducing cost, without focus to grow the business, then the culture will never take root and only a short-term improvement will be realized. Or the company will never achieve it’s true potential.”
Your point does not fit in a 30-sec sound-bite or tag line. Good managers are able to monitor all aspects of performance at the same time, from growth to costs. But communication is tricky. For example, if you showcase ” Profit = Price – Cost,” what behaviors do you expect to stimulate, other than cost cutting?
In his second response, he added: “However, what was their business objective once they focused on improvement? I would venture to guess that it was focused on cost reduction.” My response:
I don’t recall them mentioning cost reduction explicitly. When the line first started, about 20% of the units were defective, and that was a focus of concern. Of course, quality improvements reduce costs as a side effect, but they also have other effects: quality problems have an impact on customer perceptions of the product because of variability among the units that are not defective.
In their other production lines, improvement was driven by marketing concerns. This was in Europe, and, in spite of the existence of the EU, frozen food packages had a different sizes by country, in addition to having labels in different languages. Based in Italy, they pursued quick changeovers in package sizes in order to be able to export to Germany, France, the UK, etc.
Their Lean program fit within their business strategy, but it is a stretch to say that this strategy was centered on cost reduction.
Apr 24 2011
Some companies subject job applicants to hands-on tests of the skills required for a position. This says that they appear more interested in filling a capacity gap for a skill set than in recruiting people for careers. The most extreme cases are the “coding interviews” given at Silicon Valley software companies, during which candidates are asked to solve programming problems. This has spawned a whole sub-industry of coaches and books to help cram for such interviews. The problems are typically the kind of textbook exercises given in college that experienced programmers have long forgotten and are irrelevant to their actual work. College students, for example, learn various ways of sorting records, while professional application programmers just use built-in Sort functions. Software developers with, say, 20 years of experience with databases perceive these interviews as silly and demeaning, raising the question of whether they are intended to bias the hiring process in favor of recent college graduates.
This is the opposite of the Lean approach. During a career at a company, a person would have to acquire many technical and managerial skills. With that in mind, the willingness and ability to learn are more important than what the person knows walking in. When Honda set up its Marysville facility, they deliberately hired people with no prior experience in car manufacturing, to train them from scratch in the Honda way. As an employee, the background knowledge you need is supposed to have been provided at school. Whether in the US or Japan, however, schools never work perfectly, and companies end up providing remedial training they feel they shouldn’t have to. However, if all you need today is a technical skill set, you are probably better off hiring a contractor than an employee.