Your last paragraph actually makes my point. You are saying that ideas coming from the US are taken more seriously in the UK than ideas from the UK, which has to be frustrating to creative Brits. While office work and administration has not been the focus of my work, I don’t find his writings any less professional than the bulk of business writing.

As for crudity and bluntness, which has not been my policy, Seddon’s is no worse than that of many respected Japanese consultants, nicknamed “insultants.”

]]>On your other, rather bizarre, point. It’s strange to allege that ideas coming from the UK are often automatically discredited. I don’t find that to be true at all.

Incidentally, in the field of systems thinking, many of the originators were/are British.

The worst thing about the UK is that too many Britons think that the USA is the best thing since sliced bread, and seek to mimic and imitate popular USA trends, rather than learning from the best of what the USA comes up with. It’s a question of focus and is fuelled by post-war British self-abnegation. Still, internationally the UK’s recent negative evaluation of itself is mostly not shared by foreigners, and the country remains globally influential on many counts.

]]>However, still it is being presented beside Germany in many of the other countries that IT plays the leadingr role, or the triggering for change for miracle type of solutions as opposed to its supporting role! Namely without reaching process stability (e.g. inserts are still inserts, ledges are still ledges on NC-Tools!) we will face the same problem; even measuring 100% with capable in-proces-measument sensors => processing big-data => sorting them out will not help much to improve FPY rates, or reduce the total defect cost. Very few people talk about this, which is strange enough, and this should bethe job of real manufacturing people, the one on the shop floor… ]]>

If you could just look at the number of people, then the binomial distribution would lead to the same result as the Markov process discussed above. Both models have an 80% utilization, but the probability distribution of the number of technicians working at a given time is different.

To extend the 6 machine example, let the probability of a machine being in failure be 68.35%. You can find the binomial distribution of these machine states:

0 0.10%

1 1.30%

2 7.03%

3 20.25%

4 32.79%

5 28.33%

6 10.20%

If you take these values, you find the utilization of the workers is ~80%. The probability of having all workers working is 38.53%, which is greater than the 33% in the example.

If we could separate the technician from the work, these probabilities would be the same.

]]>This gives you a binomial distribution for the number of available technicians at any time, with the probability 0.8 for any technician to be busy, and the probability that all are busy is just 0.8 to the power of 20, or 1%.

]]>Let’s say there are 6 machines and 5 technicians. There is a >0 probability of having anywhere from 0 to 6 machines currently failing. Without knowing the failure rate or distribution, you cannot calculate the probability of having 0-4 machines down.

If we start from a Bernoulli distribution in 0-5, some of the probability has to move from 0-5 into 6. If it all came from 5, then it would be fine. However to keep generality, it moves out of all of 0-5. This makes the probability of 5 or 6 larger than the probability of just 5 in the original problem. Adding more machines obviously increases the gap from the original problem.

]]>If you want a 99% service level like the 20 person, 80% utilization example, you would actually need 26 people (62% utilization) with exponentially distributed events.

]]>The simple math I am using is just to answer one question: at any time, what is the probability of having at least one technician available to respond to a hypothetical emergency. It requires assumptions about the technicians’ work pattern, but not about the process by which emergencies occur.

]]>