# How to Pick the Fastest Line at the Supermarket | New York Times [Debunk]

“[…] Choose a single line that leads to several cashiers

Not all lines are structured this way, but research has largely shown that this approach, known as a serpentine line, is the fastest. The person at the head of the line goes to the first available window in a system often seen at airports or banks. […]”

Sourced through the New York Times

No! Research shows no such thing. The serpentine line does not reduce the customers’ mean time through the system. Little’s Law tells us that, in steady state, regardless of how the queue is organized:

${Mean\, time\, in\, system = \frac{Mean\, number\, of\, customers\, in \, system}{Mean\, service\, rate}}$

The mean service rate is the rate at which customers arrive into and leave the system. If there are 5 check stands each completing a transaction every 5 minutes on the average, the service rate is 1 customer/minute for the whole system. In steady state, there is also, on the average, one new customer arriving every minute.

The serpentine line still makes a difference, however, by reducing the variability of waiting times. With one line per check stand, you are occasionally stuck behind another customer taking exceptionally long. With a serpentine line, this is avoided. The slow customer will tie up one check stand for a long time, but the line automatically redirects the other customers to the other check stands.

Similar situations occur in manufacturing, with materials waiting in front of shared services like electroplating, painting or heat treatment. Manufacturers should get the math right, and so should the New York Times.

## 4 comments on “How to Pick the Fastest Line at the Supermarket | New York Times [Debunk]”

1. Kyle Harshbarger

In grad school I did research on this exact issue with Julie Niederhoff (cited in the article), and I’m on board with all of your observations. Additionally, the only time I’ve seen a choice between multiple server queues and single server queues is when there are other priority rules involved such as number of items or self-service. Julie’s suggestion has been taken out of context. The work is on how to setup a queue a priori, not for customers given a choice.

The single queue with multiple servers will keep FIFO as well which appears the most fair to customers. However there is a potential psychological affect on servers in this situation. Since a server’s work load is now directly related to how every other server is working, there is a tendency to work slower. While the shared burden is the same in both queue types, the server in a single queue system also does not have the pressure of the next customer in line helping keep them accountable.

• Michel Baudin

The points in your second paragraph apply to human customers served by other humans, and it is what you have in a grocery supermarket. But queueing systems do not always involve humans. In a factory’s heat treatment facility, for example, the customers are pieces of metal, served by a machine; inside a computer, threads of execution served by one or more processors.

When you are not dealing with humans, you don’t need to worry about fairness and you can use complex sequencing rules if they perform better, without risking confusion. The rules used by computer operating systems to allocate slots on a processor among threads would be too complicated for people.

In Manufacturing, the reason to use FIFO is neither fairness nor speed but quality. If a machine starts drifting, you can detect and trace it in the sequence of work pieces, as long as FIFO is maintained. If you dump the parts into a heap in a bin, you scramble the message.

2. OK, so the NYT language is off. The “serpentine line” assures that everyone gets through in something close to the mean service time rather than assuring that everyone gets through faster. But a consistently flowing queue isn’t to be discounted.

• Michel Baudin

The serpentine queue also relieves you of the need to evaluate the existing queues and choose which one to join.