Little's Law

by Evan Conrad, February 25, 2018

Little's Law is a theorem in queueing theory that the average number of items in a queue is equal to the average arrival rate multiplied by the average time the item spends in the queue.

Expressed in code, this looks like this

numberOfItemsInQueue = averageArrivalRate * averageTimeInQueue

The interesting bit

Little's law show's us that this relationship is "not influenced by the arrival process distribution, the service distribution, the service order, or practically anything else." (src)

Why this is important

If we know the average response times and total requests per second, we can estimate the maximum concurrent users on a web server. (src)

Something wrong with this post? Edit it here.

Didn't understand something? Have a question? Ask it here.