Queueing delay

From Wikipedia, the free encyclopedia

In computer engineering, a queueing delay is the time a job waits in a queue until it can be executed.

This term is most often used in reference to routers. When packets arrive at a router, they have to be processed and transmitted. A router can only process one packet at a time. If packets arrive faster than the router can process them (such as in a burst transmission) the router puts them into the queue (also called the buffer) until it can get around to transmitting them.

Queuing delay is proportional to buffer size. The longer the line of packets waiting to be transmitted, the longer the average waiting time is. However, this is much preferable to a shorter buffer, which would result in ignored ("dropped") packets, which in turn would result in much longer overall transmission times.

[edit] Calculation of time

For a router

  • With transmission delay of Dt
  • At 100% utilization
  • With a buffer capable of holding N packets
  • Not counting dropped packets the average queuing delay Dq is
D_q = {0 D_t+1 D_t+2 D_t+\cdots +(N - 1) D_t \over N}.\,

Using sum-of-series simplification, this simplifies to:

D_q = {{D_t \over 2} \times (N - 1)}.\,

[edit] See also