Bit time

From Wikipedia, the free encyclopedia

Bit time is a concept in computer networking. It is defined as the time it takes for one bit to be ejected from a Network Interface Card (NIC) operating at some predefined standard speed, such as 10 Mbit/s. The time is measured between the time the logical link control layer 2 sublayer receives the instruction from the operating system until the bit actually leaves the NIC. The bit time has nothing to do with the time it takes for a bit to travel on the network medium, but has to do with the internals of the NIC.

To calculate the bit time at which a NIC ejects bits, use the following:

        bit time = 1 / NIC speed

To calculate the bit time for a 10 Mbit/s NIC, use the formula as follows:

        bit time = 1 / (10 * 10^6)
                 = 10^-7
                 = 100 * 10^-9
                 = 100 nanoseconds

The bit time for a 10 Mbit/s NIC is 100 nanoseconds. That is, a 10 Mbit/s NIC can eject 1 bit every 100 nanoseconds.

Bit time is distinctively different from slot time, which is the time taken for a pulse to travel through the longest permitted length of network medium.

This article is issued from Wikipedia. The text is available under the Creative Commons Attribution/Share Alike; additional terms may apply for the media files.