The term "bandwidth" refers to the data rate supported by your network interface. Bandwidth represents the capacity of a network connection for supporting data transfers. Higher network bandwidth often translates to better performance, although overall performance also depends on other factors.
The term derives from the field of electrical engineering, where bandwidth represents the total distance or range between the highest and lowest signals on a communication channel (band).
Measuring Network Bandwidth
Computer network bandwidth is measured in units of bits per second (bps). Most modern network devices support data rates of thousands and often millions or even billions of bps (units of Kbps, Mbps, and Gbps).
Network devices each possess a bandwidth rating according to the maximum data rate they are physically capable of supporting. The network protocol technology a device uses determines its max data rate. Protocol designers and device makers measure and assign bandwidth ratings to their devices accordingly. For example, old V.90 dial-up modems were rated as 56 Kbps devices. 802.11g Wi-Fi devices as 54 Mbps and Fast Ethernet links as 100 Mbps.
Measuring Throughput versus Bandwidth
People sometimes use the terms "throughput" and "bandwidth" interchangeably. Technically, throughput represents the actual amount of useful data transferred over a network connection compared to bandwidth that measures a theoretical maximum.
Due to network communication overheads (packing and unpacking of messages, collisions, errors, and retries), throughput typically rates significantly lower than bandwidth. Internet speed tests measure the throughput of a client's Internet connection. Various utility programs measure throughput on local networks, including ttcp.