How-To Geek


Bandwidth, when used in the computational terms, is the amount of data that can be transferred over a given connection in a set period of time. The speed of many things such as internet connections, local network connections, and even local connections between host computers and peripherals are frequently described in terms of bandwidth.

Bandwidth is usually noted in the format of bits-per-second (or whatever multiple of bits-per-second is most practical for the application). Thus you will frequently hear Internet Service Providers toting that their top-tier residential internet connection offers 50Mbit/s or an article on a technology blog explaining that a new cable standard offers a 2Gbit/s increase in available bandwidth over the previous standard.

Improvements in technology have yielded increasingly higher bandwidths for both wide area and local networks as well as hardware interfaces. Over the last twenty years, for example, the available internet connections in the average metropolitan area have gone from dial-up (max 56 Kbit/s), to ASDSL (1.5 MBit/s) to various cable/fiber-based offerings (+100 MBit/s). Similar advancements have been seen in local networks (Ethernet standards have improved from 10 Mbit/s to 100 Gbit/s) and peripheral interfaces such as USB (USB 1.0 had a max transmission rate of 1.5 Mbit/s where as USB 3.0 maxes out at a theoretical 5 Gbit/s).

Enter Your Email Here to Get Access for Free:

Go check your email!