From riiders's first link: "The Three Most Common Ethernet Speeds"
I almost stopped reading there. Networks are rated in bandwidth, not speed. If you're not sure of the difference, in a nutshell, speed is a rating of how quickly (or slowly) an object in motion is travelling (ie: velocity = distance/time).
Bandwidth on the other hand, is how much data can flow past any one place on a network in one second. 100 Mbps is not ten times faster than 10 Mbps, it carries ten times as much data. The "speed" of transmission remains constant.
Then I read, "A network switch -- or hub -- supports..."
A switch and a hub are two different devices. A hub broadcasts all traffic to all ports whereas a switch intelligently manages traffic. Say a device plugged into port one of a switch is "talking" to a device plugged into port four. The traffic only flows between ports one and four. Whereas on a hub, that same traffic would be broadcast to all ports (both directions) I'm not sure anybody even manufactures hubs any more as the least intelligent switch out there is an infinitely better device to use than a hub. I know I haven't seen one in a workplace in over a decade.
I had to stop here because the article was obviously written by a moron who didn't even spend 5 minutes researching the topic before beginning to write on it.
With regard to bandwidth usage. One can expect to realize approximately 85% of their rated bandwidth as "usable" Which is to say, with a 100 Mbps connection you can expect to be able to use up to 85 Mbps at any given time. The rest will be used up in overhead.
Wireless, due to inherent issues with lag and latency will never perform as well as wired. So a 1000 Mbps wired connection will always be better than a 1000 Mbps wireless connection.
It matters not how straight the gate,
How charged with punishments the scroll,
I am the master of my fate;
I am the captain of my soul.
message edited by Curt R