In the video, Latency vs. Bandwidth, Duane Barnes differentiates between two concepts that form the basis of network connectivity. While these concepts are interrelated, understanding the distinction between them is critical in fostering a positive end-user experience.
Latency refers to the amount of time and distance it will take to move data from Point A to Point B, while bandwidth measures how much data can move in a moment between the same points. Where latency denotes speed, bandwidth denotes size.
An empirical way to explain this is to use car traffic as an example. Bandwidth in this scenario refers to the number of cars on the road at a particular time, while latency refers to the time it would take a single car to move between two points.
Latency and bandwidth are important for everyday businesses who use services such as Salesforce or office365 or Cloud. Greater server latency and slow bandwidth will result in poor transfer of data. Network services and businesses mitigate this by prioritizing some packets of data over others and establishing themselves in locations with strong bandwidth and a small number of hops to ensure less server latency, respectively.