Is it common for host machines to saturate the available bandwidth?
Although this is theoretically possible, I have never seen it happen in the real world. Typically, virtual machines are distributed across host servers according to the workload that they are expected to place on the host machine. For example, if a virtual machine is expected to produce a lot of network traffic, then it might be paired with other virtual machines that don’t generate much traffic. Another common solution is to assign each virtual machine a dedicated network interface card (NIC).