Huge or minimal performance hit running game servers on a Virtual Machine?

Damainman asked:

I have a two dedicated servers to choose from depending on which one would do a better job. I plan on updating the Hard Drive space and RAM at a later date depending on how I move forward.

Server 1:

  • 500GB Hard Drive
  • 8GB RAM
  • 2x 64bit Intel Xeon L5420(Quad Core) @ 2.50Ghz

Server2:

  • 500GB Hard Drive
  • 8GB RAM
  • 2x 64bit Intel Xeon E5420(Quad Core) @ 2.50GHz

I want to run a virtual machine that will host about 10 game servers, with about 16 active slots per server. It will be a mix and match from:

  • Minecraft
  • Counter Strike( 1.6, Source, Global Offensive)
  • Battlefield
  • Team Fortress

I know the general consensus is virtualization is a horrible idea if you plan on running virtual servers on them. The issue is, the discussions I read do not really clearly state whether they are speaking about a virtual server running inside an OS(ie: VMware Player running on Windows with the game server in a VM) or a Hypervisor such as Xen Cloud Platform.

I am trying to get a definite answer on how feasible the above would be and how much of a performance hit it might be if the VM running the game servers is on a hypervisor such as Xen Cloud Platform. My initial research lead me to believe that there wouldn’t be a performance hit since the virtualization is different than running it via inside of a OS.


I answered:

Virtualization itself isn’t bad. In fact, it’s great. You can save a lot of money by only using “as much hardware as you need.”

However, there are a lot of crappy virtual machine providers.

Since it sounds like you’re going to do the virtualization yourself, rather than buy VPSes from some fly-by-night, you have the opportunity to do it well.


View the full question and answer on Server Fault.

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.