New whitebox for extending my home lab

For a couple of months I’m searching for an extra whitebox host for extending my home lab environment. My current lab whitebox is a Haswell based whitebox (see: link).  Here is an overview of the new lab environment:

lab environment

For the new whitebox I  had the following requirements:

  • Hardware such as NICs must be recognized by VMware ESXi
  • Use 32 GB memory or more
  • Low power consumption
  • Expandable
  • Small formfactor
  • Quiet
  • Possibility to run nested hypervisors such as VMware ESXi and Hyper-V
  • Remote Management
  • Possibility to create a VMware Cluster and use vMotion, HA, DRS and DPM with the existing Haswell host

I reviewed the following popular home lab systems:

  • Intel NUC
  • Apple Mini
  • Gigabyte BRIX

The main reason to NOT choose for one of the above systems is the only support 16 GB of memory. In November 2014 I  found a motherboard that passes all the requirements, after reading a review on a Dutch hardware website. The review was about the ASRock C2750 motherboard. After some additional research I ordered the following parts to build this whitebox:

  • ASRock C2750 motherboard
  • Kingston 4 x8GB, DDR3, PC12800, CL11, Total 32 GB memory
  • be quiet System Power supply 7 300W
  • Cooler Master Midi Tower N300 ATX

VMware ESXi boots from an USB stick and the VM’s are placed on a iSCSI target so no extra storage is needed. The above parts cost me around € 735,00.

The ASRock C2750D4I motherboard has the following specifications:

  • Mini ITX motherboard
  • CPU: Intel Avoton C2750 64-bit 8 cores processor (passive cooled)
  • Graphics: ASPEED AST2300 16 MB
  • Memory: 4 x DDR3 DIMM slots, max: 64 GB memory
  • Controller: Intel C2750: 2 SATA3, 4 x SATA2 Marvell SE9172 2 x SATA 3, Marvell SE9230 4 x SATA3. Total of 12 SATA ports.
  • NIC: Dual Intel i210 Gigabit LAN adapter
  • 1 x PCIe 2.0 x8 slot
  • Remote Management: BMC Controller with IPMI dedicated LAN adapter
  • TPD of 20 watt
ASrock IMG_3839

CPU

The Intel Avoton C2750 is an atom based processor and contains 8 cores. It is passive cooled and quiet. The Avoton processor is 64-bit and supports Intel VT-x with Extended Page tables (EPT), so it is possible to nest hypervisors such as ESXi and Hyper-V. The Atom processor with 8 cores gives enough CPU performance for my lab environment.

Memory

The motherboard contains 4 memory banks with a maximum of 64 GB DDR3 memory (4 x 16 GB). I choose to use 4 x 8 GB Kingston DDR3, PC12800, CL11 dimms because of the price. 16 GB modules are to expensive on the moment. The motherboard has 32 GB of memory.

NICs

The ASRock C2750D4I system contains a dual Intel i210 Gigabit LAN adapter. The Intel i210 adapters are standard recognized by ESXi 5.5 and Windows Server 2012 R2. No additional modifications or drivers are needed.

Power consumption

The 300 W Power Supply is more than enough. The processor has a TPD of 20 watt. This whitebox consumes around 35 W with a couple of VMware VMs on it.

IMG_3949

The ASRock C2750D4I system is part a VMware cluster with Distributed Power Management (DPM) enabled. When DPM kicks in only 4 watt is used.

Remote Management

Management and remote control is possible because of  the BMC (Baseboard Management Controllers) and IPMI (Intelligence Platform Management Interface).

remote remote2

VMware ESXi support

On the ASRock C2750D4I system, VMware ESXi 5.5 Update 2 with the latest updates is installed.

esxicpu esxi

The Intel i210 Gigabit NICs and the Avoton AHCI controllers are recognized out of the box. So VMware VSAN (unsupported) could be an option to use.

esxiintel esxicontroller

The SATA 6 Gb/s controller is not recognized by default. Follow the instruction explained in the following blog to enable the controller:

  • How to make your unsupported SATA AHCI Controller work with ESXi 5.5 and 6.0, link.

Update March 23, 2015: Today I installed VMware ESXi 6.0 on the C2750. Everything seems to work.

 Windows Hyper-V support

As test I installed vNext Server Technical Preview on the ASRock C2750D4I system (with SSD as local storage) with the Hyper-V role enabled. The two Intel i210 Gigabit NICs are recognized out of the box. It has great performance.

vnext hyper-v hvtaskmgr01

Conclusion

The ASRock C2750D4I motherboard is a great system for building or extending a home lab environment based on VMware or Hyper-V. This board gives enough performance for a home lab and meets all the requirements I had for an additional whitebox host. I use it mainly for nesting VMware ESXi and Hyper-V hypervisors.

Enable the Intel I217-V network card in VMware ESXi

The motherboard of my Haswell whitebox contains an Intel I217-V network card. See my “Haswell low power whitebox for ESXi and Hyper-V” post for more information. The Intel I217-V  network card is not recognized by VMware ESXi 5.x by default. On the VMware community I found a post with the  Intel I217-V ESXi driver attached.  To enable the Intel I217-V  network card you have two options:

  • Inject the driver in the ESXi ISO using for example the “ESXi-Customizer” tool. See my blogpost  “VMware ESXi 5 whitebox NIC support” for more information.
  • Install the driver when ESXI is already installed. You need a supported network card to be able to install ESXi!

Here is a quick overview how to install the driver when ESXI is already installed:

  • Download the driver
  • Upload the VIB to a datastore by using the vSphere Client
  • Start the SSH service on the ESXI server and make a SSH connection to the ESXi server
  • Put the ESXi server in maintenance mode, command: esxcli  system maintenanceMode set -e true
  • Change the ESXi host acceptance level to Community Supported, command: esxcli software acceptance set –level=CommunitySupported
  • Install the VIB, command:  esxcli software vib install -v /vmfs/volumes/datastore1/net-e1000e-2.3.2.x86_64.vib
  • Reboot the system after the following message “Message: The update completed successfully, but the system needs to be reboot“,command: reboot
  • When the ESXi host is up make a SSH connection and exit maintenance mode. Command: esxcli  system maintenanceMode set -e false

Check if the Intel I217-V network adapter is visible is the vSphere Client or vSphere Web client

image

Haswell low power whitebox for ESXi and Hyper-V

I was searching for a new whitebox for my home lab.  I had the following requirements for the new whitebox:

  • Low power consumption for 24×7 running
  • >16 GB memory
  • Expansion slots for PCI(-E) cards
  • Good performance
  • Low noise

A couple of weeks ago Intel released the new 4th generation Haswell CPUs that consumes less power. Seems to be interesting option for building a low power consumption whitebox. So I did some research and ordered the following hardware components:

CPU i5 4570S Boxed foto-7_thumb1
Motherboard Gigabyte GA-Z87-D3HP
Memory Corsair Vengeance 4 x 8 GB DDR3 PC3-12800 (DDR3-1600) 32GB
Power Supply Seasonic G-360 80 Plus Gold

The following components I reused:

  • The case
  • 2 Intel PCI-e NICS
  • A SSD drive

CPU

Haswell is the codename for the 4th generation Intel Core processors. One of the big improvements of the Haswell CPUs is the idle power consumption. The Intel S-version is also a low-power CPU.  It contains 4 cores and has a 65W TDP. The processor supports vPro, VT-x, VT-d, EPT etc.   A CPU cooler is in the box included. For the full specifications look here.

Motherboard

The Gigabyte GA-Z87-D3HP is a socket 1150 motherboard. The board has  4 memory sockets that support  DDR3 memory up to 32 GB.  It has onboard graphics , 6 x SATA 6 Gb/s connectors, a Intel WG-217v LAN adapter and the following expansion slots:

  • 1 x PCI Express x16 slot
  • 1 x PCI Express x16 slot, running at x4
  • 2 x PCI Express x1 slots
  • 2 x PCI slots

Memory

For the memory I choose the Corsair Vengeance CML32GX3M4A1600C10 4 x 8  PC3-12800 (DDR3-1600) CL10 kit. All the four memory sockets on the board are filled with a 8 GB module (total = 32 GB memory).

Power Supply

The Seasonic G-360 power supply has a 80 Plus Gold certification. This is a great power supply with high efficiency and low noise..

 

Hypervisor support

I tested VMware ESXi 5.1 Update 1 and vSphere 5.5 and Microsoft Windows Server 2012 with the Hyper-V role installed.

VMware ESXi

The onboard Intel I217-V NIC is not recognized  by ESXi 5.1 Update 1. To get the I217-V NIC working in ESXi5.x read the blog post found here. I reused 2  Intel PCI-e NICs, one for LAN and the other for iSCSI and NFS traffic. The onboard SATA controller  (Lynx Point AHCI) is recognized. I use an existing SSD for booting ESXi and running some important VMs. But is is possible to boot ESXi from USB stick. The other VMs are on NAS device. Passthrough is supported by the CPU and motherboard.

image image
image image

 

Microsoft Windows Server 2012 with the Hyper-V role

I tested Windows Server 2012 with the Hyper-V role enabled. The The onboard Intel I217-V NIC is not recognized by default. In the “Enable the Intel I217-V NIC in Windows Server 2012” blog post I explain how to enable the I217-V NIC Windows Server 2012.

 

image image
image image

Power consumption

The whitebox consumes a maximum between 40-50 W. When idle (and that is often) it consumes only 28 a 29 W!

foto (8)

The whitebox components cost about €615. and will be used for running VMs 24×7.  The whitebox meets all the requirements I had.