Home lab extension with an Intel NUC 6th generation

For my home lab I bought a 6th generation Intel NUC. The Intel NUC has following specifications:

  • Intel NUC6i3SYH
  • Intel i3-6100u (Skylake) 2.3 GHz Dual Core, 3 MB cache, 15W TDP
  • 2 memory slots for DDR4-2133 SODIMM memory, maximum is 32 GB memory
  • Intel HD Graphics 520 GPU
  • Intel I219-V Gigabit network adapter and Intel Wireless-AC 8260 WIFI adapter
  • Option to install a 2.5″ HDD/SDD and a M.2 SSD card (2242 or 2280)
  • 4 USB 3.0 ports (2 in the front and 2 on the rear)
  • SD card reader (SDXC cards)
  • Case and a 19V AC-DC adapter

IMG_9206 IMG_9209

The Intel NUC will be used as management server for my Software Defined DataCenter (SDDC) home lab environment. The Intel NUC will host VMs such as:

  • Domain Controller + DNS
  • vCenter Server Appliance
  • Virtual SAN witness appliance
  • Veeam backup
  • Etc.

The VMs are stored on a Synology NAS. The Intel NUC will use a NFS connection to the Synology NAS.  The NUC will not have any disks. It will boot ESXi from the USB stick

Processor

The 6th generation Intel NUC leaves two choices for choosing a CPU:

  • Intel I3 Skylake available on the NUC6i3SYH model
  • Intel I5 Skylake available on the NUC6i5SYH model

Both CPUs have 2 cores and support hypertreading. The table below gives a quick comparison between both processors:

Intel

For this configuration the Intel NUC with the I3-6100u processor is sufficient and saves 100 euro. The I3 has 2 cores and hypertreading, so 4 logical processors are displayed in the hypervisor.

cpu

Other advanced technologies such as VT-x, VT-d, EPT are fully supported.

Memory

The Intel NUC has 2 memory slots and support up to 32 GB DDR4 2133 MHz SODIMM memory. I added  2 Crucial 16 GB DDR4-2133(CT16G4SFD8213) modules which makes a total of 32 GB memory.

IMG_9186 IMG_9190

I use the same memory as suggested by the blog “virten.net” link.

Network card

The Intel NUC has an Intel I219-V Gigabit network adapter and a wireless network card. Only the Intel I219-V can be used with VMware ESXi.

Storage

The NUC has a M.2 (PCIe Gen 3 x4) slot and a Intel AHCI SATA-600 controller. It is possible to install a 2.5″ SDD or harddisk in the drive cage.

IMG_9210

The VMs are on a Synology NAS. So the NUC will not have any disks other than a USB drive for booting VMware ESXi.

VMware ESXi

An USB 3 stick is used to boot VMWare ESXi. On the USB stick is VMware ESXi 6.0 U1b (VMware-VMvisor-Installer-201601001 3380124.x86_64) installed. For creating a USB stick with ESXi 6 you can use the blogpost here. Only step 1 till 3 are needed.

There is no need to add extra drivers to the ESXI image because the network and storage adapter are recognized by default.

LAN Storage

Passthrough is supported by the CPU and motherboard.

passthru

Nesting such as VMware in VMware and Hyper-V in VMware is possible. Below is an screenshot of a Hyper-V Server with a VM hosted on ESXi.

hv

Power consumption

The average power consumption of the NUC is between 20 and 30 watt with a couple of VMs active.

Costs

 Component Amount Total
Intel NUC NUC6i3SYH 1 € 299,00
Crucial 16 GB DDR4-2133 2 € 235,80
 USB3 Stick 16 GB 1  € 10,00
Total € 544,80

Conclusion

The 6th generation Intel NUC is an great and easy option for creating a small ESXi home lab. I use the Intel NUC as management server with a couple of VMs. Another use case is creating a 2/3- node hybrid Virtual SAN (VSAN) cluster. Put a Samsung 950 PRO in the M.2 slot for caching and a 2.5″ HDD as capacity tier. Easy.

Pros and cons

Pros:

  • All in-one-package including a motherboard, processor, enclosure and power adapter.
  • Supports up to 32 GB of memory
  • Easy to install
  • Small Form Factor
  • Low noise & power consumption

Cons:

  • The hardware is not on the VMware HCL
  • Need a converter to connect to a DVI or VGA monitor
  • Only 2 cores available
  • No expansion possibilities such as adding an extra netwerk card
  • No remote management

New whitebox for extending my home lab

For a couple of months I’m searching for an extra whitebox host for extending my home lab environment. My current lab whitebox is a Haswell based whitebox (see: link).  Here is an overview of the new lab environment:

lab environment

For the new whitebox I  had the following requirements:

  • Hardware such as NICs must be recognized by VMware ESXi
  • Use 32 GB memory or more
  • Low power consumption
  • Expandable
  • Small formfactor
  • Quiet
  • Possibility to run nested hypervisors such as VMware ESXi and Hyper-V
  • Remote Management
  • Possibility to create a VMware Cluster and use vMotion, HA, DRS and DPM with the existing Haswell host

I reviewed the following popular home lab systems:

  • Intel NUC
  • Apple Mini
  • Gigabyte BRIX

The main reason to NOT choose for one of the above systems is the only support 16 GB of memory. In November 2014 I  found a motherboard that passes all the requirements, after reading a review on a Dutch hardware website. The review was about the ASRock C2750 motherboard. After some additional research I ordered the following parts to build this whitebox:

  • ASRock C2750 motherboard
  • Kingston 4 x8GB, DDR3, PC12800, CL11, Total 32 GB memory
  • be quiet System Power supply 7 300W
  • Cooler Master Midi Tower N300 ATX

VMware ESXi boots from an USB stick and the VM’s are placed on a iSCSI target so no extra storage is needed. The above parts cost me around € 735,00.

The ASRock C2750D4I motherboard has the following specifications:

  • Mini ITX motherboard
  • CPU: Intel Avoton C2750 64-bit 8 cores processor (passive cooled)
  • Graphics: ASPEED AST2300 16 MB
  • Memory: 4 x DDR3 DIMM slots, max: 64 GB memory
  • Controller: Intel C2750: 2 SATA3, 4 x SATA2 Marvell SE9172 2 x SATA 3, Marvell SE9230 4 x SATA3. Total of 12 SATA ports.
  • NIC: Dual Intel i210 Gigabit LAN adapter
  • 1 x PCIe 2.0 x8 slot
  • Remote Management: BMC Controller with IPMI dedicated LAN adapter
  • TPD of 20 watt
ASrock IMG_3839

CPU

The Intel Avoton C2750 is an atom based processor and contains 8 cores. It is passive cooled and quiet. The Avoton processor is 64-bit and supports Intel VT-x with Extended Page tables (EPT), so it is possible to nest hypervisors such as ESXi and Hyper-V. The Atom processor with 8 cores gives enough CPU performance for my lab environment.

Memory

The motherboard contains 4 memory banks with a maximum of 64 GB DDR3 memory (4 x 16 GB). I choose to use 4 x 8 GB Kingston DDR3, PC12800, CL11 dimms because of the price. 16 GB modules are to expensive on the moment. The motherboard has 32 GB of memory.

NICs

The ASRock C2750D4I system contains a dual Intel i210 Gigabit LAN adapter. The Intel i210 adapters are standard recognized by ESXi 5.5 and Windows Server 2012 R2. No additional modifications or drivers are needed.

Power consumption

The 300 W Power Supply is more than enough. The processor has a TPD of 20 watt. This whitebox consumes around 35 W with a couple of VMware VMs on it.

IMG_3949

The ASRock C2750D4I system is part a VMware cluster with Distributed Power Management (DPM) enabled. When DPM kicks in only 4 watt is used.

Remote Management

Management and remote control is possible because of  the BMC (Baseboard Management Controllers) and IPMI (Intelligence Platform Management Interface).

remote remote2

VMware ESXi support

On the ASRock C2750D4I system, VMware ESXi 5.5 Update 2 with the latest updates is installed.

esxicpu esxi

The Intel i210 Gigabit NICs and the Avoton AHCI controllers are recognized out of the box. So VMware VSAN (unsupported) could be an option to use.

esxiintel esxicontroller

The SATA 6 Gb/s controller is not recognized by default. Follow the instruction explained in the following blog to enable the controller:

  • How to make your unsupported SATA AHCI Controller work with ESXi 5.5 and 6.0, link.

Update March 23, 2015: Today I installed VMware ESXi 6.0 on the C2750. Everything seems to work.

 Windows Hyper-V support

As test I installed vNext Server Technical Preview on the ASRock C2750D4I system (with SSD as local storage) with the Hyper-V role enabled. The two Intel i210 Gigabit NICs are recognized out of the box. It has great performance.

vnext hyper-v hvtaskmgr01

Conclusion

The ASRock C2750D4I motherboard is a great system for building or extending a home lab environment based on VMware or Hyper-V. This board gives enough performance for a home lab and meets all the requirements I had for an additional whitebox host. I use it mainly for nesting VMware ESXi and Hyper-V hypervisors.

Intel X79 whitebox for vSphere 5 and Hyper-V 3

Updates:

Update August 8 2012: Added Microsoft Windows 2012 Hyper-V screenshots and link to blog post how to enable the Intel 82579V NIC

Update August 9 2012: Updating to the latest BIOS enables support for DirectPath I/O in VMware vSphere. Screenshot added

In an earlier blog post (found here) I mentioned that it is time for a new homebrew whitebox based on the Intel X79 chipset. With the X79 chipset it is possible to install 64GB of memory (8 x 8 GB). Because the 8 GB DIMMs are expensive on the moment, I decided to use 8 x 4GB DIMMs (total 32GB).

I decided to create one physical host for testing VMware vSphere 5, vCloud Director, VMware SRM, VMware View 5 etc. The possibility to create a physical ESXi5 server, create virtual ESXi  hosts on it and start VM on the virtual hosts is great! This feature is called nesting. How to do this, can be found on William Lam blog found here.

Components used for the VMware ESXi 5 / Microsoft Windows Server 2012 whitebox:

  • Intel i7-3820 CPU 3.60 GHz, 4 cores, with Hyper threading 8 cores
  • Zalman CNPS10X performance cooler
  • Asus P9X79 s2011 motherboard. Some specs:
      • Socket 2011
      • 8 DIMM slots, supports 64GB memory
      • Expansions slots: 2 x PCIe 3.0 (dual x16), 1 x PCIe (x8 mode), 2 x PCIe 2.0 x1, 1 x PCI
      • 2 SATA 6 Gb/s port, 4 x SATA 3 Gb/s
      • LAN: Intel 82579V Gigabit LAN controller
  • 2 x Corsair Vengeance DDR3- 1600 16GB (4 x 4) kit, total 32GB memory (max 64GB)

The case, power supply, graphical card, RAID controller and extra NIC(S) are reused. Here are some photos of the configuration:

image image
image image
image

When the hardware  configuration was done and tried to power on the system, nothing happened (black screen). The appears that BIOS of the motherboard didn’t know the i7-3820 CPU yet. The cool thing is that the motherboard has a function called “USB BIOS Flashback”. It is possible to flash the BIOS without CPU or memory installed. Here are the steps:

  • Download the latest BIOS from the Asus site;
  • Extract the BIOS on a USB stick;
  • Rename the BIOS file, example: rename “P9X79-ASUS-0906.ROM” in “P9X79.ROM” (important);
  • Place the USB stick in the USB port with the WHITE interior on the back;
  • Press the BIOS flashback button for 3 seconds and the light will begin to flash;
  • Don’t turn of the computer during the BIOS flash;
  • When the flashing light stop, the BIOS has been complete;

After the BIOS update was finished, the system boots and I was able to install VMware ESXi and Windows Server 2012 and enable the Hyper-V role.

vSphere 5 / ESXi 5 screenshots:

image image
Hyper threading enables 8 cores 32GB memory
image image
The onboard SATA controller is listed as Patsburg 6 Port SATA AHCI controller.

Software-RAID does not work

The hardware RAID controller is added as extra PCI card

The onboard Intel 82579V NIC is not supported in ESXi5. Use the procedure found here to add the NIC. Use at your own risk!

The Intel 82574L NIC is added as extra PCIe card.

image

The latest firmware includes support for DirectPath I/O

 

Microsoft Windows Server 2012

It is possible to install Microsoft Windows 2012 and enable the Hyper-V role. Here are some screenshots:

image image
image image

The onboard Intel 82579V NIC is not recognized in Windows Server 2012 by default. How-to enable the Intel 82579V NIC is explained in this explained in this blog post.

This whitebox is a great extension to my home lab!

[ad#banner]