Home lab extension with an Intel NUC 6th generation

For my home lab I bought a 6th generation Intel NUC. The Intel NUC has following specifications:

  • Intel NUC6i3SYH
  • Intel i3-6100u (Skylake) 2.3 GHz Dual Core, 3 MB cache, 15W TDP
  • 2 memory slots for DDR4-2133 SODIMM memory, maximum is 32 GB memory
  • Intel HD Graphics 520 GPU
  • Intel I219-V Gigabit network adapter and Intel Wireless-AC 8260 WIFI adapter
  • Option to install a 2.5″ HDD/SDD and a M.2 SSD card (2242 or 2280)
  • 4 USB 3.0 ports (2 in the front and 2 on the rear)
  • SD card reader (SDXC cards)
  • Case and a 19V AC-DC adapter

IMG_9206 IMG_9209

The Intel NUC will be used as management server for my Software Defined DataCenter (SDDC) home lab environment. The Intel NUC will host VMs such as:

  • Domain Controller + DNS
  • vCenter Server Appliance
  • Virtual SAN witness appliance
  • Veeam backup
  • Etc.

The VMs are stored on a Synology NAS. The Intel NUC will use a NFS connection to the Synology NAS.  The NUC will not have any disks. It will boot ESXi from the USB stick

Processor

The 6th generation Intel NUC leaves two choices for choosing a CPU:

  • Intel I3 Skylake available on the NUC6i3SYH model
  • Intel I5 Skylake available on the NUC6i5SYH model

Both CPUs have 2 cores and support hypertreading. The table below gives a quick comparison between both processors:

Intel

For this configuration the Intel NUC with the I3-6100u processor is sufficient and saves 100 euro. The I3 has 2 cores and hypertreading, so 4 logical processors are displayed in the hypervisor.

cpu

Other advanced technologies such as VT-x, VT-d, EPT are fully supported.

Memory

The Intel NUC has 2 memory slots and support up to 32 GB DDR4 2133 MHz SODIMM memory. I added  2 Crucial 16 GB DDR4-2133(CT16G4SFD8213) modules which makes a total of 32 GB memory.

IMG_9186 IMG_9190

I use the same memory as suggested by the blog “virten.net” link.

Network card

The Intel NUC has an Intel I219-V Gigabit network adapter and a wireless network card. Only the Intel I219-V can be used with VMware ESXi.

Storage

The NUC has a M.2 (PCIe Gen 3 x4) slot and a Intel AHCI SATA-600 controller. It is possible to install a 2.5″ SDD or harddisk in the drive cage.

IMG_9210

The VMs are on a Synology NAS. So the NUC will not have any disks other than a USB drive for booting VMware ESXi.

VMware ESXi

An USB 3 stick is used to boot VMWare ESXi. On the USB stick is VMware ESXi 6.0 U1b (VMware-VMvisor-Installer-201601001 3380124.x86_64) installed. For creating a USB stick with ESXi 6 you can use the blogpost here. Only step 1 till 3 are needed.

There is no need to add extra drivers to the ESXI image because the network and storage adapter are recognized by default.

LAN Storage

Passthrough is supported by the CPU and motherboard.

passthru

Nesting such as VMware in VMware and Hyper-V in VMware is possible. Below is an screenshot of a Hyper-V Server with a VM hosted on ESXi.

hv

Power consumption

The average power consumption of the NUC is between 20 and 30 watt with a couple of VMs active.

Costs

 ComponentAmountTotal
Intel NUC NUC6i3SYH1€ 299,00
Crucial 16 GB DDR4-21332€ 235,80
 USB3 Stick 16 GB1 € 10,00
Total€ 544,80

Conclusion

The 6th generation Intel NUC is an great and easy option for creating a small ESXi home lab. I use the Intel NUC as management server with a couple of VMs. Another use case is creating a 2/3- node hybrid Virtual SAN (VSAN) cluster. Put a Samsung 950 PRO in the M.2 slot for caching and a 2.5″ HDD as capacity tier. Easy.

Pros and cons

Pros:

  • All in-one-package including a motherboard, processor, enclosure and power adapter.
  • Supports up to 32 GB of memory
  • Easy to install
  • Small Form Factor
  • Low noise & power consumption

Cons:

  • The hardware is not on the VMware HCL
  • Need a converter to connect to a DVI or VGA monitor
  • Only 2 cores available
  • No expansion possibilities such as adding an extra netwerk card
  • No remote management

Home lab extension with a Samsung 950 PRO M.2 SSD

Last month I extended my VMware ESXi and Hyper-V home lab with a Samsung 950 Pro SSD. The Samsung 950 Pro SSD is the next-gen SSD that has the following characteristics:

  • Uses V-NAND memory and the Non-Volatile Memory Express (NVME) protocol. This takes away the 600 MBps bandwidth limit with the SATA protocol.
  • Takes advantage of the PCIe Gen 3 x4 (up to 32 Gb/s) interface
  • Available in 256 and 512 GB at the moment. In 2016 larger sizes will be available
  • The Samsung 950 PRO has a M.2 form factor (2208) interface

_1Samsung

These improvements results in being one of the fastest consumer SSD on the market today. In my current home lab don’t have a PCIe Gen3 x4 slot or a M.2 interface. I found a “interface converter PCI-Express, M.2 NGFF” (link) adapter. It’s the same adapter as the Lycom DT-120 (link) another great blog called TinkerTry is referring to (link). The adapter converts the M.2 interface to a PCIe interface slot.

IMG_8197 IMG_8075

The Lycom DT-120 adapter has the following specifications:

  • Single M.2 slot (There are Dual M.2 controllers available on the market)
  • The adapter cost around € 17,00
  • Does not require any driver
  • Supports PCIe 2280,2260 and 2242
  • Supports PCIe 1.0, 2.0 and 3.0 slots on the motherboards

Configuration 

My first step was checking if the firmware is up to date with the Samsung Magician software.

SamsungMagician

In Windows 2012 R2 the Samsung 950 Pro SSD is standard recognized. The latest drivers and software can be download here, link.

VMware ESXi

The Samsung 950 PRO is recognized by default in VMware ESXi 6 Update 1.

esxi

Benchmark

With ATTO software a simple benchmark on a Samsung 840 SSD (based on the SATA protocol) and the Samsung 950 Pro is performed. The ESXi host is a whitebox with the following hardware specifications:

  • Gigabit GA-Z870D3HP
  • Intel i5 4570S
  • 32 GB memory
  • Lycom DT-120 adapter is placed in a PCIe x16 slot
  • Samsung 840 connected via SATA
  • Samsung 950 Pro SSD placed on the Lycom DT-120 adapter

A VM with a 10 GB Thick Provisioned Eager Zeroed VMDK disk is attached. The disk is formatted as NTFS with a standard (4 KB) block size.

The Left picture is the Samsung EVO 840 and the right picture is the Samsung 950 Pro.

samsung840-evo-512gb atto Z870D3HP

As you can see the read and write performance is multiplied by 3 on the Samsung 950 Pro with the M.2 interface converter . These are pretty impressive numbers for a consumer SSD.

 

New whitebox for extending my home lab

For a couple of months I’m searching for an extra whitebox host for extending my home lab environment. My current lab whitebox is a Haswell based whitebox (see: link).  Here is an overview of the new lab environment:

lab environment

For the new whitebox I  had the following requirements:

  • Hardware such as NICs must be recognized by VMware ESXi
  • Use 32 GB memory or more
  • Low power consumption
  • Expandable
  • Small formfactor
  • Quiet
  • Possibility to run nested hypervisors such as VMware ESXi and Hyper-V
  • Remote Management
  • Possibility to create a VMware Cluster and use vMotion, HA, DRS and DPM with the existing Haswell host

I reviewed the following popular home lab systems:

  • Intel NUC
  • Apple Mini
  • Gigabyte BRIX

The main reason to NOT choose for one of the above systems is the only support 16 GB of memory. In November 2014 I  found a motherboard that passes all the requirements, after reading a review on a Dutch hardware website. The review was about the ASRock C2750 motherboard. After some additional research I ordered the following parts to build this whitebox:

  • ASRock C2750 motherboard
  • Kingston 4 x8GB, DDR3, PC12800, CL11, Total 32 GB memory
  • be quiet System Power supply 7 300W
  • Cooler Master Midi Tower N300 ATX

VMware ESXi boots from an USB stick and the VM’s are placed on a iSCSI target so no extra storage is needed. The above parts cost me around € 735,00.

The ASRock C2750D4I motherboard has the following specifications:

  • Mini ITX motherboard
  • CPU: Intel Avoton C2750 64-bit 8 cores processor (passive cooled)
  • Graphics: ASPEED AST2300 16 MB
  • Memory: 4 x DDR3 DIMM slots, max: 64 GB memory
  • Controller: Intel C2750: 2 SATA3, 4 x SATA2 Marvell SE9172 2 x SATA 3, Marvell SE9230 4 x SATA3. Total of 12 SATA ports.
  • NIC: Dual Intel i210 Gigabit LAN adapter
  • 1 x PCIe 2.0 x8 slot
  • Remote Management: BMC Controller with IPMI dedicated LAN adapter
  • TPD of 20 watt
ASrockIMG_3839

CPU

The Intel Avoton C2750 is an atom based processor and contains 8 cores. It is passive cooled and quiet. The Avoton processor is 64-bit and supports Intel VT-x with Extended Page tables (EPT), so it is possible to nest hypervisors such as ESXi and Hyper-V. The Atom processor with 8 cores gives enough CPU performance for my lab environment.

Memory

The motherboard contains 4 memory banks with a maximum of 64 GB DDR3 memory (4 x 16 GB). I choose to use 4 x 8 GB Kingston DDR3, PC12800, CL11 dimms because of the price. 16 GB modules are to expensive on the moment. The motherboard has 32 GB of memory.

NICs

The ASRock C2750D4I system contains a dual Intel i210 Gigabit LAN adapter. The Intel i210 adapters are standard recognized by ESXi 5.5 and Windows Server 2012 R2. No additional modifications or drivers are needed.

Power consumption

The 300 W Power Supply is more than enough. The processor has a TPD of 20 watt. This whitebox consumes around 35 W with a couple of VMware VMs on it.

IMG_3949

The ASRock C2750D4I system is part a VMware cluster with Distributed Power Management (DPM) enabled. When DPM kicks in only 4 watt is used.

Remote Management

Management and remote control is possible because of  the BMC (Baseboard Management Controllers) and IPMI (Intelligence Platform Management Interface).

remoteremote2

VMware ESXi support

On the ASRock C2750D4I system, VMware ESXi 5.5 Update 2 with the latest updates is installed.

esxicpuesxi

The Intel i210 Gigabit NICs and the Avoton AHCI controllers are recognized out of the box. So VMware VSAN (unsupported) could be an option to use.

esxiintelesxicontroller

The SATA 6 Gb/s controller is not recognized by default. Follow the instruction explained in the following blog to enable the controller:

  • How to make your unsupported SATA AHCI Controller work with ESXi 5.5 and 6.0, link.

Update March 23, 2015: Today I installed VMware ESXi 6.0 on the C2750. Everything seems to work.

 Windows Hyper-V support

As test I installed vNext Server Technical Preview on the ASRock C2750D4I system (with SSD as local storage) with the Hyper-V role enabled. The two Intel i210 Gigabit NICs are recognized out of the box. It has great performance.

vnext hyper-vhvtaskmgr01

Conclusion

The ASRock C2750D4I motherboard is a great system for building or extending a home lab environment based on VMware or Hyper-V. This board gives enough performance for a home lab and meets all the requirements I had for an additional whitebox host. I use it mainly for nesting VMware ESXi and Hyper-V hypervisors.