The mobile SDDC and EUC lab environment

With my  company I work for (Ictivity), we decided to develop a mobile Software Defined Data Center (SDDC) and End User Computing (EUC) lab environment. This mobile lab environment will be used to demo the VMware SDDC and End User Computing (EUC) stack with integration of third party solutions.  One of the reasons to use a physical lab environment instead of cloud services was flexibility and  having no dependencies.

The past moths I’ve got asked what components we used to build this lab environment. So here is an quick overview. The environment logically looks like the picture below:

Demo Environment

This environment contains three physical hosts with VMware ESXi installed and one switch. One ESXi host function as management host. On this management host the following software bits are installed:

  • vSphere 6
  • VSAN Witness
  • NSX Manager
  • Fortigate VMX
  • vRealize components
  • The End User Computing stack such as Horizon View, App Volumes, User Environment Manager and Identity Manager
  • Veeam

The other 2 ESXi hosts function as demo cluster. On this 2-node cluster the following software bits are installed:

  • vSphere 6
  • Virtual SAN (VSAN) All Flash (AF) configuration
  • NSX integration
  • Windows 10l
  • Windows Server 2012 R2

A laptop is used to connect to the lab environment.

What component are used? 

Some highlights of this lab are:

  • 4U rackmount flightcase
  • Mini-ITX motherboard
  • Intel Xeon D-1541 single socket System-on-Chip 8 core processor
  • 2 x 10 GbE Ethernet adapters
  • Only SSD is used
  • IPMI port

Case

The case is a robust custom made 19″ 4U rackmount flightcase with a removable front and back. It has two wheels so you can carry it easily around. This case contains 3 servers and one switch.Here is a picture of the case including alle the hosts and the switch.

IMG_0622Indeling Flightcase

Hosts

The flightcase contains three SuperMicro SYS-5018D-FN4T 1U Rackmount hosts with the following hardware specifications:

  • Chassis: SuperMicro 19″ 1U with a 200W Gold level power supply. Optimized for Mini-ITX (SuperChassis SC505-203B)
  • Motherboard: Super X10SDV-8C-TLN4F Mini-ITX board
  • Processor: 1 x Intel Xeon D-1541 single socket System-on-Chip. This processor contains 8 cores with 16 threads (hypertreading)
  • Memory: 4x DDR4 DIMM sockets (maximum 128 GB, 4 x 32GB DDR4 ECC  memory)
  • LAN: 2 x 10GbE and 2 x 1 GbE and 1 x IPMI LAN port
  • Expansion slots: 1 x PCIe 3.0 x16 slot and a M.2 PCIe 3.0 x4
  • Video: Aspeed AST2400
  • USB: 2x USB 3.0 and 4x USB 2.0

Management host

  • Memory: 4 x 32GB = 128 GB
  • SSD: 2 x Samsung PM863 MZ-7LM1T9E – SSD Enterprise – 1.92 TB – intern – 2.5″ – SATA 6Gb
  • Disk: Seagate Enterprise 6 TB disk (for backup)
  • USB Stick: Sandisk Ultra Fit USB3 16 GB (for booting ESXi)

Demo hosts 

Each host contains the following hardware:

  • Memory: 2 x 16GB = 32 GB per server
  • SSD: 1 x Intel P3500 SSD 1.2 TB PCIe 3.0 x4 (NVMe) and Samsung 950 Pro V-Nand M.2 PCI-e SSD 512GB
  • USB Stick: Sandisk Ultra Fit USB3 16 GB (for booting ESXi)

Switch

  • Switch: Netgear ProSafe Plus XS708E 8 x 10 Gbps +SFP slot

Cables

  • 6 x UTP CAT6 0.50 cm cables
  • 1 x UTP CAT6 5m
  • 1 x UTP CAT6 10m

 

Processor host

The two Intel X552/X557-AT NICs are not recognized by ESXi 6.5 and lower versions by default. To enable the Intel X552/x557 2 x 10GbE NICs download the Intel driver on the VMware website (link). Extract the ZIP file and install the offline bundle by using the following command:

esxcli software vib install -d /vmfs/volumes/datastore/driver/ixgbe-4.4.1-2159203-offline_bundle-3848596.zip

With this mobile SSDC lab environment we archived the following benefits:

  • Mobile and easy to carry around
  • Flexibility to install the latest VMware SDDC and 3e party software
  • No dependency
  • Enough horsepower
  • Low noise and power consumption
  • Remote accessible from our datacenter
  • IPMI and KVM support

 

Home lab extension with an Intel NUC 6th generation

For my home lab I bought a 6th generation Intel NUC. The Intel NUC has following specifications:

  • Intel NUC6i3SYH
  • Intel i3-6100u (Skylake) 2.3 GHz Dual Core, 3 MB cache, 15W TDP
  • 2 memory slots for DDR4-2133 SODIMM memory, maximum is 32 GB memory
  • Intel HD Graphics 520 GPU
  • Intel I219-V Gigabit network adapter and Intel Wireless-AC 8260 WIFI adapter
  • Option to install a 2.5″ HDD/SDD and a M.2 SSD card (2242 or 2280)
  • 4 USB 3.0 ports (2 in the front and 2 on the rear)
  • SD card reader (SDXC cards)
  • Case and a 19V AC-DC adapter

IMG_9206 IMG_9209

The Intel NUC will be used as management server for my Software Defined DataCenter (SDDC) home lab environment. The Intel NUC will host VMs such as:

  • Domain Controller + DNS
  • vCenter Server Appliance
  • Virtual SAN witness appliance
  • Veeam backup
  • Etc.

The VMs are stored on a Synology NAS. The Intel NUC will use a NFS connection to the Synology NAS.  The NUC will not have any disks. It will boot ESXi from the USB stick

Processor

The 6th generation Intel NUC leaves two choices for choosing a CPU:

  • Intel I3 Skylake available on the NUC6i3SYH model
  • Intel I5 Skylake available on the NUC6i5SYH model

Both CPUs have 2 cores and support hypertreading. The table below gives a quick comparison between both processors:

Intel

For this configuration the Intel NUC with the I3-6100u processor is sufficient and saves 100 euro. The I3 has 2 cores and hypertreading, so 4 logical processors are displayed in the hypervisor.

cpu

Other advanced technologies such as VT-x, VT-d, EPT are fully supported.

Memory

The Intel NUC has 2 memory slots and support up to 32 GB DDR4 2133 MHz SODIMM memory. I added  2 Crucial 16 GB DDR4-2133(CT16G4SFD8213) modules which makes a total of 32 GB memory.

IMG_9186 IMG_9190

I use the same memory as suggested by the blog “virten.net” link.

Network card

The Intel NUC has an Intel I219-V Gigabit network adapter and a wireless network card. Only the Intel I219-V can be used with VMware ESXi.

Storage

The NUC has a M.2 (PCIe Gen 3 x4) slot and a Intel AHCI SATA-600 controller. It is possible to install a 2.5″ SDD or harddisk in the drive cage.

IMG_9210

The VMs are on a Synology NAS. So the NUC will not have any disks other than a USB drive for booting VMware ESXi.

VMware ESXi

An USB 3 stick is used to boot VMWare ESXi. On the USB stick is VMware ESXi 6.0 U1b (VMware-VMvisor-Installer-201601001 3380124.x86_64) installed. For creating a USB stick with ESXi 6 you can use the blogpost here. Only step 1 till 3 are needed.

There is no need to add extra drivers to the ESXI image because the network and storage adapter are recognized by default.

LAN Storage

Passthrough is supported by the CPU and motherboard.

passthru

Nesting such as VMware in VMware and Hyper-V in VMware is possible. Below is an screenshot of a Hyper-V Server with a VM hosted on ESXi.

hv

Power consumption

The average power consumption of the NUC is between 20 and 30 watt with a couple of VMs active.

Costs

 ComponentAmountTotal
Intel NUC NUC6i3SYH1€ 299,00
Crucial 16 GB DDR4-21332€ 235,80
 USB3 Stick 16 GB1 € 10,00
Total€ 544,80

Conclusion

The 6th generation Intel NUC is an great and easy option for creating a small ESXi home lab. I use the Intel NUC as management server with a couple of VMs. Another use case is creating a 2/3- node hybrid Virtual SAN (VSAN) cluster. Put a Samsung 950 PRO in the M.2 slot for caching and a 2.5″ HDD as capacity tier. Easy.

Pros and cons

Pros:

  • All in-one-package including a motherboard, processor, enclosure and power adapter.
  • Supports up to 32 GB of memory
  • Easy to install
  • Small Form Factor
  • Low noise & power consumption

Cons:

  • The hardware is not on the VMware HCL
  • Need a converter to connect to a DVI or VGA monitor
  • Only 2 cores available
  • No expansion possibilities such as adding an extra netwerk card
  • No remote management

Home lab extension with a Samsung 950 PRO M.2 SSD

Last month I extended my VMware ESXi and Hyper-V home lab with a Samsung 950 Pro SSD. The Samsung 950 Pro SSD is the next-gen SSD that has the following characteristics:

  • Uses V-NAND memory and the Non-Volatile Memory Express (NVME) protocol. This takes away the 600 MBps bandwidth limit with the SATA protocol.
  • Takes advantage of the PCIe Gen 3 x4 (up to 32 Gb/s) interface
  • Available in 256 and 512 GB at the moment. In 2016 larger sizes will be available
  • The Samsung 950 PRO has a M.2 form factor (2208) interface

_1Samsung

These improvements results in being one of the fastest consumer SSD on the market today. In my current home lab don’t have a PCIe Gen3 x4 slot or a M.2 interface. I found a “interface converter PCI-Express, M.2 NGFF” (link) adapter. It’s the same adapter as the Lycom DT-120 (link) another great blog called TinkerTry is referring to (link). The adapter converts the M.2 interface to a PCIe interface slot.

IMG_8197 IMG_8075

The Lycom DT-120 adapter has the following specifications:

  • Single M.2 slot (There are Dual M.2 controllers available on the market)
  • The adapter cost around € 17,00
  • Does not require any driver
  • Supports PCIe 2280,2260 and 2242
  • Supports PCIe 1.0, 2.0 and 3.0 slots on the motherboards

Configuration 

My first step was checking if the firmware is up to date with the Samsung Magician software.

SamsungMagician

In Windows 2012 R2 the Samsung 950 Pro SSD is standard recognized. The latest drivers and software can be download here, link.

VMware ESXi

The Samsung 950 PRO is recognized by default in VMware ESXi 6 Update 1.

esxi

Benchmark

With ATTO software a simple benchmark on a Samsung 840 SSD (based on the SATA protocol) and the Samsung 950 Pro is performed. The ESXi host is a whitebox with the following hardware specifications:

  • Gigabit GA-Z870D3HP
  • Intel i5 4570S
  • 32 GB memory
  • Lycom DT-120 adapter is placed in a PCIe x16 slot
  • Samsung 840 connected via SATA
  • Samsung 950 Pro SSD placed on the Lycom DT-120 adapter

A VM with a 10 GB Thick Provisioned Eager Zeroed VMDK disk is attached. The disk is formatted as NTFS with a standard (4 KB) block size.

The Left picture is the Samsung EVO 840 and the right picture is the Samsung 950 Pro.

samsung840-evo-512gb atto Z870D3HP

As you can see the read and write performance is multiplied by 3 on the Samsung 950 Pro with the M.2 interface converter . These are pretty impressive numbers for a consumer SSD.