Home Esxi Server 2015

Posted By admin On 27/11/21
  1. Vmware Esxi Server
  2. Vmware Esxi Setup Home Lab

I recently decided it was time to graduate into a more robust home lab environment, as I’ve been pushing the boundaries of what a single Dell T110 running ESXi 5 can do. I’m no longer satisfied with nesting virtual ESXi 5 servers like a set of Russian nesting dolls, although we all have to start somewhere. To that end, I have decided to go forth with some whitebox builds to upgrade the Wahl Network vSphere 5 home lab.

[symple_box color=”yellow” text_align=”left” width=”100%” float=”none”]
This is a rather old post focused on a Sandy Bridge design – you’re welcome to head over to my updated post that takes advantage of a Haswell design here.

For those interested in doing the same, this post outlines the additions I am making along with the overall design that I am working towards.

Acronis Backup & Recovery 10 Management Server provides an easy way to deploy Agent for ESX/ESXi to every VMware ESX or ESXi server whose virtual machines you want to back up. Acronis Backup & Recovery 10 Advanced Server Virtual Edition: Windows Server 2008. On ESXi/ESX 4.x and 5.x hosts, to determine if VAAI is enabled using the service console in ESX or the vCLI in ESXi, use the esxcfg-advcfg command to check if the options are set to 1 (enabled). In the section ESXi Passwords, ESXi Pass Phrases, and Account Lockout of the ESXi and vCenter Server 6.0 Documentation you can find the following information: ESXi Account Lockout Behavior Starting with vSphere 6.0, account locking is supported for access through SSH and through the vSphere Web Services SDK. DELL PowerEdge R720 Server Dual 10-Core E5-2670 v2 3.6TB SAS 16x SFF 64GB ESXI 7 4.5 out of 5 stars (3) 3 product ratings - DELL PowerEdge R720 Server Dual 10-Core E5-2670 v2 3.6TB SAS 16x SFF 64GB ESXI 7. Standard VMware ESXi ISO, is the easiest and most reliable way to install ESXi on HPE servers.It includes all of the required drivers and management software to run ESXi on HPE servers, and works seamlessly with Intelligent Provisioning.

I just had to use this nesting doll photo, do you blame me?

The biggest conflict when picking a platform for a host is memory. Server memory is expensive, desktop memory is cheap, and pretty much anything you buy from a vendor will charge a mint for memory. If it were just about buying sticks of RAM, I’d go with a desktop build. However, I noticed that desktop builds just lose out on so many features that would make life easier. Things like IPMI, VT-x / VT-d (variable support), ECC, internal ports, network interfaces, and so on. So, I went with a build that utilizes server parts.

Vmware Esxi Server

Let me briefly state that I don’t feel there is a wrong or right answer to what you ultimately choose to build with. As with any design, identify your functional requirements, the nice-to-haves, the budget, and then go forth.

I met the infamous @RootWyrm (Phillip Jaenke) at an HP Cloud Tech Day event last year, who runs a tech website that contains the build list for a whitebox server called the Baby Dragon. One thing I learned about Phil is that he’s very passionate about server builds and really hates noise and heat. He updated the Baby Dragon build to version 2, which is where I based my parts list. I’ve made a few changes to suit my tastes, with the end result being (per server):

  • CPU: Intel Xeon E3-1230 “Sandy Bridge” – 3.2GHz, 4 Cores, 8 Threads, 8MB (Amazon)
  • Motherboard: Supermicro X9SCM-F – Intel C204, Dual GigE, IPMI w/Virtual Media, 2x SATA-3, 4x SATA-2 (Amazon)
  • RAM: 16GB (4 x 4GB) Kingston 240-Pin DDR3 SDRAM ECC Unbuffered DDR3 1333 (PC3 10600) Server Memory Model
  • Updated! RAM: 32GB (4 x 8GB) Kingston 240 PIN DDR3 SDRAM ECC Unbuffered 1600 (PC3 12800) Server Memory Model (Amazon)
  • Disk: Lexar Echo ZX 16GB (Amazon)
  • Case: LIAN LI PC-V351B Black Aluminum MicroATX Desktop Computer Case (Amazon)
  • Fans: 2 x Scythe SY1225SL12L 120mm “Slipstream” Case Fan (Amazon)
  • Power: Seasonic 400W 80 Plus Gold Fanless ATX12V/EPS12V Power Supply (Amazon)

Hey there, sexy, want to run some ESXi 5?

Cost per server (at time of writing) is about $850.


The end result is a small form factor box that will produce nearly no noise (the case fans are only 10.7 dBA @ 41CFM), has no spinning disk (again, less heat and power), and has a dedicated out of band management port with the ability to do virtual media. Each box also has a pair of GbE NICs.

Home Esxi Server 2015

Down side? It only has 16 GB of RAM – there is no financially viable option for 8GB sticks of ECC UDIMMs at this time. I could have bought SSD drives for local swap cache, and may do so in the future when the price of SSDs fall further.

There does not seem to be a driver for the Intel 82579LM card at this time for ESXi 5.X. In the meantime, use the other port, which is an Intel 82574L, to install the hypervisor. You can then add a custom driver to enable the other port by following the instructions found in this thread:


Install your machine(s) with the vanilla ESXi 5.0 ISO.

Log on to the console (or via ssh) of one of the machines and install the vib file by using the following commands:

Reboot and configure all NICs.

Per a request in the comments, I’ve run the VMware Site Survey report to verify that it results in compatibility with fault tolerance.

Per request, here is a look at the hardware status tab in vSphere to show you some of the data collected via CIM.

Esxi server download

I also decided to retire an old model whitebox tower that contained 6 x 300 GB SATA drives in favor of a Synology DS411. Ultimately, I heard a lot of good feedback on the Synology, and it supports SSD! Not much else to say beyond that. 😉

  • Enclosure: Synology DS411 Diskless System 4-bay NAS Server (Amazon)
  • SSD: 2 x Intel 320 Series SSDSA2CW120G310 2.5″ 120GB SATA II MLC Internal Solid State Drive (Amazon)
  • SATA: 2 x Seagate Barracuda Green ST2000DL003 2TB 5900 RPM 64MB Cache SATA 6.0Gb/s 3.5″ Internal Hard Drive (Amazon)

Cost for the NAS with drives (at time of writing) is about $950.

Because I can tap into SSDs, I went with the green 5900 RPM model to save heat and power. I analyzed my lab environment using Xangati and found that I peak out at about 50 IOPS in most cases.

An IOPS report from Xangati

Home Esxi Server 2015

I waffled for a long time on the storage, but ultimately the ability to use SSD won me over. I am confident that in about a year, SSD technology will be at a price point where putting 4 x 250 GB drives in a NAS box will be budget friendly. It would be neat to see a small form factor storage appliance come out from Tintri or Pure Storage as I enjoy both of their approaches to making flash sing. Wishful thinking? 🙂

Vmware Esxi Setup Home Lab

You can also get another perspective on building with a whitebox from Robert Novak, who has a very in depth post on his build process using a Shuttle box.