Florian Grehl

VMware ESXi 7.0 Update 3 on Intel NUC

VMware vSphere ESXi 7.0 Update 3 has been released in October and before you start to deploy it to production, you want to evaluate it in your testing environment or homelab. If you have Intel NUCs or similar hardware you should be very careful when updating to new ESXi releases as there might be issues. Please always keep in mind that this is not an officially supported platform and there might be compatibility issues.

In vSphere 7.0, there are ups and downs with consumer-grade network adapters. Since the deprecation of VMKlinux drivers, there is no option to use Realtek-based NICs, and previous versions had problems with the ne1000 driver. Luckily there is the great Community Networking Driver for ESXi Fling that adds support for a bunch of network cards and VMKUSB-NIC-FLING always covers your back.

I've updated my NUC portfolio to check which NUCs are safe to update and what considerations you have to take before installing the update. Additionally, I'm taking a look at the consequences of the recently deprecated USB/SD-Card usage for ESXi Installations and some general Issues in 7.0u3.

Read More »VMware ESXi 7.0 Update 3 on Intel NUC

VMware NSX-T 3.1 Edge Node Sizing

Edge Nodes in NSX-T 3.1 are available as Virtual Machines and Bare Metal Edges. When you deploy a Virtual Edge Node using the embedded deployment function in NSX-T, you can choose between 4 sizes - Small, Medium, Large and Extra Large. In this article, I'm trying to collect information about the different sizing options, what they are intended for and how to resize Edge Nodes.

Read More »VMware NSX-T 3.1 Edge Node Sizing

How to configure Multiple TLS Certificates with SNI in NSX-T Load Balancer

When you want to use the same public IP address for multiple websites, you have to leverage the SNI extension. Server Name Indication (SNI) is an extension to the Transport Layer Security (TLS) protocol which allows a client to indicate which hostname it wants to connect to. This allows a server to present specific certificates on the same IP address and hence allows multiple secure (HTTPS) websites to be served by the same server.

The NSX-T Load Balancer supports SNI Certificates on a single Virtual Server (IP Address) with different Server Pools in the backend. This article explains how to configure SNI-based Load Balancing with 3 different secure HTTPS Websites on a single IP Address with the NSX-T 3.1 Load Balancer.

Read More »How to configure Multiple TLS Certificates with SNI in NSX-T Load Balancer

Import of Let's Encrypt Certificates in NSX-T Fails With "Certificate chain validation failed"

When you try to import a Let's Encrypt SSL Server Certificate in NSX-T, the following error message is displayed:

Error: You have 1 Error(s)
Certificate chain validation failed. Make sure a valid chain is provided in order leaf,intermediate,root certificate. (Error code: 2076)

Read More »Import of Let's Encrypt Certificates in NSX-T Fails With "Certificate chain validation failed"

SSL Load Balancer in VMware Cloud Director with NSX-ALB (AVI)

With the NSX Advanced Load Balancer integration in Cloud Director 10.2 or later, you can enable SSL offloading to secure your customer's websites. This article explains how to request a Let's Encrypt certificate, import it to VMware Cloud Director and enable SSL offloading in NSX-ALB. This allows tenants to publish websites in a secure manner.

Read More »SSL Load Balancer in VMware Cloud Director with NSX-ALB (AVI)

Shared Service Engine Groups in VMware Cloud Director with NSX Advanced Load Balancer

In theĀ Getting Started with NSX Advanced Load Balancer Integration in VMware Cloud Director 10.3 Guide, I've explained how to enable "Load Balancing as a Service" in VCD with dedicated Service Engines. With this Service Engine deployment model, each Edge Gateway is statically assigned to a dedicated NSX-ALB Service Engine Group. That means, for each EGW you create in VCD, you have to create a Service Engine Groups, which consists of multiple Service Engines (Virtual Machines).

Service Engine Groups can also be deployed in a shared model. Shared Service Engine groups can be assigned to multiple Edge Gateways. In this deployment model, a single Service Engine (Virtual Machine) can handle traffic for multiple customers. For obvious security reasons, and to prevent problems with overlapping networks, VRFs are used inside the SE to fully separate the data traffic.

This article explains how to use Shared Service Engine Groups in VMware Cloud Director 10.3.

Read More »Shared Service Engine Groups in VMware Cloud Director with NSX Advanced Load Balancer

Getting Started with NSX Advanced Load Balancer Integration in VMware Cloud Director 10.3

When you are using NSX-T as network backend for VMware Cloud Director, you can't use the Native Load Balancer included in NSX-T. Since Cloud Director 10.2, the NSX Advanced Loadbalancer (ALB), previously known as AVI Vantage Platform, has been integrated to allow customers to create Self-Service Load Balancers.

This article explains all steps required to integrate NSX ALB into VMware Cloud Director.

Read More »Getting Started with NSX Advanced Load Balancer Integration in VMware Cloud Director 10.3

Windows 11 on VMware ESXi - This PC can't run Windows 11

The latest release of Windows 11 requires a Trusted Platform Module (TPM) 2.0 chip. When you try to install Windows 11 as a Virtual Machine on VMware ESXi, the installation fails with a "This PC can't run Windows 11" error. There is no further information on why the setup fails.

By using SHIFT + F10 and notepad x:\windows\panther\setuperr.log or type x:\windows\panther\setuperr.log, you can verify that the reason for the failed setup is a missing TPM Chip:

This article explains two options to install Windows 11 by either disabling the TPM check, or by adding a Virtual Trusted Platform Module (vTPM) to the Virtual Machine.

Read More »Windows 11 on VMware ESXi - This PC can't run Windows 11

ESXi on ASRock Industrial NUC 1100 Series (11th Gen Intel "Tiger Lake" CPU)

ASRock Industrial has a NUC-like small form factor (SFF) system in their portfolio, which is very similar to Intel's latest 11th Gen NUC Series. With the global shortage of microchips the availability of 11th Gen NUCs, especially the Dual-NIC "Pro" models, is still limited. While looking for alternatives, the ASRock Industrial NUC1100 Series came out as a great alternative to the original NUC Series.

SFF systems (also known as Barebone, Nettop, SoC, or Mini-PC) like Intel's or ASRocks's NUC are not officially supported by VMware but they are very widespread in the homelab community. They are small, silent, transportable, and have very low power consumption, making them great servers in your homelab. The ASRock 1100 Series is available with i3, i5, or i7 CPU and supports up to 64GB of Memory. All models are equipped with two Network adapters, one with 1 Gigabit and a second adapter with 2.5 Gigabit support. Both adapters can be used with the latest VMware ESXi 7.0.

  • NUC BOX-1165G7 (Intel Core i7-1165G7 - 4 Core, up to 4.7 GHz)
  • NUC BOX-1135G7 (Intel Core i5-1135G7 - 4 Core, up to 4.2 GHz)
  • NUC BOX-1115G4 (Intel Core i3-1115G4 - 2 Core, up to 4.1 GHz)

Will ESXi run on the ASRock Industrial NUC 1100 Series?
Yes. It is possible to install ESXi. Due to missing i219V and i225LM drivers in the original image, it is required to create a custom Image using a community-created driver. Instructions on how to create the Image are included in this article. This problem is not specific to ASRock's 11th Gen. The custom image is also required for Intel's 11th Gen NUC and some older models.

Read More »ESXi on ASRock Industrial NUC 1100 Series (11th Gen Intel "Tiger Lake" CPU)