Skip to content

How to configure Multiple TLS Certificates with SNI in NSX-T Load Balancer

When you want to use the same public IP address for multiple websites, you have to leverage the SNI extension. Server Name Indication (SNI) is an extension to the Transport Layer Security (TLS) protocol which allows a client to indicate which hostname it wants to connect to. This allows a server to present specific certificates on the same IP address and hence allows multiple secure (HTTPS) websites to be served by the same server.

The NSX-T Load Balancer supports SNI Certificates on a single Virtual Server (IP Address) with different Server Pools in the backend. This article explains how to configure SNI-based Load Balancing with 3 different secure HTTPS Websites on a single IP Address with the NSX-T 3.1 Load Balancer.

Read More »How to configure Multiple TLS Certificates with SNI in NSX-T Load Balancer

Import of Let's Encrypt Certificates in NSX-T Fails With "Certificate chain validation failed"

When you try to import a Let's Encrypt SSL Server Certificate in NSX-T, the following error message is displayed:

Error: You have 1 Error(s)
Certificate chain validation failed. Make sure a valid chain is provided in order leaf,intermediate,root certificate. (Error code: 2076)

Read More »Import of Let's Encrypt Certificates in NSX-T Fails With "Certificate chain validation failed"

SSL Load Balancer in VMware Cloud Director with NSX-ALB (AVI)

With the NSX Advanced Load Balancer integration in Cloud Director 10.2 or later, you can enable SSL offloading to secure your customer's websites. This article explains how to request a Let's Encrypt certificate, import it to VMware Cloud Director and enable SSL offloading in NSX-ALB. This allows tenants to publish websites in a secure manner.

Read More »SSL Load Balancer in VMware Cloud Director with NSX-ALB (AVI)

Shared Service Engine Groups in VMware Cloud Director with NSX Advanced Load Balancer

In the Getting Started with NSX Advanced Load Balancer Integration in VMware Cloud Director 10.3 Guide, I've explained how to enable "Load Balancing as a Service" in VCD with dedicated Service Engines. With this Service Engine deployment model, each Edge Gateway is statically assigned to a dedicated NSX-ALB Service Engine Group. That means, for each EGW you create in VCD, you have to create a Service Engine Groups, which consists of multiple Service Engines (Virtual Machines).

Service Engine Groups can also be deployed in a shared model. Shared Service Engine groups can be assigned to multiple Edge Gateways. In this deployment model, a single Service Engine (Virtual Machine) can handle traffic for multiple customers. For obvious security reasons, and to prevent problems with overlapping networks, VRFs are used inside the SE to fully separate the data traffic.

This article explains how to use Shared Service Engine Groups in VMware Cloud Director 10.3.

Read More »Shared Service Engine Groups in VMware Cloud Director with NSX Advanced Load Balancer

Getting Started with NSX Advanced Load Balancer Integration in VMware Cloud Director 10.3

When you are using NSX-T as network backend for VMware Cloud Director, you can't use the Native Load Balancer included in NSX-T. Since Cloud Director 10.2, the NSX Advanced Loadbalancer (ALB), previously known as AVI Vantage Platform, has been integrated to allow customers to create Self-Service Load Balancers.

This article explains all steps required to integrate NSX ALB into VMware Cloud Director.

Read More »Getting Started with NSX Advanced Load Balancer Integration in VMware Cloud Director 10.3

Windows 11 on VMware ESXi - This PC can't run Windows 11

The latest release of Windows 11 requires a Trusted Platform Module (TPM) 2.0 chip. When you try to install Windows 11 as a Virtual Machine on VMware ESXi, the installation fails with a "This PC can't run Windows 11" error. There is no further information on why the setup fails.

By using SHIFT + F10 and notepad x:\windows\panther\setuperr.log or type x:\windows\panther\setuperr.log, you can verify that the reason for the failed setup is a missing TPM Chip:

This article explains two options to install Windows 11 by either disabling the TPM check, or by adding a Virtual Trusted Platform Module (vTPM) to the Virtual Machine.

Read More »Windows 11 on VMware ESXi - This PC can't run Windows 11

ESXi on ASRock Industrial NUC 1100 Series (11th Gen Intel "Tiger Lake" CPU)

ASRock Industrial has a NUC-like small form factor (SFF) system in their portfolio, which is very similar to Intel's latest 11th Gen NUC Series. With the global shortage of microchips the availability of 11th Gen NUCs, especially the Dual-NIC "Pro" models, is still limited. While looking for alternatives, the ASRock Industrial NUC1100 Series came out as a great alternative to the original NUC Series.

SFF systems (also known as Barebone, Nettop, SoC, or Mini-PC) like Intel's or ASRocks's NUC are not officially supported by VMware but they are very widespread in the homelab community. They are small, silent, transportable, and have very low power consumption, making them great servers in your homelab. The ASRock 1100 Series is available with i3, i5, or i7 CPU and supports up to 64GB of Memory. All models are equipped with two Network adapters, one with 1 Gigabit and a second adapter with 2.5 Gigabit support. Both adapters can be used with the latest VMware ESXi 7.0.

  • NUC BOX-1165G7 (Intel Core i7-1165G7 - 4 Core, up to 4.7 GHz)
  • NUC BOX-1135G7 (Intel Core i5-1135G7 - 4 Core, up to 4.2 GHz)
  • NUC BOX-1115G4 (Intel Core i3-1115G4 - 2 Core, up to 4.1 GHz)

Will ESXi run on the ASRock Industrial NUC 1100 Series?
Yes. It is possible to install ESXi. Due to missing i219V and i225LM drivers in the original image, it is required to create a custom Image using a community-created driver. Instructions on how to create the Image are included in this article. This problem is not specific to ASRock's 11th Gen. The custom image is also required for Intel's 11th Gen NUC and some older models.

Read More »ESXi on ASRock Industrial NUC 1100 Series (11th Gen Intel "Tiger Lake" CPU)

Installing or removing packages fails with "BOOTx64.EFI not found" error - ESXi 7.0

While preparing my ESXi Hosts for the recently released ESXi 7.0 Update 3, I came across a strange issue. When I try to install or remove any packages using esxcli, the update failed with the following error message:

[KeyError]
"filename 'var/db/payloads/boot_loader_efi/BOOTx64.EFI' not found"
Please refer to the log file for more details.

Any updates using Update Manager / Lifecycle Manager did also fail. ESXi was running at 7.0.2 build-17867351, which correlates to ESXi 7.0 Update 2a. Using "esxcli software profile list" reveals that this was a fresh installation and no packages have been updated yet. The image was customized by adding the vmkusb-nic-fling using PowerCLI Image Builder. After some troubleshooting, I was able to reproduce the Issue which helped me to identify the root cause and I also found a solution to make updates functional again.

Read More »Installing or removing packages fails with "BOOTx64.EFI not found" error - ESXi 7.0

vCenter 7.0 Update Issues

Patching the vCenter Server Appliance in vSphere 7.0 has become a lucky bag. When you try to update using the vCenters VAMI, you either have greyed out STAGE ONLY and STAGE AND INSTALL buttons:

The installation fails with "Exception occurred in install precheck phase" errors, rendering the VAMI unusable:

Or the installation gets stuck on Downloading RPM vsphere-ui-[version].rpm:

Read More »vCenter 7.0 Update Issues