Installing or removing packages fails with "BOOTx64.EFI not found" error - ESXi 7.0

While preparing my ESXi Hosts for the recently released ESXi 7.0 Update 3, I came across a strange issue. When I try to install or remove any packages using esxcli, the update failed with the following error message:

"filename 'var/db/payloads/boot_loader_efi/BOOTx64.EFI' not found"
Please refer to the log file for more details.

Any updates using Update Manager / Lifecycle Manager did also fail. ESXi was running at 7.0.2 build-17867351, which correlates to ESXi 7.0 Update 2a. Using "esxcli software profile list" reveals that this was a fresh installation and no packages have been updated yet. The image was customized by adding the vmkusb-nic-fling using PowerCLI Image Builder. After some troubleshooting, I was able to reproduce the Issue which helped me to identify the root cause and I also found a solution to make updates functional again.

Read More »Installing or removing packages fails with "BOOTx64.EFI not found" error - ESXi 7.0

vCenter 7.0 Update Issues

Patching the vCenter Server Appliance in vSphere 7.0 has become a lucky bag. When you try to update using the vCenters VAMI, you either have greyed out STAGE ONLY and STAGE AND INSTALL buttons:

The installation fails with "Exception occurred in install precheck phase" errors, rendering the VAMI unusable:

Or the installation gets stuck on Downloading RPM vsphere-ui-[version].rpm:

Read More »vCenter 7.0 Update Issues

VMware ESXi 7.0 Update 2 on Intel NUC

VMware vSphere ESXi 7.0 Update 2 has been released this week and before you start to deploy it to production, you want to evaluate it in your testing environment or homelab. If you have Intel NUCs you should always be very careful when updating to new ESXi releases as there might be issues. Please always keep in mind that this is not an officially supported platform.

Within the 7.0 releases, there are many issues with consumer network adapters, like the deprecation of VMKlinux drivers and thus the missing support for Realtek NICs, and the up and downs with the ne1000 driver.

To be on the safe side, I'm doing a quick checkup on which NUCs are safe to update and what considerations you have to take before installing the update. Also, I'm quickly explaining the options to workaround the crypto64.efi issue.

Read More »VMware ESXi 7.0 Update 2 on Intel NUC

Failed to load crypto64.efi - ESXi 7.0 U2 Upgrade Error

When you try to upgrade your ESXi host to the latest 7.0 U2 release using either the predefined update baselines or by using esxcli with the upgrade bundle, your ESXi host might fail to reboot with the following error message.

Loading /boot.cfg
Failed to load crypto64.efi
Fatal error: 15 (Not found)

The error can not be solved with the Shift+R method to restore the previous ESXi version. VMware is aware of the problem and has already removed the update bundle (VMware-ESXi-7.0U2-17630552-depot.zip) and Image Profile (ESXi-7.0.2-17630552-standard) from their repository. Currently, you only have two options to upgrade to ESXi 7.0 Update 2. If you already ran into the "Failed to load crypto64.efi" error, you have to take option 1, which will fix the error.

[Update 2021-03-13] - VMware has also disabled the image profile for 7.0.2. If you try an online update using ESXCLI or want to create a custom image using Imagebuilder, you get the following error:

[NoMatchError] No image profile found with name 'ESXi-7.0.2-17630552-standard' id = ESXi-7.0.2-17630552-standard Please refer to the log file for more details.

Read More »Failed to load crypto64.efi - ESXi 7.0 U2 Upgrade Error

Deploy Container Service Extension (CSE 3.0) in VMware Cloud Director 10.2

With the release of Cloud Director 10.2, the Container Service Extension 3.0 has been released. With CSE 3.0 you can extend your cloud offering by providing Kubernetes as a Service. Customers can create and manage their own K8s clusters directly in the VMware Cloud Director portal.

I've already described how to deploy vSphere with Tanzu based Kubernetes Clusters in VCD. CSE 3.0 with the "Native K8s Runtime" is is a neat alternative that allows you to deploy K8s directly into the customer's Organization networks, which is currently not possible with Tanzu.

This article explains how to integrate CSE 3.0 in VMware Cloud Director 10.2.

Read More »Deploy Container Service Extension (CSE 3.0) in VMware Cloud Director 10.2

VMware vSAN on Consumer-Grade SSDs - Endurance analysis

When you are running an ESXi based homelab, you might have considered using vSAN as the storage technology of choice. Hyperconverged storages are a growing alternative to SAN-based systems in virtual environments, so using them at home will help to improve your skillset with that technology.

To get started with vSAN you need at least 3 ESXi Hosts, each equipped with 2 drives. Alternatively, you can build a 2-node vSAN Cluster using a Raspberry Pi as a witness node.

VMware maintains a special HCL that lists supported drives to be used with vSAN. In production setups, it is very important to use certified hardware. Using non-enterprise hardware might result in data loss and bad performance caused by the lack of power loss protection and small caches.

This article takes a look at consumer-grade SSDs and their durability to be used with vSAN. Please be aware that non-certified hardware should only be used in homelabs or for demo purposes. Do not place sensitive data on vSAN that is running on consumer hardware.

Read More »VMware vSAN on Consumer-Grade SSDs - Endurance analysis

vSphere 7.0 Performance Counter Description

This is a list of all available performance metrics that are available in vSphere vCenter Server 7.0. Performance counters can be views for Virtual Machines, Hosts, Clusters, Resource Pools, and other objects by opening Monitor > Performance in the vSphere Client.

These performance counters can also be used for performance analysis with esxcfg-perf.pl, or PowerCLI.

Read More »vSphere 7.0 Performance Counter Description

Tanzu Kubernetes Licensing in vSphere 7.0 Update 1

With the release of vSphere 7.0 Update 1, VMware introduced a new licensing model for its Tanzu Kubernetes integration. Basically, the licensing has been changed from an ESXi-Host license to a Cluster license that looks familiar to the vSAN license which is in place for a couple of years. The change does only affect how you have to apply the license. The entity to pay for is still a physical CPU.

In vSphere 7.0 GA, the license required to enable Kubernetes (aka. "Workload Management") was an add-on license for ESXi Hosts named "vSphere 7 Enterprise Plus with Kubernetes". With the introduction of vSphere 7.0 Update 1, which is also referred to as 7.0.1, "vSphere add-on for Kubernetes" has been rebranded and split into 4 licenses Tanzu Basic, Tanzu Standard, Tanzu

Read More »Tanzu Kubernetes Licensing in vSphere 7.0 Update 1

Update ESXi 7.0 with VMKUSB NIC Fling to 7.0 Update 1

The USB Native Driver Fling, a popular ESXi driver by Songtao Zheng and William Lam that adds support for USB-based Network Adapters, has been updated to version 1.7. The new version has added support for vSphere 7.0 Update 1.

When you download the latest version, you notice that there are separate versions for 7.0 and 7.0 U1. Both versions are only compatible with their corresponding ESXi version, which makes direct updates a little bit more complex.

This article explains how to upgrade ESXi hosts with USB-based network adapters in a single step.

Read More »Update ESXi 7.0 with VMKUSB NIC Fling to 7.0 Update 1

Tips for using USB Network Adapters with VMware ESXi

Running Intel NUCs and other SFF systems with ESXi is a proven standard for virtualization home labs. One major drawback is that most of the available SFF systems only have a single Gigabit network adapter. This might be sufficient for a standalone ESXi with a few VMs, but when you want to use shared Storage or VMware NSX, you totally want to have additional NICs.

This article explains some basics to consider when running USB-based network adapters with ESXi.

Read More »Tips for using USB Network Adapters with VMware ESXi