Skip to content

Florian Grehl

Realtek NIC and ESXi 7.0 - Use Passthrough to make use of the Adapter

Realtek Adapters are very common in consumer hardware and SFF systems. Using SFF systems to run ESXi is a good option for home labs as they are inexpensive and have a low power consumption. Unfortunately, the Realtek RTL8168, which is used in Asus PN50 or ZOTAC ZBOX Edge for example, is not supported in ESXi. The problem can be solved with a community created driver in ESXi 5.x and 6.x but not in ESXi 7.0, due to the VMKlinux driver stack deprecation.

You can work around the problem by using an USB based NIC to manage ESXi. Using USB NICs works fine and stable, but at this point the embedded NIC is useless. If you want to use it, you can use passthrough to add it to a virtual machine.

Read More »Realtek NIC and ESXi 7.0 - Use Passthrough to make use of the Adapter

USB Devices as VMFS Datastore in vSphere ESXi 7.0

This article explains how to add USB devices as Datastores in VMware ESXi 7.0. Adding USB devices as datastores was also possible in previous versions, but in vSphere 7 it has become even easier.

Please be aware that using USB Datastores is not supported by VMware so be careful when using this method with sensitive data.

In this example, I'm using a USB 3.0 to NGFF M.2 case.

Read More »USB Devices as VMFS Datastore in vSphere ESXi 7.0

Mark USB Storage Devices as Flash fails with "The Disk is in use" Error in ESXi

When you try to mark USB-based Storage Devices as Flash in ESXi, the following error is displayed:

Cannot change the host configuration. Cannot mark disk mpx.vmhba33:C0:T0:L0 as "Flash". "Unable to configure the disk claim rules. The disk is in use."


The error message is misleading as the issue is not the disk being in use. You have to configure an advanced setting in ESXi to allow USB disks to be claimed as flash.

Read More »Mark USB Storage Devices as Flash fails with "The Disk is in use" Error in ESXi

ESXi on AMD Ryzen based ASUS PN50

The long-awaited AMD Ryzen based PN50 by ASUS is finally available. The ESXi Homelab community is constantly growing. When you want to run ESXi in home labs you typically want to have a system that is small, silent, and transportable. To keep costs at a minimum, the power consumption is also a very important factor. The portfolio of Small Form Factor (SFF) Systems, also known as Barebone, Nettop, SoC, or Mini-PC, is enormous. Intel's NUC series is currently the most used system in the homelab market, but I'm always keeping my eyes on its competitors.

Today I'm going to test the ASUS PN50, which is currently rolled out. The PN50 is available with 4 different embedded CPUs:

  • ASUS PN50 Ryzen 7 4800U (8 Core / 16 Threads, up to 4.2 GHz)
  • ASUS PN50 Ryzen 7 4700U (8 Core, up to 4.1 GHz)
  • ASUS PN50 Ryzen 5 4500U (6 Core, up to 4.0 GHz)
  • ASUS PN50 Ryzen 5 4300U (4 Core, up to 4.0 GHz)

Will ESXi run on the Asus PN50?
Yes. It is possible to install ESXi on the Asus PN50. Unfortunately, Asus is using a Realtek based RTL8168 Gigabit Network adapter for the PN50, which will not work with ESXi 7.0. To install ESXi 6.x, you have to use a community-based driver. If you want to use ESXi 7.0, you have to use a USB-based Network adapter.

Read More »ESXi on AMD Ryzen based ASUS PN50

Tips for using USB Network Adapters with VMware ESXi

Running Intel NUCs and other SFF systems with ESXi is a proven standard for virtualization home labs. One major drawback is that most of the available SFF systems only have a single Gigabit network adapter. This might be sufficient for a standalone ESXi with a few VMs, but when you want to use shared Storage or VMware NSX, you totally want to have additional NICs.

This article explains some basics to consider when running USB-based network adapters with ESXi.

Read More »Tips for using USB Network Adapters with VMware ESXi

ESXi VMKUSB NIC Fling adds support for 2.5GBASE-T Adapters

The USB Native Driver Fling, a popular ESXi driver by Songtao Zheng and William Lam that adds support for USB-based Network Adapters, has been updated to version 1.6. The new version has added support for RTL8156 based 2.5GBASE-T network adapters.

Multi-Gigabit network adapters with 5GBASE-T are available for a while, but those 5GbE adapters cost about $100 USD. The new driver allows the usage of 2.5GbE adapters that are available for as low as $25 USD. The driver was released yesterday, and luckily I already own a bunch of 2.5GbE adapters, so I could give it a test drive immediately.

CableCreation USB 3.0 to 2.5 Gigabit LAN Adapter (CD0673)

Read More »ESXi VMKUSB NIC Fling adds support for 2.5GBASE-T Adapters

vSphere with Kubernetes - Which Supervisor Cluster Settings can be edited?

When you want to deploy Kubernetes on vSphere 7 it is crucial to plan the configuration thoroughly prior to enabling Workload Management. Many of the configuration parameters entered during the Workload Management wizard can not be changed after the deployment.

The following table show which settings can be changed after the initial deployment:

Read More »vSphere with Kubernetes - Which Supervisor Cluster Settings can be edited?

vSphere with Kubernetes Supports Multiple Tier-0 Gateways

During my first vSphere with Kubernetes tests, I had an issue where I was not able to activate Workload Management (Kubernetes) because it discovered multiple Tier-0 gateways. The configuration I used was vSphere 7.0 GA and an NSX-T 3.0 backed N-VDS. I had a previously configured Edge Cluster / Tier-0 Gateway for existing workloads and configured a new Edge Cluster / Tier-0 for Kubernetes.

In the Workload Management Wizard, no Cluster was compatible so I was forced to use the previously configured Tier-0 with some routing workarounds. The error message in wcpsvc.log stated "[...]has more than one tier0 gateway[...]".

Today I tried to find a solution and noticed that there was an update to the official Kubernetes Guide:

Read More »vSphere with Kubernetes Supports Multiple Tier-0 Gateways

ESXi on Zotac ZBOX edge (M Series - 10th Gen Intel CPU)

The ESXi Homelab community is constantly growing. When you want to run ESXi in home labs you typically want to have a system that is small, silent, and transportable. To keep costs at a minimum, the power consumption is also a very important factor. The portfolio of Small Form Factor (SFF) Systems, also known as Barebone, Nettop, SoC, or Mini-PC, is enormous. Intel's NUC series is currently the most used system in the homelab market, but I'm always keeping my eyes on its competitors.

Today I'm going to test the Zotac ZBOX edge, which is the newest model in their M series. The Zotac M Series is comparable to Intel's NUC when it comes to form factor, features, and performance. As always, I'm trying to point out where the similarities and differences to the NUC series are.

The Zotac ZBOX edge is currently available with an Intel 10th Gen i3 or i5 CPU, but I assume that there will be a 3rd model with an i7 CPU.

  • ZBOX edge MI623 (Intel Core i3-10110U - 2 Core, up to 4.1 GHz)
  • ZBOX edge MI643 (Intel Core i5-10210U - 4 Core, up to 4.2 GHz)
  • ZBOX edge MI663 (Intel Core i7-10710U - 6 Core, up to 4.7 GHz) *unconfirmed

Will ESXi run on the Zotac ZBOX edge?
Yes. ESXi 5.x and 6.x will run on the ZBOX edge, but the network adapter does not work with ESXi 7.0. Unfortunately, Zotac has equipped them with two Realtek RTL8111 NICs, which are not working in ESXi 7.0. There is no driver available and the community driver used in ESXi 5.x/6.x does not work in 7.0.

Read More »ESXi on Zotac ZBOX edge (M Series - 10th Gen Intel CPU)