vCenter 5.5 Resource Exhaustion Detected

Following an upgrade of vCenter server from 5.0 to 5.5 the vCenter service intermittently stopped and we began to see a number of resource exhaustion events:

Event ID: 2004
Source: Resource-Exhaustion-Detect
Windows successfully diagnosed a low virtual memory condition. The following programs consumed the most virtual memory: java.exe (2116) consumed 10149273600 bytes, java.exe (2088) consumed 4624416768 bytes, and vpxd.exe (14440) consumed 4113379328 bytes.

We increased the allocated memory (from 12GB to 24GB) and page file (from 4Gb to 6GB) but continued to experience problems. I came across a similar issue in the VMware Communities in which user Sateesh_vcloud documented the standard JVM Heap settings:

Default values for vCenter server installation:


Our vCenter inventory was approx 100 hosts and 2000 virtual machines but we had selected large inventory for all services during the upgrade. Therefore the JVM heap allocated for each service was likely larger than we required. Sateesh_vcloud also documented the locations of the configuration files for each service:

Single Sign On:
C:\Program Files\VMware\Infrastructure\SSOServer\conf\wrapper.conf
Set”-Xmx” (default: “1024M”) to “256M”
Set”-XX:MaxPermSize=” (default: “512M”) to “128M” (or half of the Xmx value chosen before)

Inventory Service:
C:\Program Files\VMware\Infrastructure\Inventory Service\conf\wrapper.conf
Set (default: “3072”) to “384” (MB)

C:\Program Files\VMware\Infrastructure\tomcat\conf\wrapper.conf
Set”-Xmx” (default: “1024M”) to “512M” – “768M”
Set”-XX:MaxPermSize” (default: “256M”) to half of the Xmx value chosen before

Web Client:
C:\Program Files\VMware\Infrastructure\vSphereWebClient\server\bin\service\conf\wrapper.conf
Set (default: “1024m”) to “256m”
Set (default: “1024m”) to “384m”

Log Browser:
C:\Program Files\VMware\Infrastructure\vSphereWebClient\logbrowser\conf\wrapper.conf
Set (default: “512”) to “256” (MB)

Profile Driven Storage:
C:\Program Files\VMware\Infrastructure\Profile-Driven Storage\conf\wrapper.conf
Set (default: “256”) to “128” (MB)
Set (default: “1024”) to “384” (MB)

C:\Program Files\VMware\Infrastructure\Orchestrator\app-server\bin\wrapper.conf
Set (default: “768m”) to “256m”
Set (default: “2048”) to “384” (MB)
Set (default: “2048”) to “512” (MB)

I updated the Inventory Service configuration file to 6144MB (previously 1288MB) and restarted the service. We have not had a reoccurrence of the resource exhaustion and the vCenter service has been stable.

vCenter Service Failing to Start

I recently upgraded vCenter from 5.0 U3 to 5.5 U2 which went smoothly and ran fine until our standard monthly windows patch window when we found the primary vCenter service would not start.

I initially flagged the issue with our database operations team and asked them to health check the SQL database for vCenter.

However I continued investigating and upon checking the vpxd.log file I found:

[VpxdReverseProxy] Failed to create http proxy: An attempt was made to access a socket in a way forbidden by its access permissions.

This lead me to a VMware knowledge base article listing troubleshooting steps for the vCenter service. Step four of this article suggested verification of the ports required by vCenter. Running ‘netstat –bano’ I found port 80 appeared to be in use by process id 4. Via Task Manager I found process ID 4 owned by the System which was not a conclusive identifier however it ruled out some potential suspects.

Looking at the knowledge base article again, it lists some services to specifically check for –

‘If another application, such as Microsoft Internet Information Server (IIS) (also known as Web Server (IIS) on Windows 2008 Enterprise), Routing and Remote Access Service (RAS), World Wide Web Publishing Services (W3SVC), Windows Remote Management service (WS-Management) or the Citrix Licensing Support service are utilizing any of the ports, vCenter Server cannot start.’

Reviewing the services running on the server I found the Window Remote Management service. I stopped the service and then retried vCenter. It was successful. I was then able to restart the Windows Remote Management service and vCenter continued to run.

I subsequently found a blog called The World According to Gabe that detailed a permanent solution.

Recording the key steps here for my own future reference:

If when you run winrm get winrm/config | find /I “http” you find that WinRM is listening on port 80 by default, run the following command:

winrm set winrm/config/listener?Address=*+Transport=HTTP @{Port=”8888”}

If you want WinRM to listen on a different port, just change the “8888” to whatever port you wish, without breaking the formatting.

If you find that WinRM is not listening on port 80 by default, but is still grabbing the port, run the following command:

winrm set winrm/config/service @{EnableCompatibilityHttpListener=”false”}

Later still I found another VMware knowledge base article specific to the Window Remote Management service.