Question & Answer
Question
Why is vio_daemon consuming high memory on PoweVM Virtual I/O Server (VIOS)?
Cause
The problem can be due to a known issue in VIOS 2.2.3.0 thru 2.2.3.3 with vio_daemon having a memory leak that was fixed at 2.2.3.4 with IV64508, or it could be due to incorrect VIOS settings.
Answer
To check your VIOS level, as padmin, run:
$ ioslevel
If your VIOS level is 2.2.3.4 or higher, the problem may be due to the VIOS having incorrect system settings in /etc/security/limits. If the "stack" size is set to "unlimited" (stack = -1), this exposes a condition where the system can be allowed to pin as much stack as desired causing vio_daemon to consume a lot of memory.
$ oem_setup_env
# vi /etc/security/limits ->check the default stanza
default:
 core = -1
 cpu = -1
 data = -1
 rss = -1
 stack = -1
 nofiles = -1
In some cases, the issue with vio_daemon consuming high memory is noticed after a VIOS update to 2.2.3.X. However, a VIOS update will NOT change these settings. It is strongly recommended not to modify these default values as doing so is known to cause unpredictable results. Below is an example of the default values:
default:
 core = 2097151
 cpu = -1
 data = 262144
 rss = 65536
 stack = 65536
 nofiles = 2000
To correct the problem change the setting back to "default" values. Then reboot the VIOS at your earliest convenience.
Note 1
If the stack size was added to the root and/or padmin stanzas with unlimited setting, it should be removed prior to rebooting the VIOS.
Note 2
If the clients are not redundant via a second VIOS, a maintenance window should be schedule to bring the clients down before rebooting the VIOS.
Last, if your environment does not meet the above criteria, gather the following data to better ascertain if vio_chgmgt is reaching a limit in /etc/security/limits configuration or some other leak. Then contact your local SupportLine Representative and provide the testcase.
1. Gather svmon output at time the issue is going on.
2. Collect snap from VIO Server
Was this topic helpful?
Document Information
Modified date:
19 February 2022
UID
isg3T1022976