In our team, we are using a GKE cluster in google cloud platform (version : 1.19.9-gke.1400) to deploy Kubeflow to run some machine learning pipelines. We have some memory intensive tasks and we would like to increase shm size by mounting it to empty dir but it increase it by the half of available memory which is not the desired since we are setting sizeLimit parameter to “1Gi”.
When running the command : df -h COMMAND, I can see that my limit is ignored and the half of memory is booked.
How could we set such limitation in GKE ? is it a bug ?
Thank you for your help.