In one of my scenario, my setup fetches some data from the Google Cloud Storage, processes it in Kubernetes cluster and load into BigQuery. The size of my data chunks fetched from cloud storage can vary from 1 MB to 1GB.
I am using the below Kubernetes setup:
- Created node pool with min-1 & max-4 nodes
- Machine type is ‘n1-standard-96’ with 1000 GB boot disk per node
- 3-deployments are created with this node pool with 20-20 replica each
Kubernetes Version: 1.14.10-gke.17
Still I am getting the below errors:
“Cannot allocate memory” OR
“No disk space available”
Could someone please help me to identify the bottleneck?