GKE Node has over 1000 spawn ssh connections from google account

One GKE, I have noticed that the is something like a management users that logs into at once.
I assume this is so Google can perform rolling updates and various other updates.

However, if the node is under pressure (ie. A deployment does not have a CPU limit imposed and uses the host’s entire CPI) the management user logs in again and keeps the old ssh connection open.

This has become an issue for me as it stops the node being able to fork processes, thus stopping being able to spin up any new deployments.

My current solution is to kill sshd and all ssh sessions on the node then restart it from the gcp console but this feels like a bug on GKE side and I wonder who is best to speak to about this.

I’d first make sure you’re on the latest version of GKE.
Second, are you sure you’re not possibly getting DoS’d/brute-forced?

If all else fails before killing all the ssh connections run a netstat and see where exactly these connections are coming from.

Not sure how being on the latest version of GKE would help here as it seems to be an issue with the underlying VM setup.

All the SSH connections come from an internal IP address not assigned within my project.