I’m new to kubernetes and AWS and tried to set up a cluster there. Now I have a problem with deleting kubernetes cluster on AWS EC2 cluster. The cluster was set up using kOps.
Then I deleted the EC2 instance by accident from which cluster was created. First I tried to delete EC2 nodes and master, but then recalled that Kubernetes keeps them alive and keeps re-creating each time the instances are terminated. I tried to connect to a master node and installed kOps there, but I couldn’t delete cluster from there either. I used a command
kops delete cluster --name=*name* --state=s3://*bucket-name* to delete a cluster, but no luck.
So I thought that if I delete an S3 bucket it would solve my problem, but it only got bigger, now there is no state at all. I sent a message to AWS Support, but they replied only with common instructions to delete EC2 instances (like delete load-balancer, services, etc.) So now I cannot connect to a cluster, because credentials are lost, and just don’t know what to do in this situation.
Could you please help me?
Cloud being used: AWS
Installation method: kOps
Host OS: Windows