Cannot join 2 aws ec2 kuberenetes nodes

Asking for help? Comment out what you need so we can get more information to help you!

Cluster information:

Kubernetes version:v1.30
Cloud being used: aws
Installation method:kubeadm=1.30.1-1.1 kubelet=1.30.1-1.1 kubectl=1.30.1-1.1
Host OS: Ubuntu
CNI and version: cilium --version 1.16.1
CRI and version: containerd

I have 2 nodes (CP and Worker node), both set up on aws t2.large, using default VPC. The SG created for the CP (new, default parameters) was also used to created the Worker node. Both have an identical kubernetes configuration.

I have set up file nano /etc/hosts os it includes

xx.xx.xx.xx k8scp #<----- cp node
xx.xx.xx.xx worker #<----- worker node
127.0.0.1 localhost

Then I copy the output of the following command entered on the CP node
sudo kubeadm token create --print-join-command

and I paste it on the CLI of the Worker node, by adding at the end of above output --node-name=worker

I would appreciate if somene could help me out. Thank you.

Hi,
Worth checking security group and make sure that port 6443 is reachable.

1 Like

@fox-md thank you
I added another inbound rule custom tcp pot 6443 on the SG and it worked.