Can't see node via kubectl

More of a question than a issue. In two different installations of Kubernetes, I am able to see the node associated with the cluster via kubectl cmd in one whereas in another I am not. Which configuration parameter decides this. Any help appreciated

Cluster information:

Kubernetes version:1.15.0
Cloud being used: AWS
Installation method: apt
Host OS: Ubuntu 18.04
CNI and version:
CRI and version:

That would be defined in the kube config file. There are a few was to manage the file or files you use to access your clusters, here’s a good place to start Configure Access to Multiple Clusters - Kubernetes

Kubectx is another awesome nice tool to help.

Since the question about how to configure multiple contexts has been answered by @macintoshprim, i am assuming you mean that you cannot list the nodes, due to a permission issue.
That also partially answers the question: your permissions on the cluster, “decide” what you can or cannot do.

This part of the documentation explains pretty nicely how permissions in general work and also how you can determine if you can do certain things as your user or as an other user.

For interacting with a node type resource you will have to be clusteradmin

Thanks macintoshhprime. I will try this.


It is most likely not a permissions issue. I will try out macintoshprime’s suggestion. Thanks for responding Rick.


Let me know if you hit any bumps on the way. Good luck!