I’m trying to expose an Kubernetes service (service type as Loadbalancer)
having a small pod deployed to single master node cluster in AWS.
One Master and 2 Worker nodes.
apiVersion: apps/v1 kind: Deployment metadata: labels: app: web name: web namespace: test spec: replicas: 4 selector: matchLabels: app: web template: metadata: labels: app: web spec: containers: - image: httpd:2.4.39-alpine imagePullPolicy: IfNotPresent name: httpd ports: - containerPort: 80 protocol: TCP
apiVersion: v1 kind: Service metadata: labels: app: web name: web namespace: test spec: ports: - name: "80" nodePort: 30252 port: 80 protocol: TCP targetPort: 80 selector: app: web sessionAffinity: None type: LoadBalancer status: loadBalancer: ingress: - hostname: xxxxxxxxxxx.us-west-2.elb.amazonaws.com
When I issue xxxxxxxxxxx.us-west-2.elb.amazonaws.com on the browser, the page isn’t loading.
Error giving :
I tried to access using below ways as well in master node as well as worker nodes using shell.
none of are working and not giving any response.
So I tried
by logging to the respective worker node and it is working.
I’m new to Kubernetes and appreciate if anyone can let me know why I can’t access using either nodeport or using the LoadBalancer host name.
I think the request is reaching the Kubernetes service but it’s not going beyond.
Thanks in advanced.
Kubernetes version: v1.19.2
Cloud being used: AWS
Host OS: Ubuntu 18.04.5 LTS