Metrics-server on a 1.25+ k8s cluster

Hello,

I’m struggling with a metrics-server on a freshly installed 1.28 cluster. I’ve downloaded the recent 0.7.1 components yaml, but as far as I can see from logs (I relaunched the metrics-server with --v=debug switch enabled), metrics-server is failing at scraping kubelet API:

===Cut===
E0410 10:33:04.386093 1 scraper.go:149] “Failed to scrape node” err=“request failed, status: "403 Forbidden"” node=“k8s-wrk3.foo.bar”
I0410 10:33:04.386873 1 round_trippers.go:553] GET https://10.0.7.84:10250/metrics/resource 403 Forbidden in 1 milliseconds
===Cut===

I’m aware that since 1.24+ there’s no default service account token creation and this seems to be exactly the issue. I’ve created the token manually and added the secret to the service account from the yaml:


apiVersion: v1
kind: ServiceAccount
metadata:
  labels:
    k8s-app: metrics-server
  name: metrics-server
  namespace: kube-system
secrets:
- name: metrics-server-secret
---
apiVersion: v1
kind: Secret
type: kubernetes.io/service-account-token
metadata:
  name: metrics-server-secret
  namespace: kube-system
  annotations:
    kubernetes.io/service-account.name: metrics-server

But, unfortunately, this didn’t resolved the issue, metrics-server is still unable to scrape the kubelet API.
Can someone boost me up please, I’m clearly missing something crucial.

Thanks.

Okay, since there’s literally no fix and no working workaround, I changed the --authorization-mode=Webhook to --authorization-mode=AlwaysAllow in kubelet startup and now I have my metrics.

Yup, I know exactly what I did and I strongly discourage anyone reading this from doing the same unless they do understand what it is.