= Prevent Google & oother search engines from indexing homelab domain =

I have some services that I am exposing through Traefik reverse proxy, is there a way to make sure search engines don't actually index the domain I'm using so that I dont find hundreds of botnet bruteforcing it?
I have authelia over all services but I'd rather not have my domain showing up in SERPs. Thanks!
“Warning: Don't use a robots.txt file as a means to hide your web pages from Google search results

If other pages point to your page with descriptive text, Google could still index the URL without visiting the page. If you want to block your page from search results, use another method such as password protection or noindex.”
For websites, it used to be a robots.txt file, but now you use a no index metatag on your pages. For other services, I don't know how you would stop them because a simple port scan will reveal open ports, and from that the most common services can be gleaned. Malicious bots don't care about search engines anyways they do their own reconnaissance

Well you can do this in different ways,
The most known ones, not certainly the best are :
Stoping search engines crawling using robot.txt files

Restricting them using non-index tag

Adding a X-Robots-Tag HTTP header

Blocking/Cloaking by User-Agent

Blocking/Cloaking by IP Address Range

You can always do change your HTTP default port it won’t have big effects but it’ll be harder for indexation

Everything in my homelab runs on a cname. And I’ve forwarded using the * symbol everything to my lab

So *.xyz.com -> lab

Since it doesn’t have a explicit DNS entry it won’t show up on Google

As for the domain itself, the www.xyz.com directs to a “this page is left intentionally