User-agent: * Disallow: /
User-agent: * Disallow:
(or just create an empty “/robots.txt” file, or don’t use one at all)
User-agent: * Disallow: /cgi-bin/ Disallow: /tmp/ Disallow: /junk/
User-agent: BadBot Disallow: /
User-agent: Google Disallow: User-agent: * Disallow: /
This is currently a bit awkward, as there is no “Allow” field. The easy way is to put all files to be disallowed into a separate directory, say “stuff”, and leave the one file in the level above this directory:
User-agent: * Disallow: /~joe/stuff/
Alternatively you can explicitly disallow all disallowed pages:
User-agent: * Disallow: /~joe/junk.html Disallow: /~joe/foo.html Disallow: /~joe/bar.html
comments
If you’re ready to take control of your organization’s data by setting up a private… Read More
Building a private cloud involves creating a virtualized environment where you can manage, store, and… Read More
In the rapidly evolving landscape of artificial intelligence, Flex AI stands as a transformative force,… Read More
Apple is set to once again make waves in the smartphone market with the iPhone… Read More
Act quickly! The sooner you take action, the better your chances of saving your water… Read More
Introduction The electric vehicle (EV) market continues to grow rapidly, driven by technological advancements and… Read More
This website uses cookies.