User-agent: * Disallow: /
User-agent: * Disallow:
(or just create an empty “/robots.txt” file, or don’t use one at all)
User-agent: * Disallow: /cgi-bin/ Disallow: /tmp/ Disallow: /junk/
User-agent: BadBot Disallow: /
User-agent: Google Disallow: User-agent: * Disallow: /
This is currently a bit awkward, as there is no “Allow” field. The easy way is to put all files to be disallowed into a separate directory, say “stuff”, and leave the one file in the level above this directory:
User-agent: * Disallow: /~joe/stuff/
Alternatively you can explicitly disallow all disallowed pages:
User-agent: * Disallow: /~joe/junk.html Disallow: /~joe/foo.html Disallow: /~joe/bar.html
comments
Visibility is key in the busy world of online business. Just as a beacon guides… Read More
Facebook Ad Copy: Writing Facebook ad copy is a fundamental aspect of a successful ad… Read More
Introduction Penetration testing, often referred to as pen testing or ethical hacking, plays a pivotal… Read More
Introduction The smartphone industry is no stranger to innovation. Year after year, manufacturers strive to… Read More
In the world of modern smartphones, surprises are becoming increasingly rare. Breakthrough features and astounding… Read More
Introduction: In today's fast-paced and demanding world, it is vital to recognize the significance of… Read More
This website uses cookies.