Disallow Directive Test
Checks for disallowed paths in the robots.txt file.
What is this check?
A `Disallow` directive in your `robots.txt` file tells search engine crawlers not to access a specific file, directory, or section of your website.
Why is it important?
It's used to block access to non-public sections (like admin logins), or to prevent crawling of pages that provide no value in search (like internal search results).
What is the impact?
Incorrectly using `Disallow` is extremely dangerous. A single wrong line, like `Disallow: /`, can block your entire website from being crawled and indexed by search engines.
Example Implementation
User-agent: *
Disallow: /admin/
Disallow: /search-results.php