AWS resources are being used to crawl my website without my permission. How can I let AWS know?

First, modify your robots.txt file to stop or slow down the crawler. This is a file, attached to the root domain of a website, that lists all restrictions in place for crawlers.

By modifying this file you can decide which crawlers can crawl, which pages these crawlers can crawl, and also the rate at which pages can be crawled.

After you modified your robots.txt file, if you believe a crawler running on AWS resources is not abiding by your robots.txt, submit an abuse report.

Did this page help you? Yes | No

Back to the AWS Support Knowledge Center

Need help? Visit the AWS Support Center

Published: 2018-09-06