How can I increase Amazon S3 request limits to avoid throttling on my Amazon S3 bucket?

Last updated: 2019-12-16

My Amazon Simple Storage Service (Amazon S3) bucket is returning 503 Slow Down errors. How can I increase the Amazon S3 request limits to avoid throttling? 

Resolution

You can send 3,500 PUT/COPY/POST/DELETE and 5,500 GET/HEAD requests per second per prefix in an S3 bucket. There are no limits to the number of prefixes that you can have in your bucket.

If you receive only a few 503 Slow Down errors, then you can try to resolve the errors by implementing a retry mechanism with exponential backoff. If the error persists after you implement retries, then you can try gradually scaling up your S3 request workloads and distributing the objects and the requests among multiple prefixes. For more information, see Best Practices Design Patterns: Optimizing Amazon S3 Performance.


Did this article help you?

Anything we could improve?


Need more help?