I'm trying to upload a large file using the Amazon S3 console. Why is the upload failing?
Last updated: 2020-01-14
I'm trying to upload a large file (1 GB or larger) to Amazon Simple Storage Service (Amazon S3) using the console. However, the upload persistently fails and I might be getting timeout errors. How can I fix this?
For large files, Amazon S3 might separate the file into multiple uploads to maximize the upload speed. The Amazon S3 console might timeout during large uploads because of session timeouts.
Instead of using the Amazon S3 console, try uploading the file using the AWS Command Line Interface (AWS CLI) or an AWS SDK.
First, install and configure the AWS CLI. Be sure to configure the AWS CLI with the credentials of an AWS Identity and Access Management (IAM) user or role that has the correct permissions to Amazon S3.
Then, to upload a large file, run a command similar to the following:
aws s3 cp cat.png s3://awsexamplebucket
Note: In this example, the file must be in the same directory that you're running the command from.
When you run a high-level (aws s3) command such as aws s3 cp, Amazon S3 automatically performs a multipart upload when the object is large. In a multipart upload, a large file is split into multiple parts and uploaded separately to Amazon S3. After all the parts are uploaded, Amazon S3 combines the parts into a single file. A multipart upload can result in faster uploads and lower chances of failure with large files.
For more information on multipart uploads using high-level (aws s3) commands or low-level (aws s3api) commands, see How do I use the AWS CLI to perform a multipart upload of a file to Amazon S3?
Note: For a full list of AWS SDKs and programming toolkits for developing and managing applications, see Tools to Build on AWS.