AWS

AWS S3 Cp [Copy] Command Overview with Examples

aws s3 copy

Effectively working with S3 requires moving data in and out of S3 buckets efficiently, supporting use cases such as web hosting, content distribution, backups, archiving, media storage and streaming, and more. In this article, we will explore how to use the aws s3 cp command to transfer data between your local filesystem and the S3 buckets.

We will cover:

  1. What is aws s3 cp and what does it do?
  2. Examples: How to use aws s3 cp command
  3. How to use aws s3 cp recursive command flag
  4. Uploading a local file stream to S3
  5. Downloading an S3 object as a local file stream
  6. Uploading to an S3 access point
  7. Downloading from an S3 access point
  8. aws s3 cp vs sync

Before learning about the aws s3 cp command, install the AWS CLI. If it is not already installed, here’s a quick guide to help you get started.

What is aws s3 cp and what does it do?

The aws s3 cp command allows you to copy files to and from Amazon S3 buckets. It is used for uploading, downloading, and moving data efficiently in and across AWS S3 storage environments.

Below is the syntax of the cp command:

aws s3 cp <source> <target> [ --options]

The cp command is straightforward, requiring only the source and a target optionally followed by options.

Note: The source and destination cannot both be local. i.e., the aws cp command cannot be used to copy files to and from your local filesystem.

Examples: How to use aws s3 cp command

With that out of the way, let’s learn the common use cases of copying files to and from an S3 bucket.

Example 1: Copying a local file to S3

To copy a single file from the current directory to an S3 bucket, simply mention the filename followed by the name of the S3 bucket. 

Remember, the bucket name must be prefixed with s3://.

aws s3 cp file1.txt s3://aws-s3-cp-tutorial
aws s3 cp example

If you want to copy a file with a different name, simply add its name to the destination path. For example, the following command copies file1.txt as robot.txt to the aws-s3-cp-tutorial bucket:

aws s3 cp file1.txt s3://aws-s3-cp-tutorial/robot.txt
aws s3 cp folder

Example 2: Copying an object from S3 one bucket to another

To copy a file from one S3 bucket to another, replace the source with the name of the source S3 bucket followed by the path to the file and the destination with the name of the destination S3 bucket where the file is to be copied. 

Remember to prefix both the source and the destination bucket names with s3://.

aws s3 cp s3://aws-s3-cp-tutorial/file1.txt s3://aws-s3-cp-tutorial-2
aws s3 cp command

If you want to copy the file with a different name, add the desired file name to the destination S3 bucket path:

aws s3 cp s3://aws-s3-cp-tutorial/file1.txt s3://aws-s3-cp-tutorial-2/robot.txt
aws s3 cp directory

The aws s3 cp command can also be used to rename files within an S3 bucket. Set the same bucket as both the source and the destination and add the new file name to the destination path. 

Let’s see an example:

aws s3 cp s3://aws-s3-cp-tutorial/file1.txt s3://aws-s3-cp-tutorial/robot-2.txt
aws s3 cp to local

Example 3: Downloading (copying) a file from S3 to local

Downloading files from S3 is nothing but the same as copying files from an S3 bucket to your machine.

For example, to download the robot.txt file from the aws-s3-cp-tutorial bucket, we use the aws s3 cp command and replace the source with the s3 bucket name followed by the path to the file and the destination with the desired location on your machine where you want to download the file.

aws s3 cp s3://aws-s3-cp-tutorial/robot.txt .
aws s3 cp include

If you want to download the file with a different name, simply add the new file name to the destination path:

aws s3 cp s3://aws-s3-cp-tutorial/robot.txt ./random.txt
aws s3 cp all files in folder

With the fundamentals in place, let’s explore how to extend the capabilities of the aws s3 cp command by learning how to use option flags.

How to use aws s3 cp recursive command flags

The aws s3 cp command can handle various use cases, from copying multiple files to applying access control lists (ACLs) and much more. By incorporating flags with the base aws s3 cp command, we can unlock the additional functionalities and cater to the advanced use cases.

Below are some of the important flags that often come in handy:

cp
<LocalPath> <S3Uri> or <S3Uri> <LocalPath> or <S3Uri> <S3Uri>
[--recursive]: Enables recursive copying for directories.
[--include <value>]: Specifies patterns to include files for copying.
[--exclude <value>] Specifies patterns to exclude files from copying.
[--dryrun]: Simulates the command execution without actually performing the copy operation.
[--acl <value>]: Sets the ACL (Access Control List) for the copied object.
[--grants <value> [<value>...]]: Grants specific permissions to users or groups.
[--storage-class <value>]: Specifies the storage class for the copied object.

We will look at these individually to understand when and where to use them.

1. Copy multiple files ( — recursive)

Local to s3

To copy all files in a directory, use the --recursive flag. For instance, to copy all files from the current directory to the aws-s3-cp-tutorial s3 bucket, use the following command:

aws s3 cp . s3://aws-s3-cp-tutorial/ --recursive

The --recursive flag also copies files from any sub-directories. 

For example, if the directory structure is as shown below, the same directory structure would be replicated in the S3 bucket.

s3-cp-local/
|-----file1.txt
|-----file2.txt
|-----file3.txt
|-----file4.txt
|-----file5.txt
|-----random.txt
|-----robot.txt
|-----sub-directory/
     |-----file1.txt
     |-----file2.txt

Note that the files are being copied at the sub-directory level.

aws cli s3 cp

S3 to local

Similarly, the --recursive flag can be used to copy everything from an S3 bucket to the local file system, including sub-directories:

aws s3 cp s3://aws-s3-cp-tutorial/ . --recursive
aws cp s3

s3 to s3

The --recursive flag works similarly when copying files from one S3 bucket to another.

aws s3 cp s3://aws-s3-cp-tutorial/ s3://aws-s3-cp-tutorial-2 --recursive
aws s3 cp recursive

2. Exclude and include specific files (–exclude and  –include)

When copying multiple files, the s3 cp command allows selecting specific files to include or exclude in the copy operation.

The --exclude flag enables the exclusion of certain files from the copy operation. The --include flag lets you include specific files in the copy operation, often used in conjunction with the --exclude flag. 

Let’s explore them with examples, keeping in mind the following directory structure for the working directory:

s3-cp-local/
|-----file1.txt
|-----robot.txt
|-----random.txt

Excluding files

To exclude a particular file, use the --exclude flag followed by the name of the file to be excluded. 

For example, to exclude the random.txt file when copying the entire current directory, execute the following command:

aws s3 cp . s3://aws-s3-cp-tutorial --recursive --exclude random.txt
aws s3 cp exclude

As expected, only the file1.txt and robot.txt files are copied, excluding random.txt 

Including a specific file

To include a specific file while excluding others, use the following command. First, ignore all files with the --exclude flag followed by including only the random.txt file:

aws s3 cp . s3://aws-s3-cp-tutorial --recursive --exclude "*" --include random.txt
aws s3 cp options

As expected only the random.txt file is copied.

Note: The order of flags is crucial in determining the final operation. For instance, switching the positions of the --include and --exclude flags alters the outcome as well.

aws s3 cp . s3://aws-s3-cp-tutorial --recursive --include random.txt --exclude "*"
aws s3 cp file to bucket

In this case, no operation is performed as all files, including the one explicitly included, are excluded by the last --exclude flag.

3. Preview the changes made by aws s3 cp (–dryrun)

Sometimes, s3 cp operations can get complex, and you might be unsure of the expected changes. Or you might just want to double-check the changes before applying. 

In these cases, you can use the --dryrun flag. As the name suggests, it allows you to preview the changes before committing them. Simply append the --dryrun flag to any command to see the preview.

Let’s see the output of the command we ran earlier after appending the --dryrun flag.

aws s3 cp . s3://aws-s3-cp-tutorial --recursive --exclude "*" --include random.txt --dryrun
aws s3 cp from local to s3

Upon execution, the command previews the output of uploading ./random.txt to s3://aws-s3-cp-tutorial/random.txt, enabling us to verify the expected results before making any changes. 

4. Access control using ACLs (–acl)

ACLs (Access Control Lists) are crucial in managing access to S3 buckets and the objects they contain. With the aws s3 cp command, you can set Canned ACLs using the --acl flag, which accepts a range of values including private, public-read, public-read-write, authenticated-read, aws-exec-read, bucket-owner-read, bucket-owner-full-control and log-delivery-write .

Note: To use the --acl flag, the s3:PutObjectAcl permission must be included in the list of actions for your IAM policy. You can verify this using the following command:

aws iam get-user-policy --user-name myuser --policy-name mypolicy

The output should resemble the following:

{
   "UserName": "myuser",
   "PolicyName": "mypolicy",
   "PolicyDocument": {
       "Version": "2012-10-17",
       "Statement": [
           {
               "Action": [
                   "s3:PutObject",
                   "s3:PutObjectAcl"
               ],
               "Resource": [
                   "arn:aws:s3:::mybucket/*"
               ],
               "Effect": "Allow",
               "Sid": "Stmt1234567891234"
           }
       ]
   }
}

Setting the Access Control List (ACL) on files being copied to an S3 object

To grant public read access to the files being copied to the S3 bucket, use the --acl flag to apply the public-read ACL on the file.

aws s3 cp file1.txt  s3://aws-s3-cp-acl-tutorial --acl public-read
aws s3 cp with wildcard
aws s3 cp recursive wildcard

Public read access is granted to all users.

If you see the error below, make sure that the bucket allows setting ACLs for public access. Alternatively, you can try out a different canned ACL, which is not public.

aws s3 cp syntax

5. Set fine-grained grants on the files being copied (–grants)

Grants allow managing fine-grained access control in S3. The following cp command demonstrates the use of the --grants flag to grant read access to all authenticated users:

aws s3 cp file1.txt s3://aws-s3-cp-acl-tutorial --grants read=uri=http://acs.amazonaws.com/groups/global/AuthenticatedUsers
aws s3 cp dryrun

Result:

aws s3 cp parallel

Additionally, it’s possible to apply multiple grants simultaneously. The following cp command grants read access to all authenticated users identified by a URI and full control to a specific user identified by their email address:

aws s3 cp file1.txt s3://aws-s3-cp-acl-tutorial --grants read=uri=http://acs.amazonaws.com/groups/global/AuthenticatedUsers full=emailAddress=omkar.birade@something.com
aws s3 cp only new files
aws s3 cp prefix

As expected read access is granted to all authenticated users and the user associated with the canonical ID shown in the above picture.

6. Specify the storage class for the files being copied (–storage-class)

To copy a file with a specific storage class, we can use the --storage-class flag. The accepted values for the storage class are STANDARD | REDUCED_REDUNDANCY | STANDARD_IA | ONEZONE_IA | INTELLIGENT_TIERING | GLACIER | DEEP_ARCHIVE | GLACIER_IR. STANDARD is the default storage class.

In the example below, file1.txt is copied to the aws-s3-cp-acl-tutorial bucket using the REDUCED_REDUNDANCY storage class:

aws s3 cp file1.txt s3://aws-s3-cp-acl-tutorial --storage-class REDUCED_REDUNDANCY
aws s3 cp exclude include
aws s3 cp whole directory

As expected file1.txt is stored with storage class REDUCED_REDUNDANCY.

Uploading a local file stream to S3

The cp command supports uploading a local file stream from standard input to an s3 bucket. The example below demonstrates copying from the standard input to the stream.txt file in the destination S3 bucket.

echo 'Hello! Welcome to the "aws s3 cp" tutorial!!!' | aws s3 cp - s3://aws-s3-cp-tutorial/stream.txt
aws s3 cp directory to local

It’s important to note that when uploading a local file stream larger than 50GB, the --expected-size option must be provided, or the upload may fail when it reaches the default part limit of 10,000. 

For example, for a file stream of 51GB, we can set the --expected-size as follows.

aws s3 cp - s3://mybucket/stream.txt --expected-size 54760833024

Downloading an S3 object as a local file stream

Similarly to uploading, we can download files from S3 as a local file stream. For example, the command below downloads the stream.txt file from an S3 bucket as a stream to the standard output.

aws s3 cp s3://aws-s3-cp-tutorial/stream.txt -
aws cli s3 sync vs cp

Note: Downloading as a stream is not currently compatible with the --recursive parameter.

Uploading to an S3 access point

Access points are named network endpoints attached to S3 buckets. An Access Point alias provides the same functionality as an Access Point ARN and can be substituted for an S3 bucket name for data access.

The following command uploads file1.txt file via the access point access-point-cp-tutorial to S3:

aws s3 cp file1.txt s3://arn:aws:s3:eu-west-1:588626695133:accesspoint/access-point-cp-tutorial
aws s3 cp include pattern

Note: To successfully copy files to an S3 bucket using an access point, ensure that the access point policy allows the s3:PutObject action for your principal as shown below:

{
   "Version": "2012-10-17",
   "Statement": [
       {
           "Effect": "Allow",
           "Principal": {
               "AWS": "arn:aws:iam::123456789:user/omkar.birade"
           },
           "Action": [
               "s3:GetObject",
               "s3:PutObject"
           ],
           "Resource": "arn:aws:s3:eu-west-1:123456789:accesspoint/access-point-cp-tutorial/object/*"
       }
   ]
}

Downloading from an S3 access point

Access points can also be used to download files, provided the access point policy allows the s3:GetObject action. The following command downloads file1.txt using the access-point-cp-tutorial access point to local:

aws s3 cp s3://arn:aws:s3:eu-west-1:213473892479:accesspoint/access-point-cp-tutorial/file1.txt .
aws s3 cp directory to bucket

As we approach the end of the article, one last thing to learn is how aws s3 cp is different from aws s3 sync.

aws s3 cp vs sync

The difference between aws s3 cp and aws s3 sync lies in their behavior when copying files:

aws s3 sync recursively copies new and updated files from the source directory to the destination. It does not copy existing unchanged files and only creates folders in the destination if they contain one or more files. aws s3 cp --recursive on the other hand copies all files and folders from the source to the destination, overwriting any existing files. However, it does not delete any files from the destination that no longer exist in the source.

When using aws s3 sync with the --delete flag, it deletes any files from the destination that have been deleted from the source.

In summary, the sync command is more efficient when you want the destination to reflect the exact changes made in the source, while cp is more suitable when you simply want to copy and overwrite files to the destination.

Key points

The aws s3 cp is a robust command for transferring data to and from S3 buckets, providing a wide range of capabilities from recursive copying to applying ACL(s). Mastering the aws s3 cp command can significantly boost productivity when working with S3.

In the meantime, go ahead and learn how a platform like Spacelift can help you and your organization fully manage cloud resources within minutes.

Spacelift is a CI/CD platform for infrastructure-as-code that supports tools like Terraform, Pulumi, Kubernetes, and more. For example, it enables policy-as-code, which lets you define policies and rules that govern your infrastructure automatically. You can even invite your security and compliance teams to collaborate on and approve certain workflows and policies for parts that require a more manual approach. You can check it for free by creating a trial account or booking a demo with one of our engineers.

The Most Flexible CI/CD Automation Tool

Spacelift is an alternative to using homegrown solutions on top of a generic CI. It helps overcome common state management issues and adds several must-have capabilities for infrastructure management.

Start free trial