Migrating Between S3 Buckets

Migrating Data Between S3 Storage Buckets Using Rclone

Rclone is a command-line program to manage files on cloud storage. It is a feature-rich alternative to cloud vendors' web storage interfaces, supporting over 40 storage products including Amazon S3.

What can Rclone do for you?

  • Backup (and encrypt) files to cloud storage

  • Restore (and decrypt) files from cloud storage

  • Mirror cloud data to other cloud services or locally

  • Migrate data to the cloud, or between cloud storage vendors

  • Mount multiple, encrypted, cached or diverse cloud storage as a disk

  • Analyze and account for data held on cloud storage using lsf, ljson, size, ncdu

  • Union file systems together to present multiple local and/or cloud file systems as one


Download

Navigate to the Rclone downloads page and download Rclone on your local machine.


Usage

The following documentation provides an example of using Rclone on a Windows machine.

Command prompt

  1. Go to the downloaded file and extract it where you want the rclone to reside on the computer.

  2. cd rclone <destination> in command prompt e.g. cd C:\**\**\***\***\rclone\rclone-v1.59.0-windows-amd64

  3. Type any of the rclone commands e.g. rclone config

Rclone GUI

  1. Go to the downloaded file and extract it where you want the rclone to reside on the computer.

  2. cd rclone destination in command prompt e.g. cd C:\**\**\***\***\rclone\rclone-v1.59.0-windows-amd64

  3. and type rclone rcd --rc-web-gui

Using Rclone with S3

Note: For S3 bucket objects transfer, you will need to follow steps 1 - 18 for both the source bucket and the destination bucket.

  1. cd rclone destination

  2. Enter rclone config and you will be asked if you want to create New Remote, Delete Remote, Rename Remote, Copy Remote, etc. as shown in the image below.

  3. Enter the remote name

  4. Choose your storage option number. Since we are working with s3, we will select number 5 and enter it below, as shown in the second image

  5. Choose your S3 Provider, and in this scenario, we will pick number 1

  6. You will be asked if you want to provide S3 Credentials or want the credentials to the fetched from the environment

  7. Provide access key

  8. Provide secret key

  9. Provide region

  10. Provide your S3 endpoint

  11. Provide location constraints. This is needed if you are creating a new bucket. Otherwise, leave it blank

  12. Provide access control options

  13. Provide server-side encryption option

  14. provide SSE KMS KEY ID

  15. provide S3 storage class

  16. you will be asked if you want to edit advanced config?. We will not do that in this documentation but if you want to edit advanced config, enter ‘y’

  17. You are done with the setup. Enter “Y” for yes to save your setup

  18. Quit the configuration setup


Note: You will be using remote created above with the commands below e.g. to list all objects: rclone ls <bucket_name>

You can use any of the commands below to copy from one S3 Bucket to another

  • rclone sync: make source and dest identical, modifying destination only

    • e.g. rclone sync <remote-name>:<source-bucket-name> <remote-name><destination-bucket-name> (rclone sync formulatorsource:formulatorazure formulatordestination:formulatora)

  • rclone copy: copy files from source to dest, skipping already copied

  • rclone move: move files from source to dest 

  • rclone copyTo: copy files from source to dest, skipping already copied

You can use any of the commands to delete files and paths (Warning: Please do not use any of these files without permission)

  • rclone delete: remove the contents of path

    • e.g. rclone delete <your remote-name>:<bucket-name> (rclone delete formulatorsource:todeleteaa)

  • rclone deletefile: remove a single file from remote 

  • rclone purge: remove the path and all of its contents

  • rclone rmdir -- remove the path

  • rclone rmdirs -- remove any empty directories under the path

You can use any of the commands to list bucket objects

  • rclone ls: list the objects in the path with size and path

  • rclone lsd: list all direction/containers/buckets in the path

  • rclone lsf: list directories and objects in remote:path formatted parsing

  • rclone lsjson: list the directories and objects in the path in JSON format

  • rclone lsl: list the objects in path with modification time, size and path

You can confirm if the source bucket and destination bucket are in sync using this command

  • rclone check: checks the files in the source and destination match

    • e.g. 1 rclone check testa:formulatorazure testa:todeleteaa. In the image below the two buckets are not in sync

  • e.g 2 rclone check formulatorsource:formulatorazure formulatordestination:formulatora. In the image below the two buckets are in sync

To read more about rclone commands, go to this link: Commands (rclone.org)