Migrating Between S3 Buckets
Migrating Data Between S3 Storage Buckets Using Rclone
Rclone is a command-line program to manage files on cloud storage. It is a feature-rich alternative to cloud vendors' web storage interfaces, supporting over 40 storage products including Amazon S3.
What can Rclone do for you?
Backup (and encrypt) files to cloud storage
Restore (and decrypt) files from cloud storage
Mirror cloud data to other cloud services or locally
Migrate data to the cloud, or between cloud storage vendors
Mount multiple, encrypted, cached or diverse cloud storage as a disk
Analyze and account for data held on cloud storage using lsf, ljson, size, ncdu
Union file systems together to present multiple local and/or cloud file systems as one
Download
Navigate to the Rclone downloads page and download Rclone on your local machine.
Usage
The following documentation provides an example of using Rclone on a Windows machine.
Command prompt
Go to the downloaded file and extract it where you want the rclone to reside on the computer.
cd rclone <destination>
in command prompt e.g.cd C:\**\**\***\***\rclone\rclone-v1.59.0-windows-amd64
Type any of the rclone commands e.g.
rclone config
Rclone GUI
Go to the downloaded file and extract it where you want the rclone to reside on the computer.
cd rclone destination
in command prompt e.g.cd C:\**\**\***\***\rclone\rclone-v1.59.0-windows-amd64
and type
rclone rcd --rc-web-gui
Using Rclone with S3
Note: For S3 bucket objects transfer, you will need to follow steps 1 - 18 for both the source bucket and the destination bucket.
cd rclone destination
Enter
rclone config
and you will be asked if you want to create New Remote, Delete Remote, Rename Remote, Copy Remote, etc. as shown in the image below.Enter the remote name
Choose your storage option number. Since we are working with s3, we will select number 5 and enter it below, as shown in the second image
Choose your S3 Provider, and in this scenario, we will pick number 1
You will be asked if you want to provide S3 Credentials or want the credentials to the fetched from the environment
Provide access key
Provide secret key
Provide region
Provide your S3 endpoint
Provide location constraints. This is needed if you are creating a new bucket. Otherwise, leave it blank
Provide access control options
Provide server-side encryption option
provide SSE KMS KEY ID
provide S3 storage class
you will be asked if you want to edit advanced config?. We will not do that in this documentation but if you want to edit advanced config, enter ‘y’
You are done with the setup. Enter “Y” for yes to save your setup
Quit the configuration setup
Note: You will be using remote created above with the commands below e.g. to list all objects: rclone ls <bucket_name>
You can use any of the commands below to copy from one S3 Bucket to another
rclone sync
: make source and dest identical, modifying destination onlye.g.
rclone sync <remote-name>:<source-bucket-name> <remote-name><destination-bucket-name> (rclone sync formulatorsource:formulatorazure formulatordestination:formulatora)
rclone copy
: copy files from source to dest, skipping already copiedrclone move
: move files from source to destrclone copyTo
: copy files from source to dest, skipping already copied
You can use any of the commands to delete files and paths (Warning: Please do not use any of these files without permission)
rclone delete
: remove the contents of pathe.g.
rclone delete <your remote-name>:<bucket-name> (rclone delete formulatorsource:todeleteaa)
rclone deletefile
: remove a single file from remoterclone purge
: remove the path and all of its contentsrclone rmdir
-- remove the pathrclone rmdirs
-- remove any empty directories under the path
You can use any of the commands to list bucket objects
rclone ls
: list the objects in the path with size and pathrclone lsd
: list all direction/containers/buckets in the pathrclone lsf
: list directories and objects in remote:path formatted parsingrclone lsjson
: list the directories and objects in the path in JSON formatrclone lsl
: list the objects in path with modification time, size and path
You can confirm if the source bucket and destination bucket are in sync using this command
rclone check
: checks the files in the source and destination matche.g. 1
rclone check testa:formulatorazure testa:todeleteaa
. In the image below the two buckets are not in sync
e.g 2
rclone check formulatorsource:formulatorazure formulatordestination:formulatora
. In the image below the two buckets are in sync
To read more about rclone commands, go to this link: Commands (rclone.org)