If you don’t have too many objects in your bucket (may be under 1M objects) you can do the following and save 90% the headache of following AWS official guide
The AWS official guide here
Great if you can fllow it no problem.
The less complicated way is just to copy down to your computer from source account A and copy up to the to destination accuont B
1. Install and configure the AWS CLI
If you have already downloaded and configured access keys for the AWS CLI, you can skip this step.
- Follow Amazon’s documentation to install AWS CLI version 2 for your operating system. It is strongly recommended to use version 2; while the original version does receive updates, it does not support many new features.
- In order to use the AWS CLI, you will need to generate access keys for your account. This gives your CLI installation programmatic access to your AWS account. In the AWS console, click on your username in the top-right and select My Security Credentials.
- Select Access keys (access key ID and secret access key), then click Create New Access Key.
- Select Download Key File and store the file somewhere safe. This file contains your access key ID and secret access key, which we will use in the next step. You will not be able to retrieve this file again from AWS after exiting the popup, so make sure to keep your local copy safe.
- Open the command-line and run the command
aws configureto set up your CLI installation. You will need the following information:
AWS Access Key ID: This is stored in the first line of the access key file you downloaded earlier.
AWS Secret Access Key: This is stored in the second line of the access key file you downloaded earlier.
Default region name: This is used when creating new AWS resources from the CLI (e.g. creating a new S3 bucket). You can set this to your preferred AWS region, or leave it blank if you do not intend to create resources from the command line.
Default output format: This sets the output format of command results. You can choose json, yaml, yaml-stream, text, or table. We recommend json for programmatic consumption or table for human-readability.
2. Sync files using the AWS CLI
Now that AWS CLI is configured in the local environment, let’s see how to sync files with an S3 bucket directly from the command line.
- In your terminal, navigate to your project’s output directory. This will be the root directory for a plain HTML/CSS/JS website (such as the one made in our S3 static site tutorial),
build/for a React project, or
dist/for a Vue project.
- Run the following command:
aws s3 sync . s3://<your-bucket-name>/This will sync all files from the current directory to your bucket’s root directory, uploading any that are outdated or missing in the bucket.
- The previous command does not delete any files that are present in the S3 bucket but not in your local directory. This prevents accidental file deletion but requires you to manually delete files from your bucket to free up space. Adding the
--deleteflag to the command disables this behavior; all files missing in the local directory but present in the S3 bucket will be deleted.
Congratulations! Your S3 bucket contents should now be synced with your local directory. Be sure your updates are reflected on your static site by viewing the site in your browser. You may need to clear your browser caches to new static content to show up. If you are using AWS Cloudfront as a content delivery network (CDN), you may need to expire the caches in your Cloudfront distribution for the changes to show up on your site.
Remember to have —profile myprofile1 with myprofile1 having access to the source 1.
Similarly we can move files from your local directory to accuont B.
In this article, we discussed using the AWS CLI to programmatically synchronize files from AWS S3 bucket to a local directory