As companies move toward and leverage the cloud, each cloud provider has various tools that help to make working with your cloud environment easier. Today I want to show you a use case for one of those tools, AWS CLI, to copy files from a server in my local data center to my AWS S3 bucket.



The AWS CLI (Command Line Interface) is a tool for managing your AWS services from the command prompt or PowerShell. This post will assume you’ve already installed the AWS CLI tool, but if you haven’t you can easily do so from the following link.



https://aws.amazon.com/cli/



Once your AWS CLI has been installed, there are a few configuration steps you’ll need to do before you start managing your resources. For this, you’ll need to have an AWS Access Key ID, the associated Secret Access Key, default region name, and default output format. In my example, I am using PowerShell, but the same can be done via command prompt.



AWS_CLI_S3_1



Now that we have the AWS CLI configured, we can start copying files! AWS CLI makes working with S3 very easy with the aws s3 cp command using the following syntax:



aws s3 cp <source> <destination>



The source and destination arguments can be local paths or S3 locations, so you can use this command to copy between your local and S3 or even between different S3 locations.



You can copy an individual file…



aws s3 cp D:\SQLBackup\TestSpace\Full\TestSpace_FULL_20190101_214230_1.bak

s3://rdxtestbackup/SQLBackup/TestSpace_FULL_20190101_214230_1.bak



AWS_CLI_S3_2AWS_CLI_S3_3AWS_CLI_S3_4



…or all files in a folder.



aws s3 cp D:\SQLBackup\TestSpace\Full\ s3://rdxtestbackup/SQLBackup/ --recursive



AWS_CLI_S3_5AWS_CLI_S3_6AWS_CLI_S3_7



Thanks for reading! I hope the AWS CLI tool makes it easier for you to copy files to your S3 bucket. Leave a comment below if you've used the tool before and stay tuned for more AWS tips soon.