Gradle plugin that uploads and downloads S3 objects.
New way:
plugins {
id "com.github.mgk.gradle.s3" version "1.6.0"
}
Old way:
buildscript {
repositories {
maven {
url "https://plugins.gradle.org/m2/"
}
}
dependencies {
classpath "gradle.plugin.com.github.mgk.gradle:s3:1.6.0"
}
}
apply plugin: "com.github.mgk.gradle.s3"
This project uses semantic versioning
See gradle plugin page for other versions.
The S3 plugin searches for credentials in the same order as the AWS default credentials provider chain. Additionally you can specify a credentials profile to use by setting the project s3.profile
property:
s3 {
profile = 'my-profile'
}
Setting the environment variables AWS_ACCESS_KEY_ID
and AWS_SECRET_ACCESS_KEY
is one way to provide your S3 credentials. See the AWS Docs for details on credentials.
The following Gradle tasks are provided. See the tests for examples.
Upload a file to S3. Properties:
bucket
- S3 bucket to use (optional, defaults to the projects3
configured bucket)file
- path of file to be uploadedkey
- key of S3 object to create.overwrite
- (optional, default isfalse
), iftrue
the S3 object is created or overwritten if it already exists.
By default S3Upload
does not overwrite the S3 object if it already exists. Set overwrite
to true
to upload the file even if it exists.
Downloads one or more S3 objects. This task has two modes of operation: single file download and recursive download. Properties that apply to both modes:
bucket
- S3 bucket to use (optional, defaults to the projects3
configured bucket)
For a single file download:
key
- key of S3 object to downloadfile
- local path of file to save the download to
For a recursive download:
keyPrefix
- S3 prefix of objects to downloaddestDir
- local directory to download objects to
Nota Bene: recursive downloads create a sparse directory tree
containing the full keyPrefix
under destDir
. So with an S3 bucket
containing the object keys:
top/foo/bar
top/README
a recursive download:
task downloadRecursive(type: S3Download) {
keyPrefix = "top/foo/"
destDir = "local-dir"
}
results in this local tree:
local-dir/
└── foo
└── bar
So only files under top/foo
are downloaded, but their full S3 paths are appended to the destDir
. This is different from the behavior of the aws cli aws s3 cp --recursive
command which prunes the root of the downloaded objects. Use the flexible Gradle Copy task to prune the tree after downloading it. See example/build.gradle
for an example.
Downloads report percentage progress at the gradle INFO level. Run gradle with the -i
option to see download progress.
The example project pulls the build from Maven Local. Run ./gradlew publishToMavenLocal
to deploy the plugin to the Maven Local
repository. It also writes the current build to build/VERSION
. The file example/build.gradle
serves as tests and doc. The tests use
the local build of the plugin.
The tests use a generated unique path in a test bucket to exercise all of the plugins features.
The test bucket has a AWS Object Expiration policy that removes objects older than one day automatically, so yay, no cleanup required.
The automated build uses an IAM access key that allows listBucket
, getObject
,
and putObject
on the test bucket only. The AWS credits configured as Travis
environment variables. The access key id is not a secret so it is shown in the
logs, while the secret access key is secret and not shown.