diff --git a/R/compose.R b/R/compose.R index 68541f4..96d43a9 100644 --- a/R/compose.R +++ b/R/compose.R @@ -11,7 +11,7 @@ #' @import assertthat #' @importFrom googleAuthR gar_api_generator #' @export -#' @seealso \href{Compose objects}{https://cloud.google.com/storage/docs/json_api/v1/objects/compose} +#' @seealso \href{https://cloud.google.com/storage/docs/json_api/v1/objects/compose}{Compose objects} #' #' @examples #' diff --git a/docs/CONTRIBUTING.html b/docs/CONTRIBUTING.html new file mode 100644 index 0000000..d2cf816 --- /dev/null +++ b/docs/CONTRIBUTING.html @@ -0,0 +1,162 @@ + + + +
+ + + + +Contributions to googleCloudStorageR are welcome from anyone and are best sent as pull requests on the GitHub repository. This page provides some instructions to potential contributors about how to add to the package.
+Contributions can be submitted as a pull request on GitHub by forking or cloning the repo, making changes and submitting the pull request.
The cloudyr project follows a consistent style guide across all of its packages. Please refer to this when editing package code.
Pull requests should involve only one commit per substantive change. This means if you change multiple files (e.g., code and documentation), these changes should be committed together. If you don’t know how to do this (e.g., you are making changes in the GitHub web interface) just submit anyway and the maintainer will clean things up.
All contributions must be submitted consistent with the package license (MIT).
Non-trivial contributions need to be noted in the Authors@R
field in the DESCRIPTION. Just follow the format of the existing entries to add your name (and, optionally, email address). Substantial contributions should also be noted in inst/CITATION
.
The cloudyr project use royxgen code and documentation markup, so changes should be made to roxygen comments in the source code .R
files. If changes are made, roxygen needs to be run. The easiest way to do this is a command line call to: Rscript -e devtools::document()
. Please resolve any roxygen errors before submitting a pull request.
Please run R CMD BUILD googleCloudStorageR
and R CMD CHECK googleCloudStorageR_VERSION.tar.gz
before submitting the pull request to check for any errors.
Some specific types of changes that you might make are:
+Bug fixes. Great!
Documentation-only changes (e.g., to Rd files, README, vignettes). This is great! All contributions are welcome.
New functionality. This is fine, but should be discussed on the GitHub issues page before submitting a pull request.
Changes requiring a new package dependency should also be discussed on the GitHub issues page before submitting a pull request.
Message translations. These are very appreciated! The format is a pain, but if you’re doing this I’m assuming you’re already familiar with it.
Any questions you have can be opened as GitHub issues or directed to thosjleeper (at) gmail.com.
+ + +YEAR: 2017 +COPYRIGHT HOLDER: Sunholo Ltd. ++ +
Edmondson M (????). googleCloudStorageR: R Interface with Google Cloud Storage. -R package version . +R package version 0.5.0.
@Manual{, title = {googleCloudStorageR: R Interface with Google Cloud Storage}, author = {Mark Edmondson}, - note = {R package version }, + note = {R package version 0.5.0}, }
Mark Edmondson. Author, maintainer. +
Mark Edmondson. Author, maintainer.
gcs_upload()
will use file extension of name
in its temporary file (#91)gcs_copy_object()
+gcs_compose_objects()
+gcs_list_objects()
to googleAuthR > 0.7 gar_api_page()
+gcs_save_all
and gcs_load_all
which will zip, save/load and upload/download a directory_gcssave.yaml
file to control gcs_first/last
behaviourgs://
style URLs for object names (#57 - thanks seandavi)gcs_get_bucket()
to only expect length 1 character vectors for bucket name. (#60)gcs_object_list
(#58 - thanks @G3rtjan)saveToDisk
option to gcs_load
(#52 - thanks @tomsing1)gcs_get_object()
(#63 - thanks @nkeriks)gcs_object_list
(#58 - thanks @G3rtjan)saveToDisk
option to gcs_load
(#52 - thanks @tomsing1)gcs_get_object()
(#63 - thanks @nkeriks)prefix
and delimiter
in gcs_object_list
to filter objects listed (#68)gcs_save
to store R session data in cloudgcs_load
to restore session data stored with gcs_save
options(googleAuthR.rawResponse = TRUE)
when using gcs_get_object
+options(googleAuthR.rawResponse = TRUE)
when using gcs_get_object
object_name
in gcs_get_object
etc.Object.Rd
Object Object
+Object(acl = NULL, bucket = NULL, cacheControl = NULL, componentCount = NULL, contentDisposition = NULL, contentEncoding = NULL, contentLanguage = NULL, contentType = NULL, crc32c = NULL, customerEncryption = NULL, etag = NULL, generation = NULL, id = NULL, md5Hash = NULL, mediaLink = NULL, - metadata = NULL, metageneration = NULL, name = NULL, owner = NULL, - selfLink = NULL, size = NULL, storageClass = NULL, timeCreated = NULL, - timeDeleted = NULL, updated = NULL)+ metadata = NULL, metageneration = NULL, name = NULL, + owner = NULL, selfLink = NULL, size = NULL, storageClass = NULL, + timeCreated = NULL, timeDeleted = NULL, updated = NULL) -
new_user | -If TRUE, reauthenticate via Google login screen |
- ||
---|---|---|---|
no_auto | -Will ignore auto-authentication settings if TRUE |
+ json_file | +Authentication json file you have downloaded from your Google Project |
Invisibly, the token that has been saved to the session
-If you have set the environment variable GCS_AUTH_FILE
to a valid file location,
- the function will look there for authentication details.
-Otherwise it will look in the working directory for the `.httr-oauth` file, which if not present
- will trigger an authentication flow via Google login screen in your browser.
If GCS_AUTH_FILE
is specified, then gcs_auth()
will be called upon loading the package
- via library(googleCloudStorageR)
,
- meaning that calling this function yourself at the start of the session won't be necessary.
- GCS_AUTH_FILE
can be either a token generated by gar_auth or
- service account JSON ending with file extension .json
The best way to authenticate is to use an environment argument pointing at your authentication file.
+Set the file location of your download Google Project JSON file in a GCS_AUTH_FILE
argument
Then, when you load the library you should auto-authenticate
+However, you can authenticate directly using this function pointing at your JSON auth file.
++# NOT RUN { +library(googleCloudStorageR) +gcs_auth("location_of_json_file.json") +# }
gcs_compose_objects.Rd
This merges objects stored on Cloud Storage into one object.
+ +gcs_compose_objects(objects, destination, + bucket = gcs_get_global_bucket())+ +
objects | +A character vector of object names to combine |
+
---|---|
destination | +Name of the new object. |
+
bucket | +The bucket where the objects sit |
+
Object metadata
+ +Other object functions: gcs_copy_object
,
+ gcs_delete_object
,
+ gcs_get_object
,
+ gcs_list_objects
,
+ gcs_metadata_object
++# NOT RUN { + gcs_global_bucket("your-bucket") + objs <- gcs_list_objects() + + compose_me <- objs$name[1:30] + + gcs_compose_objects(compose_me, "composed/test.json") + +# }
gcs_copy_object.Rd
Copies an object to a new destination
+ +gcs_copy_object(source_object, destination_object, + source_bucket = gcs_get_global_bucket(), + destination_bucket = gcs_get_global_bucket(), rewriteToken = NULL, + destinationPredefinedAcl = NULL)+ +
source_object | +The name of the object to copy, or a |
+
---|---|
destination_object | +The name of where to copy the object to, or a |
+
source_bucket | +The bucket of the source object |
+
destination_bucket | +The bucket of the destination |
+
rewriteToken | +Include this field (from the previous rewrite response) on each rewrite request after the first one, until the rewrite response 'done' flag is true. |
+
destinationPredefinedAcl | +Apply a predefined set of access controls to the destination object. If not NULL must be one of the predefined access controls such as |
+
If successful, a rewrite object.
+ +Other object functions: gcs_compose_objects
,
+ gcs_delete_object
,
+ gcs_get_object
,
+ gcs_list_objects
,
+ gcs_metadata_object
gcs_create_bucket.Rd
Create a new bucket in your project
+gcs_create_bucket(name, projectId, location = "US", - storageClass = c("MULTI_REGIONAL", "REGIONAL", "STANDARD", "NEARLINE", + storageClass = c("MULTI_REGIONAL", "REGIONAL", "STANDARD", "NEARLINE", "COLDLINE", "DURABLE_REDUCED_AVAILABILITY"), - predefinedAcl = c("projectPrivate", "authenticatedRead", "private", + predefinedAcl = c("projectPrivate", "authenticatedRead", "private", "publicRead", "publicReadWrite"), - predefinedDefaultObjectAcl = c("bucketOwnerFullControl", "bucketOwnerRead", - "authenticatedRead", "private", "projectPrivate", "publicRead"), - projection = c("noAcl", "full"), versioning = FALSE, lifecycle = NULL)+ predefinedDefaultObjectAcl = c("bucketOwnerFullControl", + "bucketOwnerRead", "authenticatedRead", "private", "projectPrivate", + "publicRead"), projection = c("noAcl", "full"), versioning = FALSE, + lifecycle = NULL) -
topic | +The pub/sub topic name |
+
---|---|
project | +The project-id that has the pub/sub topic |
+
bucket | +The bucket for notifications |
+
event_types | +What events to activate, leave at default for all |
+
Cloud Pub/Sub notifications allow you to track changes to your Cloud Storage objects.
+As a minimum you wil need: the Cloud Pub/Sub API activated for the project;
+sufficient permissions on the bucket you wish to monitor;
+sufficient permissions on the project to receive notifications;
+an existing pub/sub topic;
+have given your service account at least pubsub.publisher
permission.
https://cloud.google.com/storage/docs/reporting-changes
+Other pubsub functions: gcs_delete_pubsub
,
+ gcs_get_service_email
,
+ gcs_list_pubsub
++# NOT RUN { +project <- "myproject" +bucket <- "mybucket" + +# get the email to give access +gce_get_service_email(project) + +# once email has access, create a new pub/sub topic for your bucket +gcs_create_pubsub("gcs_r", project, bucket) + +# }+ +
gcs_delete_bucket.Rd
Delete the bucket, and all its objects
+gcs_delete_bucket(bucket, ifMetagenerationMatch = NULL, ifMetagenerationNotMatch = NULL)-
object_name | -Object to be deleted |
+ Object to be deleted, or a |
||||||||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
bucket | @@ -132,9 +155,11 @@
config_name | +A name of a configuration |
+
---|---|
bucket | +The bucket for notifications |
+
Cloud Pub/Sub notifications allow you to track changes to your Cloud Storage objects.
+As a minimum you wil need: the Cloud Pub/Sub API activated for the project;
+sufficient permissions on the bucket you wish to monitor;
+sufficient permissions on the project to receive notifications;
+an existing pub/sub topic; have given your service account at least pubsub.publisher
permission.
https://cloud.google.com/storage/docs/reporting-changes
+Other pubsub functions: gcs_create_pubsub
,
+ gcs_get_service_email
,
+ gcs_list_pubsub
gcs_download_url.Rd
Create the download URL for objects in buckets
+gcs_download_url(object_name, bucket = gcs_get_global_bucket(), public = FALSE)-
project | +The project name containing the bucket |
+
---|
This service email can be different from the email in the service JSON. Give this
+pubsub.publisher
permission in the Google cloud console.
Other pubsub functions: gcs_create_pubsub
,
+ gcs_delete_pubsub
,
+ gcs_list_pubsub
gcs_global_bucket.Rd
Set a bucket name used for this R session
+gcs_global_bucket(bucket)-
bucket | +The bucket for notifications |
+
---|
Cloud Pub/Sub notifications allow you to track changes to your Cloud Storage objects.
+As a minimum you wil need: the Cloud Pub/Sub API activated for the project;
+sufficient permissions on the bucket you wish to monitor;
+sufficient permissions on the project to receive notifications;
+an existing pub/sub topic; have given your service account at least pubsub.publisher
permission.
https://cloud.google.com/storage/docs/reporting-changes
+Other pubsub functions: gcs_create_pubsub
,
+ gcs_delete_pubsub
,
+ gcs_get_service_email
gcs_load.Rd
Load R objects that have been saved using gcs_save or gcs_save_image
+gcs_load(file = ".RData", bucket = gcs_get_global_bucket(), envir = .GlobalEnv, saveToDisk = file, overwrite = TRUE)-
object_name | -Name of the object. GCS uses this version if also set elsewhere. |
+ Name of the object. GCS uses this version if also set elsewhere, or a |
+
---|---|---|
metadata | +User-provided metadata, in key/value pairs |
+ |
md5Hash | +MD5 hash of the data; encoded using base64 |
+ |
crc32c | +CRC32c checksum, as described in RFC 4960, Appendix B; encoded using base64 in big-endian byte order |
+ |
contentLanguage | +Content-Language of the object data |
+ |
contentEncoding | +Content-Encoding of the object data |
+ |
contentDisposition | +Content-Disposition of the object data |
+ |
cacheControl | +Cache-Control directive for the object data |
Other object functions: gcs_delete_object
,
+
Other object functions: gcs_compose_objects
,
+ gcs_copy_object
,
+ gcs_delete_object
,
gcs_get_object
,
- gcs_list_objects
gcs_list_objects
gcs_parse_download.Rd
Wrapper for httr
's content. This is the default function used in gcs_get_object
Wrapper for httr
's content. This is the default function used in gcs_get_object
gcs_parse_download(object, encoding = "UTF-8")-
expiration_ts | -A timestamp of class |
+ A timestamp of class |
|||||
---|---|---|---|---|---|---|---|
verb | @@ -140,9 +164,9 @@
... | -Passed to source |
+ Passed to source |
---|
Other R session data functions: gcs_load
,
+
Other R session data functions: gcs_load
,
gcs_save_all
, gcs_save_image
,
- gcs_save
gcs_save
gcs_update_object_acl.Rd
Updates Google Cloud Storage ObjectAccessControls
+gcs_update_object_acl(object_name, bucket = gcs_get_global_bucket(), - entity = "", entity_type = c("user", "group", "domain", "project", - "allUsers", "allAuthenticatedUsers"), role = c("READER", "OWNER"))+ entity = "", entity_type = c("user", "group", "domain", "project", + "allUsers", "allAuthenticatedUsers"), role = c("READER", "OWNER")) -
bucket | +gcs bucket |
+
---|---|
action | +"status", "enable", "disable", or "list" |
+
versioned_objects dataframe #only if "list" action
+ + +googleCloudStorageR.Rd
Interact with Google Cloud Storage API in R. Part of the 'cloudyr' project.
+