diff --git a/R/compose.R b/R/compose.R index 68541f4..96d43a9 100644 --- a/R/compose.R +++ b/R/compose.R @@ -11,7 +11,7 @@ #' @import assertthat #' @importFrom googleAuthR gar_api_generator #' @export -#' @seealso \href{Compose objects}{https://cloud.google.com/storage/docs/json_api/v1/objects/compose} +#' @seealso \href{https://cloud.google.com/storage/docs/json_api/v1/objects/compose}{Compose objects} #' #' @examples #' diff --git a/docs/CONTRIBUTING.html b/docs/CONTRIBUTING.html new file mode 100644 index 0000000..d2cf816 --- /dev/null +++ b/docs/CONTRIBUTING.html @@ -0,0 +1,162 @@ + + + + + + + + +NA • googleCloudStorageR + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+ + + +
+ +
+
+ + + +

Contributions to googleCloudStorageR are welcome from anyone and are best sent as pull requests on the GitHub repository. This page provides some instructions to potential contributors about how to add to the package.

+
    +
  1. Contributions can be submitted as a pull request on GitHub by forking or cloning the repo, making changes and submitting the pull request.

  2. +
  3. The cloudyr project follows a consistent style guide across all of its packages. Please refer to this when editing package code.

  4. +
  5. Pull requests should involve only one commit per substantive change. This means if you change multiple files (e.g., code and documentation), these changes should be committed together. If you don’t know how to do this (e.g., you are making changes in the GitHub web interface) just submit anyway and the maintainer will clean things up.

  6. +
  7. All contributions must be submitted consistent with the package license (MIT).

  8. +
  9. Non-trivial contributions need to be noted in the Authors@R field in the DESCRIPTION. Just follow the format of the existing entries to add your name (and, optionally, email address). Substantial contributions should also be noted in inst/CITATION.

  10. +
  11. The cloudyr project use royxgen code and documentation markup, so changes should be made to roxygen comments in the source code .R files. If changes are made, roxygen needs to be run. The easiest way to do this is a command line call to: Rscript -e devtools::document(). Please resolve any roxygen errors before submitting a pull request.

  12. +
  13. Please run R CMD BUILD googleCloudStorageR and R CMD CHECK googleCloudStorageR_VERSION.tar.gz before submitting the pull request to check for any errors.

  14. +
+

Some specific types of changes that you might make are:

+
    +
  1. Bug fixes. Great!

  2. +
  3. Documentation-only changes (e.g., to Rd files, README, vignettes). This is great! All contributions are welcome.

  4. +
  5. New functionality. This is fine, but should be discussed on the GitHub issues page before submitting a pull request.

  6. +
  7. Changes requiring a new package dependency should also be discussed on the GitHub issues page before submitting a pull request.

  8. +
  9. Message translations. These are very appreciated! The format is a pain, but if you’re doing this I’m assuming you’re already familiar with it.

  10. +
+

Any questions you have can be opened as GitHub issues or directed to thosjleeper (at) gmail.com.

+ + +
+ +
+ + + +
+ + + + + + diff --git a/docs/LICENSE-text.html b/docs/LICENSE-text.html new file mode 100644 index 0000000..47e8d9e --- /dev/null +++ b/docs/LICENSE-text.html @@ -0,0 +1,144 @@ + + + + + + + + +License • googleCloudStorageR + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+ + + +
+ +
+
+ + +
YEAR: 2017
+COPYRIGHT HOLDER: Sunholo Ltd.
+
+ +
+ +
+ + + +
+ + + + + + diff --git a/docs/articles/googleCloudStorageR.html b/docs/articles/googleCloudStorageR.html index 49e8224..fe44b3c 100644 --- a/docs/articles/googleCloudStorageR.html +++ b/docs/articles/googleCloudStorageR.html @@ -1,47 +1,55 @@ - + googleCloudStorageR • googleCloudStorageR - - - - + + + + + + -
+
-
+
+

2019-07-27

+ + + + +
-

R library for interacting with the Google Cloud Storage JSON API (api docs).

Setup

Google Cloud Storage charges you for storage (prices here).

You can use your own Google Project with a credit card added to create buckets, where the charges will apply. This can be done in the Google API Console

+
+
+

+Configuring your own Google Project

+

The instructions below are for when you visit the Google API console (https://console.developers.google.com/apis/)

+
+

+For local use

+
    +
  1. Click ‘Create a new Client ID’, and choose “Installed Application”.
  2. +
  3. Download the client ID JSON.
  4. +
  5. +

    Set the client ID via googleAuthR::gar_set_client():

    +
     googleAuthR::gar_set_client("your-json-file.json")
    +
  6. +
+
+
+

+For Shiny use

+
    +
  1. Click ‘Create a new Client ID’, and choose “Web Application”.
  2. +
  3. Download the client ID JSON.
  4. +
  5. Add the URL of where your Shiny app will run, with no port number. e.g. https://mark.shinyapps.io/searchConsoleRDemo/ +
  6. +
  7. And/Or also put in localhost or 127.0.0.1 with a port number for local testing. Remember the port number you use as you will need it later to launch the app e.g. http://127.0.0.1:1221 +
  8. +
  9. +

    Set the web client ID via googleAuthR::gar_set_client():

    +
     googleAuthR::gar_set_client(web_json = "your-json-file.json")
    +
  10. +
  11. To run the app locally specifying the port number you used in step 4 e.g. shiny::runApp(port=1221) or set a shiny option to default to it: options(shiny.port = 1221) and launch via the RunApp button in RStudio.
  12. +
  13. Running on your Shiny Server will work only for the URL from step 3.

  14. +
+
+
+

+Activate API

+
    +
  1. Click on “APIs”
  2. +
  3. Select and activate the Cloud Storage JSON API
  4. +
  5. After loading the package via library(googleCloudStorage), it will look to see if "https://www.googleapis.com/auth/devstorage.full_control" is set in getOption("googleAuthR.scopes.selected") and set it if it is not, adding to the existing scopes.
    +
  6. +
  7. +

    Alternativly, set the googleAuthR option for Google Cloud storage scope after the library has been loaded but before authentication.

    +
     options(googleAuthR.scopes.selected = "https://www.googleapis.com/auth/devstorage.full_control")
    +
  8. +
+

Setting environment variables

By default, all cloudyr packages look for the access key ID and secret access key in environment variables. You can also use this to specify a default bucket, and auto-authentication upon attaching the library. For example:

-
Sys.setenv("GCS_CLIENT_ID" = "mykey",
-           "GCS_CLIENT_SECRET" = "mysecretkey",
-           "GCS_WEB_CLIENT_ID" = "my-shiny-key",
-           "GCS_WEB_CLIENT_SECRET" = "my-shiny-secret-key",
-           "GCS_DEFAULT_BUCKET" = "my-default-bucket",
-           "GCS_AUTH_FILE" = "/fullpath/to/service-auth.json")
+
Sys.setenv("GCS_DEFAULT_BUCKET" = "my-default-bucket",
+           "GCS_AUTH_FILE" = "/fullpath/to/service-auth.json")

These can alternatively be set on the command line or via an Renviron.site or .Renviron file (https://cran.r-project.org/web/packages/httr/vignettes/api-packages.html).

-
-
-

-Authentication

-

Authentication can be carried out each session via gcs_auth. The first time you run this you will be sent to a Google login prompt in your browser to allow the googleCloudStorageR project access (or the Google project you configure).

-

Once authenticated a file named .httr-oauth is saved to your working directory. On subsequent authentication this file will hold your authentication details, and you won’t need to go via the browser. Deleting this file, or setting new_user=TRUE will start the authentication flow again.

-
library(googleCloudStorageR)
-## first time this will send you to the browser to authenticate
-gcs_auth()
-
-## to authenticate with a fresh user, delete .httr-oauth or run with new_user=TRUE
-gcs_auth(new_user = TRUE)
-
-...call functions...etc...
-

Each new R session will need to run gcs_auth() to authenticate future API calls.

Auto-authentication

-

Alternatively, you can specify the location of a service account JSON file taken from your Google Project, or the location of a previously created .httr-oauth token in a system environment:

-
    Sys.setenv("GCS_AUTH_FILE" = "/fullpath/to/auth.json")
+

The best method for authentication is to use your own Google Cloud Project. You can specify the location of a service account JSON file taken from your Google Project, or the location of a previously created gcs.oauth token in a system environment:

+
    Sys.setenv("GCS_AUTH_FILE" = "/fullpath/to/auth.json")

This file will then used for authentication via gcs_auth() when you load the library:

-
## GCS_AUTH_FILE set so auto-authentication
-library(googleCloudStorageR)
-
-## no need for gcs_auth()
-gcs_get_bucket("your-bucket")
+ +

If using your own gcs.oauth file you will also need to set the client ID and secret. This is easiest done using googleAuthR::gar_set_client() - see its help for details.

@@ -137,103 +179,97 @@

Setting a default Bucket

To avoid specifying the bucket in the functions below, you can set the name of your default bucket via environmental variables or via the function gcs_global_bucket(). See the Setting environment variables section below for more details.

-
## set bucket via environment
-Sys.setenv("GCS_DEFAULT_BUCKET" = "my-default-bucket")
-
-library(googleCloudStorageR)
-
-## optional, if you haven't set environment argument GCS_AUTH_FILE
-## gcs_auth()
-
-## check what the default bucket is
-gcs_get_global_bucket()
-[1] "my-default-bucket"
-
-## you can also set a default bucket after loading the library for that session
-gcs_global_bucket("your-default-bucket-2")
-gcs_get_global_bucket()
-[1] "my-default-bucket-2"
+

Downloading objects from Google Cloud storage

Once you have a Google project and created a bucket with an object in it, you can download it as below:

-
library(googleCloudStorageR)
-
-## optional, if you haven't set environment argument GCS_AUTH_FILE
-## gcs_auth()
-
-## get your project name from the API console
-proj <- "your-project"
-
-## get bucket info
-buckets <- gcs_list_buckets(proj)
-bucket <- "your-bucket"
-bucket_info <- gcs_get_bucket(bucket)
-bucket_info
-
-==Google Cloud Storage Bucket==
-Bucket:          your-bucket 
-Project Number:  1123123123 
-Location:        EU 
-Class:           STANDARD 
-Created:         2016-04-28 11:39:06 
-Updated:         2016-04-28 11:39:06 
-Meta-generation: 1 
-eTag:            Cxx=
-
-
-## get object info in the default bucket
-objects <- gcs_list_objects()
-
-## save directly to an R object (warning, don't run out of RAM if its a big object)
-## the download type is guessed into an appropriate R object
-parsed_download <- gcs_get_object(objects$name[[1]])
-
-## if you want to do your own parsing, set parseObject to FALSE
-## use httr::content() to parse afterwards
-raw_download <- gcs_get_object(objects$name[[1]], 
-                               parseObject = FALSE)
-
-## save directly to a file in your working directory
-## parseObject has no effect, it is a httr::content(req, "raw") download
-gcs_get_object(objects$name[[1]], saveToDisk = "csv_downloaded.csv")
+

Uploading objects < 5MB

-

Objects can be uploaded via files saved to disk, or passed in directly if they are data frames or list type R objects. By default, data frames will be converted to CSV via write.csv(), lists to JSON via jsonlite::toJSON.

+

Objects can be uploaded via files saved to disk, or passed in directly if they are data frames or list type R objects. By default, data frames will be converted to CSV via write.csv(), lists to JSON via jsonlite::toJSON.

If you want to use other functions for transforming R objects, for example setting row.names = FALSE or using write.csv2, pass the function through object_function

-
## upload a file - type will be guessed from file extension or supply type  
-write.csv(mtcars, file = filename)
-gcs_upload(filename)
-
-## upload an R data.frame directly - will be converted to csv via write.csv
-gcs_upload(mtcars)
-
-## upload an R list - will be converted to json via jsonlite::toJSON
-gcs_upload(list(a = 1, b = 3, c = list(d = 2, e = 5)))
-
-## upload an R data.frame directly, with a custom function
-## function should have arguments 'input' and 'output'
-## safest to supply type too
-f <- function(input, output) write.csv(input, row.names = FALSE, file = output)
-
-gcs_upload(mtcars, 
-           object_function = f,
-           type = "text/csv")
+

Upload metadata

You can pass metadata with an object via the function gcs_metadata_object().

the name you pass to the metadata object will override the name if it is also set elsewhere.

-
meta <- gcs_metadata_object("mtcars.csv",
-                             metadata = list(custom1 = 2,
-                                             custom_key = 'dfsdfsdfsfs))
-                                             
-gcs_upload(mtcars, object_metadata = meta)
+

@@ -241,197 +277,193 @@

If the file/object is under 5MB, simple uploads are used.

For files > 5MB, resumable uploads are used. This allows you to upload up to 5TB.

If you get an interrupted connection when uploading, gcs_upload will retry 3 times, if it fails it will return a Retry object, that you can try again later from where the upload stopped. Call this via gcs_retry_upload

-
## write a big object to a file
-big_file <- "big_filename.csv"
-write.csv(big_object, file = big_file)
-
-## attempt upload
-upload_try <- gcs_upload(big_file)
-
-## if successful, upload_try is an object metadata object
-upload_try
-==Google Cloud Storage Object==
-Name:            "big_filename.csv" 
-Size:            8.5 Gb 
-Media URL        https://www.googleapis.com/download/storage/v1/b/xxxx 
-Bucket:          your-bucket 
-ID:              your-bucket/"test.pdf"/xxxx
-MD5 Hash:        rshao1nxxxxxY68JZQ== 
-Class:           STANDARD 
-Created:         2016-08-12 17:33:05 
-Updated:         2016-08-12 17:33:05 
-Generation:      1471023185977000 
-Meta Generation: 1 
-eTag:            CKi90xxxxxEAE= 
-crc32c:          j4i1sQ== 
-
-
-## if unsuccessful after 3 retries, upload_try is a Retry object
-==Google Cloud Storage Upload Retry Object==
-File Location:     big_filename.csv
-Retry Upload URL:  http://xxxx
-Created:           2016-08-12 17:33:05 
-Type:              csv
-File Size:        8.5 Gb
-Upload Byte:      4343
-Upload remaining: 8.1 Gb
-
-## you can retry to upload the remaining data using gcs_retry_upload()
-try2 <- gcs_retry_upload(upload_try)
+

Updating user access to objects

You can change who can access objects via gcs_update_acl to one of READER or OWNER, on a user, group, domain, project or public for all users or authenticated users.

By default you are “OWNER” of all the objects and buckets you upload and create.

-
## update access of object to READER for all public
-gcs_update_object_acl("your-object.csv", entity_type = "allUsers")
-
-## update access of object for user joe@blogs.com to OWNER
-gcs_update_acl("your-object.csv", 
-               entity = "joe@blogs.com", 
-               role = "OWNER")
-
-## update access of object for googlegroup users to READER
-gcs_update_object_acl("your-object.csv", 
-                      entity = "my-group@googlegroups.com", 
-                      entity_type = "group")
-
-## update access of object for all users to OWNER on your Google Apps domain
-gcs_update_object_acl("your-object.csv", 
-                      entity = "yourdomain.com", 
-                      entity_type = "domain", 
-                      role = OWNER)
+

Deleting an object

Delete an object by passing its name (and bucket if not default)

-
## returns TRUE is successful, a 404 error if not found
-gcs_delete_object("your-object.csv")
+

Viewing current access level to objects

Use gcs_get_object_acl() to see what the current access is for an entity + entity_type.

-
## default entity_type is user
-acl <- gcs_get_object_acl("your-object.csv", 
-                         entity = "joe@blogs.com")
-acl$role 
-[1] "OWNER"
-
-## for allUsers and allAuthenticated users, you don't need to supply entity
-acl <- gcs_get_object_acl("your-object.csv", 
-                          entity_type = "allUsers")
-acl$role 
-[1] "READER"
+

R Session helpers

-

Versions of save.image(), save() and load() are implemented called gcs_save_image(), gcs_save() and gcs_load(). These functions save and load the global R session to the cloud.

-
## save the current R session including all objects
-gcs_save_image()
-
-### wipe environment
-rm(list = ls())
-
-## load up environment again
-gcs_load()
+

Versions of save.image(), save() and load() are implemented called gcs_save_image(), gcs_save() and gcs_load(). These functions save and load the global R session to the cloud.

+

Save specific objects:

-
cc <- 3
-d <- "test1"
-gcs_save("cc","d", file = "gcs_save_test.RData")
-
-## remove the objects saved in cloud from local environment
-rm(cc,d)
-
-## load them back in from GCS
-gcs_load(file = "gcs_save_test.RData")
-cc == 3
-[1] TRUE
-d == "test1"
-[1] TRUE
+

You can also upload .R code files and source them directly using gcs_source:

-
## make a R source file and upload it
-cat("x <- 'hello world!'\nx", file = "example.R")
-gcs_upload("example.R", name = "example.R")
-
-## source the file to run its code
-gcs_source("example.R")
-
-## the code from the upload file has run
-x
-[1] "hello world!"
+
## make a R source file and upload it
+cat("x <- 'hello world!'\nx", file = "example.R")
+gcs_upload("example.R", name = "example.R")
+
+## source the file to run its code
+gcs_source("example.R")
+
+## the code from the upload file has run
+x
+[1] "hello world!"

Uploading via a Shiny app

The library is also compatible with Shiny authentication flows, so you can create Shiny apps that lets users log in and upload their own data.

An example of that is shown below:

-
library("shiny")
-library("googleAuthR")
-library("googleCloudStorageR")
-options(googleAuthR.scopes.selected = "https://www.googleapis.com/auth/devstorage.full_control")
-## optional, if you want to use your own Google project
-# options("googleAuthR.client_id" = "YOUR_CLIENT_ID")
-# options("googleAuthR.client_secret" = "YOUR_CLIENT_SECRET")
-
-## you need to start Shiny app on port 1221
-## as thats what the default googleAuthR project expects for OAuth2 authentication
-
-## options(shiny.port = 1221)
-## print(source('shiny_test.R')$value) or push the "Run App" button in RStudio
-
-shinyApp(
-  ui = shinyUI(
-      fluidPage(
-        googleAuthR::googleAuthUI("login"),
-        fileInput("picture", "picture"),
-        textInput("filename", label = "Name on Google Cloud Storage",value = "myObject"),
-        actionButton("submit", "submit"),
-        textOutput("meta_file")
-      )
-  ),
-  server = shinyServer(function(input, output, session){
-
-    access_token <- shiny::callModule(googleAuth, "login")
-
-    meta <- eventReactive(input$submit, {
-
-      message("Uploading to Google Cloud Storage")
-      
-      # from googleCloudStorageR
-      with_shiny(gcs_upload,  
-                 file = input$picture$datapath,
-                 # enter your bucket name here
-                 bucket = "gogauth-test",  
-                 type = input$picture$type,
-                 name = input$filename,
-                 shiny_access_token = access_token())
-
-    })
-
-    output$meta_file <- renderText({
-      
-      req(meta())
-
-      str(meta())
-
-      paste("Uploaded: ", meta()$name)
-
-    })
-
-  })
-)
+

@@ -447,75 +479,13 @@

Object administration

You can get meta data about an object by passing meta=TRUE to gcs_get_object

-
gcs_get_object("your-object", "your-bucket", meta = TRUE)
+
gcs_get_object("your-object", "your-bucket", meta = TRUE)

Explanation of Google Project access

googleCloudStorageR has its own Google project which is used to call the Google Cloud Storage API, but does not have access to the objects or buckets in your Google Project unless you give permission for the library to access your own buckets during the OAuth2 authentication process.

No other user, including the owner of the Google Cloud Storage API project has access unless you have given them access, but you may want to change to use your own Google Project (that could or could not be the same as the one that holds your buckets).

-
-
-

-Configuring your own Google Project

-

The instructions below are for when you visit the Google API console (https://console.developers.google.com/apis/)

-
-

-For local use

-
    -
  1. Click ‘Create a new Client ID’, and choose “Installed Application”.
  2. -
  3. Note your Client ID and secret.
  4. -
  5. -

    Add them by modifying your .Renviron file, or under the following entries:

    -
    Sys.setenv("GCS_CLIENT_ID" = "mykey",
    -           "GCS_CLIENT_SECRET" = "mysecretkey")
    -
  6. -
  7. -

    Alternatively, modify these options after googleAuthR has been loaded:

    -
    options("googleAuthR.client_id" = "YOUR_CLIENT_ID")
    -options("googleAuthR.client_secret" = "YOUR_CLIENT_SECRET")
    -
  8. -
-
-
-

-For Shiny use

-
    -
  1. Click ‘Create a new Client ID’, and choose “Web Application”.
  2. -
  3. Note your Client ID and secret.
  4. -
  5. Add the URL of where your Shiny app will run, with no port number. e.g. https://mark.shinyapps.io/searchConsoleRDemo/ -
  6. -
  7. And/Or also put in localhost or 127.0.0.1 with a port number for local testing. Remember the port number you use as you will need it later to launch the app e.g. http://127.0.0.1:1221 -
  8. -
  9. -

    Add them by modifying your .Renviron file, or under the following entries:

    -
    Sys.setenv("GCS_WEB_CLIENT_ID" = "mykey",
    -           "GCS_WEB_CLIENT_SECRET" = "mysecretkey")
    -
  10. -
  11. -

    Alternatively, in your Shiny script modify these options:

    -
    options("googleAuthR.webapp.client_id" = "YOUR_CLIENT_ID")
    -options("googleAuthR.webapp.client_secret" = "YOUR_CLIENT_SECRET")
    -
  12. -
  13. To run the app locally specifying the port number you used in step 4 e.g. shiny::runApp(port=1221) or set a shiny option to default to it: options(shiny.port = 1221) and launch via the RunApp button in RStudio.
  14. -
  15. Running on your Shiny Server will work only for the URL from step 3.

  16. -
-
-
-

-Activate API

-
    -
  1. Click on “APIs”
  2. -
  3. Select and activate the Cloud Storage JSON API
  4. -
  5. After loading the package via library(googleCloudStorage), it will look to see if "https://www.googleapis.com/auth/devstorage.full_control" is set in getOption("googleAuthR.scopes.selected") and set it if it is not, adding to the existing scopes.
    -
  6. -
  7. -

    Alternativly, set the googleAuthR option for Google Cloud storage scope after the library has been loaded but before authentication.

    -
    options(googleAuthR.scopes.selected = "https://www.googleapis.com/auth/devstorage.full_control")
    -
  8. -
-
-
@@ -525,7 +495,7 @@

Contents

@@ -550,11 +519,12 @@

-

Site built with pkgdown.

+

Site built with pkgdown 1.3.0.

- + + diff --git a/docs/articles/index.html b/docs/articles/index.html index 08bb3a9..6936050 100644 --- a/docs/articles/index.html +++ b/docs/articles/index.html @@ -1,6 +1,6 @@ - + @@ -9,24 +9,34 @@ Articles • googleCloudStorageR - + - + - + - + + + + + + - - - + + + + + + + - + + @@ -95,12 +111,12 @@ - -
-
+
+ +

All vignettes

@@ -118,11 +134,13 @@

All vignettes

-

Site built with pkgdown.

+

Site built with pkgdown 1.3.0.

-
+ + + diff --git a/docs/authors.html b/docs/authors.html index 4ca31e2..6a268ee 100644 --- a/docs/authors.html +++ b/docs/authors.html @@ -1,6 +1,6 @@ - + @@ -9,24 +9,34 @@ Citation and Authors • googleCloudStorageR - + - + - + - + + + + + + - - - + + + + + + + - + +
@@ -95,21 +111,21 @@ -
+
-

Edmondson M (????). googleCloudStorageR: R Interface with Google Cloud Storage. -R package version . +R package version 0.5.0.

@Manual{,
   title = {googleCloudStorageR: R Interface with Google Cloud Storage},
   author = {Mark Edmondson},
-  note = {R package version },
+  note = {R package version 0.5.0},
 }
-

Site built with pkgdown.

+

Site built with pkgdown 1.3.0.

-
+ + + diff --git a/docs/docsearch.css b/docs/docsearch.css new file mode 100644 index 0000000..e5f1fe1 --- /dev/null +++ b/docs/docsearch.css @@ -0,0 +1,148 @@ +/* Docsearch -------------------------------------------------------------- */ +/* + Source: https://github.com/algolia/docsearch/ + License: MIT +*/ + +.algolia-autocomplete { + display: block; + -webkit-box-flex: 1; + -ms-flex: 1; + flex: 1 +} + +.algolia-autocomplete .ds-dropdown-menu { + width: 100%; + min-width: none; + max-width: none; + padding: .75rem 0; + background-color: #fff; + background-clip: padding-box; + border: 1px solid rgba(0, 0, 0, .1); + box-shadow: 0 .5rem 1rem rgba(0, 0, 0, .175); +} + +@media (min-width:768px) { + .algolia-autocomplete .ds-dropdown-menu { + width: 175% + } +} + +.algolia-autocomplete .ds-dropdown-menu::before { + display: none +} + +.algolia-autocomplete .ds-dropdown-menu [class^=ds-dataset-] { + padding: 0; + background-color: rgb(255,255,255); + border: 0; + max-height: 80vh; +} + +.algolia-autocomplete .ds-dropdown-menu .ds-suggestions { + margin-top: 0 +} + +.algolia-autocomplete .algolia-docsearch-suggestion { + padding: 0; + overflow: visible +} + +.algolia-autocomplete .algolia-docsearch-suggestion--category-header { + padding: .125rem 1rem; + margin-top: 0; + font-size: 1.3em; + font-weight: 500; + color: #00008B; + border-bottom: 0 +} + +.algolia-autocomplete .algolia-docsearch-suggestion--wrapper { + float: none; + padding-top: 0 +} + +.algolia-autocomplete .algolia-docsearch-suggestion--subcategory-column { + float: none; + width: auto; + padding: 0; + text-align: left +} + +.algolia-autocomplete .algolia-docsearch-suggestion--content { + float: none; + width: auto; + padding: 0 +} + +.algolia-autocomplete .algolia-docsearch-suggestion--content::before { + display: none +} + +.algolia-autocomplete .ds-suggestion:not(:first-child) .algolia-docsearch-suggestion--category-header { + padding-top: .75rem; + margin-top: .75rem; + border-top: 1px solid rgba(0, 0, 0, .1) +} + +.algolia-autocomplete .ds-suggestion .algolia-docsearch-suggestion--subcategory-column { + display: block; + padding: .1rem 1rem; + margin-bottom: 0.1; + font-size: 1.0em; + font-weight: 400 + /* display: none */ +} + +.algolia-autocomplete .algolia-docsearch-suggestion--title { + display: block; + padding: .25rem 1rem; + margin-bottom: 0; + font-size: 0.9em; + font-weight: 400 +} + +.algolia-autocomplete .algolia-docsearch-suggestion--text { + padding: 0 1rem .5rem; + margin-top: -.25rem; + font-size: 0.8em; + font-weight: 400; + line-height: 1.25 +} + +.algolia-autocomplete .algolia-docsearch-footer { + width: 110px; + height: 20px; + z-index: 3; + margin-top: 10.66667px; + float: right; + font-size: 0; + line-height: 0; +} + +.algolia-autocomplete .algolia-docsearch-footer--logo { + background-image: url("data:image/svg+xml;utf8,"); + background-repeat: no-repeat; + background-position: 50%; + background-size: 100%; + overflow: hidden; + text-indent: -9000px; + width: 100%; + height: 100%; + display: block; + transform: translate(-8px); +} + +.algolia-autocomplete .algolia-docsearch-suggestion--highlight { + color: #FF8C00; + background: rgba(232, 189, 54, 0.1) +} + + +.algolia-autocomplete .algolia-docsearch-suggestion--text .algolia-docsearch-suggestion--highlight { + box-shadow: inset 0 -2px 0 0 rgba(105, 105, 105, .5) +} + +.algolia-autocomplete .ds-suggestion.ds-cursor .algolia-docsearch-suggestion--content { + background-color: rgba(192, 192, 192, .15) +} diff --git a/docs/docsearch.js b/docs/docsearch.js new file mode 100644 index 0000000..b35504c --- /dev/null +++ b/docs/docsearch.js @@ -0,0 +1,85 @@ +$(function() { + + // register a handler to move the focus to the search bar + // upon pressing shift + "/" (i.e. "?") + $(document).on('keydown', function(e) { + if (e.shiftKey && e.keyCode == 191) { + e.preventDefault(); + $("#search-input").focus(); + } + }); + + $(document).ready(function() { + // do keyword highlighting + /* modified from https://jsfiddle.net/julmot/bL6bb5oo/ */ + var mark = function() { + + var referrer = document.URL ; + var paramKey = "q" ; + + if (referrer.indexOf("?") !== -1) { + var qs = referrer.substr(referrer.indexOf('?') + 1); + var qs_noanchor = qs.split('#')[0]; + var qsa = qs_noanchor.split('&'); + var keyword = ""; + + for (var i = 0; i < qsa.length; i++) { + var currentParam = qsa[i].split('='); + + if (currentParam.length !== 2) { + continue; + } + + if (currentParam[0] == paramKey) { + keyword = decodeURIComponent(currentParam[1].replace(/\+/g, "%20")); + } + } + + if (keyword !== "") { + $(".contents").unmark({ + done: function() { + $(".contents").mark(keyword); + } + }); + } + } + }; + + mark(); + }); +}); + +/* Search term highlighting ------------------------------*/ + +function matchedWords(hit) { + var words = []; + + var hierarchy = hit._highlightResult.hierarchy; + // loop to fetch from lvl0, lvl1, etc. + for (var idx in hierarchy) { + words = words.concat(hierarchy[idx].matchedWords); + } + + var content = hit._highlightResult.content; + if (content) { + words = words.concat(content.matchedWords); + } + + // return unique words + var words_uniq = [...new Set(words)]; + return words_uniq; +} + +function updateHitURL(hit) { + + var words = matchedWords(hit); + var url = ""; + + if (hit.anchor) { + url = hit.url_without_anchor + '?q=' + escape(words.join(" ")) + '#' + hit.anchor; + } else { + url = hit.url + '?q=' + escape(words.join(" ")); + } + + return url; +} diff --git a/docs/index.html b/docs/index.html index 5b02c76..af2a5fb 100644 --- a/docs/index.html +++ b/docs/index.html @@ -1,15 +1,19 @@ - + -R Interface with Google Cloud Storage • googleCloudStorageR - - - - + + + + + + @@ -19,29 +23,34 @@
-
- +
+

R library for interacting with the Google Cloud Storage JSON API (api docs).

Setup

Google Cloud Storage charges you for storage (prices here).

You can use your own Google Project with a credit card added to create buckets, where the charges will apply. This can be done in the Google API Console

+
+
+

+Configuring your own Google Project

+

The instructions below are for when you visit the Google API console (https://console.developers.google.com/apis/)

+
+

+For local use

+
    +
  1. Click ‘Create a new Client ID’, and choose “Installed Application”.
  2. +
  3. Download the client ID JSON.
  4. +
  5. +

    Set the client ID via googleAuthR::gar_set_client():

    +
     googleAuthR::gar_set_client("your-json-file.json")
    +
  6. +
+
+
+

+For Shiny use

+
    +
  1. Click ‘Create a new Client ID’, and choose “Web Application”.
  2. +
  3. Download the client ID JSON.
  4. +
  5. Add the URL of where your Shiny app will run, with no port number. e.g. https://mark.shinyapps.io/searchConsoleRDemo/ +
  6. +
  7. And/Or also put in localhost or 127.0.0.1 with a port number for local testing. Remember the port number you use as you will need it later to launch the app e.g. http://127.0.0.1:1221 +
  8. +
  9. +

    Set the web client ID via googleAuthR::gar_set_client():

    +
     googleAuthR::gar_set_client(web_json = "your-json-file.json")
    +
  10. +
  11. To run the app locally specifying the port number you used in step 4 e.g. shiny::runApp(port=1221) or set a shiny option to default to it: options(shiny.port = 1221) and launch via the RunApp button in RStudio.
  12. +
  13. Running on your Shiny Server will work only for the URL from step 3.

  14. +
+
+
+

+Activate API

+
    +
  1. Click on “APIs”
  2. +
  3. Select and activate the Cloud Storage JSON API
  4. +
  5. After loading the package via library(googleCloudStorage), it will look to see if "https://www.googleapis.com/auth/devstorage.full_control" is set in getOption("googleAuthR.scopes.selected") and set it if it is not, adding to the existing scopes.
    +
  6. +
  7. +

    Alternativly, set the googleAuthR option for Google Cloud storage scope after the library has been loaded but before authentication.

    +
     options(googleAuthR.scopes.selected = "https://www.googleapis.com/auth/devstorage.full_control")
    +
  8. +
+

Setting environment variables

By default, all cloudyr packages look for the access key ID and secret access key in environment variables. You can also use this to specify a default bucket, and auto-authentication upon attaching the library. For example:

-
Sys.setenv("GCS_CLIENT_ID" = "mykey",
-           "GCS_CLIENT_SECRET" = "mysecretkey",
-           "GCS_WEB_CLIENT_ID" = "my-shiny-key",
-           "GCS_WEB_CLIENT_SECRET" = "my-shiny-secret-key",
-           "GCS_DEFAULT_BUCKET" = "my-default-bucket",
-           "GCS_AUTH_FILE" = "/fullpath/to/service-auth.json")
+
Sys.setenv("GCS_DEFAULT_BUCKET" = "my-default-bucket",
+           "GCS_AUTH_FILE" = "/fullpath/to/service-auth.json")

These can alternatively be set on the command line or via an Renviron.site or .Renviron file (see here for instructions).

-
-
-

-Authentication

-

Authentication can be carried out each session via gcs_auth. The first time you run this you will be sent to a Google login prompt in your browser to allow the googleCloudStorageR project access (or the Google project you configure).

-

Once authenticated a file named .httr-oauth is saved to your working directory. On subsequent authentication this file will hold your authentication details, and you won’t need to go via the browser. Deleting this file, or setting new_user=TRUE will start the authentication flow again.

-
library(googleCloudStorageR)
-## first time this will send you to the browser to authenticate
-gcs_auth()
-
-## to authenticate with a fresh user, delete .httr-oauth or run with new_user=TRUE
-gcs_auth(new_user = TRUE)
-
-...call functions...etc...
-

Each new R session will need to run gcs_auth() to authenticate future API calls.

Auto-authentication

-

Alternatively, you can specify the location of a service account JSON file taken from your Google Project, or the location of a previously created .httr-oauth token in a system environment:

-
    Sys.setenv("GCS_AUTH_FILE" = "/fullpath/to/auth.json")
-

This file will then used for authentication via gcs_auth() when you load the library:

-
## GCS_AUTH_FILE set so auto-authentication
-library(googleCloudStorageR)
-
-## no need for gcs_auth()
-gcs_get_bucket("your-bucket")
+

To authenticate, you specify the location of a service account JSON file taken from your Google Project:

+
    Sys.setenv("GCS_AUTH_FILE" = "/fullpath/to/auth.json")
@@ -129,103 +162,97 @@

Setting a default Bucket

To avoid specifying the bucket in the functions below, you can set the name of your default bucket via environmental variables or via the function gcs_global_bucket(). See the Setting environment variables section below for more details.

-
## set bucket via environment
-Sys.setenv("GCS_DEFAULT_BUCKET" = "my-default-bucket")
-
-library(googleCloudStorageR)
-
-## optional, if you haven't set environment argument GCS_AUTH_FILE
-## gcs_auth()
-
-## check what the default bucket is
-gcs_get_global_bucket()
-[1] "my-default-bucket"
-
-## you can also set a default bucket after loading the library for that session
-gcs_global_bucket("your-default-bucket-2")
-gcs_get_global_bucket()
-[1] "my-default-bucket-2"
+

Downloading objects from Google Cloud storage

Once you have a Google project and created a bucket with an object in it, you can download it as below:

-
library(googleCloudStorageR)
-
-## optional, if you haven't set environment argument GCS_AUTH_FILE
-## gcs_auth()
-
-## get your project name from the API console
-proj <- "your-project"
-
-## get bucket info
-buckets <- gcs_list_buckets(proj)
-bucket <- "your-bucket"
-bucket_info <- gcs_get_bucket(bucket)
-bucket_info
-
-==Google Cloud Storage Bucket==
-Bucket:          your-bucket
-Project Number:  1123123123
-Location:        EU
-Class:           STANDARD
-Created:         2016-04-28 11:39:06
-Updated:         2016-04-28 11:39:06
-Meta-generation: 1
-eTag:            Cxx=
-
-
-## get object info in the default bucket
-objects <- gcs_list_objects()
-
-## save directly to an R object (warning, don't run out of RAM if its a big object)
-## the download type is guessed into an appropriate R object
-parsed_download <- gcs_get_object(objects$name[[1]])
-
-## if you want to do your own parsing, set parseObject to FALSE
-## use httr::content() to parse afterwards
-raw_download <- gcs_get_object(objects$name[[1]],
-                               parseObject = FALSE)
-
-## save directly to a file in your working directory
-## parseObject has no effect, it is a httr::content(req, "raw") download
-gcs_get_object(objects$name[[1]], saveToDisk = "csv_downloaded.csv")
+

Uploading objects < 5MB

-

Objects can be uploaded via files saved to disk, or passed in directly if they are data frames or list type R objects. By default, data frames will be converted to CSV via write.csv(), lists to JSON via jsonlite::toJSON.

+

Objects can be uploaded via files saved to disk, or passed in directly if they are data frames or list type R objects. By default, data frames will be converted to CSV via write.csv(), lists to JSON via jsonlite::toJSON.

If you want to use other functions for transforming R objects, for example setting row.names = FALSE or using write.csv2, pass the function through object_function

-
## upload a file - type will be guessed from file extension or supply type
-write.csv(mtcars, file = filename)
-gcs_upload(filename)
-
-## upload an R data.frame directly - will be converted to csv via write.csv
-gcs_upload(mtcars)
-
-## upload an R list - will be converted to json via jsonlite::toJSON
-gcs_upload(list(a = 1, b = 3, c = list(d = 2, e = 5)))
-
-## upload an R data.frame directly, with a custom function
-## function should have arguments 'input' and 'output'
-## safest to supply type too
-f <- function(input, output) write.csv(input, row.names = FALSE, file = output)
-
-gcs_upload(mtcars,
-           object_function = f,
-           type = "text/csv")
+

Upload metadata

You can pass metadata with an object via the function gcs_metadata_object().

the name you pass to the metadata object will override the name if it is also set elsewhere.

-
meta <- gcs_metadata_object("mtcars.csv",
-                             metadata = list(custom1 = 2,
-                                             custom_key = 'dfsdfsdfsfs))
-
-gcs_upload(mtcars, object_metadata = meta)
+

@@ -233,208 +260,208 @@

If the file/object is under 5MB, simple uploads are used.

For files > 5MB, resumable uploads are used. This allows you to upload up to 5TB.

If you get an interrupted connection when uploading, gcs_upload will retry 3 times, if it fails it will return a Retry object, that you can try again later from where the upload stopped. Call this via gcs_retry_upload

-
## write a big object to a file
-big_file <- "big_filename.csv"
-write.csv(big_object, file = big_file)
-
-## attempt upload
-upload_try <- gcs_upload(big_file)
-
-## if successful, upload_try is an object metadata object
-upload_try
-==Google Cloud Storage Object==
-Name:            "big_filename.csv"
-Size:            8.5 Gb
-Media URL        https://www.googleapis.com/download/storage/v1/b/xxxx
-Bucket:          your-bucket
-ID:              your-bucket/"test.pdf"/xxxx
-MD5 Hash:        rshao1nxxxxxY68JZQ==
-Class:           STANDARD
-Created:         2016-08-12 17:33:05
-Updated:         2016-08-12 17:33:05
-Generation:      1471023185977000
-Meta Generation: 1
-eTag:            CKi90xxxxxEAE=
-crc32c:          j4i1sQ==
-
-
-## if unsuccessful after 3 retries, upload_try is a Retry object
-==Google Cloud Storage Upload Retry Object==
-File Location:     big_filename.csv
-Retry Upload URL:  http://xxxx
-Created:           2016-08-12 17:33:05
-Type:              csv
-File Size:        8.5 Gb
-Upload Byte:      4343
-Upload remaining: 8.1 Gb
-
-## you can retry to upload the remaining data using gcs_retry_upload()
-try2 <- gcs_retry_upload(upload_try)
+

Updating user access to objects

You can change who can access objects via gcs_update_acl to one of READER or OWNER, on a user, group, domain, project or public for all users or authenticated users.

By default you are “OWNER” of all the objects and buckets you upload and create.

-
## update access of object to READER for all public
-gcs_update_object_acl("your-object.csv", entity_type = "allUsers")
-
-## update access of object for user joe@blogs.com to OWNER
-gcs_update_acl("your-object.csv",
-               entity = "joe@blogs.com",
-               role = "OWNER")
-
-## update access of object for googlegroup users to READER
-gcs_update_object_acl("your-object.csv",
-                      entity = "my-group@googlegroups.com",
-                      entity_type = "group")
-
-## update access of object for all users to OWNER on your Google Apps domain
-gcs_update_object_acl("your-object.csv",
-                      entity = "yourdomain.com",
-                      entity_type = "domain",
-                      role = OWNER)
+

Deleting an object

Delete an object by passing its name (and bucket if not default)

-
## returns TRUE is successful, a 404 error if not found
-gcs_delete_object("your-object.csv")
+

Viewing current access level to objects

Use gcs_get_object_acl() to see what the current access is for an entity + entity_type.

-
## default entity_type is user
-acl <- gcs_get_object_acl("your-object.csv",
-                         entity = "joe@blogs.com")
-acl$role
-[1] "OWNER"
-
-## for allUsers and allAuthenticated users, you don't need to supply entity
-acl <- gcs_get_object_acl("your-object.csv",
-                          entity_type = "allUsers")
-acl$role
-[1] "READER"
+

Signed URLs

You can create temporary links for users who may not have a Google account, but still need to be private. This is achieved using the gcs_signed_url function, which you pass a meta object too.

-
obj <- gcs_get_object("your_file", meta = TRUE)
-
-signed <- gcs_signed_url(obj)
+
obj <- gcs_get_object("your_file", meta = TRUE)
+
+signed <- gcs_signed_url(obj)

The default is for the link to be accessible for an hour, but you can alter that:

-
## a link that will expire in 24 hours (86400 seconds) from now.
-24hours_signed <- gcs_signed_url(obj, expiration_ts = Sys.time() + 86400)
+

R Session helpers

-

Versions of save.image(), save() and load() are implemented called gcs_save_image(), gcs_save() and gcs_load(). These functions save and load the global R session to the cloud.

-
## save the current R session including all objects
-gcs_save_image()
-
-### wipe environment
-rm(list = ls())
-
-## load up environment again
-gcs_load()
+

Versions of save.image(), save() and load() are implemented called gcs_save_image(), gcs_save() and gcs_load(). These functions save and load the global R session to the cloud.

+

Save specific objects:

-
cc <- 3
-d <- "test1"
-gcs_save("cc","d", file = "gcs_save_test.RData")
-
-## remove the objects saved in cloud from local environment
-rm(cc,d)
-
-## load them back in from GCS
-gcs_load(file = "gcs_save_test.RData")
-cc == 3
-[1] TRUE
-d == "test1"
-[1] TRUE
+

You can also upload .R code files and source them directly using gcs_source:

-
## make a R source file and upload it
-cat("x <- 'hello world!'\nx", file = "example.R")
-gcs_upload("example.R", name = "example.R")
-
-## source the file to run its code
-gcs_source("example.R")
-
-## the code from the upload file has run
-x
-[1] "hello world!"
+
## make a R source file and upload it
+cat("x <- 'hello world!'\nx", file = "example.R")
+gcs_upload("example.R", name = "example.R")
+
+## source the file to run its code
+gcs_source("example.R")
+
+## the code from the upload file has run
+x
+[1] "hello world!"

Uploading via a Shiny app

The library is also compatible with Shiny authentication flows, so you can create Shiny apps that lets users log in and upload their own data.

An example of that is shown below:

-
library("shiny")
-library("googleAuthR")
-library("googleCloudStorageR")
-options(googleAuthR.scopes.selected = "https://www.googleapis.com/auth/devstorage.full_control")
-## optional, if you want to use your own Google project
-# options("googleAuthR.client_id" = "YOUR_CLIENT_ID")
-# options("googleAuthR.client_secret" = "YOUR_CLIENT_SECRET")
-
-## you need to start Shiny app on port 1221
-## as thats what the default googleAuthR project expects for OAuth2 authentication
-
-## options(shiny.port = 1221)
-## print(source('shiny_test.R')$value) or push the "Run App" button in RStudio
-
-shinyApp(
-  ui = shinyUI(
-      fluidPage(
-        googleAuthR::googleAuthUI("login"),
-        fileInput("picture", "picture"),
-        textInput("filename", label = "Name on Google Cloud Storage",value = "myObject"),
-        actionButton("submit", "submit"),
-        textOutput("meta_file")
-      )
-  ),
-  server = shinyServer(function(input, output, session){
-
-    access_token <- shiny::callModule(googleAuth, "login")
-
-    meta <- eventReactive(input$submit, {
-
-      message("Uploading to Google Cloud Storage")
-
-      # from googleCloudStorageR
-      with_shiny(gcs_upload,
-                 file = input$picture$datapath,
-                 # enter your bucket name here
-                 bucket = "gogauth-test",
-                 type = input$picture$type,
-                 name = input$filename,
-                 shiny_access_token = access_token())
-
-    })
-
-    output$meta_file <- renderText({
-
-      req(meta())
-
-      str(meta())
-
-      paste("Uploaded: ", meta()$name)
-
-    })
-
-  })
-)
+

@@ -450,129 +477,72 @@

Object administration

You can get meta data about an object by passing meta=TRUE to gcs_get_object

-
gcs_get_object("your-object", "your-bucket", meta = TRUE)
-
-
-

-Explanation of Google Project access

-

googleCloudStorageR has its own Google project which is used to call the Google Cloud Storage API, but does not have access to the objects or buckets in your Google Project unless you give permission for the library to access your own buckets during the OAuth2 authentication process.

-

No other user, including the owner of the Google Cloud Storage API project has access unless you have given them access, but you may want to change to use your own Google Project (that could or could not be the same as the one that holds your buckets).

-
-
-

-Configuring your own Google Project

-

The instructions below are for when you visit the Google API console (https://console.developers.google.com/apis/)

-
-

-For local use

-
    -
  1. Click ‘Create a new Client ID’, and choose “Installed Application”.
  2. -
  3. Note your Client ID and secret.
  4. -
  5. -

    Add them by modifying your .Renviron file, or under the following entries:

    -
    Sys.setenv("GCS_CLIENT_ID" = "mykey",
    -           "GCS_CLIENT_SECRET" = "mysecretkey")
    -
  6. -
  7. -

    Alternatively, modify these options after googleAuthR has been loaded:

    -
    options("googleAuthR.client_id" = "YOUR_CLIENT_ID")
    -options("googleAuthR.client_secret" = "YOUR_CLIENT_SECRET")
    -
  8. -
-
-
-

-For Shiny use

-
    -
  1. Click ‘Create a new Client ID’, and choose “Web Application”.
  2. -
  3. Note your Client ID and secret.
  4. -
  5. Add the URL of where your Shiny app will run, with no port number. e.g. https://mark.shinyapps.io/searchConsoleRDemo/ -
  6. -
  7. And/Or also put in localhost or 127.0.0.1 with a port number for local testing. Remember the port number you use as you will need it later to launch the app e.g. http://127.0.0.1:1221 -
  8. -
  9. -

    Add them by modifying your .Renviron file, or under the following entries:

    -
    Sys.setenv("GCS_WEB_CLIENT_ID" = "mykey",
    -           "GCS_WEB_CLIENT_SECRET" = "mysecretkey")
    -
  10. -
  11. -

    Alternatively, in your Shiny script modify these options:

    -
    options("googleAuthR.webapp.client_id" = "YOUR_CLIENT_ID")
    -options("googleAuthR.webapp.client_secret" = "YOUR_CLIENT_SECRET")
    -
  12. -
  13. To run the app locally specifying the port number you used in step 4 e.g. shiny::runApp(port=1221) or set a shiny option to default to it: options(shiny.port = 1221) and launch via the RunApp button in RStudio.
  14. -
  15. Running on your Shiny Server will work only for the URL from step 3.

  16. -
-
-
-

-Activate API

-
    -
  1. Click on “APIs”
  2. -
  3. Select and activate the Cloud Storage JSON API
  4. -
  5. After loading the package via library(googleCloudStorage), it will look to see if "https://www.googleapis.com/auth/devstorage.full_control" is set in getOption("googleAuthR.scopes.selected") and set it if it is not, adding to the existing scopes.
  6. -
  7. -

    Alternativly, set the googleAuthR option for Google Cloud storage scope after the library has been loaded but before authentication.

    -
    options(googleAuthR.scopes.selected = "https://www.googleapis.com/auth/devstorage.full_control")
    -
  8. -
-
+
gcs_get_object("your-object", "your-bucket", meta = TRUE)

Installation

CRAN Build Status codecov.io

This package is on CRAN:

-
# latest stable version
-install.packages("googleCloudStorageR")
+

Or, to pull a potentially unstable version directly from GitHub:

-
if(!require("ghit")){
-    install.packages("ghit")
-}
-ghit::install_github("cloudyr/googleCloudStorageR")
+

cloudyr project logo

- +
+ + diff --git a/docs/news/index.html b/docs/news/index.html index ada72b1..721655d 100644 --- a/docs/news/index.html +++ b/docs/news/index.html @@ -1,32 +1,42 @@ - + -All news • googleCloudStorageR +Changelog • googleCloudStorageR - + - + - + - + + + + + + - - - + + + + + + + - + +
@@ -95,17 +111,34 @@ -
- -
+
+
-
-
-

-googleCloudStorageR 0.3.0.9000

+
+

+googleCloudStorage 0.5.0 Unreleased +

+ +
+
+

+googleCloudStorageR 0.4.0 2017-11-17 +

Major changes

@@ -118,12 +151,15 @@

  • Add gcs_save_all and gcs_load_all which will zip, save/load and upload/download a directory
  • Add use of _gcssave.yaml file to control gcs_first/last behaviour
  • Allow passing a bucket object to functions that expect a bucket name (#76)
  • +
  • remove now unsupported travis environment argument
  • +
  • Add support for subscribing bucket changes to Google Pub/Sub
  • -

    -googleCloudStorageR 0.3.0

    +

    +googleCloudStorageR 0.3.0 2017-05-27 +

    Major changes

    @@ -133,9 +169,9 @@

  • Add support for gs:// style URLs for object names (#57 - thanks seandavi)
  • Add what the public URL would be to an objects print method (#59 - thanks mwhitaker)
  • Add check to gcs_get_bucket() to only expect length 1 character vectors for bucket name. (#60)
  • -
  • Add paging for gcs_object_list (#58 - thanks @G3rtjan)
  • -
  • Add saveToDisk option to gcs_load (#52 - thanks @tomsing1)
  • -
  • Supply your own parse function to gcs_get_object() (#63 - thanks @nkeriks)
  • +
  • Add paging for gcs_object_list (#58 - thanks @G3rtjan)
  • +
  • Add saveToDisk option to gcs_load (#52 - thanks @tomsing1)
  • +
  • Supply your own parse function to gcs_get_object() (#63 - thanks @nkeriks)
  • Support for prefix and delimiter in gcs_object_list to filter objects listed (#68)
  • Default permissions on new buckets lets you write to them afterwards `projectPrivate (#62)
  • @@ -145,8 +181,9 @@

  • -

    -googleCloudStorageR 0.2.0

    +

    +googleCloudStorageR 0.2.0 2016-09-11 +

    Major changes

    @@ -156,7 +193,7 @@

  • Add gcs_save to store R session data in cloud
  • Add gcs_load to restore session data stored with gcs_save
  • -
  • Fix resetting of options(googleAuthR.rawResponse = TRUE) when using gcs_get_object +
  • Fix resetting of options(googleAuthR.rawResponse = TRUE) when using gcs_get_object
  • Add URLencoding to object_name in gcs_get_object etc.
  • @@ -178,8 +215,9 @@

  • -

    -googleCloudStorageR 0.1.0

    +

    +googleCloudStorageR 0.1.0 2016-08-07 +

    Major changes

    @@ -199,14 +237,14 @@

    -
    + + + diff --git a/docs/pkgdown.css b/docs/pkgdown.css index 209ce57..c03fb08 100644 --- a/docs/pkgdown.css +++ b/docs/pkgdown.css @@ -1,13 +1,32 @@ -/* Sticker footer */ +/* Sticky footer */ + +/** + * Basic idea: https://philipwalton.github.io/solved-by-flexbox/demos/sticky-footer/ + * Details: https://github.com/philipwalton/solved-by-flexbox/blob/master/assets/css/components/site.css + * + * .Site -> body > .container + * .Site-content -> body > .container .row + * .footer -> footer + * + * Key idea seems to be to ensure that .container and __all its parents__ + * have height set to 100% + * + */ + +html, body { + height: 100%; +} + body > .container { display: flex; - padding-top: 60px; - min-height: calc(100vh); + height: 100%; flex-direction: column; + + padding-top: 60px; } body > .container .row { - flex: 1; + flex: 1 0 auto; } footer { @@ -16,6 +35,7 @@ footer { border-top: 1px solid #e5e5e5; color: #666; display: flex; + flex-shrink: 0; } footer p { margin-bottom: 0; @@ -38,6 +58,17 @@ img { max-width: 100%; } +/* Fix bug in bootstrap (only seen in firefox) */ +summary { + display: list-item; +} + +/* Typographic tweaking ---------------------------------*/ + +.contents .page-header { + margin-top: calc(-60px + 1em); +} + /* Section anchors ---------------------------------*/ a.anchor { @@ -68,7 +99,7 @@ a.anchor { .contents h1, .contents h2, .contents h3, .contents h4 { padding-top: 60px; - margin-top: -60px; + margin-top: -40px; } /* Static header placement on mobile devices */ @@ -100,16 +131,19 @@ a.anchor { margin-bottom: 0.5em; } +.orcid { + height: 16px; + vertical-align: middle; +} + /* Reference index & topics ----------------------------------------------- */ .ref-index th {font-weight: normal;} -.ref-index h2 {font-size: 20px;} .ref-index td {vertical-align: top;} +.ref-index .icon {width: 40px;} .ref-index .alias {width: 40%;} -.ref-index .title {width: 60%;} - -.ref-index .alias {width: 40%;} +.ref-index-icons .alias {width: calc(40% - 40px);} .ref-index .title {width: 60%;} .ref-arguments th {text-align: right; padding-right: 10px;} @@ -137,6 +171,12 @@ pre, code { color: #333; } +pre code { + overflow: auto; + word-wrap: normal; + white-space: pre; +} + pre .img { margin: 5px 0; } @@ -151,6 +191,10 @@ code a, pre a { color: #375f84; } +a.sourceLine:hover { + text-decoration: none; +} + .fl {color: #1514b5;} .fu {color: #000000;} /* function */ .ch,.st {color: #036a07;} /* string */ @@ -161,3 +205,32 @@ code a, pre a { .error { color: orange; font-weight: bolder;} .warning { color: #6A0366; font-weight: bolder;} +/* Clipboard --------------------------*/ + +.hasCopyButton { + position: relative; +} + +.btn-copy-ex { + position: absolute; + right: 0; + top: 0; + visibility: hidden; +} + +.hasCopyButton:hover button.btn-copy-ex { + visibility: visible; +} + +/* mark.js ----------------------------*/ + +mark { + background-color: rgba(255, 255, 51, 0.5); + border-bottom: 2px solid rgba(255, 153, 51, 0.3); + padding: 1px; +} + +/* vertical spacing after htmlwidgets */ +.html-widget { + margin-bottom: 10px; +} diff --git a/docs/pkgdown.js b/docs/pkgdown.js index 4b81713..eb7e83d 100644 --- a/docs/pkgdown.js +++ b/docs/pkgdown.js @@ -1,45 +1,115 @@ -$(function() { - $("#sidebar").stick_in_parent({offset_top: 40}); - $('body').scrollspy({ - target: '#sidebar', - offset: 60 - }); +/* http://gregfranko.com/blog/jquery-best-practices/ */ +(function($) { + $(function() { + + $("#sidebar") + .stick_in_parent({offset_top: 40}) + .on('sticky_kit:bottom', function(e) { + $(this).parent().css('position', 'static'); + }) + .on('sticky_kit:unbottom', function(e) { + $(this).parent().css('position', 'relative'); + }); + + $('body').scrollspy({ + target: '#sidebar', + offset: 60 + }); + + $('[data-toggle="tooltip"]').tooltip(); + + var cur_path = paths(location.pathname); + var links = $("#navbar ul li a"); + var max_length = -1; + var pos = -1; + for (var i = 0; i < links.length; i++) { + if (links[i].getAttribute("href") === "#") + continue; + // Ignore external links + if (links[i].host !== location.host) + continue; + + var nav_path = paths(links[i].pathname); - var cur_path = paths(location.pathname); - $("#navbar ul li a").each(function(index, value) { - if (value.text == "Home") - return; - if (value.getAttribute("href") === "#") - return; - - var path = paths(value.pathname); - if (is_prefix(cur_path, path)) { - // Add class to parent
  • , and enclosing
  • if in dropdown - var menu_anchor = $(value); + var length = prefix_length(nav_path, cur_path); + if (length > max_length) { + max_length = length; + pos = i; + } + } + + // Add class to parent
  • , and enclosing
  • if in dropdown + if (pos >= 0) { + var menu_anchor = $(links[pos]); menu_anchor.parent().addClass("active"); menu_anchor.closest("li.dropdown").addClass("active"); } }); -}); -function paths(pathname) { - var pieces = pathname.split("/"); - pieces.shift(); // always starts with / + function paths(pathname) { + var pieces = pathname.split("/"); + pieces.shift(); // always starts with / + + var end = pieces[pieces.length - 1]; + if (end === "index.html" || end === "") + pieces.pop(); + return(pieces); + } - var end = pieces[pieces.length - 1]; - if (end === "index.html" || end === "") - pieces.pop(); - return(pieces); -} + // Returns -1 if not found + function prefix_length(needle, haystack) { + if (needle.length > haystack.length) + return(-1); -function is_prefix(needle, haystack) { - if (needle.length > haystack.lengh) - return(false); + // Special case for length-0 haystack, since for loop won't run + if (haystack.length === 0) { + return(needle.length === 0 ? 0 : -1); + } - for (var i = 0; i < haystack.length; i++) { - if (needle[i] != haystack[i]) - return(false); + for (var i = 0; i < haystack.length; i++) { + if (needle[i] != haystack[i]) + return(i); + } + + return(haystack.length); + } + + /* Clipboard --------------------------*/ + + function changeTooltipMessage(element, msg) { + var tooltipOriginalTitle=element.getAttribute('data-original-title'); + element.setAttribute('data-original-title', msg); + $(element).tooltip('show'); + element.setAttribute('data-original-title', tooltipOriginalTitle); } - return(true); -} + if(ClipboardJS.isSupported()) { + $(document).ready(function() { + var copyButton = ""; + + $(".examples, div.sourceCode").addClass("hasCopyButton"); + + // Insert copy buttons: + $(copyButton).prependTo(".hasCopyButton"); + + // Initialize tooltips: + $('.btn-copy-ex').tooltip({container: 'body'}); + + // Initialize clipboard: + var clipboardBtnCopies = new ClipboardJS('[data-clipboard-copy]', { + text: function(trigger) { + return trigger.parentNode.textContent; + } + }); + + clipboardBtnCopies.on('success', function(e) { + changeTooltipMessage(e.trigger, 'Copied!'); + e.clearSelection(); + }); + + clipboardBtnCopies.on('error', function() { + changeTooltipMessage(e.trigger,'Press Ctrl+C or Command+C to copy'); + }); + }); + } +})(window.jQuery || window.$) diff --git a/docs/pkgdown.yml b/docs/pkgdown.yml index 4f6d1d5..7c9267d 100644 --- a/docs/pkgdown.yml +++ b/docs/pkgdown.yml @@ -1,6 +1,9 @@ +pandoc: 2.3.1 +pkgdown: 1.3.0 +pkgdown_sha: ~ +articles: + googleCloudStorageR: googleCloudStorageR.html urls: reference: https://cloudyr.github.io/googleCloudStorageR//reference article: https://cloudyr.github.io/googleCloudStorageR//articles -articles: - googleCloudStorageR: googleCloudStorageR.html diff --git a/docs/reference/Object.html b/docs/reference/Object.html index a2d4258..5659985 100644 --- a/docs/reference/Object.html +++ b/docs/reference/Object.html @@ -1,6 +1,6 @@ - + @@ -9,24 +9,37 @@ Object Object — Object • googleCloudStorageR - + - + - + - + + + + + + - - - + + + + + + + + + + - + +
  • @@ -95,26 +114,30 @@ -
    +
    +

    Object Object

    +
    Object(acl = NULL, bucket = NULL, cacheControl = NULL,
       componentCount = NULL, contentDisposition = NULL,
       contentEncoding = NULL, contentLanguage = NULL, contentType = NULL,
       crc32c = NULL, customerEncryption = NULL, etag = NULL,
       generation = NULL, id = NULL, md5Hash = NULL, mediaLink = NULL,
    -  metadata = NULL, metageneration = NULL, name = NULL, owner = NULL,
    -  selfLink = NULL, size = NULL, storageClass = NULL, timeCreated = NULL,
    -  timeDeleted = NULL, updated = NULL)
    + metadata = NULL, metageneration = NULL, name = NULL, + owner = NULL, selfLink = NULL, size = NULL, storageClass = NULL, + timeCreated = NULL, timeDeleted = NULL, updated = NULL) -

    Arguments

    +

    Arguments

    @@ -248,11 +271,13 @@

    Contents

    -

    Site built with pkgdown.

    +

    Site built with pkgdown 1.3.0.

    - + + + diff --git a/docs/reference/gcs_auth.html b/docs/reference/gcs_auth.html index a060868..6124562 100644 --- a/docs/reference/gcs_auth.html +++ b/docs/reference/gcs_auth.html @@ -1,32 +1,45 @@ - + -Authenticate this session — gcs_auth • googleCloudStorageR +Authenticate with Google Cloud Storage API — gcs_auth • googleCloudStorageR - + - + - + - + + + + + + - - - + + + + + + + + + + - + + @@ -95,58 +114,55 @@ -
    +
    +
    -

    A wrapper for gar_auth and gar_auth_service

    +

    Authenticate with Google Cloud Storage API

    +
    -
    gcs_auth(new_user = FALSE, no_auto = FALSE)
    +
    gcs_auth(json_file)
    -

    Arguments

    +

    Arguments

    - - - - - - + +
    new_user

    If TRUE, reauthenticate via Google login screen

    no_auto

    Will ignore auto-authentication settings if TRUE

    json_file

    Authentication json file you have downloaded from your Google Project

    -

    Value

    - -

    Invisibly, the token that has been saved to the session

    -

    Details

    -

    If you have set the environment variable GCS_AUTH_FILE to a valid file location, - the function will look there for authentication details. -Otherwise it will look in the working directory for the `.httr-oauth` file, which if not present - will trigger an authentication flow via Google login screen in your browser.

    -

    If GCS_AUTH_FILE is specified, then gcs_auth() will be called upon loading the package - via library(googleCloudStorageR), - meaning that calling this function yourself at the start of the session won't be necessary. - GCS_AUTH_FILE can be either a token generated by gar_auth or - service account JSON ending with file extension .json

    +

    The best way to authenticate is to use an environment argument pointing at your authentication file.

    +

    Set the file location of your download Google Project JSON file in a GCS_AUTH_FILE argument

    +

    Then, when you load the library you should auto-authenticate

    +

    However, you can authenticate directly using this function pointing at your JSON auth file.

    +

    Examples

    +
    +
    # NOT RUN { +library(googleCloudStorageR) +gcs_auth("location_of_json_file.json") +# }
    @@ -157,11 +173,13 @@

    Contents

    -

    Site built with pkgdown.

    +

    Site built with pkgdown 1.3.0.

    -
    + + + diff --git a/docs/reference/gcs_compose_objects.html b/docs/reference/gcs_compose_objects.html new file mode 100644 index 0000000..2921860 --- /dev/null +++ b/docs/reference/gcs_compose_objects.html @@ -0,0 +1,207 @@ + + + + + + + + +Compose up to 32 objects into one — gcs_compose_objects • googleCloudStorageR + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
    +
    + + + +
    + +
    +
    + + +
    + +

    This merges objects stored on Cloud Storage into one object.

    + +
    + +
    gcs_compose_objects(objects, destination,
    +  bucket = gcs_get_global_bucket())
    + +

    Arguments

    + + + + + + + + + + + + + + +
    objects

    A character vector of object names to combine

    destination

    Name of the new object.

    bucket

    The bucket where the objects sit

    + +

    Value

    + +

    Object metadata

    + +

    See also

    + + + + +

    Examples

    +
    +
    # NOT RUN { + gcs_global_bucket("your-bucket") + objs <- gcs_list_objects() + + compose_me <- objs$name[1:30] + + gcs_compose_objects(compose_me, "composed/test.json") + +# }
    +
    + +
    + + +
    + + + + + + diff --git a/docs/reference/gcs_copy_object.html b/docs/reference/gcs_copy_object.html new file mode 100644 index 0000000..b09bf6a --- /dev/null +++ b/docs/reference/gcs_copy_object.html @@ -0,0 +1,207 @@ + + + + + + + + +Copy an object — gcs_copy_object • googleCloudStorageR + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
    +
    + + + +
    + +
    +
    + + +
    + +

    Copies an object to a new destination

    + +
    + +
    gcs_copy_object(source_object, destination_object,
    +  source_bucket = gcs_get_global_bucket(),
    +  destination_bucket = gcs_get_global_bucket(), rewriteToken = NULL,
    +  destinationPredefinedAcl = NULL)
    + +

    Arguments

    + + + + + + + + + + + + + + + + + + + + + + + + + + +
    source_object

    The name of the object to copy, or a gs:// URL

    destination_object

    The name of where to copy the object to, or a gs:// URL

    source_bucket

    The bucket of the source object

    destination_bucket

    The bucket of the destination

    rewriteToken

    Include this field (from the previous rewrite response) on each rewrite request after the first one, until the rewrite response 'done' flag is true.

    destinationPredefinedAcl

    Apply a predefined set of access controls to the destination object. If not NULL must be one of the predefined access controls such as "bucketOwnerFullControl"

    + +

    Value

    + +

    If successful, a rewrite object.

    + +

    See also

    + + + + +
    + +
    + + +
    + + + + + + diff --git a/docs/reference/gcs_create_bucket.html b/docs/reference/gcs_create_bucket.html index 4c96c25..5998cf7 100644 --- a/docs/reference/gcs_create_bucket.html +++ b/docs/reference/gcs_create_bucket.html @@ -1,6 +1,6 @@ - + @@ -9,24 +9,37 @@ Create a new bucket — gcs_create_bucket • googleCloudStorageR - + - + - + - + + + + + + - - - + + + + + + + + + + - + +
    @@ -95,26 +114,31 @@ -
    +
    +

    Create a new bucket in your project

    +
    gcs_create_bucket(name, projectId, location = "US",
    -  storageClass = c("MULTI_REGIONAL", "REGIONAL", "STANDARD", "NEARLINE",
    +  storageClass = c("MULTI_REGIONAL", "REGIONAL", "STANDARD", "NEARLINE",
       "COLDLINE", "DURABLE_REDUCED_AVAILABILITY"),
    -  predefinedAcl = c("projectPrivate", "authenticatedRead", "private",
    +  predefinedAcl = c("projectPrivate", "authenticatedRead", "private",
       "publicRead", "publicReadWrite"),
    -  predefinedDefaultObjectAcl = c("bucketOwnerFullControl", "bucketOwnerRead",
    -  "authenticatedRead", "private", "projectPrivate", "publicRead"),
    -  projection = c("noAcl", "full"), versioning = FALSE, lifecycle = NULL)
    + predefinedDefaultObjectAcl = c("bucketOwnerFullControl", + "bucketOwnerRead", "authenticatedRead", "private", "projectPrivate", + "publicRead"), projection = c("noAcl", "full"), versioning = FALSE, + lifecycle = NULL) -

    Arguments

    +

    Arguments

    @@ -161,12 +185,12 @@

    Details

    See also

    -

    Other bucket functions: gcs_create_lifecycle, +

    @@ -189,11 +213,13 @@

    Contents

    -

    Site built with pkgdown.

    +

    Site built with pkgdown 1.3.0.

    - + + + diff --git a/docs/reference/gcs_create_bucket_acl.html b/docs/reference/gcs_create_bucket_acl.html index 7076815..56ae15b 100644 --- a/docs/reference/gcs_create_bucket_acl.html +++ b/docs/reference/gcs_create_bucket_acl.html @@ -1,6 +1,6 @@ - + @@ -9,24 +9,37 @@ Create a Bucket Access Controls — gcs_create_bucket_acl • googleCloudStorageR - + - + - + - + + + + + + - - - + + + + + + + + + + - + + @@ -95,21 +114,25 @@ -
    +
    +

    Create a new access control at the bucket level

    +
    gcs_create_bucket_acl(bucket = gcs_get_global_bucket(), entity = "",
    -  entity_type = c("user", "group", "domain", "project", "allUsers",
    -  "allAuthenticatedUsers"), role = c("READER", "OWNER"))
    + entity_type = c("user", "group", "domain", "project", "allUsers", + "allAuthenticatedUsers"), role = c("READER", "OWNER")) -

    Arguments

    +

    Arguments

    @@ -137,9 +160,9 @@

    Value

    See also

    -

    Other Access control functions: gcs_get_bucket_acl, +

    @@ -162,11 +185,13 @@

    Contents

    -

    Site built with pkgdown.

    +

    Site built with pkgdown 1.3.0.

    - + + + diff --git a/docs/reference/gcs_create_lifecycle.html b/docs/reference/gcs_create_lifecycle.html index 3fea53b..9320e65 100644 --- a/docs/reference/gcs_create_lifecycle.html +++ b/docs/reference/gcs_create_lifecycle.html @@ -1,6 +1,6 @@ - + @@ -9,24 +9,37 @@ Create a lifecycle condition — gcs_create_lifecycle • googleCloudStorageR - + - + - + - + + + + + + - - - + + + + + + + + + + - + + @@ -95,20 +114,24 @@ -
    +
    +

    Use this to set rules for how long objects last in a bucket in gcs_create_bucket

    +
    gcs_create_lifecycle(age = NULL, createdBefore = NULL,
       numNewerVersions = NULL, isLive = NULL)
    -

    Arguments

    +

    Arguments

    @@ -125,24 +148,35 @@

    Ar

    -
    isLive

    If TRUE deletes all live objects, if FALSE deletes all archived versions - -numNewerVersions and isLive works only for buckets with object versioning

    +

    If TRUE deletes all live objects, if FALSE deletes all archived versions

    +

    numNewerVersions and isLive works only for buckets with object versioning

    For multiple conditions, pass this object in as a list.

    See also

    -

    Lifecycle documentation https://cloud.google.com/storage/docs/lifecycle

    + +

    Examples

    +
    # NOT RUN {
    +  lifecycle <- gcs_create_lifecycle(age = 30)
    +
    +  gcs_create_bucket("your-bucket-lifecycle",
    +                     projectId = "your-project",
    +                     location = "EUROPE-NORTH1",
    +                     storageClass = "REGIONAL",
    +                     lifecycle = list(lifecycle))
    +
    +
    +# }
    @@ -161,11 +197,13 @@

    Contents

    -

    Site built with pkgdown.

    +

    Site built with pkgdown 1.3.0.

    - + + + diff --git a/docs/reference/gcs_create_pubsub.html b/docs/reference/gcs_create_pubsub.html new file mode 100644 index 0000000..7be4282 --- /dev/null +++ b/docs/reference/gcs_create_pubsub.html @@ -0,0 +1,218 @@ + + + + + + + + +Create a pub/sub notification for a bucket — gcs_create_pubsub • googleCloudStorageR + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
    +
    + + + +
    + +
    +
    + + +
    + +

    Add a notification configuration that sends notifications for all supported events.

    + +
    + +
    gcs_create_pubsub(topic, project, bucket = gcs_get_global_bucket(),
    +  event_types = NULL)
    + +

    Arguments

    + + + + + + + + + + + + + + + + + + +
    topic

    The pub/sub topic name

    project

    The project-id that has the pub/sub topic

    bucket

    The bucket for notifications

    event_types

    What events to activate, leave at default for all

    + +

    Details

    + +

    Cloud Pub/Sub notifications allow you to track changes to your Cloud Storage objects. +As a minimum you wil need: the Cloud Pub/Sub API activated for the project; +sufficient permissions on the bucket you wish to monitor; +sufficient permissions on the project to receive notifications; +an existing pub/sub topic; +have given your service account at least pubsub.publisher permission.

    + +

    See also

    + + + + +

    Examples

    +
    +
    # NOT RUN { +project <- "myproject" +bucket <- "mybucket" + +# get the email to give access +gce_get_service_email(project) + +# once email has access, create a new pub/sub topic for your bucket +gcs_create_pubsub("gcs_r", project, bucket) + +# }
    + +
    +
    + +
    + + +
    + + + + + + diff --git a/docs/reference/gcs_delete_bucket.html b/docs/reference/gcs_delete_bucket.html index 3eb5259..d3e0381 100644 --- a/docs/reference/gcs_delete_bucket.html +++ b/docs/reference/gcs_delete_bucket.html @@ -1,6 +1,6 @@ - + @@ -9,24 +9,37 @@ Delete a bucket — gcs_delete_bucket • googleCloudStorageR - + - + - + - + + + + + + - - - + + + + + + + + + + - + + @@ -95,20 +114,24 @@ -
    +
    +

    Delete the bucket, and all its objects

    +
    gcs_delete_bucket(bucket, ifMetagenerationMatch = NULL,
       ifMetagenerationNotMatch = NULL)
    -

    Arguments

    +

    Arguments

    @@ -127,12 +150,12 @@

    Ar

    See also

    -

    Other bucket functions: gcs_create_bucket, +

    @@ -153,11 +176,13 @@

    Contents

    -

    Site built with pkgdown.

    +

    Site built with pkgdown 1.3.0.

    - + + + diff --git a/docs/reference/gcs_delete_object.html b/docs/reference/gcs_delete_object.html index 3e137e5..daa0001 100644 --- a/docs/reference/gcs_delete_object.html +++ b/docs/reference/gcs_delete_object.html @@ -1,6 +1,6 @@ - + @@ -9,24 +9,37 @@ Delete an object — gcs_delete_object • googleCloudStorageR - + - + - + - + + + + + + - - - + + + + + + + + + + - + + @@ -95,25 +114,29 @@ -
    +
    +

    Deletes an object from a bucket

    +
    gcs_delete_object(object_name, bucket = gcs_get_global_bucket(),
       generation = NULL)
    -

    Arguments

    +

    Arguments

    - + @@ -132,9 +155,11 @@

    Value

    See also

    -

    Other object functions: gcs_get_object, +

    @@ -157,11 +182,13 @@

    Contents

    -

    Site built with pkgdown.

    +

    Site built with pkgdown 1.3.0.

    - + + + diff --git a/docs/reference/gcs_delete_pubsub.html b/docs/reference/gcs_delete_pubsub.html new file mode 100644 index 0000000..2f9d8a7 --- /dev/null +++ b/docs/reference/gcs_delete_pubsub.html @@ -0,0 +1,191 @@ + + + + + + + + +Delete pub/sub notifications for a bucket — gcs_delete_pubsub • googleCloudStorageR + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
    +
    + + + +
    + +
    +
    + + +
    + +

    Delete notification configurations for a bucket.

    + +
    + +
    gcs_delete_pubsub(config_name, bucket = gcs_get_global_bucket())
    + +

    Arguments

    +
    object_name

    Object to be deleted

    Object to be deleted, or a gs:// URL

    bucket
    + + + + + + + + + +
    config_name

    A name of a configuration

    bucket

    The bucket for notifications

    + +

    Details

    + +

    Cloud Pub/Sub notifications allow you to track changes to your Cloud Storage objects. +As a minimum you wil need: the Cloud Pub/Sub API activated for the project; +sufficient permissions on the bucket you wish to monitor; +sufficient permissions on the project to receive notifications; +an existing pub/sub topic; have given your service account at least pubsub.publisher permission.

    + +

    See also

    + + + + +
    + +
    + + +
    + + + + + + diff --git a/docs/reference/gcs_download_url.html b/docs/reference/gcs_download_url.html index 3dc8685..ed93288 100644 --- a/docs/reference/gcs_download_url.html +++ b/docs/reference/gcs_download_url.html @@ -1,6 +1,6 @@ - + @@ -9,24 +9,37 @@ Get the download URL — gcs_download_url • googleCloudStorageR - + - + - + - + + + + + + - - - + + + + + + + + + + - + + @@ -95,20 +114,24 @@ -
    +
    +

    Create the download URL for objects in buckets

    +
    gcs_download_url(object_name, bucket = gcs_get_global_bucket(),
       public = FALSE)
    -

    Arguments

    +

    Arguments

    @@ -139,8 +162,8 @@

    Details

    See also

    -

    Other download functions: gcs_parse_download, - gcs_signed_url

    +

    Other download functions: gcs_parse_download, + gcs_signed_url

    @@ -165,11 +188,13 @@

    Contents

    -

    Site built with pkgdown.

    +

    Site built with pkgdown 1.3.0.

    - + + + diff --git a/docs/reference/gcs_first.html b/docs/reference/gcs_first.html index 251c264..68612b6 100644 --- a/docs/reference/gcs_first.html +++ b/docs/reference/gcs_first.html @@ -1,6 +1,6 @@ - + @@ -9,24 +9,37 @@ Save your R session to the cloud on startup/exit — gcs_first • googleCloudStorageR - + - + - + - + + + + + + - - - + + + + + + + + + + - + + @@ -95,21 +114,25 @@ -
    +
    +

    Place within your .Rprofile to load and save your session data automatically

    +
    -
    gcs_first(bucket = Sys.getenv("GCS_SESSION_BUCKET"))
    +    
    gcs_first(bucket = Sys.getenv("GCS_SESSION_BUCKET"))
     
    -gcs_last(bucket = Sys.getenv("GCS_SESSION_BUCKET"))
    +gcs_last(bucket = Sys.getenv("GCS_SESSION_BUCKET"))
    -

    Arguments

    +

    Arguments

    @@ -120,14 +143,15 @@

    Ar

    Details

    -

    The folder you want to save to Google Cloud Storage will also need to have a yaml file called _gcssave.yaml in the root of the directory. It can hold the following arguments:

      +

      The folder you want to save to Google Cloud Storage will also need to have a yaml file called _gcssave.yaml in the root of the directory. It can hold the following arguments:

      +
      • [Required] bucket - the GCS bucket to save to

      • [Optional] loaddir - if the folder name is different to the current, where to load the R session from

      • [Optional] pattern - a regex of what files to save at the end of the session

      • [Optional] load_on_startup - if FALSE will not attempt to load on startup

      The bucket name is also set via the environment arg GCE_SESSION_BUCKET. The yaml bucket name will take precedence if both are set.

      -

      The folder is named on GCS the full working path to the working directory e.g. /Users/mark/dev/your-r-project which is what is looked for on startuæ. If you create a new R project with the same filepath and bucket as an existing saved set, the files will download automatically when you load R from that folder (when starting an RStudio project).

      +

      The folder is named on GCS the full working path to the working directory e.g. /Users/mark/dev/your-r-project which is what is looked for on startup. If you create a new R project with the same filepath and bucket as an existing saved set, the files will download automatically when you load R from that folder (when starting an RStudio project).

      If you load from a different filepath (e.g. with loadir set in yaml), when you exit and save the files will be saved under your new present working directory.

      Files with the same name will not be overwritten. If you want them to be, delete or rename them then reload the R session.

      This function does not act like git, or intended as a replacement - its main use is imagined to be for using RStudio Server within disposable Docker containers on Google Cloud Engine (e.g. via googleComputeEngineR)

      @@ -135,43 +159,29 @@

      Details the easiest way is to make sure your authentication file is available in environment file GCS_AUTH_FILE, or if on Google Compute Engine it will reuse the Google Cloud authentication - via gar_gce_auth

      + via gar_gce_auth

      See also

      -

      gcs_save_all and gcs_load_all that these functions call

      +

      gcs_save_all and gcs_load_all that these functions call

      +

      gcs_save_all and gcs_load_all that these functions call

      Examples

      -
      # NOT RUN { -## within your .Rprofile file -.First <- function(){ - cat("\n# Welcome Mark! Today is ", date(), "\n") - - ## will look for download if GCS_SESSION_BUCKET env arg set - googleCloudStorageR::gcs_first() +# NOT RUN { +.First <- function(){ + googleCloudStorageR::gcs_first() } -.Last <- function(){ - # will only upload if a _gcssave.yaml in directory with bucketname - googleCloudStorageR::gcs_last() - message("\nGoodbye Mark at ", date(), "\n") +.Last <- function(){ + googleCloudStorageR::gcs_last() } -### example _gcssave.yaml contents ------------- -# The GCS bucket to save/load R workspace from -bucket: my-bucket-for-r-projects -# set to FALSE if you don't want to load on R session startup -load_on_startup: TRUE -# on first load, whether to look for a different directory on GCS than present getwd() -loaddir: /Users/mark/other-computer/projectname -# regex to only save these files to GCS -pattern: "\\.R$" - -# } -
      + +# }
      +
      -

      Site built with pkgdown.

      +

      Site built with pkgdown 1.3.0.

      - + + + diff --git a/docs/reference/gcs_get_bucket.html b/docs/reference/gcs_get_bucket.html index 1574efb..d9b98ed 100644 --- a/docs/reference/gcs_get_bucket.html +++ b/docs/reference/gcs_get_bucket.html @@ -1,6 +1,6 @@ - + @@ -9,24 +9,37 @@ Get bucket info — gcs_get_bucket • googleCloudStorageR - + - + - + - + + + + + + - - - + + + + + + + + + + - + + @@ -95,21 +114,25 @@ -
      +
      +

      Meta data about the bucket

      +
      gcs_get_bucket(bucket = gcs_get_global_bucket(),
         ifMetagenerationMatch = NULL, ifMetagenerationNotMatch = NULL,
      -  projection = c("noAcl", "full"))
      + projection = c("noAcl", "full")) -

      Arguments

      +

      Arguments

    @@ -136,12 +159,12 @@

    Value

    See also

    -

    Other bucket functions: gcs_create_bucket, +

    Examples

    @@ -176,11 +199,13 @@

    Contents

    -

    Site built with pkgdown.

    +

    Site built with pkgdown 1.3.0.

    - + + + diff --git a/docs/reference/gcs_get_bucket_acl.html b/docs/reference/gcs_get_bucket_acl.html index 1416014..2211629 100644 --- a/docs/reference/gcs_get_bucket_acl.html +++ b/docs/reference/gcs_get_bucket_acl.html @@ -1,6 +1,6 @@ - + @@ -9,24 +9,37 @@ Get Bucket Access Controls — gcs_get_bucket_acl • googleCloudStorageR - + - + - + - + + + + + + - - - + + + + + + + + + + - + + @@ -95,21 +114,25 @@ -
    +
    +

    Returns the ACL entry for the specified entity on the specified bucket

    +
    gcs_get_bucket_acl(bucket = gcs_get_global_bucket(), entity = "",
    -  entity_type = c("user", "group", "domain", "project", "allUsers",
    +  entity_type = c("user", "group", "domain", "project", "allUsers",
       "allAuthenticatedUsers"))
    -

    Arguments

    +

    Arguments

    @@ -133,9 +156,9 @@

    Value

    See also

    -

    Other Access control functions: gcs_create_bucket_acl, +

    @@ -158,11 +181,13 @@

    Contents

    -

    Site built with pkgdown.

    +

    Site built with pkgdown 1.3.0.

    - + + + diff --git a/docs/reference/gcs_get_global_bucket.html b/docs/reference/gcs_get_global_bucket.html index aa09db3..8362b39 100644 --- a/docs/reference/gcs_get_global_bucket.html +++ b/docs/reference/gcs_get_global_bucket.html @@ -1,6 +1,6 @@ - + @@ -9,24 +9,37 @@ Get global bucket name — gcs_get_global_bucket • googleCloudStorageR - + - + - + - + + + + + + - - - + + + + + + + + + + - + + @@ -95,15 +114,19 @@ -
    +
    +

    Bucket name set this session to use by default

    +
    gcs_get_global_bucket()
    @@ -117,12 +140,12 @@

    Details

    See also

    -

    Other bucket functions: gcs_create_bucket, +

    @@ -146,11 +169,13 @@

    Contents

    -

    Site built with pkgdown.

    +

    Site built with pkgdown 1.3.0.

    -
    + + + diff --git a/docs/reference/gcs_get_object.html b/docs/reference/gcs_get_object.html index a75863b..816a043 100644 --- a/docs/reference/gcs_get_object.html +++ b/docs/reference/gcs_get_object.html @@ -1,6 +1,6 @@ - + @@ -9,24 +9,37 @@ Get an object in a bucket directly — gcs_get_object • googleCloudStorageR - + - + - + - + + + + + + - - - + + + + + + + + + + - + + @@ -95,21 +114,25 @@ -
    +
    +

    This retrieves an object directly.

    +
    -
    gcs_get_object(object_name, bucket = gcs_get_global_bucket(), meta = FALSE,
    -  saveToDisk = NULL, overwrite = FALSE, parseObject = TRUE,
    -  parseFunction = gcs_parse_download)
    +
    gcs_get_object(object_name, bucket = gcs_get_global_bucket(),
    +  meta = FALSE, saveToDisk = NULL, overwrite = FALSE,
    +  parseObject = TRUE, parseFunction = gcs_parse_download)
    -

    Arguments

    +

    Arguments

    @@ -149,18 +172,20 @@

    Value

    Details

    This differs from providing downloads via a download link as you can - do via gcs_download_url - object_name can use a gs:// URI instead, + do via gcs_download_url

    +

    object_name can use a gs:// URI instead, in which case it will take the bucket name from that URI and bucket argument will be overridden. The URLs should be in the form gs://bucket/object/name

    -

    By default if you want to get the object straight into an R session the parseFunction is gcs_parse_download which wraps httr's content.

    +

    By default if you want to get the object straight into an R session the parseFunction is gcs_parse_download which wraps httr's content.

    If you want to use your own function (say to unzip the object) then supply it here. The first argument should take the downloaded object.

    See also

    -

    Other object functions: gcs_delete_object, +

    Examples

    @@ -180,7 +205,7 @@

    Examp ## default gives a warning about missing column name. ## custom parse function to suppress warning f <- function(object){ - suppressWarnings(httr::content(object, encoding = "UTF-8")) + suppressWarnings(httr::content(object, encoding = "UTF-8")) } ## get mtcars csv with custom parse function. @@ -212,11 +237,13 @@

    Contents

    -

    Site built with pkgdown.

    +

    Site built with pkgdown 1.3.0.

    - + + + diff --git a/docs/reference/gcs_get_object_acl.html b/docs/reference/gcs_get_object_acl.html index 4e2b4a7..09b808e 100644 --- a/docs/reference/gcs_get_object_acl.html +++ b/docs/reference/gcs_get_object_acl.html @@ -1,6 +1,6 @@ - + @@ -9,24 +9,37 @@ Check the access control settings for an object for one entity — gcs_get_object_acl • googleCloudStorageR - + - + - + - + + + + + + - - - + + + + + + + + + + - + + @@ -95,21 +114,25 @@ -
    +
    +

    Returns the default object ACL entry for the specified entity on the specified bucket.

    +
    gcs_get_object_acl(object_name, bucket = gcs_get_global_bucket(),
    -  entity = "", entity_type = c("user", "group", "domain", "project",
    +  entity = "", entity_type = c("user", "group", "domain", "project",
       "allUsers", "allAuthenticatedUsers"), generation = NULL)
    -

    Arguments

    +

    Arguments

    @@ -136,9 +159,9 @@

    Ar

    See also

    -

    Other Access control functions: gcs_create_bucket_acl, +

    @@ -159,11 +182,13 @@

    Contents

    -

    Site built with pkgdown.

    +

    Site built with pkgdown 1.3.0.

    - + + + diff --git a/docs/reference/gcs_get_service_email.html b/docs/reference/gcs_get_service_email.html new file mode 100644 index 0000000..6edc224 --- /dev/null +++ b/docs/reference/gcs_get_service_email.html @@ -0,0 +1,183 @@ + + + + + + + + +Get the email of service account associated with the bucket — gcs_get_service_email • googleCloudStorageR + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
    +
    + + + +
    + +
    +
    + + +
    + +

    Use this to get the right email so you can give it pubsub.publisher permission.

    + +
    + +
    gcs_get_service_email(project)
    + +

    Arguments

    +
    + + + + + +
    project

    The project name containing the bucket

    + +

    Details

    + +

    This service email can be different from the email in the service JSON. Give this +pubsub.publisher permission in the Google cloud console.

    + +

    See also

    + + + + +
    + +
    + + +
    + + + + + + diff --git a/docs/reference/gcs_global_bucket.html b/docs/reference/gcs_global_bucket.html index 6a25ff8..7f47527 100644 --- a/docs/reference/gcs_global_bucket.html +++ b/docs/reference/gcs_global_bucket.html @@ -1,6 +1,6 @@ - + @@ -9,24 +9,37 @@ Set global bucket name — gcs_global_bucket • googleCloudStorageR - + - + - + - + + + + + + - - - + + + + + + + + + + - + + @@ -95,19 +114,23 @@ -
    +
    +

    Set a bucket name used for this R session

    +
    gcs_global_bucket(bucket)
    -

    Arguments

    +

    Arguments

    @@ -127,12 +150,12 @@

    Details

    See also

    -

    Other bucket functions: gcs_create_bucket, +

    @@ -157,11 +180,13 @@

    Contents

    -

    Site built with pkgdown.

    +

    Site built with pkgdown 1.3.0.

    - + + + diff --git a/docs/reference/gcs_list_buckets.html b/docs/reference/gcs_list_buckets.html index ca4978f..82da134 100644 --- a/docs/reference/gcs_list_buckets.html +++ b/docs/reference/gcs_list_buckets.html @@ -1,6 +1,6 @@ - + @@ -9,24 +9,37 @@ List buckets — gcs_list_buckets • googleCloudStorageR - + - + - + - + + + + + + - - - + + + + + + + + + + - + + @@ -95,20 +114,24 @@ -
    +
    +

    List the buckets your projectId has access to

    +
    -
    gcs_list_buckets(projectId, prefix = "", projection = c("noAcl", "full"),
    -  maxResults = 1000, detail = c("summary", "full"))
    +
    gcs_list_buckets(projectId, prefix = "", projection = c("noAcl",
    +  "full"), maxResults = 1000, detail = c("summary", "full"))
    -

    Arguments

    +

    Arguments

    @@ -139,19 +162,20 @@

    Value

    Details

    -

    Columns returned by detail are:

      +

      Columns returned by detail are:

      +
      • summary - name, storageClass, location ,updated

      • full - as above plus: id, selfLink, projectNumber, timeCreated, metageneration, etag

      See also

      -

      Other bucket functions: gcs_create_bucket, +

      Examples

      @@ -188,11 +212,13 @@

      Contents

      -

      Site built with pkgdown.

      +

      Site built with pkgdown 1.3.0.

      - + + + diff --git a/docs/reference/gcs_list_objects.html b/docs/reference/gcs_list_objects.html index bf418ab..9b509f7 100644 --- a/docs/reference/gcs_list_objects.html +++ b/docs/reference/gcs_list_objects.html @@ -1,6 +1,6 @@ - + @@ -9,24 +9,37 @@ List objects in a bucket — gcs_list_objects • googleCloudStorageR - + - + - + - + + + + + + - - - + + + + + + + + + + - + + @@ -95,20 +114,25 @@ -
      +
      +

      List objects in a bucket

      +
      -
      gcs_list_objects(bucket = gcs_get_global_bucket(), detail = c("summary",
      -  "more", "full"), prefix = NULL, delimiter = NULL)
      +
      gcs_list_objects(bucket = gcs_get_global_bucket(),
      +  detail = c("summary", "more", "full"), prefix = NULL,
      +  delimiter = NULL)
      -

      Arguments

      +

      Arguments

    @@ -135,7 +159,8 @@

    Value

    Details

    -

    Columns returned by detail are:

      +

      Columns returned by detail are:

      +
      • summary - name, size, updated

      • more - as above plus: bucket, contentType, storageClass, timeCreated

      • full - as above plus: id, selfLink, generation, metageneration, md5Hash, mediaLink, crc32c, etag

      • @@ -150,9 +175,11 @@

        Details

        See also

        -

        Other object functions: gcs_delete_object, +

        @@ -177,11 +204,13 @@

        Contents

        -

        Site built with pkgdown.

        +

        Site built with pkgdown 1.3.0.

        - + + + diff --git a/docs/reference/gcs_list_pubsub.html b/docs/reference/gcs_list_pubsub.html new file mode 100644 index 0000000..6d0c286 --- /dev/null +++ b/docs/reference/gcs_list_pubsub.html @@ -0,0 +1,187 @@ + + + + + + + + +List pub/sub notifications for a bucket — gcs_list_pubsub • googleCloudStorageR + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
        +
        + + + +
        + +
        +
        + + +
        + +

        List notification configurations for a bucket.

        + +
        + +
        gcs_list_pubsub(bucket = gcs_get_global_bucket())
        + +

        Arguments

        +
    + + + + + +
    bucket

    The bucket for notifications

    + +

    Details

    + +

    Cloud Pub/Sub notifications allow you to track changes to your Cloud Storage objects. +As a minimum you wil need: the Cloud Pub/Sub API activated for the project; +sufficient permissions on the bucket you wish to monitor; +sufficient permissions on the project to receive notifications; +an existing pub/sub topic; have given your service account at least pubsub.publisher permission.

    + +

    See also

    + + + + +
    + +
    + + +
    + + + + + + diff --git a/docs/reference/gcs_load.html b/docs/reference/gcs_load.html index 098587c..fd08546 100644 --- a/docs/reference/gcs_load.html +++ b/docs/reference/gcs_load.html @@ -1,6 +1,6 @@ - + @@ -9,24 +9,37 @@ Load .RData objects or sessions from the Google Cloud — gcs_load • googleCloudStorageR - + - + - + - + + + + + + - - - + + + + + + + + + + - + + @@ -95,20 +114,24 @@ -
    +
    +

    Load R objects that have been saved using gcs_save or gcs_save_image

    +
    gcs_load(file = ".RData", bucket = gcs_get_global_bucket(),
       envir = .GlobalEnv, saveToDisk = file, overwrite = TRUE)
    -

    Arguments

    +

    Arguments

    @@ -146,9 +169,9 @@

    Details

    See also

    -

    Other R session data functions: gcs_save_all, +

    Other R session data functions: gcs_save_all, gcs_save_image, gcs_save, - gcs_source

    + gcs_source

    @@ -173,11 +196,13 @@

    Contents

    -

    Site built with pkgdown.

    +

    Site built with pkgdown 1.3.0.

    - + + + diff --git a/docs/reference/gcs_metadata_object.html b/docs/reference/gcs_metadata_object.html index 2e218cd..586704c 100644 --- a/docs/reference/gcs_metadata_object.html +++ b/docs/reference/gcs_metadata_object.html @@ -1,6 +1,6 @@ - + @@ -9,24 +9,37 @@ Make metadata for an object — gcs_metadata_object • googleCloudStorageR - + - + - + - + + + + + + - - - + + + + + + + + + + - + + @@ -95,26 +114,59 @@ -
    +
    +

    Use this to pass to uploads in gcs_upload

    +
    -
    gcs_metadata_object(object_name = NULL, metadata = NULL, md5Hash = NULL,
    -  crc32c = NULL, contentLanguage = NULL, contentEncoding = NULL,
    -  contentDisposition = NULL, cacheControl = NULL)
    +
    gcs_metadata_object(object_name = NULL, metadata = NULL,
    +  md5Hash = NULL, crc32c = NULL, contentLanguage = NULL,
    +  contentEncoding = NULL, contentDisposition = NULL,
    +  cacheControl = NULL)
    -

    Arguments

    +

    Arguments

    - + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
    object_name

    Name of the object. GCS uses this version if also set elsewhere.

    Name of the object. GCS uses this version if also set elsewhere, or a gs:// URL

    metadata

    User-provided metadata, in key/value pairs

    md5Hash

    MD5 hash of the data; encoded using base64

    crc32c

    CRC32c checksum, as described in RFC 4960, Appendix B; encoded using base64 in big-endian byte order

    contentLanguage

    Content-Language of the object data

    contentEncoding

    Content-Encoding of the object data

    contentDisposition

    Content-Disposition of the object data

    cacheControl

    Cache-Control directive for the object data

    @@ -124,9 +176,11 @@

    Value

    See also

    -

    Other object functions: gcs_delete_object, +

    @@ -149,11 +203,13 @@

    Contents

    -

    Site built with pkgdown.

    +

    Site built with pkgdown 1.3.0.

    -
    + + + diff --git a/docs/reference/gcs_parse_download.html b/docs/reference/gcs_parse_download.html index 33021b0..d94dfc1 100644 --- a/docs/reference/gcs_parse_download.html +++ b/docs/reference/gcs_parse_download.html @@ -1,6 +1,6 @@ - + @@ -9,24 +9,37 @@ Parse downloaded objects straight into R — gcs_parse_download • googleCloudStorageR - + - + - + - + + + + + + - - - + + + + + + + + + + - + + @@ -95,19 +114,23 @@ -
    +
    +
    -

    Wrapper for httr's content. This is the default function used in gcs_get_object

    +

    Wrapper for httr's content. This is the default function used in gcs_get_object

    +
    gcs_parse_download(object, encoding = "UTF-8")
    -

    Arguments

    +

    Arguments

    @@ -122,9 +145,9 @@

    Ar

    See also

    -

    gcs_get_object

    +

    gcs_get_object

    Other download functions: gcs_download_url, - gcs_signed_url

    + gcs_signed_url

    @@ -145,11 +168,13 @@

    Contents

    -

    Site built with pkgdown.

    +

    Site built with pkgdown 1.3.0.

    - + + + diff --git a/docs/reference/gcs_retry_upload.html b/docs/reference/gcs_retry_upload.html index 11c95e1..efa6d72 100644 --- a/docs/reference/gcs_retry_upload.html +++ b/docs/reference/gcs_retry_upload.html @@ -1,6 +1,6 @@ - + @@ -9,24 +9,38 @@ Retry a resumeable upload — gcs_retry_upload • googleCloudStorageR - + - + - + - + + + + + + - - - + + + + + + + + + + - + + @@ -95,21 +115,25 @@ -
    +
    +

    Used internally in gcs_upload, you can also use this for failed uploads within one week of generating the upload URL

    +
    gcs_retry_upload(retry_object = NULL, upload_url = NULL, file = NULL,
       type = NULL)
    -

    Arguments

    +

    Arguments

    @@ -156,11 +180,13 @@

    Contents

    -

    Site built with pkgdown.

    +

    Site built with pkgdown 1.3.0.

    - + + + diff --git a/docs/reference/gcs_save.html b/docs/reference/gcs_save.html index e3274c5..b6f1a9f 100644 --- a/docs/reference/gcs_save.html +++ b/docs/reference/gcs_save.html @@ -1,6 +1,6 @@ - + @@ -9,24 +9,37 @@ Save .RData objects to the Google Cloud — gcs_save • googleCloudStorageR - + - + - + - + + + + + + - - - + + + + + + + + + + - + + @@ -95,20 +114,24 @@ -
    +
    +
    -

    Performs save then saves it to Google Cloud Storage.

    +

    Performs save then saves it to Google Cloud Storage.

    +
    gcs_save(..., file, bucket = gcs_get_global_bucket(),
    -  envir = parent.frame())
    + envir = parent.frame()) -

    Arguments

    +

    Arguments

    @@ -135,17 +158,17 @@

    Value

    Details

    -

    For all session data use gcs_save_image instead. - gcs_save(ob1, ob2, ob3, file = "mydata.RData") will save the objects specified to an .RData file then save it to Cloud Storage, to be loaded later using gcs_load.

    +

    For all session data use gcs_save_image instead.

    +

    gcs_save(ob1, ob2, ob3, file = "mydata.RData") will save the objects specified to an .RData file then save it to Cloud Storage, to be loaded later using gcs_load.

    For any other use, its better to use gcs_upload and gcs_get_object instead.

    Restore the R objects using gcs_load(bucket = "your_bucket")

    This will overwrite any data within your local environment with the same name.

    See also

    -

    Other R session data functions: gcs_load, +

    Other R session data functions: gcs_load, gcs_save_all, gcs_save_image, - gcs_source

    + gcs_source

    @@ -170,11 +193,13 @@

    Contents

    -

    Site built with pkgdown.

    +

    Site built with pkgdown 1.3.0.

    - + + + diff --git a/docs/reference/gcs_save_all.html b/docs/reference/gcs_save_all.html index 6f4e80e..ac6bc30 100644 --- a/docs/reference/gcs_save_all.html +++ b/docs/reference/gcs_save_all.html @@ -1,6 +1,6 @@ - + @@ -9,24 +9,37 @@ Save/Load all files in directory to Google Cloud Storage — gcs_save_all • googleCloudStorageR - + - + - + - + + + + + + - - - + + + + + + + + + + - + + @@ -95,25 +114,29 @@ -
    +
    +

    This function takes all the files in the directory, zips them, and saves/loads/deletes them to the cloud. The upload name will be the directory name.

    +
    -
    gcs_save_all(directory = getwd(), bucket = gcs_get_global_bucket(),
    +    
    gcs_save_all(directory = getwd(), bucket = gcs_get_global_bucket(),
       pattern = "")
     
    -gcs_load_all(directory = getwd(), bucket = gcs_get_global_bucket(),
    +gcs_load_all(directory = getwd(), bucket = gcs_get_global_bucket(),
       exdir = directory, list = FALSE)
     
    -gcs_delete_all(directory = getwd(), bucket = gcs_get_global_bucket())
    +gcs_delete_all(directory = getwd(), bucket = gcs_get_global_bucket())
    -

    Arguments

    +

    Arguments

    @@ -148,9 +171,9 @@

    Details

    See also

    -

    Other R session data functions: gcs_load, +

    Other R session data functions: gcs_load, gcs_save_image, gcs_save, - gcs_source

    + gcs_source

    @@ -175,11 +198,13 @@

    Contents

    -

    Site built with pkgdown.

    +

    Site built with pkgdown 1.3.0.

    - + + + diff --git a/docs/reference/gcs_save_image.html b/docs/reference/gcs_save_image.html index 3798381..0550cb2 100644 --- a/docs/reference/gcs_save_image.html +++ b/docs/reference/gcs_save_image.html @@ -1,6 +1,6 @@ - + @@ -9,24 +9,37 @@ Save an R session to the Google Cloud — gcs_save_image • googleCloudStorageR - + - + - + - + + + + + + - - - + + + + + + + + + + - + + @@ -95,20 +114,24 @@ -
    +
    +
    -

    Performs save.image then saves it to Google Cloud Storage.

    +

    Performs save.image then saves it to Google Cloud Storage.

    +
    gcs_save_image(file = ".RData", bucket = gcs_get_global_bucket(),
    -  saveLocation = NULL, envir = parent.frame())
    + saveLocation = NULL, envir = parent.frame()) -

    Arguments

    +

    Arguments

    @@ -142,9 +165,9 @@

    Details

    See also

    -

    Other R session data functions: gcs_load, +

    Other R session data functions: gcs_load, gcs_save_all, gcs_save, - gcs_source

    + gcs_source

    @@ -169,11 +192,13 @@

    Contents

    -

    Site built with pkgdown.

    +

    Site built with pkgdown 1.3.0.

    - + + + diff --git a/docs/reference/gcs_signed_url.html b/docs/reference/gcs_signed_url.html index 433825d..8919605 100644 --- a/docs/reference/gcs_signed_url.html +++ b/docs/reference/gcs_signed_url.html @@ -1,6 +1,6 @@ - + @@ -9,24 +9,38 @@ Create a signed URL — gcs_signed_url • googleCloudStorageR - + - + - + - + + + + + + - - - + + + + + + + + + + - + + @@ -95,21 +115,25 @@ -
    +
    +

    This creates a signed URL which you can share with others who may or may not have a Google account. The object will be available until the specified timestamp.

    +
    -
    gcs_signed_url(meta_obj, expiration_ts = Sys.time() + 3600, verb = "GET",
    -  md5hash = NULL, includeContentType = FALSE)
    +
    gcs_signed_url(meta_obj, expiration_ts = Sys.time() + 3600,
    +  verb = "GET", md5hash = NULL, includeContentType = FALSE)
    -

    Arguments

    +

    Arguments

    @@ -118,7 +142,7 @@

    Ar

    - + @@ -140,9 +164,9 @@

    Details

    See also

    -

    https://cloud.google.com/storage/docs/access-control/signed-urls

    +

    Examples

    @@ -152,11 +176,11 @@

    Examp signed <- gcs_signed_url(obj) -temp <- tempfile() -on.exit(unlink(temp)) +temp <- tempfile() +on.exit(unlink(temp)) -download.file(signed, destfile = temp) -file.exists(temp) +download.file(signed, destfile = temp) +file.exists(temp) # }
    @@ -182,11 +206,13 @@

    Contents

    -

    Site built with pkgdown.

    +

    Site built with pkgdown 1.3.0.

    - + + + diff --git a/docs/reference/gcs_source.html b/docs/reference/gcs_source.html index 17444d4..3c3c285 100644 --- a/docs/reference/gcs_source.html +++ b/docs/reference/gcs_source.html @@ -1,6 +1,6 @@ - + @@ -9,24 +9,37 @@ Source an R script from the Google Cloud — gcs_source • googleCloudStorageR - + - + - + - + + + + + + - - - + + + + + + + + + + - + + @@ -95,19 +114,23 @@ -
    +
    +
    -

    Download an R script and run it immediately via source

    +

    Download an R script and run it immediately via source

    +
    gcs_source(script, bucket = gcs_get_global_bucket(), ...)
    -

    Arguments

    +

    Arguments

    expiration_ts

    A timestamp of class "POSIXct" such as from Sys.time() or a numeric in seconds from Unix Epoch. Default is 60 mins.

    A timestamp of class "POSIXct" such as from Sys.time() or a numeric in seconds from Unix Epoch. Default is 60 mins.

    verb
    @@ -120,7 +143,7 @@

    Ar

    - +
    ...

    Passed to source

    Passed to source

    @@ -130,9 +153,9 @@

    Value

    See also

    -

    Other R session data functions: gcs_load, +

    Other R session data functions: gcs_load, gcs_save_all, gcs_save_image, - gcs_save

    + gcs_save

    @@ -155,11 +178,13 @@

    Contents

    -

    Site built with pkgdown.

    +

    Site built with pkgdown 1.3.0.

    -
    + + + diff --git a/docs/reference/gcs_update_object_acl.html b/docs/reference/gcs_update_object_acl.html index 76292a1..bb53320 100644 --- a/docs/reference/gcs_update_object_acl.html +++ b/docs/reference/gcs_update_object_acl.html @@ -1,6 +1,6 @@ - + @@ -9,24 +9,37 @@ Change access to an object in a bucket — gcs_update_object_acl • googleCloudStorageR - + - + - + - + + + + + + - - - + + + + + + + + + + - + + @@ -95,21 +114,25 @@ -
    +
    +

    Updates Google Cloud Storage ObjectAccessControls

    +
    gcs_update_object_acl(object_name, bucket = gcs_get_global_bucket(),
    -  entity = "", entity_type = c("user", "group", "domain", "project",
    -  "allUsers", "allAuthenticatedUsers"), role = c("READER", "OWNER"))
    + entity = "", entity_type = c("user", "group", "domain", "project", + "allUsers", "allAuthenticatedUsers"), role = c("READER", "OWNER")) -

    Arguments

    +

    Arguments

    @@ -140,13 +163,15 @@

    Value

    Details

    -

    An entity is an identifier for the entity_type.

      +

      An entity is an identifier for the entity_type.

      +
      • entity="user" may have userId or email

      • entity="group" may have groupId or email

      • entity="domain" may have domain

      • entity="project" may have team-projectId

      -

      For example:

        +

        For example:

        +
        • entity="user" could be jane@doe.com

        • entity="group" could be example@googlegroups.com

        • entity="domain" could be example.com which is a Google Apps for Business domain.

        • @@ -154,10 +179,10 @@

          Details

          See also

          -

          objectAccessControls on Google API reference

          + @@ -182,11 +207,13 @@

          Contents

          -

          Site built with pkgdown.

          +

          Site built with pkgdown 1.3.0.

          - + + + diff --git a/docs/reference/gcs_upload.html b/docs/reference/gcs_upload.html index 015e850..b04beec 100644 --- a/docs/reference/gcs_upload.html +++ b/docs/reference/gcs_upload.html @@ -1,6 +1,6 @@ - + @@ -9,24 +9,37 @@ Upload a file of arbitrary type — gcs_upload • googleCloudStorageR - + - + - + - + + + + + + - - - + + + + + + + + + + - + + @@ -95,23 +114,28 @@ -
          +
          +

          Upload up to 5TB

          +
          gcs_upload(file, bucket = gcs_get_global_bucket(), type = NULL,
          -  name = deparse(substitute(file)), object_function = NULL,
          -  object_metadata = NULL, predefinedAcl = c("private", "authenticatedRead",
          -  "bucketOwnerFullControl", "bucketOwnerRead", "projectPrivate", "publicRead", "default"),
          -  upload_type = c("simple", "resumable"))
          + name = deparse(substitute(file)), object_function = NULL, + object_metadata = NULL, predefinedAcl = c("private", + "authenticatedRead", "bucketOwnerFullControl", "bucketOwnerRead", + "projectPrivate", "publicRead", "default"), upload_type = c("simple", + "resumable")) -

          Arguments

          +

          Arguments

    @@ -161,9 +185,10 @@

    Details

    By default the upload_type will be 'simple' if under 5MB, 'resumable' if over 5MB. 'Multipart' upload is used if you provide a object_metadata.

    If object_function is NULL and file is not a character filepath, - the defaults are:

      -
    • file's class is data.frame - write.csv

    • -
    • file's class is list - toJSON

    • + the defaults are:

      +

      If object_function is not NULL and file is not a character filepath, then object_function will be applied to the R object specified @@ -197,12 +222,12 @@

      Examp ## when looping, its best to specify the name else it will take ## the deparsed function call e.g. X[[i]] -my_files <- list.files("my_uploads") -lapply(my_files, function(x) gcs_upload(x, name = x)) +my_files <- list.files("my_uploads") +lapply(my_files, function(x) gcs_upload(x, name = x)) ## you can supply your own function to transform R objects before upload f <- function(input, output){ - write.csv2(input, file = output) + write.csv2(input, file = output) } gcs_upload(mtcars, name = "mtcars_csv2.csv", object_function = f) @@ -235,11 +260,13 @@

      Contents

      -

      Site built with pkgdown.

      +

      Site built with pkgdown 1.3.0.

      - + + + diff --git a/docs/reference/gcs_version_bucket.html b/docs/reference/gcs_version_bucket.html new file mode 100644 index 0000000..8ee7906 --- /dev/null +++ b/docs/reference/gcs_version_bucket.html @@ -0,0 +1,184 @@ + + + + + + + + +Turn bucket versioning on or off, check status (default), or +list archived versions of objects in the bucket and view their generation numbers. — gcs_version_bucket • googleCloudStorageR + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
      +
      + + + +
      + +
      +
      + + +
      + +

      Turn bucket versioning on or off, check status (default), or +list archived versions of objects in the bucket and view their generation numbers.

      + +
      + +
      gcs_version_bucket(bucket, action = c("status", "enable", "disable",
      +  "list"))
      + +

      Arguments

      +

    + + + + + + + + + +
    bucket

    gcs bucket

    action

    "status", "enable", "disable", or "list"

    + +

    Value

    + +

    versioned_objects dataframe #only if "list" action

    + + +
    + +
    + + +
    + + + + + + diff --git a/docs/reference/googleCloudStorageR.html b/docs/reference/googleCloudStorageR.html index 3a7aec9..6cac190 100644 --- a/docs/reference/googleCloudStorageR.html +++ b/docs/reference/googleCloudStorageR.html @@ -1,6 +1,6 @@ - + @@ -9,24 +9,37 @@ googleCloudStorageR — googleCloudStorageR • googleCloudStorageR - + - + - + - + + + + + + - - - + + + + + + + + + + - + + @@ -95,15 +114,19 @@ -
    +
    +

    Interact with Google Cloud Storage API in R. Part of the 'cloudyr' project.

    +
    @@ -122,11 +145,13 @@

    Contents

    -

    Site built with pkgdown.

    +

    Site built with pkgdown 1.3.0.

    -
    + + + diff --git a/docs/reference/index.html b/docs/reference/index.html index 06b2735..993ff61 100644 --- a/docs/reference/index.html +++ b/docs/reference/index.html @@ -1,6 +1,6 @@ - + @@ -9,24 +9,34 @@ Function reference • googleCloudStorageR - + - + - + - + + + + + + - - - + + + + + + + - + +
    @@ -95,184 +111,199 @@ -
    -
    +
    +
    -
    - +
    - - - - + + + + + - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -

    Authentication

    -

    -
    -

    gcs_auth

    -

    Authenticate this session

    -

    Buckets

    -

    Working with Google Cloud Storage buckets

    -
    -

    gcs_create_bucket_acl

    -

    Create a Bucket Access Controls

    -

    gcs_create_bucket

    -

    Create a new bucket

    -

    gcs_delete_bucket

    -

    Delete a bucket

    -

    gcs_get_bucket_acl

    -

    Get Bucket Access Controls

    -

    gcs_get_bucket

    -

    Get bucket info

    -

    gcs_get_global_bucket

    -

    Get global bucket name

    -

    gcs_global_bucket

    -

    Set global bucket name

    -

    gcs_list_buckets

    -

    List buckets

    -

    Objects

    -

    Working with objects inside the buckets

    -
    -

    gcs_delete_object

    -

    Delete an object

    -

    gcs_get_object_acl

    -

    Check the access control settings for an object for one entity

    -

    gcs_get_object

    -

    Get an object in a bucket directly

    -

    gcs_list_objects

    -

    List objects in a bucket

    -

    gcs_metadata_object

    -

    Make metadata for an object

    -

    gcs_update_object_acl

    -

    Change access to an object in a bucket

    -

    Session helpers

    -

    Working with R session objects

    -
    -

    gcs_first gcs_last

    -

    Save your R session to the cloud on startup/exit

    -

    gcs_load

    -

    Load .RData objects or sessions from the Google Cloud

    -

    gcs_save

    -

    Save .RData objects to the Google Cloud

    -

    gcs_save_all gcs_load_all gcs_delete_all

    -

    Save/Load all files in directory to Google Cloud Storage

    -

    gcs_save_image

    -

    Save an R session to the Google Cloud

    -

    gcs_source

    -

    Source an R script from the Google Cloud

    -
    + + + +

    Authentication

    +

    + + + + + +

    gcs_auth()

    + +

    Authenticate with Google Cloud Storage API

    + + + + +

    Buckets

    +

    Working with Google Cloud Storage buckets

    + + + + + +

    gcs_create_bucket()

    + +

    Create a new bucket

    + + + +

    gcs_create_bucket_acl()

    + +

    Create a Bucket Access Controls

    + + + +

    gcs_delete_bucket()

    + +

    Delete a bucket

    + + + +

    gcs_get_bucket()

    + +

    Get bucket info

    + + + +

    gcs_get_bucket_acl()

    + +

    Get Bucket Access Controls

    + + + +

    gcs_get_global_bucket()

    + +

    Get global bucket name

    + + + +

    gcs_global_bucket()

    + +

    Set global bucket name

    + + + +

    gcs_list_buckets()

    + +

    List buckets

    + + + +

    gcs_version_bucket()

    + +

    Turn bucket versioning on or off, check status (default), or +list archived versions of objects in the bucket and view their generation numbers.

    + + + + +

    Objects

    +

    Working with objects inside the buckets

    + + + + + +

    gcs_compose_objects()

    + +

    Compose up to 32 objects into one

    + + + +

    gcs_copy_object()

    + +

    Copy an object

    + + + +

    gcs_delete_object()

    + +

    Delete an object

    + + + +

    gcs_get_object()

    + +

    Get an object in a bucket directly

    + + + +

    gcs_get_object_acl()

    + +

    Check the access control settings for an object for one entity

    + + + +

    gcs_list_objects()

    + +

    List objects in a bucket

    + + + +

    gcs_metadata_object()

    + +

    Make metadata for an object

    + + + +

    gcs_update_object_acl()

    + +

    Change access to an object in a bucket

    + + + + +

    Session helpers

    +

    Working with R session objects

    + + + + + +

    gcs_first() gcs_last()

    + +

    Save your R session to the cloud on startup/exit

    + + + +

    gcs_load()

    + +

    Load .RData objects or sessions from the Google Cloud

    + + + +

    gcs_save()

    + +

    Save .RData objects to the Google Cloud

    + + + +

    gcs_save_all() gcs_load_all() gcs_delete_all()

    + +

    Save/Load all files in directory to Google Cloud Storage

    + + + +

    gcs_save_image()

    + +

    Save an R session to the Google Cloud

    + + + +

    gcs_source()

    + +

    Source an R script from the Google Cloud

    + + +
    -

    Site built with pkgdown.

    +

    Site built with pkgdown 1.3.0.

    -
    + + + diff --git a/docs/sitemap.xml b/docs/sitemap.xml new file mode 100644 index 0000000..ea48ad8 --- /dev/null +++ b/docs/sitemap.xml @@ -0,0 +1,117 @@ + + + + https://cloudyr.github.io/googleCloudStorageR//index.html + + + https://cloudyr.github.io/googleCloudStorageR//reference/Object.html + + + https://cloudyr.github.io/googleCloudStorageR//reference/gcs_auth.html + + + https://cloudyr.github.io/googleCloudStorageR//reference/gcs_compose_objects.html + + + https://cloudyr.github.io/googleCloudStorageR//reference/gcs_copy_object.html + + + https://cloudyr.github.io/googleCloudStorageR//reference/gcs_create_bucket.html + + + https://cloudyr.github.io/googleCloudStorageR//reference/gcs_create_bucket_acl.html + + + https://cloudyr.github.io/googleCloudStorageR//reference/gcs_create_lifecycle.html + + + https://cloudyr.github.io/googleCloudStorageR//reference/gcs_create_pubsub.html + + + https://cloudyr.github.io/googleCloudStorageR//reference/gcs_delete_bucket.html + + + https://cloudyr.github.io/googleCloudStorageR//reference/gcs_delete_object.html + + + https://cloudyr.github.io/googleCloudStorageR//reference/gcs_delete_pubsub.html + + + https://cloudyr.github.io/googleCloudStorageR//reference/gcs_download_url.html + + + https://cloudyr.github.io/googleCloudStorageR//reference/gcs_first.html + + + https://cloudyr.github.io/googleCloudStorageR//reference/gcs_get_bucket.html + + + https://cloudyr.github.io/googleCloudStorageR//reference/gcs_get_bucket_acl.html + + + https://cloudyr.github.io/googleCloudStorageR//reference/gcs_get_global_bucket.html + + + https://cloudyr.github.io/googleCloudStorageR//reference/gcs_get_object.html + + + https://cloudyr.github.io/googleCloudStorageR//reference/gcs_get_object_acl.html + + + https://cloudyr.github.io/googleCloudStorageR//reference/gcs_get_service_email.html + + + https://cloudyr.github.io/googleCloudStorageR//reference/gcs_global_bucket.html + + + https://cloudyr.github.io/googleCloudStorageR//reference/gcs_list_buckets.html + + + https://cloudyr.github.io/googleCloudStorageR//reference/gcs_list_objects.html + + + https://cloudyr.github.io/googleCloudStorageR//reference/gcs_list_pubsub.html + + + https://cloudyr.github.io/googleCloudStorageR//reference/gcs_load.html + + + https://cloudyr.github.io/googleCloudStorageR//reference/gcs_metadata_object.html + + + https://cloudyr.github.io/googleCloudStorageR//reference/gcs_parse_download.html + + + https://cloudyr.github.io/googleCloudStorageR//reference/gcs_retry_upload.html + + + https://cloudyr.github.io/googleCloudStorageR//reference/gcs_save.html + + + https://cloudyr.github.io/googleCloudStorageR//reference/gcs_save_all.html + + + https://cloudyr.github.io/googleCloudStorageR//reference/gcs_save_image.html + + + https://cloudyr.github.io/googleCloudStorageR//reference/gcs_signed_url.html + + + https://cloudyr.github.io/googleCloudStorageR//reference/gcs_source.html + + + https://cloudyr.github.io/googleCloudStorageR//reference/gcs_update_object_acl.html + + + https://cloudyr.github.io/googleCloudStorageR//reference/gcs_upload.html + + + https://cloudyr.github.io/googleCloudStorageR//reference/gcs_version_bucket.html + + + https://cloudyr.github.io/googleCloudStorageR//reference/googleCloudStorageR.html + + + https://cloudyr.github.io/googleCloudStorageR//articles/googleCloudStorageR.html + + diff --git a/man/gcs_compose_objects.Rd b/man/gcs_compose_objects.Rd index 0d75933..d4929b4 100644 --- a/man/gcs_compose_objects.Rd +++ b/man/gcs_compose_objects.Rd @@ -33,7 +33,7 @@ This merges objects stored on Cloud Storage into one object. } } \seealso{ -\href{Compose objects}{https://cloud.google.com/storage/docs/json_api/v1/objects/compose} +\href{https://cloud.google.com/storage/docs/json_api/v1/objects/compose}{Compose objects} Other object functions: \code{\link{gcs_copy_object}}, \code{\link{gcs_delete_object}}, diff --git a/tests/testthat.R b/tests/testthat.R new file mode 100644 index 0000000..664b788 --- /dev/null +++ b/tests/testthat.R @@ -0,0 +1,4 @@ +library(testthat) +library(googleCloudStorageR) + +test_check("googleCloudStorageR") diff --git a/tests/testthat/test-io.R b/tests/testthat/test-io.R new file mode 100644 index 0000000..56188c9 --- /dev/null +++ b/tests/testthat/test-io.R @@ -0,0 +1,7 @@ +context("Authentication") + + + +test_that("Authentication", { + gcs_auth +})