Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix(GC Storage Nodejs Client): Updated imports and client initialization for GC Storage #213

Closed
wants to merge 1 commit into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 3 additions & 0 deletions .env-sample
Original file line number Diff line number Diff line change
Expand Up @@ -9,3 +9,6 @@ SLACKBOT_S3_BUCKET=my-bucket
SLACKBOT_S3_BUCKET_REGION=us-east-1
AWS_ACCESS_KEY_ID=ABCDEFGHJKL
AWS_SECRET_ACCESS_KEY=abcdefghjkl
GOOGLE_CLOUD_BUCKET=my-bucket-name
GOOGLE_CLOUD_PROJECT=my-project-id
GOOGLE_CLOUD_CREDENTIALS_JSON=<path to .json (credential) file>
2 changes: 2 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -120,6 +120,8 @@ There are a couple environment variables that can be used to tweak behavior:

###### Google Cloud Storage

Requires that the Cloud Storage Bucket isn't preventing public access and that `Access Control` is set to `Fine-Grained` and `Object-level ACLs enabled` is set.

- `GOOGLE_CLOUD_BUCKET` (optional) - If you want to use Google Cloud to store visualization images posted by Lookerbot, provide the name of your bucket.

If Lookerbot is running on Google Compute Engine, [no further information should be needed if the approprate API scopes are set up](https://github.com/GoogleCloudPlatform/google-cloud-node#on-google-cloud-platform).
Expand Down
10 changes: 6 additions & 4 deletions src/stores/google_cloud_store.ts
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,8 @@ import * as fs from "fs"
import { ReadableStreamBuffer } from "stream-buffers"
import { Store } from "./store"

const gcs = require("@google-cloud/storage")
// post v2.* nodejs cloud storage client requires the below import
const {Storage} = require("@google-cloud/storage")

export class GoogleCloudStore extends Store {

Expand All @@ -16,9 +17,10 @@ export class GoogleCloudStore extends Store {
blobStream.put(buffer)
blobStream.stop()

const storage = gcs({
credentials: process.env.GOOGLE_CLOUD_CREDENTIALS_JSON ? JSON.parse(process.env.GOOGLE_CLOUD_CREDENTIALS_JSON) : undefined,
projectId: process.env.GOOGLE_CLOUD_PROJECT,
// updating due to v2.* nodejs client changes
const storage = new Storage({
// if keyFilename is supplied, projectID is no longer required
keyFilename: process.env.GOOGLE_CLOUD_CREDENTIALS_JSON
})

const bucketName = process.env.GOOGLE_CLOUD_BUCKET
Expand Down