Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

2023-10-28 | MAIN --> PROD | DEV (1daa2ed) --> STAGING #2644

Merged
merged 4 commits into from
Oct 28, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/workflows/add-bpmn-renders.yml
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ jobs:
uses: actions/checkout@v4

- name: Setup Node
uses: actions/setup-node@v3
uses: actions/setup-node@v4

- name: Install bpmn-to-image
run: npm install -g bpmn-to-image
Expand Down
101 changes: 101 additions & 0 deletions .github/workflows/database-backups.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,101 @@
---
name: Perform Media and Database Backups
on:
workflow_dispatch:
inputs:
environment:
required: true
type: string

jobs:
backup-media:
if: ${{ inputs.environment == 'dev' }}
name: Perform Media Backups
runs-on: ubuntu-latest
environment: ${{ inputs.environment }}
env:
space: ${{ inputs.environment }}
steps:
- name: Checkout
uses: actions/checkout@v4

- name: Unbind the private s3 bucket
uses: cloud-gov/cg-cli-tools@main
with:
cf_username: ${{ secrets.CF_USERNAME }}
cf_password: ${{ secrets.CF_PASSWORD }}
cf_org: gsa-tts-oros-fac
cf_space: ${{ env.space }}
command: cf unbind-service gsa-fac fac-private-s3

- name: Rebind the private s3 bucket with backups bucket as an additional instance
uses: cloud-gov/cg-cli-tools@main
with:
cf_username: ${{ secrets.CF_USERNAME }}
cf_password: ${{ secrets.CF_PASSWORD }}
cf_org: gsa-tts-oros-fac
cf_space: ${{ env.space }}
command: |
cf bind-service gsa-fac fac-private-s3 -c '{"additional_instances": ["backups"]}'

- name: Restart the app
uses: cloud-gov/cg-cli-tools@main
with:
cf_username: ${{ secrets.CF_USERNAME }}
cf_password: ${{ secrets.CF_PASSWORD }}
cf_org: gsa-tts-oros-fac
cf_space: ${{ env.space }}
command: cf restart gsa-fac

- name: Backup media files
uses: cloud-gov/cg-cli-tools@main
with:
cf_username: ${{ secrets.CF_USERNAME }}
cf_password: ${{ secrets.CF_PASSWORD }}
cf_org: gsa-tts-oros-fac
cf_space: ${{ env.space }}
command: cf run-task gsa-fac -k 2G -m 2G --name media_backup --command "./s3-sync.sh"

backup-dev-database:
if: ${{ inputs.environment == 'dev' }}
name: Perform Dev Database Backups
runs-on: ubuntu-latest
environment: ${{ inputs.environment }}
env:
space: ${{ inputs.environment }}
steps:
- name: Backup Dev Database
uses: cloud-gov/cg-cli-tools@main
with:
cf_username: ${{ secrets.CF_USERNAME }}
cf_password: ${{ secrets.CF_PASSWORD }}
cf_org: gsa-tts-oros-fac
cf_space: ${{ env.space }}
command: cf run-task gsa-fac -k 2G -m 2G --name pg_backup --command "./backup_database.sh ${{ env.space }}"

# backup-prod-database:
# if: ${{ inputs.environment == 'production' }}
# name: Perform Prod Database Backups
# runs-on: ubuntu-latest
# environment: ${{ inputs.environment }}
# env:
# space: ${{ inputs.environment }}
# steps:
# - name: Bind backup s3 bucket to prod app
# uses: cloud-gov/cg-cli-tools@main
# with:
# cf_username: ${{ secrets.CF_USERNAME }}
# cf_password: ${{ secrets.CF_PASSWORD }}
# cf_org: gsa-tts-oros-fac
# cf_space: ${{ env.space }}
# command: cf bind-service gsa-fac backups -w

# - name: Backup the database (Prod Only)
# uses: cloud-gov/cg-cli-tools@main
# with:
# cf_username: ${{ secrets.CF_USERNAME }}
# cf_password: ${{ secrets.CF_PASSWORD }}
# cf_org: gsa-tts-oros-fac
# cf_space: ${{ env.space }}
# command: cf run-task gsa-fac -k 2G -m 2G --name pg_backup --command "./backup_database.sh ${{ env.space }}"

2 changes: 1 addition & 1 deletion .github/workflows/end-to-end-test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ jobs:
- name: Checkout
uses: actions/checkout@v4

- uses: actions/setup-node@v3
- uses: actions/setup-node@v4
with:
node-version: 18

Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/linting.yml
Original file line number Diff line number Diff line change
Expand Up @@ -83,7 +83,7 @@ jobs:
fac-build-npm-
fac-build-

- uses: actions/setup-node@v3
- uses: actions/setup-node@v4
with:
node-version: 18

Expand Down
4 changes: 2 additions & 2 deletions .github/workflows/testing-from-build.yml
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ jobs:
- name: Checkout
uses: actions/checkout@v4

- uses: actions/setup-node@v3
- uses: actions/setup-node@v4
with:
node-version: 18

Expand Down Expand Up @@ -70,7 +70,7 @@ jobs:
# DISABLE_AUTH: True
# steps:
# - uses: actions/checkout@v4
# - uses: actions/setup-node@v3
# - uses: actions/setup-node@v4
# with:
# node-version: 18
# - name: Start services
Expand Down
4 changes: 2 additions & 2 deletions .github/workflows/testing-from-ghcr.yml
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ jobs:
- name: Checkout
uses: actions/checkout@v4

- uses: actions/setup-node@v3
- uses: actions/setup-node@v4
with:
node-version: 18

Expand Down Expand Up @@ -72,7 +72,7 @@ jobs:
# DISABLE_AUTH: True
# steps:
# - uses: actions/checkout@v4
# - uses: actions/setup-node@v3
# - uses: actions/setup-node@v4
# with:
# node-version: 16
# - name: Start services
Expand Down
4 changes: 2 additions & 2 deletions .github/workflows/trivy.yml
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ jobs:
run: docker build -t ${{ env.DOCKER_NAME }}:${{ steps.date.outputs.date }} .

- name: Run Trivy vulnerability scanner
uses: aquasecurity/trivy-action@0.12.0
uses: aquasecurity/trivy-action@0.13.0
with:
image-ref: '${{ env.DOCKER_NAME }}:${{ steps.date.outputs.date }}'
scan-type: 'image'
Expand Down Expand Up @@ -74,7 +74,7 @@ jobs:
run: docker pull ${{ matrix.image.name }}

- name: Run Trivy vulnerability scanner on Third Party Images
uses: aquasecurity/trivy-action@0.12.0
uses: aquasecurity/trivy-action@0.13.0
with:
image-ref: '${{ matrix.image.name }}'
scan-type: 'image'
Expand Down
1 change: 0 additions & 1 deletion backend/backup_database.sh
Original file line number Diff line number Diff line change
Expand Up @@ -4,4 +4,3 @@ echo "Environment set as: $1"
export PATH=/home/vcap/deps/0/apt/usr/lib/postgresql/15/bin:$PATH
date=$(date '+%Y-%m-%d-%H%M')
python manage.py dbbackup -o "$1-db-backup-$date.psql.bin"
python manage.py mediabackup -o "$1-media-backup-$date.tar"
50 changes: 50 additions & 0 deletions backend/s3-sync.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,50 @@
#!/bin/bash

# This requires: cf bind-service gsa-fac fac-private-s3 -c '{"additional_instances": ["backups"]}'

# Grab AWS cli
unset https_proxy
curl -L "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "awscliv2.zip"
unzip awscliv2.zip && rm awscliv2.zip
./aws/install -i ~/usr -b ~/bin
/home/vcap/app/bin/aws --version

# Get the fac-private-s3 bucket
export S3CREDS="$(echo $VCAP_SERVICES|jq -r '.s3')"
export FACPRIVS3="$(echo $S3CREDS|jq '.[]|select(.name=="fac-private-s3")'|jq '.credentials')"
export AWS_ACCESS_KEY_ID="$(echo "$FACPRIVS3"|jq -r '.access_key_id')"
export AWS_SECRET_ACCESS_KEY="$(echo "$FACPRIVS3"|jq -r '.secret_access_key')"
export FAC_MEDIA_BUCKET="$(echo "$FACPRIVS3"|jq -r '.bucket')"
export AWS_DEFAULT_REGION='us-gov-west-1'

# Get the backups bucket
export FACBACKUPS="$(echo $S3CREDS|jq '.[]|select(.name=="backups")'|jq '.credentials')"
export BACKUPS_BUCKET="$(echo "$FACBACKUPS"|jq -r '.bucket')"

date=$(date +%Y%m%d%H%M)

# Grab the s3 tar binary
curl -L "https://github.com/awslabs/amazon-s3-tar-tool/releases/download/v1.0.14/s3tar-linux-amd64.zip" -o "s3tar-linux-amd64.zip"
unzip s3tar-linux-amd64.zip && rm s3tar-linux-amd64.zip

# Create a single tar in the source bucket
./s3tar-linux-amd64 --region $AWS_DEFAULT_REGION -cvf s3://${FAC_MEDIA_BUCKET}/mediabackups/$date/archive.tar s3://${FAC_MEDIA_BUCKET} --storage-class INTELLIGENT_TIERING

# List contents of source bucket
/home/vcap/app/bin/aws s3 ls s3://${FAC_MEDIA_BUCKET}/mediabackups/$date/

# Move the tar to the backups bucket
/home/vcap/app/bin/aws s3 sync s3://${FAC_MEDIA_BUCKET}/mediabackups/$date/ s3://${BACKUPS_BUCKET}/mediabackups/$date/ --storage-class INTELLIGENT_TIERING
# Share the Tar to dest and extract (without including the tar)
#./s3tar-linux-amd64 --region $AWS_DEFAULT_REGION -cvf s3://${FAC_MEDIA_BUCKET}/mediabackups/$date/archive.tar -C s3://${BACKUPS_BUCKET}/mediabackups/$date/ --storage-class INTELLIGENT_TIERING

# List contents of destination bucket
/home/vcap/app/bin/aws s3 ls s3://${BACKUPS_BUCKET}/mediabackups/$date/

# Cleanup the source bucket so older backups don't get added to the tar
/home/vcap/app/bin/aws s3 rm s3://${FAC_MEDIA_BUCKET}/mediabackups/$date/archive.tar
/home/vcap/app/bin/aws s3 rm s3://${FAC_MEDIA_BUCKET}/mediabackups/$date/
/home/vcap/app/bin/aws s3 rm s3://${FAC_MEDIA_BUCKET}/mediabackups/

# List contents of source bucket to ensure everything was deleted properly
/home/vcap/app/bin/aws s3 ls s3://${FAC_MEDIA_BUCKET}/mediabackups/$date/
5 changes: 0 additions & 5 deletions tools/workbook-generator/.gitignore

This file was deleted.

73 changes: 0 additions & 73 deletions tools/workbook-generator/README.md

This file was deleted.

Loading
Loading