Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add spiffworkflow module #58

Open
wants to merge 30 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
30 commits
Select commit Hold shift + click to select a range
f2060c1
SpiffWorkflow module
mogul Dec 8, 2024
6b9aaff
Minimal testing module
mogul Dec 8, 2024
16b0b7c
Make sure a changed hash for an imageref label results in a new deploy
mogul Dec 9, 2024
0752ee5
terraform fmt
mogul Dec 9, 2024
0079b2e
Avoid default route collisions
mogul Dec 9, 2024
d4c67ae
Add workflow to publish to GHCR
asteel-gsa Mar 3, 2025
454538b
Build the Trivy DB Cache
asteel-gsa Mar 3, 2025
85fbff7
Run a scan
asteel-gsa Mar 3, 2025
4a1a1e5
Add optional params to Trivy
asteel-gsa Mar 3, 2025
2760480
Publish findings to GH Security Tab
asteel-gsa Mar 3, 2025
ca8377f
Publish images
asteel-gsa Mar 3, 2025
08eac17
Disable on push event
asteel-gsa Mar 3, 2025
9e21a0d
Supply default ClamAV Image
asteel-gsa Mar 3, 2025
f9d9ec0
Change default image
asteel-gsa Mar 3, 2025
f671b41
terraform fmt
asteel-gsa Mar 3, 2025
3953d28
Add tests
asteel-gsa Mar 4, 2025
0e895e8
Disable default, force a ssh key
asteel-gsa Mar 4, 2025
cdf37ed
Whitespace
asteel-gsa Mar 4, 2025
e447326
Disable database creation for tests
asteel-gsa Mar 4, 2025
b9c4f73
Add spiff to test matrix
asteel-gsa Mar 4, 2025
f314714
Add database service instance id as a variable
asteel-gsa Mar 4, 2025
dc7b332
Pass the DB as a dependency
asteel-gsa Mar 5, 2025
dfe4d3c
Readme Update
asteel-gsa Mar 5, 2025
830b581
Cleanup
asteel-gsa Mar 5, 2025
a232c44
Fix Tests
asteel-gsa Mar 5, 2025
e0a7dd0
Update Readme
asteel-gsa Mar 5, 2025
ebf5247
Enable reporting scans to GH security tab
asteel-gsa Mar 5, 2025
09a6ec6
Adds sha256 test
asteel-gsa Mar 5, 2025
cfb9eeb
Process Models are working again
asteel-gsa Mar 6, 2025
c5d8cb1
Update Readme
asteel-gsa Mar 6, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
82 changes: 82 additions & 0 deletions .github/workflows/pull-and-publish-images.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,82 @@
---
name: Pull Third Party Containers, Scan, and Publish to GHCR
on:
workflow_dispatch:
schedule:
- cron: '0 5 * * 0'

jobs:
pull-and-scan:
runs-on: ubuntu-latest
permissions:
contents: read
packages: write
env:
GH_REPO: gsa-tts/terraform-cloudgov
strategy:
fail-fast: false
matrix:
image:
- name: ghcr.io/gsa-tts/spiffworkflow-backend:deploy-to-cloud-gov-latest
short-name: spiffarena-backend
- name: ghcr.io/gsa-tts/spiffworkflow-frontend:deploy-to-cloud-gov-latest
short-name: spiffarena-frontend
- name: ghcr.io/gsa-tts/connector-proxy-demo:deploy-to-cloud-gov-latest
short-name: spiffarena-connector
- name: ghcr.io/gsa-tts/clamav-rest/clamav:latest
short-name: clamav-proxy-support

steps:
- name: Checkout
uses: actions/checkout@v4

- name: Set up Docker Buildx
id: buildx
uses: docker/setup-buildx-action@v3

- name: Pull Docker Image
run: docker pull ${{ matrix.image.name }}

- name: Run Trivy vulnerability scanner
uses: aquasecurity/[email protected]
env:
TRIVY_DB_REPOSITORY: public.ecr.aws/aquasecurity/trivy-db,ghcr.io/aquasecurity/trivy-db
TRIVY_JAVA_DB_REPOSITORY: public.ecr.aws/aquasecurity/trivy-java-db,ghcr.io/aquasecurity/trivy-java-db
TRIVY_SKIP_DB_UPDATE: true
TRIVY_SKIP_JAVA_DB_UPDATE: true
TRIVY_DISABLE_VEX_NOTICE: true
with:
image-ref: '${{ matrix.image.name }}'
scan-type: 'image'
hide-progress: true
format: 'sarif'
output: 'trivy-results.sarif'
exit-code: 0
severity: 'CRITICAL,HIGH'
scanners: 'vuln'
timeout: 15m0s
ignore-unfixed: true

- name: Upload Trivy scan results to GitHub Security tab for Third Party Images
uses: github/codeql-action/upload-sarif@v3
with:
sarif_file: 'trivy-results.sarif'

# - name: Scan Image
# run: docker run aquasec/trivy:latest image --db-repository public.ecr.aws/aquasecurity/trivy-db,ghcr.io/aquasecurity/trivy-db --java-db-repository public.ecr.aws/aquasecurity/trivy-java-db,ghcr.io/aquasecurity/trivy-java-db --timeout 5m --scanners vuln --exit-code 1 --severity CRITICAL,HIGH ${{ matrix.image.name }}

- name: Tag Image
run: |
date=$(date +%Y%m%d)
docker tag ${{ matrix.image.name }} ghcr.io/${{ env.GH_REPO }}/${{ matrix.image.short-name }}:latest
docker tag ${{ matrix.image.name }} ghcr.io/${{ env.GH_REPO }}/${{ matrix.image.short-name }}:$date

- name: Login to GitHub Container Registry
uses: docker/login-action@v3
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}

- name: Push Image
run: docker push --all-tags ghcr.io/${{ env.GH_REPO }}/${{ matrix.image.short-name }}
2 changes: 1 addition & 1 deletion .github/workflows/test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ jobs:
fail-fast: false
max-parallel: 2
matrix:
module: ["s3", "database", "redis", "cg_space", "domain", "clamav"]
module: ["s3", "database", "redis", "cg_space", "domain", "clamav", "spiffworkflow"]
steps:
- uses: actions/checkout@v4

Expand Down
38 changes: 38 additions & 0 deletions .github/workflows/update-trivy-cache.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
# In your scan workflow, set TRIVY_SKIP_DB_UPDATE=true and TRIVY_SKIP_JAVA_DB_UPDATE=true.
name: Update Trivy Cache

on:
schedule:
- cron: '0 0 * * *' # Run daily at midnight UTC
workflow_dispatch: # Allow manual triggering

jobs:
update-trivy-db:
runs-on: ubuntu-latest
steps:
- name: Setup oras
uses: oras-project/setup-oras@v1

- name: Get current date
id: date
run: echo "date=$(date +'%Y-%m-%d')" >> $GITHUB_OUTPUT

- name: Download and extract the vulnerability DB
run: |
mkdir -p $GITHUB_WORKSPACE/.cache/trivy/db
oras pull ghcr.io/aquasecurity/trivy-db:2
tar -xzf db.tar.gz -C $GITHUB_WORKSPACE/.cache/trivy/db
rm db.tar.gz

- name: Download and extract the Java DB
run: |
mkdir -p $GITHUB_WORKSPACE/.cache/trivy/java-db
oras pull ghcr.io/aquasecurity/trivy-java-db:1
tar -xzf javadb.tar.gz -C $GITHUB_WORKSPACE/.cache/trivy/java-db
rm javadb.tar.gz

- name: Cache DBs
uses: actions/cache/save@v4
with:
path: ${{ github.workspace }}/.cache/trivy
key: cache-trivy-${{ steps.date.outputs.date }}
27 changes: 27 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -202,6 +202,33 @@ module "drupal" {
}
```

### SpiffWorkflow
Spiff Workflow is a workflow engine implemented in pure Python. Using BPMN will allow non-developers to describe complex workflow processes in a visual diagram, coupled with a powerful python script engine that works seamlessly within the diagrams. SpiffWorkflow can parse these diagrams and execute them. The ability for businesses to create clear, coherent diagrams that drive an application has far reaching potential. More information can be found on the creators [github page](https://github.com/sartography/SpiffWorkflow).
```
module "SpiffWorkflow" {
source = ".github.com/GSA-TTS/terraform-cloudgov//spiffworkflow?ref=v2.3.0"
cf_org_name = var.cf_org_name
cf_space_name = var.cf_space_name
# You must have a valid git key pairing. Generate with ssh-keygen -t rsa -b 4096 -C "my-git@email", and add the
# public key to https://github.com/settings/keys. var.process_models_ssh_key is the private key. When you store process_models_ssh_key
# in a .tfvars, ensure that the file format of the .tfvars file is in "LF" End Of Line Sequence.
process_models_ssh_key = var.process_models_ssh_key
# source_branch_for_example_models = "" # This should be a branch (non-main), to load the examples. Otherwise, edits will be pushed directly to main
# target_branch_for_saving_changes = "" # This should be an existing branch in the model repo. New models will be pushed here.
database_service_instance_name = "spiffworkflow-db"
tags = ["SpiffWorkflow"]
depends_on = [module.Database]
}

module "Database" {
source = "github.com/gsa-tts/terraform-cloudgov//database?ref=v2.2.0"
cf_space_id = data.cloudfoundry_space.space.id
name = "my-db-name"
tags = ["rds", "SpiffWorkflow"]
rds_plan_name = "small-psql"
}
```

## Testing


Expand Down
2 changes: 2 additions & 0 deletions clamav/variables.tf
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,8 @@ variable "name" {
variable "clamav_image" {
type = string
description = "Docker image to deploy the clamav api app"
# Uses the https://github.com/GSA-TTS/clamav-rest clamav image, which is maintained by GSA-TTS, and supports a proxy.
default = "ghcr.io/gsa-tts/terraform-cloudgov/clamav-proxy-support:latest"
}

variable "clamav_memory" {
Expand Down
195 changes: 195 additions & 0 deletions spiffworkflow/main.tf
Original file line number Diff line number Diff line change
@@ -0,0 +1,195 @@
locals {
route_prefix = (var.route_prefix != "" ? var.route_prefix : random_pet.route_prefix.id)

backend_route = "${local.frontend_route}/api"
connector_route = "${local.route_prefix}-connector.apps.internal"
frontend_route = "${local.route_prefix}.app.cloud.gov"

username = random_uuid.username.result
password = random_password.password.result

# backend_url = "https://${local.username}:${local.password}@${local.backend_route}"
# connector_url = "https://${local.connector_route}:61443"
# frontend_url = "https://${local.username}:${local.password}@${local.frontend_route}"

backend_url = "https://${local.backend_route}"
connector_url = "https://${local.connector_route}:61443"
frontend_url = "https://${local.frontend_route}"

backend_app_id = cloudfoundry_app.backend.id
connector_app_id = cloudfoundry_app.connector.id
frontend_app_id = cloudfoundry_app.frontend.id
tags = setunion(["terraform-cloudgov-managed"], var.tags)
backend_baseimage = split(":", var.backend_imageref)[0]
frontend_baseimage = split(":", var.frontend_imageref)[0]
connector_baseimage = split(":", var.connector_imageref)[0]
}

resource "random_uuid" "username" {}
resource "random_password" "password" {
length = 16
special = false
}

resource "random_pet" "route_prefix" {
prefix = "spiffworkflow"
}

resource "random_password" "backend_flask_secret_key" {
length = 32
special = true
}

resource "random_password" "backend_openid_secret" {
length = 32
special = true
}

data "docker_registry_image" "backend" {
name = var.backend_imageref
}
resource "cloudfoundry_app" "backend" {
name = "${var.app_prefix}-backend"
org_name = var.cf_org_name
space_name = var.cf_space_name
docker_image = "${local.backend_baseimage}@${data.docker_registry_image.backend.sha256_digest}"
memory = var.backend_memory
instances = var.backend_instances
disk_quota = "3G"
strategy = "rolling"
command = <<-COMMAND
# Get the postgres URI from the service binding. (SQL Alchemy insists on "postgresql://".🙄)
export SPIFFWORKFLOW_BACKEND_DATABASE_URI=$( echo $VCAP_SERVICES | jq -r '.["aws-rds"][].credentials.uri' | sed -e s/postgres/postgresql/ )

# Make sure the Cloud Foundry-provided CA is recognized when making TLS connections
cat /etc/cf-system-certificates/* > /usr/local/share/ca-certificates/cf-system-certificates.crt
/usr/sbin/update-ca-certificates

# Verify that this is working. It should return '{"ok": true}'
# curl https://spiffworkflow((slug))-connector.apps.internal:61443/liveness

/app/bin/clone_process_models
/app/bin/boot_server_in_docker
COMMAND
health_check_type = "http"
health_check_http_endpoint = "/api/v1.0/status"
service_bindings = [
{ service_instance = var.database_service_instance_name }
]
routes = [{
route = local.backend_route
protocol = "http1"
}]

environment = {
APPLICATION_ROOT : "/"
FLASK_SESSION_SECRET_KEY : random_password.backend_flask_secret_key.result
FLASK_DEBUG : "0"
REQUESTS_CA_BUNDLE : "/etc/ssl/certs/ca-certificates.crt"

# All of the configuration variables are documented here:
# spiffworkflow-backend/src/spiffworkflow_backend/config/default.py
SPIFFWORKFLOW_BACKEND_BPMN_SPEC_ABSOLUTE_DIR : "/app/process_models"
SPIFFWORKFLOW_BACKEND_CHECK_FRONTEND_AND_BACKEND_URL_COMPATIBILITY : "false"
SPIFFWORKFLOW_BACKEND_CONNECTOR_PROXY_URL : local.connector_url
SPIFFWORKFLOW_BACKEND_DATABASE_TYPE : "postgres"
SPIFFWORKFLOW_BACKEND_ENV : "local_docker"
SPIFFWORKFLOW_BACKEND_EXTENSIONS_API_ENABLED : "true"

# This branch needs to exist, otherwise we can't clone it at startup and startup fails
SPIFFWORKFLOW_BACKEND_GIT_COMMIT_ON_SAVE : "true"
SPIFFWORKFLOW_BACKEND_GIT_PUBLISH_CLONE_URL : var.process_models_repository
SPIFFWORKFLOW_BACKEND_GIT_PUBLISH_TARGET_BRANCH : var.target_branch_for_saving_changes
SPIFFWORKFLOW_BACKEND_GIT_SOURCE_BRANCH : var.source_branch_for_example_models
SPIFFWORKFLOW_BACKEND_GIT_SSH_PRIVATE_KEY : var.process_models_ssh_key
SPIFFWORKFLOW_BACKEND_LOAD_FIXTURE_DATA : "false"
SPIFFWORKFLOW_BACKEND_LOG_LEVEL : "INFO"

# TODO: We should make these configurable with variables so
# you can specify an external OIDC IDP.
SPIFFWORKFLOW_BACKEND_OPEN_ID_CLIENT_ID : "spiffworkflow-backend"
SPIFFWORKFLOW_BACKEND_OPEN_ID_CLIENT_SECRET_KEY : random_password.backend_openid_secret.result
SPIFFWORKFLOW_BACKEND_OPEN_ID_SERVER_URL : "${local.backend_url}/openid"

# TODO: static creds are in this path in the image:
# /config/permissions/example.yml
# We should probably generate credentials only for the admin
# and have everything else be specified via DMN as described here:
# https://spiff-arena.readthedocs.io/en/latest/DevOps_installation_integration/admin_and_permissions.html#site-administration
SPIFFWORKFLOW_BACKEND_PERMISSIONS_FILE_NAME : "example.yml"

SPIFFWORKFLOW_BACKEND_PORT : "8080"
SPIFFWORKFLOW_BACKEND_RUN_BACKGROUND_SCHEDULER_IN_CREATE_APP : "true"
SPIFFWORKFLOW_BACKEND_UPGRADE_DB : "true"
SPIFFWORKFLOW_BACKEND_URL : local.backend_url
SPIFFWORKFLOW_BACKEND_URL_FOR_FRONTEND : local.frontend_url
SPIFFWORKFLOW_BACKEND_USE_WERKZEUG_MIDDLEWARE_PROXY_FIX : "true"
SPIFFWORKFLOW_BACKEND_WSGI_PATH_PREFIX : "/api"
}
}

resource "random_password" "connector_flask_secret_key" {
length = 32
special = true
}

data "docker_registry_image" "connector" {
name = var.connector_imageref
}

resource "cloudfoundry_app" "connector" {
name = "${var.app_prefix}-connector"
org_name = var.cf_org_name
space_name = var.cf_space_name
docker_image = "${local.connector_baseimage}@${data.docker_registry_image.connector.sha256_digest}"
memory = var.connector_memory
instances = var.connector_instances
disk_quota = "3G"
strategy = "rolling"
command = <<-COMMAND
# Make sure the Cloud Foundry-provided CA is recognized when making TLS connections
cat /etc/cf-system-certificates/* > /usr/local/share/ca-certificates/cf-system-certificates.crt
/usr/sbin/update-ca-certificates
/app/bin/boot_server_in_docker
COMMAND
health_check_type = "http"
health_check_http_endpoint = "/liveness"
routes = [{
route = local.connector_route
protocol = "http1"
}]

environment = {
FLASK_DEBUG : "0"
FLASK_SESSION_SECRET_KEY : random_password.connector_flask_secret_key.result
CONNECTOR_PROXY_PORT : "8080"
REQUESTS_CA_BUNDLE : "/etc/ssl/certs/ca-certificates.crt"
}
}

data "docker_registry_image" "frontend" {
name = var.frontend_imageref
}

resource "cloudfoundry_app" "frontend" {
name = "${var.app_prefix}-frontend"
org_name = var.cf_org_name
space_name = var.cf_space_name
docker_image = "${local.frontend_baseimage}@${data.docker_registry_image.frontend.sha256_digest}"
memory = var.frontend_memory
instances = var.frontend_instances
strategy = "rolling"
health_check_type = "port"
routes = [{
route = local.frontend_route
protocol = "http1"
}]

environment = {
APPLICATION_ROOT : "/"
PORT0 : "80"
SPIFFWORKFLOW_FRONTEND_RUNTIME_CONFIG_APP_ROUTING_STRATEGY : "path_based"
SPIFFWORKFLOW_FRONTEND_RUNTIME_CONFIG_BACKEND_BASE_URL : local.backend_url
BACKEND_BASE_URL : local.backend_url
}
}
27 changes: 27 additions & 0 deletions spiffworkflow/outputs.tf
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
output "backend_url" {
value = local.backend_url
sensitive = true
}

output "connector_url" {
value = local.connector_url
sensitive = true
}

output "frontend_url" {
value = local.frontend_url
sensitive = true
}

output "backend_app_id" {
value = local.backend_app_id
}

output "connector_app_id" {
value = local.connector_app_id
}

output "frontend_app_id" {
value = local.frontend_app_id
}

Loading