This repo is part of a multi-part guide that shows how to configure and deploy the example.com reference architecture described in Google Cloud security foundations guide (PDF). The following table lists the parts of the guide.
0-bootstrap | Bootstraps a Google Cloud organization, creating all the required resources and permissions to start using the Cloud Foundation Toolkit (CFT). This step also configures a CI/CD pipeline for foundations code in subsequent stages. |
1-org (this file) | Sets up top level shared folders, monitoring and networking projects, and organization-level logging, and sets baseline security settings through organizational policy. |
2-environments | Sets up development, non-production, and production environments within the Google Cloud organization that you've created. |
3-networks | Sets up base and restricted shared VPCs with default DNS, NAT (optional), Private Service networking, VPC service controls, on-premises Dedicated Interconnect, and baseline firewall rules for each environment. It also sets up the global DNS hub. |
4-projects | Sets up a folder structure, projects, and application infrastructure pipeline for applications, which are connected as service projects to the shared VPC created in the previous stage. |
5-app-infra | Deploy a simple Compute Engine instance in one of the business unit projects using the infra pipeline set up in 4-projects. |
For an overview of the architecture and the parts, see the terraform-example-foundation README.
The purpose of this step is to set up top-level shared folders, monitoring and networking projects, organization-level logging, and baseline security settings through organizational policies.
- 0-bootstrap executed successfully.
- Cloud Identity / Google Workspace group for security admins.
- Membership in the security admins group for the user running Terraform.
- Security Command Center notifications require that you choose a Security Command Center tier and create and grant permissions for the Security Command Center service account as outlined in Setting up Security Command Center
- Ensure that you have requested for sufficient projects quota, as the Terraform scripts will create multiple projects from this point onwards. For more information, please see the FAQ.
Note: Make sure that you use the same version of Terraform throughout this series, otherwise you might experience Terraform state snapshot lock errors.
Please refer to troubleshooting if you run into issues during this step.
Disclaimer: This step enables Data Access logs for all services in your organization.
Enabling Data Access logs might result in your project being charged for the additional logs usage.
For details on costs you might incur, go to Pricing.
You can choose not to enable the Data Access logs by setting variable data_access_logs_enabled
to false.
Note: This module creates a sink to export all logs to Google Storage. It also creates sinks to export a subset of security related logs to BigQuery and Pub/Sub. This will result in additional charges for those copies of logs.
You can change the filters & sinks by modifying the configuration in envs/shared/log_sinks.tf
.
Note: Currently, this module does not enable bucket policy retention for organization logs, please, enable it if needed.
Note: It is possible to enable an organization policy for OS Login with this module.
OS Login has some limitations.
If those limitations do not apply to your workload/environment, you can choose to enable the OS Login policy by setting variable enable_os_login_policy
to true
.
Note: You need to set variable enable_hub_and_spoke
to true
to be able to used the Hub-and-Spoke architecture detailed in the Networking section of the google cloud security foundations guide.
Note: If you are using MacOS, replace cp -RT
with cp -R
in the relevant
commands. The -T
flag is needed for Linux, but causes problems for MacOS.
Note: This module creates a Security Command Center Notification.
The notification name must be unique in the organization.
The suggested name in the terraform.tfvars
file is scc-notify.
To check if it already exists run:
gcloud scc notifications describe <scc_notification_name> --organization=<org_id>
FIs should configure the log_export_storage_retention_policy
variable in terraform.tfvars
to set a minimum retention period for org-level logs for compliance use cases.
In addition to the other Organization Policies, FIs should review org_policy_mas_abs.tf
to evaluate if the following constraints are needed:
constraints/compute.restrictNonConfidentialComputing
to restrict non-Confidential Computing resources from being created (by default, this is disabled, although FIs can enable it and add specific folders or projects to be excluded from this constraint)constraints/storage.retentionPolicySeconds
to enforce Cloud Storage retention policy from a list of retention periods (by default, 1 hour and 30 days are configured)
-
Clone the policy repo based on the Terraform output from the previous section. Clone the repo at the same level of the
terraform-example-foundation
folder, the next instructions assume that layout. Runterraform output cloudbuild_project_id
in the0-bootstrap
folder to see the project again.gcloud source repos clone gcp-policies --project=YOUR_CLOUD_BUILD_PROJECT_ID
-
Navigate into the repo. All subsequent steps assume you are running them from the gcp-policies directory. If you run them from another directory, adjust your copy paths accordingly.
cd gcp-policies
-
Copy contents of policy-library to new repo.
cp -RT ../terraform-example-foundation/policy-library/ .
-
Commit changes.
git add . git commit -m 'Your message'
-
Push your master branch to the new repo.
git push --set-upstream origin master
-
Navigate out of the repo.
cd ..
-
Clone the repo.
gcloud source repos clone gcp-org --project=YOUR_CLOUD_BUILD_PROJECT_ID
-
Navigate into the repo and change to a non-production branch. All subsequent steps assume you are running them from the gcp-environments directory. If you run them from another directory, adjust your copy paths accordingly.
cd gcp-org git checkout -b plan
-
Copy contents of foundation to new repo (terraform variables will updated in a future step).
cp -RT ../terraform-example-foundation/1-org/ .
-
Copy Cloud Build configuration files for Terraform. You may need to modify the command to reflect your current directory.
cp ../terraform-example-foundation/build/cloudbuild-tf-* .
-
Copy the Terraform wrapper script to the root of your new repository (modify accordingly based on your current directory).
cp ../terraform-example-foundation/build/tf-wrapper.sh .
-
Ensure wrapper script can be executed.
chmod 755 ./tf-wrapper.sh
-
Check if your organization already has an Access Context Manager Policy.
gcloud access-context-manager policies list --organization YOUR_ORGANIZATION_ID --format="value(name)"
-
Rename
./envs/shared/terraform.example.tfvars
to./envs/shared/terraform.tfvars
and update the file with values from your environment and bootstrap step (you can re-runterraform output
in the 0-bootstrap directory to find these values). Make sure thatdefault_region
is set to a valid BigQuery dataset region. Also, if the previous step showed a numeric value, make sure to un-comment the variablecreate_access_context_manager_access_policy = false
. See the shared folder README.md for additional information on the values in theterraform.tfvars
file. -
Commit changes.
git add . git commit -m 'Your message'
-
Push your plan branch to trigger a plan. For this command, the branch
plan
is not a special one. Any branch which name is different fromdevelopment
,non-production
orproduction
will trigger a Terraform plan.git push --set-upstream origin plan
-
Review the plan output in your Cloud Build project. https://console.cloud.google.com/cloud-build/builds?project=YOUR_CLOUD_BUILD_PROJECT_ID
-
Merge changes to production branch.
git checkout -b production git push origin production
-
Review the apply output in your Cloud Build project. https://console.cloud.google.com/cloud-build/builds?project=YOUR_CLOUD_BUILD_PROJECT_ID
-
You can now move to the instructions in the 2-environments step.
Troubleshooting:
If you received a PERMISSION_DENIED
error running the gcloud access-context-manager
or the gcloud scc notifications
commands you can append
--impersonate-service-account=org-terraform@<SEED_PROJECT_ID>.iam.gserviceaccount.com
to run the command as the Terraform service account.
-
Clone the repo you created manually in 0-bootstrap.
git clone <YOUR_NEW_REPO-1-org>
-
Navigate into the repo and change to a non-production branch. All subsequent steps assume you are running them from the gcp-environments directory. If you run them from another directory, adjust your copy paths accordingly.
cd YOUR_NEW_REPO_CLONE-1-org git checkout -b plan
-
Copy contents of foundation to new repo.
cp -RT ../terraform-example-foundation/1-org/ .
-
Copy contents of policy-library to new repo.
cp -RT ../terraform-example-foundation/policy-library/ ./policy-library
-
Copy the Jenkinsfile script to the root of your new repository.\
cp ../terraform-example-foundation/build/Jenkinsfile .
-
Update the variables located in the
environment {}
section of theJenkinsfile
with values from your environment:_TF_SA_EMAIL _STATE_BUCKET_NAME _PROJECT_ID (the cicd project id)
-
Copy Terraform wrapper script to the root of your new repository.
cp ../terraform-example-foundation/build/tf-wrapper.sh .
-
Ensure wrapper script can be executed.
chmod 755 ./tf-wrapper.sh
-
Check if your organization already has an Access Context Manager Policy.
gcloud access-context-manager policies list --organization YOUR_ORGANIZATION_ID --format="value(name)"
-
Rename
./envs/shared/terraform.example.tfvars
to./envs/shared/terraform.tfvars
and update the file with values from your environment and bootstrap. You can re-runterraform output
in the 0-bootstrap directory to find these values. Make sure thatdefault_region
is set to a valid BigQuery dataset region. Also, if the previous step showed a numeric value, make sure to un-comment the variablecreate_access_context_manager_access_policy = false
. See the shared folder README.md for additional information on the values in theterraform.tfvars
file. -
Commit changes.
git add . git commit -m 'Your message'
-
Push your plan branch. The branch
plan
is not a special one. Any branch which name is different fromdevelopment
,non-production
orproduction
will trigger a Terraform plan.- Assuming you configured an automatic trigger in your Jenkins Master (see Jenkins sub-module README), this will trigger a plan. You can also trigger a Jenkins job manually. Given the many options to do this in Jenkins, it is out of the scope of this document see Jenkins website for more details.
git push --set-upstream origin plan
-
Review the plan output in your Master's web UI.
-
Merge changes to production branch.
git checkout -b production git push origin production
-
Review the apply output in your Master's web UI. (you might want to use the option to "Scan Multibranch Pipeline Now" in your Jenkins Master UI).
- Change into 1-org folder.
- Run
cp ../build/tf-wrapper.sh .
- Run
chmod 755 ./tf-wrapper.sh
- Change into 1-org/envs/shared/ folder.
- Rename
terraform.example.tfvars
toterraform.tfvars
and update the file with values from your environment and bootstrap. - Obtain your bucket name by running the following command in the 0-bootstrap folder.
terraform output gcs_bucket_tfstate
- Update
backend.tf
with your bucket from bootstrap.for i in `find -name 'backend.tf'`; do sed -i 's/UPDATE_ME/<YOUR-BUCKET-NAME>/' $i; done
We will now deploy our environment (production) using this script. When using Cloud Build or Jenkins as your CI/CD tool each environment corresponds to a branch is the repository for 1-org step and only the corresponding environment is applied.
To use the validate
option of the tf-wrapper.sh
script, please follow the instructions in the Install Terraform Validator section and install version v0.4.0
in your system. You will also need to rename the binary from terraform-validator-<your-platform>
to terraform-validator
and the terraform-validator
binary must be in your PATH
.
- Run
./tf-wrapper.sh init production
. - Run
./tf-wrapper.sh plan production
and review output. - Run
./tf-wrapper.sh validate production $(pwd)/../policy-library <YOUR_CLOUD_BUILD_PROJECT_ID>
and check for violations. - Run
./tf-wrapper.sh apply production
.
If you received any errors or made any changes to the Terraform config or terraform.tfvars
you must re-run ./tf-wrapper.sh plan production
before run ./tf-wrapper.sh apply production
.