Skip to content

Commit

Permalink
Merge pull request #1329 from DFE-Digital/669-git-api-clean-up-code
Browse files Browse the repository at this point in the history
[669] Clean up code after migration
  • Loading branch information
saliceti authored Dec 13, 2023
2 parents 0732742 + 4ce4bb7 commit 58c951d
Show file tree
Hide file tree
Showing 39 changed files with 87 additions and 832 deletions.
3 changes: 0 additions & 3 deletions .github/common_environment.yml
Original file line number Diff line number Diff line change
@@ -1,10 +1,7 @@
DOCKER_REPOSITORY: ghcr.io/dfe-digital/get-into-teaching-api
DOMAIN: london.cloudapps.digital
APPLICATION: Get Into Teaching API Service
PAAS_APPLICATION_NAME: get-into-teaching-api
SLACK_FAILURE: '#ff0000'
SLACK_SUCCESS: '#00ff00'
SLACK_ICON: https://raw.githubusercontent.com/DFE-Digital/get-into-teaching-api/master/.github/image.png?size=48
SLACK_USERNAME: GiT Workflows
SLACK_FOOTER: Get Into Teaching API Service

19 changes: 12 additions & 7 deletions .github/labeler.yml
Original file line number Diff line number Diff line change
@@ -1,18 +1,23 @@
---

VisualCSharp:
- GetIntoTeachingApi/**/*.cs
- changed-files:
- any-glob-to-any-file: 'GetIntoTeachingApi/**/*.cs'

Test:
- GetIntoTeachingApiTests/**/*
- changed-files:
- any-glob-to-any-file: 'GetIntoTeachingApiTests/**/*'

Monitoring:
- monitoring/**/*
- changed-files:
- any-glob-to-any-file: 'monitoring/**/*'

DevOps:
- terraform/**/*
- .github/**/*.yml
- changed-files:
- any-glob-to-any-file: 'terraform/**/*'
- any-glob-to-any-file: '.github/**/*.yml'

Docker:
- Dockerfile
- docker-compose.yml
- changed-files:
- any-glob-to-any-file: 'Dockerfile'
- any-glob-to-any-file: 'docker-compose.yml'
29 changes: 16 additions & 13 deletions .github/workflows/actions/deploy_v2/action.yml
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ inputs:
required: true
outputs:
deploy-url:
value: ${{ steps.set_env_var.outputs.deploy_url }}
value: https://${{ steps.set_env_var.outputs.deploy_url }}
runs:
using: composite
steps:
Expand All @@ -22,15 +22,16 @@ runs:
run: |
tf_vars_file=terraform/aks/config/${{ inputs.environment }}.tfvars.json
terraform_version=$(awk '/{/{f=/^terraform/;next}f' terraform/aks/provider.tf | grep -o [0-9\.]*)
echo "cluster=$(jq -r '.cluster' ${tf_vars_file})" >> $GITHUB_ENV
echo "aks_app_environment=$(jq -r '.environment' ${tf_vars_file})" >> $GITHUB_ENV
cluster=$(jq -r '.cluster' ${tf_vars_file})
aks_app_environment=$(jq -r '.environment' ${tf_vars_file})
echo "TERRAFORM_VERSION=$terraform_version" >> $GITHUB_ENV
echo "namespace=$(jq -r '.namespace' ${tf_vars_file})" >> $GITHUB_ENV
if [[ $cluster == 'production' ]]; then
echo "deploy_url=https://getintoteachingapi-${{ env.aks_app_environment }}.teacherservices.cloud" >> $GITHUB_OUTPUT
else
echo "deploy_url=https://getintoteachingapi-${{ env.aks_app_environment }}.${cluster}.teacherservices.cloud" >> $GITHUB_OUTPUT
app_hostname=getintoteachingapi-${aks_app_environment}.teacherservices.cloud
else
app_hostname=getintoteachingapi-${aks_app_environment}.${cluster}.teacherservices.cloud
fi
echo "app_hostname=${app_hostname}" >> $GITHUB_OUTPUT
- name: Use Terraform ${{ env.TERRAFORM_VERSION }}
uses: hashicorp/setup-terraform@v2
Expand All @@ -41,14 +42,16 @@ runs:
with:
azure-credentials: ${{ inputs.azure-credentials }}

- name: Print Sha
id: print-sha
shell: bash
run: |
echo "${{ inputs.sha }}"
- name: Terraform init, plan & apply
shell: bash
run: make ci ${{ inputs.environment }} terraform-apply
env:
IMAGE_TAG: ${{ inputs.sha }}

- name: Smoke tests
shell: bash
run: |
tests/confidence/healthcheck.sh "${APP_HOSTNAME}" "${IMAGE_TAG#sha-}"
env:
APP_HOSTNAME: ${{ steps.set_env_var.outputs.app_hostname }}
IMAGE_TAG: ${{ inputs.sha }}
2 changes: 2 additions & 0 deletions .github/workflows/build-and-deploy.yml
Original file line number Diff line number Diff line change
Expand Up @@ -96,6 +96,7 @@ jobs:
runs-on: ubuntu-latest
environment:
name: development_aks
url: ${{ steps.deploy.outputs.deploy-url }}
steps:
- name: Check out the repo
uses: actions/checkout@v4
Expand Down Expand Up @@ -167,6 +168,7 @@ jobs:
runs-on: ubuntu-latest
environment:
name: test_aks
url: ${{ steps.deploy.outputs.deploy-url }}
steps:
- name: Check out the repo
uses: actions/checkout@v4
Expand Down
65 changes: 0 additions & 65 deletions .github/workflows/fix-network-policy.yml

This file was deleted.

3 changes: 2 additions & 1 deletion .github/workflows/manual.yml
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,8 @@ jobs:
name: Deploy to ${{github.event.inputs.environment}}
runs-on: ubuntu-latest
environment:
name: ${{github.event.inputs.environment}}
name: ${{github.event.inputs.environment}}
url: ${{ steps.deploy.outputs.deploy-url }}
concurrency: ${{github.event.inputs.environment}}
defaults:
run:
Expand Down
50 changes: 1 addition & 49 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -9,29 +9,6 @@ ifndef VERBOSE
.SILENT:
endif

help:
echo "Secrets:"
echo " This makefile gives the user the ability to safely display and edit azure secrets which are used by this project. "
echo ""
echo "Commands:"
echo " edit-app-secrets - Edit Application specific Secrets."
echo " print-app-secrets - Display Application specific Secrets."
echo " edit-monitoring-secrets - Edit Monitoring specific Secrets."
echo " print-monitoring-secrets - Display Monitoring specific Secrets."
echo " edit-infrastructure-secrets - Edit Infrastructure specific Secrets."
echo " print-infrastructure-secrets - Display Infrastructure specific Secrets."
echo ""
echo "Parameters:"
echo "All commands take the parameter development|review|test|production"
echo ""
echo "Examples:"
echo ""
echo "To edit the Application secrets for Development"
echo " make development edit-app-secrets"
echo ""
echo "To print the Monitoring secrets for Production"
echo " make production print-monitoring-secrets"

MONITORING_SECRETS=MONITORING-KEYS
APPLICATION_SECRETS=API-KEYS
INFRASTRUCTURE_SECRETS=INFRA-KEYS
Expand All @@ -50,11 +27,6 @@ bin/terrafile: ## Install terrafile to manage terraform modules
bin/yaq:
mkdir -p bin | curl -sL https://github.com/uk-devops/yaq/releases/download/v0.0.3/yaq_linux_amd64_v0.0.3.zip -o yaq.zip && unzip -o yaq.zip -d ./bin/ && rm yaq.zip

.PHONY: development
development:
$(eval export KEY_VAULT=s146d01-kv)
$(eval export AZ_SUBSCRIPTION=s146-getintoteachingwebsite-development)

development_aks:
$(eval include global_config/development_aks.sh)

Expand All @@ -73,26 +45,6 @@ set-key-vault-names:
$(eval KEY_VAULT_APPLICATION_NAME=$(AZURE_RESOURCE_PREFIX)-$(SERVICE_SHORT)-$(CONFIG_SHORT)-app-kv)
$(eval KEY_VAULT_INFRASTRUCTURE_NAME=$(AZURE_RESOURCE_PREFIX)-$(SERVICE_SHORT)-$(CONFIG_SHORT)-inf-kv)

.PHONY: local
local:
$(eval export KEY_VAULT=s146d01-local2-kv)
$(eval export AZ_SUBSCRIPTION=s146-getintoteachingwebsite-development)

.PHONY: review
review:
$(eval export KEY_VAULT=s146d01-kv)
$(eval export AZ_SUBSCRIPTION=s146-getintoteachingwebsite-development)

.PHONY: test
test:
$(eval export KEY_VAULT=s146t01-kv)
$(eval export AZ_SUBSCRIPTION=s146-getintoteachingwebsite-test)

.PHONY: production
production:
$(eval export KEY_VAULT=s146p01-kv)
$(eval export AZ_SUBSCRIPTION=s146-getintoteachingwebsite-production)

.PHONY: ci
ci: ## Run in automation environment
$(eval export DISABLE_PASSCODE=true)
Expand Down Expand Up @@ -121,7 +73,7 @@ terraform-init: bin/terrafile set-azure-account
-backend-config=key=${CONFIG}.tfstate

$(if $(IMAGE_TAG), , $(error The IMAGE_TAG variable must be provided))
$(eval export TF_VAR_paas_app_docker_image=ghcr.io/dfe-digital/get-into-teaching-api:$(IMAGE_TAG))
$(eval export TF_VAR_app_docker_image=ghcr.io/dfe-digital/get-into-teaching-api:$(IMAGE_TAG))
$(eval export TF_VAR_azure_resource_prefix=$(AZURE_RESOURCE_PREFIX))
$(eval export TF_VAR_config_short=$(CONFIG_SHORT))
$(eval export TF_VAR_service_short=$(SERVICE_SHORT))
Expand Down
39 changes: 13 additions & 26 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -107,25 +107,25 @@ Quick start steps:

## Useful Links

As the API is service-facing it has no user interface, but in non-production environments you can access a [dashboard for Hangfire](https://get-into-teaching-api-dev.london.cloudapps.digital/hangfire/) and the [Swagger UI](https://get-into-teaching-api-dev.london.cloudapps.digital/swagger/index.html). You will need the basic auth credentials to access these dashboards.
As the API is service-facing it has no user interface, but in non-production environments you can access a [dashboard for Hangfire](https://getintoteachingapi-development.test.teacherservices.cloud/hangfire/) and the [Swagger UI](https://getintoteachingapi-development.test.teacherservices.cloud/swagger/index.html). You will need the basic auth credentials to access these dashboards.

## Deployment

### Environments

The API is deployed to GOV.UK PAAS. We currently have three hosted environments; `development`, `test` and `production`. This can get confusing because our ASP.NET Core environments are `development`, `staging`, `test` and `production` (we should look to address this as part of the migration away from GOV.UK PAAS!). Here is a table to try and make sense of the combinations:
The API is deployed to [AKS](https://github.com/DFE-Digital/teacher-services-cloud/). We currently have three hosted environments; `development`, `test` and `production`. This can get confusing because our ASP.NET Core environments are `development`, `staging`, `test` and `production`. Here is a table to try and make sense of the combinations:

| Environment | ASP.NET Core Environment | URL |
| ----------------------- | ------------------------ | ----------------------------------------------------------- |
| development (PAAS) | staging | https://get-into-teaching-api-dev.london.cloudapps.digital |
| test (PAAS) | staging | https://get-into-teaching-api-test.london.cloudapps.digital |
| production (PASS) | production | https://get-into-teaching-api-prod.london.cloudapps.digital |
| development (local) | development | localhost |
| test (local) | test | n/a |
| Environment | ASP.NET Core Environment | URL |
| ----------------------- | ------------------------ | ----------------------------------------------------------------- |
| development (AKS) | staging | https://getintoteachingapi-development.test.teacherservices.cloud/|
| test (AKS) | staging | https://getintoteachingapi-test.test.teacherservices.cloud/ |
| production (AKS) | production | https://getintoteachingapi-production.teacherservices.cloud/ |
| development (local) | development | localhost |
| test (local) | test | n/a |

### Process

When you merge a branch to `master` it will automatically be deployed to the [development](https://get-into-teaching-api-dev.london.cloudapps.digital/) and [test](https://get-into-teaching-api-test.london.cloudapps.digital/) environments via GitHub Actions and a tagged release will be created (the tag will use the PR number). You can then test the changes using the corresponding dev/test environments of the other GiT services. Once you're happy and want to ship to [production](https://get-into-teaching-api-prod.london.cloudapps.digital/) you need to note the tag of your release and go to the `Manual Release` GitHub Action; from there you can select `Run workflow`, choose the `Production` environment and enter your release number.
When you merge a branch to `master` it will automatically be deployed to the [development](#environments) and [test](#environments) environments via GitHub Actions and a tagged release will be created (the tag will use the PR number). You can then test the changes using the corresponding dev/test environments of the other GiT services. Once you're happy and want to ship to [production](#environments) you need to note the tag of your release and go to the `Manual Release` GitHub Action; from there you can select `Run workflow`, choose the `Production` environment and enter your release number.

### Rollbacks

Expand All @@ -150,19 +150,6 @@ Then **set properties of the created env.local to "Always copy"**.

Other environment variables are available (see the `IEnv` interface) but are not necessary to run the bare-bones application.

The Postgres connections (for Hangfire and our database) are setup dynamically from the `VCAP_SERVICES` environment variable provided by GOV.UK PaaS. If you want to connect to a Postgres instance running in PaaS instead of the one in Docker - such as the test environment instance - you can do so by creating a conduit to it using Cloud Foundry:

```
cf conduit get-into-teaching-api-dev-pg-svc
```

You then need to update the `VCAP_SERVICES` environment variable (in `env.local`) to reflect the connection details for your conduit session:

```
{\"postgres\": [{\"instance_name\": \"rdsbroker_277c8858_eb3a_427b_99ed_0f4f4171701e\",\"credentials\": {\"host\": \"127.0.0.1\",\"name\": \"rdsbroker_277c8858_eb3a_427b_99ed_0f4f4171701e\",\"username\": \"******\",\"password\": \"******\",\"port\": \"7080\"}}]}
```

### Secrets

Secrets are stored in Azure keyvaults. There is a Makefile that should be used to view or edit the secrets, for example:
Expand Down Expand Up @@ -311,15 +298,15 @@ We use [logit.io](https://kibana.logit.io/app/kibana) to host a Kibana instance

### Metrics

We use [Prometheus](https://prometheus-prod-get-into-teaching.london.cloudapps.digital/) to collect our metrics into an InfluxDB instance. Metrics are exposed to Prometheus on the `/metrics` endpoint; [prometheus-net](https://github.com/prometheus-net/prometheus-net) is used for collecting and exposing the metrics.
We use Prometheus to collect our metrics into an InfluxDB instance. Metrics are exposed to Prometheus on the `/metrics` endpoint; [prometheus-net](https://github.com/prometheus-net/prometheus-net) is used for collecting and exposing the metrics.

The metrics are presented using [Grafana](https://grafana-prod-get-into-teaching.london.cloudapps.digital/). All the configuration/infrastructure is currently configured in the terraform files.
The metrics are presented using Grafana. All the configuration/infrastructure is currently configured in the terraform files.

Note that if you change the Grafana dashboard **it will not persist** and you need to instead export the dashboard and [updated it in the GitHub repository](https://github.com/DFE-Digital/get-into-teaching-api/tree/master/monitoring/grafana/dashboards). These are re-applied on API deployment.

### Alerts

We use [Prometheus Alert Manager](https://alertmanager-prod-get-into-teaching.london.cloudapps.digital/#/alerts) to notify us when something has gone wrong. It will post to the relevant Slack channel and contain a link to the appropriate Grafana dashboard and/or runbook.
We use Prometheus Alert Manager to notify us when something has gone wrong. It will post to the relevant Slack channel and contain a link to the appropriate Grafana dashboard and/or runbook.

You can add/configure alerts in the [alert.rules file](https://github.com/DFE-Digital/get-into-teaching-api/blob/master/monitoring/prometheus/alert.rules).

Expand Down
8 changes: 4 additions & 4 deletions docs/runbooks/client-approaching-rate-limit.md
Original file line number Diff line number Diff line change
@@ -1,18 +1,18 @@
# ClientApproachingRateLimit
# ClientApproachingRateLimit

MEDIUM
MEDIUM

## Description

Alerts when the API is reviving a lot of requests to rate limited endpoints from a client (and is in danger of returning a 429 response soon).
Alerts when the API is reviving a lot of requests to rate limited endpoints from a client (and is in danger of returning a 429 response soon).

## Potential Causes

See the potential causes in the TooManyRequests alert.

## Resolutions

Check the [Grafana panel](https://grafana-prod-get-into-teaching.london.cloudapps.digital/d/28EURzZGz/get-into-teaching-api?viewPanel=60&orgId=1&var-App=get-into-teaching-api-prod) for an indication of what’s going on.
Check the Grafana panel for an indication of what’s going on.

If it was a caused by a spike that did not get very close to the critical threshold then it can be safely ignored for now and monitored going forward.

Expand Down
Loading

0 comments on commit 58c951d

Please sign in to comment.