Skip to content

Commit

Permalink
PR changes
Browse files Browse the repository at this point in the history
  • Loading branch information
Tope Emmanuel committed Nov 6, 2023
1 parent afbd430 commit 5a6efaa
Showing 1 changed file with 14 additions and 4 deletions.
18 changes: 14 additions & 4 deletions docs/documents/powerbi.md
Original file line number Diff line number Diff line change
@@ -1,21 +1,31 @@
The powerBI db backup pipeline was migrated from the old paas environment to the new aks environment
PowerBI dashboard data

the old pipeline was implemented using azure devops classic release pipeline. As at the time of writing this document the old pipeline was located here
https://dev.azure.com/dfe-ssp/S105-School-Experience/_release?_a=releases&view=mine&definitionId=64
The powerBI db backup pipeline was migrated from the old paas environment to the new aks environment

the old pipeline was implemented using azure devops classic release pipeline.
The new pipeline is also implemented using the classic azure devops release pipeline and it is located here
https://dev.azure.com/dfe-ssp/S105-School-Experience/_release?_a=releases&view=mine&definitionId=66

the pipeline works by doing a backup of the kubernetes application database in aks. The database is accessed byport forwarding using Konduit.

There are two set of credentials used in the pipeline namely admin and perfappuser
the admin credentials is used in deleting and recreating the perf analysis database
This is the admin credentials created when the db was initialised

the perfappuser credentials is used when importing the database and this is created when the perf analysis database is
being created

konduit is installed using the make file from the repository

a dump command is then run on the postgres database to dump the db into a file on the pipeline
the command that is run is
``bin/konduit.sh -k $(KV) -d gse-production get-school-experience-production -- pg_dump -E utf8 --clean --if-exists --no-owner --verbose --no-password -f schoolexperiencedb_temp.sql``
where kv is the keyvault name, -d option refers to the database name, and get-school-experience-production refers to the pod name

the performance db is then deleted and recreated in the next task and the exported db in schoolexperiencedb_temp.sql is then imported into the performance database
the performance db is then deleted and recreated in the next task and the exported db in schoolexperiencedb_temp.sql is then imported into the performance database. the recreation is done using the script available at
https://raw.githubusercontent.com/DFE-Digital/school-experience-devops/master/createdb.sql
in this script the perfappuser credentials is used using the password variable 'postgresUserPassword' that is stored in the pipeline
perfappuser credentials is used in the data source

ogr2ogr is also used to load GeoJSON files (eer.json and topo_lad.json) into the PostgreSQL database

Expand Down

0 comments on commit 5a6efaa

Please sign in to comment.