In order for the shovel to work, it has to have a Postgres database with the correct database schema
- In the
.env
of your Apollos API, add the Postgres Connection URI of your database asDATABASE_URL=<Your Postgres Connection URI>
as well asDATABASE_CONTENT=true
- In the API folder, run
yarn migrator up
You should be able to see that the migrations were applied and that the database schema was updated.
- In the shovel's dag folder, duplicate the
core-rock-content-item-dag.py
,core-rock-people-dag.py
, andcore-rock-tags-dag.py
and rename each file to match the name of the church. - Go through each of the newly created files and replace
core
with the church of your choice. Be consistent though, as the shovel uses this name to access the church's variables that will be added in airflow
At this point, the DAGs should be visible from the Airflow console. If not, restart astronomer with astro dev stop && astro dev start
. If you using a Mac M1 machine, you will need to run DOCKER_BUILDKIT=0 astro dev start
- Under the
Admin
tab in Airflow, go toConnections
- Add your database's connection using the naming format as
<church_name>_apollos_postgres
. SelectPostgres
as the Connection Type. It is critical to keep thechurch_name
consistent throughout the entire shovel.
- Under the
Admin
tab in Airflow, go toVariables
- Add the following variables
<church_name>_rock_api
- Rock API URL<church_name>_rock_token
- Rock API Token<church_name>_rock_config
- This will be an object that contains more variables associated with the rock instance. You can see an example in thecore_rock_config
variable. This is explained more inCONTENT_SHOVEL_MIGRATIONS.md
Required Keys: --CONTENT_MAPPINGS
- This will be related to how theconfig.yml
is setup --PERSONA_CATEGORY_ID
- The ID of the correct persona category in Rock --SERIES_CATEGORY_ORIGIN_IDS
- an array of the content item categories that are series
Once everything is added correctly, turn on each DAG related to the church and watch the magic happen!