Skip to content

Commit

Permalink
Merge pull request #112 from mlverse/updates
Browse files Browse the repository at this point in the history
Figuring out the Spark cache
  • Loading branch information
edgararuiz authored Apr 20, 2024
2 parents 6691373 + a9e0aa7 commit 06bbcd9
Show file tree
Hide file tree
Showing 2 changed files with 3 additions and 2 deletions.
3 changes: 2 additions & 1 deletion .github/workflows/spark-tests.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ jobs:
fail-fast: false
matrix:
config:
- {spark: '3.4.1', pyspark: '3.4', hadoop: '3', name: 'PySpark 3.4'}
- {spark: '3.5.1', pyspark: '3.5', hadoop: '3', name: 'PySpark 3.5'}

env:
GITHUB_PAT: ${{ secrets.GITHUB_TOKEN }}
Expand Down Expand Up @@ -63,6 +63,7 @@ jobs:
if: steps.cache-spark.outputs.cache-hit != 'true'
run: |
sparklyr::spark_install(version = Sys.getenv("SPARK_VERSION"))
print(sparklyr::spark_install_find(Sys.getenv("SPARK_VERSION"))$sparkVersionDir)
shell: Rscript {0}

- name: Cache Scala
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/test-coverage.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,7 @@ jobs:
key: sparklyr-spark-3.5.0-bin-hadoop3-2

- name: Install Spark (via sparklyr)
if: steps.cache-spark.outputs.cache-hit != 'true'
if: steps.cache-spark.outputs.cache-2-hit != 'true'
run: |
sparklyr::spark_install(version = "3.5.0")
shell: Rscript {0}
Expand Down

0 comments on commit 06bbcd9

Please sign in to comment.