Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add required deps to project #33

Merged
merged 2 commits into from
Sep 14, 2023

Conversation

holdenk
Copy link
Contributor

@holdenk holdenk commented Sep 13, 2023

Description

Adds the missing dependencies so a fresh install of spark-expectations can work :)

Related Issue

#32

Motivation and Context

Bug fix

How Has This Been Tested?

Created fresh conda env, see stream https://www.youtube.com/watch?v=FxqFdYWW2jo around 40 minute mark.

Types of changes

  • [X ] Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to change)

Checklist:

  • [X ] My code follows the code style of this project.
  • My change requires a change to the documentation.
  • I have updated the documentation accordingly.
  • [ X ] I have read the CONTRIBUTING document.
  • I have added tests to cover my changes.
  • All new and existing tests passed.

…in regular installs. Otherwise you get an error like ImportError: cannot import name 'configure_spark_with_delta_pip' from 'delta' (/home/holden/miniconda3/lib/python3.10/site-packages/delta/__init__.py)
Copy link
Collaborator

@asingamaneni asingamaneni left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Having pyspark and delta-spark as dependencies could cause problems in databricks. But for now this LGTM, we will modify if needed in the next coming pr's while we are re-architecting readers and writers.

@holdenk Thanks for your first contribution to this project :)

@codecov
Copy link

codecov bot commented Sep 14, 2023

Codecov Report

Patch and project coverage have no change.

Comparison is base (915fffc) 100.00% compared to head (b002f58) 100.00%.
Report is 1 commits behind head on main.

Additional details and impacted files
@@            Coverage Diff            @@
##              main       #33   +/-   ##
=========================================
  Coverage   100.00%   100.00%           
=========================================
  Files           22        22           
  Lines         1362      1362           
=========================================
  Hits          1362      1362           

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@asingamaneni asingamaneni merged commit 1bcce5a into Nike-Inc:main Sep 14, 2023
6 checks passed
@newfront
Copy link

Having pyspark and delta-spark as dependencies could cause problems in databricks. But for now this LGTM, we will modify if needed in the next coming pr's while we are re-architecting readers and writers.

@holdenk Thanks for your first contribution to this project :)

pyspark = "^3.0.0"
delta-spark = "^2.1.0"
requests = "^2.28.1"

If it is a problem, we could also add the , optional = true flag to the delta-spark and requests package, and call out these as extra packages under [tool.poetry.extracts]`:

poetry install -E delta-spark -E requests

I also think we should have pyspark in the main package dependencies so it is a good and safe addition.

In the case where folk my not be using delta in the future it could be good to make that optional, I'm just thinking of this like supporting <scope>provided</scope> in typical maven project.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants