Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Does this connector support spark 3? #305

Open
flxzaj opened this issue Jul 22, 2020 · 3 comments
Open

Does this connector support spark 3? #305

flxzaj opened this issue Jul 22, 2020 · 3 comments

Comments

@flxzaj
Copy link

flxzaj commented Jul 22, 2020

Hi, I ran the connector successfully on spark 2.4.5, but failed on spark 3.0.0. Is there any support for spark 3? Thanks

@nicolaszhang
Copy link

I met the same issue, any update on this?

@dvdgnzlz-maths
Copy link

Hi everyone,

I try to create DeltaTable and use Apache Atlas:
I have been able to create from a DeltaTable a table in Hive. CREATE EXTERNAL TABLE delta_tbl(date int, delay int, distance int, origin string, destination string) ROW FORMAT SERDE 'org.apache.hadoop.hive.ql.io.parquet.serde.ParquetHiveSerDe' WITH SERDEPROPERTIES("path" = "/tmp/departureDelays.delta'") STORED BY 'io.delta.hive.DeltaStorageHandler' LOCATION 'file:///tmp/departureDelays.delta';

I can do it but I'm using Delta Lake 0.6.1. I would like to use the last version (0.7) but I need Spark 3.0. I clone this project to work in this but it have a strong dependency with "spark-atlas-connector-main_2.11". I don't find this project to work with this.

Where can I find the code for "spark-atlas-connector-main_2.11"?

Thanks in advance!
Best,

@sbbagal13
Copy link

Anyone found solution for this?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants