You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I used SAC for select -> inser stament at Spark, like:
val df = spark.read.parquet(source_path)
df.write.parquet(target_path)
as a result at the Atlas I have a tons of hdfs_path entities (for each file from "source_path"- this is folder) and just one entity for "target_path"(just folder).
Is it normal behaviour?
Spark version 2.4.4
Scala version 2.11.12
Atlas: Version : 1.1.0.3.1.0.0-78
SAC: spark-atlas-connector-assembly-0.1.0-SNAPSHOT.jar
The text was updated successfully, but these errors were encountered:
I used SAC for select -> inser stament at Spark, like:
as a result at the Atlas I have a tons of hdfs_path entities (for each file from "source_path"- this is folder) and just one entity for "target_path"(just folder).
Is it normal behaviour?
The text was updated successfully, but these errors were encountered: