We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
With Spark 3.x it's recommend to use java.time.instant over java.sql.timestamp . (reference: https://databricks.com/blog/2020/07/22/a-comprehensive-look-at-dates-and-timestamps-in-apache-spark-3-0.html) spark.conf.set("spark.sql.datetime.java8API.enabled", "true")
however with that turned on, the upload step fails, most likely because the spark-redshift lib still uses java.sql.timestamp related formats https://github.com/spark-redshift-community/spark-redshift/blob/master/src/main/scala/io/github/spark_redshift_community/spark/redshift/RedshiftWriter.scala#L240
eager to this improved and aligned with spark 3.x releases.
The text was updated successfully, but these errors were encountered:
Makes sense to me. Thanks for providing links 😄
Do you have interest in submitting a PR? I'm happy to review and help get this merged/released.
Sorry, something went wrong.
@jsleight pleasure. a quick one: #95
No branches or pull requests
With Spark 3.x it's recommend to use java.time.instant over java.sql.timestamp . (reference: https://databricks.com/blog/2020/07/22/a-comprehensive-look-at-dates-and-timestamps-in-apache-spark-3-0.html)
spark.conf.set("spark.sql.datetime.java8API.enabled", "true")
however with that turned on, the upload step fails, most likely because the spark-redshift lib still uses java.sql.timestamp related formats
https://github.com/spark-redshift-community/spark-redshift/blob/master/src/main/scala/io/github/spark_redshift_community/spark/redshift/RedshiftWriter.scala#L240
eager to this improved and aligned with spark 3.x releases.
The text was updated successfully, but these errors were encountered: