We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
While upgrading the pyspark version of sparkpools to 3.4 in Synapse Studio. We faced this error
An error occurred while calling o6063.save.\n: java.lang.NoSuchMethodError: 'org.apache.spark.sql.types.StructType org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.getSchema(java.sql.ResultSet, org.apache.spark.sql.jdbc.JdbcDialect, boolean)'\n\tat com.microsoft.sqlserver.jdbc.spark.BulkCopyUtils$.matchSchemas(BulkCopyUtils.scala:305)\n\tat com.microsoft.sqlserver.jdbc.spark.BulkCopyUtils$.getColMetaData(BulkCopyUtils.scala:266)\n\tat com.microsoft.sqlserver.jdbc.spark.Connector.write(Connector.scala:79)\n\tat com.microsoft.sqlserver.jdbc.spark.DefaultSource.createRelation(DefaultSource.scala:66)\n\tat
We have updgraded the pyspark(3.4.0) and delta-spark(2.4.0) version in requirement.txt file also but issue is same.
The text was updated successfully, but these errors were encountered:
even I am facing same issue. looks like extra parameter was added IsTimestampNTZ in getShema method in spark 3.4.0 https://github.com/apache/spark/blob/87a5442f7ed96b11051d8a9333476d080054e5a0/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/JdbcUtils.scala#L279
I see 1.4.0-BETA release , but not available in maven
Sorry, something went wrong.
No branches or pull requests
While upgrading the pyspark version of sparkpools to 3.4 in Synapse Studio. We faced this error
An error occurred while calling o6063.save.\n: java.lang.NoSuchMethodError: 'org.apache.spark.sql.types.StructType org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.getSchema(java.sql.ResultSet, org.apache.spark.sql.jdbc.JdbcDialect, boolean)'\n\tat com.microsoft.sqlserver.jdbc.spark.BulkCopyUtils$.matchSchemas(BulkCopyUtils.scala:305)\n\tat com.microsoft.sqlserver.jdbc.spark.BulkCopyUtils$.getColMetaData(BulkCopyUtils.scala:266)\n\tat com.microsoft.sqlserver.jdbc.spark.Connector.write(Connector.scala:79)\n\tat com.microsoft.sqlserver.jdbc.spark.DefaultSource.createRelation(DefaultSource.scala:66)\n\tat
We have updgraded the pyspark(3.4.0) and delta-spark(2.4.0) version in requirement.txt file also but issue is same.
The text was updated successfully, but these errors were encountered: