Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

when MERGE INTO a merge-on-read table got NoSuchMethodError #11821

Open
1 of 3 tasks
lordk911 opened this issue Dec 19, 2024 · 2 comments
Open
1 of 3 tasks

when MERGE INTO a merge-on-read table got NoSuchMethodError #11821

lordk911 opened this issue Dec 19, 2024 · 2 comments
Labels
bug Something isn't working

Comments

@lordk911
Copy link

lordk911 commented Dec 19, 2024

Apache Iceberg version

1.6.1

Query engine

Spark

Please describe the bug 🐞

env:

spark 3.4.4
iceberg-spark-runtime-3.4_2.12-1.6.1.jar

table ddl:

create table if not exists test.ice_test
(
  id                    string          comment '',
  order_id              string          comment '',
  lock_id               string          comment ''
)
using iceberg
TBLPROPERTIES (
 'write.metadata.delete-after-commit.enabled'=true,
 'write.metadata.previous-versions-max'=10,
 'format-version'=2,
 'write.merge.mode'='merge-on-read');

sql to merge table:

MERGE INTO test.ice_test a USING test.product_tmp b ON a.id = b.id when matched then delete;

Exception:

java.lang.NoSuchMethodError: org.apache.spark.sql.catalyst.ProjectingInternalRow.<init>(Lorg/apache/spark/sql/types/StructType;Lscala/collection/Seq;)V
  at org.apache.spark.sql.catalyst.analysis.RewriteRowLevelIcebergCommand.newLazyProjection(RewriteRowLevelIcebergCommand.scala:121)
  at org.apache.spark.sql.catalyst.analysis.RewriteRowLevelIcebergCommand.buildDeltaProjections(RewriteRowLevelIcebergCommand.scala:86)
  at org.apache.spark.sql.catalyst.analysis.RewriteRowLevelIcebergCommand.buildDeltaProjections$(RewriteRowLevelIcebergCommand.scala:75)
  at org.apache.spark.sql.catalyst.analysis.RewriteMergeIntoTable$.buildDeltaProjections(RewriteMergeIntoTable.scala:75)
  at org.apache.spark.sql.catalyst.analysis.RewriteMergeIntoTable$.buildDeltaProjections(RewriteMergeIntoTable.scala:412)
  at org.apache.spark.sql.catalyst.analysis.RewriteMergeIntoTable$.org$apache$spark$sql$catalyst$analysis$RewriteMergeIntoTable$$buildWriteDeltaPlan(RewriteMergeIntoTable.scala:315)
  at org.apache.spark.sql.catalyst.analysis.RewriteMergeIntoTable$$anonfun$apply$1.applyOrElse(RewriteMergeIntoTable.scala:156)
  at org.apache.spark.sql.catalyst.analysis.RewriteMergeIntoTable$$anonfun$apply$1.applyOrElse(RewriteMergeIntoTable.scala:84)
  at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsDownWithPruning$2(AnalysisHelper.scala:170)
  at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:104)
  at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsDownWithPruning$1(AnalysisHelper.scala:170)
  at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:323)
  at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsDownWithPruning(AnalysisHelper.scala:168)
  at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsDownWithPruning$(AnalysisHelper.scala:164)
  at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperatorsDownWithPruning(LogicalPlan.scala:31)
  at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsWithPruning(AnalysisHelper.scala:99)
  at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsWithPruning$(AnalysisHelper.scala:96)
  at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperatorsWithPruning(LogicalPlan.scala:31)
  at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperators(AnalysisHelper.scala:76)
  at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperators$(AnalysisHelper.scala:75)
  at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperators(LogicalPlan.scala:31)
  at org.apache.spark.sql.catalyst.analysis.RewriteMergeIntoTable$.apply(RewriteMergeIntoTable.scala:84)
  at org.apache.spark.sql.catalyst.analysis.RewriteMergeIntoTable$.apply(RewriteMergeIntoTable.scala:75)
  at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$2(RuleExecutor.scala:222)

If change write.merge.mode to copy-on-write the MERGE INTO statement can be executed successfully.

Willingness to contribute

  • I can contribute a fix for this bug independently
  • I would be willing to contribute a fix for this bug with guidance from the Iceberg community
  • I cannot contribute a fix for this bug at this time
@lordk911 lordk911 added the bug Something isn't working label Dec 19, 2024
@RussellSpitzer
Copy link
Member

This usually signifies a version mismatch on the runtime classpath for Spark. Make sure there are no other iceberg-spark-runtime jars

@lordk911
Copy link
Author

lordk911 commented Jan 3, 2025

ll $SPARK_HOME/jars/*iceberg*

lrwxrwxrwx 1 bigtop bigtop 79 Dec 26 11:19 jars/iceberg-spark-runtime-3.4_2.12-1.6.1.jar -> /data/soft/extentions4spark3.4/iceberg/iceberg-spark-runtime-3.4_2.12-1.6.1.jar

ll /data/soft/extentions4spark3.4/iceberg/iceberg-spark-runtime-3.4_2.12-1.6.1.jar

-rw-rw-r-- 1 bigtop bigtop 42035638 Dec 27 10:37 /data/soft/extentions4spark3.4/iceberg/iceberg-spark-runtime-3.4_2.12-1.6.1.jar

cat $SPARK_HOME/conf/spark-defaults.conf | grep spark.yarn.jars

spark.yarn.jars hdfs:///share/sparkjars344/*

hdfs dfs -ls hdfs:///share/sparkjars344/*iceberg*

-rw-r--r--   3 hdfs hdfs   42035638 2024-12-27 14:26 hdfs:///share/sparkjars344/iceberg-spark-runtime-3.4_2.12-1.6.1.jar

This usually signifies a version mismatch on the runtime classpath for Spark. Make sure there are no other iceberg-spark-runtime jars

I'm sure there is only one iceberg-spark-runtime jars and the local and remote versions are the same

and if I do this :
ALTER TABLE test.ice_test SET TBLPROPERTIES ('write.merge.mode'='copy-on-write');
then the MERGE INTO statement can be executed successfully.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants