Apache Hudi 0.11.0 新版本新特性解读

1. Apache Hudi 0.11.0 新功能简介:多级索引 Multi-Modal Index

2. Spark SQL 新功能与 Flink 集成改进

3. 快速浏览其他功能和提升

(PDF下载见文末)

Apache-Hudi-0.11.0-.pdf__01.png
Apache-Hudi-0.11.0-.pdf__02.png
Apache-Hudi-0.11.0-.pdf__03.png
Apache-Hudi-0.11.0-.pdf__04.png
Apache-Hudi-0.11.0-.pdf__05.png
Apache-Hudi-0.11.0-.pdf__06.png
Apache-Hudi-0.11.0-.pdf__07.png
Apache-Hudi-0.11.0-.pdf__08.png
Apache-Hudi-0.11.0-.pdf__09.png
Apache-Hudi-0.11.0-.pdf__10.png
Apache-Hudi-0.11.0-.pdf__11.png
Apache-Hudi-0.11.0-.pdf__12.png
Apache-Hudi-0.11.0-.pdf__13.png
Apache-Hudi-0.11.0-.pdf__14.png
Apache-Hudi-0.11.0-.pdf__15.png
Apache-Hudi-0.11.0-.pdf__16.png
Apache-Hudi-0.11.0-.pdf__17.png
Apache-Hudi-0.11.0-.pdf__18.png
Apache-Hudi-0.11.0-.pdf__19.png
Apache-Hudi-0.11.0-.pdf__20.png
Apache-Hudi-0.11.0-.pdf__21.png
Apache-Hudi-0.11.0-.pdf__22.png
Apache-Hudi-0.11.0-.pdf__23.png
Apache-Hudi-0.11.0-.pdf__24.png
Apache-Hudi-0.11.0-.pdf__25.png
Apache-Hudi-0.11.0-.pdf__26.png
Apache-Hudi-0.11.0-.pdf__27.png
Apache-Hudi-0.11.0-.pdf__28.png
Apache-Hudi-0.11.0-.pdf__29.png
Apache-Hudi-0.11.0-.pdf__30.png
Apache-Hudi-0.11.0-.pdf__31.png
Apache-Hudi-0.11.0-.pdf__32.png
Apache-Hudi-0.11.0-.pdf__33.png
Apache-Hudi-0.11.0-.pdf__34.png
Apache-Hudi-0.11.0-.pdf__35.png
Apache-Hudi-0.11.0-.pdf__36.png
Apache-Hudi-0.11.0-.pdf__37.png
Apache-Hudi-0.11.0-.pdf__38.png
Apache-Hudi-0.11.0-.pdf__39.png
Apache-Hudi-0.11.0-.pdf__40.png
Apache-Hudi-0.11.0-.pdf__41.png
Apache-Hudi-0.11.0-.pdf__42.png

本文PDF下载:

在评论区回复任意评论即可获取PDF下载

立即回复
已经回复?立即刷新

5 1 投票
文章评分

本文转载自onehouse && DataFun,原文链接:。

(1)
上一篇 2022-05-12 22:39
下一篇 2022-05-14 12:37

相关推荐

订阅评论
提醒
guest

8 评论
最旧
最新 最多投票
内联反馈
查看所有评论
anderson
anderson
回复给  xiaozhch5
1 年 前

学习hudi0.11

用户6937
用户6937
1 年 前

Hudi0.11学习

edward
edward
1 年 前

pdf

anderson
anderson
1 年 前

学习hudi0.11

anderson
anderson
1 年 前

学习hudi0.11

1273
1273
1 年 前

I want to get the slide. thanks.

1273
1273
1 年 前

[hadoop@header script]$
[hadoop@header script]$ $SPARK_HOME/bin/spark-shell –master yarn –deploy-mode client \
>  –jars /home/hadoop/hudi-jars/hudi-spark3.3-bundle_2.12-0.12.0.jar,/home/hadoop/hudi-jars/hudi-hadoop-mr-bundle-0.12.0.jar,/home/hadoop/hudi-jars/hudi-utilities-slim-bundle_2.12-0.12.0.jar \
>  –conf ‘spark.serializer=org.apache.spark.serializer.KryoSerializer’ \
>  –conf ‘spark.sql.catalog.spark_catalog=org.apache.spark.sql.hudi.catalog.HoodieCatalog’ \
>  –conf ‘spark.sql.extensions=org.apache.spark.sql.hudi.HoodieSparkSessionExtension’ \
>  –conf spark.driver.extraJavaOptions=”-Dlog4j.configuration=file:log4j.properties” \
>  –conf spark.defalut.parallelism=720 \
>  –conf spark.sql.shuffle.partitions=720
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/hadoop/spark-3.3.0-bin-hadoop3/jars/log4j-slf4j-impl-2.17.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/hadoop/hadoop-3.3.4/share/hadoop/common/lib/slf4j-reload4j-1.7.36.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Setting default log level to “WARN”.
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Spark context Web UI available at http://header:4040
Spark context available as ‘sc’ (master = yarn, app id = application_1663641776218_0019).
Spark session available as ‘spark’.
Welcome to
   ____       __
   / __/__ ___ _____/ /__
  _\ \/ _ \/ _ `/ __/ ‘_/
  /___/ .__/\_,_/_/ /_/\_\  version 3.3.0
   /_/

Using Scala version 2.12.15 (Java HotSpot(TM) 64-Bit Server VM, Java 17.0.4.1)
Type in expressions to have them evaluated.
Type :help for more information.

scala> def log(str: => String): Unit = {
   |  println(s”${java.time.LocalDateTime.now} $str”)
   | }
log: (str: => String)Unit

scala>

scala> val scaleFactor = 2000;
scaleFactor: Int = 2000

scala> val format = “parquet”;
format: String = parquet

scala> val fsdir = s”hdfs://header:9000″;
fsdir: String = hdfs://header:9000

scala> val hudi_path = s”${fsdir}/hudi/tpcds_${format}/${scaleFactor}”;
hudi_path: String = hdfs://header:9000/hudi/tpcds_parquet/2000

scala>

scala> val tableName = “store_sales”;
tableName: String = store_sales

scala> val targetLocation = s”${hudi_path}/${tableName}”;
targetLocation: String = hdfs://header:9000/hudi/tpcds_parquet/2000/store_sales

scala>

scala> :paste
// Entering paste mode (ctrl-D to finish)

val df = spark.read.format(“hudi”)
     .option(“hoodie.metadata.enable”, “true”)
     .option(“hoodie.enable.data.skipping”, “true”)
     .load(targetLocation);

// Exiting paste mode, now interpreting.

df: org.apache.spark.sql.DataFrame = [_hoodie_commit_time: string, _hoodie_commit_seqno: string … 26 more fields]

scala> val viewName = “vew_store_sales”;
viewName: String = vew_store_sales

scala> df.createOrReplaceTempView(viewName);

scala>

scala> val sql4 = s””” select *
   | | from (select avg(ss_list_price) B1_LP
   | |      ,count(ss_list_price) B1_CNT
   | |      ,count(distinct ss_list_price) B1_CNTD
   | |   from ${viewName}
   | |   where ss_quantity between 0 and 5
   | |    and (ss_list_price between 8 and 8+10
   | |      or ss_coupon_amt between 459 and 459+1000
   | |      or ss_wholesale_cost between 57 and 57+20)) B1 cross join
   | |  (select avg(ss_list_price) B2_LP
   | |      ,count(ss_list_price) B2_CNT
   | |      ,count(distinct ss_list_price) B2_CNTD
   | |   from ${viewName}
   | |   where ss_quantity between 6 and 10
   | |    and (ss_list_price between 90 and 90+10
   | |      or ss_coupon_amt between 2323 and 2323+1000
   | |      or ss_wholesale_cost between 31 and 31+20)) B2 cross join
   | |  (select avg(ss_list_price) B3_LP
   | |      ,count(ss_list_price) B3_CNT
   | |      ,count(distinct ss_list_price) B3_CNTD
   | |   from ${viewName}
   | |   where ss_quantity between 11 and 15
   | |    and (ss_list_price between 142 and 142+10
   | |      or ss_coupon_amt between 12214 and 12214+1000
   | |      or ss_wholesale_cost between 79 and 79+20)) B3 cross join
   | |  (select avg(ss_list_price) B4_LP
   | |      ,count(ss_list_price) B4_CNT
   | |      ,count(distinct ss_list_price) B4_CNTD
   | |   from ${viewName}
   | |   where ss_quantity between 16 and 20
   | |    and (ss_list_price between 135 and 135+10
   | |      or ss_coupon_amt between 6071 and 6071+1000
   | |      or ss_wholesale_cost between 38 and 38+20)) B4 cross join
   | |  (select avg(ss_list_price) B5_LP
   | |      ,count(ss_list_price) B5_CNT
   | |      ,count(distinct ss_list_price) B5_CNTD
   | |   from ${viewName}
   | |   where ss_quantity between 21 and 25
   | |    and (ss_list_price between 122 and 122+10
   | |      or ss_coupon_amt between 836 and 836+1000
   | |      or ss_wholesale_cost between 17 and 17+20)) B5 cross join
   | |  (select avg(ss_list_price) B6_LP
   | |      ,count(ss_list_price) B6_CNT
   | |      ,count(distinct ss_list_price) B6_CNTD
   | |   from ${viewName}
   | |   where ss_quantity between 26 and 30
   | |    and (ss_list_price between 154 and 154+10
   | |      or ss_coupon_amt between 7326 and 7326+1000
   | |      or ss_wholesale_cost between 7 and 7+20)) B6
   | | limit 100 “””
sql4: String =
” select *
 from (select avg(ss_list_price) B1_LP
      ,count(ss_list_price) B1_CNT
      ,count(distinct ss_list_price) B1_CNTD
   from vew_store_sales
   where ss_quantity between 0 and 5
    and (ss_list_price between 8 and 8+10
      or ss_coupon_amt between 459 and 459+1000
      or ss_wholesale_cost between 57 and 57+20)) B1 cross join
  (select avg(ss_list_price) B2_LP
      ,count(ss_list_price) B2_CNT
      ,count(distinct ss_list_price) B2_CNTD
   from vew_store_sales
   where ss_quantity between 6 and 10
    and (ss_list_price between 90 and 90+10
      or ss_coupon_amt between 2323 and 2323+1000
      or ss_wholesale_cost between 31 and 31+20)) B2 cross join
  (select avg(…

scala> spark.sql(sql4).show();
00:33 WARN: [kryo] Unable to load class [Lorg.apache.hudi.common.model.DeleteRecord; with kryo’s ClassLoader. Retrying with current..
22/09/21 07:54:28 ERROR AbstractHoodieLogRecordReader: Got exception when reading log file
com.esotericsoftware.kryo.KryoException: Unable to find class: [Lorg.apache.hudi.common.model.DeleteRecord;
    at com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:160) ~[kryo-shaded-4.0.2.jar:?]
    at com.esotericsoftware.kryo.util.DefaultClassResolver.readClass(DefaultClassResolver.java:133) ~[kryo-shaded-4.0.2.jar:?]
    at com.esotericsoftware.kryo.Kryo.readClass(Kryo.java:693) ~[kryo-shaded-4.0.2.jar:?]
    at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:804) ~[kryo-shaded-4.0.2.jar:?]
    at org.apache.hudi.common.util.SerializationUtils$KryoSerializerInstance.deserialize(SerializationUtils.java:104) ~[?:?]
    at org.apache.hudi.common.util.SerializationUtils.deserialize(SerializationUtils.java:78) ~[?:?]
    at org.apache.hudi.common.table.log.block.HoodieDeleteBlock.deserialize(HoodieDeleteBlock.java:106) ~[?:?]
    at org.apache.hudi.common.table.log.block.HoodieDeleteBlock.getRecordsToDelete(HoodieDeleteBlock.java:91) ~[?:?]
    at org.apache.hudi.common.table.log.AbstractHoodieLogRecordReader.processQueuedBlocksForInstant(AbstractHoodieLogRecordReader.java:473) ~[?:?]
    at org.apache.hudi.common.table.log.AbstractHoodieLogRecordReader.scanInternal(AbstractHoodieLogRecordReader.java:343) ~[?:?]
    at org.apache.hudi.metadata.HoodieMetadataMergedLogRecordReader.getRecordsByKeyPrefixes(HoodieMetadataMergedLogRecordReader.java:109) ~[?:?]
    at org.apache.hudi.metadata.HoodieBackedTableMetadata.readLogRecords(HoodieBackedTableMetadata.java:267) ~[?:?]
    at org.apache.hudi.metadata.HoodieBackedTableMetadata.lambda$getRecordsByKeyPrefixes$7539c171$1(HoodieBackedTableMetadata.java:180) ~[?:?]
    at org.apache.hudi.common.function.FunctionWrapper.lambda$throwingMapWrapper$0(FunctionWrapper.java:38) ~[?:?]
    at org.apache.hudi.common.data.HoodieListData.lambda$flatMap$0(HoodieListData.java:124) ~[?:?]
    at java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:273) ~[?:?]
    at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1625) ~[?:?]
    at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509) ~[?:?]
    at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499) ~[?:?]
    at java.util.stream.ReduceOps$ReduceTask.doLeaf(ReduceOps.java:960) ~[?:?]
    at java.util.stream.ReduceOps$ReduceTask.doLeaf(ReduceOps.java:934) ~[?:?]
    at java.util.stream.AbstractTask.compute(AbstractTask.java:327) ~[?:?]
    at java.util.concurrent.CountedCompleter.exec(CountedCompleter.java:754) ~[?:?]
    at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:373) ~[?:?]
    at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1182) ~[?:?]
    at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1655) ~[?:?]
    at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1622) ~[?:?]
    at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:165) ~[?:?]
Caused by: java.lang.ClassNotFoundException: org.apache.hudi.common.model.DeleteRecord
    at jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:641) ~[?:?]
    at jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:188) ~[?:?]
    at java.lang.ClassLoader.loadClass(ClassLoader.java:520) ~[?:?]
    at java.lang.Class.forName0(Native Method) ~[?:?]
    at java.lang.Class.forName(Class.java:467) ~[?:?]
    at com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:154) ~[kryo-shaded-4.0.2.jar:?]
    … 27 more
22/09/21 07:54:28 ERROR HoodieFileIndex: Failed to lookup candidate files in File Index
org.apache.hudi.exception.HoodieException: org.apache.hudi.exception.HoodieException: Error occurs when executing map
    at jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:?]
    at jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:77) ~[?:?]
    at jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:?]
    at java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:499) ~[?:?]
    at java.lang.reflect.Constructor.newInstance(Constructor.java:480) ~[?:?]
    at java.util.concurrent.ForkJoinTask.getThrowableException(ForkJoinTask.java:562) ~[?:?]
    at java.util.concurrent.ForkJoinTask.reportException(ForkJoinTask.java:591) ~[?:?]
    at java.util.concurrent.ForkJoinTask.invoke(ForkJoinTask.java:689) ~[?:?]
    at java.util.stream.ReduceOps$ReduceOp.evaluateParallel(ReduceOps.java:927) ~[?:?]
    at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:233) ~[?:?]
    at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:682) ~[?:?]
    at org.apache.hudi.common.data.HoodieListPairData.groupByKey(HoodieListPairData.java:110) ~[hudi-spark3.3-bundle_2.12-0.12.0.jar:0.12.0]
    at org.apache.hudi.ColumnStatsIndexSupport.transpose(ColumnStatsIndexSupport.scala:224) ~[hudi-spark3.3-bundle_2.12-0.12.0.jar:0.12.0]
    at org.apache.hudi.ColumnStatsIndexSupport.$anonfun$loadTransposed$1(ColumnStatsIndexSupport.scala:119) ~[hudi-spark3.3-bundle_2.12-0.12.0.jar:0.12.0]
    at org.apache.hudi.HoodieCatalystUtils$.withPersistedData(HoodieCatalystUtils.scala:60) ~[hudi-spark3.3-bundle_2.12-0.12.0.jar:0.12.0]
    at org.apache.hudi.ColumnStatsIndexSupport.loadTransposed(ColumnStatsIndexSupport.scala:118) ~[hudi-spark3.3-bundle_2.12-0.12.0.jar:0.12.0]
    at org.apache.hudi.HoodieFileIndex.$anonfun$lookupCandidateFilesInMetadataTable$1(HoodieFileIndex.scala:214) ~[hudi-spark3.3-bundle_2.12-0.12.0.jar:0.12.0]
    at scala.util.Try$.apply(Try.scala:213) ~[scala-library-2.12.15.jar:?]
    at org.apache.hudi.HoodieFileIndex.lookupCandidateFilesInMetadataTable(HoodieFileIndex.scala:191) ~[hudi-spark3.3-bundle_2.12-0.12.0.jar:0.12.0]
    at org.apache.hudi.HoodieFileIndex.listFiles(HoodieFileIndex.scala:117) ~[hudi-spark3.3-bundle_2.12-0.12.0.jar:0.12.0]
    at org.apache.spark.sql.execution.FileSourceScanExec.selectedPartitions$lzycompute(DataSourceScanExec.scala:254) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.FileSourceScanExec.selectedPartitions(DataSourceScanExec.scala:249) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.FileSourceScanExec.dynamicallySelectedPartitions$lzycompute(DataSourceScanExec.scala:284) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.FileSourceScanExec.dynamicallySelectedPartitions(DataSourceScanExec.scala:265) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.FileSourceScanExec.inputRDD$lzycompute(DataSourceScanExec.scala:463) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.FileSourceScanExec.inputRDD(DataSourceScanExec.scala:448) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.FileSourceScanExec.doExecuteColumnar(DataSourceScanExec.scala:547) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeColumnar$1(SparkPlan.scala:221) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:232) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) ~[spark-core_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:229) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.SparkPlan.executeColumnar(SparkPlan.scala:217) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.InputAdapter.doExecuteColumnar(WholeStageCodegenExec.scala:521) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeColumnar$1(SparkPlan.scala:221) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:232) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) ~[spark-core_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:229) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.SparkPlan.executeColumnar(SparkPlan.scala:217) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.ColumnarToRowExec.inputRDDs(Columnar.scala:205) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.FilterExec.inputRDDs(basicPhysicalOperators.scala:238) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.ProjectExec.inputRDDs(basicPhysicalOperators.scala:51) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.aggregate.AggregateCodegenSupport.inputRDDs(AggregateCodegenSupport.scala:89) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.aggregate.AggregateCodegenSupport.inputRDDs$(AggregateCodegenSupport.scala:88) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.aggregate.HashAggregateExec.inputRDDs(HashAggregateExec.scala:47) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:751) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:194) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:232) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) ~[spark-core_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:229) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:190) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.inputRDD$lzycompute(ShuffleExchangeExec.scala:135) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.inputRDD(ShuffleExchangeExec.scala:135) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.mapOutputStatisticsFuture$lzycompute(ShuffleExchangeExec.scala:140) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.mapOutputStatisticsFuture(ShuffleExchangeExec.scala:139) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.exchange.ShuffleExchangeLike.$anonfun$submitShuffleJob$1(ShuffleExchangeExec.scala:68) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:232) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) ~[spark-core_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:229) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.exchange.ShuffleExchangeLike.submitShuffleJob(ShuffleExchangeExec.scala:68) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.exchange.ShuffleExchangeLike.submitShuffleJob$(ShuffleExchangeExec.scala:67) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.submitShuffleJob(ShuffleExchangeExec.scala:115) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.adaptive.ShuffleQueryStageExec.shuffleFuture$lzycompute(QueryStageExec.scala:174) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.adaptive.ShuffleQueryStageExec.shuffleFuture(QueryStageExec.scala:174) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.adaptive.ShuffleQueryStageExec.doMaterialize(QueryStageExec.scala:176) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.adaptive.QueryStageExec.materialize(QueryStageExec.scala:82) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.$anonfun$getFinalPhysicalPlan$5(AdaptiveSparkPlanExec.scala:258) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.$anonfun$getFinalPhysicalPlan$5$adapted(AdaptiveSparkPlanExec.scala:256) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at scala.collection.Iterator.foreach(Iterator.scala:943) ~[scala-library-2.12.15.jar:?]
    at scala.collection.Iterator.foreach$(Iterator.scala:943) ~[scala-library-2.12.15.jar:?]
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1431) ~[scala-library-2.12.15.jar:?]
    at scala.collection.IterableLike.foreach(IterableLike.scala:74) ~[scala-library-2.12.15.jar:?]
    at scala.collection.IterableLike.foreach$(IterableLike.scala:73) ~[scala-library-2.12.15.jar:?]
    at scala.collection.AbstractIterable.foreach(Iterable.scala:56) ~[scala-library-2.12.15.jar:?]
    at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.$anonfun$getFinalPhysicalPlan$1(AdaptiveSparkPlanExec.scala:256) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:779) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.getFinalPhysicalPlan(AdaptiveSparkPlanExec.scala:228) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.withFinalPlanUpdate(AdaptiveSparkPlanExec.scala:367) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.executeCollect(AdaptiveSparkPlanExec.scala:340) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.Dataset.collectFromPlan(Dataset.scala:3868) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.Dataset.$anonfun$head$1(Dataset.scala:2863) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.Dataset.$anonfun$withAction$2(Dataset.scala:3858) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.QueryExecution$.withInternalError(QueryExecution.scala:510) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.Dataset.$anonfun$withAction$1(Dataset.scala:3856) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$6(SQLExecution.scala:109) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:169) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:95) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:779) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3856) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.Dataset.head(Dataset.scala:2863) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.Dataset.take(Dataset.scala:3084) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.Dataset.getRows(Dataset.scala:288) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.Dataset.showString(Dataset.scala:327) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.Dataset.show(Dataset.scala:808) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.Dataset.show(Dataset.scala:767) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.Dataset.show(Dataset.scala:776) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at $line26.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(<console>:24) ~[scala-library-2.12.15.jar:?]
    at $line26.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(<console>:28) ~[scala-library-2.12.15.jar:?]
    at $line26.$read$$iw$$iw$$iw$$iw$$iw$$iw.<init>(<console>:30) ~[scala-library-2.12.15.jar:?]
    at $line26.$read$$iw$$iw$$iw$$iw$$iw.<init>(<console>:32) ~[scala-library-2.12.15.jar:?]
    at $line26.$read$$iw$$iw$$iw$$iw.<init>(<console>:34) ~[scala-library-2.12.15.jar:?]
    at $line26.$read$$iw$$iw$$iw.<init>(<console>:36) ~[scala-library-2.12.15.jar:?]
    at $line26.$read$$iw$$iw.<init>(<console>:38) ~[scala-library-2.12.15.jar:?]
    at $line26.$read$$iw.<init>(<console>:40) ~[scala-library-2.12.15.jar:?]
    at $line26.$read.<init>(<console>:42) ~[scala-library-2.12.15.jar:?]
    at $line26.$read$.<init>(<console>:46) ~[scala-library-2.12.15.jar:?]
    at $line26.$read$.<clinit>(<console>) ~[scala-library-2.12.15.jar:?]
    at $line26.$eval$.$print$lzycompute(<console>:7) ~[scala-library-2.12.15.jar:?]
    at $line26.$eval$.$print(<console>:6) ~[scala-library-2.12.15.jar:?]
    at $line26.$eval.$print(<console>) ~[scala-library-2.12.15.jar:?]
    at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?]
    at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) ~[?:?]
    at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]
    at java.lang.reflect.Method.invoke(Method.java:568) ~[?:?]
    at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:747) ~[scala-compiler-2.12.15.jar:?]
    at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1020) ~[scala-compiler-2.12.15.jar:?]
    at scala.tools.nsc.interpreter.IMain.$anonfun$interpret$1(IMain.scala:568) ~[scala-compiler-2.12.15.jar:?]
    at scala.reflect.internal.util.ScalaClassLoader.asContext(ScalaClassLoader.scala:36) ~[scala-reflect-2.12.15.jar:?]
    at scala.reflect.internal.util.ScalaClassLoader.asContext$(ScalaClassLoader.scala:116) ~[scala-reflect-2.12.15.jar:?]
    at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:41) ~[scala-reflect-2.12.15.jar:?]
    at scala.tools.nsc.interpreter.IMain.loadAndRunReq$1(IMain.scala:567) ~[scala-compiler-2.12.15.jar:?]
    at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:594) ~[scala-compiler-2.12.15.jar:?]
    at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:564) ~[scala-compiler-2.12.15.jar:?]
    at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:865) ~[scala-compiler-2.12.15.jar:?]
    at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:733) ~[scala-compiler-2.12.15.jar:?]
    at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:435) ~[scala-compiler-2.12.15.jar:?]
    at scala.tools.nsc.interpreter.ILoop.loop(ILoop.scala:456) ~[scala-compiler-2.12.15.jar:?]
    at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:239) ~[spark-repl_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.repl.Main$.doMain(Main.scala:78) ~[spark-repl_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.repl.Main$.main(Main.scala:58) ~[spark-repl_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.repl.Main.main(Main.scala) ~[spark-repl_2.12-3.3.0.jar:3.3.0]
    at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?]
    at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) ~[?:?]
    at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]
    at java.lang.reflect.Method.invoke(Method.java:568) ~[?:?]
    at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52) ~[spark-core_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:958) ~[spark-core_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180) ~[spark-core_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203) ~[spark-core_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90) ~[spark-core_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1046) ~[spark-core_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1055) ~[spark-core_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) ~[spark-core_2.12-3.3.0.jar:3.3.0]
Caused by: org.apache.hudi.exception.HoodieException: Error occurs when executing map
    at org.apache.hudi.common.function.FunctionWrapper.lambda$throwingMapWrapper$0(FunctionWrapper.java:40) ~[hudi-spark3.3-bundle_2.12-0.12.0.jar:0.12.0]
    at org.apache.hudi.common.data.HoodieListData.lambda$flatMap$0(HoodieListData.java:124) ~[hudi-spark3.3-bundle_2.12-0.12.0.jar:0.12.0]
    at java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:273) ~[?:?]
    at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1625) ~[?:?]
    at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509) ~[?:?]
    at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499) ~[?:?]
    at java.util.stream.ReduceOps$ReduceTask.doLeaf(ReduceOps.java:960) ~[?:?]
    at java.util.stream.ReduceOps$ReduceTask.doLeaf(ReduceOps.java:934) ~[?:?]
    at java.util.stream.AbstractTask.compute(AbstractTask.java:327) ~[?:?]
    at java.util.concurrent.CountedCompleter.exec(CountedCompleter.java:754) ~[?:?]
    at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:373) ~[?:?]
    at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1182) ~[?:?]
    at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1655) ~[?:?]
    at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1622) ~[?:?]
    at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:165) ~[?:?]
Caused by: org.apache.hudi.exception.HoodieException: Exception when reading log file
    at org.apache.hudi.common.table.log.AbstractHoodieLogRecordReader.scanInternal(AbstractHoodieLogRecordReader.java:352) ~[hudi-spark3.3-bundle_2.12-0.12.0.jar:0.12.0]
    at org.apache.hudi.metadata.HoodieMetadataMergedLogRecordReader.getRecordsByKeyPrefixes(HoodieMetadataMergedLogRecordReader.java:109) ~[hudi-spark3.3-bundle_2.12-0.12.0.jar:0.12.0]
    at org.apache.hudi.metadata.HoodieBackedTableMetadata.readLogRecords(HoodieBackedTableMetadata.java:267) ~[hudi-spark3.3-bundle_2.12-0.12.0.jar:0.12.0]
    at org.apache.hudi.metadata.HoodieBackedTableMetadata.lambda$getRecordsByKeyPrefixes$7539c171$1(HoodieBackedTableMetadata.java:180) ~[hudi-spark3.3-bundle_2.12-0.12.0.jar:0.12.0]
    at org.apache.hudi.common.function.FunctionWrapper.lambda$throwingMapWrapper$0(FunctionWrapper.java:38) ~[hudi-spark3.3-bundle_2.12-0.12.0.jar:0.12.0]
    at org.apache.hudi.common.data.HoodieListData.lambda$flatMap$0(HoodieListData.java:124) ~[hudi-spark3.3-bundle_2.12-0.12.0.jar:0.12.0]
    at java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:273) ~[?:?]
    at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1625) ~[?:?]
    at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509) ~[?:?]
    at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499) ~[?:?]
    at java.util.stream.ReduceOps$ReduceTask.doLeaf(ReduceOps.java:960) ~[?:?]
    at java.util.stream.ReduceOps$ReduceTask.doLeaf(ReduceOps.java:934) ~[?:?]
    at java.util.stream.AbstractTask.compute(AbstractTask.java:327) ~[?:?]
    at java.util.concurrent.CountedCompleter.exec(CountedCompleter.java:754) ~[?:?]
    at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:373) ~[?:?]
    at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1182) ~[?:?]
    at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1655) ~[?:?]
    at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1622) ~[?:?]
    at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:165) ~[?:?]
Caused by: com.esotericsoftware.kryo.KryoException: Unable to find class: [Lorg.apache.hudi.common.model.DeleteRecord;
    at com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:160) ~[kryo-shaded-4.0.2.jar:?]
    at com.esotericsoftware.kryo.util.DefaultClassResolver.readClass(DefaultClassResolver.java:133) ~[kryo-shaded-4.0.2.jar:?]
    at com.esotericsoftware.kryo.Kryo.readClass(Kryo.java:693) ~[kryo-shaded-4.0.2.jar:?]
    at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:804) ~[kryo-shaded-4.0.2.jar:?]
    at org.apache.hudi.common.util.SerializationUtils$KryoSerializerInstance.deserialize(SerializationUtils.java:104) ~[hudi-spark3.3-bundle_2.12-0.12.0.jar:0.12.0]
    at org.apache.hudi.common.util.SerializationUtils.deserialize(SerializationUtils.java:78) ~[hudi-spark3.3-bundle_2.12-0.12.0.jar:0.12.0]
    at org.apache.hudi.common.table.log.block.HoodieDeleteBlock.deserialize(HoodieDeleteBlock.java:106) ~[hudi-spark3.3-bundle_2.12-0.12.0.jar:0.12.0]
    at org.apache.hudi.common.table.log.block.HoodieDeleteBlock.getRecordsToDelete(HoodieDeleteBlock.java:91) ~[hudi-spark3.3-bundle_2.12-0.12.0.jar:0.12.0]
    at org.apache.hudi.common.table.log.AbstractHoodieLogRecordReader.processQueuedBlocksForInstant(AbstractHoodieLogRecordReader.java:473) ~[hudi-spark3.3-bundle_2.12-0.12.0.jar:0.12.0]
    at org.apache.hudi.common.table.log.AbstractHoodieLogRecordReader.scanInternal(AbstractHoodieLogRecordReader.java:343) ~[hudi-spark3.3-bundle_2.12-0.12.0.jar:0.12.0]
    at org.apache.hudi.metadata.HoodieMetadataMergedLogRecordReader.getRecordsByKeyPrefixes(HoodieMetadataMergedLogRecordReader.java:109) ~[hudi-spark3.3-bundle_2.12-0.12.0.jar:0.12.0]
    at org.apache.hudi.metadata.HoodieBackedTableMetadata.readLogRecords(HoodieBackedTableMetadata.java:267) ~[hudi-spark3.3-bundle_2.12-0.12.0.jar:0.12.0]
    at org.apache.hudi.metadata.HoodieBackedTableMetadata.lambda$getRecordsByKeyPrefixes$7539c171$1(HoodieBackedTableMetadata.java:180) ~[hudi-spark3.3-bundle_2.12-0.12.0.jar:0.12.0]
    at org.apache.hudi.common.function.FunctionWrapper.lambda$throwingMapWrapper$0(FunctionWrapper.java:38) ~[hudi-spark3.3-bundle_2.12-0.12.0.jar:0.12.0]
    at org.apache.hudi.common.data.HoodieListData.lambda$flatMap$0(HoodieListData.java:124) ~[hudi-spark3.3-bundle_2.12-0.12.0.jar:0.12.0]
    at java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:273) ~[?:?]
    at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1625) ~[?:?]
    at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509) ~[?:?]
    at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499) ~[?:?]
    at java.util.stream.ReduceOps$ReduceTask.doLeaf(ReduceOps.java:960) ~[?:?]
    at java.util.stream.ReduceOps$ReduceTask.doLeaf(ReduceOps.java:934) ~[?:?]
    at java.util.stream.AbstractTask.compute(AbstractTask.java:327) ~[?:?]
    at java.util.concurrent.CountedCompleter.exec(CountedCompleter.java:754) ~[?:?]
    at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:373) ~[?:?]
    at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1182) ~[?:?]
    at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1655) ~[?:?]
    at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1622) ~[?:?]
    at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:165) ~[?:?]
Caused by: java.lang.ClassNotFoundException: org.apache.hudi.common.model.DeleteRecord
    at jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:641) ~[?:?]
    at jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:188) ~[?:?]
    at java.lang.ClassLoader.loadClass(ClassLoader.java:520) ~[?:?]
    at java.lang.Class.forName0(Native Method) ~[?:?]
    at java.lang.Class.forName(Class.java:467) ~[?:?]
    at com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:154) ~[kryo-shaded-4.0.2.jar:?]
    at com.esotericsoftware.kryo.util.DefaultClassResolver.readClass(DefaultClassResolver.java:133) ~[kryo-shaded-4.0.2.jar:?]
    at com.esotericsoftware.kryo.Kryo.readClass(Kryo.java:693) ~[kryo-shaded-4.0.2.jar:?]
    at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:804) ~[kryo-shaded-4.0.2.jar:?]
    at org.apache.hudi.common.util.SerializationUtils$KryoSerializerInstance.deserialize(SerializationUtils.java:104) ~[hudi-spark3.3-bundle_2.12-0.12.0.jar:0.12.0]
    at org.apache.hudi.common.util.SerializationUtils.deserialize(SerializationUtils.java:78) ~[hudi-spark3.3-bundle_2.12-0.12.0.jar:0.12.0]
    at org.apache.hudi.common.table.log.block.HoodieDeleteBlock.deserialize(HoodieDeleteBlock.java:106) ~[hudi-spark3.3-bundle_2.12-0.12.0.jar:0.12.0]
    at org.apache.hudi.common.table.log.block.HoodieDeleteBlock.getRecordsToDelete(HoodieDeleteBlock.java:91) ~[hudi-spark3.3-bundle_2.12-0.12.0.jar:0.12.0]
    at org.apache.hudi.common.table.log.AbstractHoodieLogRecordReader.processQueuedBlocksForInstant(AbstractHoodieLogRecordReader.java:473) ~[hudi-spark3.3-bundle_2.12-0.12.0.jar:0.12.0]
    at org.apache.hudi.common.table.log.AbstractHoodieLogRecordReader.scanInternal(AbstractHoodieLogRecordReader.java:343) ~[hudi-spark3.3-bundle_2.12-0.12.0.jar:0.12.0]
    at org.apache.hudi.metadata.HoodieMetadataMergedLogRecordReader.getRecordsByKeyPrefixes(HoodieMetadataMergedLogRecordReader.java:109) ~[hudi-spark3.3-bundle_2.12-0.12.0.jar:0.12.0]
    at org.apache.hudi.metadata.HoodieBackedTableMetadata.readLogRecords(HoodieBackedTableMetadata.java:267) ~[hudi-spark3.3-bundle_2.12-0.12.0.jar:0.12.0]
    at org.apache.hudi.metadata.HoodieBackedTableMetadata.lambda$getRecordsByKeyPrefixes$7539c171$1(HoodieBackedTableMetadata.java:180) ~[hudi-spark3.3-bundle_2.12-0.12.0.jar:0.12.0]
    at org.apache.hudi.common.function.FunctionWrapper.lambda$throwingMapWrapper$0(FunctionWrapper.java:38) ~[hudi-spark3.3-bundle_2.12-0.12.0.jar:0.12.0]
    at org.apache.hudi.common.data.HoodieListData.lambda$flatMap$0(HoodieListData.java:124) ~[hudi-spark3.3-bundle_2.12-0.12.0.jar:0.12.0]
    at java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:273) ~[?:?]
    at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1625) ~[?:?]
    at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509) ~[?:?]
    at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499) ~[?:?]
    at java.util.stream.ReduceOps$ReduceTask.doLeaf(ReduceOps.java:960) ~[?:?]
    at java.util.stream.ReduceOps$ReduceTask.doLeaf(ReduceOps.java:934) ~[?:?]
    at java.util.stream.AbstractTask.compute(AbstractTask.java:327) ~[?:?]
    at java.util.concurrent.CountedCompleter.exec(CountedCompleter.java:754) ~[?:?]
    at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:373) ~[?:?]
    at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1182) ~[?:?]
    at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1655) ~[?:?]
    at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1622) ~[?:?]
    at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:165) ~[?:?]
00:33 WARN: [kryo] Unable to load class [Lorg.apache.hudi.common.model.DeleteRecord; with kryo’s ClassLoader. Retrying with current..
22/09/21 07:54:28 ERROR AbstractHoodieLogRecordReader: Got exception when reading log file
com.esotericsoftware.kryo.KryoException: Unable to find class: [Lorg.apache.hudi.common.model.DeleteRecord;
    at com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:160) ~[kryo-shaded-4.0.2.jar:?]
    at com.esotericsoftware.kryo.util.DefaultClassResolver.readClass(DefaultClassResolver.java:133) ~[kryo-shaded-4.0.2.jar:?]
    at com.esotericsoftware.kryo.Kryo.readClass(Kryo.java:693) ~[kryo-shaded-4.0.2.jar:?]
    at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:804) ~[kryo-shaded-4.0.2.jar:?]
    at org.apache.hudi.common.util.SerializationUtils$KryoSerializerInstance.deserialize(SerializationUtils.java:104) ~[?:?]
    at org.apache.hudi.common.util.SerializationUtils.deserialize(SerializationUtils.java:78) ~[?:?]
    at org.apache.hudi.common.table.log.block.HoodieDeleteBlock.deserialize(HoodieDeleteBlock.java:106) ~[?:?]
    at org.apache.hudi.common.table.log.block.HoodieDeleteBlock.getRecordsToDelete(HoodieDeleteBlock.java:91) ~[?:?]
    at org.apache.hudi.common.table.log.AbstractHoodieLogRecordReader.processQueuedBlocksForInstant(AbstractHoodieLogRecordReader.java:473) ~[?:?]
    at org.apache.hudi.common.table.log.AbstractHoodieLogRecordReader.scanInternal(AbstractHoodieLogRecordReader.java:343) ~[?:?]
    at org.apache.hudi.metadata.HoodieMetadataMergedLogRecordReader.getRecordsByKeyPrefixes(HoodieMetadataMergedLogRecordReader.java:109) ~[?:?]
    at org.apache.hudi.metadata.HoodieBackedTableMetadata.readLogRecords(HoodieBackedTableMetadata.java:267) ~[?:?]
    at org.apache.hudi.metadata.HoodieBackedTableMetadata.lambda$getRecordsByKeyPrefixes$7539c171$1(HoodieBackedTableMetadata.java:180) ~[?:?]
    at org.apache.hudi.common.function.FunctionWrapper.lambda$throwingMapWrapper$0(FunctionWrapper.java:38) ~[?:?]
    at org.apache.hudi.common.data.HoodieListData.lambda$flatMap$0(HoodieListData.java:124) ~[?:?]
    at java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:273) ~[?:?]
    at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1625) ~[?:?]
    at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509) ~[?:?]
    at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499) ~[?:?]
    at java.util.stream.ReduceOps$ReduceTask.doLeaf(ReduceOps.java:960) ~[?:?]
    at java.util.stream.ReduceOps$ReduceTask.doLeaf(ReduceOps.java:934) ~[?:?]
    at java.util.stream.AbstractTask.compute(AbstractTask.java:327) ~[?:?]
    at java.util.concurrent.CountedCompleter.exec(CountedCompleter.java:754) ~[?:?]
    at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:373) ~[?:?]
    at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1182) ~[?:?]
    at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1655) ~[?:?]
    at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1622) ~[?:?]
    at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:165) ~[?:?]
Caused by: java.lang.ClassNotFoundException: org.apache.hudi.common.model.DeleteRecord
    at jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:641) ~[?:?]
    at jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:188) ~[?:?]
    at java.lang.ClassLoader.loadClass(ClassLoader.java:520) ~[?:?]
    at java.lang.Class.forName0(Native Method) ~[?:?]
    at java.lang.Class.forName(Class.java:467) ~[?:?]
    at com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:154) ~[kryo-shaded-4.0.2.jar:?]
    … 27 more
22/09/21 07:54:29 ERROR HoodieFileIndex: Failed to lookup candidate files in File Index
org.apache.hudi.exception.HoodieException: org.apache.hudi.exception.HoodieException: Error occurs when executing map
    at jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:?]
    at jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:77) ~[?:?]
    at jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:?]
    at java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:499) ~[?:?]
    at java.lang.reflect.Constructor.newInstance(Constructor.java:480) ~[?:?]
    at java.util.concurrent.ForkJoinTask.getThrowableException(ForkJoinTask.java:562) ~[?:?]
    at java.util.concurrent.ForkJoinTask.reportException(ForkJoinTask.java:591) ~[?:?]
    at java.util.concurrent.ForkJoinTask.invoke(ForkJoinTask.java:689) ~[?:?]
    at java.util.stream.ReduceOps$ReduceOp.evaluateParallel(ReduceOps.java:927) ~[?:?]
    at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:233) ~[?:?]
    at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:682) ~[?:?]
    at org.apache.hudi.common.data.HoodieListPairData.groupByKey(HoodieListPairData.java:110) ~[hudi-spark3.3-bundle_2.12-0.12.0.jar:0.12.0]
    at org.apache.hudi.ColumnStatsIndexSupport.transpose(ColumnStatsIndexSupport.scala:224) ~[hudi-spark3.3-bundle_2.12-0.12.0.jar:0.12.0]
    at org.apache.hudi.ColumnStatsIndexSupport.$anonfun$loadTransposed$1(ColumnStatsIndexSupport.scala:119) ~[hudi-spark3.3-bundle_2.12-0.12.0.jar:0.12.0]
    at org.apache.hudi.HoodieCatalystUtils$.withPersistedData(HoodieCatalystUtils.scala:60) ~[hudi-spark3.3-bundle_2.12-0.12.0.jar:0.12.0]
    at org.apache.hudi.ColumnStatsIndexSupport.loadTransposed(ColumnStatsIndexSupport.scala:118) ~[hudi-spark3.3-bundle_2.12-0.12.0.jar:0.12.0]
    at org.apache.hudi.HoodieFileIndex.$anonfun$lookupCandidateFilesInMetadataTable$1(HoodieFileIndex.scala:214) ~[hudi-spark3.3-bundle_2.12-0.12.0.jar:0.12.0]
    at scala.util.Try$.apply(Try.scala:213) ~[scala-library-2.12.15.jar:?]
    at org.apache.hudi.HoodieFileIndex.lookupCandidateFilesInMetadataTable(HoodieFileIndex.scala:191) ~[hudi-spark3.3-bundle_2.12-0.12.0.jar:0.12.0]
    at org.apache.hudi.HoodieFileIndex.listFiles(HoodieFileIndex.scala:117) ~[hudi-spark3.3-bundle_2.12-0.12.0.jar:0.12.0]
    at org.apache.spark.sql.execution.FileSourceScanExec.selectedPartitions$lzycompute(DataSourceScanExec.scala:254) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.FileSourceScanExec.selectedPartitions(DataSourceScanExec.scala:249) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.FileSourceScanExec.dynamicallySelectedPartitions$lzycompute(DataSourceScanExec.scala:284) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.FileSourceScanExec.dynamicallySelectedPartitions(DataSourceScanExec.scala:265) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.FileSourceScanExec.inputRDD$lzycompute(DataSourceScanExec.scala:463) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.FileSourceScanExec.inputRDD(DataSourceScanExec.scala:448) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.FileSourceScanExec.doExecuteColumnar(DataSourceScanExec.scala:547) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeColumnar$1(SparkPlan.scala:221) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:232) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) ~[spark-core_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:229) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.SparkPlan.executeColumnar(SparkPlan.scala:217) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.InputAdapter.doExecuteColumnar(WholeStageCodegenExec.scala:521) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeColumnar$1(SparkPlan.scala:221) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:232) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) ~[spark-core_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:229) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.SparkPlan.executeColumnar(SparkPlan.scala:217) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.ColumnarToRowExec.inputRDDs(Columnar.scala:205) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.FilterExec.inputRDDs(basicPhysicalOperators.scala:238) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.ProjectExec.inputRDDs(basicPhysicalOperators.scala:51) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.aggregate.AggregateCodegenSupport.inputRDDs(AggregateCodegenSupport.scala:89) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.aggregate.AggregateCodegenSupport.inputRDDs$(AggregateCodegenSupport.scala:88) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.aggregate.HashAggregateExec.inputRDDs(HashAggregateExec.scala:47) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:751) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:194) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:232) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) ~[spark-core_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:229) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:190) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.inputRDD$lzycompute(ShuffleExchangeExec.scala:135) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.inputRDD(ShuffleExchangeExec.scala:135) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.mapOutputStatisticsFuture$lzycompute(ShuffleExchangeExec.scala:140) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.mapOutputStatisticsFuture(ShuffleExchangeExec.scala:139) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.exchange.ShuffleExchangeLike.$anonfun$submitShuffleJob$1(ShuffleExchangeExec.scala:68) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:232) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) ~[spark-core_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:229) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.exchange.ShuffleExchangeLike.submitShuffleJob(ShuffleExchangeExec.scala:68) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.exchange.ShuffleExchangeLike.submitShuffleJob$(ShuffleExchangeExec.scala:67) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.submitShuffleJob(ShuffleExchangeExec.scala:115) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.adaptive.ShuffleQueryStageExec.shuffleFuture$lzycompute(QueryStageExec.scala:174) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.adaptive.ShuffleQueryStageExec.shuffleFuture(QueryStageExec.scala:174) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.adaptive.ShuffleQueryStageExec.doMaterialize(QueryStageExec.scala:176) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.adaptive.QueryStageExec.materialize(QueryStageExec.scala:82) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.$anonfun$getFinalPhysicalPlan$5(AdaptiveSparkPlanExec.scala:258) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.$anonfun$getFinalPhysicalPlan$5$adapted(AdaptiveSparkPlanExec.scala:256) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at scala.collection.Iterator.foreach(Iterator.scala:943) ~[scala-library-2.12.15.jar:?]
    at scala.collection.Iterator.foreach$(Iterator.scala:943) ~[scala-library-2.12.15.jar:?]
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1431) ~[scala-library-2.12.15.jar:?]
    at scala.collection.IterableLike.foreach(IterableLike.scala:74) ~[scala-library-2.12.15.jar:?]
    at scala.collection.IterableLike.foreach$(IterableLike.scala:73) ~[scala-library-2.12.15.jar:?]
    at scala.collection.AbstractIterable.foreach(Iterable.scala:56) ~[scala-library-2.12.15.jar:?]
    at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.$anonfun$getFinalPhysicalPlan$1(AdaptiveSparkPlanExec.scala:256) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:779) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.getFinalPhysicalPlan(AdaptiveSparkPlanExec.scala:228) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.withFinalPlanUpdate(AdaptiveSparkPlanExec.scala:367) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.executeCollect(AdaptiveSparkPlanExec.scala:340) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.Dataset.collectFromPlan(Dataset.scala:3868) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.Dataset.$anonfun$head$1(Dataset.scala:2863) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.Dataset.$anonfun$withAction$2(Dataset.scala:3858) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.QueryExecution$.withInternalError(QueryExecution.scala:510) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.Dataset.$anonfun$withAction$1(Dataset.scala:3856) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$6(SQLExecution.scala:109) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:169) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:95) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:779) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3856) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.Dataset.head(Dataset.scala:2863) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.Dataset.take(Dataset.scala:3084) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.Dataset.getRows(Dataset.scala:288) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.Dataset.showString(Dataset.scala:327) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.Dataset.show(Dataset.scala:808) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.Dataset.show(Dataset.scala:767) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at org.apache.spark.sql.Dataset.show(Dataset.scala:776) ~[spark-sql_2.12-3.3.0.jar:3.3.0]
    at $line26.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(<console>:24) ~[scala-library-2.12.15.jar:?]
    at $line26.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(<console>:28) ~[scala-library-2.12.15.jar:?]

8
0
希望看到您的想法,请您发表评论x