CDH 6.3.2集成Hudi异常,首先获取hudi源码,地址:git clone https://github.com/apache/hudi.git,进入根目录hudi编译相关jar时,存在2个问题jar包依赖为导入和开源JDK版本问题。异常分别如下所示。
1.编译命令
到hudi根目录编译即可,命令如下所示:
-Dspark3.4 指定spark版本3.4
-Dflink1.18 指定flink版本1.18
-Dhadoop.version=3.3.5 指定hadoop版本3.3.5
mvn clean package -DskipTests -Dcheckstyle.skip -Dspark3.4 -Dflink1.18 -Dscala-2.12 -Dhadoop.version=3.3.5 -Pflink-bundle-shade-hive3
2.jar包依赖问题
[INFO]
[INFO] --- maven-compiler-plugin:3.8.0:compile (default-compile) @ hudi-common ---
[INFO] Changes detected - recompiling the module!
[INFO] Compiling 543 source files to /Users/carter/Desktop/study/hudi/hudi-common/target/classes
[INFO] -------------------------------------------------------------
[ERROR] COMPILATION ERROR :
[INFO] -------------------------------------------------------------
[ERROR] /Users/carter/Desktop/study/hudi/hudi-common/src/main/java/org/apache/hudi/avro/JsonEncoder.java:[30,28] 程序包org.codehaus.jackson不存在
[ERROR] /Users/carter/Desktop/study/hudi/hudi-common/src/main/java/org/apache/hudi/avro/JsonEncoder.java:[31,28] 程序包org.codehaus.jackson不存在
[ERROR] /Users/carter/Desktop/study/hudi/hudi-common/src/main/java/org/apache/hudi/avro/JsonEncoder.java:[32,28] 程序包org.codehaus.jackson不存在
[ERROR] /Users/carter/Desktop/study/hudi/hudi-common/src/main/java/org/apache/hudi/avro/JsonEncoder.java:[33,33] 程序包org.codehaus.jackson.util不存在
[ERROR] /Users/carter/Desktop/study/hudi/hudi-common/src/main/java/org/apache/hudi/avro/JsonEncoder.java:[63,11] 找不到符号
符号: 类 JsonGenerator
位置: 类 org.apache.hudi.avro.JsonEncoder
[ERROR] /Users/carter/Desktop/study/hudi/hudi-common/src/main/java/org/apache/hudi/avro/JsonEncoder.java:[83,26] 找不到符号
符号: 类 JsonGenerator
位置: 类 org.apache.hudi.avro.JsonEncoder
[ERROR] /Users/carter/Desktop/study/hudi/hudi-common/src/main/java/org/apache/hudi/avro/JsonEncoder.java:[105,18] 找不到符号
符号: 类 JsonGenerator
位置: 类 org.apache.hudi.avro.JsonEncoder
[ERROR] /Users/carter/Desktop/study/hudi/hudi-common/src/main/java/org/apache/hudi/avro/JsonEncoder.java:[179,33] 找不到符号
符号: 类 JsonGenerator
位置: 类 org.apache.hudi.avro.JsonEncoder
[INFO] 8 errors
[INFO] -------------------------------------------------------------
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary for Hudi 1.0.0-SNAPSHOT:
[INFO]
[INFO] Hudi ............................................... SUCCESS [ 1.715 s]
[INFO] hudi-tests-common .................................. SUCCESS [ 2.741 s]
[INFO] hudi-io ............................................ SUCCESS [ 4.245 s]
[INFO] hudi-hadoop-common ................................. SUCCESS [ 2.177 s]
[INFO] hudi-common ........................................ FAILURE [ 6.635 s]
[INFO] hudi-hadoop-mr ..................................... SKIPPED
[INFO] hudi-sync-common ................................... SKIPPED
[INFO] hudi-hive-sync ..................................... SKIPPED
[INFO] hudi-aws ........................................... SKIPPED
[INFO] hudi-timeline-service .............................. SKIPPED
[INFO] hudi-client ........................................ SKIPPED
[INFO] hudi-client-common ................................. SKIPPED
[INFO] hudi-spark-client .................................. SKIPPED
[INFO] hudi-spark-datasource .............................. SKIPPED
[INFO] hudi-spark-common_2.12 ............................. SKIPPED
[INFO] hudi-spark3-common ................................. SKIPPED
[INFO] hudi-spark3.2plus-common ........................... SKIPPED
[INFO] hudi-spark3.4.x_2.12 ............................... SKIPPED
[INFO] hudi-java-client ................................... SKIPPED
[INFO] hudi-spark_2.12 .................................... SKIPPED
[INFO] hudi-gcp ........................................... SKIPPED
[INFO] hudi-utilities_2.12 ................................ SKIPPED
[INFO] hudi-utilities-bundle_2.12 ......................... SKIPPED
[INFO] hudi-cli ........................................... SKIPPED
[INFO] hudi-flink-client .................................. SKIPPED
[INFO] hudi-datahub-sync .................................. SKIPPED
[INFO] hudi-adb-sync ...................................... SKIPPED
[INFO] hudi-sync .......................................... SKIPPED
[INFO] hudi-hadoop-mr-bundle .............................. SKIPPED
[INFO] hudi-datahub-sync-bundle ........................... SKIPPED
[INFO] hudi-hive-sync-bundle .............................. SKIPPED
[INFO] hudi-aws-bundle .................................... SKIPPED
[INFO] hudi-gcp-bundle .................................... SKIPPED
[INFO] hudi-spark3.4-bundle_2.12 .......................... SKIPPED
[INFO] hudi-presto-bundle ................................. SKIPPED
[INFO] hudi-utilities-slim-bundle_2.12 .................... SKIPPED
[INFO] hudi-timeline-server-bundle ........................ SKIPPED
[INFO] hudi-trino-bundle .................................. SKIPPED
[INFO] hudi-examples ...................................... SKIPPED
[INFO] hudi-examples-common ............................... SKIPPED
[INFO] hudi-examples-spark ................................ SKIPPED
[INFO] hudi-flink-datasource .............................. SKIPPED
[INFO] hudi-flink1.18.x ................................... SKIPPED
[INFO] hudi-flink ......................................... SKIPPED
[INFO] hudi-examples-flink ................................ SKIPPED
[INFO] hudi-examples-java ................................. SKIPPED
[INFO] hudi-flink1.14.x ................................... SKIPPED
[INFO] hudi-flink1.15.x ................................... SKIPPED
[INFO] hudi-flink1.16.x ................................... SKIPPED
[INFO] hudi-flink1.17.x ................................... SKIPPED
[INFO] hudi-kafka-connect ................................. SKIPPED
[INFO] hudi-flink1.18-bundle .............................. SKIPPED
[INFO] hudi-kafka-connect-bundle .......................... SKIPPED
[INFO] hudi-cli-bundle_2.12 ............................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 18.704 s
[INFO] Finished at: 2024-02-18T14:24:57+08:00
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.8.0:compile (default-compile) on project hudi-common: Compilation failure: Compilation failure:
[ERROR] /Users/carter/Desktop/study/hudi/hudi-common/src/main/java/org/apache/hudi/avro/JsonEncoder.java:[30,28] 程序包org.codehaus.jackson不存在
[ERROR] /Users/carter/Desktop/study/hudi/hudi-common/src/main/java/org/apache/hudi/avro/JsonEncoder.java:[31,28] 程序包org.codehaus.jackson不存在
[ERROR] /Users/carter/Desktop/study/hudi/hudi-common/src/main/java/org/apache/hudi/avro/JsonEncoder.java:[32,28] 程序包org.codehaus.jackson不存在
[ERROR] /Users/carter/Desktop/study/hudi/hudi-common/src/main/java/org/apache/hudi/avro/JsonEncoder.java:[33,33] 程序包org.codehaus.jackson.util不存在
[ERROR] /Users/carter/Desktop/study/hudi/hudi-common/src/main/java/org/apache/hudi/avro/JsonEncoder.java:[63,11] 找不到符号
[ERROR] 符号: 类 JsonGenerator
[ERROR] 位置: 类 org.apache.hudi.avro.JsonEncoder
[ERROR] /Users/carter/Desktop/study/hudi/hudi-common/src/main/java/org/apache/hudi/avro/JsonEncoder.java:[83,26] 找不到符号
[ERROR] 符号: 类 JsonGenerator
[ERROR] 位置: 类 org.apache.hudi.avro.JsonEncoder
[ERROR] /Users/carter/Desktop/study/hudi/hudi-common/src/main/java/org/apache/hudi/avro/JsonEncoder.java:[105,18] 找不到符号
[ERROR] 符号: 类 JsonGenerator
[ERROR] 位置: 类 org.apache.hudi.avro.JsonEncoder
[ERROR] /Users/carter/Desktop/study/hudi/hudi-common/src/main/java/org/apache/hudi/avro/JsonEncoder.java:[179,33] 找不到符号
[ERROR] 符号: 类 JsonGenerator
[ERROR] 位置: 类 org.apache.hudi.avro.JsonEncoder
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn <args> -rf :hudi-common
(base) carter@MacBook-Pro hudi %
报错如下图所示:
解决方案:
<dependency>
<groupId>org.codehaus.jackson</groupId>
<artifactId>jackson-core-asl</artifactId>
<version>1.9.13</version>
</dependency>
3.开源jdk问题
此问题必须使用官方jdk,即便1.8版本开源JDK也不行。
报错信息:
[INFO] hudi-cli-bundle_2.12 ............................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 30.773 s
[INFO] Finished at: 2024-02-18T10:20:11+08:00
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.8.0:compile (default-compile) on project hudi-common: Compilation failure: Compilation failure:
[ERROR] 无法访问org.apache.hadoop.shaded.org.apache.avro.reflect.Stringable
[ERROR] 找不到org.apache.hadoop.shaded.org.apache.avro.reflect.Stringable的类文件
[ERROR] /Users/carter/Desktop/study/hudi/hudi-common/src/main/java/org/apache/hudi/metadata/HoodieTableMetadataUtil.java:[282,9] 对于collect(java.util.stream.Collector<org.apache.hudi.common.model.HoodieColumnRangeMetadata<java.lang.Comparable>,capture#1, 共 ?,java.util.Map<java.lang.String,org.apache.hudi.common.model.HoodieColumnRangeMetadata<java.lang.Comparable>>>), 找不到合适的方法
[ERROR] 方法 java.util.stream.Stream.<R>collect(java.util.function.Supplier<R>,java.util.function.BiConsumer<R,? super org.apache.hudi.common.model.HoodieColumnRangeMetadata>,java.util.function.BiConsumer<R,R>)不适用
[ERROR] (无法推断类型变量 R
[ERROR] (实际参数列表和形式参数列表长度不同))
[ERROR] 方法 java.util.stream.Stream.<R,A>collect(java.util.stream.Collector<? super org.apache.hudi.common.model.HoodieColumnRangeMetadata,A,R>)不适用
[ERROR] (无法推断类型变量 R,A
[ERROR] (参数不匹配; java.util.stream.Collector<org.apache.hudi.common.model.HoodieColumnRangeMetadata<java.lang.Comparable>,capture#1, 共 ?,java.util.Map<java.lang.String,org.apache.hudi.common.model.HoodieColumnRangeMetadata<java.lang.Comparable>>>无法转换为java.util.stream.Collector<? super org.apache.hudi.common.model.HoodieColumnRangeMetadata,A,R>))
[ERROR] /Users/carter/Desktop/study/hudi/hudi-common/src/main/java/org/apache/hudi/common/util/ParquetUtils.java:[369,11] 对于collect(java.util.stream.Collector<org.apache.hudi.common.model.HoodieColumnRangeMetadata<java.lang.Comparable>,capture#2, 共 ?,java.util.Map<java.lang.String,java.util.List<org.apache.hudi.common.model.HoodieColumnRangeMetadata<java.lang.Comparable>>>>), 找不到合适的方法
[ERROR] 方法 java.util.stream.Stream.<R>collect(java.util.function.Supplier<R>,java.util.function.BiConsumer<R,? super org.apache.hudi.common.model.HoodieColumnRangeMetadata>,java.util.function.BiConsumer<R,R>)不适用
[ERROR] (无法推断类型变量 R
[ERROR] (实际参数列表和形式参数列表长度不同))
[ERROR] 方法 java.util.stream.Stream.<R,A>collect(java.util.stream.Collector<? super org.apache.hudi.common.model.HoodieColumnRangeMetadata,A,R>)不适用
[ERROR] (无法推断类型变量 R,A
[ERROR] (参数不匹配; java.util.stream.Collector<org.apache.hudi.common.model.HoodieColumnRangeMetadata<java.lang.Comparable>,capture#2, 共 ?,java.util.Map<java.lang.String,java.util.List<org.apache.hudi.common.model.HoodieColumnRangeMetadata<java.lang.Comparable>>>>无法转换为java.util.stream.Collector<? super org.apache.hudi.common.model.HoodieColumnRangeMetadata,A,R>))
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn <args> -rf :hudi-common
(base) carter@MacBook-Pro hudi %
报错异常截图如下:
解决方法:使用官方JDK版本,不能使用开源版本,只要java -version版本包含openJdk都不可行。
4.相关大数据学习demo地址: