环境
spark-sql是 spark 3.2版本
// spark集群是3.5版本
paimon 1.2
启动命令
sudo -i spark-sql \
--master local[*] \
--conf spark.sql.catalogImplementation=hive \
--conf spark.driver.memory=4g \
--conf spark.executor.memory=5g \
--conf spark.executor.cores=2 \
--conf spark.executor.instances=2 \
--jars /opt/resource/paimon/paimon-spark-3.2-1.2.0.jar \
--conf spark.sql.catalog.paimon=org.apache.paimon.spark.SparkCatalog \
--conf spark.sql.catalog.paimon.warehouse=s3a://aaaaaaa/paimon_lc/ \
--conf spark.sql.extensions=org.apache.paimon.spark.extensions.PaimonSparkSessionExtensions \
--conf spark.sql.cli.print.header=true
永久保存只要启动配置即可
--conf spark.sql.catalogImplementation=hive
异常现象
create view v1 as select * from paimon.db1.table1 limit 1; //这里在spark_catalog中新建了一个视图,指向paimon表.
但是重新进入spark-sql后,直接查询v1视图会报错:
When using Paimon, it is necessary to configure spark.sql.extensions and ensure that it includes org.apache.paimon.spark.extensions.PaimonSparkSessionExtensions. You can disable this check by configuring spark.paimon.requiredSparkConfsCheck.enabled to false, but it is strongly discouraged to do so.
但是如果use paimon; use spark_catalog; 来回切换下,查询视图v1就没问题.
原因未知