执行 spark.sql("CALL sys.create_tag(`table` => 'pipeline.bigdata_biz.tb1', tag => 'tag_${last1day_dt}')" )
报错:
java.lang.RuntimeException: spark_catalog is not a ProcedureCatalog.
at org.apache.paimon.spark.catalyst.analysis.PaimonProcedureResolver$CatalogValidator.asProcedureCatalog(PaimonProcedureResolver.scala:237)
at org.apache.paimon.spark.catalyst.analysis.PaimonProcedureResolver$$anonfunapply1.applyOrElse(PaimonProcedureResolver.scala:54)
at org.apache.paimon.spark.catalyst.analysis.PaimonProcedureResolver$$anonfunapply1.applyOrElse(PaimonProcedureResolver.scala:52)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.anonfunresolveOperatorsDownWithPruning$2(AnalysisHelper.scala:170)
at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(origin.scala:76)
解决方法
即打tag前必须切换到表所在的catalog
spark.sql("refresh table pipeline.bigdata_biz.tb1")
spark.sql("use pipeline") # 不切catalog 无法打tag
spark.sql("CALL sys.create_tag(`table` => 'pipeline.bigdata_biz.tb1', tag => 'tag_${last1day_dt}')" )