Flink on yarn 加载失败plugins失效问题解决

flink版本:1.13.6

1. 问题

flink 任务运行在yarn集群,plugins加载失效,导致通过扩展资源获取任务参数失效

2. 问题定位

  1. yarn容器的jar包及插件信息,jar包是正常上传

  2. 源码定位

    加载plugins入口,TaskManagerRunner.class

    PluginUtils.createPluginManagerFromRootFolder

    源码加载扩展资源参数入口TaskManagerRunner.class

    ExternalResourceUtils.createStaticExternalResourceInfoProviderFromConfig

    日志信息

    定位PluginConfig源码

3. 方案

重写覆盖集群的PluginConfig.java,优先从configurtaion获取

bash 复制代码
/*
 * Licensed to the Apache Software Foundation (ASF) under one
 * or more contributor license agreements.  See the NOTICE file
 * distributed with this work for additional information
 * regarding copyright ownership.  The ASF licenses this file
 * to you under the Apache License, Version 2.0 (the
 * "License"); you may not use this file except in compliance
 * with the License.  You may obtain a copy of the License at
 *
 * http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */

package org.apache.flink.core.plugin;

import com.alibaba.fastjson.JSON;
import org.apache.commons.lang3.StringUtils;
import org.apache.flink.configuration.*;

import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

import java.io.File;
import java.nio.file.Path;
import java.util.Arrays;
import java.util.Objects;
import java.util.Optional;

/** Stores the configuration for plugins mechanism. */
@SuppressWarnings("OptionalUsedAsFieldOrParameterType")
public class PluginConfig {
    private static final Logger LOG = LoggerFactory.getLogger(PluginConfig.class);

    private static final String PUPU_FLINK_PLUGINS_DIR = "flink_plugins_dir";

    private final Optional<Path> pluginsPath;

    private final String[] alwaysParentFirstPatterns;

    private PluginConfig(Optional<Path> pluginsPath, String[] alwaysParentFirstPatterns) {
        pluginsPath.ifPresent(path -> LOG.info("pluginsPath: {}", path));
        LOG.info("alwaysParentFirstPatterns: {}", Arrays.stream(alwaysParentFirstPatterns).toArray());
        this.pluginsPath = pluginsPath;
        this.alwaysParentFirstPatterns = alwaysParentFirstPatterns;
    }

    public Optional<Path> getPluginsPath() {
        return pluginsPath;
    }

    public String[] getAlwaysParentFirstPatterns() {
        return alwaysParentFirstPatterns;
    }

    public static PluginConfig fromConfiguration(Configuration configuration) {
        return new PluginConfig(
                getPluginsDir(configuration).map(File::toPath),
                CoreOptions.getPluginParentFirstLoaderPatterns(configuration));
    }

    public static Optional<File> getPluginsDir(Configuration configuration) {
        String pluginsDir = configuration.get(ConfigOptions.key(PUPU_FLINK_PLUGINS_DIR).stringType().defaultValue(null));
        if (StringUtils.isBlank(pluginsDir)) {
            pluginsDir =
                    System.getenv()
                            .getOrDefault(
                                    ConfigConstants.ENV_FLINK_PLUGINS_DIR,
                                    ConfigConstants.DEFAULT_FLINK_PLUGINS_DIRS);
        }
        File pluginsDirFile = new File(pluginsDir);
        if (!pluginsDirFile.isDirectory()) {
            LOG.warn("The plugins directory [{}] does not exist.", pluginsDirFile);
            return Optional.empty();
        }
        return Optional.of(pluginsDirFile);
    }

}
相关推荐
淡酒交魂15 分钟前
「Flink」业务搭建方法总结
大数据·数据挖掘·数据分析
mask哥20 分钟前
详解flink java基础(一)
java·大数据·微服务·flink·实时计算·领域驱动
TDengine (老段)24 分钟前
TDengine IDMP 高级功能(4. 元素引用)
大数据·数据库·人工智能·物联网·数据分析·时序数据库·tdengine
livemetee1 小时前
Flink2.0学习笔记:Flink服务器搭建与flink作业提交
大数据·笔记·学习·flink
zhang98800003 小时前
储能领域大数据平台的设计中如何使用 Hadoop、Spark、Flink 等组件实现数据采集、清洗、存储及实时 / 离线计算,支持储能系统分析与预测
大数据·hadoop·spark
老蒋新思维3 小时前
存量竞争下的破局之道:品牌与IP的双引擎策略|创客匠人
大数据·网络·知识付费·创客匠人·知识变现
Lx3524 小时前
Hadoop日志分析实战:快速定位问题的技巧
大数据·hadoop
喂完待续7 小时前
【Tech Arch】Hive技术解析:大数据仓库的SQL桥梁
大数据·数据仓库·hive·hadoop·sql·apache
SelectDB7 小时前
5000+ 中大型企业首选的 Doris,在稳定性的提升上究竟花了多大的功夫?
大数据·数据库·apache
最初的↘那颗心8 小时前
Flink Stream API 源码走读 - window 和 sum
大数据·hadoop·flink·源码·实时计算·窗口函数