flink1.16+连接Elasticsearch7官方例子报错解决方案

flink输出至es,连接es7时使用的sink类是Elasticsearch7SinkBuilder,与es6链接方式不同,官网地址:fink连接es官方地址

flink连接es7官方示例

依赖jar:

xml 复制代码
<dependency>
    <groupId>org.apache.flink</groupId>
    <artifactId>flink-connector-elasticsearch7</artifactId>
    <version>3.0.1-1.16</version>
</dependency>

官方连接es7代码

java 复制代码
import org.apache.flink.api.common.functions.MapFunction;
import org.apache.flink.connector.elasticsearch.sink.Elasticsearch7SinkBuilder;
import org.apache.flink.streaming.api.datastream.DataStream;

import org.apache.http.HttpHost;
import org.elasticsearch.action.index.IndexRequest;
import org.elasticsearch.client.Requests;

import java.util.HashMap;
import java.util.Map;

DataStream<String> input = ...;

input.sinkTo(
    new Elasticsearch7SinkBuilder<String>()
        .setBulkFlushMaxActions(1) // Instructs the sink to emit after every element, otherwise they would be buffered
        .setHosts(new HttpHost("127.0.0.1", 9200, "http"))
        .setEmitter(
        (element, context, indexer) ->
        indexer.add(createIndexRequest(element)))
        .build());

private static IndexRequest createIndexRequest(String element) {
    Map<String, Object> json = new HashMap<>();
    json.put("data", element);

    return Requests.indexRequest()
        .index("my-index")
        .id(element)
        .source(json);
}

本地运行报错

java.lang.IllegalStateException: The elasticsearch emitter must be serializable.

错误提示很明确:emitter没有序列化,点击查看ElasticsearchEmitter的源码其实最终是有继承序列化接口的,那就猜测是使用lambda方式不能自己实现序列化,那就自己写个实现类ElasticsearchEmitter接口然后显示序列化一次。

修改后代码

java 复制代码
import org.apache.flink.api.common.functions.MapFunction;
import org.apache.flink.connector.elasticsearch.sink.Elasticsearch7SinkBuilder;
import org.apache.flink.streaming.api.datastream.DataStream;

import org.apache.http.HttpHost;
import org.elasticsearch.action.index.IndexRequest;
import org.elasticsearch.client.Requests;

import java.util.HashMap;
import java.util.Map;

DataStream<String> input = ...;

input.sinkTo(
    new Elasticsearch7SinkBuilder<String>()
        .setBulkFlushMaxActions(1) // Instructs the sink to emit after every element, otherwise they would be buffered
        .setHosts(new HttpHost("127.0.0.1", 9200, "http"))
        .setEmitter(
        (element, context, indexer) ->
        indexer.add(createIndexRequest(element)))
        .build());

private static IndexRequest createIndexRequest(String element) {
    Map<String, Object> json = new HashMap<>();
    json.put("data", element);

    return Requests.indexRequest()
        .index("my-index")
        .id(element)
        .source(json);
}

//自己新增类继承ElasticsearchEmitter,实现序列化接口
    static class MyElasticsearchEmitter implements ElasticsearchEmitter<String>, Serializable {

        @Override
        public void emit(String element, SinkWriter.Context context, RequestIndexer indexer) {
            indexer.add(createIndexRequest(element));
        }

    }

启动后验证成功。

flink连接es7抽出完成工具类

java 复制代码
import com.alibaba.fastjson.JSON;
import lombok.extern.slf4j.Slf4j;
import org.apache.flink.api.connector.sink2.Sink;
import org.apache.flink.api.connector.sink2.SinkWriter;
import org.apache.flink.connector.elasticsearch.sink.*;
import org.apache.http.HttpHost;
import org.elasticsearch.action.index.IndexRequest;
import org.elasticsearch.client.Requests;

import java.io.Serializable;

@Slf4j
public class Es7SinkFactory {

    public static Sink createSink(String host, Integer port, String schema, String user, String pwd, String index) {
        try {

            ElasticsearchSink<String> elasticsearchSink = new Elasticsearch7SinkBuilder<String>()
                    .setHosts(new HttpHost(host, port, schema))
                    .setConnectionUsername(user)
                    .setConnectionPassword(pwd)
                    .setEmitter(new MyElasticsearchEmitter(index))
                    .setBulkFlushMaxSizeMb(5)//最大缓冲大小
                    .setBulkFlushInterval(5000)//刷新间隔
                    .setBulkFlushMaxActions(1000)//批量插入最大条数
                    .setBulkFlushBackoffStrategy(FlushBackoffType.CONSTANT, 5, 100)
                    .build();

            return elasticsearchSink;
        } catch (Exception e) {
            log.error("Es7SinkFactory.createSink error e=", e);
        }

        return null;
    }

    private static IndexRequest createIndexRequest(String element, String index) {
        return Requests.indexRequest()
                .index(index)
                .source(JSON.parseObject(element));
    }


    //匿名内部类/lamda表达式未集成序列化接口,需要实现序列化接口
    static class MyElasticsearchEmitter implements ElasticsearchEmitter<String>, Serializable {

        private String index;

        MyElasticsearchEmitter(String index) {
            this.index = index;
        }

        @Override
        public void emit(String element, SinkWriter.Context context, RequestIndexer indexer) {
            indexer.add(createIndexRequest(element, index));
        }

    }


}
相关推荐
桦说编程1 小时前
从 ForkJoinPool 的 Compensate 看并发框架的线程补偿思想
java·后端·源码阅读
躺平大鹅3 小时前
Java面向对象入门(类与对象,新手秒懂)
java
初次攀爬者4 小时前
RocketMQ在Spring Boot上的基础使用
java·spring boot·rocketmq
花花无缺4 小时前
搞懂@Autowired 与@Resuorce
java·spring boot·后端
Derek_Smart5 小时前
从一次 OOM 事故说起:打造生产级的 JVM 健康检查组件
java·jvm·spring boot
NE_STOP6 小时前
MyBatis-mybatis入门与增删改查
java
孟陬9 小时前
国外技术周刊 #1:Paul Graham 重新分享最受欢迎的文章《创作者的品味》、本周被划线最多 YouTube《如何在 19 分钟内学会 AI》、为何我不
java·前端·后端
想用offer打牌9 小时前
一站式了解四种限流算法
java·后端·go
华仔啊10 小时前
Java 开发千万别给布尔变量加 is 前缀!很容易背锅
java