flink1.16+连接Elasticsearch7官方例子报错解决方案

flink输出至es,连接es7时使用的sink类是Elasticsearch7SinkBuilder,与es6链接方式不同,官网地址:fink连接es官方地址

flink连接es7官方示例

依赖jar:

xml 复制代码
<dependency>
    <groupId>org.apache.flink</groupId>
    <artifactId>flink-connector-elasticsearch7</artifactId>
    <version>3.0.1-1.16</version>
</dependency>

官方连接es7代码

java 复制代码
import org.apache.flink.api.common.functions.MapFunction;
import org.apache.flink.connector.elasticsearch.sink.Elasticsearch7SinkBuilder;
import org.apache.flink.streaming.api.datastream.DataStream;

import org.apache.http.HttpHost;
import org.elasticsearch.action.index.IndexRequest;
import org.elasticsearch.client.Requests;

import java.util.HashMap;
import java.util.Map;

DataStream<String> input = ...;

input.sinkTo(
    new Elasticsearch7SinkBuilder<String>()
        .setBulkFlushMaxActions(1) // Instructs the sink to emit after every element, otherwise they would be buffered
        .setHosts(new HttpHost("127.0.0.1", 9200, "http"))
        .setEmitter(
        (element, context, indexer) ->
        indexer.add(createIndexRequest(element)))
        .build());

private static IndexRequest createIndexRequest(String element) {
    Map<String, Object> json = new HashMap<>();
    json.put("data", element);

    return Requests.indexRequest()
        .index("my-index")
        .id(element)
        .source(json);
}

本地运行报错

java.lang.IllegalStateException: The elasticsearch emitter must be serializable.

错误提示很明确:emitter没有序列化,点击查看ElasticsearchEmitter的源码其实最终是有继承序列化接口的,那就猜测是使用lambda方式不能自己实现序列化,那就自己写个实现类ElasticsearchEmitter接口然后显示序列化一次。

修改后代码

java 复制代码
import org.apache.flink.api.common.functions.MapFunction;
import org.apache.flink.connector.elasticsearch.sink.Elasticsearch7SinkBuilder;
import org.apache.flink.streaming.api.datastream.DataStream;

import org.apache.http.HttpHost;
import org.elasticsearch.action.index.IndexRequest;
import org.elasticsearch.client.Requests;

import java.util.HashMap;
import java.util.Map;

DataStream<String> input = ...;

input.sinkTo(
    new Elasticsearch7SinkBuilder<String>()
        .setBulkFlushMaxActions(1) // Instructs the sink to emit after every element, otherwise they would be buffered
        .setHosts(new HttpHost("127.0.0.1", 9200, "http"))
        .setEmitter(
        (element, context, indexer) ->
        indexer.add(createIndexRequest(element)))
        .build());

private static IndexRequest createIndexRequest(String element) {
    Map<String, Object> json = new HashMap<>();
    json.put("data", element);

    return Requests.indexRequest()
        .index("my-index")
        .id(element)
        .source(json);
}

//自己新增类继承ElasticsearchEmitter,实现序列化接口
    static class MyElasticsearchEmitter implements ElasticsearchEmitter<String>, Serializable {

        @Override
        public void emit(String element, SinkWriter.Context context, RequestIndexer indexer) {
            indexer.add(createIndexRequest(element));
        }

    }

启动后验证成功。

flink连接es7抽出完成工具类

java 复制代码
import com.alibaba.fastjson.JSON;
import lombok.extern.slf4j.Slf4j;
import org.apache.flink.api.connector.sink2.Sink;
import org.apache.flink.api.connector.sink2.SinkWriter;
import org.apache.flink.connector.elasticsearch.sink.*;
import org.apache.http.HttpHost;
import org.elasticsearch.action.index.IndexRequest;
import org.elasticsearch.client.Requests;

import java.io.Serializable;

@Slf4j
public class Es7SinkFactory {

    public static Sink createSink(String host, Integer port, String schema, String user, String pwd, String index) {
        try {

            ElasticsearchSink<String> elasticsearchSink = new Elasticsearch7SinkBuilder<String>()
                    .setHosts(new HttpHost(host, port, schema))
                    .setConnectionUsername(user)
                    .setConnectionPassword(pwd)
                    .setEmitter(new MyElasticsearchEmitter(index))
                    .setBulkFlushMaxSizeMb(5)//最大缓冲大小
                    .setBulkFlushInterval(5000)//刷新间隔
                    .setBulkFlushMaxActions(1000)//批量插入最大条数
                    .setBulkFlushBackoffStrategy(FlushBackoffType.CONSTANT, 5, 100)
                    .build();

            return elasticsearchSink;
        } catch (Exception e) {
            log.error("Es7SinkFactory.createSink error e=", e);
        }

        return null;
    }

    private static IndexRequest createIndexRequest(String element, String index) {
        return Requests.indexRequest()
                .index(index)
                .source(JSON.parseObject(element));
    }


    //匿名内部类/lamda表达式未集成序列化接口,需要实现序列化接口
    static class MyElasticsearchEmitter implements ElasticsearchEmitter<String>, Serializable {

        private String index;

        MyElasticsearchEmitter(String index) {
            this.index = index;
        }

        @Override
        public void emit(String element, SinkWriter.Context context, RequestIndexer indexer) {
            indexer.add(createIndexRequest(element, index));
        }

    }


}
相关推荐
说书客啊8 分钟前
计算机毕业设计 | SpringBoot+vue线上家具商城 家居商品购买系统(附源码+论文)
java·spring boot·node.js·vue·毕业设计·智能家居·课程设计
小阿龙...9 分钟前
创建mapreduce项目使用maven
java·ide·hadoop·spark·big data
疯一样的码农13 分钟前
使用命令行创建 Maven 项目
java·maven
飞滕人生TYF20 分钟前
位运算实现加法 的过程中 保证最终进位为 0 详解
java·位运算
大数据编程之光25 分钟前
Flink普通API之Source使用全解析
大数据·windows·flink
情勤坊33 分钟前
JAVA实现将PDF转换成word文档
java·pdf·word
苹果酱056734 分钟前
springcloud-网关路由gateway
java·开发语言·spring boot·mysql·中间件
武子康40 分钟前
Java-08 深入浅出 MyBatis - 多对多模型 SqlMapConfig 与 Mapper 详细讲解测试
java·开发语言·数据库·sql·mybatis
摇滚侠1 小时前
java http body的格式 ‌application/x-www-form-urlencoded‌不支持文件上传
java·开发语言·http
尘浮生1 小时前
Java项目实战II基于SpringBoot的共享单车管理系统开发文档+数据库+源码)
java·开发语言·数据库·spring boot·后端·微信小程序·小程序