Flink之Kafka Sink

  • 代码内容
java 复制代码
package com.jin.demo;

import org.apache.flink.api.common.serialization.SimpleStringSchema;
import org.apache.flink.connector.base.DeliveryGuarantee;
import org.apache.flink.connector.kafka.sink.KafkaRecordSerializationSchema;
import org.apache.flink.connector.kafka.sink.KafkaSink;
import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.kafka.clients.producer.ProducerConfig;

import java.util.Properties;

/**
 * @Author: J
 * @Version: 1.0
 * @CreateTime: 2023/6/29
 * @Description: 测试
 **/
public class FlinkKafkaSink {
    public static void main(String[] args) throws Exception {
        // 创建流环境
        StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
        // 设置并行度为1
        env.setParallelism(1);
        // 添加数据源(CustomizeSource为自定义数据源,便于测试)
        SingleOutputStreamOperator<String> mapStream = env.addSource(new CustomizeSource()).map(bean -> bean.toString());
        // 设置生产者事务超时时间
        Properties properties = new Properties();
        properties.setProperty(ProducerConfig.TRANSACTION_TIMEOUT_CONFIG, "10000");
        // 构建KafkaSink
        KafkaSink<String> kafkaSink = KafkaSink.<String>builder()
                // 配置Kafka服务
                .setBootstrapServers("lx01:9092")
                // 配置消息序列化类型
                .setRecordSerializer(KafkaRecordSerializationSchema.<String>builder()
                        // 配置kafka topic信息
                        .setTopic("tpc-02")
                        // 配置value序列化类型
                        .setValueSerializationSchema(new SimpleStringSchema())
                        .build()
                )
                // 设置语义
                .setDeliverGuarantee(DeliveryGuarantee.AT_LEAST_ONCE)
                // 设置事务ID前缀
                .setTransactionalIdPrefix("JL-")
                .build();
        // 将结果输出到kafka
        mapStream.sinkTo(kafkaSink);
        env.execute("Kafka Sink");
    }
}

结果数据

powershell 复制代码
[root@lx01 bin]# ./kafka-console-consumer.sh --bootstrap-server lx01:9092 --topic tpc-02
CustomizeBean(name=AAA-274, age=64, gender=W, hobbit=钓鱼爱好者)
CustomizeBean(name=AAA-973, age=45, gender=W, hobbit=钓鱼爱好者)
CustomizeBean(name=AAA-496, age=71, gender=W, hobbit=非遗文化爱好者)
CustomizeBean(name=AAA-263, age=45, gender=M, hobbit=天文知识爱好者)
CustomizeBean(name=AAA-790, age=77, gender=W, hobbit=书法爱好者)
CustomizeBean(name=AAA-806, age=38, gender=M, hobbit=非遗文化爱好者)
CustomizeBean(name=AAA-498, age=58, gender=M, hobbit=篮球运动爱好者)
CustomizeBean(name=AAA-421, age=63, gender=M, hobbit=书法爱好者)
CustomizeBean(name=AAA-938, age=56, gender=W, hobbit=乒乓球运动爱好者)
CustomizeBean(name=AAA-278, age=18, gender=M, hobbit=乒乓球运动爱好者)
CustomizeBean(name=AAA-614, age=74, gender=W, hobbit=钓鱼爱好者)
CustomizeBean(name=AAA-249, age=67, gender=W, hobbit=天文知识爱好者)
CustomizeBean(name=AAA-690, age=72, gender=W, hobbit=网吧战神)
CustomizeBean(name=AAA-413, age=69, gender=M, hobbit=美食爱好者)
相关推荐
指尖下的技术9 小时前
Kafka面试题----Kafka消息是采用Pull模式,还是Push模式
分布式·kafka
码至终章11 小时前
kafka常用目录文件解析
java·分布式·后端·kafka·mq
小马爱打代码12 小时前
Kafka-常见的问题解答
分布式·kafka
weisian15112 小时前
消息队列篇--原理篇--常见消息队列总结(RabbitMQ,Kafka,ActiveMQ,RocketMQ,Pulsar)
kafka·rabbitmq·activemq
weisian15113 小时前
消息队列篇--原理篇--Pulsar和Kafka对比分析
分布式·kafka
Ray.199817 小时前
Flink 的核心特点和概念
大数据·数据仓库·数据分析·flink
极客先躯17 小时前
如何提升flink的处理速度?
大数据·flink·提高处理速度
BestandW1shEs17 小时前
快速入门Flink
java·大数据·flink
龙哥·三年风水19 小时前
openresty(nginx)+lua+kafka实现日志搜集系统
kafka·lua·openresty
Ray.199821 小时前
Flink在流处理中,为什么还会有窗口的概念呢
大数据·flink