Flink实现kafka到kafka、kafka到doris的精准一次消费

1 流程图

2 Flink来源表建模

sql 复制代码
--来源-城市topic
CREATE TABLE NJ_QL_JC_SSJC_SOURCE (
record string 
) WITH (
	'connector' = 'kafka',
	'topic' = 'QL_JC_SSJC',
	'properties.bootstrap.servers' = '172.*.*.*:9092',
	'properties.group.id' = 'QL_JC_SSJC_NJ_QL_JC_SSJC_SOURCE',
    'scan.startup.mode' = 'group-offsets',
    'properties.isolation.level' = 'read_committed',
    'properties.auto.offset.reset' = 'earliest',
	'format' = 'raw'
);
--来源-中台kafka-topic
CREATE TABLE ODS_QL_JC_SSJC_SOURCE (
sscsdm string,
extract_time TIMESTAMP,
record string
) WITH (
	'connector' = 'kafka',
	'topic' = 'ODS_QL_JC_SSJC',
	'properties.bootstrap.servers' = '172.*.*.*:21007,172.*.*.*:21007,172.*.*.*:21007',
	'properties.security.protocol' = 'SASL_PLAINTEXT',
	'properties.sasl.kerberos.service.name' = 'kafka',
	'properties.kerberos.domain.name' = 'hadoop.hadoop.com',
	'properties.group.id' = 'ODS_QL_JC_SSJC_SOURCE_ODS_QL_JC_SSJC_SOURCE',
	'scan.startup.mode' = 'group-offsets',
   'properties.auto.offset.reset' = 'earliest',
   'properties.isolation.level' = 'read_committed',
   'sink.semantic' = 'exactly-once',
	'format' = 'json'
);

3 Flink去向表建模

sql 复制代码
--去向-中台kafka-topic
CREATE TABLE KAFKA_ODS_QL_JC_SSJC_SINK  (
sscsdm string,
extract_time TIMESTAMP,
record string
) WITH (
	'connector' = 'kafka',
	'topic' = 'ODS_QL_JC_SSJC',
	'properties.bootstrap.servers' = '172.*.*.*:21007,172.*.*.*:21007,172.*.*.*:21007',
	'properties.security.protocol' = 'SASL_PLAINTEXT',
	'properties.sasl.kerberos.service.name' = 'kafka',
	'properties.kerberos.domain.name' = 'hadoop.hadoop.com',
	'format' = 'json', 
   'properties.transaction.timeout.ms' = '900000'
);
--去向-Doris表
CREATE TABLE DORIS_ODS_QL_JC_SSJC_SINK (
	sscsdm STRING,
	extract_time TIMESTAMP,
	record STRING
) WITH (
	'connector' = 'doris',
	'fenodes' = '3.*.*.*:8030,3.*.*.*:8030,3.*.*.*:8030',
	'table.identifier' = 'doris_d.ods_ql_jc_ssjc',
	'username' = 'root',
	'password' = '********',
   'sink.properties.two_phase_commit' = 'true' 
);

4 城市Topic至中台Topic的Flinksql

sql 复制代码
insert into
  KAFKA_ODS_QL_JC_SSJC_SINK
 SELECT
   '320100' as sscsdm,
   CURRENT_TIMESTAMP as extract_time,
   record
 FROM
   NJ_QL_JC_SSJC_SOURCE
 UNION ALL
SELECT
  '320200' as sscsdm,
  CURRENT_TIMESTAMP as extract_time,
  record
FROM
  WX_QL_JC_SSJC_SOURCE
  .
  .
  .
 UNION ALL
 SELECT
   '320583' as sscsdm,
   CURRENT_TIMESTAMP as extract_time,
   record
 FROM
   KS_QL_JC_SSJC_SOURCE

5 中台Topic至Doris的Flinksql

sql 复制代码
insert into DORIS_ODS_QL_JC_SSJC_SINK
SELECT
  sscsdm,
  CURRENT_TIMESTAMP as extract_time,
  record
FROM
  ODS_QL_JC_SSJC_SOURCE   
相关推荐
猫猫姐姐1 天前
Flink基于Paimon的实时湖仓解决方案的演进
大数据·flink·湖仓一体
陈果然DeepVersion1 天前
Java大厂面试真题:Spring Boot+Kafka+AI智能客服场景全流程解析(五)
java·spring boot·kafka·向量数据库·大厂面试·rag·ai智能客服
极客数模1 天前
2025年(第六届)“大湾区杯”粤港澳金融数学建模竞赛准备!严格遵循要求,拿下大奖!
大数据·python·数学建模·金融·分类·图论·boosting
Elastic 中国社区官方博客1 天前
Elastic AI agent builder 介绍(三)
大数据·人工智能·elasticsearch·搜索引擎·ai·全文检索
陈果然DeepVersion1 天前
Java大厂面试真题:Spring Boot+Kafka+AI智能客服场景全流程解析(六)
spring boot·kafka·消息队列·向量数据库·java面试·rag·ai智能客服
王卫东1 天前
深入HBase:原理剖析与优化实战
大数据·数据库·hbase
HaiLang_IT1 天前
2026 人工智能与大数据专业毕业论文选题方向及题目示例(nlp/自然语言处理/图像处理)
大数据·人工智能·毕业设计选题
呆呆小金人1 天前
SQL键类型详解:超键到外键全解析
大数据·数据库·数据仓库·sql·数据库开发·etl·etl工程师
StarRocks_labs1 天前
StarRocks 4.0:基于 Apache Iceberg 的 Catalog 中心化访问控制
大数据·apache
陈果然DeepVersion1 天前
Java大厂面试真题:Spring Boot+Kafka+AI智能客服场景全流程解析(四)
java·spring boot·微服务·kafka·面试题·rag·ai智能客服