Flink实现kafka到kafka、kafka到doris的精准一次消费

1 流程图

2 Flink来源表建模

sql 复制代码
--来源-城市topic
CREATE TABLE NJ_QL_JC_SSJC_SOURCE (
record string 
) WITH (
	'connector' = 'kafka',
	'topic' = 'QL_JC_SSJC',
	'properties.bootstrap.servers' = '172.*.*.*:9092',
	'properties.group.id' = 'QL_JC_SSJC_NJ_QL_JC_SSJC_SOURCE',
    'scan.startup.mode' = 'group-offsets',
    'properties.isolation.level' = 'read_committed',
    'properties.auto.offset.reset' = 'earliest',
	'format' = 'raw'
);
--来源-中台kafka-topic
CREATE TABLE ODS_QL_JC_SSJC_SOURCE (
sscsdm string,
extract_time TIMESTAMP,
record string
) WITH (
	'connector' = 'kafka',
	'topic' = 'ODS_QL_JC_SSJC',
	'properties.bootstrap.servers' = '172.*.*.*:21007,172.*.*.*:21007,172.*.*.*:21007',
	'properties.security.protocol' = 'SASL_PLAINTEXT',
	'properties.sasl.kerberos.service.name' = 'kafka',
	'properties.kerberos.domain.name' = 'hadoop.hadoop.com',
	'properties.group.id' = 'ODS_QL_JC_SSJC_SOURCE_ODS_QL_JC_SSJC_SOURCE',
	'scan.startup.mode' = 'group-offsets',
   'properties.auto.offset.reset' = 'earliest',
   'properties.isolation.level' = 'read_committed',
   'sink.semantic' = 'exactly-once',
	'format' = 'json'
);

3 Flink去向表建模

sql 复制代码
--去向-中台kafka-topic
CREATE TABLE KAFKA_ODS_QL_JC_SSJC_SINK  (
sscsdm string,
extract_time TIMESTAMP,
record string
) WITH (
	'connector' = 'kafka',
	'topic' = 'ODS_QL_JC_SSJC',
	'properties.bootstrap.servers' = '172.*.*.*:21007,172.*.*.*:21007,172.*.*.*:21007',
	'properties.security.protocol' = 'SASL_PLAINTEXT',
	'properties.sasl.kerberos.service.name' = 'kafka',
	'properties.kerberos.domain.name' = 'hadoop.hadoop.com',
	'format' = 'json', 
   'properties.transaction.timeout.ms' = '900000'
);
--去向-Doris表
CREATE TABLE DORIS_ODS_QL_JC_SSJC_SINK (
	sscsdm STRING,
	extract_time TIMESTAMP,
	record STRING
) WITH (
	'connector' = 'doris',
	'fenodes' = '3.*.*.*:8030,3.*.*.*:8030,3.*.*.*:8030',
	'table.identifier' = 'doris_d.ods_ql_jc_ssjc',
	'username' = 'root',
	'password' = '********',
   'sink.properties.two_phase_commit' = 'true' 
);

4 城市Topic至中台Topic的Flinksql

sql 复制代码
insert into
  KAFKA_ODS_QL_JC_SSJC_SINK
 SELECT
   '320100' as sscsdm,
   CURRENT_TIMESTAMP as extract_time,
   record
 FROM
   NJ_QL_JC_SSJC_SOURCE
 UNION ALL
SELECT
  '320200' as sscsdm,
  CURRENT_TIMESTAMP as extract_time,
  record
FROM
  WX_QL_JC_SSJC_SOURCE
  .
  .
  .
 UNION ALL
 SELECT
   '320583' as sscsdm,
   CURRENT_TIMESTAMP as extract_time,
   record
 FROM
   KS_QL_JC_SSJC_SOURCE

5 中台Topic至Doris的Flinksql

sql 复制代码
insert into DORIS_ODS_QL_JC_SSJC_SINK
SELECT
  sscsdm,
  CURRENT_TIMESTAMP as extract_time,
  record
FROM
  ODS_QL_JC_SSJC_SOURCE   
相关推荐
warrah1 小时前
flink-cdc同步数据到doris中
flink·doris
坚定信念,勇往无前1 小时前
Spring Boot中整合Flink CDC 数据库变更监听器来实现对MySQL数据库
数据库·spring boot·flink
桃林春风一杯酒1 小时前
HADOOP_HOME and hadoop.home.dir are unset.
大数据·hadoop·分布式
桃木山人2 小时前
BigData File Viewer报错
大数据·java-ee·github·bigdata
B站计算机毕业设计超人2 小时前
计算机毕业设计Python+DeepSeek-R1高考推荐系统 高考分数线预测 大数据毕设(源码+LW文档+PPT+讲解)
大数据·python·机器学习·网络爬虫·课程设计·数据可视化·推荐算法
数造科技2 小时前
紧随“可信数据空间”政策风潮,数造科技正式加入开放数据空间联盟
大数据·人工智能·科技·安全·敏捷开发
undo_try3 小时前
大数据组件(四)快速入门实时数据湖存储系统Apache Paimon(2)
flink·bigdata·paimon
逸Y 仙X5 小时前
Git常见命令--助力开发
java·大数据·git·java-ee·github·idea
caihuayuan46 小时前
PHP建立MySQL持久化连接(长连接)及mysql与mysqli扩展的区别
java·大数据·sql·spring
B站计算机毕业设计超人6 小时前
计算机毕业设计Hadoop+Spark+DeepSeek-R1大模型民宿推荐系统 hive民宿可视化 民宿爬虫 大数据毕业设计(源码+LW文档+PPT+讲解)
大数据·hadoop·爬虫·机器学习·课程设计·数据可视化·推荐算法