pyflink过滤kafka数据

py 复制代码
from pyflink.table import (TableEnvironment, EnvironmentSettings)

# 输入、输出、过滤条件
columns_in = [
...
]

columns_out = [
...
]
filter_condition = "name = '蒋介石' and sex = '男'"


# 创建执行环境

t_env = TableEnvironment.create(EnvironmentSettings.in_streaming_mode())
t_env.get_config().get_configuration().set_string("pipeline.jars", "file:///work/flink-sql-connector-kafka-3.2.0-1.19.jar")

source_topic = "foo"
sink_topic = "baa"
kafka_servers = "kafka:9092"
kafka_consumer_group_id = "flink consumer"

columnstr = ','.join([f"`{col}` VARCHAR"  for col in columns_in])
source_ddl = f"""
CREATE TABLE kafka_source({columnstr}) WITH (
              'connector' = 'kafka',
              'topic' = '{source_topic}',
              'properties.bootstrap.servers' = '{kafka_servers}',
              'properties.group.id' = '{kafka_consumer_group_id}',
              'scan.startup.mode' = 'latest-offset',
              'format' = 'json'
            )
"""

columnstr2 = ','.join([f"`{col}` VARCHAR"  for col in columns_out])
sink_ddl = f"""
CREATE TABLE kafka_sink ({columnstr2}
    ) with (
      'connector' = 'kafka',
      'topic' = '{sink_topic}',
      'properties.bootstrap.servers' = '{kafka_servers}',
      'properties.group.id' = '{kafka_consumer_group_id}',
      'scan.startup.mode' = 'latest-offset',
      'format' = 'json'
    )
"""
# 过滤字段
filtersql = f"""
insert into kafka_sink
select {
','.join([f"`{col}`"  for col in columns_out])
}
from kafka_source
where {filter_condition}
"""
t_env.execute_sql(filtersql)
t_env.execute_sql(source_ddl)
t_env.execute_sql(sink_ddl)
相关推荐
都叫我大帅哥1 小时前
Python的Optional:让你的代码优雅处理“空值”危机
python
曾几何时`3 小时前
基于python和neo4j构建知识图谱医药问答系统
python·知识图谱·neo4j
写写闲篇儿5 小时前
Python+MongoDB高效开发组合
linux·python·mongodb
杭州杭州杭州6 小时前
Python笔记
开发语言·笔记·python
路人蛃8 小时前
通过国内扣子(Coze)搭建智能体并接入discord机器人
人工智能·python·ubuntu·ai·aigc·个人开发
qiqiqi(^_×)8 小时前
卡在“pycharm正在创建帮助程序目录”
ide·python·pycharm
Ching·8 小时前
esp32使用ESP-IDF在Linux下的升级步骤,和遇到的坑Traceback (most recent call last):,及解决
linux·python·esp32·esp_idf升级
cui_win9 小时前
Kafka 配置参数详解:ZooKeeper 模式与 KRaft 模式对比
分布式·zookeeper·kafka
吗喽1543451889 小时前
用python实现自动化布尔盲注
数据库·python·自动化
hbrown9 小时前
Flask+LayUI开发手记(十一):选项集合的数据库扩展类
前端·数据库·python·layui