Dinky FlinkSQL Doris读取写入

Dinky运行前开启全局变量,以支持使用: 'sink.sink.label-prefix' = '${idUtil.simpleUUID()}'

Mysql同步Doris - testMysqlCdcDoris:

复制代码
EXECUTE CDCSOURCE demo_doris WITH (
  'connector' = 'mysql-cdc',
  'hostname' = '172.xxx',
  'port' = '3306',
  'username' = 'xxx',
  'password' = 'xxx',
  'checkpoint' = '10000',
  'scan.startup.mode' = 'initial',
  'parallelism' = '1',
  'database-name' = 'test',
  'table-name' = 'test\.student,',
  'sink.connector' = 'doris',
  'sink.fenodes' = '172.xxx:8130',
  'sink.username' = 'xxx',
  'sink.password' = 'xxx',
  'sink.doris.batch.size' = '1000',
  'sink.sink.max-retries' = '1',
  'sink.sink.db' = 'test',
  'sink.sink.enable-delete' = 'true',
  'sink.sink.properties.format' ='json',
  'sink.sink.properties.read_json_by_line' ='true',
  'sink.table.prefix' = 'test_',
  'sink.table.identifier' = '#{schemaName}.#{tableName}',
  'sink.sink.label-prefix' = '${idUtil.simpleUUID()}'
);

读取Doris - testDorisRead:

复制代码
CREATE TABLE flink_doris_source (
  aggregate_id int,
  replace_data string,
  max_data string,
  agg_item int,
  max_item int,
  min_item int
) 
WITH (
  'connector' = 'doris',
  'fenodes' = '172.xxx:8130',
  'table.identifier' = 'test.aggregate_table',
  'username' = 'xxx',
  'password' = 'xxx'
);

select * from flink_doris_source

Doris同步Doris - testDorisCdcDoris:

复制代码
-- doris source
CREATE TABLE flink_doris_source (
  aggregate_id int,
  replace_data string,
  max_data string,
  agg_item int,
  max_item int,
  min_item int
) 
WITH (
  'connector' = 'doris',
  'fenodes' = '172.xxx:8130',
  'table.identifier' = 'test.aggregate_table',
  'username' = 'xxx',
  'password' = 'xxx'
);

-- enable checkpoint
SET 'execution.checkpointing.interval' = '10s';

-- doris sink
CREATE TABLE flink_doris_sink (
  aggregate_id int,
  replace_data string,
  max_data string,
  agg_item int,
  max_item int,
  min_item int
    ) 
    WITH (
      'connector' = 'doris',
      'fenodes' = '172.xxx:8030',
      'table.identifier' = 'test.test_aggregate_table',
      'username' = 'xxx',
      'password' = 'xxx',
      'sink.label-prefix' = '${idUtil.simpleUUID()}'
);

-- submit insert job
INSERT INTO flink_doris_sink select aggregate_id, replace_data, max_data, agg_item, max_item, min_item from flink_doris_source

参考

Flink Doris Connector - Apache Doris

Doris + Flink + DolphinScheduler + Dinky 构建开源数据平台_dinky dolphinscheduler flink-CSDN博客

整库同步概述 | Dinky

相关推荐
亲亲菱纱6 分钟前
hive数仓分层
数据仓库
断春风10 分钟前
如何避免 MySQL 死锁?——从原理到实战的系统性解决方案
数据库·mysql
闲人编程11 分钟前
基础设施即代码(IaC)工具比较:Pulumi vs Terraform
java·数据库·terraform·iac·codecapsule·pulumi
QQ_216962909617 分钟前
Spring Boot大学生社团管理平台 【部署教程+可完整运行源码+数据库】
java·数据库·spring boot·微信小程序
玉成22623 分钟前
MySQL两表之间数据迁移由于字段排序规则设置的不一样导致失败
数据库·mysql
dblens 数据库管理和开发工具32 分钟前
DBLens:让 SQL 查询更智能、更高效的数据库利器
服务器·数据库·sql·数据库连接工具·dblens
TDengine (老段)43 分钟前
TDengine 在新能源领域的最佳实践
大数据·数据库·物联网·时序数据库·tdengine·涛思数据
是席木木啊1 小时前
Spring Boot 中 @Async 与 @Transactional 结合使用全解析:避坑指南
数据库·spring boot·oracle
__风__1 小时前
PostgreSQL 创建扩展后台流程
数据库·postgresql
StarRocks_labs1 小时前
Fresha 的实时分析进化:从 Postgres 和 Snowflake 走向 StarRocks
数据库·starrocks·postgres·snowflake·fresha