Dinky FlinkSQL Doris读取写入

Dinky运行前开启全局变量,以支持使用: 'sink.sink.label-prefix' = '${idUtil.simpleUUID()}'

Mysql同步Doris - testMysqlCdcDoris:

复制代码
EXECUTE CDCSOURCE demo_doris WITH (
  'connector' = 'mysql-cdc',
  'hostname' = '172.xxx',
  'port' = '3306',
  'username' = 'xxx',
  'password' = 'xxx',
  'checkpoint' = '10000',
  'scan.startup.mode' = 'initial',
  'parallelism' = '1',
  'database-name' = 'test',
  'table-name' = 'test\.student,',
  'sink.connector' = 'doris',
  'sink.fenodes' = '172.xxx:8130',
  'sink.username' = 'xxx',
  'sink.password' = 'xxx',
  'sink.doris.batch.size' = '1000',
  'sink.sink.max-retries' = '1',
  'sink.sink.db' = 'test',
  'sink.sink.enable-delete' = 'true',
  'sink.sink.properties.format' ='json',
  'sink.sink.properties.read_json_by_line' ='true',
  'sink.table.prefix' = 'test_',
  'sink.table.identifier' = '#{schemaName}.#{tableName}',
  'sink.sink.label-prefix' = '${idUtil.simpleUUID()}'
);

读取Doris - testDorisRead:

复制代码
CREATE TABLE flink_doris_source (
  aggregate_id int,
  replace_data string,
  max_data string,
  agg_item int,
  max_item int,
  min_item int
) 
WITH (
  'connector' = 'doris',
  'fenodes' = '172.xxx:8130',
  'table.identifier' = 'test.aggregate_table',
  'username' = 'xxx',
  'password' = 'xxx'
);

select * from flink_doris_source

Doris同步Doris - testDorisCdcDoris:

复制代码
-- doris source
CREATE TABLE flink_doris_source (
  aggregate_id int,
  replace_data string,
  max_data string,
  agg_item int,
  max_item int,
  min_item int
) 
WITH (
  'connector' = 'doris',
  'fenodes' = '172.xxx:8130',
  'table.identifier' = 'test.aggregate_table',
  'username' = 'xxx',
  'password' = 'xxx'
);

-- enable checkpoint
SET 'execution.checkpointing.interval' = '10s';

-- doris sink
CREATE TABLE flink_doris_sink (
  aggregate_id int,
  replace_data string,
  max_data string,
  agg_item int,
  max_item int,
  min_item int
    ) 
    WITH (
      'connector' = 'doris',
      'fenodes' = '172.xxx:8030',
      'table.identifier' = 'test.test_aggregate_table',
      'username' = 'xxx',
      'password' = 'xxx',
      'sink.label-prefix' = '${idUtil.simpleUUID()}'
);

-- submit insert job
INSERT INTO flink_doris_sink select aggregate_id, replace_data, max_data, agg_item, max_item, min_item from flink_doris_source

参考

Flink Doris Connector - Apache Doris

Doris + Flink + DolphinScheduler + Dinky 构建开源数据平台_dinky dolphinscheduler flink-CSDN博客

整库同步概述 | Dinky

相关推荐
半夏知半秋5 分钟前
MongoDB 与 Elasticsearch 数据同步方案整理
大数据·数据库·mongodb·elasticsearch·搜索引擎
虎头金猫11 分钟前
openEuler 22.03 LTS 时序数据库实战:InfluxDB 深度性能评测与优化指南
网络·数据库·python·网络协议·tcp/ip·负载均衡·时序数据库
菜鸟小九12 分钟前
mysql运维(读写分离)
运维·数据库·mysql
菜鸟小九16 分钟前
mysql运维(分库分表)
运维·数据库·mysql
RestCloud18 分钟前
SQL Server到Oracle:不同事务机制下的数据一致性挑战
数据库·oracle·sqlserver·etl·cdc·数据处理·数据传输
蝈蝈(GuoGuo)19 分钟前
FireDAC][Phys][ODBC][SQLSRV32.DLL] SQL_NO_DATA FDquery
数据库·sql·oracle
蜂蜜黄油呀土豆30 分钟前
MySQL Undo Log 深度解析:表空间、MVCC、回滚机制与版本演进全解
数据库·mysql·innodb·redo log·mvcc·undo log·事务日志
leoufung33 分钟前
LeetCode 433:Minimum Genetic Mutation 题目理解与 BFS 思路详解
数据库·leetcode·宽度优先
张3蜂43 分钟前
SQL Server 数据库 的通信加密配置SSL安全连接
数据库·安全·ssl
卿雪1 小时前
Redis 数据过期删除和内存淘汰策略
数据库·redis·缓存