DataX-json模板(hdfsToMysql、hdfsToOracle、mysqlToHdfs、oracleToDoris)【全】

文章目录

  • 前言
    • [1. HDFS 到 MySQL (hdfsToMysql)](#1. HDFS 到 MySQL (hdfsToMysql))
    • [2. HDFS 到 Oracle (hdfsToOracle)](#2. HDFS 到 Oracle (hdfsToOracle))
    • [3. MySQL 到 HDFS (mysqlToHdfs)](#3. MySQL 到 HDFS (mysqlToHdfs))
    • [4. Oracle 到 Doris (oracleToDoris)](#4. Oracle 到 Doris (oracleToDoris))
  • 总结

前言

DataX 是一款由阿里巴巴开源的数据同步工具,它提供了丰富的数据源插件,支持包括但不限于 MySQL、Oracle、HDFS 等多种数据源的同步。以下是几种常见同步场景的 JSON 模板示例。


1. HDFS 到 MySQL (hdfsToMysql)

  • 当您需要将数据从 HDFS 同步到 MySQL 时,可以使用如下的 JSON 模板:
json 复制代码
{
  "job": {
    "content": [
      {
        "reader": {
          "name": "hdfsreader",
          "parameter": {
            "path": "/user/hive/warehouse/ads.db/ads_zhy_site_stat_di/date=${dt1}",
            "defaultFS": "hdfs://dn22:8020",
            "fileType": "parquet",
            "skipHeader": false,
            "column": [
                            {"index":"0","type":"string"},
                            {"index":"1","type":"string"},
                            {"index":"2","type":"int"},
                            {"index":"3","type":"int"},
                            { "name": "date", 
                                "type": "string",
                                "value":"${dt1}"
                            }
            ]
          }
        },
 "writer": {
          "name": "mysqlwriter",
          "parameter": {
    "writeMode": "insert",
            "username": "cnooc_fuse",
            "password": "root",
            "column": [
                    "`site_id`",
                    "`fuel_type`",
                    "`vehicle_num`",
                    "`through_vehicle_num`",
                    "`date`"

            ],
            "connection": [
              {
                "table": [
                "t_site_flow_stat_day"
                ],
                "jdbcUrl": "jdbc:mysql://172.0.0.1:8086/cnooc_fuse?useUnicode=true&characterEncoding=utf-8&zeroDateTimeBehavior=convertToNull&tinyInt1isBit=false&dontTrackOpenResources=true"
              }
            ]
          }
        }

      }
    ]
,
        "setting": {
            "speed": {
                "channel": "2"
            }
        }

  }
}

2. HDFS 到 Oracle (hdfsToOracle)

  • 将数据从 HDFS 同步到 Oracle 的 JSON 模板如下:
json 复制代码
{
  "job": {
    "content": [
      {
        "reader": {
          "name": "hdfsreader",
          "parameter": {
            "path": "/user/hive/warehouse/mid.db/hky_veh_run_time/dt=${dt}",
            "defaultFS": "hdfs://nn01:8020",
            "fileType": "parquet",
            "skipHeader": false,
            "column": [
                            {"index":"0","type":"long"},
                            {"index":"1","type":"int"},
                            {"index":"2","type":"int"},
                            { "name": "dt", 
                                "type": "long",
                                "value":"${dt}"
                            }
            ]
          }
        },
        "writer": {
          "name": "oraclewriter",
          "parameter": {
            "username": "root",
            "password": "root",
            "column": [
                    "TRANS",
                    "AD_CODE",
                    "RUN_TIME",
                    "DT"
            ],
            "connection": [
              {
                "table": [
                "HKY_VEH_RUN_TIME"
                ],
                "jdbcUrl": "jdbc:oracle:thin:@172.0.0.1:1521:rdt1"
              }
            ]
          }
        }
      }
    ],
    "setting": {
      "speed": {
        "channel": "2"
      }
    }
  }
}

3. MySQL 到 HDFS (mysqlToHdfs)

  • 从 MySQL 同步数据到 HDFS 的 JSON 模板示例:
json 复制代码
{
    "job": {
        "content": [
            {
		"reader": {
			"name": "mysqlreader",
			"parameter": {
				"username": "cnooc_fuse",
				"password": "root",
				"connection": [{
					"querySql": [
						"SELECT id,site_name,area_type,province,city,site_type,company_name,tel,lon,lat,is_cnooc,address, section_code,road_type,road_name,site_state,
recommend_type, manage_type, remark, note, create_user,create_time, update_user, update_time, is_deleted
	FROM t_site_new "
					],
					"jdbcUrl": [
						"jdbc:mysql://172.0.0.1:8086/cnooc_fuse?allowLoadLocalInfile=false&autoDeserialize=false&allowLocalInfile=false&allowUrlInLocalInfile=false"
					]
				}]
			}
		},
                "writer": {
                    "name": "hdfswriter",
                    "parameter": {
                        "column": [
					{
						"name": "id",
						"type": "int"
					},
					{
						"name": "site_name",
						"type": "string"
					},
					{
						"name": "area_type",
						"type": "int"
					},
					{
						"name": "province",
						"type": "int"
					},
					{
						"name": "city",
						"type": "int"
					},
					{
						"name": "site_type",
						"type": "int"
					},
					{
						"name": "company_name",
						"type": "string"
					},
					{
						"name": "tel",
						"type": "string"
					},
					{
						"name": "lon",
						"type": "double"
					},
					{
						"name": "lat",
						"type": "double"
					},
					{
						"name": "is_cnooc",
						"type": "int"
					},
					{
						"name": "address",
						"type": "string"
					},
					{
						"name": "section_code",
						"type": "string"
					},
					{
						"name": "road_type",
						"type": "string"
					},
					{
						"name": "road_name",
						"type": "string"
					},
					{
						"name": "site_state",
						"type": "string"
					},
					{
						"name": "recommend_type",
						"type": "string"
					},
					{
						"name": "manage_type",
						"type": "string"
					},
					{
						"name": "remark",
						"type": "string"
					},
					{
						"name": "note",
						"type": "string"
					},
					{
						"name": "create_user",
						"type": "string"
					},
					{
						"name": "update_user",
						"type": "string"
					},
					{
						"name": "create_time",
						"type": "string"
					},
					{
						"name": "update_time",
						"type": "string"
					},
					{
						"name": "is_deleted",
						"type": "int"
					}
                        ],
						"compress": "snappy",
						"defaultFS": "hdfs://nn01:8020",
						"fieldDelimiter":",",
						"fileName": "datax",
						"fileType": "parquet",
                        "path": "/user/hive/warehouse/ods.db/ods_tran_logistics_site/",
                        "writeMode": "truncate"
                    }
                }
            }
        ],
        "setting": {
            "speed": {
                "channel": "2"
            }
        }
    }
}

4. Oracle 到 Doris (oracleToDoris)

  • 将数据从 Oracle 同步到 Doris 的 JSON 模板:
json 复制代码
{
  "job": {
    "setting": {
      "speed": {
        "channel": 1
      },
      "errorLimit": {
        "record": 0,
        "percentage": 0
      }
    },
    "content": [
      {
        "reader": {
          "name": "oraclereader",
          "parameter": {
            "column": [
"SERIAL_ID",
"CYCLE",
"REGION_CODE",
"TYPE",
"SRC_REGION",
"DST_REGION",
"SRC_TIME",
"HB_TIME",
"STATUS",
"CREATE_TIME"
            ],
            "connection": [
              {
                "jdbcUrl": [
                  "jdbc:oracle:thin:@//172.0.0.1:32021/s_tsshprod"
                ],
                "table": [
                  "HEART_BEAT_INFO"
                ]
              }
            ],
            "password": "root",
            "splitPk": "",
            "username": "tssh",
            "where": "1=1 and CREATE_TIME >= TO_DATE('${dt}', 'YYYY-MM-DD HH24:MI:SS') - INTERVAL '1:10' HOUR TO MINUTE"
          }
        },
        "writer": {
          "name": "doriswriter",
          "parameter": {
            "loadUrl": [
              "172.0.0.1:8030"
            ],
            "loadProps": {
              "format": "json",
              "strip_outer_array": true
            },
            "column": [
"serial_id",
"cycle",
"region_code",
"type",
"src_region",
"dst_region",
"src_time",
"hb_time",
"status",
"create_time"
            ],
            "username": "yunwei",
            "password": "root",
            "postSql": [],
            "preSql": [],
            "flushInterval": 30000,
            "connection": [
              {
                "jdbcUrl": "jdbc:mysql://172.0.0.1:9030/collect",
                "selectedDatabase": "collect",
                "table": [
                  "dl_01_heart_beat_info"
                ]
              }
            ]
          }
        }
      }
    ]
  }
}

总结

如果此篇文章有帮助到您, 希望打大佬们能关注点赞收藏评论支持一波,非常感谢大家!

如果有不对的地方请指正!!!

相关推荐
alikami1 小时前
【若依】用 post 请求传 json 格式的数据下载文件
前端·javascript·json
dingdingfish4 小时前
JSON 系列之1:将 JSON 数据存储在 Oracle 数据库中
oracle·json·database
糖朝6 小时前
c#读取json
c#·json
dingdingfish9 小时前
JSON 系列之2:JSON简单查询
oracle·json·database·19c·23ai
_oP_i14 小时前
HTTP 请求Media typetext/plain application/json text/json区别
网络协议·http·json
fkdw18 小时前
C# Newtonsoft.Json 反序列化派生类数据丢失问题
c#·json
Kiros_Jiang1 天前
开源低代码平台-Microi吾码 打印引擎使用
javascript·开源·json·.net·pip
nbsaas-boot2 天前
探索 JSON 数据在关系型数据库中的应用:MySQL 与 SQL Server 的对比
数据库·mysql·json
疯一样的码农2 天前
Jackson 的@JsonRawValue
json
Web打印2 天前
web打印插件 HttpPrinter 使用半年评测
javascript·json·firefox·jquery·html5