DataX-json模板(hdfsToMysql、hdfsToOracle、mysqlToHdfs、oracleToDoris)【全】

文章目录

  • 前言
    • [1. HDFS 到 MySQL (hdfsToMysql)](#1. HDFS 到 MySQL (hdfsToMysql))
    • [2. HDFS 到 Oracle (hdfsToOracle)](#2. HDFS 到 Oracle (hdfsToOracle))
    • [3. MySQL 到 HDFS (mysqlToHdfs)](#3. MySQL 到 HDFS (mysqlToHdfs))
    • [4. Oracle 到 Doris (oracleToDoris)](#4. Oracle 到 Doris (oracleToDoris))
  • 总结

前言

DataX 是一款由阿里巴巴开源的数据同步工具,它提供了丰富的数据源插件,支持包括但不限于 MySQL、Oracle、HDFS 等多种数据源的同步。以下是几种常见同步场景的 JSON 模板示例。


1. HDFS 到 MySQL (hdfsToMysql)

  • 当您需要将数据从 HDFS 同步到 MySQL 时,可以使用如下的 JSON 模板:
json 复制代码
{
  "job": {
    "content": [
      {
        "reader": {
          "name": "hdfsreader",
          "parameter": {
            "path": "/user/hive/warehouse/ads.db/ads_zhy_site_stat_di/date=${dt1}",
            "defaultFS": "hdfs://dn22:8020",
            "fileType": "parquet",
            "skipHeader": false,
            "column": [
                            {"index":"0","type":"string"},
                            {"index":"1","type":"string"},
                            {"index":"2","type":"int"},
                            {"index":"3","type":"int"},
                            { "name": "date", 
                                "type": "string",
                                "value":"${dt1}"
                            }
            ]
          }
        },
 "writer": {
          "name": "mysqlwriter",
          "parameter": {
    "writeMode": "insert",
            "username": "cnooc_fuse",
            "password": "root",
            "column": [
                    "`site_id`",
                    "`fuel_type`",
                    "`vehicle_num`",
                    "`through_vehicle_num`",
                    "`date`"

            ],
            "connection": [
              {
                "table": [
                "t_site_flow_stat_day"
                ],
                "jdbcUrl": "jdbc:mysql://172.0.0.1:8086/cnooc_fuse?useUnicode=true&characterEncoding=utf-8&zeroDateTimeBehavior=convertToNull&tinyInt1isBit=false&dontTrackOpenResources=true"
              }
            ]
          }
        }

      }
    ]
,
        "setting": {
            "speed": {
                "channel": "2"
            }
        }

  }
}

2. HDFS 到 Oracle (hdfsToOracle)

  • 将数据从 HDFS 同步到 Oracle 的 JSON 模板如下:
json 复制代码
{
  "job": {
    "content": [
      {
        "reader": {
          "name": "hdfsreader",
          "parameter": {
            "path": "/user/hive/warehouse/mid.db/hky_veh_run_time/dt=${dt}",
            "defaultFS": "hdfs://nn01:8020",
            "fileType": "parquet",
            "skipHeader": false,
            "column": [
                            {"index":"0","type":"long"},
                            {"index":"1","type":"int"},
                            {"index":"2","type":"int"},
                            { "name": "dt", 
                                "type": "long",
                                "value":"${dt}"
                            }
            ]
          }
        },
        "writer": {
          "name": "oraclewriter",
          "parameter": {
            "username": "root",
            "password": "root",
            "column": [
                    "TRANS",
                    "AD_CODE",
                    "RUN_TIME",
                    "DT"
            ],
            "connection": [
              {
                "table": [
                "HKY_VEH_RUN_TIME"
                ],
                "jdbcUrl": "jdbc:oracle:thin:@172.0.0.1:1521:rdt1"
              }
            ]
          }
        }
      }
    ],
    "setting": {
      "speed": {
        "channel": "2"
      }
    }
  }
}

3. MySQL 到 HDFS (mysqlToHdfs)

  • 从 MySQL 同步数据到 HDFS 的 JSON 模板示例:
json 复制代码
{
    "job": {
        "content": [
            {
		"reader": {
			"name": "mysqlreader",
			"parameter": {
				"username": "cnooc_fuse",
				"password": "root",
				"connection": [{
					"querySql": [
						"SELECT id,site_name,area_type,province,city,site_type,company_name,tel,lon,lat,is_cnooc,address, section_code,road_type,road_name,site_state,
recommend_type, manage_type, remark, note, create_user,create_time, update_user, update_time, is_deleted
	FROM t_site_new "
					],
					"jdbcUrl": [
						"jdbc:mysql://172.0.0.1:8086/cnooc_fuse?allowLoadLocalInfile=false&autoDeserialize=false&allowLocalInfile=false&allowUrlInLocalInfile=false"
					]
				}]
			}
		},
                "writer": {
                    "name": "hdfswriter",
                    "parameter": {
                        "column": [
					{
						"name": "id",
						"type": "int"
					},
					{
						"name": "site_name",
						"type": "string"
					},
					{
						"name": "area_type",
						"type": "int"
					},
					{
						"name": "province",
						"type": "int"
					},
					{
						"name": "city",
						"type": "int"
					},
					{
						"name": "site_type",
						"type": "int"
					},
					{
						"name": "company_name",
						"type": "string"
					},
					{
						"name": "tel",
						"type": "string"
					},
					{
						"name": "lon",
						"type": "double"
					},
					{
						"name": "lat",
						"type": "double"
					},
					{
						"name": "is_cnooc",
						"type": "int"
					},
					{
						"name": "address",
						"type": "string"
					},
					{
						"name": "section_code",
						"type": "string"
					},
					{
						"name": "road_type",
						"type": "string"
					},
					{
						"name": "road_name",
						"type": "string"
					},
					{
						"name": "site_state",
						"type": "string"
					},
					{
						"name": "recommend_type",
						"type": "string"
					},
					{
						"name": "manage_type",
						"type": "string"
					},
					{
						"name": "remark",
						"type": "string"
					},
					{
						"name": "note",
						"type": "string"
					},
					{
						"name": "create_user",
						"type": "string"
					},
					{
						"name": "update_user",
						"type": "string"
					},
					{
						"name": "create_time",
						"type": "string"
					},
					{
						"name": "update_time",
						"type": "string"
					},
					{
						"name": "is_deleted",
						"type": "int"
					}
                        ],
						"compress": "snappy",
						"defaultFS": "hdfs://nn01:8020",
						"fieldDelimiter":",",
						"fileName": "datax",
						"fileType": "parquet",
                        "path": "/user/hive/warehouse/ods.db/ods_tran_logistics_site/",
                        "writeMode": "truncate"
                    }
                }
            }
        ],
        "setting": {
            "speed": {
                "channel": "2"
            }
        }
    }
}

4. Oracle 到 Doris (oracleToDoris)

  • 将数据从 Oracle 同步到 Doris 的 JSON 模板:
json 复制代码
{
  "job": {
    "setting": {
      "speed": {
        "channel": 1
      },
      "errorLimit": {
        "record": 0,
        "percentage": 0
      }
    },
    "content": [
      {
        "reader": {
          "name": "oraclereader",
          "parameter": {
            "column": [
"SERIAL_ID",
"CYCLE",
"REGION_CODE",
"TYPE",
"SRC_REGION",
"DST_REGION",
"SRC_TIME",
"HB_TIME",
"STATUS",
"CREATE_TIME"
            ],
            "connection": [
              {
                "jdbcUrl": [
                  "jdbc:oracle:thin:@//172.0.0.1:32021/s_tsshprod"
                ],
                "table": [
                  "HEART_BEAT_INFO"
                ]
              }
            ],
            "password": "root",
            "splitPk": "",
            "username": "tssh",
            "where": "1=1 and CREATE_TIME >= TO_DATE('${dt}', 'YYYY-MM-DD HH24:MI:SS') - INTERVAL '1:10' HOUR TO MINUTE"
          }
        },
        "writer": {
          "name": "doriswriter",
          "parameter": {
            "loadUrl": [
              "172.0.0.1:8030"
            ],
            "loadProps": {
              "format": "json",
              "strip_outer_array": true
            },
            "column": [
"serial_id",
"cycle",
"region_code",
"type",
"src_region",
"dst_region",
"src_time",
"hb_time",
"status",
"create_time"
            ],
            "username": "yunwei",
            "password": "root",
            "postSql": [],
            "preSql": [],
            "flushInterval": 30000,
            "connection": [
              {
                "jdbcUrl": "jdbc:mysql://172.0.0.1:9030/collect",
                "selectedDatabase": "collect",
                "table": [
                  "dl_01_heart_beat_info"
                ]
              }
            ]
          }
        }
      }
    ]
  }
}

总结

如果此篇文章有帮助到您, 希望打大佬们能关注点赞收藏评论支持一波,非常感谢大家!

如果有不对的地方请指正!!!

相关推荐
黎明晓月1 天前
PostgreSQL提取JSON格式的数据(包含提取list指定索引数据)
postgresql·json·list
心死翼未伤1 天前
python从入门到精通:pyspark实战分析
开发语言·数据结构·python·spark·json
Mephisto.java2 天前
【大数据学习 | flume】flume Sink Processors与拦截器Interceptor
大数据·sql·oracle·sqlite·json·flume
ac-er88882 天前
ThinkPHP中使用ajax接收json数据的方法
前端·ajax·json·php
0x派大星2 天前
【Golang】——Gin 框架中的 API 请求处理与 JSON 数据绑定
开发语言·后端·golang·go·json·gin
不能只会打代码3 天前
支持用户注册和登录、发布动态、点赞、评论、私信等功能的社交媒体平台创建!!!
前端·css·后端·html·json·媒体·社交媒体平台
愚公码农3 天前
MySQL json字段索引添加及使用
数据库·mysql·json
拧螺丝专业户3 天前
gin源码阅读(2)请求体中的JSON参数是如何解析的?
前端·json·gin
Mephisto.java3 天前
【大数据学习 | Spark】yarn-client与yarn-cluster的区别
大数据·sql·oracle·spark·json·database
Mephisto.java3 天前
【大数据学习 | Spark】spark-shell开发
大数据·sql·oracle·spark·sqlite·json