服务器检测databricks job的运行状态封装

Azure databricks上配置的job 需要传入可变参数,故而我将任务的启动脚本都部署在服务器上。但是任务执行是否成功需要不断获取任务状态才能检测,故而将检测任务的状态进行封装。

封装脚本如下:

azurework:

powershell 复制代码
#! /bin/bash
source /etc/profile
source ~/.bashrc
JQ_EXEC=`which jq`
if [ $# -eq 0 ];
then
    exit
fi
starttime=`date +'%Y-%m-%d %H:%M:%S'`
start_seconds=$(date --date="$starttime" +%s)
jobId=$1
params=$2
echo "azure job Id : ${jobId}"
echo "azure param : ${params}"
job_id=$(databricks jobs run-now --job-id ${jobId} --spark-submit-params "${params}")
run_id=$(echo ${job_id} | ${JQ_EXEC} .run_id | sed 's/\"//g')
echo $run_id
wait_time=7200
for((i=1;i<=10000;i++));
do
   sleep 5s
   echo "databricks runs get --run-id ${run_id}"
   job_info=$(databricks runs get --run-id ${run_id})
   life_cycle_state=$(echo "${job_info}" | jq -r '.state.life_cycle_state')
   endtime=`date +'%Y-%m-%d %H:%M:%S'`
   end_seconds=$(date --date="$endtime" +%s);
   interval=$((end_seconds-start_seconds))
   echo $interval
   echo $life_cycle_state
   if [ "$life_cycle_state" == "TERMINATED" ]; then
      echo "job_info :: $job_info"
      echo "$status excute time = ${interval}s"
      result_state=$(echo "${job_info}" | jq -r '.state.result_state')
      echo $result_state
      if [ "$result_state" == "SUCCESS" ];then
          echo "Job Finish "
          exit 0
      else
          echo "Job Failed"
          exit 1
      fi
   fi
   if [ "$life_cycle_state" == "INTERNAL_ERROR" ]; then
      exit 1
   fi
   if [ "$life_cycle_state" == "SKIPPED" ]; then
      echo "已有任务在运行,此任务被SKIPPED"
      exit 1
   fi
   if [ $interval -ge $wait_time ]; then
        echo "too long time force shutdown"
        exit 1
   fi
   if [ $((i%3)) == 0 ]; then
      echo "RUNNING wait = ${interval}s"
   fi

done

如此以来,执行任务就可以简化为 scriptPath/azurework jobId $params

示例:

powershell 复制代码
#!/bin/sh
source /etc/profile
source ~/.bashrc
if [ $# -eq 1 ];then
  pt=$1
else
  pt=$(date -d "yesterday" +%Y%m%d)
  echo "daily running"
fi
echo "$pt"

jobId=630195968789849
jarPath=dbfs:/FileStore/tables/cdp-1.0.0.jar
scriptPath=/home/eshen2/cdp/bigdata
confPath=dbfs:/FileStore/tables
classPath=org.deloitte.cdp.batch.engine.Job_Combine
cdp=dbw_corp_crm_etl_dev_southeastasia.cdp
tags_daily_table=tags_daily
tags_history_table=tags_history
tags_routine_table=tags_routine
id_access_table=id_access
params="[\"--class\",\"$classPath\",\"$jarPath\",\"--date\",\"${pt}\",\"--cdp\",\"$cdp\",\"--entity\",\"obj\",\"--tags_daily_table\",\"$tags_daily_table\",\"--tags_history_table\",\"$tags_history_table\",\"--tags_routine_table\",\"$tags_routine_table\",\"--id_access_table\",\"$id_access_table\",\"--confPath\",\"$confPath\"]"
echo $params

$scriptPath/azurework $jobId  $params
相关推荐
碳基沙盒1 天前
OpenClaw 多 Agent 配置实战指南
运维
蝎子莱莱爱打怪4 天前
Centos7中一键安装K8s集群以及Rancher安装记录
运维·后端·kubernetes
DianSan_ERP5 天前
电商API接口全链路监控:构建坚不可摧的线上运维防线
大数据·运维·网络·人工智能·git·servlet
呉師傅5 天前
火狐浏览器报错配置文件缺失如何解决#操作技巧#
运维·网络·windows·电脑
不是二师兄的八戒5 天前
Linux服务器挂载OSS存储的完整实践指南
linux·运维·服务器
zhangfeng11335 天前
趋动云 如何ssh登录 服务区 项目server
运维·人工智能·ssh
ZeroNews内网穿透5 天前
谷歌封杀OpenClaw背后:本地部署或是出路
运维·服务器·数据库·安全
失重外太空啦5 天前
nginx
运维·nginx
Gofarlic_oms15 天前
避免Kisssoft高级分析模块过度采购的科学评估方法
大数据·linux·运维·人工智能·matlab
田井中律.5 天前
服务器部署问题汇总(ubuntu24.04.3)
运维·服务器