flink-cdc实战之oracle问题记录01

记录问题,温暖你我,上台

欢迎点赞留言关注

复制代码
2024-01-26 11:02:56,168 ERROR  Oracle|oracle_logminer|streaming  Mining session stopped due to the {}   [io.debezium.connector.oracle.logminer.LogMinerHelper]
io.debezium.DebeziumException: Supplemental logging not configured for table ORCL.AA.A.  Use command: ALTER TABLE AA.A ADD SUPPLEMENTAL LOG DATA (ALL) COLUMNS
	at io.debezium.connector.oracle.logminer.LogMinerHelper.checkSupplementalLogging(LogMinerHelper.java:407)
	at io.debezium.connector.oracle.logminer.LogMinerStreamingChangeEventSource.execute(LogMinerStreamingChangeEventSource.java:132)
	at io.debezium.pipeline.ChangeEventSourceCoordinator.streamEvents(ChangeEventSourceCoordinator.java:152)
	at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:119)
	at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
	at java.base/java.lang.Thread.run(Thread.java:834)
2024-01-26 11:02:56,170 ERROR  Oracle|oracle_logminer|streaming  Producer failure   [io.debezium.pipeline.ErrorHandler]
io.debezium.DebeziumException: Supplemental logging not configured for table ORCL.AA.A.  Use command: ALTER TABLE AA.A ADD SUPPLEMENTAL LOG DATA (ALL) COLUMNS
	at io.debezium.connector.oracle.logminer.LogMinerHelper.checkSupplementalLogging(LogMinerHelper.java:407)
	at io.debezium.connector.oracle.logminer.LogMinerStreamingChangeEventSource.execute(LogMinerStreamingChangeEventSource.java:132)
	at io.debezium.pipeline.ChangeEventSourceCoordinator.streamEvents(ChangeEventSourceCoordinator.java:152)
	at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:119)
	at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
	at java.base/java.lang.Thread.run(Thread.java:834)
2024-01-26 11:02:56,170 INFO   Oracle|oracle_logminer|streaming  startScn=3460974, endScn=null, offsetContext.getScn()=3460974   [io.debezium.connector.oracle.logminer.LogMinerStreamingChangeEventSource]
2024-01-26 11:02:56,171 INFO   Oracle|oracle_logminer|streaming  Transactional buffer dump:    [io.debezium.connector.oracle.logminer.LogMinerStreamingChangeEventSource]
2024-01-26 11:02:56,171 INFO   Oracle|oracle_logminer|streaming  Streaming metrics dump: OracleStreamingChangeEventSourceMetrics{currentScn=null, oldestScn=null, committedScn=null, offsetScn=null, logMinerQueryCount=0, totalProcessedRows=0, totalCapturedDmlCount=0, totalDurationOfFetchingQuery=PT0S, lastCapturedDmlCount=0, lastDurationOfFetchingQuery=PT0S, maxCapturedDmlCount=0, maxDurationOfFetchingQuery=PT0S, totalBatchProcessingDuration=PT0S, lastBatchProcessingDuration=PT0S, maxBatchProcessingDuration=PT0S, maxBatchProcessingThroughput=0, currentLogFileName=null, minLogFilesMined=0, maxLogFilesMined=0, redoLogStatus=null, switchCounter=0, batchSize=20000, millisecondToSleepBetweenMiningQuery=1000, recordMiningHistory=false, hoursToKeepTransaction=0, networkConnectionProblemsCounter0, batchSizeDefault=20000, batchSizeMin=1000, batchSizeMax=100000, sleepTimeDefault=1000, sleepTimeMin=0, sleepTimeMax=3000, sleepTimeIncrement=200, totalParseTime=PT0S, totalStartLogMiningSessionDuration=PT0S, lastStartLogMiningSessionDuration=PT0S, maxStartLogMiningSessionDuration=PT0S, totalProcessTime=PT0S, minBatchProcessTime=PT0S, maxBatchProcessTime=PT0S, totalResultSetNextTime=PT0S, lagFromTheSource=DurationPT0S, maxLagFromTheSourceDuration=PT0S, minLagFromTheSourceDuration=PT0S, lastCommitDuration=PT0S, maxCommitDuration=PT0S, activeTransactions=0, rolledBackTransactions=0, committedTransactions=0, abandonedTransactionIds=[], rolledbackTransactionIds=[], registeredDmlCount=0, committedDmlCount=0, errorCount=1, warningCount=0, scnFreezeCount=0}   [io.debezium.connector.oracle.logminer.LogMinerStreamingChangeEventSource]
2024-01-26 11:02:56,171 INFO   Oracle|oracle_logminer|streaming  Finished streaming   [io.debezium.pipeline.ChangeEventSourceCoordinator]

解决:

复制代码
  /**
   * oracle 开启补充日志
   */
  // 查询补充日志和强制日志
  // select supplemental_log_data_all,force_logging from v$database;
  
  //alter database add supplemental log data(all) columns;
  //alter database force logging;
  
  /*
   * SQL> select supplemental_log_data_all,force_logging from v$database;

	SUP FORCE_LOGGING
	--- ---------------------------------------
	NO  NO
	
	SQL> alter database add supplemental log data(all) columns;
	
	Database altered.
	
	SQL> alter database force logging;
	
	Database altered.
	
	SQL> select supplemental_log_data_all,force_logging from v$database;
	
	SUP FORCE_LOGGING
	--- ---------------------------------------
	YES YES

   */
相关推荐
岁岁种桃花儿18 分钟前
Kafka从入门到上天系列第一篇:kafka的安装和启动
大数据·中间件·kafka
Apache Flink38 分钟前
Apache Flink Agents 0.2.0 发布公告
大数据·flink·apache
永霖光电_UVLED1 小时前
打造更优异的 UVB 激光器
大数据·制造·量子计算
m0_466525291 小时前
绿盟科技风云卫AI安全能力平台成果重磅发布
大数据·数据库·人工智能·安全
晟诺数字人1 小时前
2026年海外直播变革:数字人如何改变游戏规则
大数据·人工智能·产品运营
vx_biyesheji00011 小时前
豆瓣电影推荐系统 | Python Django 协同过滤 Echarts可视化 深度学习 大数据 毕业设计源码
大数据·爬虫·python·深度学习·django·毕业设计·echarts
2501_943695332 小时前
高职大数据与会计专业,考CDA证后能转纯数据分析岗吗?
大数据·数据挖掘·数据分析
实时数据2 小时前
通过大数据的深度分析与精准营销策略,企业能够有效实现精准引流
大数据
惜分飞2 小时前
ORA-600 kcratr_nab_less_than_odr和ORA-600 4193故障处理--惜分飞
数据库·oracle
子榆.2 小时前
CANN 性能分析与调优实战:使用 msprof 定位瓶颈,榨干硬件每一分算力
大数据·网络·人工智能