flink-cdc实战之oracle问题记录01

记录问题,温暖你我,上台

欢迎点赞留言关注

2024-01-26 11:02:56,168 ERROR  Oracle|oracle_logminer|streaming  Mining session stopped due to the {}   [io.debezium.connector.oracle.logminer.LogMinerHelper]
io.debezium.DebeziumException: Supplemental logging not configured for table ORCL.AA.A.  Use command: ALTER TABLE AA.A ADD SUPPLEMENTAL LOG DATA (ALL) COLUMNS
	at io.debezium.connector.oracle.logminer.LogMinerHelper.checkSupplementalLogging(LogMinerHelper.java:407)
	at io.debezium.connector.oracle.logminer.LogMinerStreamingChangeEventSource.execute(LogMinerStreamingChangeEventSource.java:132)
	at io.debezium.pipeline.ChangeEventSourceCoordinator.streamEvents(ChangeEventSourceCoordinator.java:152)
	at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:119)
	at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
	at java.base/java.lang.Thread.run(Thread.java:834)
2024-01-26 11:02:56,170 ERROR  Oracle|oracle_logminer|streaming  Producer failure   [io.debezium.pipeline.ErrorHandler]
io.debezium.DebeziumException: Supplemental logging not configured for table ORCL.AA.A.  Use command: ALTER TABLE AA.A ADD SUPPLEMENTAL LOG DATA (ALL) COLUMNS
	at io.debezium.connector.oracle.logminer.LogMinerHelper.checkSupplementalLogging(LogMinerHelper.java:407)
	at io.debezium.connector.oracle.logminer.LogMinerStreamingChangeEventSource.execute(LogMinerStreamingChangeEventSource.java:132)
	at io.debezium.pipeline.ChangeEventSourceCoordinator.streamEvents(ChangeEventSourceCoordinator.java:152)
	at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:119)
	at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
	at java.base/java.lang.Thread.run(Thread.java:834)
2024-01-26 11:02:56,170 INFO   Oracle|oracle_logminer|streaming  startScn=3460974, endScn=null, offsetContext.getScn()=3460974   [io.debezium.connector.oracle.logminer.LogMinerStreamingChangeEventSource]
2024-01-26 11:02:56,171 INFO   Oracle|oracle_logminer|streaming  Transactional buffer dump:    [io.debezium.connector.oracle.logminer.LogMinerStreamingChangeEventSource]
2024-01-26 11:02:56,171 INFO   Oracle|oracle_logminer|streaming  Streaming metrics dump: OracleStreamingChangeEventSourceMetrics{currentScn=null, oldestScn=null, committedScn=null, offsetScn=null, logMinerQueryCount=0, totalProcessedRows=0, totalCapturedDmlCount=0, totalDurationOfFetchingQuery=PT0S, lastCapturedDmlCount=0, lastDurationOfFetchingQuery=PT0S, maxCapturedDmlCount=0, maxDurationOfFetchingQuery=PT0S, totalBatchProcessingDuration=PT0S, lastBatchProcessingDuration=PT0S, maxBatchProcessingDuration=PT0S, maxBatchProcessingThroughput=0, currentLogFileName=null, minLogFilesMined=0, maxLogFilesMined=0, redoLogStatus=null, switchCounter=0, batchSize=20000, millisecondToSleepBetweenMiningQuery=1000, recordMiningHistory=false, hoursToKeepTransaction=0, networkConnectionProblemsCounter0, batchSizeDefault=20000, batchSizeMin=1000, batchSizeMax=100000, sleepTimeDefault=1000, sleepTimeMin=0, sleepTimeMax=3000, sleepTimeIncrement=200, totalParseTime=PT0S, totalStartLogMiningSessionDuration=PT0S, lastStartLogMiningSessionDuration=PT0S, maxStartLogMiningSessionDuration=PT0S, totalProcessTime=PT0S, minBatchProcessTime=PT0S, maxBatchProcessTime=PT0S, totalResultSetNextTime=PT0S, lagFromTheSource=DurationPT0S, maxLagFromTheSourceDuration=PT0S, minLagFromTheSourceDuration=PT0S, lastCommitDuration=PT0S, maxCommitDuration=PT0S, activeTransactions=0, rolledBackTransactions=0, committedTransactions=0, abandonedTransactionIds=[], rolledbackTransactionIds=[], registeredDmlCount=0, committedDmlCount=0, errorCount=1, warningCount=0, scnFreezeCount=0}   [io.debezium.connector.oracle.logminer.LogMinerStreamingChangeEventSource]
2024-01-26 11:02:56,171 INFO   Oracle|oracle_logminer|streaming  Finished streaming   [io.debezium.pipeline.ChangeEventSourceCoordinator]

解决:

  /**
   * oracle 开启补充日志
   */
  // 查询补充日志和强制日志
  // select supplemental_log_data_all,force_logging from v$database;
  
  //alter database add supplemental log data(all) columns;
  //alter database force logging;
  
  /*
   * SQL> select supplemental_log_data_all,force_logging from v$database;

	SUP FORCE_LOGGING
	--- ---------------------------------------
	NO  NO
	
	SQL> alter database add supplemental log data(all) columns;
	
	Database altered.
	
	SQL> alter database force logging;
	
	Database altered.
	
	SQL> select supplemental_log_data_all,force_logging from v$database;
	
	SUP FORCE_LOGGING
	--- ---------------------------------------
	YES YES

   */
相关推荐
zmd-zk1 小时前
kafka+zookeeper的搭建
大数据·分布式·zookeeper·中间件·kafka
激流丶1 小时前
【Kafka 实战】如何解决Kafka Topic数量过多带来的性能问题?
java·大数据·kafka·topic
测试界的酸菜鱼1 小时前
Python 大数据展示屏实例
大数据·开发语言·python
时差9531 小时前
【面试题】Hive 查询:如何查找用户连续三天登录的记录
大数据·数据库·hive·sql·面试·database
Mephisto.java1 小时前
【大数据学习 | kafka高级部分】kafka中的选举机制
大数据·学习·kafka
Mephisto.java1 小时前
【大数据学习 | kafka高级部分】kafka的优化参数整理
大数据·sql·oracle·kafka·json·database
道可云1 小时前
道可云人工智能&元宇宙每日资讯|2024国际虚拟现实创新大会将在青岛举办
大数据·人工智能·3d·机器人·ar·vr
成都古河云2 小时前
智慧场馆:安全、节能与智能化管理的未来
大数据·运维·人工智能·安全·智慧城市
软工菜鸡2 小时前
预训练语言模型BERT——PaddleNLP中的预训练模型
大数据·人工智能·深度学习·算法·语言模型·自然语言处理·bert
武子康3 小时前
大数据-212 数据挖掘 机器学习理论 - 无监督学习算法 KMeans 基本原理 簇内误差平方和
大数据·人工智能·学习·算法·机器学习·数据挖掘