openh264 源码分析: WelsEncoderEncodeExt 函数

函数功能

  1. 功能:视频 svc 编码的核心过程,输入图像,输出码流。
  2. 原型:
cpp 复制代码
int32_t WelsEncoderEncodeExt (sWelsEncCtx* pCtx, SFrameBSInfo* pFbi, const SSourcePicture* pSrcPic) 
  1. 参数:
  • sWelsEncCtx* pCtx:编码器上下文
  • SFrameBSInfo* pFbi:输出码流
  • const SSourcePicture* pSrcPic:输入图像

函数调用关系图

函数原理

  1. 过程
  • 检查入参pCtx是否是 NULL,如果是,则返回内存分配错误;
  • 初始化一些局部变量;
  • 调用GetTimestampForRc函数获取适合 RC 的时间戳uiTimeStamp;
  • for 循环初始化pFbi结构体中每一层的帧类型和NAL单元计数;
  • 调用BuildSpatialPicList函数计算空域层数iSpatialNum,该函数用于执行颜色空间转换、去噪、下采样和填充,以生成空间层;
  • 调用pfWelsUpdateMaxBrWindowStatus函数更新最大比特窗口状态;
  • 如果iSpatialNum小于 1,表明没有有效的空间层,将跳过当前帧,并更新编码索引;
  • 调用InitBitStream函数初始化比特流缓冲区和NAL单元长度数组;
  • 如果bSimulcastAVC 为 false,
    • 则调用PrepareEncodeFrame函数准备编码帧,执行一系列操作,如确定帧类型、设置编码参数等;
  • 否则,
    • for 循环遍历所有空域层,调用GetTemporalLevel函数计算时域 IDiTemporalId;
  • while 循环遍历所有空域层,
    • 空间层配置:获取当前层的配置信息,pCurDqLayer、uiDependencyId等;
    • 如果bSimulcastAVC开启,即为 true,
      • 调用PrepareEncodeFrame函数计算帧类型eFrameType;
      • 如果eFrameType帧类型为 skip,
        • 则设置码流中eFrameType类型为videoFrameTypeSkip;
        • 更新状态iSpatialIdx,继续下一个空域层;
    • 调用InitFrameCoding函数来初始化帧编码;
    • 调用AnalyzeSpatialPic函数分析空间图像;
    • 编码参数设置,如宽、高、poc、type 等;
    • switch 根据uiSliceMode模式选择不同的编码逻辑,
      • SM_FIXEDSLCNUM_SLICE:固定切片数量模式
        • 启用多线程且使用负载均衡,且多线程的数量大于或等于切片的数量,
          • 如果当前依赖ID(iCurDid)大于0,则调用AdjustEnhanceLayer函数调整增强层的参数;如果是基础层,则调用AdjustBaseLayer函数调整基础层的参数;
      • SM_SIZELIMITED_SLICE:大小受限切片模式
        • 调用PicPartitionNumDecision函数决定图像的分区数量(iPicIPartitionNum);
        • 设置活动的线程数量(pCtx->iActiveThreadsNum)等于图像分区的数量,这意味着编码器将尝试使用与图像分区数量相同的线程来并行处理编码;
        • 调用WelsInitCurrentDlayerMltslc函数初始化当前层的多切片编码,iPicIPartitionNum作为参数传递,指示应该创建多少个切片;
    • 根据帧类型eFrameType获取 NAL 类型eNalType;
    • 根据时域 iCurTid和片类型eSliceType,计算 NAL 参考优先级eNalRefIdc;
    • 设置解码图像pDecPic的 type 和 poc;
    • 调用WelsInitCurrentLayer函数初始化当前编码层的状态;
    • 调用MarkPic函数标记当前图像;
    • 调用BuildRefList函数构建参考列表;如果构建失败,则将当前帧类型设置为 IDR ;
    • 如果编码的帧是 P 帧,则调用AfterBuildRefList函数更新参考列表;
    • 如果率控制模式不是关闭状态,
      • 调用AnalyzePictureComplexity函数分析图像复杂度;
    • 调用WelsUpdateRefSyntax更新参考语法,这涉及到获取和传输用于写入切片头的重排序语法;
    • 调用PrefetchReferencePicture函数为当前pDq 层更新参考图像;
    • 调用pfWelsRcPictureInit函数初始化当前帧的率控制参数;
    • 调用PreprocessSliceCoding函数执行编码前的切片处理,这个步骤必须在率控制初始化和当前层初始化之后调用;
    • 当uiSliceMode为SM_SINGLE_SLICE,即单一切片模式,
      • 如果bNeedPrefixNalFlag开启,则调用AddPrefixNal函数增加前缀 NAL单元,并更新iLayerSize、iPayloadSize;
      • 调用 WelsLoadNal 函数加载NAL单元,准备编码;
      • 调用 SetSliceBoundaryInfo 函数设置当前切片的编码边界信息;
      • 调用 WelsCodeOneSlice 函数对单个切片进行编码;
      • 编码完成后,调用 WelsUnloadNal 函数卸载NAL单元;
      • 调用 WelsEncodeNal 函数将NAL单元编码到比特流中,并更新 iLayerSize、pCtx->iPosBsBuffer 和 pLayerBsInfo;
      • 更新层信息,如uiLayerType、uiSpatialId、uiTemporalId、iNalCount、eFrameType等;
    • 当uiSliceMode为SM_SIZELIMITED_SLICE,且iMultipleThreadIdc小于等于 1,即动态切片编码的情况,但以单线程模式运行,
      • 获取当前帧的宏块(Macroblock, MB)总数,并存储在 kiLastMbInFrame 变量中;
      • 调用WelsCodeOnePicPartition函数对当前图像分区进行编码;
      • 将 pLayerBsInfo->eFrameType 设置为当前帧类型 eFrameType;
      • 调用 GetSubSequenceId 函数获取子序列ID;
    • 其他情况,
      • 当uiSliceMode不等于SM_SIZELIMITED_SLICE,且iMultipleThreadIdc大于 1 时,使用多线程进行非大小受限的多切片编码,
        • 调用GetCurrentSliceNum函数获取切片数;
        • 初始化层比特流pLayerBsInfo信息;
        • 调用pTaskManage->ExecuteTasks()执行多线程任务;
        • 调用 AppendSliceToFrameBs 函数将所有切片附加到帧比特流中,并更新 iLayerSize;
      • 当切片模式是 SM_SIZELIMITED_SLICE 并且多线程指标大于1时,
        • 获取层比特流信息;
        • 调用InitAllSlicesInThread函数在线程中初始化所有片信息;
        • 调用 pTaskManage->ExecuteTasks() 执行多线程任务;
        • 调用SliceLayerInfoUpdate函数更新片层信息;
        • 调用GetCurrentSliceNum函数获取片数iSliceCount;
        • 调用AppendSliceToFrameBs函数获取编码后层流大小iLayerSize;
      • 其他,针对非动态多切片编码模式下的单线程编码,
        • 调用 GetCurrentSliceNum 函数获取当前层的切片总数;
        • 使用 while 循环遍历所有切片,
          • 如果需要添加前缀NAL单元,调用 AddPrefixNal 函数,并更新 iLayerSize 和 iPayloadSize;
          • 调用 WelsLoadNal 函数加载当前切片的NAL单元;
          • 调用 SetSliceBoundaryInfo 函数设置切片的边界信息;
          • 调用 WelsCodeOneSlice 函数对当前切片进行编码;
          • 编码完成后,调用 WelsUnloadNal 函数卸载NAL单元;
          • 调用 WelsEncodeNal 函数将NAL单元编码到比特流中,并更新 iLayerSize、pCtx->iPosBsBuffer 和 pLayerBsInfo;
        • 更新层信息,如uiLayerType、uiSpatialId、uiTemporalId、iNalCount、eFrameType等;
    • 调用pfWelsRcPostFrameSkipping函数编码后跳帧检查;
    • 调用StackBackEncoderStatus函数回退编码状态;
    • 调用ClearFrameBsInfo函数清楚比特流信息;
    • 调用WelsRcPostFrameSkippedUpdate函数更新跳过帧后的码率控制状态;
    • 如果bDeblockingParallelFlag为 true,则调用PerformDeblockingFilter函数进行去块滤波;
    • 调用pfWelsRcPictureInfoUpdate更新码率控制信息,传入当前层大小iLayerSize;
    • 编码过程中信息更新;
    • 编码后处理,包括NAL单元计数、填充(padding)、多线程负载均衡、参考帧更新和编码状态回滚;
  • 清理、状态更新等工作;
  1. 内部函数关系与原理

源码

cpp 复制代码
/*!
 * \brief   core svc encoding process
 *
 * \pParam  pCtx            sWelsEncCtx*, encoder context
 * \pParam  pFbi            FrameBSInfo*
 * \pParam  pSrcPic         Source Picture
 * \return  EFrameType (videoFrameTypeIDR/videoFrameTypeI/videoFrameTypeP)
 */
int32_t WelsEncoderEncodeExt (sWelsEncCtx* pCtx, SFrameBSInfo* pFbi, const SSourcePicture* pSrcPic) {
  if (pCtx == NULL) {
    return ENC_RETURN_MEMALLOCERR;
  }
  SLayerBSInfo* pLayerBsInfo            = &pFbi->sLayerInfo[0];
  SWelsSvcCodingParam* pSvcParam        = pCtx->pSvcParam;
  SSpatialPicIndex* pSpatialIndexMap = &pCtx->sSpatialIndexMap[0];
#if defined(ENABLE_FRAME_DUMP) || defined(ENABLE_PSNR_CALC)
  SPicture* fsnr                = NULL;
#endif//ENABLE_FRAME_DUMP || ENABLE_PSNR_CALC
  SPicture* pEncPic             = NULL; // to be decided later
#if defined(MT_DEBUG)
  int32_t iDidList[MAX_DEPENDENCY_LAYER] = {0};
#endif
  int32_t iLayerNum             = 0;
  int32_t iLayerSize            = 0;
  int32_t iSpatialNum           =
    0; // available count number of spatial layers due to frame size changed in this given frame
  int32_t iSpatialIdx           = 0; // iIndex of spatial layers due to frame size changed in this given frame
  int32_t iFrameSize            = 0;
  int32_t iNalIdxInLayer        = 0;
  int32_t iCountNal             = 0;
  EVideoFrameType eFrameType    = videoFrameTypeInvalid;
  int32_t iCurWidth             = 0;
  int32_t iCurHeight            = 0;
  EWelsNalUnitType eNalType     = NAL_UNIT_UNSPEC_0;
  EWelsNalRefIdc eNalRefIdc     = NRI_PRI_LOWEST;
  int8_t iCurDid                = 0;
  int32_t iCurTid                = 0;
  bool bAvcBased                = false;
  SLogContext* pLogCtx = & (pCtx->sLogCtx);
#if defined(ENABLE_PSNR_CALC)
  float fSnrY = .0f, fSnrU = .0f, fSnrV = .0f;
#endif//ENABLE_PSNR_CALC

#if defined(_DEBUG)
  int32_t i = 0, j = 0, k = 0;
#endif//_DEBUG
  pCtx->iEncoderError = ENC_RETURN_SUCCESS;
  pCtx->bCurFrameMarkedAsSceneLtr = false;
  pFbi->eFrameType = videoFrameTypeSkip;
  pFbi->iLayerNum = 0; // for initialization
  pFbi->uiTimeStamp = GetTimestampForRc (pSrcPic->uiTimeStamp, pCtx->uiLastTimestamp,
                                         pCtx->pSvcParam->sSpatialLayers[pCtx->pSvcParam->iSpatialLayerNum - 1].fFrameRate);
  for (int32_t iNalIdx = 0; iNalIdx < MAX_LAYER_NUM_OF_FRAME; iNalIdx++) {
    pFbi->sLayerInfo[iNalIdx].eFrameType = videoFrameTypeSkip;
    pFbi->sLayerInfo[iNalIdx].iNalCount  = 0;
  }
  // perform csc/denoise/downsample/padding, generate spatial layers
  iSpatialNum = pCtx->pVpp->BuildSpatialPicList (pCtx, pSrcPic);
  if (iSpatialNum == -1) {
    WelsLog (& (pCtx->sLogCtx), WELS_LOG_ERROR, "Failed in allocating memory in BuildSpatialPicList");
    return ENC_RETURN_MEMALLOCERR;
  }

  if (pCtx->pFuncList->pfRc.pfWelsUpdateMaxBrWindowStatus) {
    pCtx->pFuncList->pfRc.pfWelsUpdateMaxBrWindowStatus (pCtx, iSpatialNum, pFbi->uiTimeStamp);
  }

  if (iSpatialNum < 1) {
    for (int32_t iDidIdx = 0; iDidIdx < pSvcParam->iSpatialLayerNum; iDidIdx++) {
      SSpatialLayerInternal* pParamInternal = &pSvcParam->sDependencyLayers[iDidIdx];
      pParamInternal->iCodingIndex ++;
    }
    pFbi->eFrameType = videoFrameTypeSkip;
    pLayerBsInfo->eFrameType = videoFrameTypeSkip;
    WelsLog (& (pCtx->sLogCtx), WELS_LOG_DEBUG,
             "[Rc] Frame timestamp = %lld, skip one frame due to preprocessing return (temporal layer settings or else)",
             pSrcPic->uiTimeStamp);
    return ENC_RETURN_SUCCESS;
  }

  InitBitStream (pCtx);
  pLayerBsInfo->pBsBuf = pCtx->pFrameBs ;
  pLayerBsInfo->pNalLengthInByte = pCtx->pOut->pNalLen;
  iCurDid = pSpatialIndexMap->iDid;
  pCtx->pCurDqLayer             = pCtx->ppDqLayerList[iCurDid];
  pCtx->pCurDqLayer->pRefLayer  = NULL;
  if (!pSvcParam->bSimulcastAVC) {
    eFrameType = PrepareEncodeFrame (pCtx, pLayerBsInfo, iSpatialNum, iCurDid, iCurTid, iLayerNum, iFrameSize,
                                     pFbi->uiTimeStamp);
    if (eFrameType == videoFrameTypeSkip) {
      pFbi->eFrameType = videoFrameTypeSkip;
      pLayerBsInfo->eFrameType = videoFrameTypeSkip;
      return ENC_RETURN_SUCCESS;
    }
  } else {
    for (int32_t iDidIdx = 0; iDidIdx < pSvcParam->iSpatialLayerNum; iDidIdx++) {
      SSpatialLayerInternal* pParamInternal = &pSvcParam->sDependencyLayers[iDidIdx];
      int32_t iTemporalId =  GetTemporalLevel (pParamInternal, pParamInternal->iCodingIndex,
                             pSvcParam->uiGopSize);
      if (iTemporalId == INVALID_TEMPORAL_ID)
        pParamInternal->iCodingIndex ++;
    }
  }

  while (iSpatialIdx < iSpatialNum) {
    iCurDid  = (pSpatialIndexMap + iSpatialIdx)->iDid;
    SSpatialLayerConfig* pParam = &pSvcParam->sSpatialLayers[iCurDid];
    SSpatialLayerInternal* pParamInternal = &pSvcParam->sDependencyLayers[iCurDid];
    int32_t  iDecompositionStages = pSvcParam->sDependencyLayers[iCurDid].iDecompositionStages;
    pCtx->pCurDqLayer           = pCtx->ppDqLayerList[iCurDid];
    pCtx->uiDependencyId        =  iCurDid;

    if (pSvcParam->bSimulcastAVC) {
      eFrameType = PrepareEncodeFrame (pCtx, pLayerBsInfo, iSpatialNum, iCurDid, iCurTid, iLayerNum, iFrameSize,
                                       pFbi->uiTimeStamp);
      if (eFrameType == videoFrameTypeSkip) {
        pLayerBsInfo->eFrameType = videoFrameTypeSkip;
        ++iSpatialIdx;
        continue;
      }
    }
    InitFrameCoding (pCtx, eFrameType, iCurDid);
    pCtx->pVpp->AnalyzeSpatialPic (pCtx, iCurDid);

    pCtx->pEncPic               = pEncPic = (pSpatialIndexMap + iSpatialIdx)->pSrc;
    pCtx->pEncPic->iPictureType = pCtx->eSliceType;
    pCtx->pEncPic->iFramePoc    = pParamInternal->iPOC;

    iCurWidth   = pParam->iVideoWidth;
    iCurHeight  = pParam->iVideoHeight;
#if defined(MT_DEBUG)
    iDidList[iSpatialIdx]       = iCurDid;
#endif
    // Encoding this picture might mulitiple sQualityStat layers potentially be encoded as followed
    switch (pParam->sSliceArgument.uiSliceMode) {
    case SM_FIXEDSLCNUM_SLICE: {
      if ((pSvcParam->iMultipleThreadIdc > 1) &&
          (pSvcParam->bUseLoadBalancing
           && pSvcParam->iMultipleThreadIdc >= pSvcParam->sSpatialLayers[iCurDid].sSliceArgument.uiSliceNum)
         ) {
        if (iCurDid > 0)
          AdjustEnhanceLayer (pCtx, iCurDid);
        else
          AdjustBaseLayer (pCtx);
      }

      break;
    }
    case SM_SIZELIMITED_SLICE: {
      int32_t iPicIPartitionNum = PicPartitionNumDecision (pCtx);
      // MT compatibility
      pCtx->iActiveThreadsNum =
        iPicIPartitionNum; // we try to active number of threads, equal to number of picture partitions
      WelsInitCurrentDlayerMltslc (pCtx, iPicIPartitionNum);
      break;
    }
    default: {
      break;
    }
    }

    /* coding each spatial layer, only one sQualityStat layer within spatial support */
    int32_t iSliceCount = 1;
    if (iLayerNum >= MAX_LAYER_NUM_OF_FRAME) { // check available layer_bs_info writing as follows
      WelsLog (pLogCtx, WELS_LOG_ERROR, "WelsEncoderEncodeExt(), iLayerNum(%d) overflow(max:%d)!", iLayerNum,
               MAX_LAYER_NUM_OF_FRAME);
      return ENC_RETURN_UNSUPPORTED_PARA;
    }

    iNalIdxInLayer  = 0;
    bAvcBased       = ((pSvcParam->bSimulcastAVC) || (iCurDid == BASE_DEPENDENCY_ID));
    pCtx->bNeedPrefixNalFlag    = ((!pSvcParam->bSimulcastAVC) && (bAvcBased &&
                                   (pSvcParam->bPrefixNalAddingCtrl ||
                                    (pSvcParam->iSpatialLayerNum > 1))));

    if (eFrameType == videoFrameTypeP) {
      eNalType = bAvcBased ? NAL_UNIT_CODED_SLICE : NAL_UNIT_CODED_SLICE_EXT;
    } else if (eFrameType == videoFrameTypeIDR) {
      eNalType = bAvcBased ? NAL_UNIT_CODED_SLICE_IDR : NAL_UNIT_CODED_SLICE_EXT;
    }
    if (iCurTid == 0 || pCtx->eSliceType == I_SLICE)
      eNalRefIdc = NRI_PRI_HIGHEST;
    else if (iCurTid == iDecompositionStages)
      eNalRefIdc = NRI_PRI_LOWEST;
    else if (1 + iCurTid == iDecompositionStages)
      eNalRefIdc = NRI_PRI_LOW;
    else // more details for other temporal layers?
      eNalRefIdc = NRI_PRI_HIGHEST;
    pCtx->eNalType = eNalType;
    pCtx->eNalPriority = eNalRefIdc;

    pCtx->pDecPic               = pCtx->ppRefPicListExt[iCurDid]->pNextBuffer;
#if defined(ENABLE_FRAME_DUMP) || defined(ENABLE_PSNR_CALC)
    fsnr                        = pCtx->pDecPic;
#endif//#if defined(ENABLE_FRAME_DUMP) || defined(ENABLE_PSNR_CALC)
    pCtx->pDecPic->iPictureType = pCtx->eSliceType;
    pCtx->pDecPic->iFramePoc    = pParamInternal->iPOC;

    WelsInitCurrentLayer (pCtx, iCurWidth, iCurHeight);

    pCtx->pReferenceStrategy->MarkPic();
    if (!pCtx->pReferenceStrategy->BuildRefList (pParamInternal->iPOC, 0)) {
      WelsLog (pLogCtx, WELS_LOG_WARNING,
               "WelsEncoderEncodeExt(), WelsBuildRefList failed for P frames, pCtx->iNumRef0= %d. ForceCodingIDR!",
               pCtx->iNumRef0);
      eFrameType = videoFrameTypeIDR;
      pCtx->iEncoderError = ENC_RETURN_CORRECTED;
      break;
    }
    if (pCtx->eSliceType != I_SLICE) {
      pCtx->pReferenceStrategy->AfterBuildRefList();
    }
#ifdef LONG_TERM_REF_DUMP
    DumpRef (pCtx);
#endif
    if (pSvcParam->iRCMode != RC_OFF_MODE)
      pCtx->pVpp->AnalyzePictureComplexity (pCtx, pCtx->pEncPic, ((pCtx->eSliceType == P_SLICE)
                                            && (pCtx->iNumRef0 > 0)) ? pCtx->pRefList0[0] : NULL,
                                            iCurDid, (pCtx->eSliceType == P_SLICE) && pSvcParam->bEnableBackgroundDetection);
    WelsUpdateRefSyntax (pCtx,  pParamInternal->iPOC,
                         eFrameType); //get reordering syntax used for writing slice header and transmit to encoder.
    PrefetchReferencePicture (pCtx, eFrameType); // update reference picture for current pDq layer
    pCtx->pFuncList->pfRc.pfWelsRcPictureInit (pCtx, pFbi->uiTimeStamp);
    PreprocessSliceCoding (pCtx); // MUST be called after pfWelsRcPictureInit() and WelsInitCurrentLayer()

    //TODO Complexity Calculation here for screen content
    iLayerSize = 0;
    if (SM_SINGLE_SLICE == pParam->sSliceArgument.uiSliceMode) { // only one slice within a sQualityStat layer
      int32_t iSliceSize   = 0;
      int32_t iPayloadSize = 0;
      SSlice* pCurSlice    = &pCtx->pCurDqLayer->sSliceBufferInfo[0].pSliceBuffer[0];

      if (pCtx->bNeedPrefixNalFlag) {
        pCtx->iEncoderError = AddPrefixNal (pCtx, pLayerBsInfo, &pLayerBsInfo->pNalLengthInByte[0], &iNalIdxInLayer, eNalType,
                                            eNalRefIdc,
                                            iPayloadSize);
        WELS_VERIFY_RETURN_IFNEQ (pCtx->iEncoderError, ENC_RETURN_SUCCESS)
        iLayerSize += iPayloadSize;
      }

      WelsLoadNal (pCtx->pOut, eNalType, eNalRefIdc);
      assert (0 == (int) pCurSlice->iSliceIdx);
      pCtx->iEncoderError   = SetSliceBoundaryInfo (pCtx->pCurDqLayer, pCurSlice, 0);
      WELS_VERIFY_RETURN_IFNEQ (pCtx->iEncoderError, ENC_RETURN_SUCCESS)

      pCtx->iEncoderError   = WelsCodeOneSlice (pCtx, pCurSlice, eNalType);
      WELS_VERIFY_RETURN_IFNEQ (pCtx->iEncoderError, ENC_RETURN_SUCCESS)

      WelsUnloadNal (pCtx->pOut);

      pCtx->iEncoderError = WelsEncodeNal (&pCtx->pOut->sNalList[pCtx->pOut->iNalIndex - 1],
                                           &pCtx->pCurDqLayer->sLayerInfo.sNalHeaderExt,
                                           pCtx->iFrameBsSize - pCtx->iPosBsBuffer,
                                           pCtx->pFrameBs + pCtx->iPosBsBuffer,
                                           &pLayerBsInfo->pNalLengthInByte[iNalIdxInLayer]);
      WELS_VERIFY_RETURN_IFNEQ (pCtx->iEncoderError, ENC_RETURN_SUCCESS)
      iSliceSize = pLayerBsInfo->pNalLengthInByte[iNalIdxInLayer];

      iLayerSize += iSliceSize;
      pCtx->iPosBsBuffer               += iSliceSize;
      pLayerBsInfo->uiLayerType         = VIDEO_CODING_LAYER;
      pLayerBsInfo->uiSpatialId         = iCurDid;
      pLayerBsInfo->uiTemporalId        = iCurTid;
      pLayerBsInfo->uiQualityId         = 0;
      pLayerBsInfo->iNalCount           = ++ iNalIdxInLayer;
      pLayerBsInfo->eFrameType          = eFrameType;
      pLayerBsInfo->iSubSeqId = GetSubSequenceId (pCtx, eFrameType);
    }
    // for dynamic slicing single threading..
    else if ((SM_SIZELIMITED_SLICE == pParam->sSliceArgument.uiSliceMode) && (pSvcParam->iMultipleThreadIdc <= 1)) {
      const int32_t kiLastMbInFrame = pCtx->pCurDqLayer->sSliceEncCtx.iMbNumInFrame;
      pCtx->iEncoderError = WelsCodeOnePicPartition (pCtx, pFbi, pLayerBsInfo, &iNalIdxInLayer, &iLayerSize, 0,
                            kiLastMbInFrame - 1, 0);
      pLayerBsInfo->eFrameType = eFrameType;
      pLayerBsInfo->iSubSeqId = GetSubSequenceId (pCtx, eFrameType);
      WELS_VERIFY_RETURN_IFNEQ (pCtx->iEncoderError, ENC_RETURN_SUCCESS)
    } else {
      //other multi-slice uiSliceMode
      // THREAD_FULLY_FIRE_MODE/THREAD_PICK_UP_MODE for any mode of non-SM_SIZELIMITED_SLICE
      if ((SM_SIZELIMITED_SLICE != pParam->sSliceArgument.uiSliceMode) && (pSvcParam->iMultipleThreadIdc > 1)) {
        iSliceCount = GetCurrentSliceNum (pCtx->pCurDqLayer);
        if (iLayerNum + 1 >= MAX_LAYER_NUM_OF_FRAME) { // check available layer_bs_info for further writing as followed
          WelsLog (pLogCtx, WELS_LOG_ERROR,
                   "WelsEncoderEncodeExt(), iLayerNum(%d) overflow(max:%d) at iDid= %d uiSliceMode= %d, iSliceCount= %d!",
                   iLayerNum, MAX_LAYER_NUM_OF_FRAME, iCurDid, pParam->sSliceArgument.uiSliceMode, iSliceCount);
          return ENC_RETURN_UNSUPPORTED_PARA;
        }
        if (iSliceCount <= 1) {
          WelsLog (pLogCtx, WELS_LOG_ERROR,
                   "WelsEncoderEncodeExt(), iSliceCount(%d) from GetCurrentSliceNum() is untrusted due stack/heap crupted!",
                   iSliceCount);
          return ENC_RETURN_UNEXPECTED;
        }
        //note: the old codes are removed at commit: 3e0ee69
        pLayerBsInfo->pBsBuf = pCtx->pFrameBs + pCtx->iPosBsBuffer;
        pLayerBsInfo->uiLayerType   = VIDEO_CODING_LAYER;
        pLayerBsInfo->uiSpatialId   = pCtx->uiDependencyId;
        pLayerBsInfo->uiTemporalId  = pCtx->uiTemporalId;
        pLayerBsInfo->uiQualityId   = 0;
        pLayerBsInfo->iNalCount     = 0;
        pLayerBsInfo->eFrameType    = eFrameType;
        pLayerBsInfo->iSubSeqId = GetSubSequenceId (pCtx, eFrameType);

        pCtx->pTaskManage->ExecuteTasks();
        if (pCtx->iEncoderError) {
          WelsLog (pLogCtx, WELS_LOG_ERROR,
                   "WelsEncoderEncodeExt(), multi-slice (mode %d) encoding error!",
                   pParam->sSliceArgument.uiSliceMode);
          return pCtx->iEncoderError;
        }

        iLayerSize = AppendSliceToFrameBs (pCtx, pLayerBsInfo, iSliceCount);
      }
      // THREAD_FULLY_FIRE_MODE && SM_SIZELIMITED_SLICE
      else if ((SM_SIZELIMITED_SLICE == pParam->sSliceArgument.uiSliceMode) && (pSvcParam->iMultipleThreadIdc > 1)) {
        const int32_t kiPartitionCnt = pCtx->iActiveThreadsNum;

        //TODO: use a function to remove duplicate code here and ln3994
        int32_t iLayerBsIdx       = pCtx->pOut->iLayerBsIndex;
        SLayerBSInfo* pLbi        = &pFbi->sLayerInfo[iLayerBsIdx];
        pLbi->pBsBuf = pCtx->pFrameBs + pCtx->iPosBsBuffer;
        pLbi->uiLayerType   = VIDEO_CODING_LAYER;
        pLbi->uiSpatialId   = pCtx->uiDependencyId;
        pLbi->uiTemporalId  = pCtx->uiTemporalId;
        pLbi->uiQualityId   = 0;
        pLbi->iNalCount     = 0;
        pLbi->eFrameType = eFrameType;
        pLbi->iSubSeqId = GetSubSequenceId (pCtx, eFrameType);
        int32_t iIdx = 0;
        while (iIdx < kiPartitionCnt) {
          pCtx->pSliceThreading->pThreadPEncCtx[iIdx].pFrameBsInfo = pFbi;
          pCtx->pSliceThreading->pThreadPEncCtx[iIdx].iSliceIndex  = iIdx;
          ++ iIdx;
        }

        int32_t iRet = InitAllSlicesInThread (pCtx);
        if (iRet) {
          WelsLog (pLogCtx, WELS_LOG_ERROR,
                   "WelsEncoderEncodeExt(), multi-slice (mode %d) InitAllSlicesInThread() error!",
                   pParam->sSliceArgument.uiSliceMode);
          return ENC_RETURN_UNEXPECTED;
        }
        pCtx->pTaskManage->ExecuteTasks();

        if (pCtx->iEncoderError) {
          WelsLog (pLogCtx, WELS_LOG_ERROR,
                   "WelsEncoderEncodeExt(), multi-slice (mode %d) encoding error = %d!",
                   pParam->sSliceArgument.uiSliceMode, pCtx->iEncoderError);
          return pCtx->iEncoderError;
        }

        iRet = SliceLayerInfoUpdate (pCtx, pFbi, pLayerBsInfo, pParam->sSliceArgument.uiSliceMode);
        if (iRet) {
          WelsLog (pLogCtx, WELS_LOG_ERROR,
                   "WelsEncoderEncodeExt(), multi-slice (mode %d) InitAllSlicesInThread() error!",
                   pParam->sSliceArgument.uiSliceMode);
          return ENC_RETURN_UNEXPECTED;
        }

        iSliceCount = GetCurrentSliceNum (pCtx->pCurDqLayer);
        iLayerSize  = AppendSliceToFrameBs (pCtx, pLayerBsInfo, iSliceCount);
      } else { // for non-dynamic-slicing mode single threading branch..
        const bool bNeedPrefix = pCtx->bNeedPrefixNalFlag;
        int32_t iSliceIdx    = 0;
        SSlice* pCurSlice    = NULL;

        iSliceCount = GetCurrentSliceNum (pCtx->pCurDqLayer);
        while (iSliceIdx < iSliceCount) {
          int32_t iSliceSize    = 0;
          int32_t iPayloadSize  = 0;

          if (bNeedPrefix) {
            pCtx->iEncoderError = AddPrefixNal (pCtx, pLayerBsInfo, &pLayerBsInfo->pNalLengthInByte[0], &iNalIdxInLayer, eNalType,
                                                eNalRefIdc,
                                                iPayloadSize);
            WELS_VERIFY_RETURN_IFNEQ (pCtx->iEncoderError, ENC_RETURN_SUCCESS)
            iLayerSize += iPayloadSize;
          }

          WelsLoadNal (pCtx->pOut, eNalType, eNalRefIdc);

          pCurSlice = &pCtx->pCurDqLayer->sSliceBufferInfo[0].pSliceBuffer[iSliceIdx];
          assert (iSliceIdx == pCurSlice->iSliceIdx);
          pCtx->iEncoderError   = SetSliceBoundaryInfo (pCtx->pCurDqLayer, pCurSlice, iSliceIdx);

          pCtx->iEncoderError = WelsCodeOneSlice (pCtx, pCurSlice, eNalType);
          WELS_VERIFY_RETURN_IFNEQ (pCtx->iEncoderError, ENC_RETURN_SUCCESS)

          WelsUnloadNal (pCtx->pOut);

          pCtx->iEncoderError = WelsEncodeNal (&pCtx->pOut->sNalList[pCtx->pOut->iNalIndex - 1],
                                               &pCtx->pCurDqLayer->sLayerInfo.sNalHeaderExt,
                                               pCtx->iFrameBsSize - pCtx->iPosBsBuffer,
                                               pCtx->pFrameBs + pCtx->iPosBsBuffer, &pLayerBsInfo->pNalLengthInByte[iNalIdxInLayer]);
          WELS_VERIFY_RETURN_IFNEQ (pCtx->iEncoderError, ENC_RETURN_SUCCESS)
          iSliceSize = pLayerBsInfo->pNalLengthInByte[iNalIdxInLayer];

          pCtx->iPosBsBuffer += iSliceSize;
          iLayerSize         += iSliceSize;

#if defined(SLICE_INFO_OUTPUT)
          fprintf (stderr,
                   "@slice=%-6d sliceType:%c idc:%d size:%-6d\n",
                   iSliceIdx,
                   (pCtx->eSliceType == P_SLICE ? 'P' : 'I'),
                   eNalRefIdc,
                   iSliceSize);
#endif//SLICE_INFO_OUTPUT
          ++ iNalIdxInLayer;
          ++ iSliceIdx;
        }

        pLayerBsInfo->uiLayerType       = VIDEO_CODING_LAYER;
        pLayerBsInfo->uiSpatialId       = iCurDid;
        pLayerBsInfo->uiTemporalId      = iCurTid;
        pLayerBsInfo->uiQualityId       = 0;
        pLayerBsInfo->iNalCount         = iNalIdxInLayer;
        pLayerBsInfo->eFrameType        = eFrameType;
        pLayerBsInfo->iSubSeqId         = GetSubSequenceId (pCtx, eFrameType);
      }
    }

    if (NULL != pCtx->pFuncList->pfRc.pfWelsRcPostFrameSkipping
        && pCtx->pFuncList->pfRc.pfWelsRcPostFrameSkipping (pCtx, iCurDid, pFbi->uiTimeStamp)) {

      StackBackEncoderStatus (pCtx, eFrameType);
      ClearFrameBsInfo (pCtx, pFbi);

      iFrameSize = 0;
      iLayerSize = 0;
      iLayerNum = 0;

      if (pCtx->pFuncList->pfRc.pfWelsUpdateBufferWhenSkip) {
        pCtx->pFuncList->pfRc.pfWelsUpdateBufferWhenSkip (pCtx, iSpatialNum);
      }

      WelsRcPostFrameSkippedUpdate (pCtx, iCurDid);
      pCtx->iEncoderError = ENC_RETURN_SUCCESS;
      return ENC_RETURN_SUCCESS;
    }

    // deblocking filter
    if (
      (!pCtx->pCurDqLayer->bDeblockingParallelFlag) &&
#if !defined(ENABLE_FRAME_DUMP)
      ((eNalRefIdc != NRI_PRI_LOWEST) && (pSvcParam->sDependencyLayers[iCurDid].iHighestTemporalId == 0
                                          || iCurTid < pSvcParam->sDependencyLayers[iCurDid].iHighestTemporalId)) &&
#endif//!ENABLE_FRAME_DUMP
      true
    ) {
      PerformDeblockingFilter (pCtx);
    }

    pCtx->pFuncList->pfRc.pfWelsRcPictureInfoUpdate (pCtx, iLayerSize);
    iFrameSize += iLayerSize;
    RcTraceFrameBits (pCtx, pFbi->uiTimeStamp, iFrameSize);
    pCtx->pDecPic->iFrameAverageQp = pCtx->pWelsSvcRc[iCurDid].iAverageFrameQp;

    //update scc related
    pCtx->pFuncList->pfUpdateFMESwitch (pCtx->pCurDqLayer);

    // reference picture list update
    if (eNalRefIdc != NRI_PRI_LOWEST) {
      if (!pCtx->pReferenceStrategy->UpdateRefList()) {
        WelsLog (pLogCtx, WELS_LOG_WARNING, "WelsEncoderEncodeExt(), WelsUpdateRefList failed. ForceCodingIDR!");
        //the above is to set the next frame to be IDR
        pCtx->iEncoderError = ENC_RETURN_CORRECTED;
        break;
      }
    }


    //check MinCr
    {
      int32_t iMinCrFrameSize = (pParam->iVideoWidth * pParam->iVideoHeight * 3) >> 2; //MinCr = 2;
      if (pParam->uiLevelIdc == LEVEL_3_1 || pParam->uiLevelIdc == LEVEL_3_2 || pParam->uiLevelIdc == LEVEL_4_0)
        iMinCrFrameSize >>= 1; //MinCr = 4
      if (iFrameSize > iMinCrFrameSize)
        WelsLog (pLogCtx, WELS_LOG_WARNING,
                 "WelsEncoderEncodeExt()MinCr Checking,codec bitstream size is larger than Level limitation");
    }
#ifdef ENABLE_FRAME_DUMP
    {
      DumpDependencyRec (fsnr, &pSvcParam->sDependencyLayers[iCurDid].sRecFileName[0], iCurDid,
                         pCtx->bDependencyRecFlag[iCurDid], pCtx->pCurDqLayer, pSvcParam->bSimulcastAVC);
      pCtx->bDependencyRecFlag[iCurDid] = true;
    }
#endif//ENABLE_FRAME_DUMP

#if defined(ENABLE_PSNR_CALC)
    fSnrY = WelsCalcPsnr (fsnr->pData[0],
                          fsnr->iLineSize[0],
                          pEncPic->pData[0],
                          pEncPic->iLineSize[0],
                          iCurWidth,
                          iCurHeight);
    fSnrU = WelsCalcPsnr (fsnr->pData[1],
                          fsnr->iLineSize[1],
                          pEncPic->pData[1],
                          pEncPic->iLineSize[1],
                          (iCurWidth >> 1),
                          (iCurHeight >> 1));
    fSnrV = WelsCalcPsnr (fsnr->pData[2],
                          fsnr->iLineSize[2],
                          pEncPic->pData[2],
                          pEncPic->iLineSize[2],
                          (iCurWidth >> 1),
                          (iCurHeight >> 1));
#endif//ENABLE_PSNR_CALC

#if defined(LAYER_INFO_OUTPUT)
    fprintf (stderr, "%2s %5d: %-5d %2s   T%1d D%1d Q%-2d  QP%3d   Y%2.2f  U%2.2f  V%2.2f  %8d bits\n",
             (iSpatialIdx == 0) ? "#AU" : "   ",
             pParamInternal->iPOC,
             pParamInternal->iFrameNum,
             (eFrameType == videoFrameTypeI || eFrameType == videoFrameTypeIDR) ? "I" : "P",
             iCurTid,
             iCurDid,
             0,
             pCtx->pWelsSvcRc[pCtx->uiDependencyId].iAverageFrameQp,
             fSnrY,
             fSnrU,
             fSnrV,
             (iLayerSize << 3));
#endif//LAYER_INFO_OUTPUT

#if defined(STAT_OUTPUT)

#if defined(ENABLE_PSNR_CALC)
    {
      pCtx->sStatData[iCurDid][0].sQualityStat.rYPsnr[pCtx->eSliceType] += fSnrY;
      pCtx->sStatData[iCurDid][0].sQualityStat.rUPsnr[pCtx->eSliceType] += fSnrU;
      pCtx->sStatData[iCurDid][0].sQualityStat.rVPsnr[pCtx->eSliceType] += fSnrV;
    }
#endif//ENABLE_PSNR_CALC

#if defined(MB_TYPES_CHECK) //091025, frame output
    if (pCtx->eSliceType == P_SLICE) {
      pCtx->sStatData[iCurDid][0].sSliceData.iMbCount[P_SLICE][Intra4x4] += pCtx->sPerInfo.iMbCount[P_SLICE][Intra4x4];
      pCtx->sStatData[iCurDid][0].sSliceData.iMbCount[P_SLICE][Intra16x16] += pCtx->sPerInfo.iMbCount[P_SLICE][Intra16x16];
      pCtx->sStatData[iCurDid][0].sSliceData.iMbCount[P_SLICE][Inter16x16] += pCtx->sPerInfo.iMbCount[P_SLICE][Inter16x16];
      pCtx->sStatData[iCurDid][0].sSliceData.iMbCount[P_SLICE][Inter16x8] += pCtx->sPerInfo.iMbCount[P_SLICE][Inter16x8];
      pCtx->sStatData[iCurDid][0].sSliceData.iMbCount[P_SLICE][Inter8x16] += pCtx->sPerInfo.iMbCount[P_SLICE][Inter8x16];
      pCtx->sStatData[iCurDid][0].sSliceData.iMbCount[P_SLICE][Inter8x8] += pCtx->sPerInfo.iMbCount[P_SLICE][Inter8x8];
      pCtx->sStatData[iCurDid][0].sSliceData.iMbCount[P_SLICE][PSkip] += pCtx->sPerInfo.iMbCount[P_SLICE][PSkip];
      pCtx->sStatData[iCurDid][0].sSliceData.iMbCount[P_SLICE][8] += pCtx->sPerInfo.iMbCount[P_SLICE][8];
      pCtx->sStatData[iCurDid][0].sSliceData.iMbCount[P_SLICE][9] += pCtx->sPerInfo.iMbCount[P_SLICE][9];
      pCtx->sStatData[iCurDid][0].sSliceData.iMbCount[P_SLICE][10] += pCtx->sPerInfo.iMbCount[P_SLICE][10];
      pCtx->sStatData[iCurDid][0].sSliceData.iMbCount[P_SLICE][11] += pCtx->sPerInfo.iMbCount[P_SLICE][11];
    } else if (pCtx->eSliceType == I_SLICE) {
      pCtx->sStatData[iCurDid][0].sSliceData.iMbCount[I_SLICE][Intra4x4] += pCtx->sPerInfo.iMbCount[I_SLICE][Intra4x4];
      pCtx->sStatData[iCurDid][0].sSliceData.iMbCount[I_SLICE][Intra16x16] += pCtx->sPerInfo.iMbCount[I_SLICE][Intra16x16];
      pCtx->sStatData[iCurDid][0].sSliceData.iMbCount[I_SLICE][7] += pCtx->sPerInfo.iMbCount[I_SLICE][7];
    }

    memset (pCtx->sPerInfo.iMbCount[P_SLICE], 0, 18 * sizeof (int32_t));
    memset (pCtx->sPerInfo.iMbCount[I_SLICE], 0, 18 * sizeof (int32_t));

#endif//MB_TYPES_CHECK
    {
      ++ pCtx->sStatData[iCurDid][0].sSliceData.iSliceCount[pCtx->eSliceType]; // for multiple slices coding
      pCtx->sStatData[iCurDid][0].sSliceData.iSliceSize[pCtx->eSliceType] += (iLayerSize << 3); // bits
    }
#endif//STAT_OUTPUT

    iCountNal = pLayerBsInfo->iNalCount;
    ++ iLayerNum;
    ++ pLayerBsInfo;
    ++ pCtx->pOut->iLayerBsIndex;
    pLayerBsInfo->pBsBuf = pCtx->pFrameBs + pCtx->iPosBsBuffer;
    pLayerBsInfo->pNalLengthInByte = (pLayerBsInfo - 1)->pNalLengthInByte + iCountNal;

    if (pSvcParam->iPaddingFlag && pCtx->pWelsSvcRc[pCtx->uiDependencyId].iPaddingSize > 0) {
      int32_t iPaddingNalSize = 0;
      pCtx->iEncoderError =  WritePadding (pCtx, pCtx->pWelsSvcRc[pCtx->uiDependencyId].iPaddingSize, iPaddingNalSize);
      WELS_VERIFY_RETURN_IFNEQ (pCtx->iEncoderError, ENC_RETURN_SUCCESS)

#if GOM_TRACE_FLAG
      WelsLog (pLogCtx, WELS_LOG_INFO, "[RC] dependency ID = %d,encoding_qp = %d Padding: %d", pCtx->uiDependencyId,
               pCtx->iGlobalQp,
               pCtx->pWelsSvcRc[pCtx->uiDependencyId].iPaddingSize);
#endif
      if (iPaddingNalSize <= 0)
        return ENC_RETURN_UNEXPECTED;

      pCtx->pWelsSvcRc[pCtx->uiDependencyId].iPaddingBitrateStat += pCtx->pWelsSvcRc[pCtx->uiDependencyId].iPaddingSize;

      pCtx->pWelsSvcRc[pCtx->uiDependencyId].iPaddingSize = 0;

      pLayerBsInfo->uiSpatialId         = 0;
      pLayerBsInfo->uiTemporalId        = 0;
      pLayerBsInfo->uiQualityId         = 0;
      pLayerBsInfo->uiLayerType         = NON_VIDEO_CODING_LAYER;
      pLayerBsInfo->iNalCount           = 1;
      pLayerBsInfo->pNalLengthInByte[0] = iPaddingNalSize;
      pLayerBsInfo->eFrameType          = eFrameType;
      pLayerBsInfo->iSubSeqId = GetSubSequenceId (pCtx, eFrameType);
      ++ pLayerBsInfo;
      ++ pCtx->pOut->iLayerBsIndex;
      pLayerBsInfo->pBsBuf           = pCtx->pFrameBs + pCtx->iPosBsBuffer;
      pLayerBsInfo->pNalLengthInByte = (pLayerBsInfo - 1)->pNalLengthInByte + 1;
      ++ iLayerNum;

      iFrameSize += iPaddingNalSize;
    }

    if ((pParam->sSliceArgument.uiSliceMode == SM_FIXEDSLCNUM_SLICE)
        && pSvcParam->bUseLoadBalancing
        && pSvcParam->iMultipleThreadIdc > 1 &&
        pSvcParam->iMultipleThreadIdc >= pParam->sSliceArgument.uiSliceNum) {
      CalcSliceComplexRatio (pCtx->pCurDqLayer);
#if defined(MT_DEBUG)
      TrackSliceComplexities (pCtx, iCurDid);
#endif//#if defined(MT_DEBUG)
    }

    pCtx->eLastNalPriority[iCurDid] = eNalRefIdc;
    ++ iSpatialIdx;

    if (iCurDid + 1 < pSvcParam->iSpatialLayerNum) {
      //for next layer, note that iSpatialIdx has been ++ so it is pointer to next layer
      WelsSwapDqLayers (pCtx, (pSpatialIndexMap + iSpatialIdx)->iDid);
    }

    if (pCtx->pVpp->UpdateSpatialPictures (pCtx, pSvcParam, iCurTid, iCurDid) != 0) {
      ForceCodingIDR (pCtx, iCurDid);
      WelsLog (pLogCtx, WELS_LOG_WARNING,
               "WelsEncoderEncodeExt(), Logic Error Found in Preprocess updating. ForceCodingIDR!");
      //the above is to set the next frame IDR
      pFbi->eFrameType = eFrameType;
      pLayerBsInfo->eFrameType = eFrameType;
      return ENC_RETURN_CORRECTED;
    }

    if (pSvcParam->bEnableLongTermReference && ((pCtx->pLtr[pCtx->uiDependencyId].bLTRMarkingFlag
        && (pCtx->pLtr[pCtx->uiDependencyId].iLTRMarkMode == LTR_DIRECT_MARK)) || eFrameType == videoFrameTypeIDR)) {
      pCtx->bRefOfCurTidIsLtr[iCurDid][iCurTid] = true;
    }
    if (pSvcParam->bSimulcastAVC)
      ++ pParamInternal->iCodingIndex;
  }//end of (iSpatialIdx/iSpatialNum)

  if (!pSvcParam->bSimulcastAVC) {
    for (int32_t i = 0; i < pSvcParam->iSpatialLayerNum; i++) {
      SSpatialLayerInternal* pParamInternal = &pSvcParam->sDependencyLayers[i];
      pParamInternal->iCodingIndex ++;
    }
  }

  if (ENC_RETURN_CORRECTED == pCtx->iEncoderError) {
    pCtx->pVpp->UpdateSpatialPictures (pCtx, pSvcParam, iCurTid, (pSpatialIndexMap + iSpatialIdx)->iDid);
    ForceCodingIDR (pCtx, (pSpatialIndexMap + iSpatialIdx)->iDid);
    WelsLog (pLogCtx, WELS_LOG_ERROR, "WelsEncoderEncodeExt(), Logic Error Found in temporal level. ForceCodingIDR!");
    //the above is to set the next frame IDR
    pFbi->eFrameType = eFrameType;
    pLayerBsInfo->eFrameType = eFrameType;
    return ENC_RETURN_CORRECTED;
  }

#if defined(MT_DEBUG)
  TrackSliceConsumeTime (pCtx, iDidList, iSpatialNum);
#endif//MT_DEBUG

  // to check number of layers / nals / slices dependencies
  if (iLayerNum > MAX_LAYER_NUM_OF_FRAME) {
    WelsLog (& pCtx->sLogCtx, WELS_LOG_ERROR, "WelsEncoderEncodeExt(), iLayerNum(%d) > MAX_LAYER_NUM_OF_FRAME(%d)!",
             iLayerNum, MAX_LAYER_NUM_OF_FRAME);
    return 1;
  }


  pFbi->iLayerNum = iLayerNum;

  WelsLog (pLogCtx, WELS_LOG_DEBUG, "WelsEncoderEncodeExt() OutputInfo iLayerNum = %d,iFrameSize = %d",
           iLayerNum, iFrameSize);
  for (int32_t i = 0; i < iLayerNum; i++)
    WelsLog (pLogCtx, WELS_LOG_DEBUG,
             "WelsEncoderEncodeExt() OutputInfo iLayerId = %d,iNalType = %d,iNalCount = %d, first Nal Length=%d,uiSpatialId = %d,uiTemporalId = %d,iSubSeqId = %d",
             i,
             pFbi->sLayerInfo[i].uiLayerType, pFbi->sLayerInfo[i].iNalCount, pFbi->sLayerInfo[i].pNalLengthInByte[0],
             pFbi->sLayerInfo[i].uiSpatialId, pFbi->sLayerInfo[i].uiTemporalId, pFbi->sLayerInfo[i].iSubSeqId);
  WelsEmms();

  pLayerBsInfo->eFrameType = eFrameType;
  pFbi->iFrameSizeInBytes = iFrameSize;
  pFbi->eFrameType = eFrameType;
  for (int32_t k = 0; k < pFbi->iLayerNum; k++) {
    if (pFbi->eFrameType != pFbi->sLayerInfo[k].eFrameType) {
      pFbi->eFrameType = videoFrameTypeIPMixed;
    }
  }
#ifdef _DEBUG
  if (pFbi->iLayerNum > MAX_LAYER_NUM_OF_FRAME) {
    WelsLog (& pCtx->sLogCtx, WELS_LOG_ERROR, "WelsEncoderEncodeExt(), iLayerNum(%d) > MAX_LAYER_NUM_OF_FRAME(%d)!",
             pFbi->iLayerNum, MAX_LAYER_NUM_OF_FRAME);
    return ENC_RETURN_UNEXPECTED;
  }

  int32_t iTotalNal = 0;
  for (int32_t k = 0; k < pFbi->iLayerNum; k++) {
    iTotalNal += pFbi->sLayerInfo[k].iNalCount;

    if ((pCtx->iActiveThreadsNum > 1) && (MAX_NAL_UNITS_IN_LAYER < pFbi->sLayerInfo[k].iNalCount)) {
      WelsLog (& pCtx->sLogCtx, WELS_LOG_ERROR,
               "WelsEncoderEncodeExt(), iCountNumNals(%d) > MAX_NAL_UNITS_IN_LAYER(%d) under multi-thread(%d) NOT supported!",
               pFbi->sLayerInfo[k].iNalCount, MAX_NAL_UNITS_IN_LAYER, pCtx->iActiveThreadsNum);
      return ENC_RETURN_UNEXPECTED;
    }
  }

  if (iTotalNal > pCtx->pOut->iCountNals) {
    WelsLog (& pCtx->sLogCtx, WELS_LOG_ERROR, "WelsEncoderEncodeExt(), iTotalNal(%d) > iCountNals(%d)!",
             iTotalNal, pCtx->pOut->iCountNals);
    return ENC_RETURN_UNEXPECTED;
  }
#endif
  return ENC_RETURN_SUCCESS;
}
相关推荐
darkdragonking6 小时前
FLV视频封装格式详解
音视频
元争栈道7 小时前
webview和H5来实现的android短视频(短剧)音视频播放依赖控件
android·音视频
元争栈道9 小时前
webview+H5来实现的android短视频(短剧)音视频播放依赖控件资源
android·音视频
MediaTea13 小时前
Pr:音频仪表
音视频
桃园码工13 小时前
13_HTML5 Audio(音频) --[HTML5 API 学习之旅]
音视频·html5·audio
cuijiecheng201818 小时前
音视频入门基础:MPEG2-TS专题(21)——FFmpeg源码中,获取TS流的视频信息的实现
ffmpeg·音视频
γ..19 小时前
基于MATLAB的图像增强
开发语言·深度学习·神经网络·学习·机器学习·matlab·音视频
cuijiecheng201819 小时前
音视频入门基础:AAC专题(13)——FFmpeg源码中,获取ADTS格式的AAC裸流音频信息的实现
ffmpeg·音视频·aac
悟纤1 天前
Suno Api V4模型无水印开发「高清音频WAV下载」 —— 「Suno Api系列」第6篇
音视频·suno·suno v4·suno ai
gomogomono1 天前
HDR视频技术之八:色域映射
音视频·hdr·yuv