基于Android Camera2 createCaptureSession流程分析

基于Android P版本分析

createCaptureSession时序图:

整个过程中涉及到的类之间的关联关系:

openCamera执行到最后,会执行回调方法:Onopened()方法,同时会得到一个CameraDevice实例,这表示openCamera已经完成,紧接着就是执行createCaptureSession()方法;

通过获取到的CameraDevice实例创建session,使得Application和Framework之间的通道连通;

1.CameraDeviceImpl.createCaptureSession()

java 复制代码
@Override
public void createCaptureSession(List<Surface> outputs,
                                 CameraCaptureSession.StateCallback callback, Handler handler)
        throws CameraAccessException {
    List<OutputConfiguration> outConfigurations = new ArrayList<>(outputs.size());
    for (Surface surface : outputs) {
        outConfigurations.add(new OutputConfiguration(surface));
    }
    createCaptureSessionInternal(null, outConfigurations, callback,
            checkAndWrapHandler(handler), /*operatingMode*/ICameraDeviceUser.NORMAL_MODE,
            /*sessionParams*/ null);
}

第一个参数为Surface的List,Surface就是用来创建流的,一般如果没有特殊的要求,那我们只需要下两个Surface,一个提供预览,一个提供拍照。

预览的Surface就是相机预览区域,buffer轮转时,预览区的buffer就是要从这个预览Surface当中获取的,这个Surface一定要正确,否则就会导致session创建失败,预览区就会黑屏了;

至于拍照Surface,我们一般使用ImageReader对象来获取,ImageReader是系统提供的一个类,它的创建过程已经为我们创建好了一个Surface,我们直接使用它来当作拍照Surface;

第二个参数StateCallback和openCamera一样,当session创建成功后,framework也会通过这个回调接口的public abstract void onConfigured(@NonNull CameraCaptureSession session)方法返回一个CameraCaptureSession对象给我们,而真正的实现是一个CameraCaptureSessionImpl对象;

第三个参数Handler的作用和openCamera也一样,还是为了保证线程不发生切换,我们在应用进程的哪个工作线程中执行createCaptureSession,那么framework回调我们时,也会通过这个handler把回调消息发送到当前handler线程的Looper循环上;

在这个方法中,首先将需要绑定的surface以List的方式传入到CameraDeviceImpl中,然后对Surface进行封装成OutputConfiguration对象,然后保存到List中:

scss 复制代码
new OutputConfiguration(surface);
go 复制代码
package android.hardware.camera2;
​
interface ICameraDeviceUser
{
    ......
    /**
     * The standard operating mode for a camera device; all API guarantees are in force
     */
    const int NORMAL_MODE = 0;
    
    /**
     * High-speed recording mode; only two outputs targeting preview and video recording may be
     * used, and requests must be batched.
     */
    const int CONSTRAINED_HIGH_SPEED_MODE = 1;
}

OutputConfiguration构造方法:

ini 复制代码
@SystemApi
public static final int ROTATION_0 = 0;
​
@SystemApi
public static final int ROTATION_90 = 1;
​
@SystemApi
public static final int ROTATION_180 = 2;
​
@SystemApi
public static final int ROTATION_270 = 3;
​
/**
 * Invalid surface group ID.
 *
 *<p>An {@link OutputConfiguration} with this value indicates that the included surface
 *doesn't belong to any surface group.</p>
 */
public static final int SURFACE_GROUP_ID_NONE = -1;
​
public OutputConfiguration(@NonNull Surface surface) {
    this(SURFACE_GROUP_ID_NONE, surface, ROTATION_0);
}
​
@SystemApi
public OutputConfiguration(int surfaceGroupId, @NonNull Surface surface, int rotation) {
    checkNotNull(surface, "Surface must not be null");
    checkArgumentInRange(rotation, ROTATION_0, ROTATION_270, "Rotation constant");
    mSurfaceGroupId = surfaceGroupId;
    mSurfaceType = SURFACE_TYPE_UNKNOWN;
    mSurfaces = new ArrayList<Surface>();
    mSurfaces.add(surface);
    mRotation = rotation;
    mConfiguredSize = SurfaceUtils.getSurfaceSize(surface);
    mConfiguredFormat = SurfaceUtils.getSurfaceFormat(surface);
    mConfiguredDataspace = SurfaceUtils.getSurfaceDataspace(surface);
    mConfiguredGenerationId = surface.getGenerationId();
    mIsDeferredConfig = false;
    mIsShared = false;
    mPhysicalCameraId = null;
}

OutputConfiguration是一个描述camera输出数据的类,其中包括Surface和捕获camera会话的特定设置;

紧接着调用createCaptureSessionInternal()方法;

ini 复制代码
private void createCaptureSessionInternal(InputConfiguration inputConfig,
                                          List<OutputConfiguration> outputConfigurations,
                                          CameraCaptureSession.StateCallback callback, Executor executor,
                                          int operatingMode, CaptureRequest sessionParams) throws CameraAccessException {
    synchronized(mInterfaceLock) {
        if (DEBUG) {
            Log.d(TAG, "createCaptureSessionInternal");
        }
        // 判断异常情况,包括相机已经关闭,相机出错
        checkIfCameraClosedOrInError();
​
        boolean isConstrainedHighSpeed =
                (operatingMode == ICameraDeviceUser.CONSTRAINED_HIGH_SPEED_MODE);
        // high speed限制,通过流程分析,inputConfig为null,所以不会执行到这一步
        if (isConstrainedHighSpeed && inputConfig != null) {
            throw new IllegalArgumentException("Constrained high speed session doesn't support"
                    + " input configuration yet.");
        }
​
        // Notify current session that it's going away, before starting camera operations
        // After this call completes, the session is not allowed to call into CameraDeviceImpl
        if (mCurrentSession != null) {
            mCurrentSession.replaceSessionClose();
        }
​
        // TODO: dont block for this
        boolean configureSuccess = true;
        CameraAccessException pendingException = null;
        Surface input = null;
        try {
            // 执行surface配置,如果配置成功,则configureSuccess值为true,否则为false
            // configure streams and then block until IDLE
            configureSuccess = configureStreamsChecked(inputConfig, outputConfigurations,
                    operatingMode, sessionParams);
            if (configureSuccess == true && inputConfig != null) {
                input = mRemoteDevice.getInputSurface();
            }
        } catch (CameraAccessException e) {
            configureSuccess = false;
            pendingException = e;
            input = null;
            if (DEBUG) {
                Log.v(TAG, "createCaptureSession - failed with exception ", e);
            }
        }
​
        // Fire onConfigured if configureOutputs succeeded, fire onConfigureFailed otherwise.
        CameraCaptureSessionCore newSession = null;
        if (isConstrainedHighSpeed) {
            ArrayList<Surface> surfaces = new ArrayList<>(outputConfigurations.size());
            for (OutputConfiguration outConfig : outputConfigurations) {
                surfaces.add(outConfig.getSurface());
            }
            StreamConfigurationMap config =
                    getCharacteristics().get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
            SurfaceUtils.checkConstrainedHighSpeedSurfaces(surfaces, /*fpsRange*/null, config);
​
            newSession = new CameraConstrainedHighSpeedCaptureSessionImpl(mNextSessionId++,
                    callback, executor, this, mDeviceExecutor, configureSuccess,
                    mCharacteristics);
        } else {
            newSession = new CameraCaptureSessionImpl(mNextSessionId++, input,
                    callback, executor, this, mDeviceExecutor, configureSuccess);
        }
​
        // TODO: wait until current session closes, then create the new session
        mCurrentSession = newSession;
​
        if (pendingException != null) {
            throw pendingException;
        }
​
        mSessionStateCallback = mCurrentSession.getDeviceStateCallback();
    }
}

1.1 mCurrentSession.replaceSessionClose()

scss 复制代码
if (mCurrentSession != null) {
    mCurrentSession.replaceSessionClose();
}

判断session是否为空,如果不为空就调用CameraCaptureSessionCore的replaceSessionClose方法;

CameraCaptureSessionCore是一个接口,它的实现是CameraCaptureSessionImpl;

  • replaceSessionClose方法调用了 close方法;

  • 调用CameraDeviceImpl的stopRepeating方法;

  • 调用ICameraDeviceUser的cancelRequest();

  • 在Camera open的流程中已经知道,在AIDL的另一头的实现是CameraDeviceClient,所以调用它的cancelRequest();

    php 复制代码
    Status CameraService::connectDevice(
            const sp<hardware::camera2::ICameraDeviceCallbacks>& cameraCb,
            const String16& cameraId,
            const String16& clientPackageName,
            int clientUid,
            /*out*/
            sp<hardware::camera2::ICameraDeviceUser>* device) {
        sp<CameraDeviceClient> client = nullptr;
        ......
        *device = client;
        return ret;
    }

    上述说明了ICameraDeviceUser.aidl在c++侧的实现为CameraDeviceClient.cpp,实现位置在CameraService.cpp的connectDevice()方法中;

  • 调用Camera3Device的clearStreamingRequest()方法,最终调用clear方法;clear方法会清空RepeatingRequests list,然后遍历并清空队列中的buffer。

上述操作执行完成之后,当前运行的session就被close了。

2.configureStreamsChecked()

csharp 复制代码
if (outputs == null) {
    outputs = new ArrayList<OutputConfiguration>();
}
if (outputs.size() == 0 && inputConfig != null) {
    throw new IllegalArgumentException("cannot configure an input stream without " +
            "any output streams");
}
​
checkInputConfiguration(inputConfig);
scss 复制代码
// 检查Camera状态,如果有异常则关闭camera,抛出异常
checkIfCameraClosedOrInError();
// Streams to create
// addSet就是要即将要创建输出流的集合列表
HashSet<OutputConfiguration> addSet = new HashSet<OutputConfiguration>(outputs);
// Streams to delete
// deleteList就是即将要删除的streamId列表
List<Integer> deleteList = new ArrayList<Integer>();
​
// Determine which streams need to be created, which to be deleted
// 保证当前mConfiguredOutputs列表中的输出流数据是最新可用的
for (int i = 0; i < mConfiguredOutputs.size(); ++i) {
    int streamId = mConfiguredOutputs.keyAt(i);
    OutputConfiguration outConfig = mConfiguredOutputs.valueAt(i);
​
    // isDeferredConfiguration():判断该ConfiguredOutput是否为延迟surface
    if (!outputs.contains(outConfig) || outConfig.isDeferredConfiguration()) {
        // Always delete the deferred output configuration when the session
        // is created, as the deferred output configuration doesn't have unique surface
        // related identifies.
        deleteList.add(streamId);
    } else {
        addSet.remove(outConfig);  // Don't create a stream previously created
    }
}

遍历outputConfigurations,使用addSet和deleteList两个集合预存储需要更新的stream信息,确定哪些stream需要创建,哪些stream需要删除;

  • addSet:在初始化的时候保存的outputConfiguration为createCaptureSession传进来的surface集合;
  • deleteList:初始化为空集合;

这两个集合会在后续过程中使用到,即在更新mConfiguredOutputs的时候用到,该逻辑在mRemoteDevice.beginConfigure()之后执行;

scss 复制代码
// Delete all streams first (to free up HW resources)
for (Integer streamId : deleteList) {
    mRemoteDevice.deleteStream(streamId);
    mConfiguredOutputs.delete(streamId);
}
​
// Add all new streams
for (OutputConfiguration outConfig : outputs) {
    if (addSet.contains(outConfig)) {
        int streamId = mRemoteDevice.createStream(outConfig);
        mConfiguredOutputs.put(streamId, outConfig);
    }
}

上述两个逻辑,主要是用于检查outputConfiguration和inputConfiguration,确保配置正确;

2.1 mDeviceExecutor.execute(mCallOnBusy)

scss 复制代码
// 表示接下来要开始配置Surface
mDeviceExecutor.execute(mCallOnBusy);
// 当前运行的session
stopRepeating();
java 复制代码
private final Runnable mCallOnBusy = new Runnable() {
    @Override
    public void run() {
        StateCallbackKK sessionCallback = null;
        synchronized(mInterfaceLock) {
            if (mRemoteDevice == null) return; // Camera already closed
​
            sessionCallback = mSessionStateCallback;
        }
        if (sessionCallback != null) {
            sessionCallback.onBusy(CameraDeviceImpl.this);
        }
    }
};

mCallOnBusy内部有一个同步锁mInterfaceLock,configureStreamsChecked方法也有这个锁,所以它会等configureStreamsChecked方法执行完才执行run方法。StateCallbackKK是CameraDeviceImpl的一个内部静态抽象类,包含了session的一些状态,与CameraCaptureSession.StateCallback是不同的;

2.2 waitUntilIdle()

scss 复制代码
private void waitUntilIdle() throws CameraAccessException {
​
    synchronized(mInterfaceLock) {
        checkIfCameraClosedOrInError();
​
        if (mRepeatingRequestId != REQUEST_ID_NONE) {
            throw new IllegalStateException("Active repeating request ongoing");
        }
​
        mRemoteDevice.waitUntilIdle();
    }
}

该方法主要用于当有stream request时,需要等待直到空闲为止;

mRemoteDevice的类型为ICameraDeviceUserWrapper,而ICameraDeviceUserWrapper类中的mRemoteDevice则为CameraDeviceClient类型,即调用mRemoteDevice.waitUntilIdle()就是调用了CameraDeviceClient.cpp中的waitUntilIdle()方法;

arduino 复制代码
private ICameraDeviceUserWrapper mRemoteDevice;
java 复制代码
public class ICameraDeviceUserWrapper {
    private final ICameraDeviceUser mRemoteDevice;
    
    public ICameraDeviceUserWrapper(ICameraDeviceUser remoteDevice) {
        if (remoteDevice == null) {
            throw new NullPointerException("Remote device may not be null");
        }
        mRemoteDevice = remoteDevice;
    }
}
scss 复制代码
binder::Status CameraDeviceClient::waitUntilIdle()
{
    ATRACE_CALL();
    ALOGV("%s", __FUNCTION__);
​
    binder::Status res;
    if (!(res = checkPidStatus(__FUNCTION__)).isOk()) return res;
​
    Mutex::Autolock icl(mBinderSerializationLock);
​
    if (!mDevice.get()) {
        return STATUS_ERROR(CameraService::ERROR_DISCONNECTED, "Camera device no longer alive");
    }
​
    // FIXME: Also need check repeating burst.
    Mutex::Autolock idLock(mStreamingRequestIdLock);
    if (mStreamingRequestId != REQUEST_ID_NONE) {
        String8 msg = String8::format(
            "Camera %s: Try to waitUntilIdle when there are active streaming requests",
            mCameraIdStr.string());
        ALOGE("%s: %s", __FUNCTION__, msg.string());
        return STATUS_ERROR(CameraService::ERROR_INVALID_OPERATION, msg.string());
    }
    status_t err = mDevice->waitUntilDrained();
    if (err != OK) {
        res = STATUS_ERROR_FMT(CameraService::ERROR_INVALID_OPERATION,
                "Camera %s: Error waiting to drain: %s (%d)",
                mCameraIdStr.string(), strerror(-err), err);
    }
    ALOGV("%s Done", __FUNCTION__);
    return res;
}

最后在CameraDeviceClient中调用了mDevice->waitUntilDrained(),mDevice的类型为Camera3Device;

scss 复制代码
status_t Camera3Device::waitUntilDrained() {
    ATRACE_CALL();
    Mutex::Autolock il(mInterfaceLock);
    nsecs_t maxExpectedDuration = getExpectedInFlightDuration();
    Mutex::Autolock l(mLock);
​
    return waitUntilDrainedLocked(maxExpectedDuration);
}
c 复制代码
​
status_t Camera3Device::waitUntilDrainedLocked(nsecs_t maxExpectedDuration) {
    switch (mStatus) {
        case STATUS_UNINITIALIZED:
        case STATUS_UNCONFIGURED:
            ALOGV("%s: Already idle", __FUNCTION__);
            return OK;
        case STATUS_CONFIGURED:
            // To avoid race conditions, check with tracker to be sure
        case STATUS_ERROR:
        case STATUS_ACTIVE:
            // Need to verify shut down
            break;
        default:
            SET_ERR_L("Unexpected status: %d",mStatus);
            return INVALID_OPERATION;
    }
    ALOGV("%s: Camera %s: Waiting until idle (%" PRIi64 "ns)", __FUNCTION__, mId.string(),
            maxExpectedDuration);
    status_t res = waitUntilStateThenRelock(/*active*/ false, maxExpectedDuration);
    if (res != OK) {
        SET_ERR_L("Error waiting for HAL to drain: %s (%d)", strerror(-res),
                res);
    }
    return res;
}
arduino 复制代码
status_t Camera3Device::waitUntilStateThenRelock(bool active, nsecs_t timeout) {
    status_t res = OK;
​
    size_t startIndex = 0;
    if (mStatusWaiters == 0) {
        // Clear the list of recent statuses if there are no existing threads waiting on updates to
        // this status list
        mRecentStatusUpdates.clear();
    } else {
        // If other threads are waiting on updates to this status list, set the position of the
        // first element that this list will check rather than clearing the list.
        startIndex = mRecentStatusUpdates.size();
    }
​
    mStatusWaiters++;
​
    bool stateSeen = false;
    do {
        if (active == (mStatus == STATUS_ACTIVE)) {
            // Desired state is current
            break;
        }
​
        // 阻塞,等待状态变化
        res = mStatusChanged.waitRelative(mLock, timeout);
        if (res != OK) break;
​
        // This is impossible, but if not, could result in subtle deadlocks and invalid state
        // transitions.
        LOG_ALWAYS_FATAL_IF(startIndex > mRecentStatusUpdates.size(),
                "%s: Skipping status updates in Camera3Device, may result in deadlock.",
                __FUNCTION__);
​
        // Encountered desired state since we began waiting
        for (size_t i = startIndex; i < mRecentStatusUpdates.size(); i++) {
            if (active == (mRecentStatusUpdates[i] == STATUS_ACTIVE) ) {
                stateSeen = true;
                break;
            }
        }
    } while (!stateSeen);
​
    mStatusWaiters--;
​
    return res;
}

最终的核心代码是一个do...while循环。如果status是active就结束;

2.3 CameraDeviceClient.createStream(outConfig)

mRemoteDevice.beginConfigure()与mRemoteDevice.endConfigure(operatingMode, null)中间的过程是IPC通知service端告知当前正在处理输入输出流;

ini 复制代码
mRemoteDevice.beginConfigure();
// 由于inputConfiguration通常是null,所以直接看createStream即可
// reconfigure the input stream if the input configuration is different.
InputConfiguration currentInputConfig = mConfiguredInput.getValue();
if (inputConfig != currentInputConfig &&
        (inputConfig == null || !inputConfig.equals(currentInputConfig))) {
    if (currentInputConfig != null) {
        mRemoteDevice.deleteStream(mConfiguredInput.getKey());
        mConfiguredInput = new SimpleEntry<Integer, InputConfiguration>(
                REQUEST_ID_NONE, null);
    }
    if (inputConfig != null) {
        int streamId = mRemoteDevice.createInputStream(inputConfig.getWidth(),
                inputConfig.getHeight(), inputConfig.getFormat());
        mConfiguredInput = new SimpleEntry<Integer, InputConfiguration>(
                streamId, inputConfig);
    }
}
​
// deleteList和addSet两个集合更新mConfiguredOutputs的逻辑
for (Integer streamId : deleteList) {
    mRemoteDevice.deleteStream(streamId);
    mConfiguredOutputs.delete(streamId);
}
​
for (OutputConfiguration outConfig : outputs) {
    if (addSet.contains(outConfig)) {
        int streamId = mRemoteDevice.createStream(outConfig);
        mConfiguredOutputs.put(streamId, outConfig);
    }
}
​
if (sessionParams != null) {
    mRemoteDevice.endConfigure(operatingMode, sessionParams.getNativeCopy());
} else {
    mRemoteDevice.endConfigure(operatingMode, null);
}
​
success = true;
ini 复制代码
for (OutputConfiguration outConfig : outputs) {
    if (addSet.contains(outConfig)) {
        int streamId = mRemoteDevice.createStream(outConfig);
        mConfiguredOutputs.put(streamId, outConfig);
    }
}

遍历所有的outConfig(createCaptureSession传入的surface集合),创建对应的stream,然后将创建好的stream和outConfig一一对应保存到mConfiguredOutputs容器中;

php 复制代码
binder::Status CameraDeviceClient::createStream(
        const hardware::camera2::params::OutputConfiguration &outputConfiguration,
        /*out*/
        int32_t* newStreamId) {
    ATRACE_CALL();
​
    // 进行了大量的状态判断
    binder::Status res;
    if (!(res = checkPidStatus(__FUNCTION__)).isOk()) return res;
​
    Mutex::Autolock icl(mBinderSerializationLock);
​
    // 得到GraphicBufferProducers创建出对应的surface
    const std::vector<sp<IGraphicBufferProducer>>& bufferProducers =
            outputConfiguration.getGraphicBufferProducers();
    size_t numBufferProducers = bufferProducers.size();
    bool deferredConsumer = outputConfiguration.isDeferred();
    bool isShared = outputConfiguration.isShared();
    String8 physicalCameraId = String8(outputConfiguration.getPhysicalCameraId());
​
    // 不能超过所支持的最大的stream num
    // MAX_SURFACES_PER_STREAM:在CameraDeviceClient.h中定义了该变量,值为2
    if (numBufferProducers > MAX_SURFACES_PER_STREAM) {
        ALOGE("%s: GraphicBufferProducer count %zu for stream exceeds limit of %d",
              __FUNCTION__, bufferProducers.size(), MAX_SURFACES_PER_STREAM);
        return STATUS_ERROR(CameraService::ERROR_ILLEGAL_ARGUMENT, "Surface count is too high");
    }
    bool deferredConsumerOnly = deferredConsumer && numBufferProducers == 0;
    int surfaceType = outputConfiguration.getSurfaceType();
    bool validSurfaceType = ((surfaceType == OutputConfiguration::SURFACE_TYPE_SURFACE_VIEW) ||
            (surfaceType == OutputConfiguration::SURFACE_TYPE_SURFACE_TEXTURE));
​
    if (deferredConsumer && !validSurfaceType) {
        ALOGE("%s: Target surface is invalid: bufferProducer = %p, surfaceType = %d.",
                __FUNCTION__, bufferProducers[0].get(), surfaceType);
        return STATUS_ERROR(CameraService::ERROR_ILLEGAL_ARGUMENT, "Target Surface is invalid");
    }
​
    if (!mDevice.get()) {
        return STATUS_ERROR(CameraService::ERROR_DISCONNECTED, "Camera device no longer alive");
    }
​
    if (physicalCameraId.size() > 0) {
        std::vector<std::string> physicalCameraIds;
        std::string physicalId(physicalCameraId.string());
        bool logicalCamera =
                CameraProviderManager::isLogicalCamera(mDevice->info(), &physicalCameraIds);
        if (!logicalCamera ||
                std::find(physicalCameraIds.begin(), physicalCameraIds.end(), physicalId) ==
                physicalCameraIds.end()) {
            String8 msg = String8::format("Camera %s: Camera doesn't support physicalCameraId %s.",
                    mCameraIdStr.string(), physicalCameraId.string());
            ALOGE("%s: %s", __FUNCTION__, msg.string());
            return STATUS_ERROR(CameraService::ERROR_ILLEGAL_ARGUMENT, msg.string());
        }
    }
    std::vector<sp<Surface>> surfaces;
    std::vector<sp<IBinder>> binders;
    status_t err;
​
    // Create stream for deferred surface case.
    // 为delay surface创建stream
    if (deferredConsumerOnly) {
        return createDeferredSurfaceStreamLocked(outputConfiguration, isShared, newStreamId);
    }
​
    OutputStreamInfo streamInfo;
    
    //遍历bufferProducers,这部分是Android的Surface机制相关的内容
    bool isStreamInfoValid = false;
    for (auto& bufferProducer : bufferProducers) {
        // Don't create multiple streams for the same target surface
        sp<IBinder> binder = IInterface::asBinder(bufferProducer);
        ssize_t index = mStreamMap.indexOfKey(binder);
        if (index != NAME_NOT_FOUND) {
            String8 msg = String8::format("Camera %s: Surface already has a stream created for it "
                    "(ID %zd)", mCameraIdStr.string(), index);
            ALOGW("%s: %s", __FUNCTION__, msg.string());
            return STATUS_ERROR(CameraService::ERROR_ALREADY_EXISTS, msg.string());
        }
​
        sp<Surface> surface;
        // streamInfo这个参数也是在这一步获取的,和surface一块获取的
        res = createSurfaceFromGbp(streamInfo, isStreamInfoValid, surface, bufferProducer);
​
        if (!res.isOk())
            return res;
​
        if (!isStreamInfoValid) {
            isStreamInfoValid = true;
        }
​
        binders.push_back(IInterface::asBinder(bufferProducer));
        // 这个应该就是在后续过程中常常出现了的mConsumer
        surfaces.push_back(surface);
    }
​
    // 核心方法:调用了Camera3Device的createStream()方法
    int streamId = camera3::CAMERA3_STREAM_ID_INVALID;
    std::vector<int> surfaceIds;
    err = mDevice->createStream(surfaces, deferredConsumer, streamInfo.width,
            streamInfo.height, streamInfo.format, streamInfo.dataSpace,
            static_cast<camera3_stream_rotation_t>(outputConfiguration.getRotation()),
            &streamId, physicalCameraId, &surfaceIds, outputConfiguration.getSurfaceSetID(),
            isShared);
​
    if (err != OK) {
        res = STATUS_ERROR_FMT(CameraService::ERROR_INVALID_OPERATION,
                "Camera %s: Error creating output stream (%d x %d, fmt %x, dataSpace %x): %s (%d)",
                mCameraIdStr.string(), streamInfo.width, streamInfo.height, streamInfo.format,
                streamInfo.dataSpace, strerror(-err), err);
    } else {
        int i = 0;
        for (auto& binder : binders) {
            ALOGV("%s: mStreamMap add binder %p streamId %d, surfaceId %d",
                    __FUNCTION__, binder.get(), streamId, i);
            mStreamMap.add(binder, StreamSurfaceId(streamId, surfaceIds[i]));
            i++;
        }
​
        mConfiguredOutputs.add(streamId, outputConfiguration);
        mStreamInfoMap[streamId] = streamInfo;
​
        ALOGV("%s: Camera %s: Successfully created a new stream ID %d for output surface"
                    " (%d x %d) with format 0x%x.",
                  __FUNCTION__, mCameraIdStr.string(), streamId, streamInfo.width,
                  streamInfo.height, streamInfo.format);
​
        // Set transform flags to ensure preview to be rotated correctly.
        res = setStreamTransformLocked(streamId);
​
        *newStreamId = streamId;
    }
​
    return res;
}

该方法中主要就是for (auto& bufferProducer : bufferProducers)、mDevice->createStream这两句逻辑;

arduino 复制代码
const std::vector<sp<IGraphicBufferProducer>>& bufferProducers =
        outputConfiguration.getGraphicBufferProducers();

for循环中使用outputConfiguration.getGraphicBufferProducers()得到的GraphicBufferProducers创建出对应的surface;

ini 复制代码
for (auto& bufferProducer : bufferProducers) {
    ......
    sp<Surface> surface;
    res = createSurfaceFromGbp(streamInfo, isStreamInfoValid, surface, bufferProducer);
            
    if (!res.isOk())
        return res;
​
    if (!isStreamInfoValid) {
        isStreamInfoValid = true;
    }
​
    // bufferProducer是生产者,也就是从应用层传入的surface
    binders.push_back(IInterface::asBinder(bufferProducer));
    // 这里的入参参数surface是通过bufferProducer创建生成的对应的consumer,通过createSurfaceFromGbp函数生成
    surfaces.push_back(surface);
}

同时会对这些surface对象进行判断,检查它们的合法性,合法的话就会将它们加入到surfaces集合中,然后调用mDevice->createStream进一步执行流的创建;

2.4 Camera3Device::createStream()

arduino 复制代码
status_t Camera3Device::createStream(const std::vector<sp<Surface>>& consumers,
        bool hasDeferredConsumer, uint32_t width, uint32_t height, int format,
        android_dataspace dataSpace, camera3_stream_rotation_t rotation, int *id,
        const String8& physicalCameraId,
        std::vector<int> *surfaceIds, int streamSetId, bool isShared, uint64_t consumerUsage) {
    ......
}

第一个参数consumer,也就是通过outputConfiguration.getGraphicBufferProducers()得到的GraphicBufferProducers创建出对应的surface的surfaces集合,对应的就是surfaces.push_back(surface)这个逻辑;

第三、四个参数width、height,表示要配置surface的宽度和高度,分别为streamInfo.width和streamInfo.height;

第五个参数format,表示surface的format,即下面提及到的format,可以简单的理解为camera数据格式;

第六个参数dataSpace,dataSpace的类型为android_dataspace,它表示我们buffer轮转时,buffer的大小;

第七个参数rotation,表示旋转角度;

在createStream()方法主要需要分析两部分:

ini 复制代码
status_t res;
bool wasActive = false;
​
switch (mStatus) {
    case STATUS_ERROR:
        CLOGE("Device has encountered a serious error");
        return INVALID_OPERATION;
    case STATUS_UNINITIALIZED:
        CLOGE("Device not initialized");
        return INVALID_OPERATION;
    case STATUS_UNCONFIGURED:
    case STATUS_CONFIGURED:
        // OK
        break;
    case STATUS_ACTIVE:
        ALOGV("%s: Stopping activity to reconfigure streams", __FUNCTION__);
        res = internalPauseAndWaitLocked(maxExpectedDuration);
        if (res != OK) {
            SET_ERR_L("Can't pause captures to reconfigure streams!");
            return res;
        }
        wasActive = true;
        break;
    default:
        SET_ERR_L("Unexpected status: %d", mStatus);
        return INVALID_OPERATION;
}
assert(mStatus != STATUS_ACTIVE);
​
sp<Camera3OutputStream> newStream;
​
if (consumers.size() == 0 && !hasDeferredConsumer) {
    ALOGE("%s: Number of consumers cannot be smaller than 1", __FUNCTION__);
    return BAD_VALUE;
}
​
if (hasDeferredConsumer && format != HAL_PIXEL_FORMAT_IMPLEMENTATION_DEFINED) {
    ALOGE("Deferred consumer stream creation only support IMPLEMENTATION_DEFINED format");
    return BAD_VALUE;
}

在switch中判断mStatus的状态,mStatus变量是在Camera3Device.internalUpdateStatusLocked()方法中进行的赋值,在初次创建Camera3Device的时候,initialize()方法调用initializeCommonLocked()方法,调用internalUpdateStatusLocked()方法时,mStatus赋值为STATUS_UNCONFIGURED,所以在上面的代码中直接执行了break语句,跳出来switch;

ini 复制代码
// HAL_PIXEL_FORMAT_BLOB,是用来存储BLOB数据的,对应的是拍照流,可以简单的理解为ImageReader所对应的surface的stream
if (format == HAL_PIXEL_FORMAT_BLOB) {
    ssize_t blobBufferSize;
    if (dataSpace != HAL_DATASPACE_DEPTH) {
        blobBufferSize = getJpegBufferSize(width, height);
        if (blobBufferSize <= 0) {
            SET_ERR_L("Invalid jpeg buffer size %zd", blobBufferSize);
            return BAD_VALUE;
        }
    } else {
        blobBufferSize = getPointCloudBufferSize();
        if (blobBufferSize <= 0) {
            SET_ERR_L("Invalid point cloud buffer size %zd", blobBufferSize);
            return BAD_VALUE;
        }
    }
    newStream = new Camera3OutputStream(mNextStreamId, consumers[0],
            width, height, blobBufferSize, format, dataSpace, rotation,
            mTimestampOffset, physicalCameraId, streamSetId);
// 该 format::HAL_PIXEL_FORMAT_RAW_OPAQUE 可以简单的理解为preview format
} else if (format == HAL_PIXEL_FORMAT_RAW_OPAQUE) {
    ssize_t rawOpaqueBufferSize = getRawOpaqueBufferSize(width, height);
    if (rawOpaqueBufferSize <= 0) {
        SET_ERR_L("Invalid RAW opaque buffer size %zd", rawOpaqueBufferSize);
        return BAD_VALUE;
    }
    newStream = new Camera3OutputStream(mNextStreamId, consumers[0],
            width, height, rawOpaqueBufferSize, format, dataSpace, rotation,
            mTimestampOffset, physicalCameraId, streamSetId);
} else if (isShared) {
    newStream = new Camera3SharedOutputStream(mNextStreamId, consumers,
            width, height, format, consumerUsage, dataSpace, rotation,
            mTimestampOffset, physicalCameraId, streamSetId);
} else if (consumers.size() == 0 && hasDeferredConsumer) {
    newStream = new Camera3OutputStream(mNextStreamId,
            width, height, format, consumerUsage, dataSpace, rotation,
            mTimestampOffset, physicalCameraId, streamSetId);
} else {
    newStream = new Camera3OutputStream(mNextStreamId, consumers[0],
            width, height, format, dataSpace, rotation,
            mTimestampOffset, physicalCameraId, streamSetId);
}
​
........................
res = mOutputStreams.add(mNextStreamId, newStream);

在上述判断都没有异常的情况下,就进入了stream的创建;

根据stream类型的不同,调用不同的构造方法创建;

ini 复制代码
*id = mNextStreamId++;

创建完steam之后,就设置steam的id;

Id是在已有的基础上+1,所以如果有多个流,id就是连续的数字,至此一个steam就创建完成了,然后将创建好的stream添加到mOutputStreams中streams,然后会在后续的调用过程中将mOutputStreams中的stream对象保存到streams list中;

要区分stream,可以通过format参数进行区分。CameraServer在配置surface创建流时会打印出每个流的信息,其中的format也会打印出来,比如0x21、0x22、0x23;

system\core\libsystem\include\system\graphics-base.h

ini 复制代码
#ifndef HIDL_GENERATED_ANDROID_HARDWARE_GRAPHICS_COMMON_V1_0_EXPORTED_CONSTANTS_H_
#define HIDL_GENERATED_ANDROID_HARDWARE_GRAPHICS_COMMON_V1_0_EXPORTED_CONSTANTS_H_
 
#ifdef __cplusplus
extern "C" {
#endif
 
typedef enum {
    HAL_PIXEL_FORMAT_RGBA_8888 = 1,
    HAL_PIXEL_FORMAT_RGBX_8888 = 2,
    HAL_PIXEL_FORMAT_RGB_888 = 3,
    HAL_PIXEL_FORMAT_RGB_565 = 4,
    HAL_PIXEL_FORMAT_BGRA_8888 = 5,
    HAL_PIXEL_FORMAT_RGBA_1010102 = 43, // 0x2B
    HAL_PIXEL_FORMAT_RGBA_FP16 = 22, // 0x16
    HAL_PIXEL_FORMAT_YV12 = 842094169, // 0x32315659
    HAL_PIXEL_FORMAT_Y8 = 538982489, // 0x20203859
    HAL_PIXEL_FORMAT_Y16 = 540422489, // 0x20363159
    HAL_PIXEL_FORMAT_RAW16 = 32, // 0x20
    HAL_PIXEL_FORMAT_RAW10 = 37, // 0x25
    HAL_PIXEL_FORMAT_RAW12 = 38, // 0x26
    HAL_PIXEL_FORMAT_RAW_OPAQUE = 36, // 0x24
    HAL_PIXEL_FORMAT_BLOB = 33, // 0x21
    HAL_PIXEL_FORMAT_IMPLEMENTATION_DEFINED = 34, // 0x22
    HAL_PIXEL_FORMAT_YCBCR_420_888 = 35, // 0x23
    HAL_PIXEL_FORMAT_YCBCR_422_888 = 39, // 0x27
    HAL_PIXEL_FORMAT_YCBCR_444_888 = 40, // 0x28
    HAL_PIXEL_FORMAT_FLEX_RGB_888 = 41, // 0x29
    HAL_PIXEL_FORMAT_FLEX_RGBA_8888 = 42, // 0x2A
    HAL_PIXEL_FORMAT_YCBCR_422_SP = 16, // 0x10
    HAL_PIXEL_FORMAT_YCRCB_420_SP = 17, // 0x11
    HAL_PIXEL_FORMAT_YCBCR_422_I = 20, // 0x14
    HAL_PIXEL_FORMAT_JPEG = 256, // 0x100
} android_pixel_format_t;
 
typedef enum {
    HAL_TRANSFORM_FLIP_H = 1, // 0x01
    HAL_TRANSFORM_FLIP_V = 2, // 0x02
    HAL_TRANSFORM_ROT_90 = 4, // 0x04
    HAL_TRANSFORM_ROT_180 = 3, // 0x03
    HAL_TRANSFORM_ROT_270 = 7, // 0x07
} android_transform_t;
 
typedef enum {
    HAL_DATASPACE_UNKNOWN = 0, // 0x0
    HAL_DATASPACE_ARBITRARY = 1, // 0x1
    HAL_DATASPACE_STANDARD_SHIFT = 16,
    HAL_DATASPACE_STANDARD_MASK = 4128768, // (63 << STANDARD_SHIFT)
    HAL_DATASPACE_STANDARD_UNSPECIFIED = 0, // (0 << STANDARD_SHIFT)
    HAL_DATASPACE_STANDARD_BT709 = 65536, // (1 << STANDARD_SHIFT)
    HAL_DATASPACE_STANDARD_BT601_625 = 131072, // (2 << STANDARD_SHIFT)
    HAL_DATASPACE_STANDARD_BT601_625_UNADJUSTED = 196608, // (3 << STANDARD_SHIFT)
    HAL_DATASPACE_STANDARD_BT601_525 = 262144, // (4 << STANDARD_SHIFT)
    HAL_DATASPACE_STANDARD_BT601_525_UNADJUSTED = 327680, // (5 << STANDARD_SHIFT)
    HAL_DATASPACE_STANDARD_BT2020 = 393216, // (6 << STANDARD_SHIFT)
    HAL_DATASPACE_STANDARD_BT2020_CONSTANT_LUMINANCE = 458752, // (7 << STANDARD_SHIFT)
    HAL_DATASPACE_STANDARD_BT470M = 524288, // (8 << STANDARD_SHIFT)
    HAL_DATASPACE_STANDARD_FILM = 589824, // (9 << STANDARD_SHIFT)
    HAL_DATASPACE_STANDARD_DCI_P3 = 655360, // (10 << STANDARD_SHIFT)
    HAL_DATASPACE_STANDARD_ADOBE_RGB = 720896, // (11 << STANDARD_SHIFT)
    HAL_DATASPACE_TRANSFER_SHIFT = 22,
    HAL_DATASPACE_TRANSFER_MASK = 130023424, // (31 << TRANSFER_SHIFT)
    HAL_DATASPACE_TRANSFER_UNSPECIFIED = 0, // (0 << TRANSFER_SHIFT)
    HAL_DATASPACE_TRANSFER_LINEAR = 4194304, // (1 << TRANSFER_SHIFT)
    HAL_DATASPACE_TRANSFER_SRGB = 8388608, // (2 << TRANSFER_SHIFT)
    HAL_DATASPACE_TRANSFER_SMPTE_170M = 12582912, // (3 << TRANSFER_SHIFT)
    HAL_DATASPACE_TRANSFER_GAMMA2_2 = 16777216, // (4 << TRANSFER_SHIFT)
    HAL_DATASPACE_TRANSFER_GAMMA2_6 = 20971520, // (5 << TRANSFER_SHIFT)
    HAL_DATASPACE_TRANSFER_GAMMA2_8 = 25165824, // (6 << TRANSFER_SHIFT)
    HAL_DATASPACE_TRANSFER_ST2084 = 29360128, // (7 << TRANSFER_SHIFT)
    HAL_DATASPACE_TRANSFER_HLG = 33554432, // (8 << TRANSFER_SHIFT)
    HAL_DATASPACE_RANGE_SHIFT = 27,
    HAL_DATASPACE_RANGE_MASK = 939524096, // (7 << RANGE_SHIFT)
    HAL_DATASPACE_RANGE_UNSPECIFIED = 0, // (0 << RANGE_SHIFT)
    HAL_DATASPACE_RANGE_FULL = 134217728, // (1 << RANGE_SHIFT)
    HAL_DATASPACE_RANGE_LIMITED = 268435456, // (2 << RANGE_SHIFT)
    HAL_DATASPACE_RANGE_EXTENDED = 402653184, // (3 << RANGE_SHIFT)
    HAL_DATASPACE_SRGB_LINEAR = 512, // 0x200
    HAL_DATASPACE_V0_SRGB_LINEAR = 138477568, // ((STANDARD_BT709 | TRANSFER_LINEAR) | RANGE_FULL)
    HAL_DATASPACE_V0_SCRGB_LINEAR = 406913024, // ((STANDARD_BT709 | TRANSFER_LINEAR) | RANGE_EXTENDED)
    HAL_DATASPACE_SRGB = 513, // 0x201
    HAL_DATASPACE_V0_SRGB = 142671872, // ((STANDARD_BT709 | TRANSFER_SRGB) | RANGE_FULL)
    HAL_DATASPACE_V0_SCRGB = 411107328, // ((STANDARD_BT709 | TRANSFER_SRGB) | RANGE_EXTENDED)
    HAL_DATASPACE_JFIF = 257, // 0x101
    HAL_DATASPACE_V0_JFIF = 146931712, // ((STANDARD_BT601_625 | TRANSFER_SMPTE_170M) | RANGE_FULL)
    HAL_DATASPACE_BT601_625 = 258, // 0x102
    HAL_DATASPACE_V0_BT601_625 = 281149440, // ((STANDARD_BT601_625 | TRANSFER_SMPTE_170M) | RANGE_LIMITED)
    HAL_DATASPACE_BT601_525 = 259, // 0x103
    HAL_DATASPACE_V0_BT601_525 = 281280512, // ((STANDARD_BT601_525 | TRANSFER_SMPTE_170M) | RANGE_LIMITED)
    HAL_DATASPACE_BT709 = 260, // 0x104
    HAL_DATASPACE_V0_BT709 = 281083904, // ((STANDARD_BT709 | TRANSFER_SMPTE_170M) | RANGE_LIMITED)
    HAL_DATASPACE_DCI_P3_LINEAR = 139067392, // ((STANDARD_DCI_P3 | TRANSFER_LINEAR) | RANGE_FULL)
    HAL_DATASPACE_DCI_P3 = 155844608, // ((STANDARD_DCI_P3 | TRANSFER_GAMMA2_6) | RANGE_FULL)
    HAL_DATASPACE_DISPLAY_P3_LINEAR = 139067392, // ((STANDARD_DCI_P3 | TRANSFER_LINEAR) | RANGE_FULL)
    HAL_DATASPACE_DISPLAY_P3 = 143261696, // ((STANDARD_DCI_P3 | TRANSFER_SRGB) | RANGE_FULL)
    HAL_DATASPACE_ADOBE_RGB = 151715840, // ((STANDARD_ADOBE_RGB | TRANSFER_GAMMA2_2) | RANGE_FULL)
    HAL_DATASPACE_BT2020_LINEAR = 138805248, // ((STANDARD_BT2020 | TRANSFER_LINEAR) | RANGE_FULL)
    HAL_DATASPACE_BT2020 = 147193856, // ((STANDARD_BT2020 | TRANSFER_SMPTE_170M) | RANGE_FULL)
    HAL_DATASPACE_BT2020_PQ = 163971072, // ((STANDARD_BT2020 | TRANSFER_ST2084) | RANGE_FULL)
    HAL_DATASPACE_DEPTH = 4096, // 0x1000
    HAL_DATASPACE_SENSOR = 4097, // 0x1001
} android_dataspace_t;
 
typedef enum {
    HAL_COLOR_MODE_NATIVE = 0,
    HAL_COLOR_MODE_STANDARD_BT601_625 = 1,
    HAL_COLOR_MODE_STANDARD_BT601_625_UNADJUSTED = 2,
    HAL_COLOR_MODE_STANDARD_BT601_525 = 3,
    HAL_COLOR_MODE_STANDARD_BT601_525_UNADJUSTED = 4,
    HAL_COLOR_MODE_STANDARD_BT709 = 5,
    HAL_COLOR_MODE_DCI_P3 = 6,
    HAL_COLOR_MODE_SRGB = 7,
    HAL_COLOR_MODE_ADOBE_RGB = 8,
    HAL_COLOR_MODE_DISPLAY_P3 = 9,
} android_color_mode_t;
 
typedef enum {
    HAL_COLOR_TRANSFORM_IDENTITY = 0,
    HAL_COLOR_TRANSFORM_ARBITRARY_MATRIX = 1,
    HAL_COLOR_TRANSFORM_VALUE_INVERSE = 2,
    HAL_COLOR_TRANSFORM_GRAYSCALE = 3,
    HAL_COLOR_TRANSFORM_CORRECT_PROTANOPIA = 4,
    HAL_COLOR_TRANSFORM_CORRECT_DEUTERANOPIA = 5,
    HAL_COLOR_TRANSFORM_CORRECT_TRITANOPIA = 6,
} android_color_transform_t;
 
typedef enum {
    HAL_HDR_DOLBY_VISION = 1,
    HAL_HDR_HDR10 = 2,
    HAL_HDR_HLG = 3,
} android_hdr_t;

到这里,createStream的逻辑就执行完成了,该逻辑是在framework的for循环里执行的;

2.5 CameraDeviceImpl.configureStreamsChecked() -- endConfigure()

csharp 复制代码
if (sessionParams != null) {
    mRemoteDevice.endConfigure(operatingMode, sessionParams.getNativeCopy());
} else {
    mRemoteDevice.endConfigure(operatingMode, null);
}

CameraDeviceImpl.configureStreamsChecked()方法中,mRemoteDevice.createStream()遍历执行完成之后,就需要调用mRemoteDevice.endConfigure()方法;

php 复制代码
binder::Status CameraDeviceClient::endConfigure(int operatingMode,
        const hardware::camera2::impl::CameraMetadataNative& sessionParams) {
    ATRACE_CALL();
    ALOGV("%s: ending configure (%d input stream, %zu output surfaces)",
            __FUNCTION__, mInputStream.configured ? 1 : 0,
            mStreamMap.size());
    ......
    status_t err = mDevice->configureStreams(sessionParams, operatingMode);
    if (err == BAD_VALUE) {
        String8 msg = String8::format("Camera %s: Unsupported set of inputs/outputs provided",
                mCameraIdStr.string());
        ALOGE("%s: %s", __FUNCTION__, msg.string());
        res = STATUS_ERROR(CameraService::ERROR_ILLEGAL_ARGUMENT, msg.string());
    } else if (err != OK) {
        String8 msg = String8::format("Camera %s: Error configuring streams: %s (%d)",
                mCameraIdStr.string(), strerror(-err), err);
        ALOGE("%s: %s", __FUNCTION__, msg.string());
        res = STATUS_ERROR(CameraService::ERROR_INVALID_OPERATION, msg.string());
    }
​
    return res;
}

核心代码是调用了Camera3Device的configureStreams()方法;

scss 复制代码
status_t Camera3Device::configureStreams(const CameraMetadata& sessionParams, int operatingMode) {
    ATRACE_CALL();
    ALOGV("%s: E", __FUNCTION__);
​
    Mutex::Autolock il(mInterfaceLock);
    Mutex::Autolock l(mLock);
​
    // In case the client doesn't include any session parameter, try a
    // speculative configuration using the values from the last cached
    // default request.
    if (sessionParams.isEmpty() &&
            ((mLastTemplateId > 0) && (mLastTemplateId < CAMERA3_TEMPLATE_COUNT)) &&
            (!mRequestTemplateCache[mLastTemplateId].isEmpty())) {
        ALOGV("%s: Speculative session param configuration with template id: %d", __func__,
                mLastTemplateId);
        return filterParamsAndConfigureLocked(mRequestTemplateCache[mLastTemplateId],
                operatingMode);
    }
​
    return filterParamsAndConfigureLocked(sessionParams, operatingMode);
}
ini 复制代码
status_t Camera3Device::filterParamsAndConfigureLocked(const CameraMetadata& sessionParams,
        int operatingMode) {
    //Filter out any incoming session parameters
    const CameraMetadata params(sessionParams);
    camera_metadata_entry_t availableSessionKeys = mDeviceInfo.find(
            ANDROID_REQUEST_AVAILABLE_SESSION_KEYS);
    CameraMetadata filteredParams(availableSessionKeys.count);
    camera_metadata_t *meta = const_cast<camera_metadata_t *>(
            filteredParams.getAndLock());
    set_camera_metadata_vendor_id(meta, mVendorTagId);
    filteredParams.unlock(meta);
    if (availableSessionKeys.count > 0) {
        for (size_t i = 0; i < availableSessionKeys.count; i++) {
            camera_metadata_ro_entry entry = params.find(
                    availableSessionKeys.data.i32[i]);
            if (entry.count > 0) {
                filteredParams.update(entry);
            }
        }
    }
​
    return configureStreamsLocked(operatingMode, filteredParams);
}

紧接着调用了Camera3Device::configureStreamsLocked()方法;

scss 复制代码
// 将mOutputStreams中的stream信息保存到camera3_stream_configuration变量config中
config.streams = streams.editArray();
​
// Do the HAL configuration; will potentially touch stream
// max_buffers, usage, priv fields.
​
const camera_metadata_t *sessionBuffer = sessionParams.getAndLock();
// 通过Interface实例将Stream信息交互到HAL层,进行HAL层的configStream
res = mInterface->configureStreams(sessionBuffer, &config, bufferSizes);
sessionParams.unlock(sessionBuffer);
​
if (res == BAD_VALUE) {
    // HAL rejected this set of streams as unsupported, clean up config
    // attempt and return to unconfigured state
    CLOGE("Set of requested inputs/outputs not supported by HAL");
    cancelStreamsConfigurationLocked();
    return BAD_VALUE;
} else if (res != OK) {
    // Some other kind of error from configure_streams - this is not
    // expected
    SET_ERR_L("Unable to configure streams with HAL: %s (%d)",
            strerror(-res), res);
    return res;
}
​
// 完成输入流的配置
if (mInputStream != NULL && mInputStream->isConfiguring()) {
    res = mInputStream->finishConfiguration();
    if (res != OK) {
        CLOGE("Can't finish configuring input stream %d: %s (%d)",
                mInputStream->getId(), strerror(-res), res);
        cancelStreamsConfigurationLocked();
        if ((res == NO_INIT || res == DEAD_OBJECT) && mInputStream->isAbandoned()) {
            return DEAD_OBJECT;
        }
        return BAD_VALUE;
    }
}
​
// Finish all stream configuration immediately.
// TODO: Try to relax this later back to lazy completion, which should be
// faster
// 完成输出流的配置
for (size_t i = 0; i < mOutputStreams.size(); i++) {
    sp<Camera3OutputStreamInterface> outputStream =
            mOutputStreams.editValueAt(i);
    if (outputStream->isConfiguring() && !outputStream->isConsumerConfigurationDeferred()) {
        res = outputStream->finishConfiguration();
        if (res != OK) {
            CLOGE("Can't finish configuring output stream %d: %s (%d)",
                    outputStream->getId(), strerror(-res), res);
            cancelStreamsConfigurationLocked();
            if ((res == NO_INIT || res == DEAD_OBJECT) && outputStream->isAbandoned()) {
                return DEAD_OBJECT;
            }
            return BAD_VALUE;
        }
    }
}
​
// Request thread needs to know to avoid using repeat-last-settings protocol
// across configure_streams() calls
if (notifyRequestThread) {
    mRequestThread->configurationComplete(mIsConstrainedHighSpeedConfiguration, sessionParams);
}
​
char value[PROPERTY_VALUE_MAX];
property_get("camera.fifo.disable", value, "0");
int32_t disableFifo = atoi(value);
if (disableFifo != 1) {
    // Boost priority of request thread to SCHED_FIFO.
    pid_t requestThreadTid = mRequestThread->getTid();
    res = requestPriority(getpid(), requestThreadTid,
            kRequestThreadPriority, /*isForApp*/ false, /*asynchronous*/ false);
    if (res != OK) {
        ALOGW("Can't set realtime priority for request processing thread: %s (%d)",
                strerror(-res), res);
    } else {
        ALOGD("Set real time priority for request queue thread (tid %d)", requestThreadTid);
    }
}
​
// Update device state
const camera_metadata_t *newSessionParams = sessionParams.getAndLock();
const camera_metadata_t *currentSessionParams = mSessionParams.getAndLock();
bool updateSessionParams = (newSessionParams != currentSessionParams) ? true : false;
sessionParams.unlock(newSessionParams);
mSessionParams.unlock(currentSessionParams);
if (updateSessionParams)  {
    mSessionParams = sessionParams;
}
​
mNeedConfig = false;
​
internalUpdateStatusLocked((mDummyStreamId == NO_STREAM) ?
STATUS_CONFIGURED : STATUS_UNCONFIGURED);
​
ALOGV("%s: Camera %s: Stream configuration complete", __FUNCTION__, mId.string());
​
// tear down the deleted streams after configure streams.
mDeletedStreams.clear();
​
// 调用 prepareThread 的 resume 函数让其继续运作
auto rc = mPreparerThread->resume();
if (rc != OK) {
    SET_ERR_L("%s: Camera %s: Preparer thread failed to resume!", __FUNCTION__, mId.string());
    return rc;
}
​
return OK;

在该方法中主要是给已创建的stream添加buffersize、调用res = mInterface->configureStreams(&config)提供给HAL层去执行配置,因为HAL层的逻辑每个厂商都不一样。

输入流和输出流配置完成之后,调用internalUpdateStatusLocked()将全局变量mStatus修改为STATUS_CONFIGURED,同时mNeedConfig变量修改为false,表明不需要再次配置;

配置完成后,在保证正确的情况下,调用outputStream->finishConfiguration()结束配置,这个方法在Camera3Stream中实现;

ini 复制代码
status_t Camera3Stream::finishConfiguration() {
    ATRACE_CALL();
    Mutex::Autolock l(mLock);
    switch (mState) {
        case STATE_ERROR:
            ALOGE("%s: In error state", __FUNCTION__);
            return INVALID_OPERATION;
        case STATE_IN_CONFIG:
        case STATE_IN_RECONFIG:
            // OK
            break;
        case STATE_CONSTRUCTED:
        case STATE_CONFIGURED:
            ALOGE("%s: Cannot finish configuration that hasn't been started",
                    __FUNCTION__);
            return INVALID_OPERATION;
        case STATE_IN_IDLE:
            //Skip configuration in this state
            return OK;
        default:
            ALOGE("%s: Unknown state", __FUNCTION__);
            return INVALID_OPERATION;
    }
​
    // Register for idle tracking
    sp<StatusTracker> statusTracker = mStatusTracker.promote();
    if (statusTracker != 0) {
        mStatusId = statusTracker->addComponent();
    }
​
    // Check if the stream configuration is unchanged, and skip reallocation if
    // so. As documented in hardware/camera3.h:configure_streams().
    if (mState == STATE_IN_RECONFIG &&
            mOldUsage == mUsage &&
            mOldMaxBuffers == camera3_stream::max_buffers) {
        mState = STATE_CONFIGURED;
        return OK;
    }
​
    // Reset prepared state, since buffer config has changed, and existing
    // allocations are no longer valid
    mPrepared = false;
    mStreamUnpreparable = false;
​
    status_t res;
    res = configureQueueLocked();
    // configureQueueLocked could return error in case of abandoned surface.
    // Treat as non-fatal error.
    if (res == NO_INIT || res == DEAD_OBJECT) {
        ALOGE("%s: Unable to configure stream %d queue (non-fatal): %s (%d)",
                __FUNCTION__, mId, strerror(-res), res);
        mState = STATE_ABANDONED;
        return res;
    } else if (res != OK) {
        ALOGE("%s: Unable to configure stream %d queue: %s (%d)",
                __FUNCTION__, mId, strerror(-res), res);
        mState = STATE_ERROR;
        return res;
    }
​
    mState = STATE_CONFIGURED;
​
    return res;
}

其中最核心的方法就是configureQueueLocked(),这个方法在Camera3OutputStream中实现;

frameworks\av\services\camera\libcameraservice\device3\Camera3OutputStream.cpp

ini 复制代码
status_t Camera3OutputStream::configureQueueLocked() {
    status_t res;
​
    mTraceFirstBuffer = true;
    if ((res = Camera3IOStreamBase::configureQueueLocked()) != OK) {
        return res;
    }
​
    if ((res = configureConsumerQueueLocked()) != OK) {
        return res;
    }
​
    // Set dequeueBuffer/attachBuffer timeout if the consumer is not hw composer or hw texture.
    // We need skip these cases as timeout will disable the non-blocking (async) mode.
    if (!(isConsumedByHWComposer() || isConsumedByHWTexture())) {
        mConsumer->setDequeueTimeout(kDequeueBufferTimeout);
    }
​
    return OK;
}

这个方法中又调用了configureConsumerQueueLocked()方法;

arduino 复制代码
status_t Camera3OutputStream::configureConsumerQueueLocked() {
    status_t res;
​
    mTraceFirstBuffer = true;
​
    ALOG_ASSERT(mConsumer != 0, "mConsumer should never be NULL");
​
    // Configure consumer-side ANativeWindow interface. The listener may be used
    // to notify buffer manager (if it is used) of the returned buffers.
    // mBufferReleasedListener用来通知缓冲区管理器返回的缓冲区
    // mBufferReleasedListener这个变量在初始化Camera3OutputStream的时候就创建成功了
    res = mConsumer->connect(NATIVE_WINDOW_API_CAMERA,
            /*listener*/mBufferReleasedListener,
            /*reportBufferRemoval*/true);
    if (res != OK) {
        ALOGE("%s: Unable to connect to native window for stream %d",
                __FUNCTION__, mId);
        return res;
    }
    ......
}

执行了mConsumer->connect()方法,mConsumer就是我们配置流时创建的surface,这里会执行连接,然后分配需要的空间大小。

如果我们当前surface的GraphicBufferProducer生产者有被其他连接,那么配置也会失败;

arduino 复制代码
std::vector<sp<Surface>> consumers;

执行了surface的connect()方法;

frameworks/native/libs/gui/Surface.cpp

ini 复制代码
int Surface::connect(
        int api, const sp<IProducerListener>& listener, bool reportBufferRemoval) {
    ATRACE_CALL();
    ALOGV("Surface::connect");
    Mutex::Autolock lock(mMutex);
    // 生产者
    IGraphicBufferProducer::QueueBufferOutput output;
    mReportRemovedBuffers = reportBufferRemoval;
    int err = mGraphicBufferProducer->connect(listener, api, mProducerControlledByApp, &output);
    if (err == NO_ERROR) {
        mDefaultWidth = output.width;
        mDefaultHeight = output.height;
        mNextFrameNumber = output.nextFrameNumber;
​
        // Disable transform hint if sticky transform is set.
        if (mStickyTransform == 0) {
            mTransformHint = output.transformHint;
        }
​
        mConsumerRunningBehind = (output.numPendingBuffers >= 2);
    }
    if (!err && api == NATIVE_WINDOW_API_CPU) {
        mConnectedToCpu = true;
        // Clear the dirty region in case we're switching from a non-CPU API
        mDirtyRegion.clear();
    } else if (!err) {
        // Initialize the dirty region for tracking surface damage
        mDirtyRegion = Region::INVALID_REGION;
    }
​
    return err;
}

当遇到Camera的preview黑屏无法显示的问题,如果出现了与Surface相关的异常,那么就是可能是这两个地方存在问题。一是GraphicBufferProducer构建出现问题,传入的gbp是null,二是当前surface的GraphicBufferProducer有其他链接;

执行完这一步,configureStreamsChecked()方法暂时就解决了;

3 CameraDeviceImpl.createCaptureSessionInternal -- newSession

ini 复制代码
CameraCaptureSessionCore newSession = null;
if (isConstrainedHighSpeed) {
    ArrayList<Surface> surfaces = new ArrayList<>(outputConfigurations.size());
    for (OutputConfiguration outConfig : outputConfigurations) {
        surfaces.add(outConfig.getSurface());
    }
    StreamConfigurationMap config =
 getCharacteristics().get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
    SurfaceUtils.checkConstrainedHighSpeedSurfaces(surfaces, /*fpsRange*/null, config);
​
    newSession = new CameraConstrainedHighSpeedCaptureSessionImpl(mNextSessionId++,
            callback, executor, this, mDeviceExecutor, configureSuccess,
            mCharacteristics);
} else {
    newSession = new CameraCaptureSessionImpl(mNextSessionId++, input,
            callback, executor, this, mDeviceExecutor, configureSuccess);
}
​
// TODO: wait until current session closes, then create the new session
mCurrentSession = newSession;
​
if (pendingException != null) {
    throw pendingException;
}
​
mSessionStateCallback = mCurrentSession.getDeviceStateCallback();

stream创建完成之后,就需要开始创建session了;

首先创建CameraCaptureSessionCore类型的实例newSession;

ini 复制代码
public class CameraCaptureSessionImpl extends CameraCaptureSession
        implements CameraCaptureSessionCore {
​
    /**
     * Create a new CameraCaptureSession.
     *
     * <p>The camera device must already be in the {@code IDLE} state when this is invoked.
     * There must be no pending actions
     * (e.g. no pending captures, no repeating requests, no flush).</p>
     */
    CameraCaptureSessionImpl(int id, Surface input,
                             CameraCaptureSession.StateCallback callback, Executor stateExecutor,
                             android.hardware.camera2.impl.CameraDeviceImpl deviceImpl,
                             Executor deviceStateExecutor, boolean configureSuccess) {
        if (callback == null) {
            throw new IllegalArgumentException("callback must not be null");
        }
​
        mId = id;
        mIdString = String.format("Session %d: ", mId);
​
        mInput = input;
        mStateExecutor = checkNotNull(stateExecutor, "stateExecutor must not be null");
        mStateCallback = createUserStateCallbackProxy(mStateExecutor, callback);
​
        mDeviceExecutor = checkNotNull(deviceStateExecutor,
                "deviceStateExecutor must not be null");
        mDeviceImpl = checkNotNull(deviceImpl, "deviceImpl must not be null");
​
        /*
         * Use the same handler as the device's StateCallback for all the internal coming events
         *
         * This ensures total ordering between CameraDevice.StateCallback and
         * CameraDeviceImpl.CaptureCallback events.
         */
        mSequenceDrainer = new TaskDrainer<>(mDeviceExecutor, new SequenceDrainListener(),
                /*name*/"seq");
        mIdleDrainer = new TaskSingleDrainer(mDeviceExecutor, new IdleDrainListener(),
                /*name*/"idle");
        mAbortDrainer = new TaskSingleDrainer(mDeviceExecutor, new AbortDrainListener(),
                /*name*/"abort");
​
        // CameraDevice should call configureOutputs and have it finish before constructing us
​
        if (configureSuccess) {
            mStateCallback.onConfigured(this);
            if (DEBUG) Log.v(TAG, mIdString + "Created session successfully");
            mConfigureSuccess = true;
        } else {
            mStateCallback.onConfigureFailed(this);
            mClosed = true; // do not fire any other callbacks, do not allow any other work
            Log.e(TAG, mIdString + "Failed to create capture session; configuration failed");
            mConfigureSuccess = false;
        }
    }
}

这个构造方法里,除去直接构建的全局变量,主要就是构建StateCallback的createUserStateCallbackProxy方法,方法创建了一个SessionStateCallbackProxy对象,这个类是CameraCaptureSession.StateCallback的子类;

typescript 复制代码
/**
 * Post calls into a CameraCaptureSession.StateCallback to the user-specified {@code executor}.
 */
private StateCallback createUserStateCallbackProxy(Executor executor, StateCallback callback) {
    return new CallbackProxies.SessionStateCallbackProxy(executor, callback);
}
java 复制代码
public class CallbackProxies {
    public static class SessionStateCallbackProxy
            extends CameraCaptureSession.StateCallback {
        private final Executor mExecutor;
        private final CameraCaptureSession.StateCallback mCallback;
​
        public SessionStateCallbackProxy(Executor executor,
                CameraCaptureSession.StateCallback callback) {
            mExecutor = checkNotNull(executor, "executor must not be null");
            mCallback = checkNotNull(callback, "callback must not be null");
        }
        
        @Override
        public void onConfigured(CameraCaptureSession session) {
            final long ident = Binder.clearCallingIdentity();
            try {
                mExecutor.execute(() -> mCallback.onConfigured(session));
            } finally {
                Binder.restoreCallingIdentity(ident);
            }
        }
        ......
    }
    ......
}

上述流程执行完成之后就将返回结果赋值给了mStateCallback,通过分析SessionStateCallbackProxy类里面的onConfigured()方法可知,其实mStateCallback调用的就是new SessionStateCallbackProxy时传入的callback的回调方法,也就是APP创建并传下来的callback;

CameraCaptureSessionImpl的构造函数的最后,通过configureSuccess判断,如果一切配置正常就调用StateCallback的onConfiged方法;

至此Session就全部创建完成了。

总结:

createCaptureSession的整个流程,主要做了3件事:

  1. 创建了outputConfiguration;
  2. 创建了stream;
  3. 创建了Session,调用Callback;

简要概括:就是通过surface创建对应的stream,然后将surface与HAL层connect,保证数据的传递畅通,最后就是创建Application和CameraServer之间的session,在CameraCaptureSessionImpl的构造方法的参数中,有一个CameraDeviceImpl类型的参数,就是为了保证能够正常调用(setRepeatingRequest()方法),保证App和Server之间的通信;

知识点:

Android显示系统

Android上最终绘制在屏幕上的buffer都是在显存中分配的,而除了这部分外,其他都是在内存中分配的;

buffer管理的模块有两个,一个是framebuffer,一个是gralloc,framebuffer用来将渲染好的buffer显示到屏幕上,而gralloc用于分配buffer

动态绑定Preview Surface

一种方式就是通过openGL的方式动态的绑定surface;

另一种方式就是android 8.0的源码中在CameraCaptureSession类中提供的finalizeOutputConfigurations方法非常巧妙,它可以缩短相机的启动时间,它的实现思路是这样的,原本创建session时,要求我们必须提供预览surface,而该接口可以先不提供预览surface,而只以预览的size构造一个OutputConfiguration对象,将它下到CameraServer进程当中,然后等到预览surface准备就绪以后,再调用该接口去对接,这样也可以保证预览的正常,中间就可以节省出预览surface准备的部分时间,也就达到了缩短相机启动时长的目的了;

camera启动时长:创建会话是一项费时的操作,可能需要几百毫秒,因为它需要配置摄像机设备的内部管道并分配内存缓冲区以将图像发送到所需目标;

如果多个Surface共享相同的OutputConfiguration,并且其中一个Surface在创建CameraCaptureSession后变为可用,则也可以调用此函数。 在这种情况下,应用程序必须首先使用可用的Surface创建OutputConfiguration,然后在创建CameraCaptureSession之前通过OutputConfiguration #enableSurfaceSharing启用进一步的Surface共享。 创建CameraCaptureSession后,一旦额外的Surface可用,应用程序必须在使用此方法完成配置之前调用OutputConfiguration#addSurface。

如果提供的OutputConfigurations在创建会话时保持不变,则此函数调用无效。 只能为特定输出配置调用此函数一次。

此调用返回后,此OutputConfiguration列表中包含的输出Surface可用作CaptureRequest目标。

CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL_LEGACY级设备不支持此方法

CameraCaptureSession.java

java 复制代码
@Override
public void finalizeOutputConfigurations(
        List<OutputConfiguration> outputConfigs) throws CameraAccessException {
    mDeviceImpl.finalizeOutputConfigs(outputConfigs);
}

最终确定现在包含延迟和/或额外Surfaces的输出配。

对于需要配置预览和其他输出配置的相机使用情况,预览Surface可能需要一些时间才能准备好。 例如,如果从SurfaceView获取预览Surface,则只有在UI布局完成后,SurfaceView才会准备好,这可能会延迟相机启动。

为了加快相机启动时间,应用程序可以将CameraCaptureSession配置为最终预览大小(通过OutputConfiguration#OutputConfiguration(Size,Class)),并将预览输出配置推迟到Surface准备就绪。 使用此延迟输出和其他正常输出成功创建CameraCaptureSession后,只要不包含延迟输出Surface,应用程序就可以开始提交请求。 一旦延迟Surface准备就绪,应用程序可以使用OutputConfiguration#addSurface方法将Surface添加到延迟输出配置,然后通过此方法更新延迟输出配置,然后才能使用此输出目标提交捕获请求。

CameraDeviceImpl.java

arduino 复制代码
public void finalizeOutputConfigs(List<OutputConfiguration> outputConfigs)
        throws CameraAccessException {
    if (outputConfigs == null || outputConfigs.size() == 0) {
        throw new IllegalArgumentException("deferred config is null or empty");
    }
​
    synchronized(mInterfaceLock) {
        for (OutputConfiguration config : outputConfigs) {
            int streamId = -1;
            for (int i = 0; i < mConfiguredOutputs.size(); i++) {
                // Have to use equal here, as createCaptureSessionByOutputConfigurations() and
                // createReprocessableCaptureSessionByConfigurations() do a copy of the configs.
                if (config.equals(mConfiguredOutputs.valueAt(i))) {
                    streamId = mConfiguredOutputs.keyAt(i);
                    break;
                }
            }
            if (streamId == -1) {
                throw new IllegalArgumentException("Deferred config is not part of this "
                        + "session");
            }
​
            if (config.getSurfaces().size() == 0) {
                throw new IllegalArgumentException("The final config for stream " + streamId
                        + " must have at least 1 surface");
            }
            mRemoteDevice.finalizeOutputConfigurations(streamId, config);
            mConfiguredOutputs.put(streamId, config);
        }
    }
}

CameraDeviceClient.cpp

ini 复制代码
binder::Status CameraDeviceClient::finalizeOutputConfigurations(int32_t streamId,
        const hardware::camera2::params::OutputConfiguration &outputConfiguration) {
    ATRACE_CALL();
​
    binder::Status res;
    if (!(res = checkPidStatus(__FUNCTION__)).isOk()) return res;
​
    Mutex::Autolock icl(mBinderSerializationLock);
​
    const std::vector<sp<IGraphicBufferProducer> >& bufferProducers =
            outputConfiguration.getGraphicBufferProducers();
​
    if (bufferProducers.size() == 0) {
        ALOGE("%s: bufferProducers must not be empty", __FUNCTION__);
        return STATUS_ERROR(CameraService::ERROR_ILLEGAL_ARGUMENT, "Target Surface is invalid");
    }
​
    // streamId should be in mStreamMap if this stream already has a surface attached
    // to it. Otherwise, it should be in mDeferredStreams.
    bool streamIdConfigured = false;
    ssize_t deferredStreamIndex = NAME_NOT_FOUND;
    for (size_t i = 0; i < mStreamMap.size(); i++) {
        if (mStreamMap.valueAt(i).streamId() == streamId) {
            streamIdConfigured = true;
            break;
        }
    }
    for (size_t i = 0; i < mDeferredStreams.size(); i++) {
        if (streamId == mDeferredStreams[i]) {
            deferredStreamIndex = i;
            break;
        }
​
    }
    if (deferredStreamIndex == NAME_NOT_FOUND && !streamIdConfigured) {
        String8 msg = String8::format("Camera %s: deferred surface is set to a unknown stream"
                "(ID %d)", mCameraIdStr.string(), streamId);
        ALOGW("%s: %s", __FUNCTION__, msg.string());
        return STATUS_ERROR(CameraService::ERROR_ILLEGAL_ARGUMENT, msg.string());
    }
​
    if (mStreamInfoMap[streamId].finalized) {
        String8 msg = String8::format("Camera %s: finalizeOutputConfigurations has been called"
                " on stream ID %d", mCameraIdStr.string(), streamId);
        ALOGW("%s: %s", __FUNCTION__, msg.string());
        return STATUS_ERROR(CameraService::ERROR_ILLEGAL_ARGUMENT, msg.string());
    }
​
    if (!mDevice.get()) {
        return STATUS_ERROR(CameraService::ERROR_DISCONNECTED, "Camera device no longer alive");
    }
​
    std::vector<sp<Surface>> consumerSurfaces;
    for (auto& bufferProducer : bufferProducers) {
        // Don't create multiple streams for the same target surface
        ssize_t index = mStreamMap.indexOfKey(IInterface::asBinder(bufferProducer));
        if (index != NAME_NOT_FOUND) {
            ALOGV("Camera %s: Surface already has a stream created "
                    " for it (ID %zd)", mCameraIdStr.string(), index);
            continue;
        }
​
        sp<Surface> surface;
        res = createSurfaceFromGbp(mStreamInfoMap[streamId], true /*isStreamInfoValid*/,
                surface, bufferProducer);
​
        if (!res.isOk())
            return res;
​
        consumerSurfaces.push_back(surface);
    }
​
    // Gracefully handle case where finalizeOutputConfigurations is called
    // without any new surface.
    if (consumerSurfaces.size() == 0) {
        mStreamInfoMap[streamId].finalized = true;
        return res;
    }
​
    // Finish the deferred stream configuration with the surface.
    status_t err;
    std::vector<int> consumerSurfaceIds;
    err = mDevice->setConsumerSurfaces(streamId, consumerSurfaces, &consumerSurfaceIds);
    if (err == OK) {
        for (size_t i = 0; i < consumerSurfaces.size(); i++) {
            sp<IBinder> binder = IInterface::asBinder(
                    consumerSurfaces[i]->getIGraphicBufferProducer());
            ALOGV("%s: mStreamMap add binder %p streamId %d, surfaceId %d", __FUNCTION__,
                    binder.get(), streamId, consumerSurfaceIds[i]);
            mStreamMap.add(binder, StreamSurfaceId(streamId, consumerSurfaceIds[i]));
        }
        if (deferredStreamIndex != NAME_NOT_FOUND) {
            mDeferredStreams.removeItemsAt(deferredStreamIndex);
        }
        mStreamInfoMap[streamId].finalized = true;
        mConfiguredOutputs.replaceValueFor(streamId, outputConfiguration);
    } else if (err == NO_INIT) {
        res = STATUS_ERROR_FMT(CameraService::ERROR_ILLEGAL_ARGUMENT,
                "Camera %s: Deferred surface is invalid: %s (%d)",
                mCameraIdStr.string(), strerror(-err), err);
    } else {
        res = STATUS_ERROR_FMT(CameraService::ERROR_INVALID_OPERATION,
                "Camera %s: Error setting output stream deferred surface: %s (%d)",
                mCameraIdStr.string(), strerror(-err), err);
    }
​
    return res;
}
相关推荐
爱上语文43 分钟前
Springboot三层架构
java·开发语言·spring boot·spring·架构
Mercury Random1 小时前
Qwen 个人笔记
android·笔记
苏苏码不动了1 小时前
Android 如何使用jdk命令给应用/APK重新签名。
android
aqi002 小时前
FFmpeg开发笔记(五十三)移动端的国产直播录制工具EasyPusher
android·ffmpeg·音视频·直播·流媒体
xiaoduyyy3 小时前
【Android】ToolBar,滑动菜单,悬浮按钮和可交互提示等的使用方法
android
ღ᭄ꦿ࿐Never say never꧂3 小时前
微服务架构中的负载均衡与服务注册中心(Nacos)
java·spring boot·后端·spring cloud·微服务·架构·负载均衡
liyy6143 小时前
Android架构组件:MVVM模式的实战应用与数据绑定技巧
android
CaritoB3 小时前
中台架构下的数据仓库与非结构化数据整合
数据仓库·架构
K1t05 小时前
Android-UI设计
android·ui
吃汉堡吃到饱6 小时前
【Android】浅析MVC与MVP
android·mvc