Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Camera2 API and beyond

Tomoaki Imai
November 21, 2018

Camera2 API and beyond

Android development is becoming easy to jump in day by day. There are so many convenient development tools such as Jetpack. However, when it comes to Camera2, there's still a big hurdle for newcomers to understand the whole picture of how it works and what can be possible.

One reason why Camera2 confuses developers is that there are many callbacks that we have to manage just to take one picture.

In this session, I’m giving you a guide to understand this complex system so that you can develop and customize your own Camera app. I’m going to explain the step by step photo shooting flow by looking into how each callback interacts with a Camera device. Then, I will dive into more advanced usage such as how to customize settings and add OpenGL shading into the process of shooting a photo or video.

Tomoaki Imai

November 21, 2018
Tweet

More Decks by Tomoaki Imai

Other Decks in Technology

Transcript

  1. Understand the basic Break the basic Beyond the basic -

    Flows, Classes , Callbacks - Preview calculation - Adjusting previews size - Controlling Auto Exposure/Focus - Using filters with OpenGL
  2. Revisiting Architecture of Camera2 $BNFSB%FWJDF )BSEXBSF )BSEXBSF"CTUSBDUJPO-BZFS $BNFSB4FSWJDF $ #JOEFS*OUFSGBDF

    +BWB $BNFSB"1* +BWB :PVSDPEF $BNFSB%FWJDF 4UBUF$BMMCBDL $BQUVSF4FTTJPO 4UBUF$BMMCBDL $BQUVSF4FTTJPO $BQUVSF$BMMCBDL *NBHF3FBEFS 0O*NBHF"WBJMBCMF $BNFSB$POTUSBJO FE)JHI4QFFE$BMM CBDL $BNFSB%FWJDF "WBJMBCJMJUZ$BMMCBDL
  3. Revisiting Architecture of Camera2 $BNFSB%FWJDF )BSEXBSF )BSEXBSF"CTUSBDUJPO-BZFS $BNFSB4FSWJDF $ #JOEFS*OUFSGBDF

    +BWB $BNFSB"1* +BWB :PVSDPEF $BNFSB%FWJDF 4UBUF$BMMCBDL $BQUVSF4FTTJPO 4UBUF$BMMCBDL $BQUVSF4FTTJPO $BQUVSF$BMMCBDL *NBHF3FBEFS 0O*NBHF"WBJMBCMF $BNFSB$POTUSBJO FE)JHI4QFFE$BMM CBDL $BNFSB%FWJDF "WBJMBCJMJUZ$BMMCBDL $BMMCBDL
  4. 5FYUVSF7JFX $BNFSB.BOBHFS val surfaceTextureListener = object : TextureView.SurfaceTextureListener { override

    fun onSurfaceTextureAvailable( texture: SurfaceTexture, width: Int, height: Int) { openCamera(width, height) } 4VSGBDF5FYUVSF SurfaceTextureListener
  5. override fun onCaptureCompleted(session: CameraCaptureSession, request: CaptureRequest, result: TotalCaptureResult) { //

    Check for Auto Focus val afState = result.get(CaptureResult.CONTROL_AF_STATE) if(afState == CaptureResult.CONTROL_AF_STATE_FOCUSED_LOCKED) { } }
  6. override fun onCaptureCompleted(session: CameraCaptureSession, request: CaptureRequest, result: TotalCaptureResult) { //

    Check for Auto Focus val afState = result.get(CaptureResult.CONTROL_AF_STATE) if(afState == CaptureResult.CONTROL_AF_STATE_FOCUSED_LOCKED) { // Check for Auto Exposure Status val aeState = result.get(CaptureResult.CONTROL_AE_STATE) if(aeState == CaptureResult.CONTROL_AE_STATE_CONVERGED) { } }
  7. override fun onCaptureCompleted(session: CameraCaptureSession, request: CaptureRequest, result: TotalCaptureResult) { //

    Check for Auto Focus val afState = result.get(CaptureResult.CONTROL_AF_STATE) if(afState == CaptureResult.CONTROL_AF_STATE_FOCUSED_LOCKED) { // Check for Auto Exposure Status val aeState = result.get(CaptureResult.CONTROL_AE_STATE) if(aeState == CaptureResult.CONTROL_AE_STATE_CONVERGED) { // Auto Exposure is ready captureStillPicture() } else if(aeState != CaptureResult.CONTROL_AE_STATE_INACTIVE)) { runPreCapture() } } }
  8. override fun onCaptureCompleted(session: CameraCaptureSession, request: CaptureRequest, result: TotalCaptureResult) { //

    Check for Auto Focus val afState = result.get(CaptureResult.CONTROL_AF_STATE) if(afState == CaptureResult.CONTROL_AF_STATE_FOCUSED_LOCKED) { // Check for Auto Exposure Status val aeState = result.get(CaptureResult.CONTROL_AE_STATE) if(aeState == CaptureResult.CONTROL_AE_STATE_CONVERGED) { // Auto Exposure is ready captureStillPicture() } else if(aeState != CaptureResult.CONTROL_AE_STATE_INACTIVE)) { runPreCapture() } } else if(afState == null) { // Auto Focus is null when you first request, in that case // request Auto Exposure until Auto Focus returns runPreCapture() } }
  9. override fun onCaptureCompleted(session: CameraCaptureSession, request: CaptureRequest, result: TotalCaptureResult) { //

    Check for Auto Focus val afState = result.get(CaptureResult.CONTROL_AF_STATE) if(afState == CaptureResult.CONTROL_AF_STATE_FOCUSED_LOCKED) { // Check for Auto Exposure Status val aeState = result.get(CaptureResult.CONTROL_AE_STATE) if(aeState == CaptureResult.CONTROL_AE_STATE_CONVERGED) { // Auto Exposure is ready captureStillPicture() } else if(aeState != CaptureResult.CONTROL_AE_STATE_INACTIVE)) { runPreCapture() } else if(aeState == null) { // Some device do not support AutoExposure. in that case // skip Auto Exposure and request Camera Capture captureStillPicture() } } else if(afState == null) { // Auto Focus is null when you first request, in that case // request Auto Exposure until Auto Focus returns runPreCapture() } }
  10. override fun onCaptureCompleted(session: CameraCaptureSession, request: CaptureRequest, result: TotalCaptureResult) { //

    Check for Auto Focus val afState = result.get(CaptureResult.CONTROL_AF_STATE) if(afState == CaptureResult.CONTROL_AF_STATE_FOCUSED_LOCKED) { // Check for Auto Exposure Status val aeState = result.get(CaptureResult.CONTROL_AE_STATE) if(aeState == CaptureResult.CONTROL_AE_STATE_CONVERGED // Auto Exposure might request Flash // but we can't wait for that... || aeState == CaptureRequest.CONTROL_AE_STATE_FLASH_REQUIRED) { // Auto Exposure is ready captureStillPicture() } else if(aeState != CaptureResult.CONTROL_AE_STATE_INACTIVE)) { runPreCapture() } else if(aeState == null) { // Some device do not support AutoExposure. in that case // skip Auto Exposure and request Camera Capture captureStillPicture() } } else if(afState == null) { // Auto Focus is null when you first request, in that case // request Auto Exposure until Auto Focus returns runPreCapture() } }
  11. override fun onCaptureCompleted(session: CameraCaptureSession, request: CaptureRequest, result: TotalCaptureResult) { //

    Check for Auto Focus val afState = result.get(CaptureResult.CONTROL_AF_STATE) if(afState == CaptureResult.CONTROL_AF_STATE_FOCUSED_LOCKED) { // Check for Auto Exposure Status val aeState = result.get(CaptureResult.CONTROL_AE_STATE) if(aeState == CaptureResult.CONTROL_AE_STATE_CONVERGED // Auto Exposure might request Flash // but we can't wait for that... || aeState == CaptureRequest.CONTROL_AE_STATE_FLASH_REQUIRED) { // Auto Exposure is ready captureStillPicture() } else if(aeState != CaptureResult.CONTROL_AE_STATE_INACTIVE)) { runPreCapture() } else if(aeState == null) { // Some device do not support AutoExposure. in that case // skip Auto Exposure and request Camera Capture captureStillPicture() } } else if(afState == null) { // Auto Focus is null when you first request, in that case // request Auto Exposure until Auto Focus returns runPreCapture() } }
  12. private enum class State { PREVIEW, WAITING_LOCK, WAITING_PRECAPTURE, WAITING_NON_PRECAPTURE, TAKEN

    } override fun onCaptureCompleted(session: CameraCaptureSession, request: CaptureRequest, result: TotalCaptureResult) { when (state) { State.WAITING_LOCK -> { // Check for Auto Focus state val afState = result.get(CaptureResult.CONTROL_AF_STATE) } State.WAITING_PRECAPTURE -> { // Check for Auto Exposure State val aeState = result.get(CaptureResult.CONTROL_AE_STATE) } … } Keep track of Request State
  13. focus private fun lockFocus() { state = State.WAITING_LOCK val builder

    = createPreviewRequestBuilder() captureSession?.capture( builder?.build(), captureCallback, backgroundHandler) } private fun createPreviewRequestBuilder(): CaptureRequest.Builder? { … if (characteristics.isContinuousAutoFocusSupported()) { builder.set( CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE) } else { builder.set( CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_AUTO ) } }
  14. focus private fun lockFocus() { state = State.WAITING_LOCK val builder

    = createPreviewRequestBuilder() captureSession?.capture( builder?.build(), captureCallback, backgroundHandler) } State.WAITING_LOCK -> { val afState = result.get(CaptureResult.CONTROL_AF_STATE) if (CaptureResult.CONTROL_AF_STATE_INACTIVE == afState || CaptureResult.CONTROL_AF_STATE_FOCUSED_LOCKED == afState || CaptureResult.CONTROL_AF_STATE_NOT_FOCUSED_LOCKED == afState) { // Request Auto Exposure runPreCapture() } else { captureStillPicture() } }
  15. focus precapture fun runPreCapture() { state = State.WAITING_PRECAPTURE val builder

    = createPreviewRequestBuilder() builder?.set( CaptureRequest.CONTROL_AE_PRECAPTURE_TRIGGER, CaptureRequest.CONTROL_AE_PRECAPTURE_TRIGGER_START ) captureSession?.capture( builder?.build(), captureCallback, backgroundHandler) }
  16. focus precapture fun runPreCapture() { state = State.WAITING_PRECAPTURE val builder

    = createPreviewRequestBuilder() builder?.set( CaptureRequest.CONTROL_AE_PRECAPTURE_TRIGGER, CaptureRequest.CONTROL_AE_PRECAPTURE_TRIGGER_START ) captureSession?.capture( builder?.build(), captureCallback, backgroundHandler) }
  17. focus precapture capture private fun captureStillPicture() { state = State.TAKEN

    val builder = cameraDevice?.createCaptureRequest( CameraDevice.TEMPLATE_STILL_CAPTURE) // bind imageReader so that we can pass the result builder?.addTarget(imageReader?.surface) builder?.addTarget(surface) captureSession?.stopRepeating() captureSession?.capture( builder?.build(),{…},backgroundHandler) }
  18. focus precapture capture take picture fun takePicture(handler : ImageHandler) {

    imageReader?.setOnImageAvailableListener( object: ImageReader.OnImageAvailableListener{ override fun onImageAvailable(reader: ImageReader) { val image = reader.acquireNextImage() backgroundHandler?.post( handler.handleImage(image = image)) } }, backgroundHandler) lockFocus() } object : ImageHandler { override fun handleImage(image: Image): Runnable { // Save your image }
  19. Adjusting Preview size aspect ratio : 1.33 width : 1080

    height : 1436 aspect ratio : 1.22 width : 1080 height : 1436 aspect ratio : 1.77 width : 1000 height : 1778
  20. Steps to adjust preview size • Calculate view rotation •

    Pick the supported preview size from Camera • Adjust TextureView based on the preview size
  21. Calculate view rotation characteristics.get(CameraCharacteristics.SENSOR_ORIENTATION) Sensor Orientation - Angle that needs

    to be rotated to get upright image on device - Based on how the sensor is implemented on device h w what Camera Sensor Captures SENSOR_ORIENTATION = 90 Image output
  22. height width CAMERA DEVICE WORLD width(displaySize.x) height (displaySize.y) DISPLAY WORLD

    previewSize = cameraCharacteristics.chooseOptimalSize( textureViewWidth = textureHeight, textureViewHeight = textureWidth, maxWidth = displaySize.y, maxHeight = displaySize.x, aspectRatio = largest) Calculate view rotation
  23. [4032x3024, 4000x3000, 3840x2160, 4000x2000, 3264x2448, 3200x2400, 2688x1512, 2592x1944, 2048x1536, 1920x1440,

    1920x1080, 1600x1200, 1920x960, 1280x960, 1280x768, 1280x720, 1024x768, 800x400, 800x600, 800x480, 720x480, 640x400, 640x480, 640x360, 352x288, 320x240, 176x144, 160x120] Fetch supported Sizes val map: StreamConfigurationMap = cameraCharacteristic.get( CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP) val choices = map.getOutputSizes(SurfaceTexture::class.java)
  24. Look for supported Sizes 1) Smaller than defined max size

    [4032x3024, 4000x3000, 3840x2160, 4000x2000, 3264x2448, 3200x2400, 2688x1512, 2592x1944, 2048x1536, 1920x1440, 1920x1080, 1600x1200, 1920x960, 1280x960, 1280x768, 1280x720, 1024x768, 800x400, 800x600, 800x480, 720x480, 640x400, 640x480, 640x360, 352x288, 320x240, 176x144, 160x120] val choices = map.getOutputSizes(SurfaceTexture::class.java) for (op in choices) { if (op.width <= MAX_PREVIEW_WIDTH && op.height <= MAX_PREVIEW_HEIGHT) { } }
  25. Look for supported Sizes 2) Pick ones aspect ratio matches

    val w = aspectRatio.width // 4032 val h = aspectRatio.height // 3024 for (op in choices) { if (op.width <= _maxWidth && op.height <= _maxHeight && op.height == op.width * h / w) { } [4032x3024, 4000x3000, 3840x2160, 4000x2000, 3264x2448, 3200x2400, 2688x1512, 2592x1944, 2048x1536, 1920x1440, 1920x1080, 1600x1200, 1920x960, 1280x960, 1280x768, 1280x720, 1024x768, 800x400, 800x600, 800x480, 720x480, 640x400, 640x480, 640x360, 352x288, 320x240, 176x144, 160x120] How to decide Aspect Ratio of Preview? - Pick the largest from supported ex) 4032 / 3024 = 1.33
  26. Look for supported Sizes 2) Pick ones aspect ratio matches

    [4032x3024, 4000x3000, 3840x2160, 4000x2000, 3264x2448, 3200x2400, 2688x1512, 2592x1944, 2048x1536, 1920x1440, 1920x1080, 1600x1200, 1920x960, 1280x960, 1280x768, 1280x720, 1024x768, 800x400, 800x600, 800x480, 720x480, 640x400, 640x480, 640x360, 352x288, 320x240, 176x144, 160x120] val w = aspectRatio.width // 4032 val h = aspectRatio.height // 3024 for (op in choices) { if (op.width <= _maxWidth && op.height <= _maxHeight && op.height == op.width * h / w) { } How to decide Aspect Ratio of Preview? - Pick the largest from supported ex) 4032 / 3024 = 1.33
  27. Look for supported Sizes 3) Pick the closest one you

    can get [4032x3024, 4000x3000, 3840x2160, 4000x2000, 3264x2448, 3200x2400, 2688x1512, 2592x1944, 2048x1536, 1920x1440, 1920x1080, 1600x1200, 1920x960, 1280x960, 1280x768, 1280x720, 1024x768, 800x400, 800x600, 800x480, 720x480, 640x400, 640x480, 640x360, 352x288, 320x240, 176x144, 160x120] // textureWidth = 1080 // textureHeight = 2080 if (op.width >= textureWidth && op.height >= textureHeight) { bigEnough.add(op) } else { notBigEnough.add(op) } - If candidates are larger than texture size, pick the smallest return when { bigEnough.size > 0 -> bigEnough.sortedWith(area()).first() notBigEnough.size > 0 -> notBigEnough.sortedWith(area()).last() } - If candidates are smaller than texture size, pick the largest
  28. Adjust TextView Size with Preview ratio 5FYUVSF7JFX 1080 1436 ratioWidth

    = 1280 ratioHeight = 960 // aspect = ratioHeight/ratioWidth = 1.33 override fun onMeasure( widthMeasureSpec: Int, heightMeasureSpec: Int) { val width = MeasureSpec.getSize(widthMeasureSpec) val height = MeasureSpec.getSize(heightMeasureSpec) setMeasuredDimension( width, width * ratioHeight / ratioWidth) } TextureView
  29. Full Screen Preview 5FYUVSF7JFX 1080 2160 [4032x3024, 4000x3000, 3840x2160, 4000x2000,

    3264x2448, 3200x2400, 2688x1512, 2592x1944, 2048x1536, 1920x1440, 1920x1080, 1600x1200, 1920x960, 1280x960, 1280x768, 1280x720, 1024x768, 800x400, 800x600, 800x480, 720x480, 640x400, 640x480, 640x360, 352x288, 320x240, 176x144, 160x120] val realSize = Point() activity?.windowManager?.defaultDisplay?.getRealSize(realSize) val aspectRatio = realSize.x.toFloat()/ realSize.y.toFloat() // 2.0 Define Aspect Ratio from windowSize
  30. Full Screen Preview [4032x3024, 4000x3000, 3840x2160, 4000x2000, 3264x2448, 3200x2400, 2688x1512,

    2592x1944, 2048x1536, 1920x1440, 1920x1080, 1600x1200, 1920x960, 1280x960, 1280x768, 1280x720, 1024x768, 800x400, 800x600, 800x480, 720x480, 640x400, 640x480, 640x360, 352x288, 320x240, 176x144, 160x120] val realSize = Point() activity?.windowManager?.defaultDisplay?.getRealSize(realSize) val aspectRatio = realSize.x.toFloat()/ realSize.y.toFloat() // 2.0 Closest Aspect Ratio Define Aspect Ratio from windowSize
  31. Full Screen Preview val realSize = Point() activity?.windowManager?.defaultDisplay?.getRealSize(realSize) val aspectRatio

    = realSize.x.toFloat()/ realSize.y.toFloat() // 2.0 Aspect Ratio = 2.0 && height < 1960 && width <1080 Define Aspect Ratio from windowSize [4032x3024, 4000x3000, 3840x2160, 4000x2000, 3264x2448, 3200x2400, 2688x1512, 2592x1944, 2048x1536, 1920x1440, 1920x1080, 1600x1200, 1920x960, 1280x960, 1280x768, 1280x720, 1024x768, 800x400, 800x600, 800x480, 720x480, 640x400, 640x480, 640x360, 352x288, 320x240, 176x144, 160x120] val w = aspectRatio.width // 4000 val h = aspectRatio.height // 2000 for (op in choices) { if (op.width <= _maxWidth && op.height <= _maxHeight && op.height == op.width * h / w) { }
  32. Controlling Camera Modes • Know what kind of mode is

    supported on your device • Add an parameters CameraRequest.requestBuilder and call CaptureSession.setRepeatingRequest
  33. Know what kind of mode is supported val list =

    characteristics.get(CameraCharacteristics.CONTROL_AE_AVAILABLE_MODES) CameraCharasteristics.get(Key<IntArray> modes)
  34. Know what kind of mode is supported val list =

    characteristics.get(CameraCharacteristics.CONTROL_AE_AVAILABLE_MODES) // list: {0, 1, 2, 3, 4} CameraCharasteristics.get(Key<IntArray> modes)
  35. Know what kind of mode is supported val list =

    characteristics.get(CameraCharacteristics.CONTROL_AE_AVAILABLE_MODES) // list: {0, 1, 2, 3, 4} CameraCharasteristics.get(Key<IntArray> modes) https://developer.android.com/reference/android/hardware/camera2/ CameraCharacteristics#CONTROL_AE_AVAILABLE_MODES
  36. Know what kind of mode is supported val list =

    characteristics.get(CameraCharacteristics.CONTROL_AE_AVAILABLE_MODES) // list: {0, 1, 2, 3, 4} CameraCharasteristics.get(Key<IntArray> modes) https://developer.android.com/reference/android/hardware/camera2/CaptureRequest.html#CONTROL_AE_MODE
  37. Know what kind of mode is supported val list =

    characteristics.get(CameraCharacteristics.CONTROL_AE_AVAILABLE_MODES) // list: {0, 1, 2, 3, 4} CameraCharasteristics.get(Key<IntArray> modes) https://developer.android.com/reference/android/hardware/camera2/CameraMetadata#CONTROL_AE_MODE_ON
  38. Know what kind of mode is supported val list =

    characteristics.get(CameraCharacteristics.CONTROL_AE_AVAILABLE_MODES) // list: {0, 1, 2, 3, 4} = {OFF, ON, AUTO_FLASH, ALWAYS_FLASH AUTO_FLASH_REDEYE} CameraCharasteristics.get(Key<IntArray> modes) https://developer.android.com/reference/android/hardware/camera2/CameraMetadata#CONTROL_AE_MODE_ON
  39. Know what kind of mode is supported CameraCharasteristics.get(Key<IntArray> modes) -

    Returns CameraMetaData - Use CaptureRequest(Subclass of CameraMetaData) for Requesting capture session // You can write like this, but don’t builder.set( CaptureRequest.CONTROL_AF_TRIGGER, CameraMetadata.CONTROL_AF_TRIGGER_START) // Do builder.set( CaptureRequest.CONTROL_AF_TRIGGER, CaptureRequest.CONTROL_AF_TRIGGER_START)
  40. Requesting Camera Modes override fun onClick(view: View) { when (view.id)

    { R.id.sun -> { camera?.wbMode = WBMode.SUNNY openCamera(textureView.width, textureView.height) // Re-request } CameraFragment
  41. Requesting Camera Modes override fun onClick(view: View) { when (view.id)

    { R.id.sun -> { camera?.wbMode = WBMode.SUNNY openCamera(textureView.width, textureView.height) // Re-request } val builder = cameraDevice?.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW) when(wbMode) { WBMode.SUNNY -> { builder.set( CaptureRequest.CONTROL_AWB_MODE, CaptureRequest.CONTROL_AWB_MODE_DAYLIGHT) } } captureSession?.setRepeatingRequest( builder?.build(), captureCallback, backgroundHandler) CameraFragment Camera
  42. Understand camera metering val rect = Rect( focusLeft, focusBottom, focusLeft

    + areaSize, focusBottom + areaSize) val meteringRectangle = MeteringRectangle(rect, weight = 500) builder.set(CaptureRequest.CONTROL_AF_REGIONS, arrayOf(meteringRectangle)) builder.set(CaptureRequest.CONTROL_AE_REGIONS, arrayOf(meteringRectangle)) val numOfMaxregionsAF = cameraCharacteristic.get( CameraCharacteristics.CONTROL_MAX_REGIONS_AF) #   XFJHIUSBOHF  " 
  43. Understand camera metering 3024 4032 Image Sensor 1080 2160 

     3024 * 500/ 1080 = 1400 4032 * 1000 / 2160 = 1866   Screen val rect = characteristics.get(CameraCharacteristics.SENSOR_INFO_ACTIVE_ARRAY_SIZE) val right = rect.right val bottom = rect.bottom val censorX= right * screenX / screenWidth val censorY = bottom * screenY / screenHeight
  44. 4VSGBDF5FYUVSF How OpenGL and Camera works 5FYUVSF7JFX 8JOEPX4VSGBDF &(-#BTF4VSGBDF 4VSGBDF5FYUVSF

    DBNFSB$BNFSB%FWJDF 5FYUVSF<> 4QFDJBM4VSGBDFUIBU JOUFSBDUXJUI0QFO(- Camera Stream https://github.com/google/grafika/blob/master/app/src/main/java/com/android/grafika/gles/WindowSurface.java
  45. Activity(Fragment) SurfaceTextureListener .onSurfaceTexture Available TextureView openCamera onResume startBackground Thread startPreview

    setCameraOutput configureTransform startCamera Camera setup EGL and WindowSurface OpenGL layer EGL - Interface between OpenGL and Native Window system 8JOEPX4VSGBDF &(-#BTF4VSGBDF
  46. Activity(Fragment) SurfaceTextureListener .onSurfaceTexture Available TextureView openCamera onResume startBackground Thread startPreview

    setCameraOutput configureTransform startCamera Camera setup EGL and WindowSurface OpenGL layer setup Textures Textures - Render graphics up on WindowSurface 8JOEPX4VSGBDF &(-#BTF4VSGBDF 5FYUVSF<> 5FYUVSF<> 5FYUVSF<> 5FYUVSF<>
  47. Activity(Fragment) SurfaceTextureListener .onSurfaceTexture Available TextureView openCamera onResume startBackground Thread startPreview

    setCameraOutput configureTransform startCamera Camera setup EGL and WindowSurface OpenGL layer setup Textures setup SurfaceTexture SurfaceTexture //set texture[0] to camera texture GLES20.glActiveTexture(GLES20.GL_TEXTURE0) GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, texturesIds[0]) previewSurfaceTexture = SurfaceTexture(texturesIds[0])
  48. Activity(Fragment) SurfaceTextureListener .onSurfaceTexture Available TextureView openCamera onResume startBackground Thread startPreview

    setCameraOutput configureTransform startCamera Camera setup EGL and WindowSurface OpenGL layer setup Textures setup SurfaceTexture load Shader Shaders (written with glsl) - VertexShader Renders graphics with given Texture Coordinates and Transform Matrix - FragmentShader Renders Colors for each pixels
  49. Activity(Fragment) SurfaceTextureListener .onSurfaceTexture Available TextureView openCamera onResume startBackground Thread startPreview

    setCameraOutput configureTransform startCamera Camera setup EGL and WindowSurface OpenGL layer setup Textures setup SurfaceTexture load Shader onRendererReady
  50. Wrap up • Learn Callbacks! • UnderStand Camera Sensor and

    Screen are different worlds • Write sample camera app by yourself