Smile, it's CameraX!

Smile, it's CameraX!

These days, humans take more than 1 trillion photos a year and there are researches that show how taking photos increases enjoyment of experiences. This could be a good reason to focus on how to develop an app that takes pictures and CameraX was one of the most popular releases at Google I/O 2019.
CameraX is a Jetpack supportt library built to help us make camera app development easier. In this presentation we will discuss about simplified usage of this new API that solves many bugs and pain points by offering consistency across devices.

44a168e6578c2cc83aaf54a38458ade9?s=128

Magda Miu

June 10, 2020
Tweet

Transcript

  1. Smile, it’s CameraX! Magda Miu @magdamiu Squad Lead Developer at

    Orange Android Google Developer Expert
  2. Birthday party 2015

  3. Christmas 2016

  4. Tančící dům 2017

  5. Pizzaaaaa

  6. It's Cocktail Time

  7. mDevCamp 2018

  8. Just happiness

  9. Discovering that the food order has 1 hour delay

  10. The last_last_last_..._last_selfie A Picture Is Worth a Thousand Words Photos

    help us to tell stories
  11. The last_last_last_..._last_selfie A Picture Is Worth a Thousand Words Photos

    help us to tell stories Self-portraits are about self-image “looking-glass self”
  12. Total photos taken yearly

  13. Digital camera vs Phone vs Tablet 87.5% Phone 10% Digital

    camera 2.5% Tablet 89.8% Phone 8.2% Digital camera 2% Tablet 90.9% Phone 7.3% Digital camera 1.8% Tablet
  14. Challenges OS flavors Platform fragmentation Camera API complexity

  15. Camera APIs 01 Legacy android.hardware.Camera 02 Camera 2 android.hardware.camera2 03

    CameraX androidx.camera
  16. Challenge Solution OS flavors Backward compatible with L+ devices

  17. Platform fragmentation Consistent behavior across devices Challenge Solution

  18. Easy to use API Camera API complexity Challenge Solution

  19. Fewer lines of code by using CameraX vs Camera2 70%

  20. CameraX Lifecycle Lifecycle awareness

  21. Use-case-driven approach UseCase ImageCapture ImageAnalysis Preview

  22. Preview 01 Get an image on the display

  23. Preview - implementation steps Step 1 Add gradle dependencies Step

    2 Permission handling Step 3 Add PreviewView in a layout Step 4 Get an instance of ProcessCamera Provider Step 5 Select a camera and bind the lifecycle
  24. compileOptions { sourceCompatibility JavaVersion.VERSION_1_8 targetCompatibility JavaVersion.VERSION_1_8 } Step 1: gradle

    setup
  25. def camerax = "1.0.0-beta04" implementation "androidx.camera:camera-camera2:${camerax}" implementation "androidx.camera:camera-lifecycle:${camerax}" implementation 'androidx.camera:camera-view:1.0.0-alpha11'

    implementation 'androidx.camera:camera-extensions:1.0.0-alpha11' Step 1: gradle setup
  26. <uses-feature android:name="android.hardware.camera.any" /> <uses-permission android:name="android.permission.CAMERA" /> if (areAllPermissionsGranted()) { startCamera()

    } else { ActivityCompat.requestPermissions( this, PERMISSIONS, CAMERA_REQUEST_PERMISSION_CODE) } Step 2: permission handling
  27. <uses-feature android:name="android.hardware.camera.any" /> <uses-permission android:name="android.permission.CAMERA" /> if (areAllPermissionsGranted()) { startCamera()

    } else { ActivityCompat.requestPermissions( this, PERMISSIONS, CAMERA_REQUEST_PERMISSION_CODE) } Step 2: permission handling
  28. <FrameLayout android:id="@+id/container" android:layout_width="match_parent" android:layout_height="match_parent"> <androidx.camera.view.PreviewView android:id="@+id/preview" android:layout_width="match_parent" android:layout_height="match_parent" > </androidx.camera.view.PreviewView>

    </FrameLayout> Step 3: add PreviewView in a layout
  29. PreviewView SurfaceView TextureView FrameLayout CameraCharacteristics #INFO_SUPPORTED _HARDWARE_LEVEL _LEGACY YES NO

    ViewGroup
  30. val previewView = findViewById(R.id.preview) // initialize the Preview object (current

    use case) val preview = Preview.Builder().build() Step 3: add PreviewView in a layout
  31. val cameraProviderFuture = ProcessCameraProvider.getInstance(this) // used to bind the lifecycle

    of camera to the lifecycle owner val cameraProvider: ProcessCameraProvider = cameraProviderFuture.get() // add a listener to the cameraProviderFuture val cameraExecutor = ContextCompat.getMainExecutor(this) cameraProviderFuture.addListener(Runnable {}, cameraExecutor) Step 4: instance of ProcessCameraProvider
  32. val cameraProviderFuture = ProcessCameraProvider.getInstance(this) // used to bind the lifecycle

    of camera to the lifecycle owner val cameraProvider: ProcessCameraProvider = cameraProviderFuture.get() // add a listener to the cameraProviderFuture val cameraExecutor = ContextCompat.getMainExecutor(this) cameraProviderFuture.addListener(Runnable {}, cameraExecutor) Step 4: instance of ProcessCameraProvider
  33. val cameraProviderFuture = ProcessCameraProvider.getInstance(this) // used to bind the lifecycle

    of camera to the lifecycle owner val cameraProvider: ProcessCameraProvider = cameraProviderFuture.get() // add a listener to the cameraProviderFuture val cameraExecutor = ContextCompat.getMainExecutor(this) cameraProviderFuture.addListener(Runnable {}, cameraExecutor) Step 4: instance of ProcessCameraProvider
  34. val backCamera = CameraSelector.LENS_FACING_BACK; val frontCamera = CameraSelector.LENS_FACING_FRONT; val cameraSelector

    = CameraSelector.Builder().requireLensFacing(backCamera).build() Step 5: select camera
  35. // unbind use cases before rebinding cameraProvider.unbindAll(); // bind the

    preview use case to camera camera = cameraProvider.bindToLifecycle( this as LifecycleOwner, cameraSelector, preview ) preview?.setSurfaceProvider( previewView.createSurfaceProvider(camera?.cameraInfo)) Step 5: bind the lifecycle (try-catch) *Runnable - try catch
  36. // unbind use cases before rebinding cameraProvider.unbindAll(); // bind the

    preview use case to camera camera = cameraProvider.bindToLifecycle( this as LifecycleOwner, cameraSelector, preview ) preview?.setSurfaceProvider( previewView.createSurfaceProvider(camera?.cameraInfo)) Step 5: bind the lifecycle (try-catch) *Runnable - try catch
  37. val previewView = findViewById(R.id.preview) val preview = Preview.Builder().build() val cameraProviderFuture

    = ProcessCameraProvider.getInstance(this) val cameraProvider: ProcessCameraProvider = cameraProviderFuture.get() val cameraExecutor = ContextCompat.getMainExecutor(this) cameraProviderFuture.addListener(Runnable { cameraProvider.unbindAll(); camera = cameraProvider.bindToLifecycle( this as LifecycleOwner, cameraSelector, preview ) preview?.setSurfaceProvider( previewView.createSurfaceProvider(camera?.cameraInfo)) }, cameraExecutor) try catch
  38. None
  39. Capture 02 Save high-quality images

  40. Capture - implementation steps Step 1 Create ImageCapture reference Step

    2 Add orientation event listener Step 3 Image file management Step 4 Call takePicture() Step 5 Update the call to bind lifecycle
  41. val imageCapture = ImageCapture.Builder().build() Step 1: create ImageCapture reference

  42. // to optimize photo capture for quality val captureMode =

    ImageCapture.CAPTURE_MODE_MAXIMIZE_QUALITY // to optimize photo capture for latency (default) val captureMode = ImageCapture.CAPTURE_MODE_MINIMIZE_LATENCY imageCapture = ImageCapture.Builder() .setCaptureMode(captureMode) .build() Step 1: create ImageCapture reference *photo capture mode
  43. // flash will always be used when taking a picture

    val flashMode = ImageCapture.FLASH_MODE_ON // flash will never be used when taking a picture (default) val flashMode = ImageCapture.FLASH_MODE_OFF // flash will be used according to the camera system's determination val flashMode = ImageCapture.FLASH_MODE_AUTO imageCapture = ImageCapture.Builder() .setFlashMode(flashMode) .build() Step 1: create ImageCapture reference *flash mode
  44. // 16:9 standard aspect ratio val aspectRatio = AspectRatio.RATIO_16_9 //

    4:3 standard aspect ratio (default) val aspectRatio = AspectRatio.RATIO_4_3 imageCapture = ImageCapture.Builder() .setTargetAspectRatio(aspectRatio) .build() Step 1: create ImageCapture reference *aspect ratio
  45. val metrics = DisplayMetrics().also { previewView.display.getRealMetrics(it) } val screenSize =

    Size(metrics.widthPixels, metrics.heightPixels) imageCapture = ImageCapture.Builder() .setTargetResolution(screenSize) .setTargetName("CameraConference") .build() Step 1: create ImageCapture reference *target resolution and target name
  46. val orientationEventListener = object : OrientationEventListener(this as Context) { override

    fun onOrientationChanged(orientation: Int) { val rotation: Int = when (orientation) { in 45..134 -> Surface.ROTATION_270 in 135..224 -> Surface.ROTATION_180 in 225..314 -> Surface.ROTATION_90 else -> Surface.ROTATION_0 } // default => Display.getRotation() imageCapture.targetRotation = rotation } } orientationEventListener.enable() Step 2: add orientation event listener
  47. val file = File( externalMediaDirs.first(), "${System.currentTimeMillis()}.jpg" ) val outputFileOptions =

    ImageCapture.OutputFileOptions.Builder(file).build() Step 3: image file management
  48. imageCapture.takePicture(outputFileOptions, cameraExecutor, object : ImageCapture.OnImageSavedCallback { override fun onImageSaved(outputFileResult: ImageCapture.OutputFileResults)

    { // yey!!! :) } override fun onError(exception: ImageCaptureException) { // ohhh!!! :( } }) Step 4: call takePicture()
  49. // bind the image capture use case to camera camera

    = cameraProvider.bindToLifecycle( this as LifecycleOwner, cameraSelector, preview, imageCapture ) Step 5: update the call to bind lifecycle
  50. takePicture implementation options takePicture(Executor, OnImageCapturedCallback) takePicture(OutputFileOptions, Executor, OnImageSavedCallback) in-memory buffer

    of the captured image save the captured image to the provided file location
  51. val imageCapture = ImageCapture.Builder().build() val file = File(externalMediaDirs.first(), name) val

    output = ImageCapture.OutputFileOptions.Builder(file).build() imageCapture.takePicture(output, executor, object : ImageCapture.OnImageSavedCallback { override fun onImageSaved(ofr: ImageCapture.OutputFileResults) { // yey!!! :) } override fun onError(exception: ImageCaptureException) { // ohhh!!! :( } }) camera = cameraProvider.bindToLifecycle( this as LifecycleOwner, cameraSelector, preview, imageCapture )
  52. None
  53. Analysis 03 CPU-accessible image for image processing, computer vision, ML

  54. Analysis - implementation steps Step 1 Create ImageAnalysis reference Step

    2 Define a custom analyser Step 3 Implement analyze() method Step 4 Set the custom analyser Step 5 Update the call to bind lifecycle
  55. val imageAnalysis = ImageAnalysis.Builder().build() Step 1: create ImageAnalysis reference

  56. // the executor receives sequentially the frames val blocking =

    ImageAnalysis.STRATEGY_BLOCK_PRODUCER // the executor receives last available frame (default) val nonBlocking = ImageAnalysis.STRATEGY_KEEP_ONLY_LATEST val imageAnalysis = ImageAnalysis.Builder() .setBackpressureStrategy(nonBlocking) .build() Step 1: create ImageAnalysis reference
  57. Image analysis working modes setBackpressureStrategy(ImageAnalysis.STRATEGY_BLOCK_PRODUCER) • The executor receives sequentially

    the frames • We could use getImageQueueDepth() to return the no of images available in the pipeline, including the image currently analysed • If analyze() takes longer than the latency of a single frame at the current framerate => new frames are blocked until the method returns setBackpressureStrategy(ImageAnalysis.STRATEGY_KEEP_ONLY_LATEST) • The default strategy • The executor receives last available frame • If analyze() takes longer than the latency of a single frame at the current framerate => some frames might be skipped and the method will get the last frame available in the camera pipeline blocking mode non-blocking mode
  58. ImageAnalysis.Builder methods setTargetAspectRatio(int aspectRatio) • aspectRatio could be RATIO_16_9 or

    RATIO_4_3 • Default = RATIO_4_3 setTargetName(String targetName) • Debug purpose • Default = class canonical name and random UUID setTargetResolution(Size resolution) • Set the resolution of the intended target • Default = 640x480 setTargetRotation(int rotation) • Set the rotation of the intended target • Default = Display.getRotation()
  59. class PurpleColorAnalyser() : ImageAnalysis.Analyzer { override fun analyze(image: ImageProxy) {

    TODO("Not yet implemented") } } Step 2: define a custom analyser
  60. private var lastAnalyzedTimestamp = 0L override fun analyze(image: ImageProxy) {

    val timestamp = System.currentTimeMillis() val oneSecond = TimeUnit.SECONDS.toMillis(1) if (elapsedOneSecond(timestamp, oneSecond)) { val buffer = image.planes[0].buffer val data = buffer.toByteArray() val pixels = data.map { it.toInt() and 0x9370DB } val averagePurplePixels = pixels.average() lastAnalyzedTimestamp = timestamp } image.close() } Step 3: implement analyze() method
  61. imageAnalysis.setAnalyzer(executor, PurpleColorAnalyser()) Step 4: set the custom analyser

  62. // bind the image analysis use case to camera camera

    = cameraProvider.bindToLifecycle( this as LifecycleOwner, cameraSelector, imageAnalysis, preview ) Step 5: update the call to bind lifecycle
  63. val imageAnalysis = ImageAnalysis.Builder() .setBackpressureStrategy(nonBlocking) .build() val cameraProviderFuture = ProcessCameraProvider.getInstance(this)

    val cameraProvider: ProcessCameraProvider = cameraProviderFuture.get() val executor = ContextCompat.getMainExecutor(this) imageAnalysis.setAnalyzer(executor, PurpleColorAnalyser()) camera = cameraProvider.bindToLifecycle( this as LifecycleOwner, cameraSelector, imageAnalysis, preview )
  64. None
  65. Image format CameraX produces images in YUV_420_888 format.

  66. YUV color encoding This scheme assigns both brightness and color

    values to each pixel.
  67. RGB vs YUV RGB => R = red G =

    green B = blue YUV => Y = Luminance U = Chrominance of blue V = Chrominance of red YUV = YCbCr • Luminance = refers to the brightness of the pixel = Y (grayscale image) • Chrominance = refers to the color = UV
  68. RGB to YUV • Y = 0.299R + 0.587G +

    0.114B • U = 0.492(B - Y) => blueness of the pixel • V = 0.877(R - Y) => redness of the pixel
  69. YUV to RGB • R = 1.164 * Y +

    1.596 * V • G = 1.164 * Y - 0.392 * U - 0.813 * V • B = 1.164 * Y + 2.017 * U
  70. None
  71. Human eye sees well sees not so well Chroma Subsampling

  72. 4:4:4 4:2:2 4:2:0

  73. None
  74. 4:4:4

  75. 4:4:4

  76. 4:4:4

  77. 4:4:4

  78. 4:4:4

  79. 4:4:4

  80. 4:2:2

  81. 4:2:2

  82. 4:2:2

  83. 4:2:2

  84. 4:2:2

  85. 4:2:2

  86. 4:2:0

  87. 4:2:0

  88. 4:2:0

  89. 4:2:0

  90. 4:2:0

  91. Chroma Subsampling 4:4:4 4:2:2 4:2:0 Luma Chroma Luma + Chroma

  92. Data savings 50%

  93. Camera Controls cancelFocusAndMetering() • Cancels current FocusMeteringAction and clears AF/AE/AWB

    regions • automatic focus (AF), automatic exposure (AE), and automatic white-balance (AWB) enableTorch(torch: Boolean) • Enable the torch or disable the torch. setLinearZoom(@FloatRange(0.0, 1.0) linearZoom: Float) • Sets current zoom by a linear zoom value ranging from 0f to 1f. setZoomRatio(ratio: Float) • Sets current zoom by ratio. startFocusAndMetering(@NonNull action: FocusMeteringAction) • Starts a focus and metering action configured by the FocusMeteringAction.
  94. val cameraControl = camera.cameraControl val cameraInfo = camera.cameraInfo cameraInfo.torchState.observe(this, Observer

    { state -> if (state == TorchState.ON) { // state on } else { // state off } })
  95. None
  96. CameraView CameraView FrameLayout ViewGroup • A view that displays a

    preview of the camera with methods like: ◦ takePicture(ImageCapture.OutputFileOptions, Executor, OnImageSavedCallback) ◦ startRecording(File, Executor, OnVideoSavedCallback) ◦ stopRecording() • Must be opened/closed, since it consumes a high amount of power, and these actions can be handled by using bindToLifecycle • CameraCapture is used to setup de capture mode ◦ IMAGE ◦ MIXED ◦ VIDEO
  97. CameraX lab-tested devices

  98. Extensions 04 Dedicated API for optional effects like HDR, portrait,

    night-mode
  99. Extensions architecture Image source: https://developer.android.com/training/camerax/vendor-extensions

  100. ImageCaptureExtender +enableExtension(cameraSelector) +isExtensionAvailable(cameraSelector) BeautyImageCaptureExtender NightImageCaptureExtender AutoImageCaptureExtender BokehImageCaptureExtender HdrImageCaptureExtender

  101. What is bokeh? The bokeh effect is produced when the

    foreground and/or background is intentionally blurred around a subject. Bokeh means "blur" in Japanese.
  102. Extensions - implementation steps Step 1 Create an Extender object

    Step 2 Enable the extension
  103. val builder = ImageCapture.Builder() val beautyExtender = BeautyImageCaptureExtender.create(builder) Step 1:

    create an Extender object
  104. if (beautyExtender.isExtensionAvailable(cameraSelector)) { beautyExtender.enableExtension(cameraSelector) } Step 2: enable the extension

  105. CameraX recap Backward compatible with L+ devices Consistent behavior across

    devices Easy to use API Lifecycle awareness Use-case driven approach
  106. Learn more... • Official documentation ◦ CameraX overview • Medium

    Articles @androiddevelopers ◦ Core principles behind CameraX Jetpack Library - Android Developers ◦ Android’s CameraX Jetpack Library is now in Beta! • Podcast ◦ https://androidbackstage.blogspot.com/2019/06/episode-116-camerax.html • Check latest updates about CameraX ◦ https://developer.android.com/jetpack/androidx/releases/camera • Lab-tested devices ◦ https://developer.android.com/training/camerax/devices
  107. Thanks! magdamiu.com @magdamiu