Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Smile, it's CameraX!

Smile, it's CameraX!

These days, humans take more than 1 trillion photos a year and there are researches that show how taking photos increases enjoyment of experiences. This could be a good reason to focus on how to develop an app that takes pictures and CameraX was one of the most popular releases at Google I/O 2019.
CameraX is a Jetpack supportt library built to help us make camera app development easier. In this presentation we will discuss about simplified usage of this new API that solves many bugs and pain points by offering consistency across devices.

Magda Miu

June 10, 2020
Tweet

More Decks by Magda Miu

Other Decks in Programming

Transcript

  1. The last_last_last_..._last_selfie A Picture Is Worth a Thousand Words Photos

    help us to tell stories Self-portraits are about self-image “looking-glass self”
  2. Digital camera vs Phone vs Tablet 87.5% Phone 10% Digital

    camera 2.5% Tablet 89.8% Phone 8.2% Digital camera 2% Tablet 90.9% Phone 7.3% Digital camera 1.8% Tablet
  3. Preview - implementation steps Step 1 Add gradle dependencies Step

    2 Permission handling Step 3 Add PreviewView in a layout Step 4 Get an instance of ProcessCamera Provider Step 5 Select a camera and bind the lifecycle
  4. <uses-feature android:name="android.hardware.camera.any" /> <uses-permission android:name="android.permission.CAMERA" /> if (areAllPermissionsGranted()) { startCamera()

    } else { ActivityCompat.requestPermissions( this, PERMISSIONS, CAMERA_REQUEST_PERMISSION_CODE) } Step 2: permission handling
  5. <uses-feature android:name="android.hardware.camera.any" /> <uses-permission android:name="android.permission.CAMERA" /> if (areAllPermissionsGranted()) { startCamera()

    } else { ActivityCompat.requestPermissions( this, PERMISSIONS, CAMERA_REQUEST_PERMISSION_CODE) } Step 2: permission handling
  6. val previewView = findViewById(R.id.preview) // initialize the Preview object (current

    use case) val preview = Preview.Builder().build() Step 3: add PreviewView in a layout
  7. val cameraProviderFuture = ProcessCameraProvider.getInstance(this) // used to bind the lifecycle

    of camera to the lifecycle owner val cameraProvider: ProcessCameraProvider = cameraProviderFuture.get() // add a listener to the cameraProviderFuture val cameraExecutor = ContextCompat.getMainExecutor(this) cameraProviderFuture.addListener(Runnable {}, cameraExecutor) Step 4: instance of ProcessCameraProvider
  8. val cameraProviderFuture = ProcessCameraProvider.getInstance(this) // used to bind the lifecycle

    of camera to the lifecycle owner val cameraProvider: ProcessCameraProvider = cameraProviderFuture.get() // add a listener to the cameraProviderFuture val cameraExecutor = ContextCompat.getMainExecutor(this) cameraProviderFuture.addListener(Runnable {}, cameraExecutor) Step 4: instance of ProcessCameraProvider
  9. val cameraProviderFuture = ProcessCameraProvider.getInstance(this) // used to bind the lifecycle

    of camera to the lifecycle owner val cameraProvider: ProcessCameraProvider = cameraProviderFuture.get() // add a listener to the cameraProviderFuture val cameraExecutor = ContextCompat.getMainExecutor(this) cameraProviderFuture.addListener(Runnable {}, cameraExecutor) Step 4: instance of ProcessCameraProvider
  10. val backCamera = CameraSelector.LENS_FACING_BACK; val frontCamera = CameraSelector.LENS_FACING_FRONT; val cameraSelector

    = CameraSelector.Builder().requireLensFacing(backCamera).build() Step 5: select camera
  11. // unbind use cases before rebinding cameraProvider.unbindAll(); // bind the

    preview use case to camera camera = cameraProvider.bindToLifecycle( this as LifecycleOwner, cameraSelector, preview ) preview?.setSurfaceProvider( previewView.createSurfaceProvider(camera?.cameraInfo)) Step 5: bind the lifecycle (try-catch) *Runnable - try catch
  12. // unbind use cases before rebinding cameraProvider.unbindAll(); // bind the

    preview use case to camera camera = cameraProvider.bindToLifecycle( this as LifecycleOwner, cameraSelector, preview ) preview?.setSurfaceProvider( previewView.createSurfaceProvider(camera?.cameraInfo)) Step 5: bind the lifecycle (try-catch) *Runnable - try catch
  13. val previewView = findViewById(R.id.preview) val preview = Preview.Builder().build() val cameraProviderFuture

    = ProcessCameraProvider.getInstance(this) val cameraProvider: ProcessCameraProvider = cameraProviderFuture.get() val cameraExecutor = ContextCompat.getMainExecutor(this) cameraProviderFuture.addListener(Runnable { cameraProvider.unbindAll(); camera = cameraProvider.bindToLifecycle( this as LifecycleOwner, cameraSelector, preview ) preview?.setSurfaceProvider( previewView.createSurfaceProvider(camera?.cameraInfo)) }, cameraExecutor) try catch
  14. Capture - implementation steps Step 1 Create ImageCapture reference Step

    2 Add orientation event listener Step 3 Image file management Step 4 Call takePicture() Step 5 Update the call to bind lifecycle
  15. // to optimize photo capture for quality val captureMode =

    ImageCapture.CAPTURE_MODE_MAXIMIZE_QUALITY // to optimize photo capture for latency (default) val captureMode = ImageCapture.CAPTURE_MODE_MINIMIZE_LATENCY imageCapture = ImageCapture.Builder() .setCaptureMode(captureMode) .build() Step 1: create ImageCapture reference *photo capture mode
  16. // flash will always be used when taking a picture

    val flashMode = ImageCapture.FLASH_MODE_ON // flash will never be used when taking a picture (default) val flashMode = ImageCapture.FLASH_MODE_OFF // flash will be used according to the camera system's determination val flashMode = ImageCapture.FLASH_MODE_AUTO imageCapture = ImageCapture.Builder() .setFlashMode(flashMode) .build() Step 1: create ImageCapture reference *flash mode
  17. // 16:9 standard aspect ratio val aspectRatio = AspectRatio.RATIO_16_9 //

    4:3 standard aspect ratio (default) val aspectRatio = AspectRatio.RATIO_4_3 imageCapture = ImageCapture.Builder() .setTargetAspectRatio(aspectRatio) .build() Step 1: create ImageCapture reference *aspect ratio
  18. val metrics = DisplayMetrics().also { previewView.display.getRealMetrics(it) } val screenSize =

    Size(metrics.widthPixels, metrics.heightPixels) imageCapture = ImageCapture.Builder() .setTargetResolution(screenSize) .setTargetName("CameraConference") .build() Step 1: create ImageCapture reference *target resolution and target name
  19. val orientationEventListener = object : OrientationEventListener(this as Context) { override

    fun onOrientationChanged(orientation: Int) { val rotation: Int = when (orientation) { in 45..134 -> Surface.ROTATION_270 in 135..224 -> Surface.ROTATION_180 in 225..314 -> Surface.ROTATION_90 else -> Surface.ROTATION_0 } // default => Display.getRotation() imageCapture.targetRotation = rotation } } orientationEventListener.enable() Step 2: add orientation event listener
  20. val file = File( externalMediaDirs.first(), "${System.currentTimeMillis()}.jpg" ) val outputFileOptions =

    ImageCapture.OutputFileOptions.Builder(file).build() Step 3: image file management
  21. imageCapture.takePicture(outputFileOptions, cameraExecutor, object : ImageCapture.OnImageSavedCallback { override fun onImageSaved(outputFileResult: ImageCapture.OutputFileResults)

    { // yey!!! :) } override fun onError(exception: ImageCaptureException) { // ohhh!!! :( } }) Step 4: call takePicture()
  22. // bind the image capture use case to camera camera

    = cameraProvider.bindToLifecycle( this as LifecycleOwner, cameraSelector, preview, imageCapture ) Step 5: update the call to bind lifecycle
  23. val imageCapture = ImageCapture.Builder().build() val file = File(externalMediaDirs.first(), name) val

    output = ImageCapture.OutputFileOptions.Builder(file).build() imageCapture.takePicture(output, executor, object : ImageCapture.OnImageSavedCallback { override fun onImageSaved(ofr: ImageCapture.OutputFileResults) { // yey!!! :) } override fun onError(exception: ImageCaptureException) { // ohhh!!! :( } }) camera = cameraProvider.bindToLifecycle( this as LifecycleOwner, cameraSelector, preview, imageCapture )
  24. Analysis - implementation steps Step 1 Create ImageAnalysis reference Step

    2 Define a custom analyser Step 3 Implement analyze() method Step 4 Set the custom analyser Step 5 Update the call to bind lifecycle
  25. // the executor receives sequentially the frames val blocking =

    ImageAnalysis.STRATEGY_BLOCK_PRODUCER // the executor receives last available frame (default) val nonBlocking = ImageAnalysis.STRATEGY_KEEP_ONLY_LATEST val imageAnalysis = ImageAnalysis.Builder() .setBackpressureStrategy(nonBlocking) .build() Step 1: create ImageAnalysis reference
  26. Image analysis working modes setBackpressureStrategy(ImageAnalysis.STRATEGY_BLOCK_PRODUCER) • The executor receives sequentially

    the frames • We could use getImageQueueDepth() to return the no of images available in the pipeline, including the image currently analysed • If analyze() takes longer than the latency of a single frame at the current framerate => new frames are blocked until the method returns setBackpressureStrategy(ImageAnalysis.STRATEGY_KEEP_ONLY_LATEST) • The default strategy • The executor receives last available frame • If analyze() takes longer than the latency of a single frame at the current framerate => some frames might be skipped and the method will get the last frame available in the camera pipeline blocking mode non-blocking mode
  27. ImageAnalysis.Builder methods setTargetAspectRatio(int aspectRatio) • aspectRatio could be RATIO_16_9 or

    RATIO_4_3 • Default = RATIO_4_3 setTargetName(String targetName) • Debug purpose • Default = class canonical name and random UUID setTargetResolution(Size resolution) • Set the resolution of the intended target • Default = 640x480 setTargetRotation(int rotation) • Set the rotation of the intended target • Default = Display.getRotation()
  28. class PurpleColorAnalyser() : ImageAnalysis.Analyzer { override fun analyze(image: ImageProxy) {

    TODO("Not yet implemented") } } Step 2: define a custom analyser
  29. private var lastAnalyzedTimestamp = 0L override fun analyze(image: ImageProxy) {

    val timestamp = System.currentTimeMillis() val oneSecond = TimeUnit.SECONDS.toMillis(1) if (elapsedOneSecond(timestamp, oneSecond)) { val buffer = image.planes[0].buffer val data = buffer.toByteArray() val pixels = data.map { it.toInt() and 0x9370DB } val averagePurplePixels = pixels.average() lastAnalyzedTimestamp = timestamp } image.close() } Step 3: implement analyze() method
  30. // bind the image analysis use case to camera camera

    = cameraProvider.bindToLifecycle( this as LifecycleOwner, cameraSelector, imageAnalysis, preview ) Step 5: update the call to bind lifecycle
  31. val imageAnalysis = ImageAnalysis.Builder() .setBackpressureStrategy(nonBlocking) .build() val cameraProviderFuture = ProcessCameraProvider.getInstance(this)

    val cameraProvider: ProcessCameraProvider = cameraProviderFuture.get() val executor = ContextCompat.getMainExecutor(this) imageAnalysis.setAnalyzer(executor, PurpleColorAnalyser()) camera = cameraProvider.bindToLifecycle( this as LifecycleOwner, cameraSelector, imageAnalysis, preview )
  32. RGB vs YUV RGB => R = red G =

    green B = blue YUV => Y = Luminance U = Chrominance of blue V = Chrominance of red YUV = YCbCr • Luminance = refers to the brightness of the pixel = Y (grayscale image) • Chrominance = refers to the color = UV
  33. RGB to YUV • Y = 0.299R + 0.587G +

    0.114B • U = 0.492(B - Y) => blueness of the pixel • V = 0.877(R - Y) => redness of the pixel
  34. YUV to RGB • R = 1.164 * Y +

    1.596 * V • G = 1.164 * Y - 0.392 * U - 0.813 * V • B = 1.164 * Y + 2.017 * U
  35. Camera Controls cancelFocusAndMetering() • Cancels current FocusMeteringAction and clears AF/AE/AWB

    regions • automatic focus (AF), automatic exposure (AE), and automatic white-balance (AWB) enableTorch(torch: Boolean) • Enable the torch or disable the torch. setLinearZoom(@FloatRange(0.0, 1.0) linearZoom: Float) • Sets current zoom by a linear zoom value ranging from 0f to 1f. setZoomRatio(ratio: Float) • Sets current zoom by ratio. startFocusAndMetering(@NonNull action: FocusMeteringAction) • Starts a focus and metering action configured by the FocusMeteringAction.
  36. val cameraControl = camera.cameraControl val cameraInfo = camera.cameraInfo cameraInfo.torchState.observe(this, Observer

    { state -> if (state == TorchState.ON) { // state on } else { // state off } })
  37. CameraView CameraView FrameLayout ViewGroup • A view that displays a

    preview of the camera with methods like: ◦ takePicture(ImageCapture.OutputFileOptions, Executor, OnImageSavedCallback) ◦ startRecording(File, Executor, OnVideoSavedCallback) ◦ stopRecording() • Must be opened/closed, since it consumes a high amount of power, and these actions can be handled by using bindToLifecycle • CameraCapture is used to setup de capture mode ◦ IMAGE ◦ MIXED ◦ VIDEO
  38. What is bokeh? The bokeh effect is produced when the

    foreground and/or background is intentionally blurred around a subject. Bokeh means "blur" in Japanese.
  39. CameraX recap Backward compatible with L+ devices Consistent behavior across

    devices Easy to use API Lifecycle awareness Use-case driven approach
  40. Learn more... • Official documentation ◦ CameraX overview • Medium

    Articles @androiddevelopers ◦ Core principles behind CameraX Jetpack Library - Android Developers ◦ Android’s CameraX Jetpack Library is now in Beta! • Podcast ◦ https://androidbackstage.blogspot.com/2019/06/episode-116-camerax.html • Check latest updates about CameraX ◦ https://developer.android.com/jetpack/androidx/releases/camera • Lab-tested devices ◦ https://developer.android.com/training/camerax/devices