Upgrade to Pro — share decks privately, control downloads, hide ads and more …

VIDEO EDITING ON ANDROID

VIDEO EDITING ON ANDROID

Nothing better than sharing visual content in the sleekest form possible. Which is why more and more apps now require implementing offline video editing features. Ever tried synchronizing the usage of FFmpeg and ExoPlayer on Android? If not, why should you?

Michal Jenicek

January 23, 2020
Tweet

More Decks by Michal Jenicek

Other Decks in Programming

Transcript

  1. Michal Jenicek, Software Engineer at STRV
    VIDEO EDITING
    I would like to share experiences with video editing

    View Slide

  2. 2
    We have been working on the project called Opkix.

    Opkix is about IoT device looking like the egg with two cameras.

    View Slide

  3. 3
    Google Pixel 3 - Black
    357 x 714 px
    Google Pixel 3 - Black
    357 x 714 px
    Opkix device has it’s own application for firmware and video management.

    You can manage the egg by using WiFi and BLE.

    View Slide

  4. 4
    Google Pixel 3 - Black
    357 x 714 px
    Google Pixel 3 - Black
    357 x 714 px
    User want’s also the ability to combine clips and create final move.

    View Slide

  5. 5
    01
    CUSTOM VIDEO
    EDITING FEATURES

    View Slide

  6. 6
    WYSIWYG
    • Visible result of every step
    Real-Time
    • Don’t wait for each step processing
    Offline
    • Use just WiFi & BLE
    FEATURE REQUIREMENTS
    WYSIWYG - What you see is what you get. User knows how edit step affect the video.

    REAL-TIME - See the the result immediately (don’t wait for real video processing)

    OFFLINE - app is communicating just via WiFi or BLE with the egg.

    View Slide

  7. 7
    EXPORTABLE
    • Ability to export final movie
    • Export shouldn’t block the app
    FEATURE REQUIREMENTS

    View Slide

  8. 8
    • Media object is defining all edits.
    • View layer represents media object in real time.
    • Export layer exports final movie.
    IMPLEMENTATION STRATEGY
    We know the requirements, let’s define clear implementation strategy.

    media object - edits can be simply persisted without need to adjust source video file

    View Slide

  9. 9
    iOS
    • Platform native media object
    IMPLEMENTATION STRATEGY
    iOS - media object with ability to adjust behaviour.

    View Slide

  10. 10
    iOS
    • Platform native media object
    • Platform native view layer
    IMPLEMENTATION STRATEGY
    iOS - media object with ability to adjust behaviour, be played.

    View Slide

  11. 11
    iOS
    • Platform native media object
    • Platform native view layer
    • Platform native export layer
    IMPLEMENTATION STRATEGY
    iOS - media object with ability to adjust behaviour, be played and is also exportable.

    View Slide

  12. 12
    iOS
    • Platform native media object
    • Platform native view layer
    • Platform native export layer
    Android
    • Platform native media object
    IMPLEMENTATION STRATEGY ANDROID REALITY
    iOS - media object with ability to adjust behaviour, be played and is also exportable.

    View Slide

  13. 13
    iOS
    • Platform native media object
    • Platform native view layer
    • Platform native export layer
    Android
    • Platform native media object
    • Platform native view layer
    IMPLEMENTATION STRATEGY ANDROID REALITY
    iOS - media object with ability to adjust behaviour, be played and is also exportable.

    View Slide

  14. 14
    iOS
    • Platform native media object
    • Platform native view layer
    • Platform native export layer
    Android
    • Platform native media object
    • Platform native view layer
    • Platform native export layer
    IMPLEMENTATION STRATEGY ANDROID REALITY
    iOS - media object with ability to adjust behaviour, be played and is also exportable.

    View Slide

  15. 15
    iOS
    • Platform native media object
    • Platform native view layer
    • Platform native export layer
    Android
    • Platform native media object
    • Platform native view layer
    • Platform native export layer
    • Fake them all!
    IMPLEMENTATION STRATEGY ANDROID REALITY
    iOS - media object with ability to adjust behaviour, be played and is also exportable.

    Android - has to solve the framework itself.

    View Slide

  16. • MediaPlayer (view layer)
    16
    CHOOSE THE FAKE TOOLS
    MediaPlayer
    MediaPlayer is simply blackbox. We didn’t pick it.

    View Slide

  17. ❌ MediaPlayer
    • Android Media API (Extractor, Muxer, …)
    17
    CHOOSE THE FAKE TOOLS
    android.media
    MediaPlayer
    android.media
    Android Media API is complicated without good samples.

    We tried android.media for different task and we decided not to use it this time.

    View Slide

  18. ❌ MediaPlayer
    ❌ Android Media API (Extractor, Muxer, …)
    • Commercial edit libraries (export layer)
    18
    CHOOSE THE FAKE TOOLS
    android.media
    MediaPlayer
    We didn’t want to use any paid solutions.

    View Slide

  19. ✅ Kotlin data class (media object)
    ❌ MediaPlayer
    ❌ Android Media API (Extractor, Muxer, …)
    ❌ Commercial edit libraries
    19
    CHOOSE THE FAKE TOOLS
    android.media
    MediaPlayer

    View Slide

  20. ✅ Kotlin data class (media object)
    ✅ ExoPlayer (view layer)
    ❌ MediaPlayer
    ❌ Android Media API (Extractor, Muxer, …)
    ❌ Commercial edit libraries
    20
    CHOOSE THE FAKE TOOLS
    android.media
    MediaPlayer
    ExoPlayer
    Open Source library by Google.

    Good documentation including samples and solutions on stackoverflow.

    View Slide

  21. ✅ Kotlin data class (media object)
    ✅ ExoPlayer (view layer)
    ✅ FFmpeg (export layer)
    ❌ MediaPlayer
    ❌ Android Media API (Extractor, Muxer, …)
    ❌ Commercial edit libraries
    21
    CHOOSE THE FAKE TOOLS
    android.media
    MediaPlayer
    FFmpeg
    ExoPlayer
    Powerful opensource library with broad community.

    Usable also on Android.

    View Slide

  22. • ExoPlayer (view layer)
    • FFmpeg (export layer)
    22
    FAKE TOOLS
    FFmpeg
    ExoPlayer
    Usage is straight forward.

    View Slide

  23. • ExoPlayer (view layer)
    • FFmpeg (export layer)
    23
    FAKE TOOLS
    FFmpeg
    ExoPlayer
    FFmpeg is not targeted to Android.

    It’s usage on Android might be less obvious.

    Let’s talk it a bit.

    View Slide

  24. 24
    02
    FFMPEG
    ON ANDROID

    View Slide

  25. 25
    • Audio/Video processing library
    FFMPEG IS

    View Slide

  26. 26
    • Audio/Video processing library
    • C language
    FFMPEG IS

    View Slide

  27. 27
    • Audio/Video processing library
    • C language
    • Opensource
    FFMPEG IS

    View Slide

  28. 28
    • Audio/Video processing library
    • C language
    • Opensource
    FFMPEG IS
    read and write media streams
    decode and encode media streams
    scale images
    utility functions for all other libraries
    libavformat
    libavcodec
    libswscale
    libavutil
    FFmpeg stands for 4 core libraries ( it takes about 16MB ).

    View Slide

  29. 29
    • Audio/Video processing library
    • C language
    • Opensource
    • Lot of external libraries
    • x264
    • x265
    • xvidcore
    • vid.stab
    • …
    FFMPEG IS
    read and write media streams
    decode and encode media streams
    scale images
    utility functions for all other libraries
    libavformat
    libavcodec
    libswscale
    libavutil
    There is lot of external libraries available.

    View Slide

  30. 30
    • Audio/Video processing library
    • C language
    • Opensource
    • Lot of external libraries
    • x264
    • x265
    • xvidcore
    • vid.stab
    • …
    FFMPEG IS
    read and write media streams
    decode and encode media streams
    scale images
    utility functions for all other libraries
    libavformat
    libavcodec
    libswscale
    libavutil
    FFmpeg offers thre tools.

    View Slide

  31. 31
    • Audio/Video processing library
    • C language
    • Opensource
    • Lot of external libraries
    • x264
    • x265
    • xvidcore
    • vid.stab
    • …
    FFMPEG STARTS ON 16MB SIZE
    read and write media streams
    decode and encode media streams
    scale images
    utility functions for all other libraries
    libavformat
    libavcodec
    libswscale
    libavutil
    To write basic video editor the ffmpeg tool and few external libraries is enough.

    For basic video editor it takes about 20MB.

    View Slide

  32. 32
    Build FFmpeg yourself
    • Compile
    • Connect extension libraries
    • Integrate (JNI)
    FFMPEG ON ANDROID
    You have options to get build of FFmpeg on Android

    View Slide

  33. 33
    Build FFmpeg yourself
    • Compile
    • Connect extension libraries
    • Integrate (JNI)
    Use existing prebuilt library
    • TANERSENER (mobile-ffmpeg)
    • …
    FFMPEG ON ANDROID

    View Slide

  34. 34
    Build FFmpeg yourself
    • Compile
    • Connect extension libraries
    • Integrate (JNI)
    Use existing prebuilt library
    • TANERSENER (mobile-ffmpeg)
    • …
    FFMPEG ON ANDROID
    Specific function
    • Function in C language (on top of FFmpeg build)
    • Expose function via JNI
    Specific function - good for exposing your code as a library.

    View Slide

  35. 35
    Build FFmpeg yourself
    • Compile
    • Connect extension libraries
    • Integrate (JNI)
    Use existing prebuilt library
    • TANERSENER (mobile-ffmpeg)
    • …
    FFMPEG ON ANDROID
    Generic function
    • Runtime.getRuntime().exec(…)
    • Expose generic call via JNI
    Specific function
    • Function in C language (on top of FFmpeg build)
    • Expose function via JNI
    Send set of ffmpeg commands as a string.

    View Slide

  36. 36
    Build FFmpeg yourself
    • Compile
    • Connect extension libraries
    • Integrate (JNI)
    Use existing prebuilt library
    • TANERSENER (mobile-ffmpeg)
    • …
    FFMPEG ON ANDROID OUR CHOICE
    Generic function
    • Runtime.getRuntime().exec(…)
    • Expose generic call via JNI
    Specific function
    • Function in C language (on top of FFmpeg build)
    • Expose function via JNI

    View Slide

  37. 37
    FFMPEG TANERSENER TO THE RESCUE
    • https://github.com/tanersener/mobile-ffmpeg

    View Slide

  38. 38
    FFMPEG TANERSENER TO THE RESCUE
    • Support many FFmpeg releases (Active development)
    https://github.com/tanersener/mobile-ffmpeg

    View Slide

  39. 39
    FFMPEG TANERSENER TO THE RESCUE
    • Support many FFmpeg releases (Active development)
    • 32 external libraries
    https://github.com/tanersener/mobile-ffmpeg

    View Slide

  40. 40
    FFMPEG TANERSENER TO THE RESCUE
    • Support many FFmpeg releases (Active development)
    • 32 external libraries
    • x86_64 support
    https://github.com/tanersener/mobile-ffmpeg

    View Slide

  41. 41
    FFMPEG TANERSENER TO THE RESCUE
    • Support many FFmpeg releases (Active development)
    • 32 external libraries
    • x86_64 support
    • API Level 16+
    https://github.com/tanersener/mobile-ffmpeg

    View Slide

  42. 42
    FFMPEG TANERSENER TO THE RESCUE
    • Support many FFmpeg releases (Active development)
    • 32 external libraries
    • x86_64 support
    • API Level 16+
    • MobileFFmpeg wrapper library
    https://github.com/tanersener/mobile-ffmpeg

    View Slide

  43. 43
    FFMPEG TANERSENER TO THE RESCUE
    • Support many FFmpeg releases (Active development)
    • 32 external libraries
    • x86_64 support
    • API Level 16+
    • MobileFFmpeg wrapper library
    • 8 types of pre-built library variants (or fork the custom one)
    https://github.com/tanersener/mobile-ffmpeg

    View Slide

  44. 44
    FFMPEG TANERSENER TO THE RESCUE
    • Support many FFmpeg releases (Active development)
    • 32 external libraries
    • x86_64 support
    • API Level 16+
    • Generic mobileFFmpeg wrapper library
    • 8 types of pre-built library variants (or fork the custom one)
    https://github.com/tanersener/mobile-ffmpeg

    View Slide

  45. 45
    03
    IMPLEMENT
    FEATURES

    View Slide

  46. 46
    < FEATURE > < LAYER >
    Piece of code to show the main trick.

    View Slide

  47. 47
    TIMELINE EXOPLAYER
    val mediaSourceList: List = timelineList.map {
    ClippingMediaSource(
    ProgressiveMediaSource.Factory(videoDataSourceFactory)
    .createMediaSource(Uri.parse(it.videoPath)),
    it.startPositionInMillis,
    it.endPositionInMillis)
    }
    exoPlayer.prepare(
    ConcatenatingMediaSource(isAtomic = true, *mediaSourceList.toTypedArray())
    )

    View Slide

  48. 48
    TIMELINE EXOPLAYER
    val mediaSourceList: List = timelineList.map {
    ClippingMediaSource(
    ProgressiveMediaSource.Factory(videoDataSourceFactory)
    .createMediaSource(Uri.parse(it.videoPath)),
    it.startPositionInMillis,
    it.endPositionInMillis)
    }
    exoPlayer.prepare(
    ConcatenatingMediaSource(isAtomic = true, *mediaSourceList.toTypedArray())
    )

    View Slide

  49. 49
    TIMELINE EXOPLAYER
    val mediaSourceList: List = timelineList.map {
    ClippingMediaSource(
    ProgressiveMediaSource.Factory(videoDataSourceFactory)
    .createMediaSource(Uri.parse(it.videoPath)),
    it.startPositionInMillis,
    it.endPositionInMillis)
    }
    exoPlayer.prepare(
    ConcatenatingMediaSource(isAtomic = true, *mediaSourceList.toTypedArray())
    )

    View Slide

  50. 50
    TIMELINE EXOPLAYER
    val mediaSourceList: List = timelineList.map {
    ClippingMediaSource(
    ProgressiveMediaSource.Factory(videoDataSourceFactory)
    .createMediaSource(Uri.parse(it.videoPath)),
    it.startPositionInMillis,
    it.endPositionInMillis)
    }
    exoPlayer.prepare(
    ConcatenatingMediaSource(isAtomic = true, *mediaSourceList.toTypedArray())
    )

    View Slide

  51. 51
    TIMELINE EXOPLAYER
    val mediaSourceList: List = timelineList.map {
    ClippingMediaSource(
    ProgressiveMediaSource.Factory(videoDataSourceFactory)
    .createMediaSource(Uri.parse(it.videoPath)),
    it.startPositionInMillis,
    it.endPositionInMillis)
    }
    exoPlayer.prepare(
    ConcatenatingMediaSource(isAtomic = true, *mediaSourceList.toTypedArray())
    )

    View Slide

  52. 52
    TIMELINE FFMPEG
    ffmpeg -i input1.mp4 -c copy -f mpegts intermediate1.ts
    ffmpeg -i input2.mp4 -c copy -f mpegts intermediate2.ts
    ffmpeg -i "concat:intermediate1.ts|intermediate2.ts" -c copy output.mp4
    Concat protocol

    View Slide

  53. 53
    TIMELINE FFMPEG
    ffmpeg -i input1.mp4 -c copy -f mpegts intermediate1.ts
    ffmpeg -i input2.mp4 -c copy -f mpegts intermediate2.ts
    ffmpeg -i "concat:intermediate1.ts|intermediate2.ts" -c copy output.mp4
    Concat protocol

    View Slide

  54. 54
    TIMELINE FFMPEG
    ffmpeg -i input1.mp4 -c copy -f mpegts intermediate1.ts
    ffmpeg -i input2.mp4 -c copy -f mpegts intermediate2.ts
    ffmpeg -i "concat:intermediate1.ts|intermediate2.ts" -c copy output.mp4
    Concat protocol
    mpegts - converts mp4 to MPEG-2 transport stream, which is concatenable.

    View Slide

  55. 55
    TIMELINE FFMPEG
    ffmpeg -i input1.mp4 -c copy -f mpegts intermediate1.ts
    ffmpeg -i input2.mp4 -c copy -f mpegts intermediate2.ts
    ffmpeg -i "concat:intermediate1.ts|intermediate2.ts" -c copy output.mp4
    Concat protocol
    mpegts - converts mp4 to MPEG-2 transport stream, which is concatenable.

    View Slide

  56. 56
    TIMELINE FFMPEG
    ffmpeg -i input1.mp4 -c copy -f mpegts intermediate1.ts
    ffmpeg -i input2.mp4 -c copy -f mpegts intermediate2.ts
    ffmpeg -i "concat:intermediate1.ts|intermediate2.ts" -c copy output.mp4
    Concat protocol
    mpegts - converts mp4 to MPEG-2 transport stream, which is concatenable.

    View Slide

  57. 57
    TIMELINE FFMPEG
    ffmpeg -i input1.mp4 -c copy -f mpegts intermediate1.ts
    ffmpeg -i input2.mp4 -c copy -f mpegts intermediate2.ts
    ffmpeg -i "concat:intermediate1.ts|intermediate2.ts" -c copy output.mp4
    Concat protocol
    mpegts - converts mp4 to MPEG-2 transport stream, which is concatenable.

    View Slide

  58. 58
    TIMELINE FFMPEG
    ffmpeg -i input1.mp4 -c copy -f mpegts intermediate1.ts
    ffmpeg -i input2.mp4 -c copy -f mpegts intermediate2.ts
    ffmpeg -i "concat:intermediate1.ts|intermediate2.ts" -c copy output.mp4
    Concat protocol
    mpegts - converts mp4 to MPEG-2 transport stream, which is concatenable.

    View Slide

  59. 59
    TIMELINE FFMPEG
    ffmpeg -i input1.mp4 -c copy -f mpegts intermediate1.ts
    ffmpeg -i input2.mp4 -c copy -f mpegts intermediate2.ts
    ffmpeg -i "concat:intermediate1.ts|intermediate2.ts" -c copy output.mp4
    Concat protocol
    mpegts - converts mp4 to MPEG-2 transport stream, which is concatenable.

    Concat protocol is fast since it doesn’t require RE-ENCODING.

    View Slide

  60. 60
    TIMELINE FFMPEG
    ffmpeg -i input1.mp4 -c copy -f mpegts intermediate1.ts
    ffmpeg -i input2.mp4 -c copy -f mpegts intermediate2.ts
    ffmpeg -i "concat:intermediate1.ts|intermediate2.ts" -c copy output.mp4
    Concat protocol
    It’s usage is limited by using intermediate files (you have to handle them and it is another space required for processing).

    Another downside is hard to show overall progress.

    View Slide

  61. 61
    TIMELINE FFMPEG
    ffmpeg -i input1.mp4 -i input2.mov -i input3.ts \
    -filter_complex “[0:v:0][0:a:0][1:v:0][1:a:0][2:v:0][2:a:0]
    concat=n=2:v:1:a=1[outv][outa]" \
    -map "[outv]" -map "[outa]" output.mp4
    Concat filter

    View Slide

  62. 62
    TIMELINE FFMPEG
    ffmpeg -i input1.mp4 -i input2.mov -i input3.ts \
    -filter_complex “[0:v:0][0:a:0][1:v:0][1:a:0][2:v:0][2:a:0]
    concat=n=2:v:1:a=1[outv][outa]" \
    -map "[outv]" -map "[outa]" output.mp4
    Concat filter
    video and audio stream of input 1

    View Slide

  63. 63
    TIMELINE FFMPEG
    ffmpeg -i input1.mp4 -i input2.mov -i input3.ts \
    -filter_complex “[0:v:0][0:a:0][1:v:0][1:a:0][2:v:0][2:a:0]
    concat=n=2:v:1:a=1[outv][outa]" \
    -map "[outv]" -map "[outa]" output.mp4
    Concat filter
    video and audio stream of input 2

    View Slide

  64. 64
    TIMELINE FFMPEG
    ffmpeg -i input1.mp4 -i input2.mov -i input3.ts \
    -filter_complex “[0:v:0][0:a:0][1:v:0][1:a:0][2:v:0][2:a:0]
    concat=n=2:v:1:a=1[outv][outa]" \
    -map "[outv]" -map "[outa]" output.mp4
    Concat filter
    video and audio stream of input 3

    View Slide

  65. 65
    TIMELINE FFMPEG
    ffmpeg -i input1.mp4 -i input2.mov -i input3.ts
    -filter_complex “[0:v:0][0:a:0][1:v:0][1:a:0][2:v:0][2:a:0]
    concat=n=2:v:1:a=1[outv][outa]" \
    -map "[outv]" -map "[outa]" output.mp4
    Concat filter
    Concat filter is transposing all inputs to streams and you can write all operations as one command.

    View Slide

  66. 66
    TIMELINE FFMPEG
    ffmpeg -i input1.mp4 -i input2.mov -i input3.ts
    -filter_complex “[0:v:0][0:a:0][1:v:0][1:a:0][2:v:0][2:a:0]
    concat=n=2:v:1:a=1[outv][outa]" \
    -map "[outv]" -map "[outa]" output.mp4
    Concat filter
    Concat filter is transposing all inputs to streams and you can write all operations as one command.

    View Slide

  67. 67
    TIMELINE FFMPEG
    ffmpeg -i input1.mp4 -i input2.mov -i input3.ts \
    -filter_complex “[0:v:0][0:a:0][1:v:0][1:a:0][2:v:0][2:a:0]
    concat=n=2:v:1:a=1[outv][outa]" \
    -map "[outv]" -map "[outa]" output.mp4
    Concat filter
    Concat filter is transposing all inputs to streams and you can write all operations as one command.

    This is great to show overall progress.

    It can combine different video file types.

    Even though it requires re-encoding, so it’s quite slow, we picked this as usable solution.

    Export of course requires processing on foreground service not to block the app.

    View Slide

  68. 68
    TRIM EXOPLAYER
    val mediaSourceList: List = timelineList.map {
    ClippingMediaSource(
    ProgressiveMediaSource.Factory(videoDataSourceFactory)
    .createMediaSource(Uri.parse(it.videoPath)),
    it.startPositionInMillis,
    it.endPositionInMillis)
    }
    exoPlayer.prepare(
    ConcatenatingMediaSource(isAtomic = true, *mediaSourceList.toTypedArray())
    )
    Trim is ensured by parameters start and end of media source

    View Slide

  69. 69
    TRIM EXOPLAYER
    val mediaSourceList: List = timelineList.map {
    ClippingMediaSource(
    ProgressiveMediaSource.Factory(videoDataSourceFactory)
    .createMediaSource(Uri.parse(it.videoPath)),
    it.startPositionInMillis,
    it.endPositionInMillis)
    }
    exoPlayer.prepare(
    ConcatenatingMediaSource(isAtomic = true, *mediaSourceList.toTypedArray())
    )
    Code including TRIM and CONCAT

    View Slide

  70. 70
    TRIM FFMPEG
    ffmpeg -i input1.mp4 -i input2.mov
    -filter_complex
    [0:v:0]trim=0.0:20.624, setpts=PTS-STARTPTS [video00]
    [1:v:0]trim=0.0:21.625, setpts=PTS-STARTPTS [video10]
    [0:a:0]atrim=0.0:20.624, asetpts=PTS-STARTPTS [audio00]
    [1:a:0]atrim=0.0:21.625, asetpts=PTS-STARTPTS [audio10]
    [video00][video10][audio00][audio10] concat=n=2:v:1:a=1 [outv]
    [outa]
    -map "[outv]" -map "[outa]" output.mp4

    View Slide

  71. 71
    TRIM FFMPEG
    ffmpeg -i input1.mp4 -i input2.mov
    -filter_complex
    [0:v:0]trim=0.0:20.624, setpts=PTS-STARTPTS [video00]
    [1:v:0]trim=0.0:21.625, setpts=PTS-STARTPTS [video10]
    [0:a:0]atrim=0.0:20.624, asetpts=PTS-STARTPTS [audio00]
    [1:a:0]atrim=0.0:21.625, asetpts=PTS-STARTPTS [audio10]
    [video00][video10][audio00][audio10] concat=n=2:v:1:a=1 [outv]
    [outa]
    -map "[outv]" -map "[outa]" output.mp4

    View Slide

  72. 72
    TRIM FFMPEG
    ffmpeg -i input1.mp4 -i input2.mov
    -filter_complex
    [0:v:0]trim=0.0:20.624, setpts=PTS-STARTPTS [video00]
    [1:v:0]trim=0.0:21.625, setpts=PTS-STARTPTS [video10]
    [0:a:0]atrim=0.0:20.624, asetpts=PTS-STARTPTS [audio00]
    [1:a:0]atrim=0.0:21.625, asetpts=PTS-STARTPTS [audio10]
    [video00][video10][audio00][audio10] concat=n=2:v:1:a=1 [outv]
    [outa]
    -map "[outv]" -map "[outa]" output.mp4

    View Slide

  73. 73
    ROTATE/ZOOM/MOVE EXOPLAYER
    import android.graphics.Matrix
    val matrix = Matrix()
    matrix.setScale(video.zoom, video.zoom, pivotPointX, pivotPointY)
    matrix.postScale(cropScale, cropScale, pivotPointX, pivotPointY)
    matrix.postRotate(video.rotation, pivotPointX, pivotPointY)
    matrix.postTranslate(translationX * cropScale, translationY * cropScale)
    (binding.playerView.videoSurfaceView as TextureView).setTransform(matrix)

    View Slide

  74. 74
    ROTATE/ZOOM/MOVE EXOPLAYER
    import android.graphics.Matrix
    val matrix = Matrix()
    matrix.setScale(video.zoom, video.zoom, pivotPointX, pivotPointY)
    matrix.postScale(cropScale, cropScale, pivotPointX, pivotPointY)
    matrix.postRotate(video.rotation, pivotPointX, pivotPointY)
    matrix.postTranslate(translationX * cropScale, translationY * cropScale)
    (binding.playerView.videoSurfaceView as TextureView).setTransform(matrix)

    View Slide

  75. 75
    ROTATE/ZOOM/MOVE EXOPLAYER
    import android.graphics.Matrix
    val matrix = Matrix()
    matrix.setScale(video.zoom, video.zoom, pivotPointX, pivotPointY)
    matrix.postScale(cropScale, cropScale, pivotPointX, pivotPointY)
    matrix.postRotate(video.rotation, pivotPointX, pivotPointY)
    matrix.postTranslate(translationX * cropScale, translationY * cropScale)
    (binding.playerView.videoSurfaceView as TextureView).setTransform(matrix)

    View Slide

  76. 76
    ROTATE/ZOOM/MOVE EXOPLAYER
    import android.graphics.Matrix
    val matrix = Matrix()
    matrix.setScale(video.zoom, video.zoom, pivotPointX, pivotPointY)
    matrix.postScale(cropScale, cropScale, pivotPointX, pivotPointY)
    matrix.postRotate(video.rotation, pivotPointX, pivotPointY)
    matrix.postTranslate(translationX * cropScale, translationY * cropScale)
    (binding.playerView.videoSurfaceView as TextureView).setTransform(matrix)

    View Slide

  77. 77
    ROTATE/ZOOM/MOVE EXOPLAYER
    import android.graphics.Matrix
    val matrix = Matrix()
    matrix.setScale(video.zoom, video.zoom, pivotPointX, pivotPointY)
    matrix.postScale(cropScale, cropScale, pivotPointX, pivotPointY)
    matrix.postRotate(video.rotation, pivotPointX, pivotPointY)
    matrix.postTranslate(translationX * cropScale, translationY * cropScale)
    (binding.playerView.videoSurfaceView as TextureView).setTransform(matrix)

    View Slide

  78. 78
    ZOOM/MOVE FFMPEG
    ffmpeg -i input1.mp4 -filter_complex
    [0:v:0]
    scale=w=(1.0*max(iw*1080/ih\,1920)):h=(1.0*max(1080\,ih*1920/iw)),
    crop=w=1920:h=1080:x=(iw-ow)/2-((iw*0.5)/1920.0):y=(ih-oh)/2-((ih*0.0)/1080.0)
    [video00]
    [0:a:0]anull [audio00]
    [video00][audio00] concat=n=1:v:1:a=1 [outv][outa]
    -map "[outv]" -map "[outa]" output.mp4
    1.0 ${zoom}
    0.5 ${translation.translationX}
    0.0 ${translation.translationY}

    View Slide

  79. 79
    ZOOM/MOVE FFMPEG
    ffmpeg -i input1.mp4 -filter_complex
    [0:v:0]
    scale=w=(1.0*max(iw*1080/ih\,1920)):h=(1.0*max(1080\,ih*1920/iw)),
    crop=w=1920:h=1080:x=(iw-ow)/2-((iw*0.5)/1920.0):y=(ih-oh)/2-((ih*0.0)/1080.0)
    [video00]
    [0:a:0]anull [audio00]
    [video00][audio00] concat=n=1:v:1:a=1 [outv][outa]
    -map "[outv]" -map "[outa]" output.mp4
    1.0 ${zoom}
    0.5 ${translation.translationX}
    0.0 ${translation.translationY}

    View Slide

  80. 80
    ZOOM/MOVE FFMPEG
    ffmpeg -i input1.mp4 -filter_complex
    [0:v:0]
    scale=w=(1.0*max(iw*1080/ih\,1920)):h=(1.0*max(1080\,ih*1920/iw)),
    crop=w=1920:h=1080:x=(iw-ow)/2-((iw*0.5)/1920.0):y=(ih-oh)/2-((ih*0.0)/1080.0)
    [video00]
    [0:a:0]anull [audio00]
    [video00][audio00] concat=n=1:v:1:a=1 [outv][outa]
    -map "[outv]" -map "[outa]" output.mp4
    1.0 ${zoom}
    0.5 ${translation.translationX}
    0.0 ${translation.translationY}

    View Slide

  81. 81
    ROTATE FFMPEG
    ffmpeg -i input1.mp4 -filter_complex
    [0:v:0]
    rotate=90*PI/180:ow=min(iw,ih)/sqrt(2):oh=ow:c=none
    [video00]
    [0:a:0]anull [audio00]
    [video00][audio00] concat=n=1:v:1:a=1 [outv][outa]
    -map "[outv]" -map "[outa]" output.mp4
    ScaleType.CROP
    Rotate input video clockwise, expressed as number of radians.

    c=none used when no backgeround is ever shown to improve performance

    View Slide

  82. 82
    ROTATE FFMPEG
    ffmpeg -i input1.mp4 -filter_complex
    [0:v:0]
    rotate=90*PI/180:ow=min(iw,ih)/sqrt(2):oh=ow:c=none
    [video00]
    [0:a:0]anull [audio00]
    [video00][audio00] concat=n=1:v:1:a=1 [outv][outa]
    -map "[outv]" -map "[outa]" output.mp4
    ScaleType.CROP
    Rotate input video clockwise, expressed as number of radians.

    c=none used when no backgeround is ever shown to improve performance

    View Slide

  83. 83
    ROTATE FFMPEG
    ffmpeg -i input1.mp4 -filter_complex
    [0:v:0]
    rotate=90*PI/180:ow=min(iw,ih)/sqrt(2):oh=ow:c=none
    [video00]
    [0:a:0]anull [audio00]
    [video00][audio00] concat=n=1:v:1:a=1 [outv][outa]
    -map "[outv]" -map "[outa]" output.mp4
    ScaleType.CROP
    Rotate input video clockwise, expressed as number of radians.

    c=none used when no backgeround is ever shown to improve performance

    View Slide

  84. 84
    ROTATE FFMPEG
    ffmpeg -i input1.mp4 -filter_complex
    [0:v:0]
    rotate=90*PI/180:ow=min(iw,ih)/sqrt(2):oh=ow:c=none
    [video00]
    [0:a:0]anull [audio00]
    [video00][audio00] concat=n=1:v:1:a=1 [outv][outa]
    -map "[outv]" -map "[outa]" output.mp4
    ScaleType.CROP
    Rotate input video clockwise, expressed as number of radians.

    c=none used when no backgeround is ever shown to improve performance

    View Slide

  85. 85
    ROTATE FFMPEG
    ffmpeg -i input1.mp4 -filter_complex
    [0:v:0]
    rotate=90*PI/180:ow=min(iw,ih)/sqrt(2):oh=ow:c=none
    [video00]
    [0:a:0]anull [audio00]
    [video00][audio00] concat=n=1:v:1:a=1 [outv][outa]
    -map "[outv]" -map "[outa]" output.mp4
    ScaleType.CROP
    Rotate input video clockwise, expressed as number of radians.

    c=none used when no backgeround is ever shown to improve performance

    View Slide

  86. 86
    COLOR FILTER EXOPLAYER
    app:surface_type=“texture_view” />

    View Slide

  87. 87
    COLOR FILTER EXOPLAYER
    1)
    val contentFrame = binding.root.findViewById
    (com.google.android.exoplayer2.ui.R.id.exo_content_frame) as AspectRatioFrameLayout
    2)
    val textureView = GLTextureView(context)
    contentFrame.addView(textureView)
    3)
    val exoPlayerRenderer = ExoPlayerRenderer(context, textureView, viewModel.getPlayer())
    textureView.setRenderer(exoPlayerRenderer)
    4)
    exoPlayerRenderer.setTransform(Matrix())
    app:surface_type=“texture_view” />

    View Slide

  88. 88
    COLOR FILTER EXOPLAYER
    1)
    val contentFrame = binding.root.findViewById
    (com.google.android.exoplayer2.ui.R.id.exo_content_frame) as AspectRatioFrameLayout
    2)
    val textureView = GLTextureView(context)
    contentFrame.addView(textureView)
    3)
    val exoPlayerRenderer = ExoPlayerRenderer(context, textureView, viewModel.getPlayer())
    textureView.setRenderer(exoPlayerRenderer)
    4)
    exoPlayerRenderer.setTransform(Matrix())
    app:surface_type=“texture_view” />

    View Slide

  89. 89
    COLOR FILTER EXOPLAYER
    1)
    val contentFrame = binding.root.findViewById
    (com.google.android.exoplayer2.ui.R.id.exo_content_frame) as AspectRatioFrameLayout
    2)
    val textureView = GLTextureView(context)
    contentFrame.addView(textureView)
    3)
    val exoPlayerRenderer = ExoPlayerRenderer(context, textureView, viewModel.getPlayer())
    textureView.setRenderer(exoPlayerRenderer)
    4)
    exoPlayerRenderer.setTransform(Matrix())
    app:surface_type=“texture_view” />

    View Slide

  90. 90
    COLOR FILTER EXOPLAYER
    1)
    val contentFrame = binding.root.findViewById
    (com.google.android.exoplayer2.ui.R.id.exo_content_frame) as AspectRatioFrameLayout
    2)
    val textureView = GLTextureView(context)
    contentFrame.addView(textureView)
    3)
    val exoPlayerRenderer = ExoPlayerRenderer(context, textureView, viewModel.getPlayer())
    textureView.setRenderer(exoPlayerRenderer)
    4)
    exoPlayerRenderer.setTransform(Matrix())
    app:surface_type=“texture_view” />

    View Slide

  91. 91
    COLOR FILTER EXOPLAYER
    1)
    val contentFrame = binding.root.findViewById
    (com.google.android.exoplayer2.ui.R.id.exo_content_frame) as AspectRatioFrameLayout
    2)
    val textureView = GLTextureView(context)
    contentFrame.addView(textureView)
    3)
    val exoPlayerRenderer = ExoPlayerRenderer(context, textureView, viewModel.getPlayer())
    textureView.setRenderer(exoPlayerRenderer)
    4)
    exoPlayerRenderer.setTransform(Matrix())
    app:surface_type=“texture_view” />
    With this mechanism you can apply any OpenGL program you want.

    View Slide

  92. 92
    COLOR FILTER FFMPEG
    VideoEffectType.NONE -> ""
    VideoEffectType.BLACK_WHITE -> "hue=s=0"
    VideoEffectType.VIGNETTE -> "vignette=PI/4"
    VideoEffectType.SEPIA -> "colorchannelmixer=.393:.769:.189:0:.349:.686:.168:0:.272:.534:.131"
    VideoEffectType.INVERT -> "negate"
    VideoEffectType.GAMMA -> "eq=gamma=5.0"
    ffmpeg -i input1.mp4 -filter_complex
    [0:v:0]
    colorchannelmixer=.393:.769:.189:0:.349:.686:.168:0:.272:.534:.131
    [video00]
    [0:a:0]anull[audio00]
    [video00][audio00] concat=n=1:v:1:a=1 [outv][outa]
    -map "[outv]" -map "[outa]" output.mp4
    FFmpeg offers operations over the color channel matrix.

    Sample of SEPIA effect.

    View Slide

  93. 93
    COLOR FILTER FFMPEG
    VideoEffectType.NONE -> ""
    VideoEffectType.BLACK_WHITE -> "hue=s=0"
    VideoEffectType.VIGNETTE -> "vignette=PI/4"
    VideoEffectType.SEPIA -> "colorchannelmixer=.393:.769:.189:0:.349:.686:.168:0:.272:.534:.131"
    VideoEffectType.INVERT -> "negate"
    VideoEffectType.GAMMA -> "eq=gamma=5.0"
    ffmpeg -i input1.mp4 -filter_complex
    [0:v:0]
    colorchannelmixer=.393:.769:.189:0:.349:.686:.168:0:.272:.534:.131
    [video00]
    [0:a:0]anull[audio00]
    [video00][audio00] concat=n=1:v:1:a=1 [outv][outa]
    -map "[outv]" -map "[outa]" output.mp4
    In therms of well known filters, we can use existing tools of ffmpeg.

    View Slide

  94. 94
    SPEED EXOPLAYER
    simpleExoPlayer.playbackParameters =
    PlaybackParameters(currentItem.speedMultiplier, _ , _ )

    View Slide

  95. 95
    SPEED FFMPEG
    ffmpeg -i input1.mp4 -filter_complex
    [0:v:0] setpts=${1/speedMultiplier}*PTS[video00]
    [0:a:0] atempo=${speedMultiplier} [audio00]
    [video00][audio00] concat=n=1:v:1:a=1 [outv][outa]
    -map "[outv]" -map "[outa]" output.mp4
    setpts= change PTS (presentation timestamp) of the input frames.

    setpts=0.5 fast motion

    setpts=2.0 slow motion

    atempo=0.8 means 80%

    atempo is just speedUp

    asetpts is dropping (audio) frames

    View Slide

  96. 96
    SPEED FFMPEG
    ffmpeg -i input1.mp4 -filter_complex
    [0:v:0] setpts=${1/speedMultiplier}*PTS[video00]
    [0:a:0] atempo=${speedMultiplier} [audio00]
    [video00][audio00] concat=n=1:v:1:a=1 [outv][outa]
    -map "[outv]" -map "[outa]" output.mp4
    setpts= change PTS (presentation timestamp) of the input frames.

    setpts=0.5 fast motion

    setpts=2.0 slow motion

    atempo=0.8 means 80%

    atempo is just speedUp

    asetpts is dropping (audio) frames

    View Slide

  97. 97
    SPEED FFMPEG
    ffmpeg -i input1.mp4 -filter_complex
    [0:v:0] setpts=${1/speedMultiplier}*PTS[video00]
    [0:a:0] atempo=${speedMultiplier} [audio00]
    [video00][audio00] concat=n=1:v:1:a=1 [outv][outa]
    -map "[outv]" -map "[outa]" output.mp4
    setpts= change PTS (presentation timestamp) of the input frames.

    setpts=0.5 fast motion

    setpts=2.0 slow motion

    atempo=0.8 means 80%

    atempo is just speedUp

    asetpts is dropping (audio) frames

    View Slide

  98. 98
    VOLUME EXOPLAYER
    simpleExoPlayer.volume = currentItem.volumeMultiplier

    View Slide

  99. 99
    VOLUME EXOPLAYER
    simpleExoPlayer.volume = currentItem.volumeMultiplier
    The platform doesn't provide support for volume greater than 1.0f
    (amplification).
    It could be implemented as an AudioProcessor that allows the
    application to set a volume multiplier for each channel.
    ChannelMappingAudioProcessor (ExoPlayer/issues/2659)

    View Slide

  100. 100
    VOLUME EXOPLAYER
    simpleExoPlayer.volume = currentItem.volumeMultiplier
    The platform doesn't provide support for volume greater than 1.0f
    (amplification).
    It could be implemented as an AudioProcessor that allows the
    application to set a volume multiplier for each channel.
    ChannelMappingAudioProcessor (ExoPlayer/issues/2659)

    View Slide

  101. 101
    VOLUME EXOPLAYER
    simpleExoPlayer.volume = currentItem.volumeMultiplier
    The platform doesn't provide support for volume greater than 1.0f
    (amplification).
    It could be implemented as an AudioProcessor that allows the
    application to set a volume multiplier for each channel.
    ChannelMappingAudioProcessor (ExoPlayer/issues/2659)

    View Slide

  102. 102
    VOLUME FFMPEG
    ffmpeg -i input1.mp4 -filter_complex
    [0:v:0] input1[video00]
    [0:a:0] volume=${volumeMultiplier}[audio00]
    [video00][audio00] concat=n=1:v:1:a=1 [outv][outa]
    -map "[outv]" -map "[outa]" output.mp4
    The default value for volume is "1.0" = unity gain

    Amplification (values over 1.0) works well too.

    View Slide

  103. 103
    VOLUME FFMPEG
    ffmpeg -i input1.mp4 -filter_complex
    [0:v:0] input1[video00]
    [0:a:0] volume=${volumeMultiplier}[audio00]
    [video00][audio00] concat=n=1:v:1:a=1 [outv][outa]
    -map "[outv]" -map "[outa]" output.mp4
    The default value for volume is "1.0" = unity gain

    Amplification (values over 1.0) works well too.

    View Slide

  104. 104
    TEXT EXOPLAYER
    fun StaticLayout.draw(canvas: Canvas?, x: Float, y: Float, scale: Float) {
    canvas?.withTranslation(x, y) {
    canvas.withScale(x=scale, y=scale, pivotX=0f, pivotY=0f) {
    draw(this)
    }
    }
    }
    class TextCanvasView : View {
    override fun onDraw(canvas: Canvas?) {
    super.onDraw(canvas)
    layout.draw(canvas, translationX, translationY, scale)
    }
    }
    It’s about drawing text to static layout

    View Slide

  105. 105
    TEXT EXOPLAYER
    fun StaticLayout.draw(canvas: Canvas?, x: Float, y: Float, scale: Float) {
    canvas?.withTranslation(x, y) {
    canvas.withScale(x=scale, y=scale, pivotX=0f, pivotY=0f) {
    draw(this)
    }
    }
    }
    class TextCanvasView : View {
    override fun onDraw(canvas: Canvas?) {
    super.onDraw(canvas)
    layout.draw(canvas, translationX, translationY, scale)
    }
    }

    View Slide

  106. 106
    TEXT EXOPLAYER
    fun StaticLayout.draw(canvas: Canvas?, x: Float, y: Float, scale: Float) {
    canvas?.withTranslation(x, y) {
    canvas.withScale(x=scale, y=scale, pivotX=0f, pivotY=0f) {
    draw(this)
    }
    }
    }
    class TextCanvasView : View {
    override fun onDraw(canvas: Canvas?) {
    super.onDraw(canvas)
    layout.draw(canvas, translationX, translationY, scale)
    }
    }

    View Slide

  107. 107
    TEXT FFMPEG
    ffmpeg -i input1.mp4 -filter_complex
    [0:v:0]
    drawtext=fontfile=Chewy.ttf
    :fontsize=100
    :fontcolor=white
    :x=\(w-text_w\)/2
    :y=\(h-text_h\)/2
    :text=Text
    [video00]
    [0:a:0] anull[audio00]
    [video00][audio00] concat=n=1:v:1:a=1 [outv][outa]
    -map "[outv]" -map "[outa]" output.mp4
    Title based text is strongly dependent on characters supported by font.

    There is bad support for placeholder font in FFmpeg.

    Text on android canvas might differ from text on final movie - when special characters are not supported by font.

    View Slide

  108. 108
    TEXT FFMPEG
    ffmpeg -i input1.mp4 -filter_complex
    [0:v:0]
    drawtext=fontfile=Chewy.ttf
    :fontsize=100
    :fontcolor=white
    :x=\(w-text_w\)/2
    :y=\(h-text_h\)/2
    :text=Text
    [video00]
    [0:a:0] anull[audio00]
    [video00][audio00] concat=n=1:v:1:a=1 [outv][outa]
    -map "[outv]" -map "[outa]" output.mp4

    View Slide

  109. 109
    TEXT FFMPEG
    ffmpeg -i input1.mp4 -filter_complex
    [0:v:0]null[video00]
    movie=bitmap_text.png[text00]
    [video00][text00]overlay=x=0:y=0, setpts=PTS-STARTPTS[videotext00]
    [0:a:0] anull[audio00]
    [videotext00][audio00] concat=n=1:v:1:a=1 [outv][outa]
    -map "[outv]" -map "[outa]" output.mp4

    View Slide

  110. 110
    TEXT FFMPEG
    ffmpeg -i input1.mp4 -filter_complex
    [0:v:0]null[video00]
    movie=bitmap_text.png[text00]
    [video00][text00]overlay=x=0:y=0, setpts=PTS-STARTPTS[videotext00]
    [0:a:0] anull[audio00]
    [videotext00][audio00] concat=n=1:v:1:a=1 [outv][outa]
    -map "[outv]" -map "[outa]" output.mp4
    Overlay 0,0 (left upper corner) expects that bitmap is in the video resolution.

    View Slide

  111. 111
    TEXT FFMPEG
    ffmpeg -i input1.mp4 -filter_complex
    [0:v:0]null[video00]
    movie=bitmap_text.png[text00]
    [video00][text00]overlay=x=0:y=0, setpts=PTS-STARTPTS[videotext00]
    [0:a:0] anull[audio00]
    [videotext00][audio00] concat=n=1:v:1:a=1 [outv][outa]
    -map "[outv]" -map "[outa]" output.mp4

    View Slide

  112. 112
    BACKGROUND MUSIC EXOPLAYER
    private val exoPlayer: SimpleExoPlayer
    private val exoPlayerBackground: SimpleExoPlayer
    exoPlayerBackground.prepare(
    ProgressiveMediaSource.Factory(dataSourceFactory)
    .createMediaSource(Uri.parse(requireNotNull(currentBackgroundMusic).path))
    )
    exoPlayer.addListener(object : Player.EventListener {
    override fun onPlayerStateChanged(playWhenReady: Boolean, playbackState: Int) {
    when (exoPlayer.playbackState) {
    Player.STATE_READY -> {
    if (!exoPlayer.playWhenReady) {
    exoPlayerBackground.playWhenReady = false
    } else {
    exoPlayerBackground.playWhenReady = currentBackgroundMusic != null
    }
    }
    }
    }
    })

    View Slide

  113. 113
    BACKGROUND MUSIC EXOPLAYER
    private val exoPlayer: SimpleExoPlayer
    private val exoPlayerBackground: SimpleExoPlayer
    exoPlayerBackground.prepare(
    ProgressiveMediaSource.Factory(dataSourceFactory)
    .createMediaSource(Uri.parse(requireNotNull(currentBackgroundMusic).path))
    )
    exoPlayer.addListener(object : Player.EventListener {
    override fun onPlayerStateChanged(playWhenReady: Boolean, playbackState: Int) {
    when (exoPlayer.playbackState) {
    Player.STATE_READY -> {
    if (!exoPlayer.playWhenReady) {
    exoPlayerBackground.playWhenReady = false
    } else {
    exoPlayerBackground.playWhenReady = currentBackgroundMusic != null
    }
    }
    }
    }
    })

    View Slide

  114. 114
    BACKGROUND MUSIC EXOPLAYER
    private val exoPlayer: SimpleExoPlayer
    private val exoPlayerBackground: SimpleExoPlayer
    exoPlayerBackground.prepare(
    ProgressiveMediaSource.Factory(dataSourceFactory)
    .createMediaSource(Uri.parse(requireNotNull(currentBackgroundMusic).path))
    )
    exoPlayer.addListener(object : Player.EventListener {
    override fun onPlayerStateChanged(playWhenReady: Boolean, playbackState: Int) {
    when (exoPlayer.playbackState) {
    Player.STATE_READY -> {
    if (!exoPlayer.playWhenReady) {
    exoPlayerBackground.playWhenReady = false
    } else {
    exoPlayerBackground.playWhenReady = currentBackgroundMusic != null
    }
    }
    }
    }
    })
    Synchronize players.

    View Slide

  115. 115
    BACKGROUND MUSIC EXOPLAYER
    private val exoPlayer: SimpleExoPlayer
    private val exoPlayerBackground: SimpleExoPlayer
    exoPlayerBackground.prepare(
    ProgressiveMediaSource.Factory(dataSourceFactory)
    .createMediaSource(Uri.parse(requireNotNull(currentBackgroundMusic).path))
    )
    exoPlayer.addListener(object : Player.EventListener {
    override fun onPlayerStateChanged(playWhenReady: Boolean, playbackState: Int) {
    when (exoPlayer.playbackState) {
    Player.STATE_READY -> {
    if (!exoPlayer.playWhenReady) {
    exoPlayerBackground.playWhenReady = false
    } else {
    exoPlayerBackground.playWhenReady = currentBackgroundMusic != null
    }
    }
    }
    }
    })
    Synchronize players.

    View Slide

  116. 116
    BACKGROUND MUSIC EXOPLAYER
    private val exoPlayer: SimpleExoPlayer
    private val exoPlayerBackground: SimpleExoPlayer
    exoPlayerBackground.prepare(
    ProgressiveMediaSource.Factory(dataSourceFactory)
    .createMediaSource(Uri.parse(requireNotNull(currentBackgroundMusic).path))
    )
    exoPlayer.addListener(object : Player.EventListener {
    override fun onPlayerStateChanged(playWhenReady: Boolean, playbackState: Int) {
    when (exoPlayer.playbackState) {
    Player.STATE_READY -> {
    if (!exoPlayer.playWhenReady) {
    exoPlayerBackground.playWhenReady = false
    } else {
    exoPlayerBackground.playWhenReady = currentBackgroundMusic != null
    }
    }
    }
    }
    })

    View Slide

  117. 117
    BACKGROUND MUSIC FFMPEG
    ffmpeg
    -i input1.mp4
    -filter_complex [0:v:0]null[video00][0:a:0]anull[audio00]
    amovie= background_music.mp3:loop=0,asepts=PTS-STARTPTS,volume=0.3[outm]
    [video00][audio00] concat=n=1:v:1:a=1 [outv][outav]
    [outav][outm]amerge,asepts=PTS-STARTPTS[outa]
    -map "[outv]" -map "[outa]" output.mp4

    View Slide

  118. 118
    BACKGROUND MUSIC FFMPEG
    ffmpeg
    -i input1.mp4
    -filter_complex [0:v:0]null[video00][0:a:0]anull[audio00]
    amovie= background_music.mp3:loop=0,asepts=PTS-STARTPTS,volume=0.3[outm]
    [video00][audio00] concat=n=1:v:1:a=1 [outv][outav]
    [outav][outm]amerge,asepts=PTS-STARTPTS[outa]
    -map "[outv]" -map "[outa]" output.mp4

    View Slide

  119. 119
    BACKGROUND MUSIC FFMPEG
    ffmpeg
    -i input1.mp4
    -filter_complex [0:v:0]null[video00][0:a:0]anull[audio00]
    amovie= background_music.mp3:loop=0,asepts=PTS-STARTPTS,volume=0.3[outm]
    [video00][audio00] concat=n=1:v:1:a=1 [outv][outav]
    [outav][outm]amerge,asepts=PTS-STARTPTS[outa]
    -map "[outv]" -map "[outa]" output.mp4
    Do nothing with video and audio stream of input 0.

    View Slide

  120. 120
    03
    LAST BUT NOT
    LEAST

    View Slide

  121. 121
    AUDIO FOCUS
    import com.google.android.exoplayer2.C
    import com.google.android.exoplayer2.ExoPlayerFactory
    import com.google.android.exoplayer2.audio.AudioAttributes
    exoPlayer.setAudioAttributes(
    AudioAttributes.Builder()
    .setUsage(C.USAGE_MEDIA)
    .setContentType(C.CONTENT_TYPE_MOVIE).build(),
    handleAudioFocus = true)
    C.CONTENT_TYPE_MOVIE //reduce volume while notification is playing
    C.CONTENT_TYPE_SPEECH //pause player while notification is playing

    View Slide

  122. 122
    AUDIO FOCUS
    import com.google.android.exoplayer2.C
    import com.google.android.exoplayer2.ExoPlayerFactory
    import com.google.android.exoplayer2.audio.AudioAttributes
    exoPlayer.setAudioAttributes(
    AudioAttributes.Builder()
    .setUsage(C.USAGE_MEDIA)
    .setContentType(C.CONTENT_TYPE_MOVIE).build(),
    handleAudioFocus = true)
    C.CONTENT_TYPE_MOVIE //reduce volume while notification is playing
    C.CONTENT_TYPE_SPEECH //pause player while notification is playing

    View Slide

  123. 123
    AUDIO FOCUS
    import com.google.android.exoplayer2.C
    import com.google.android.exoplayer2.ExoPlayerFactory
    import com.google.android.exoplayer2.audio.AudioAttributes
    exoPlayer.setAudioAttributes(
    AudioAttributes.Builder()
    .setUsage(C.USAGE_MEDIA)
    .setContentType(C.CONTENT_TYPE_MOVIE).build(),
    handleAudioFocus = true)
    C.CONTENT_TYPE_MOVIE //reduce volume while notification is playing
    C.CONTENT_TYPE_SPEECH //pause player while notification is playing

    View Slide

  124. 124
    AUDIO FOCUS
    import com.google.android.exoplayer2.C
    import com.google.android.exoplayer2.ExoPlayerFactory
    import com.google.android.exoplayer2.audio.AudioAttributes
    exoPlayer.setAudioAttributes(
    AudioAttributes.Builder()
    .setUsage(C.USAGE_MEDIA)
    .setContentType(C.CONTENT_TYPE_MOVIE).build(),
    handleAudioFocus = true)
    C.CONTENT_TYPE_MOVIE //reduce volume while notification is playing
    C.CONTENT_TYPE_SPEECH //pause player while notification is playing
    In case of more exoPlayers playing at the same you should mark just one (the main) as audioFocus handler. Otherwise they will affect each other :-)

    View Slide

  125. 125
    Hopefully some ideas mentioned
    here will help or inspire you.

    View Slide

  126. Michal Jenicek / [email protected]
    THANK YOU!

    View Slide

  127. QUESTIONS

    View Slide