Upgrade to Pro — share decks privately, control downloads, hide ads and more …

VIDEO EDITING ON ANDROID

VIDEO EDITING ON ANDROID

Nothing better than sharing visual content in the sleekest form possible. Which is why more and more apps now require implementing offline video editing features. Ever tried synchronizing the usage of FFmpeg and ExoPlayer on Android? If not, why should you?

Michal Jenicek

January 23, 2020
Tweet

More Decks by Michal Jenicek

Other Decks in Programming

Transcript

  1. Michal Jenicek, Software Engineer at STRV VIDEO EDITING I would

    like to share experiences with video editing
  2. 2 We have been working on the project called Opkix.

    Opkix is about IoT device looking like the egg with two cameras.
  3. 3 Google Pixel 3 - Black 357 x 714 px

    Google Pixel 3 - Black 357 x 714 px Opkix device has it’s own application for firmware and video management. You can manage the egg by using WiFi and BLE.
  4. 4 Google Pixel 3 - Black 357 x 714 px

    Google Pixel 3 - Black 357 x 714 px User want’s also the ability to combine clips and create final move.
  5. 6 WYSIWYG • Visible result of every step Real-Time •

    Don’t wait for each step processing Offline • Use just WiFi & BLE FEATURE REQUIREMENTS WYSIWYG - What you see is what you get. User knows how edit step affect the video. REAL-TIME - See the the result immediately (don’t wait for real video processing) OFFLINE - app is communicating just via WiFi or BLE with the egg.
  6. 7 EXPORTABLE • Ability to export final movie • Export

    shouldn’t block the app FEATURE REQUIREMENTS
  7. 8 • Media object is defining all edits. • View

    layer represents media object in real time. • Export layer exports final movie. IMPLEMENTATION STRATEGY We know the requirements, let’s define clear implementation strategy. media object - edits can be simply persisted without need to adjust source video file
  8. 9 iOS • Platform native media object IMPLEMENTATION STRATEGY iOS

    - media object with ability to adjust behaviour.
  9. 10 iOS • Platform native media object • Platform native

    view layer IMPLEMENTATION STRATEGY iOS - media object with ability to adjust behaviour, be played.
  10. 11 iOS • Platform native media object • Platform native

    view layer • Platform native export layer IMPLEMENTATION STRATEGY iOS - media object with ability to adjust behaviour, be played and is also exportable.
  11. 12 iOS • Platform native media object • Platform native

    view layer • Platform native export layer Android • Platform native media object IMPLEMENTATION STRATEGY ANDROID REALITY iOS - media object with ability to adjust behaviour, be played and is also exportable.
  12. 13 iOS • Platform native media object • Platform native

    view layer • Platform native export layer Android • Platform native media object • Platform native view layer IMPLEMENTATION STRATEGY ANDROID REALITY iOS - media object with ability to adjust behaviour, be played and is also exportable.
  13. 14 iOS • Platform native media object • Platform native

    view layer • Platform native export layer Android • Platform native media object • Platform native view layer • Platform native export layer IMPLEMENTATION STRATEGY ANDROID REALITY iOS - media object with ability to adjust behaviour, be played and is also exportable.
  14. 15 iOS • Platform native media object • Platform native

    view layer • Platform native export layer Android • Platform native media object • Platform native view layer • Platform native export layer • Fake them all! IMPLEMENTATION STRATEGY ANDROID REALITY iOS - media object with ability to adjust behaviour, be played and is also exportable. Android - has to solve the framework itself.
  15. • MediaPlayer (view layer) 16 CHOOSE THE FAKE TOOLS MediaPlayer

    MediaPlayer is simply blackbox. We didn’t pick it.
  16. ❌ MediaPlayer • Android Media API (Extractor, Muxer, …) 17

    CHOOSE THE FAKE TOOLS android.media MediaPlayer android.media Android Media API is complicated without good samples. We tried android.media for different task and we decided not to use it this time.
  17. ❌ MediaPlayer ❌ Android Media API (Extractor, Muxer, …) •

    Commercial edit libraries (export layer) 18 CHOOSE THE FAKE TOOLS android.media MediaPlayer We didn’t want to use any paid solutions.
  18. ✅ Kotlin data class (media object) ❌ MediaPlayer ❌ Android

    Media API (Extractor, Muxer, …) ❌ Commercial edit libraries 19 CHOOSE THE FAKE TOOLS android.media MediaPlayer
  19. ✅ Kotlin data class (media object) ✅ ExoPlayer (view layer)

    ❌ MediaPlayer ❌ Android Media API (Extractor, Muxer, …) ❌ Commercial edit libraries 20 CHOOSE THE FAKE TOOLS android.media MediaPlayer ExoPlayer Open Source library by Google. Good documentation including samples and solutions on stackoverflow.
  20. ✅ Kotlin data class (media object) ✅ ExoPlayer (view layer)

    ✅ FFmpeg (export layer) ❌ MediaPlayer ❌ Android Media API (Extractor, Muxer, …) ❌ Commercial edit libraries 21 CHOOSE THE FAKE TOOLS android.media MediaPlayer FFmpeg ExoPlayer Powerful opensource library with broad community. Usable also on Android.
  21. • ExoPlayer (view layer) • FFmpeg (export layer) 22 FAKE

    TOOLS FFmpeg ExoPlayer Usage is straight forward.
  22. • ExoPlayer (view layer) • FFmpeg (export layer) 23 FAKE

    TOOLS FFmpeg ExoPlayer FFmpeg is not targeted to Android. It’s usage on Android might be less obvious. Let’s talk it a bit.
  23. 28 • Audio/Video processing library • C language • Opensource

    FFMPEG IS read and write media streams decode and encode media streams scale images utility functions for all other libraries libavformat libavcodec libswscale libavutil FFmpeg stands for 4 core libraries ( it takes about 16MB ).
  24. 29 • Audio/Video processing library • C language • Opensource

    • Lot of external libraries • x264 • x265 • xvidcore • vid.stab • … FFMPEG IS read and write media streams decode and encode media streams scale images utility functions for all other libraries libavformat libavcodec libswscale libavutil There is lot of external libraries available.
  25. 30 • Audio/Video processing library • C language • Opensource

    • Lot of external libraries • x264 • x265 • xvidcore • vid.stab • … FFMPEG IS read and write media streams decode and encode media streams scale images utility functions for all other libraries libavformat libavcodec libswscale libavutil FFmpeg offers thre tools.
  26. 31 • Audio/Video processing library • C language • Opensource

    • Lot of external libraries • x264 • x265 • xvidcore • vid.stab • … FFMPEG STARTS ON 16MB SIZE read and write media streams decode and encode media streams scale images utility functions for all other libraries libavformat libavcodec libswscale libavutil To write basic video editor the ffmpeg tool and few external libraries is enough. For basic video editor it takes about 20MB.
  27. 32 Build FFmpeg yourself • Compile • Connect extension libraries

    • Integrate (JNI) FFMPEG ON ANDROID You have options to get build of FFmpeg on Android
  28. 33 Build FFmpeg yourself • Compile • Connect extension libraries

    • Integrate (JNI) Use existing prebuilt library • TANERSENER (mobile-ffmpeg) • … FFMPEG ON ANDROID
  29. 34 Build FFmpeg yourself • Compile • Connect extension libraries

    • Integrate (JNI) Use existing prebuilt library • TANERSENER (mobile-ffmpeg) • … FFMPEG ON ANDROID Specific function • Function in C language (on top of FFmpeg build) • Expose function via JNI Specific function - good for exposing your code as a library.
  30. 35 Build FFmpeg yourself • Compile • Connect extension libraries

    • Integrate (JNI) Use existing prebuilt library • TANERSENER (mobile-ffmpeg) • … FFMPEG ON ANDROID Generic function • Runtime.getRuntime().exec(…) • Expose generic call via JNI Specific function • Function in C language (on top of FFmpeg build) • Expose function via JNI Send set of ffmpeg commands as a string.
  31. 36 Build FFmpeg yourself • Compile • Connect extension libraries

    • Integrate (JNI) Use existing prebuilt library • TANERSENER (mobile-ffmpeg) • … FFMPEG ON ANDROID OUR CHOICE Generic function • Runtime.getRuntime().exec(…) • Expose generic call via JNI Specific function • Function in C language (on top of FFmpeg build) • Expose function via JNI
  32. 38 FFMPEG TANERSENER TO THE RESCUE • Support many FFmpeg

    releases (Active development) https://github.com/tanersener/mobile-ffmpeg
  33. 39 FFMPEG TANERSENER TO THE RESCUE • Support many FFmpeg

    releases (Active development) • 32 external libraries https://github.com/tanersener/mobile-ffmpeg
  34. 40 FFMPEG TANERSENER TO THE RESCUE • Support many FFmpeg

    releases (Active development) • 32 external libraries • x86_64 support https://github.com/tanersener/mobile-ffmpeg
  35. 41 FFMPEG TANERSENER TO THE RESCUE • Support many FFmpeg

    releases (Active development) • 32 external libraries • x86_64 support • API Level 16+ https://github.com/tanersener/mobile-ffmpeg
  36. 42 FFMPEG TANERSENER TO THE RESCUE • Support many FFmpeg

    releases (Active development) • 32 external libraries • x86_64 support • API Level 16+ • MobileFFmpeg wrapper library https://github.com/tanersener/mobile-ffmpeg
  37. 43 FFMPEG TANERSENER TO THE RESCUE • Support many FFmpeg

    releases (Active development) • 32 external libraries • x86_64 support • API Level 16+ • MobileFFmpeg wrapper library • 8 types of pre-built library variants (or fork the custom one) https://github.com/tanersener/mobile-ffmpeg
  38. 44 FFMPEG TANERSENER TO THE RESCUE • Support many FFmpeg

    releases (Active development) • 32 external libraries • x86_64 support • API Level 16+ • Generic mobileFFmpeg wrapper library • 8 types of pre-built library variants (or fork the custom one) https://github.com/tanersener/mobile-ffmpeg
  39. 46 < FEATURE > < LAYER > Piece of code

    to show the main trick.
  40. 47 TIMELINE EXOPLAYER val mediaSourceList: List<ClippingMediaSource> = timelineList.map { ClippingMediaSource(

    ProgressiveMediaSource.Factory(videoDataSourceFactory) .createMediaSource(Uri.parse(it.videoPath)), it.startPositionInMillis, it.endPositionInMillis) } exoPlayer.prepare( ConcatenatingMediaSource(isAtomic = true, *mediaSourceList.toTypedArray()) )
  41. 48 TIMELINE EXOPLAYER val mediaSourceList: List<ClippingMediaSource> = timelineList.map { ClippingMediaSource(

    ProgressiveMediaSource.Factory(videoDataSourceFactory) .createMediaSource(Uri.parse(it.videoPath)), it.startPositionInMillis, it.endPositionInMillis) } exoPlayer.prepare( ConcatenatingMediaSource(isAtomic = true, *mediaSourceList.toTypedArray()) )
  42. 49 TIMELINE EXOPLAYER val mediaSourceList: List<ClippingMediaSource> = timelineList.map { ClippingMediaSource(

    ProgressiveMediaSource.Factory(videoDataSourceFactory) .createMediaSource(Uri.parse(it.videoPath)), it.startPositionInMillis, it.endPositionInMillis) } exoPlayer.prepare( ConcatenatingMediaSource(isAtomic = true, *mediaSourceList.toTypedArray()) )
  43. 50 TIMELINE EXOPLAYER val mediaSourceList: List<ClippingMediaSource> = timelineList.map { ClippingMediaSource(

    ProgressiveMediaSource.Factory(videoDataSourceFactory) .createMediaSource(Uri.parse(it.videoPath)), it.startPositionInMillis, it.endPositionInMillis) } exoPlayer.prepare( ConcatenatingMediaSource(isAtomic = true, *mediaSourceList.toTypedArray()) )
  44. 51 TIMELINE EXOPLAYER val mediaSourceList: List<ClippingMediaSource> = timelineList.map { ClippingMediaSource(

    ProgressiveMediaSource.Factory(videoDataSourceFactory) .createMediaSource(Uri.parse(it.videoPath)), it.startPositionInMillis, it.endPositionInMillis) } exoPlayer.prepare( ConcatenatingMediaSource(isAtomic = true, *mediaSourceList.toTypedArray()) )
  45. 52 TIMELINE FFMPEG ffmpeg -i input1.mp4 -c copy -f mpegts

    intermediate1.ts ffmpeg -i input2.mp4 -c copy -f mpegts intermediate2.ts ffmpeg -i "concat:intermediate1.ts|intermediate2.ts" -c copy output.mp4 Concat protocol
  46. 53 TIMELINE FFMPEG ffmpeg -i input1.mp4 -c copy -f mpegts

    intermediate1.ts ffmpeg -i input2.mp4 -c copy -f mpegts intermediate2.ts ffmpeg -i "concat:intermediate1.ts|intermediate2.ts" -c copy output.mp4 Concat protocol
  47. 54 TIMELINE FFMPEG ffmpeg -i input1.mp4 -c copy -f mpegts

    intermediate1.ts ffmpeg -i input2.mp4 -c copy -f mpegts intermediate2.ts ffmpeg -i "concat:intermediate1.ts|intermediate2.ts" -c copy output.mp4 Concat protocol mpegts - converts mp4 to MPEG-2 transport stream, which is concatenable.
  48. 55 TIMELINE FFMPEG ffmpeg -i input1.mp4 -c copy -f mpegts

    intermediate1.ts ffmpeg -i input2.mp4 -c copy -f mpegts intermediate2.ts ffmpeg -i "concat:intermediate1.ts|intermediate2.ts" -c copy output.mp4 Concat protocol mpegts - converts mp4 to MPEG-2 transport stream, which is concatenable.
  49. 56 TIMELINE FFMPEG ffmpeg -i input1.mp4 -c copy -f mpegts

    intermediate1.ts ffmpeg -i input2.mp4 -c copy -f mpegts intermediate2.ts ffmpeg -i "concat:intermediate1.ts|intermediate2.ts" -c copy output.mp4 Concat protocol mpegts - converts mp4 to MPEG-2 transport stream, which is concatenable.
  50. 57 TIMELINE FFMPEG ffmpeg -i input1.mp4 -c copy -f mpegts

    intermediate1.ts ffmpeg -i input2.mp4 -c copy -f mpegts intermediate2.ts ffmpeg -i "concat:intermediate1.ts|intermediate2.ts" -c copy output.mp4 Concat protocol mpegts - converts mp4 to MPEG-2 transport stream, which is concatenable.
  51. 58 TIMELINE FFMPEG ffmpeg -i input1.mp4 -c copy -f mpegts

    intermediate1.ts ffmpeg -i input2.mp4 -c copy -f mpegts intermediate2.ts ffmpeg -i "concat:intermediate1.ts|intermediate2.ts" -c copy output.mp4 Concat protocol mpegts - converts mp4 to MPEG-2 transport stream, which is concatenable.
  52. 59 TIMELINE FFMPEG ffmpeg -i input1.mp4 -c copy -f mpegts

    intermediate1.ts ffmpeg -i input2.mp4 -c copy -f mpegts intermediate2.ts ffmpeg -i "concat:intermediate1.ts|intermediate2.ts" -c copy output.mp4 Concat protocol mpegts - converts mp4 to MPEG-2 transport stream, which is concatenable. Concat protocol is fast since it doesn’t require RE-ENCODING.
  53. 60 TIMELINE FFMPEG ffmpeg -i input1.mp4 -c copy -f mpegts

    intermediate1.ts ffmpeg -i input2.mp4 -c copy -f mpegts intermediate2.ts ffmpeg -i "concat:intermediate1.ts|intermediate2.ts" -c copy output.mp4 Concat protocol It’s usage is limited by using intermediate files (you have to handle them and it is another space required for processing). Another downside is hard to show overall progress.
  54. 61 TIMELINE FFMPEG ffmpeg -i input1.mp4 -i input2.mov -i input3.ts

    \ -filter_complex “[0:v:0][0:a:0][1:v:0][1:a:0][2:v:0][2:a:0] concat=n=2:v:1:a=1[outv][outa]" \ -map "[outv]" -map "[outa]" output.mp4 Concat filter
  55. 62 TIMELINE FFMPEG ffmpeg -i input1.mp4 -i input2.mov -i input3.ts

    \ -filter_complex “[0:v:0][0:a:0][1:v:0][1:a:0][2:v:0][2:a:0] concat=n=2:v:1:a=1[outv][outa]" \ -map "[outv]" -map "[outa]" output.mp4 Concat filter video and audio stream of input 1
  56. 63 TIMELINE FFMPEG ffmpeg -i input1.mp4 -i input2.mov -i input3.ts

    \ -filter_complex “[0:v:0][0:a:0][1:v:0][1:a:0][2:v:0][2:a:0] concat=n=2:v:1:a=1[outv][outa]" \ -map "[outv]" -map "[outa]" output.mp4 Concat filter video and audio stream of input 2
  57. 64 TIMELINE FFMPEG ffmpeg -i input1.mp4 -i input2.mov -i input3.ts

    \ -filter_complex “[0:v:0][0:a:0][1:v:0][1:a:0][2:v:0][2:a:0] concat=n=2:v:1:a=1[outv][outa]" \ -map "[outv]" -map "[outa]" output.mp4 Concat filter video and audio stream of input 3
  58. 65 TIMELINE FFMPEG ffmpeg -i input1.mp4 -i input2.mov -i input3.ts

    -filter_complex “[0:v:0][0:a:0][1:v:0][1:a:0][2:v:0][2:a:0] concat=n=2:v:1:a=1[outv][outa]" \ -map "[outv]" -map "[outa]" output.mp4 Concat filter Concat filter is transposing all inputs to streams and you can write all operations as one command.
  59. 66 TIMELINE FFMPEG ffmpeg -i input1.mp4 -i input2.mov -i input3.ts

    -filter_complex “[0:v:0][0:a:0][1:v:0][1:a:0][2:v:0][2:a:0] concat=n=2:v:1:a=1[outv][outa]" \ -map "[outv]" -map "[outa]" output.mp4 Concat filter Concat filter is transposing all inputs to streams and you can write all operations as one command.
  60. 67 TIMELINE FFMPEG ffmpeg -i input1.mp4 -i input2.mov -i input3.ts

    \ -filter_complex “[0:v:0][0:a:0][1:v:0][1:a:0][2:v:0][2:a:0] concat=n=2:v:1:a=1[outv][outa]" \ -map "[outv]" -map "[outa]" output.mp4 Concat filter Concat filter is transposing all inputs to streams and you can write all operations as one command. This is great to show overall progress. It can combine different video file types. Even though it requires re-encoding, so it’s quite slow, we picked this as usable solution. Export of course requires processing on foreground service not to block the app.
  61. 68 TRIM EXOPLAYER val mediaSourceList: List<ClippingMediaSource> = timelineList.map { ClippingMediaSource(

    ProgressiveMediaSource.Factory(videoDataSourceFactory) .createMediaSource(Uri.parse(it.videoPath)), it.startPositionInMillis, it.endPositionInMillis) } exoPlayer.prepare( ConcatenatingMediaSource(isAtomic = true, *mediaSourceList.toTypedArray()) ) Trim is ensured by parameters start and end of media source
  62. 69 TRIM EXOPLAYER val mediaSourceList: List<ClippingMediaSource> = timelineList.map { ClippingMediaSource(

    ProgressiveMediaSource.Factory(videoDataSourceFactory) .createMediaSource(Uri.parse(it.videoPath)), it.startPositionInMillis, it.endPositionInMillis) } exoPlayer.prepare( ConcatenatingMediaSource(isAtomic = true, *mediaSourceList.toTypedArray()) ) Code including TRIM and CONCAT
  63. 70 TRIM FFMPEG ffmpeg -i input1.mp4 -i input2.mov -filter_complex [0:v:0]trim=0.0:20.624,

    setpts=PTS-STARTPTS [video00] [1:v:0]trim=0.0:21.625, setpts=PTS-STARTPTS [video10] [0:a:0]atrim=0.0:20.624, asetpts=PTS-STARTPTS [audio00] [1:a:0]atrim=0.0:21.625, asetpts=PTS-STARTPTS [audio10] [video00][video10][audio00][audio10] concat=n=2:v:1:a=1 [outv] [outa] -map "[outv]" -map "[outa]" output.mp4
  64. 71 TRIM FFMPEG ffmpeg -i input1.mp4 -i input2.mov -filter_complex [0:v:0]trim=0.0:20.624,

    setpts=PTS-STARTPTS [video00] [1:v:0]trim=0.0:21.625, setpts=PTS-STARTPTS [video10] [0:a:0]atrim=0.0:20.624, asetpts=PTS-STARTPTS [audio00] [1:a:0]atrim=0.0:21.625, asetpts=PTS-STARTPTS [audio10] [video00][video10][audio00][audio10] concat=n=2:v:1:a=1 [outv] [outa] -map "[outv]" -map "[outa]" output.mp4
  65. 72 TRIM FFMPEG ffmpeg -i input1.mp4 -i input2.mov -filter_complex [0:v:0]trim=0.0:20.624,

    setpts=PTS-STARTPTS [video00] [1:v:0]trim=0.0:21.625, setpts=PTS-STARTPTS [video10] [0:a:0]atrim=0.0:20.624, asetpts=PTS-STARTPTS [audio00] [1:a:0]atrim=0.0:21.625, asetpts=PTS-STARTPTS [audio10] [video00][video10][audio00][audio10] concat=n=2:v:1:a=1 [outv] [outa] -map "[outv]" -map "[outa]" output.mp4
  66. 73 ROTATE/ZOOM/MOVE EXOPLAYER import android.graphics.Matrix val matrix = Matrix() matrix.setScale(video.zoom,

    video.zoom, pivotPointX, pivotPointY) matrix.postScale(cropScale, cropScale, pivotPointX, pivotPointY) matrix.postRotate(video.rotation, pivotPointX, pivotPointY) matrix.postTranslate(translationX * cropScale, translationY * cropScale) (binding.playerView.videoSurfaceView as TextureView).setTransform(matrix)
  67. 74 ROTATE/ZOOM/MOVE EXOPLAYER import android.graphics.Matrix val matrix = Matrix() matrix.setScale(video.zoom,

    video.zoom, pivotPointX, pivotPointY) matrix.postScale(cropScale, cropScale, pivotPointX, pivotPointY) matrix.postRotate(video.rotation, pivotPointX, pivotPointY) matrix.postTranslate(translationX * cropScale, translationY * cropScale) (binding.playerView.videoSurfaceView as TextureView).setTransform(matrix)
  68. 75 ROTATE/ZOOM/MOVE EXOPLAYER import android.graphics.Matrix val matrix = Matrix() matrix.setScale(video.zoom,

    video.zoom, pivotPointX, pivotPointY) matrix.postScale(cropScale, cropScale, pivotPointX, pivotPointY) matrix.postRotate(video.rotation, pivotPointX, pivotPointY) matrix.postTranslate(translationX * cropScale, translationY * cropScale) (binding.playerView.videoSurfaceView as TextureView).setTransform(matrix)
  69. 76 ROTATE/ZOOM/MOVE EXOPLAYER import android.graphics.Matrix val matrix = Matrix() matrix.setScale(video.zoom,

    video.zoom, pivotPointX, pivotPointY) matrix.postScale(cropScale, cropScale, pivotPointX, pivotPointY) matrix.postRotate(video.rotation, pivotPointX, pivotPointY) matrix.postTranslate(translationX * cropScale, translationY * cropScale) (binding.playerView.videoSurfaceView as TextureView).setTransform(matrix)
  70. 77 ROTATE/ZOOM/MOVE EXOPLAYER import android.graphics.Matrix val matrix = Matrix() matrix.setScale(video.zoom,

    video.zoom, pivotPointX, pivotPointY) matrix.postScale(cropScale, cropScale, pivotPointX, pivotPointY) matrix.postRotate(video.rotation, pivotPointX, pivotPointY) matrix.postTranslate(translationX * cropScale, translationY * cropScale) (binding.playerView.videoSurfaceView as TextureView).setTransform(matrix)
  71. 78 ZOOM/MOVE FFMPEG ffmpeg -i input1.mp4 -filter_complex [0:v:0] scale=w=(1.0*max(iw*1080/ih\,1920)):h=(1.0*max(1080\,ih*1920/iw)), crop=w=1920:h=1080:x=(iw-ow)/2-((iw*0.5)/1920.0):y=(ih-oh)/2-((ih*0.0)/1080.0)

    [video00] [0:a:0]anull [audio00] [video00][audio00] concat=n=1:v:1:a=1 [outv][outa] -map "[outv]" -map "[outa]" output.mp4 1.0 ${zoom} 0.5 ${translation.translationX} 0.0 ${translation.translationY}
  72. 79 ZOOM/MOVE FFMPEG ffmpeg -i input1.mp4 -filter_complex [0:v:0] scale=w=(1.0*max(iw*1080/ih\,1920)):h=(1.0*max(1080\,ih*1920/iw)), crop=w=1920:h=1080:x=(iw-ow)/2-((iw*0.5)/1920.0):y=(ih-oh)/2-((ih*0.0)/1080.0)

    [video00] [0:a:0]anull [audio00] [video00][audio00] concat=n=1:v:1:a=1 [outv][outa] -map "[outv]" -map "[outa]" output.mp4 1.0 ${zoom} 0.5 ${translation.translationX} 0.0 ${translation.translationY}
  73. 80 ZOOM/MOVE FFMPEG ffmpeg -i input1.mp4 -filter_complex [0:v:0] scale=w=(1.0*max(iw*1080/ih\,1920)):h=(1.0*max(1080\,ih*1920/iw)), crop=w=1920:h=1080:x=(iw-ow)/2-((iw*0.5)/1920.0):y=(ih-oh)/2-((ih*0.0)/1080.0)

    [video00] [0:a:0]anull [audio00] [video00][audio00] concat=n=1:v:1:a=1 [outv][outa] -map "[outv]" -map "[outa]" output.mp4 1.0 ${zoom} 0.5 ${translation.translationX} 0.0 ${translation.translationY}
  74. 81 ROTATE FFMPEG ffmpeg -i input1.mp4 -filter_complex [0:v:0] rotate=90*PI/180:ow=min(iw,ih)/sqrt(2):oh=ow:c=none [video00]

    [0:a:0]anull [audio00] [video00][audio00] concat=n=1:v:1:a=1 [outv][outa] -map "[outv]" -map "[outa]" output.mp4 ScaleType.CROP Rotate input video clockwise, expressed as number of radians. c=none used when no backgeround is ever shown to improve performance
  75. 82 ROTATE FFMPEG ffmpeg -i input1.mp4 -filter_complex [0:v:0] rotate=90*PI/180:ow=min(iw,ih)/sqrt(2):oh=ow:c=none [video00]

    [0:a:0]anull [audio00] [video00][audio00] concat=n=1:v:1:a=1 [outv][outa] -map "[outv]" -map "[outa]" output.mp4 ScaleType.CROP Rotate input video clockwise, expressed as number of radians. c=none used when no backgeround is ever shown to improve performance
  76. 83 ROTATE FFMPEG ffmpeg -i input1.mp4 -filter_complex [0:v:0] rotate=90*PI/180:ow=min(iw,ih)/sqrt(2):oh=ow:c=none [video00]

    [0:a:0]anull [audio00] [video00][audio00] concat=n=1:v:1:a=1 [outv][outa] -map "[outv]" -map "[outa]" output.mp4 ScaleType.CROP Rotate input video clockwise, expressed as number of radians. c=none used when no backgeround is ever shown to improve performance
  77. 84 ROTATE FFMPEG ffmpeg -i input1.mp4 -filter_complex [0:v:0] rotate=90*PI/180:ow=min(iw,ih)/sqrt(2):oh=ow:c=none [video00]

    [0:a:0]anull [audio00] [video00][audio00] concat=n=1:v:1:a=1 [outv][outa] -map "[outv]" -map "[outa]" output.mp4 ScaleType.CROP Rotate input video clockwise, expressed as number of radians. c=none used when no backgeround is ever shown to improve performance
  78. 85 ROTATE FFMPEG ffmpeg -i input1.mp4 -filter_complex [0:v:0] rotate=90*PI/180:ow=min(iw,ih)/sqrt(2):oh=ow:c=none [video00]

    [0:a:0]anull [audio00] [video00][audio00] concat=n=1:v:1:a=1 [outv][outa] -map "[outv]" -map "[outa]" output.mp4 ScaleType.CROP Rotate input video clockwise, expressed as number of radians. c=none used when no backgeround is ever shown to improve performance
  79. 87 COLOR FILTER EXOPLAYER 1) val contentFrame = binding.root.findViewById<View> (com.google.android.exoplayer2.ui.R.id.exo_content_frame)

    as AspectRatioFrameLayout 2) val textureView = GLTextureView(context) contentFrame.addView(textureView) 3) val exoPlayerRenderer = ExoPlayerRenderer(context, textureView, viewModel.getPlayer()) textureView.setRenderer(exoPlayerRenderer) 4) exoPlayerRenderer.setTransform(Matrix()) <com.google.android.exoplayer2.ui.PlayerView app:surface_type=“texture_view” />
  80. 88 COLOR FILTER EXOPLAYER 1) val contentFrame = binding.root.findViewById<View> (com.google.android.exoplayer2.ui.R.id.exo_content_frame)

    as AspectRatioFrameLayout 2) val textureView = GLTextureView(context) contentFrame.addView(textureView) 3) val exoPlayerRenderer = ExoPlayerRenderer(context, textureView, viewModel.getPlayer()) textureView.setRenderer(exoPlayerRenderer) 4) exoPlayerRenderer.setTransform(Matrix()) <com.google.android.exoplayer2.ui.PlayerView app:surface_type=“texture_view” />
  81. 89 COLOR FILTER EXOPLAYER 1) val contentFrame = binding.root.findViewById<View> (com.google.android.exoplayer2.ui.R.id.exo_content_frame)

    as AspectRatioFrameLayout 2) val textureView = GLTextureView(context) contentFrame.addView(textureView) 3) val exoPlayerRenderer = ExoPlayerRenderer(context, textureView, viewModel.getPlayer()) textureView.setRenderer(exoPlayerRenderer) 4) exoPlayerRenderer.setTransform(Matrix()) <com.google.android.exoplayer2.ui.PlayerView app:surface_type=“texture_view” />
  82. 90 COLOR FILTER EXOPLAYER 1) val contentFrame = binding.root.findViewById<View> (com.google.android.exoplayer2.ui.R.id.exo_content_frame)

    as AspectRatioFrameLayout 2) val textureView = GLTextureView(context) contentFrame.addView(textureView) 3) val exoPlayerRenderer = ExoPlayerRenderer(context, textureView, viewModel.getPlayer()) textureView.setRenderer(exoPlayerRenderer) 4) exoPlayerRenderer.setTransform(Matrix()) <com.google.android.exoplayer2.ui.PlayerView app:surface_type=“texture_view” />
  83. 91 COLOR FILTER EXOPLAYER 1) val contentFrame = binding.root.findViewById<View> (com.google.android.exoplayer2.ui.R.id.exo_content_frame)

    as AspectRatioFrameLayout 2) val textureView = GLTextureView(context) contentFrame.addView(textureView) 3) val exoPlayerRenderer = ExoPlayerRenderer(context, textureView, viewModel.getPlayer()) textureView.setRenderer(exoPlayerRenderer) 4) exoPlayerRenderer.setTransform(Matrix()) <com.google.android.exoplayer2.ui.PlayerView app:surface_type=“texture_view” /> With this mechanism you can apply any OpenGL program you want.
  84. 92 COLOR FILTER FFMPEG VideoEffectType.NONE -> "" VideoEffectType.BLACK_WHITE -> "hue=s=0"

    VideoEffectType.VIGNETTE -> "vignette=PI/4" VideoEffectType.SEPIA -> "colorchannelmixer=.393:.769:.189:0:.349:.686:.168:0:.272:.534:.131" VideoEffectType.INVERT -> "negate" VideoEffectType.GAMMA -> "eq=gamma=5.0" ffmpeg -i input1.mp4 -filter_complex [0:v:0] colorchannelmixer=.393:.769:.189:0:.349:.686:.168:0:.272:.534:.131 [video00] [0:a:0]anull[audio00] [video00][audio00] concat=n=1:v:1:a=1 [outv][outa] -map "[outv]" -map "[outa]" output.mp4 FFmpeg offers operations over the color channel matrix. Sample of SEPIA effect.
  85. 93 COLOR FILTER FFMPEG VideoEffectType.NONE -> "" VideoEffectType.BLACK_WHITE -> "hue=s=0"

    VideoEffectType.VIGNETTE -> "vignette=PI/4" VideoEffectType.SEPIA -> "colorchannelmixer=.393:.769:.189:0:.349:.686:.168:0:.272:.534:.131" VideoEffectType.INVERT -> "negate" VideoEffectType.GAMMA -> "eq=gamma=5.0" ffmpeg -i input1.mp4 -filter_complex [0:v:0] colorchannelmixer=.393:.769:.189:0:.349:.686:.168:0:.272:.534:.131 [video00] [0:a:0]anull[audio00] [video00][audio00] concat=n=1:v:1:a=1 [outv][outa] -map "[outv]" -map "[outa]" output.mp4 In therms of well known filters, we can use existing tools of ffmpeg.
  86. 95 SPEED FFMPEG ffmpeg -i input1.mp4 -filter_complex [0:v:0] setpts=${1/speedMultiplier}*PTS[video00] [0:a:0]

    atempo=${speedMultiplier} [audio00] [video00][audio00] concat=n=1:v:1:a=1 [outv][outa] -map "[outv]" -map "[outa]" output.mp4 setpts= change PTS (presentation timestamp) of the input frames. setpts=0.5 fast motion setpts=2.0 slow motion atempo=0.8 means 80% atempo is just speedUp asetpts is dropping (audio) frames
  87. 96 SPEED FFMPEG ffmpeg -i input1.mp4 -filter_complex [0:v:0] setpts=${1/speedMultiplier}*PTS[video00] [0:a:0]

    atempo=${speedMultiplier} [audio00] [video00][audio00] concat=n=1:v:1:a=1 [outv][outa] -map "[outv]" -map "[outa]" output.mp4 setpts= change PTS (presentation timestamp) of the input frames. setpts=0.5 fast motion setpts=2.0 slow motion atempo=0.8 means 80% atempo is just speedUp asetpts is dropping (audio) frames
  88. 97 SPEED FFMPEG ffmpeg -i input1.mp4 -filter_complex [0:v:0] setpts=${1/speedMultiplier}*PTS[video00] [0:a:0]

    atempo=${speedMultiplier} [audio00] [video00][audio00] concat=n=1:v:1:a=1 [outv][outa] -map "[outv]" -map "[outa]" output.mp4 setpts= change PTS (presentation timestamp) of the input frames. setpts=0.5 fast motion setpts=2.0 slow motion atempo=0.8 means 80% atempo is just speedUp asetpts is dropping (audio) frames
  89. 99 VOLUME EXOPLAYER simpleExoPlayer.volume = currentItem.volumeMultiplier The platform doesn't provide

    support for volume greater than 1.0f (amplification). It could be implemented as an AudioProcessor that allows the application to set a volume multiplier for each channel. ChannelMappingAudioProcessor (ExoPlayer/issues/2659)
  90. 100 VOLUME EXOPLAYER simpleExoPlayer.volume = currentItem.volumeMultiplier The platform doesn't provide

    support for volume greater than 1.0f (amplification). It could be implemented as an AudioProcessor that allows the application to set a volume multiplier for each channel. ChannelMappingAudioProcessor (ExoPlayer/issues/2659)
  91. 101 VOLUME EXOPLAYER simpleExoPlayer.volume = currentItem.volumeMultiplier The platform doesn't provide

    support for volume greater than 1.0f (amplification). It could be implemented as an AudioProcessor that allows the application to set a volume multiplier for each channel. ChannelMappingAudioProcessor (ExoPlayer/issues/2659)
  92. 102 VOLUME FFMPEG ffmpeg -i input1.mp4 -filter_complex [0:v:0] input1[video00] [0:a:0]

    volume=${volumeMultiplier}[audio00] [video00][audio00] concat=n=1:v:1:a=1 [outv][outa] -map "[outv]" -map "[outa]" output.mp4 The default value for volume is "1.0" = unity gain Amplification (values over 1.0) works well too.
  93. 103 VOLUME FFMPEG ffmpeg -i input1.mp4 -filter_complex [0:v:0] input1[video00] [0:a:0]

    volume=${volumeMultiplier}[audio00] [video00][audio00] concat=n=1:v:1:a=1 [outv][outa] -map "[outv]" -map "[outa]" output.mp4 The default value for volume is "1.0" = unity gain Amplification (values over 1.0) works well too.
  94. 104 TEXT EXOPLAYER fun StaticLayout.draw(canvas: Canvas?, x: Float, y: Float,

    scale: Float) { canvas?.withTranslation(x, y) { canvas.withScale(x=scale, y=scale, pivotX=0f, pivotY=0f) { draw(this) } } } class TextCanvasView : View { override fun onDraw(canvas: Canvas?) { super.onDraw(canvas) layout.draw(canvas, translationX, translationY, scale) } } It’s about drawing text to static layout
  95. 105 TEXT EXOPLAYER fun StaticLayout.draw(canvas: Canvas?, x: Float, y: Float,

    scale: Float) { canvas?.withTranslation(x, y) { canvas.withScale(x=scale, y=scale, pivotX=0f, pivotY=0f) { draw(this) } } } class TextCanvasView : View { override fun onDraw(canvas: Canvas?) { super.onDraw(canvas) layout.draw(canvas, translationX, translationY, scale) } }
  96. 106 TEXT EXOPLAYER fun StaticLayout.draw(canvas: Canvas?, x: Float, y: Float,

    scale: Float) { canvas?.withTranslation(x, y) { canvas.withScale(x=scale, y=scale, pivotX=0f, pivotY=0f) { draw(this) } } } class TextCanvasView : View { override fun onDraw(canvas: Canvas?) { super.onDraw(canvas) layout.draw(canvas, translationX, translationY, scale) } }
  97. 107 TEXT FFMPEG ffmpeg -i input1.mp4 -filter_complex [0:v:0] drawtext=fontfile=Chewy.ttf :fontsize=100

    :fontcolor=white :x=\(w-text_w\)/2 :y=\(h-text_h\)/2 :text=Text [video00] [0:a:0] anull[audio00] [video00][audio00] concat=n=1:v:1:a=1 [outv][outa] -map "[outv]" -map "[outa]" output.mp4 Title based text is strongly dependent on characters supported by font. There is bad support for placeholder font in FFmpeg. Text on android canvas might differ from text on final movie - when special characters are not supported by font.
  98. 108 TEXT FFMPEG ffmpeg -i input1.mp4 -filter_complex [0:v:0] drawtext=fontfile=Chewy.ttf :fontsize=100

    :fontcolor=white :x=\(w-text_w\)/2 :y=\(h-text_h\)/2 :text=Text [video00] [0:a:0] anull[audio00] [video00][audio00] concat=n=1:v:1:a=1 [outv][outa] -map "[outv]" -map "[outa]" output.mp4
  99. 109 TEXT FFMPEG ffmpeg -i input1.mp4 -filter_complex [0:v:0]null[video00] movie=bitmap_text.png[text00] [video00][text00]overlay=x=0:y=0,

    setpts=PTS-STARTPTS[videotext00] [0:a:0] anull[audio00] [videotext00][audio00] concat=n=1:v:1:a=1 [outv][outa] -map "[outv]" -map "[outa]" output.mp4
  100. 110 TEXT FFMPEG ffmpeg -i input1.mp4 -filter_complex [0:v:0]null[video00] movie=bitmap_text.png[text00] [video00][text00]overlay=x=0:y=0,

    setpts=PTS-STARTPTS[videotext00] [0:a:0] anull[audio00] [videotext00][audio00] concat=n=1:v:1:a=1 [outv][outa] -map "[outv]" -map "[outa]" output.mp4 Overlay 0,0 (left upper corner) expects that bitmap is in the video resolution.
  101. 111 TEXT FFMPEG ffmpeg -i input1.mp4 -filter_complex [0:v:0]null[video00] movie=bitmap_text.png[text00] [video00][text00]overlay=x=0:y=0,

    setpts=PTS-STARTPTS[videotext00] [0:a:0] anull[audio00] [videotext00][audio00] concat=n=1:v:1:a=1 [outv][outa] -map "[outv]" -map "[outa]" output.mp4
  102. 112 BACKGROUND MUSIC EXOPLAYER private val exoPlayer: SimpleExoPlayer private val

    exoPlayerBackground: SimpleExoPlayer exoPlayerBackground.prepare( ProgressiveMediaSource.Factory(dataSourceFactory) .createMediaSource(Uri.parse(requireNotNull(currentBackgroundMusic).path)) ) exoPlayer.addListener(object : Player.EventListener { override fun onPlayerStateChanged(playWhenReady: Boolean, playbackState: Int) { when (exoPlayer.playbackState) { Player.STATE_READY -> { if (!exoPlayer.playWhenReady) { exoPlayerBackground.playWhenReady = false } else { exoPlayerBackground.playWhenReady = currentBackgroundMusic != null } } } } })
  103. 113 BACKGROUND MUSIC EXOPLAYER private val exoPlayer: SimpleExoPlayer private val

    exoPlayerBackground: SimpleExoPlayer exoPlayerBackground.prepare( ProgressiveMediaSource.Factory(dataSourceFactory) .createMediaSource(Uri.parse(requireNotNull(currentBackgroundMusic).path)) ) exoPlayer.addListener(object : Player.EventListener { override fun onPlayerStateChanged(playWhenReady: Boolean, playbackState: Int) { when (exoPlayer.playbackState) { Player.STATE_READY -> { if (!exoPlayer.playWhenReady) { exoPlayerBackground.playWhenReady = false } else { exoPlayerBackground.playWhenReady = currentBackgroundMusic != null } } } } })
  104. 114 BACKGROUND MUSIC EXOPLAYER private val exoPlayer: SimpleExoPlayer private val

    exoPlayerBackground: SimpleExoPlayer exoPlayerBackground.prepare( ProgressiveMediaSource.Factory(dataSourceFactory) .createMediaSource(Uri.parse(requireNotNull(currentBackgroundMusic).path)) ) exoPlayer.addListener(object : Player.EventListener { override fun onPlayerStateChanged(playWhenReady: Boolean, playbackState: Int) { when (exoPlayer.playbackState) { Player.STATE_READY -> { if (!exoPlayer.playWhenReady) { exoPlayerBackground.playWhenReady = false } else { exoPlayerBackground.playWhenReady = currentBackgroundMusic != null } } } } }) Synchronize players.
  105. 115 BACKGROUND MUSIC EXOPLAYER private val exoPlayer: SimpleExoPlayer private val

    exoPlayerBackground: SimpleExoPlayer exoPlayerBackground.prepare( ProgressiveMediaSource.Factory(dataSourceFactory) .createMediaSource(Uri.parse(requireNotNull(currentBackgroundMusic).path)) ) exoPlayer.addListener(object : Player.EventListener { override fun onPlayerStateChanged(playWhenReady: Boolean, playbackState: Int) { when (exoPlayer.playbackState) { Player.STATE_READY -> { if (!exoPlayer.playWhenReady) { exoPlayerBackground.playWhenReady = false } else { exoPlayerBackground.playWhenReady = currentBackgroundMusic != null } } } } }) Synchronize players.
  106. 116 BACKGROUND MUSIC EXOPLAYER private val exoPlayer: SimpleExoPlayer private val

    exoPlayerBackground: SimpleExoPlayer exoPlayerBackground.prepare( ProgressiveMediaSource.Factory(dataSourceFactory) .createMediaSource(Uri.parse(requireNotNull(currentBackgroundMusic).path)) ) exoPlayer.addListener(object : Player.EventListener { override fun onPlayerStateChanged(playWhenReady: Boolean, playbackState: Int) { when (exoPlayer.playbackState) { Player.STATE_READY -> { if (!exoPlayer.playWhenReady) { exoPlayerBackground.playWhenReady = false } else { exoPlayerBackground.playWhenReady = currentBackgroundMusic != null } } } } })
  107. 117 BACKGROUND MUSIC FFMPEG ffmpeg -i input1.mp4 -filter_complex [0:v:0]null[video00][0:a:0]anull[audio00] amovie=

    background_music.mp3:loop=0,asepts=PTS-STARTPTS,volume=0.3[outm] [video00][audio00] concat=n=1:v:1:a=1 [outv][outav] [outav][outm]amerge,asepts=PTS-STARTPTS[outa] -map "[outv]" -map "[outa]" output.mp4
  108. 118 BACKGROUND MUSIC FFMPEG ffmpeg -i input1.mp4 -filter_complex [0:v:0]null[video00][0:a:0]anull[audio00] amovie=

    background_music.mp3:loop=0,asepts=PTS-STARTPTS,volume=0.3[outm] [video00][audio00] concat=n=1:v:1:a=1 [outv][outav] [outav][outm]amerge,asepts=PTS-STARTPTS[outa] -map "[outv]" -map "[outa]" output.mp4
  109. 119 BACKGROUND MUSIC FFMPEG ffmpeg -i input1.mp4 -filter_complex [0:v:0]null[video00][0:a:0]anull[audio00] amovie=

    background_music.mp3:loop=0,asepts=PTS-STARTPTS,volume=0.3[outm] [video00][audio00] concat=n=1:v:1:a=1 [outv][outav] [outav][outm]amerge,asepts=PTS-STARTPTS[outa] -map "[outv]" -map "[outa]" output.mp4 Do nothing with video and audio stream of input 0.
  110. 121 AUDIO FOCUS import com.google.android.exoplayer2.C import com.google.android.exoplayer2.ExoPlayerFactory import com.google.android.exoplayer2.audio.AudioAttributes exoPlayer.setAudioAttributes(

    AudioAttributes.Builder() .setUsage(C.USAGE_MEDIA) .setContentType(C.CONTENT_TYPE_MOVIE).build(), handleAudioFocus = true) C.CONTENT_TYPE_MOVIE //reduce volume while notification is playing C.CONTENT_TYPE_SPEECH //pause player while notification is playing
  111. 122 AUDIO FOCUS import com.google.android.exoplayer2.C import com.google.android.exoplayer2.ExoPlayerFactory import com.google.android.exoplayer2.audio.AudioAttributes exoPlayer.setAudioAttributes(

    AudioAttributes.Builder() .setUsage(C.USAGE_MEDIA) .setContentType(C.CONTENT_TYPE_MOVIE).build(), handleAudioFocus = true) C.CONTENT_TYPE_MOVIE //reduce volume while notification is playing C.CONTENT_TYPE_SPEECH //pause player while notification is playing
  112. 123 AUDIO FOCUS import com.google.android.exoplayer2.C import com.google.android.exoplayer2.ExoPlayerFactory import com.google.android.exoplayer2.audio.AudioAttributes exoPlayer.setAudioAttributes(

    AudioAttributes.Builder() .setUsage(C.USAGE_MEDIA) .setContentType(C.CONTENT_TYPE_MOVIE).build(), handleAudioFocus = true) C.CONTENT_TYPE_MOVIE //reduce volume while notification is playing C.CONTENT_TYPE_SPEECH //pause player while notification is playing
  113. 124 AUDIO FOCUS import com.google.android.exoplayer2.C import com.google.android.exoplayer2.ExoPlayerFactory import com.google.android.exoplayer2.audio.AudioAttributes exoPlayer.setAudioAttributes(

    AudioAttributes.Builder() .setUsage(C.USAGE_MEDIA) .setContentType(C.CONTENT_TYPE_MOVIE).build(), handleAudioFocus = true) C.CONTENT_TYPE_MOVIE //reduce volume while notification is playing C.CONTENT_TYPE_SPEECH //pause player while notification is playing In case of more exoPlayers playing at the same you should mark just one (the main) as audioFocus handler. Otherwise they will affect each other :-)