Home » Android » android – Display two videos together then output as a merged video on a single screen

android – Display two videos together then output as a merged video on a single screen

Posted by: admin May 14, 2020 Leave a comment


This question may sound a little bit complex or ambiguous, but I’ll try to make it as clear as I can. I have done lots of Googling and spent lots of time but didn’t find anything relevant for windows.

I want to play two videos on a single screen. One as full screen in background and one on top of it in a small window or small width/height in the right corner. Then I want an output which consists of both videos playing together on a single screen.

So basically one video overlays another and then I want that streamed as output so the user can play that stream later.

I am not asking you to write the whole code, just tell me what to do or how to do it or which tool or third party SDK I have to use to make it happen.

Tried a lots of solution.

1.Xuggler– doesn’t support Android.

2.JavaCV or JJMPEG– not able to find any tutorial which suggested how to do it?

Now looking for FFMPEG- searched for a long time but not able to find any tutorial which suggest the coding way to do it. I found command line way to how to fix it.
So can anyone suggest or point the tutorial of FFMPEG or tell any other way to

How to&Answers:

I would start with JavaCV. It’s quite good and flexible. It should allow you to grab frames, composite them and write them back to a file. Use FFmpegFrameGrabber and Recorder classes. The composition can be done manually.

The rest of the answer depends on few things:

  • do you want to read from a file/mem/url?
  • do you want to save to a file/mem/url?
  • do you need realtime processing?
  • do you need something more than simple picture-in-picture?


You could use OpenGL to do the trick. Please note however that you will need to have to render steps, one rendering the first video in a FBO and then the second rendering the second video, using the FBO as TEXTURE0 and the second as EXTERNAL_TEXTURE.

Blending, and all the stuff you want would be done by OpengL.

You can check the source codes here: Using SurfaceTexture in Android and some important information here: Android OpenGL combination of SurfaceTexture (external image) and ordinary texture

The only thing I’m not sure is what happens when two instances of mediaplayer are running in Parallel. I guess it should not be a problem.


ffmpeg is a very active project, lot’s of changes and releases all the time.

You should look at the Xuggler project, this provides a Java API for what you want to do, and they have tight integration with ffmpeg.


Should you choose to go down the Runtime.exec() path, this Red5 thread should be useful: