Seems the issue is this device needs not multiple of 16 dimensions…but multiple of 32. Thing is, I still have no idea how to determine what quantifies as a problematic device. This does appear to use qcom (which I’ve seen to be problematic on Android 4.X, but this is Android 5.1.1).
I’m encoding a video in my Android app. I’m using Android MediaCodec to do so, converting RGB of each frame to YUV and passing in the pixels.
The code I use has been in place for a while, and works across any device I’ve come across.
A user came to me with a bug report that their MP4s were coming out weird. The device is a Samsung T337A (a Galaxy Tab 4).
Here are what MP4 exports look like:
NOTE – For whatever odd reason, it doesn’t happen at all resolutions. It’s confirmed to happen at 768×432 and 1280×720 but does not happen at 640×352 for example (my app makes sure all resolutions are divisible by 16 by default).
On a Nexus 5X (which uses the same semi-planar YUV format) the output works at all resolutions.
So it’s something with this device, and maybe other devices I don’t know about?
I’ve looked at all output, and it looks normal and identical to the Nexus 5X I mentioned above (which works 100% of the time).
MediaCodecInfo being used is OMX.qcom.video.encoder.avc, color format being used is 2135033992 (which is COLOR_FormatYUV420Flexible). So basically, nothing weird.
The code is a bit expansive, I’ll post if necessary, but just looking for general ideas of why this happens. I’d understand if it was more common, but the same code works for a vast array of other devices, so something funky is going on…
Make sure that you use the right strides and buffer offsets values