I need hardware-accelerated H.264 decoding for a research project, to test a self-defined protocol.
As I have Search on the web, I have found a few ways to perform hardware-accelerated video decoding on Android.
- Use ffmpeg libstagefright (overview of libstagefright) or use libstagefright in the OS directly, like here.
- Use OpenMax on specific hardware platform. like here about samsung device and here about Qualcomm Snapdragon series
- Some people mentioned PVplayer,
Some people “say” libstagefright is the only way while Qualcomm guys have made success obviously.
Currently I am not sure which way could work. I am a little confused now. If all could work, I would certainly prefer a hardware independent method.
As I have tested a few video players of their H/W acceleration with Galaxy Tab 7.7(3.2 & Enxyos), VLC, Mobo, Rock, vplayer, rock and mobo work fine, VLC doesn’t work, vplayer seems to have a rendering bug which costs its performance.
Anyway, I did an ‘operation’ on Rockplayer and deleted all its .so libs in data\data\com.redirecting\rockplayer, and software decoding crashes while hw decoding works still fine! I wonder how they did that.
It appears to me that hw acceleration could be independent of hardware platforms.
Can someone nail this problem? Or provide any reference with additional information or better details?
To answer the above question, let me introduce few concepts related to Android
Android uses OpenMAX for codec interface. Hence all native codecs (hardware accelerated or otherwise) provide OpenMAX interface. This interface is used by StageFright(Player framework) for decoding media using codec
Android allows Java Applications to interact with underlying C/C++ native libraries using NDK. This requires using JNI (Java Native Interface).
Now coming to your question
How to tap native decoder to decode raw video bitstream?
In Android 4.0 version and below, Android did not provide access to underlying video decoders at Java layer. You would need to write native code to directly interact with OMX decoder. Though this is possible, it is not trivial as it would need knowledge of how OMX works and how to map this OMX to application using NDK.
In 4.1 (Jelly Bean version), Android seems to provide access to hardware accelerated decoders at application level through JAVA APIs. More details about new APIs at http://developer.android.com/about/versions/android-4.1.html#Multimedia
Its a Google-sponsored open-source project that replaces the platform’s MediaPlayer. Each component in the pipeline is extensible including the sample source (how the H.264 frames are extracted from your custom protocol) to the rendering (to a Surface, SurfaceTexture, etc).
It includes a nice demo app showing usage.