34

I have been trying to get H264 encoding to work with input captured by the camera on an Android tablet using the new low-level MediaCodec. I have gone through some difficulties with this, since the MediaCodecAPI is poorly documented, but I've gotten something to work at last.

I'm setting up the camera as follows:

        Camera.Parameters parameters = mCamera.getParameters();
        parameters.setPreviewFormat(ImageFormat.YV12); // <1>
        parameters.setPreviewFpsRange(4000,60000);
        parameters.setPreviewSize(640, 480);            
        mCamera.setParameters(parameters);

For the encoding part, I'm instantiating the MediaCodec object as follows:

    mediaCodec = MediaCodec.createEncoderByType("video/avc");
    MediaFormat mediaFormat = MediaFormat.createVideoFormat("video/avc", 640, 480);
    mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 500000);
    mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 15);
    mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT,
            MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar); // <2>
    mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5);
    mediaCodec.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
    mediaCodec.start();

The final goal is to create an RTP-stream (and correspond with Skype), but so far I am only streaming the raw H264 directly to my desktop. There I use the following GStreamer-pipeline to show the result:

gst-launch udpsrc port=5555 ! video/x-h264,width=640,height=480,framerate=15/1 ! ffdec_h264 ! autovideosink

All works well, except for the colors. I need to set 2 colorformats in the computer: one for the camera-preview (line tagged with <1>) and one for the MediaCodec-object (tagged with <2>)

To determine the acceptable values for the lines <1> I used parameters.getSupportedPreviewFormats(). From this, I know that the only supported formats on the camera are ImageFormat.NV21 and ImageFormat.YV2.

For <2>, I retrieved the MediaCodecInfo.CodecCapabilities-object for type video/avc, being the integer values 19 (corresponding with MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar and 2130708361 (which doesn't correspond with any value of MediaCodecInfo.CodecCapabilities).

Any other value than the above results in a crash.

Combining these settings gives different results, which I'll show below. Here's the screenshot on Android (i.e. the "real" colors): Input on Android-tablet Here are the results as shown by Gstreamer:

<1> = NV21, <2> = COLOR_FormatYUV420Planar Gstreamer-output for NV21-COLOR_FormatYUV420Planar

<1> = NV21, <2> = 2130708361 Gstreamer-output for NV21-2130708361

<1> = YV2, <2> = COLOR_FormatYUV420Planar Gstreamer-output for YV2-COLOR_FormatYUV420Planar

<1> = YV2, <2> = 2130708361 Gstreamer-output for YV2-2130708361

As can be seen, none of these are satisfying. The YV2-colorspace looks the most promising, but it looks like red (Cr) and blue (Cb) are inverted. The NV21 looks interlaced I guess (however, I'm no expert in this field).

Since the purpose is to communicate with Skype, I assume I shouldn't change the decoder (i.e. the Gstreamer-command), right? Is this to be solved in Android and if so: how? Or can this be solved by adding certain RTP payload information? Any other suggestion?

1
  • 1
    Looks like 2130708361 corresponds to COLOR_FormatSurface. You must be using API version > 17.
    – Ryan
    Nov 6, 2013 at 15:35

5 Answers 5

7

I solved it by swapping the byteplanes myself on Android level, using a simple function:

public byte[] swapYV12toI420(byte[] yv12bytes, int width, int height) {
    byte[] i420bytes = new byte[yv12bytes.length];
    for (int i = 0; i < width*height; i++)
        i420bytes[i] = yv12bytes[i];
    for (int i = width*height; i < width*height + (width/2*height/2); i++)
        i420bytes[i] = yv12bytes[i + (width/2*height/2)];
    for (int i = width*height + (width/2*height/2); i < width*height + 2*(width/2*height/2); i++)
        i420bytes[i] = yv12bytes[i - (width/2*height/2)];
    return i420bytes;
}
4
  • FWIW, some devices accept COLOR_FormatYUV420Planar, some accept COLOR_FormatYUV420SemiPlanar.
    – fadden
    Jun 25, 2013 at 20:00
  • 2
    In 4.3 (API 18) you can use the Surface preview rather than a ByteBuffer. This avoids the need to correct the color planes, and is portable. See the CameraToMpegTest sample on bigflake.com/mediacodec
    – fadden
    Jul 24, 2013 at 19:29
  • yeah, 500, 1000 msec, please delete this answer
    – user25
    Jun 22, 2018 at 20:39
  • @user25, This question and answer is > 5 years old. We all knew the image format was wrong, but the function to change it wasn't documented - possibly didn't even exist. If you are still having a similar problem you should write a new answer because other people are sure to be having the same problem. Jun 24, 2018 at 21:37
6

I think it's more efficient to swap the values in place.

        int wh4 = input.length/6; //wh4 = width*height/4
        byte tmp;
        for (int i=wh4*4; i<wh4*5; i++)
            {
            tmp = input[i];
            input[i] = input[i+wh4];
            input[i+wh4] = tmp;
            }

Maybe even better, you can instead replace

            inputBuffer.put(input);

With the 3 planar slices in the correct order

            inputBuffer.put(input, 0, wh4*4);
            inputBuffer.put(input, wh4*5, wh4);
            inputBuffer.put(input, wh4*4, wh4);

I think that should only have a tiny overhead

1
  • with cameraParamters.setPreviewFormat(ImageFormat.YV12); or (ImageFormat.YV12) your solution does nothing (With the 3 planar slices in the correct order)
    – user25
    Jun 22, 2018 at 20:36
3

It seems Android is transmitting in YV12, but the format set in the H264 headers is YUV420. These formats are equal except the U and V channels are in different order, which explains the swapping of red and blue.

Best would be of course to fix the setting on the Android side. But if there is no way to set compatible settings for camera and encoder, you will have to force the format on the GStreamer side.

This can be done by adding capssetter element after the ffdec_h264

... ! ffdec_h264 ! capssetter caps="video/x-raw-yuv, format=(fourcc)YV12" ! colorspace ! ...

2
  • Could I also fix this by adding a GStreamer pipe on the android side? Given that the above uses up to 50% of the CPU, do you think there'll be enough resources to do that, and eventually do some H264 decoding in parallel?
    – gleerman
    Dec 6, 2012 at 9:43
  • android default is NV21, not YV12
    – user25
    Jun 22, 2018 at 21:20
0

With ImageFormat.NV21 set on camera and COLOR_FormatYUV420Planar for encoder , similar blue shadow is seen to overlap in my case. As I understand the above swap function cannot be used in my case, any suggestions on an algorithm that can be used for this? ps: Its a complete black screen at decoder when the camera preview format is set as YV12

1
  • The above swap will only work if you set ImageFormat.YV12 on the camera side. Conversion from NV21 will be a lot more complex since it includes an interleaved plane (see fourcc.org/yuv.php). If I understand correctly, this interleaved plane has swapped U and V values as well, so the swap function may come in handy anyway (slightly modified since the width and height will be only half).
    – gleerman
    Dec 17, 2012 at 16:26
0

I've used the code from here to convert the camera images to video using media encoder, which caused the same kind of issue.

So referring this wikipedia article I modified the code like this and the output seems to be working fine. Instead of uvuv byte order, changed the byte order to uuuuuvvvv ..

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Not the answer you're looking for? Browse other questions tagged or ask your own question.