with Create video from screen grabs in android




video recording with mediaprojectionmanager (8)

I would like to record user interaction in a video that people can then upload to their social media sites.

For example, the Talking Tom Cat android app has a little camcorder icon. The user can press the camcorder icon, then interact with the app, press the icon to stop the recording and then the video is processed/converted ready for upload.

I think I can use setDrawingCacheEnabled(true) to save images but don't know how to add audio or make a video.

Update: After further reading I think I will need to use the NDK and ffmpeg. I prefer not to do this, but, if there are no other options, does anyone know how to do this?

Does anyone know how to do this in Android?

Relevant links...

Android Screen capturing or make video from images

how to record screen video as like Talking Tomcat application does in iphone?


how to record screen video as like Talking Tomcat application does in iphone?

In case someone want to implement the same..i figured out myself. First of all to my surprise i found that talking tomcat is not a 3D game app...it uses frame animations for all movements. and if some one wants to capture the that kind of view then they can use following code---

UIGraphicsBeginImageContext(self.view.bounds.size); //self.view.window.frame.size
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImageWriteToSavedPhotosAlbum(viewImage, nil, nil, nil);

and then use AVAssetWriter for creating the video from that frames. ofcourse you can find code for that in some other post.. For me its not useful as i have to capture 3D part.. Cheers


Use the MediaCodec API with CONFIGURE_FLAG_ENCODE to set it up as an encoder. No ffmpeg required :)

You've already found how to grab the screen in the other question you linked to, now you just need to feed each captured frame to MediaCodec, setting the appropriate format flags, timestamp, etc.

EDIT: Sample code for this was hard to find, but here it is, hat tip to Martin Storsjö. Quick API walkthrough:

MediaFormat inputFormat = MediaFormat.createVideoFormat("video/avc", width, height);
inputFormat.setInteger(MediaFormat.KEY_BIT_RATE, bitRate);
inputFormat.setInteger(MediaFormat.KEY_FRAME_RATE, frameRate);
inputFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, colorFormat);
inputFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 75);
inputFormat.setInteger("stride", stride);
inputFormat.setInteger("slice-height", sliceHeight);

encoder = MediaCodec.createByCodecName("OMX.TI.DUCATI1.VIDEO.H264E"); // need to find name in media codec list, it is chipset-specific

encoder.configure(inputFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
encoder.start();
encoderInputBuffers = encoder.getInputBuffers();
encoderOutputBuffers = encoder.getOutputBuffers();

byte[] inputFrame = new byte[frameSize];

while ( ... have data ... ) {
    int inputBufIndex = encoder.dequeueInputBuffer(timeout);

    if (inputBufIndex >= 0) {
        ByteBuffer inputBuf = encoderInputBuffers[inputBufIndex];
        inputBuf.clear();

        // HERE: fill in input frame in correct color format, taking strides into account
        // This is an example for I420
        for (int i = 0; i < width; i++) {
            for (int j = 0; j < height; j++) {
                inputFrame[ i * stride + j ] = ...; // Y[i][j]
                inputFrame[ i * stride/2 + j/2 + stride * sliceHeight ] = ...; // U[i][j]
                inputFrame[ i * stride/2 + j/2 + stride * sliceHeight * 5/4 ] = ...; // V[i][j]
            }
        }

        inputBuf.put(inputFrame);

        encoder.queueInputBuffer(
            inputBufIndex,
            0 /* offset */,
            sampleSize,
            presentationTimeUs,
            0);
    }

    int outputBufIndex = encoder.dequeueOutputBuffer(info, timeout);

    if (outputBufIndex >= 0) {
        ByteBuffer outputBuf = encoderOutputBuffers[outputBufIndex];

        // HERE: read get the encoded data

        encoder.releaseOutputBuffer(
            outputBufIndex, 
            false);
    }
    else {
        // Handle change of buffers, format, etc
    }
}

There are also some open issues.

EDIT: You'd feed the data in as a byte buffer in one of the supported pixel formats, for example I420 or NV12. There is unfortunately no perfect way of determining which formats would work on a particular device; however it is typical for the same formats you can get from the camera to work with the encoder.


As long as you have bitmaps you can flip it into video using JCodec ( http://jcodec.org ).

Here's a sample image sequence encoder: https://github.com/jcodec/jcodec/blob/master/src/main/java/org/jcodec/api/SequenceEncoder.java . You can modify it for your purposes by replacing BufferedImage with Bitmap.

Use these helper methods:

public static Picture fromBitmap(Bitmap src) {
  Picture dst = Picture.create((int)src.getWidth(), (int)src.getHeight(), RGB);
  fromBitmap(src, dst);
  return dst;
}

public static void fromBitmap(Bitmap src, Picture dst) {
  int[] dstData = dst.getPlaneData(0);
  int[] packed = new int[src.getWidth() * src.getHeight()];

  src.getPixels(packed, 0, src.getWidth(), 0, 0, src.getWidth(), src.getHeight());

  for (int i = 0, srcOff = 0, dstOff = 0; i < src.getHeight(); i++) {
    for (int j = 0; j < src.getWidth(); j++, srcOff++, dstOff += 3) {
      int rgb = packed[srcOff];
      dstData[dstOff]     = (rgb >> 16) & 0xff;
      dstData[dstOff + 1] = (rgb >> 8) & 0xff;
      dstData[dstOff + 2] = rgb & 0xff;
    }
  }
}

public static Bitmap toBitmap(Picture src) {
  Bitmap dst = Bitmap.create(pic.getWidth(), pic.getHeight(), ARGB_8888);
  toBitmap(src, dst);
  return dst;
}

public static void toBitmap(Picture src, Bitmap dst) {
  int[] srcData = src.getPlaneData(0);
  int[] packed = new int[src.getWidth() * src.getHeight()];

  for (int i = 0, dstOff = 0, srcOff = 0; i < src.getHeight(); i++) {
    for (int j = 0; j < src.getWidth(); j++, dstOff++, srcOff += 3) {
      packed[dstOff] = (srcData[srcOff] << 16) | (srcData[srcOff + 1] << 8) | srcData[srcOff + 2];
    }
  }
  dst.setPixels(packed, 0, src.getWidth(), 0, 0, src.getWidth(), src.getHeight());
}

You can as well wait for JCodec team to implement full Android support, they are working on it according to this: http://jcodec.org/news/no_deps.html



Yes,you can! No rooting and no ROM modification,the best way to do this is to build a virtual app that runs the other app as a plugin´╝îso that you can modify anything in the target app. But there is so much work to do, the best news is that there are several open source projects to do this. And so, the next thing is not so difficult,you only have to hook several libs so in /system/lib that affect the camera recording. In fact, I have done this on my device, but I modified the system lib directly, it has to be rooted of course. But it works well on almost all apps except some apps that use the service to capture video. We have to modify the service lib, but it is a little more difficult.


The question isn't new, but I thought I'd pitch in:

We provide an SDK called "Everyplay" that allows you to do exactly what you're looking for. It's free to use, and is lightweight.

We provide out-of-the-box integrations for Unity3D, cocos2d (1.x, 2.x), cocos2d-x, and you can of course integrate to a custom OpenGL-based game engine.

The documentation is available at https://developers.everyplay.com/doc

The documentation contains an example app key to use when developing, but you can of course sign up for your own client key at https://developers.everyplay.com/


How to create a fake Camera

I think what you are looking for is a way to encode videos to H.264 in a way similar to what MediaRecorder does but not from the camera. You do not particularly care whether this is done with a "fake camera" or in some other way, correct? In that case...

You can use the MediaCodec API available in Android 4.1 and later. You can just give it a series of images and it will create video encoded with (where available) the hardware encoder. Some sample code: Create video from screen grabs in android and Encoding H.264 from camera with Android MediaCodec


you can use following code for screen capturing in Android.

ImageView v1 = (ImageView)findViewById(R.id.mImage);
v1.setDrawingCacheEnabled(true);
Bitmap bm = v1.getDrawingCache();

For Creating Video from Images visit this link.