当前位置: 移动技术网 > IT编程>移动开发>Android > Android视频处理之动态时间水印效果

Android视频处理之动态时间水印效果

2019年07月24日  | 移动技术网IT编程  | 我要评论

广州夜场招聘男公关,钉子精神,鳌江

最近的项目中遇到一个非常头痛的需求,在android端录制视频的时候动态添加像监控画面一样的精确到秒的时间信息,关键是,并不是说只在播放器的界面显示时间就可以了,而是录制到视频里面去,这个mp4在电脑上播放也能看到每个画面的时间。
最后想到的办法是在录制完成以后去处理这个视频。

期间参考了很多资料,比较有用的大概是ffmpeg和比较新的api mediacodec系列了。介于ffmpeg都是c实现,和一大堆ndk相关,本人不是太懂,就重点关注了mediacodec系列。

参考逻辑流程图一目了然的这篇博文

mediacodec进行编解码的大体逻辑是这样的(转载):

主要函数的调用逻辑如下:

mediaextractor,mediacodec,mediamuxer这三个api已经可以很多多媒体处理工作了,比如用mediaextractor+mediamuxer就可以做音视频剪辑,mediacodec+mediamuxer就可以做自定义的录像机,一起用就可以做特效编辑,滤镜之类的了。

添加时间水印效果

关键在于取到的数据帧,是yuv格式的,根据拍摄时选取的不同还不一样,我用到的nv21格式,也就是yuv420sp,拿到nv21格式的帧以后,转成rgb渲染,然后又转回nv21交给encoder,看起来好笨重,也非常地耗时,但我还没找到更好的办法。

 private bitmap first;

 private void handleframedata(byte[] data, mediacodec.bufferinfo info) {
 //yuv420sp转rgb数据 5-60ms
 bytearrayoutputstream out = new bytearrayoutputstream();
 yuvimage yuvimage = new yuvimage(data, imageformat.nv21, srcwidth, srcheight, null);
 yuvimage.compresstojpeg(new rect(0, 0, srcwidth, srcheight), 100, out);
 byte[] imagebytes = out.tobytearray();

 //旋转图像,顺便解决电脑上播放被旋转90度的问题 20-50ms
 bitmap image = bitmapfactory.decodebytearray(imagebytes, 0, imagebytes.length);
 bitmap bitmap = rotaingimageview(videorotation, image);
 image.recycle();

 //渲染文字 0-1ms
 canvas canvas = new canvas(bitmap);
 canvas.drawtext(videotimeformat.format(videofirsttime + info.presentationtimeus / 1000), 10, 30, paint);

 //预览处理帧 0-5ms
 first = bitmap;
 handler.sendemptymessage((int) (info.presentationtimeus / 1000));

 synchronized (mediacodec.class) {//记得加锁
 timedatacontainer.add(new frame(info, bitmap));
 }
 }
 /*
 * 旋转图片
 * @param angle
 * @param bitmap
 * @return bitmap
 */
 public bitmap rotaingimageview(int angle, bitmap bitmap) {
 //旋转图片 动作
 matrix matrix = new matrix();
 matrix.postrotate(angle);
 // 创建新的图片
 return bitmap.createbitmap(bitmap, 0, 0, bitmap.getwidth(), bitmap.getheight(), matrix, true);
 }

然后是转回nv21

/**
 * 获取夹了时间戳的的数据
 *
 * @return
 */
 private frame getframedata() {
 synchronized (mediacodec.class) {//记得加锁

 if (timedatacontainer.isempty()) {
 return null;
 }

 //从队列中获取数据
 frame frame = timedatacontainer.remove(0);////取出后将此数据remove掉 既能保证pcm数据块的取出顺序 又能及时释放内存

 //转回yuv420sp 120-160ms
 frame.data = getnv21(dstwidth, dstheight, frame.bitmap);

 return frame;
 }
 }
 public static byte[] getnv21(int width, int height, bitmap scaled) {

 int[] argb = new int[width * height];

 scaled.getpixels(argb, 0, width, 0, 0, width, height);

 byte[] yuv = new byte[width * height * 3 / 2];

 encodeyuv420sp(yuv, argb, width, height);

 scaled.recycle();

 return yuv;
 }


 /**
 * 将bitmap里得到的argb数据转成yuv420sp格式
 * 这个yuv420sp数据就可以直接传给mediacodec,通过avcencoder间接进行编码
 *
 * @param yuv420sp 用来存放yuv420sp数据
 * @param argb 传入argb数据
 * @param width 图片width
 * @param height 图片height
 */
 public static void encodeyuv420sp(byte[] yuv420sp, int[] argb, int width, int height) {
 final int framesize = width * height;

 int yindex = 0;
 int uvindex = framesize;

 int a, r, g, b, y, u, v;
 int index = 0;
 for (int j = 0; j < height; j++) {
 for (int i = 0; i < width; i++) {

// a = (argb[index] & 0xff000000) >> 24; // a is not used obviously
 r = (argb[index] & 0xff0000) >> 16;
 g = (argb[index] & 0xff00) >> 8;
 b = (argb[index] & 0xff) >> 0;

 // well known rgb to yuv algorithm
 y = ((66 * r + 129 * g + 25 * b + 128) >> 8) + 16;
 u = ((-38 * r - 74 * g + 112 * b + 128) >> 8) + 128;
 v = ((112 * r - 94 * g - 18 * b + 128) >> 8) + 128;

 // nv21 has a plane of y and interleaved planes of vu each sampled by a factor of 2
 // meaning for every 4 y pixels there are 1 v and 1 u. note the sampling is every other
 // pixel and every other scanline.
 yuv420sp[yindex++] = (byte) ((y < 0) ? 0 : ((y > 255) ? 255 : y));
 if (j % 2 == 0 && index % 2 == 0) {
  yuv420sp[uvindex++] = (byte) ((v < 0) ? 0 : ((v > 255) ? 255 : v));
  yuv420sp[uvindex++] = (byte) ((u < 0) ? 0 : ((u > 255) ? 255 : u));
 }

 index++;
 }
 }
 }

看到上面的代码执行耗时,根本不可能实时录制时处理,就算后台服务处理,3秒钟的720*480视频得花费约20秒..

解码与编码的地方也有很多,比如编码器在某些手机不支持颜色格式,为了适配更多机型,颜色格式的设置以后需要更换。

 /**
 * 初始化编码器
 */

 private void initmediaencode(string mime) {
 try {
 mediaformat format = mediaformat.createvideoformat(mediaformat.mimetype_video_avc, dstwidth, dstheight);
 format.setinteger(mediaformat.key_bit_rate, 1024 * 512);
 format.setinteger(mediaformat.key_frame_rate, 27);
 format.setinteger(mediaformat.key_color_format, mediacodecinfo.codeccapabilities.color_formatyuv420flexible);
// format.setinteger(mediaformat.key_color_format, mediacodecinfo.codeccapabilities.color_formatyuv420planar);
 format.setinteger(mediaformat.key_i_frame_interval, 1);

 mediaencode = mediacodec.createencoderbytype(mediaformat.mimetype_video_avc);
 mediaencode.configure(format, null, null, mediacodec.configure_flag_encode);
 } catch (ioexception e) {
 e.printstacktrace();
 }

 if (mediaencode == null) {
 jlog.e(tag, "create mediaencode failed");
 return;
 }
 mediaencode.start();

 }

补充:匹配大部分手机的颜色模式应该是mediacodecinfo.codeccapabilities.color_formatyuv420semiplanar=21,这个颜色格式是在decode解码首buffer的时候得到的,但这个format居然没有码率,关键帧间隔,以及fps等,这些只能根据自己情况设
为什么我之前用了yuv420flexible,android源码里面说的yuv420semiplanner弃用

@deprecated use {@link #color_formatyuv420flexible}。
public static final int color_formatyuv420semiplanar        = 21;

不过现在可以从源文件首buffer里面解码读取出来

 case mediacodec.info_output_format_changed:
 mediaformat format = mediadecode.getoutputformat();
 log.d(tag, "new format " + format);
 if (format != null && format.containskey(mediaformat.key_color_format)) {
 videocolorformat = format.getinteger(mediaformat.key_color_format);
 log.d(tag, "decode extract get videocolorformat =" + videocolorformat);//解码得到视频颜色格式
 }
 initmediaencode(videocolorformat);//根据颜色格式初始化编码器
 break;

源码:

import android.annotation.targetapi;
import android.app.service;
import android.content.intent;
import android.graphics.bitmap;
import android.graphics.bitmapfactory;
import android.graphics.canvas;
import android.graphics.color;
import android.graphics.imageformat;
import android.graphics.matrix;
import android.graphics.paint;
import android.graphics.rect;
import android.graphics.yuvimage;
import android.media.mediacodec;
import android.media.mediacodecinfo;
import android.media.mediaextractor;
import android.media.mediaformat;
import android.media.mediametadataretriever;
import android.media.mediamuxer;
import android.os.binder;
import android.os.build;
import android.os.ibinder;
import android.os.message;
import android.support.annotation.nullable;
import android.widget.toast;


import java.io.bytearrayoutputstream;
import java.io.file;
import java.io.ioexception;
import java.lang.ref.weakreference;
import java.nio.bytebuffer;
import java.text.simpledateformat;
import java.util.arraylist;
import java.util.list;

/**

 * created by user on 2016/8/13.

 */
@targetapi(build.version_codes.jelly_bean_mr2)
public class testcodecservice extends service {

 private mediaextractor extractor;
 private mediamuxer muxer;
 private final static string tag = "px";
 private final string tag = this.getclass().getsimplename();
 private mediaformat format;

 private int videomaxinputsize = 0, videorotation = 0;
 private long videoduration;

 private boolean decodeover = false, encoding = false, mcancel, mdelete;

 //视频流在数据流中的序号


 private int videotrackindex = -1;

 private mediacodec mediadecode, mediaencode;
 private bytebuffer[] decodeinputbuffers, decodeoutputbuffers;

 private arraylist<frame> timedatacontainer;//数据块容器


 private mediacodec.bufferinfo decodebufferinfo;


 private int srcwidth, srcheight, dstwidth, dstheight;

 private simpledateformat videotimeformat;

 private int mprogress, mmax;
 private videocodecdao codecdao;

 //绘制时间戳的画笔


 private paint paint = new paint(paint.anti_alias_flag);

 @override
 public void oncreate() {
 super.oncreate();
 jlog.d(tag, "oncreate");
 //视频时间戳显示格式


 videotimeformat = new simpledateformat("yyyy/mm/dd hh:mm:ss");

 timedatacontainer = new arraylist<>();

 //初始化画笔工具


 paint.setcolor(color.white);
 paint.settextsize(20);
 codecdao = videocodecdao.getinstance(jingruiapp.getjrapplicationcontext());

 }

 @override
 public void ondestroy() {
 super.ondestroy();
 jlog.d(tag, "ondestroy");
 decodeover = true;
 encoding = false;
 }

 private void init(string srcpath, string dstpath) {
 mediametadataretriever mmr = new mediametadataretriever();
 mmr.setdatasource(srcpath);
 try {
  srcwidth = integer.parseint(mmr.extractmetadata(mediametadataretriever.metadata_key_video_width));
  srcheight = integer.parseint(mmr.extractmetadata(mediametadataretriever.metadata_key_video_height));
 } catch (illegalargumentexception e) {
  e.printstacktrace();
 } catch (illegalstateexception e) {
  e.printstacktrace();
 }

 try {
  extractor = new mediaextractor();
  extractor.setdatasource(srcpath);

  string mime = null;
  for (int i = 0; i < extractor.gettrackcount(); i++) {
  //获取码流的详细格式/配置信息


  mediaformat format = extractor.gettrackformat(i);
  mime = format.getstring(mediaformat.key_mime);
  if (mime.startswith("video/")) {
   videotrackindex = i;
   this.format = format;
  } else if (mime.startswith("audio/")) {
   continue;
  } else {
   continue;
  }
  }


  extractor.selecttrack(videotrackindex); //选择读取视频数据


  //创建合成器


  srcwidth = format.getinteger(mediaformat.key_width);
  dstheight = format.getinteger(mediaformat.key_height);
  videomaxinputsize = format.getinteger(mediaformat.key_max_input_size);
  videoduration = format.getlong(mediaformat.key_duration);
  //videorotation = format.getinteger(mediaformat.key_rotation);


  videorotation = 90;//低版本不支持获取旋转,手动写入了


  if (videorotation == 90) {
  dstwidth = srcheight;
  dstheight = srcwidth;
  } else if (videorotation == 0) {
  dstwidth = srcwidth;
  dstheight = srcheight;
  }
  mmax = (int) (videoduration / 1000);
  //int bit = this.format.getinteger(mediaformat.key_bit_rate);


  jlog.d(tag, "videowidth=" + srcwidth + ",videoheight=" + srcheight + ",videomaxinputsize=" + videomaxinputsize + ",videoduration=" + videoduration + ",videorotation=" + videorotation);

  //写入文件的合成器


  muxer = new mediamuxer(dstpath, mediamuxer.outputformat.muxer_output_mpeg_4);
  //向合成器添加视频轨


  //videotrackindex = muxer.addtrack(format);






  mediacodec.bufferinfo videoinfo = new mediacodec.bufferinfo();
  videoinfo.presentationtimeus = 0;

  initmediadecode(mime);
  initmediaencode(mime);

 } catch (ioexception e) {
  e.printstacktrace();
 }
 }


 //抽出每一帧


 @targetapi(build.version_codes.lollipop)
 private void extract() {
 int inputindex = mediadecode.dequeueinputbuffer(-1);//获取可用的inputbuffer -1代表一直等待,0表示不等待 建议-1,避免丢帧

 if (inputindex < 0) {
  jlog.d("px", "=========== code over =======");
  return;
 }

 bytebuffer inputbuffer = decodeinputbuffers[inputindex];//拿到inputbuffer

 inputbuffer.clear();

 int length = extractor.readsampledata(inputbuffer, 0); //读取一帧数据,放到解码队列

 if (length < 0) {
  jlog.d("px", "extract over");
  decodeover = true;
  return;
 } else {
  //获取时间戳


  long presentationtimeus = extractor.getsampletime();

  mediacodec.bufferinfo videoinfo = new mediacodec.bufferinfo();
  videoinfo.offset = 0;
  videoinfo.size = length;
  //获取帧类型,只能识别是否为i帧


  videoinfo.flags = extractor.getsampleflags();
  videoinfo.presentationtimeus = extractor.getsampletime();

  //解码视频


  decode(videoinfo, inputindex);
  extractor.advance(); //移动到下一帧

 }

 }


 private void handleframedata(byte[] data, mediacodec.bufferinfo info) {
 //yuv420sp转rgb数据 5-60ms


 bytearrayoutputstream out = new bytearrayoutputstream();
 yuvimage yuvimage = new yuvimage(data, imageformat.nv21, srcwidth, srcheight, null);
 yuvimage.compresstojpeg(new rect(0, 0, srcwidth, srcheight), 100, out);
 byte[] imagebytes = out.tobytearray();

 //旋转图像 20-50ms


 bitmap image = bitmapfactory.decodebytearray(imagebytes, 0, imagebytes.length);
 bitmap bitmap = rotaingimageview(videorotation, image);
 image.recycle();

 //渲染文字 0-1ms


 canvas canvas = new canvas(bitmap);
 canvas.drawtext(videotimeformat.format(mvideo.videocreatetime + info.presentationtimeus / 1000), 10, 30, paint);

 //通知进度 0-5ms


 mprogress = (int) (info.presentationtimeus / 1000);
 if (mlistener != null) {
  mlistener.onprogress(mprogress, mmax);
 }

 synchronized (mediacodec.class) {//记得加锁

  timedatacontainer.add(new frame(info, bitmap));
 }
 }


 public static byte[] getnv21(int width, int height, bitmap scaled) {

 int[] argb = new int[width * height];

 scaled.getpixels(argb, 0, width, 0, 0, width, height);

 byte[] yuv = new byte[width * height * 3 / 2];

 encodeyuv420sp(yuv, argb, width, height);

 scaled.recycle();

 return yuv;
 }


 /**

 * 将bitmap里得到的argb数据转成yuv420sp格式

 * 这个yuv420sp数据就可以直接传给mediacodec,通过avcencoder间接进行编码

 *

 * @param yuv420sp 用来存放yuv420sp数据

 * @param argb 传入argb数据

 * @param width 图片width

 * @param height 图片height

 */
 public static void encodeyuv420sp(byte[] yuv420sp, int[] argb, int width, int height) {
 final int framesize = width * height;

 int yindex = 0;
 int uvindex = framesize;

 int a, r, g, b, y, u, v;
 int index = 0;
 for (int j = 0; j < height; j++) {
  for (int i = 0; i < width; i++) {

//  a = (argb[index] & 0xff000000) >> 24; // a is not used obviously


  r = (argb[index] & 0xff0000) >> 16;
  g = (argb[index] & 0xff00) >> 8;
  b = (argb[index] & 0xff) >> 0;

  // well known rgb to yuv algorithm


  y = ((66 * r + 129 * g + 25 * b + 128) >> 8) + 16;
  u = ((-38 * r - 74 * g + 112 * b + 128) >> 8) + 128;
  v = ((112 * r - 94 * g - 18 * b + 128) >> 8) + 128;

  // nv21 has a plane of y and interleaved planes of vu each sampled by a factor of 2


  // meaning for every 4 y pixels there are 1 v and 1 u. note the sampling is every other


  // pixel and every other scanline.


  yuv420sp[yindex++] = (byte) ((y < 0) ? 0 : ((y > 255) ? 255 : y));
  if (j % 2 == 0 && index % 2 == 0) {
   yuv420sp[uvindex++] = (byte) ((v < 0) ? 0 : ((v > 255) ? 255 : v));
   yuv420sp[uvindex++] = (byte) ((u < 0) ? 0 : ((u > 255) ? 255 : u));
  }

  index++;
  }
 }
 }

 /**

 * 获取夹了时间戳的的数据

 *

 * @return

 */
 private frame getframedata() {
 synchronized (mediacodec.class) {//记得加锁


  if (timedatacontainer.isempty()) {
  return null;
  }

  //从队列中获取数据


  frame frame = timedatacontainer.remove(0);////取出后将此数据remove掉 既能保证pcm数据块的取出顺序 又能及时释放内存


  //转回yuv420sp 120-160ms


  frame.data = getnv21(dstwidth, dstheight, frame.bitmap);

  return frame;
 }
 }

 /*

 * 旋转图片

 * @param angle

 * @param bitmap

 * @return bitmap

 */
 public bitmap rotaingimageview(int angle, bitmap bitmap) {
 //旋转图片 动作


 matrix matrix = new matrix();
 matrix.postrotate(angle);
 // 创建新的图片


 return bitmap.createbitmap(bitmap, 0, 0, bitmap.getwidth(), bitmap.getheight(), matrix, true);
 }

 /**

 * 初始化解码器

 */
 private void initmediadecode(string mime) {
 try {
  //创建解码器


  mediadecode = mediacodec.createdecoderbytype(mime);
  mediadecode.configure(format, null, null, 0);
 } catch (ioexception e) {
  e.printstacktrace();
 }

 if (mediadecode == null) {
  jlog.e(tag, "create mediadecode failed");
  return;
 }
 mediadecode.start();
 decodeinputbuffers = mediadecode.getinputbuffers();
 decodeoutputbuffers = mediadecode.getoutputbuffers();
 decodebufferinfo = new mediacodec.bufferinfo();//用于描述解码得到的byte[]数据的相关信息

 }

 /**

 * 初始化编码器

 */

 private void initmediaencode(string mime) {
 try {
  mediaformat format = mediaformat.createvideoformat(mediaformat.mimetype_video_avc, dstwidth, dstheight);
  format.setinteger(mediaformat.key_bit_rate, 1024 * 512);
  format.setinteger(mediaformat.key_frame_rate, 27);
  format.setinteger(mediaformat.key_color_format, mediacodecinfo.codeccapabilities.color_formatyuv420flexible);
//  format.setinteger(mediaformat.key_color_format, mediacodecinfo.codeccapabilities.color_formatyuv420planar);


  format.setinteger(mediaformat.key_i_frame_interval, 1);

  mediaencode = mediacodec.createencoderbytype(mediaformat.mimetype_video_avc);
  mediaencode.configure(format, null, null, mediacodec.configure_flag_encode);
 } catch (ioexception e) {
  e.printstacktrace();
 }

 if (mediaencode == null) {
  jlog.e(tag, "create mediaencode failed");
  return;
 }
 mediaencode.start();

 }

 @targetapi(build.version_codes.lollipop)
 private void decode(mediacodec.bufferinfo videoinfo, int inputindex) {


 mediadecode.queueinputbuffer(inputindex, 0, videoinfo.size, videoinfo.presentationtimeus, videoinfo.flags);//通知mediadecode解码刚刚传入的数据


 //获取解码得到的byte[]数据 参数bufferinfo上面已介绍 10000同样为等待时间 同上-1代表一直等待,0代表不等待。此处单位为微秒


 //此处建议不要填-1 有些时候并没有数据输出,那么他就会一直卡在这 等待


 mediacodec.bufferinfo bufferinfo = new mediacodec.bufferinfo();
 int outputindex = mediadecode.dequeueoutputbuffer(bufferinfo, 50000);

 switch (outputindex) {
  case mediacodec.info_output_buffers_changed:
  jlog.d(tag, "info_output_buffers_changed");
  decodeoutputbuffers = mediadecode.getoutputbuffers();
  break;
  case mediacodec.info_output_format_changed:
  jlog.d(tag, "new format " + mediadecode.getoutputformat());
  break;
  case mediacodec.info_try_again_later:
  jlog.d(tag, "dequeueoutputbuffer timed out!");
  break;
  default:


  bytebuffer outputbuffer;
  byte[] frame;
  while (outputindex >= 0) {//每次解码完成的数据不一定能一次吐出 所以用while循环,保证解码器吐出所有数据

   outputbuffer = decodeoutputbuffers[outputindex];//拿到用于存放pcm数据的buffer

   frame = new byte[bufferinfo.size];//bufferinfo内定义了此数据块的大小

   outputbuffer.get(frame);//将buffer内的数据取出到字节数组中

   outputbuffer.clear();//数据取出后一定记得清空此buffer mediacodec是循环使用这些buffer的,不清空下次会得到同样的数据


   handleframedata(frame, videoinfo);//自己定义的方法,供编码器所在的线程获取数据,下面会贴出代码


   mediadecode.releaseoutputbuffer(outputindex, false);//此操作一定要做,不然mediacodec用完所有的buffer后 将不能向外输出数据

   outputindex = mediadecode.dequeueoutputbuffer(decodebufferinfo, 50000);//再次获取数据,如果没有数据输出则outputindex=-1 循环结束

  }
  break;
 }
 }


 /**

 * 编码

 */
 private void encode() {


 //获取解码器所在线程输出的数据


 byte[] chunktime;
 frame frame = getframedata();
 if (frame == null) {
  return;
 }
 chunktime = frame.data;
 int inputindex = mediaencode.dequeueinputbuffer(-1);//同解码器

 if (inputindex < 0) {
  jlog.d("px", "dequeueinputbuffer return inputindex " + inputindex + ",then break");
  mediaencode.signalendofinputstream();
 }

 bytebuffer inputbuffer = mediaencode.getinputbuffers()[inputindex];//同解码器

 inputbuffer.clear();//同解码器

 inputbuffer.put(chunktime);//pcm数据填充给inputbuffer

 inputbuffer.limit(frame.videoinfo.size);

 mediaencode.queueinputbuffer(inputindex, 0, chunktime.length, frame.videoinfo.presentationtimeus, frame.videoinfo.flags);//通知编码器 编码


 mediacodec.bufferinfo bufferinfo = new mediacodec.bufferinfo();
 int outputindex = mediaencode.dequeueoutputbuffer(bufferinfo, 50000);//同解码器

 switch (outputindex) {
  case mediacodec.info_output_buffers_changed:
  jlog.d(tag, "info_output_buffers_changed");
  break;
  case mediacodec.info_output_format_changed:
  mediaformat outputformat = mediaencode.getoutputformat();
  outputformat.setinteger(mediaformat.key_rotation, videorotation);
  jlog.d(tag, "mediaencode find new format " + outputformat);
  //向合成器添加视频轨


  videotrackindex = muxer.addtrack(outputformat);
  muxer.start();
  break;
  case mediacodec.info_try_again_later:
  jlog.d(tag, "dequeueoutputbuffer timed out!");
  break;
  default:
  bytebuffer outputbuffer;
  while (outputindex >= 0) {//同解码器

   outputbuffer = mediaencode.getoutputbuffers()[outputindex];//拿到输出buffer


   muxer.writesampledata(videotrackindex, outputbuffer, bufferinfo);
//   jlog.d("px", "writesampledata:" + bufferinfo.size);




   mediaencode.releaseoutputbuffer(outputindex, false);

   outputindex = mediaencode.dequeueoutputbuffer(bufferinfo, 50000);
  }
  break;
 }
 }

 private void release() {
 //全部写完后释放mediamuxer和mediaextractor


 extractor.release();
 mediadecode.release();
 mediaencode.release();
 muxer.stop();
 muxer.release();

 }

 private decoderunnable decoderunnable;
 private encoderunnable encoderunnable;

 /**

 * 解码线程

 */
 private class decoderunnable extends thread {

 @override
 public void run() {
  decodeover = false;
  while (!decodeover) {
  try {
   extract();
  } catch (exception e) {
   //抓住删除文件造成的异常


   jlog.e("px", e.tostring());
  }
  synchronized (encoderunnable) {
   encoderunnable.notify();
  }
  }
 }

 }

 /**

 * 编码线程

 */
 private class encoderunnable extends thread {

 @override
 public void run() {
  encoding = true;
  while (encoding) {
  if (timedatacontainer.isempty()) {
   if (decodeover) {//解码完成,缓存也清空了

   break;
   }
   try {
   synchronized (encoderunnable) {
    wait();
   }
   } catch (interruptedexception e) {
   e.printstacktrace();
   }
  } else {
   encode();
  }
  }
  release();
  encoding = false;
  handler.sendemptymessage(-2);//发送消息完成任务

 }
 }


 android.os.handler handler = new android.os.handler() {
 @override
 public void handlemessage(message msg) {
  switch (msg.what) {
  case -2:
   oncomplete();
   break;
  default:
   break;
  }
 }
 };

 public void oncomplete() {
 if (mdelete) {//delete请求,是在cancel事件前提下

  mdelete = false;
  new file(mvideo.srcpath).delete();//主动请求删除,删源文件,和数据库

  codecdao.deleteitem(mvideo);
  jlog.d("px", "delete file " + mvideo.srcpath);
 } else {
  mvideo.finish = mcancel ? 0 : 100;
  codecdao.createorupdate(mvideo);//更新数据库状态为已完成,或闲置中

 }
 if (mcancel) {//中途取消

  mcancel = false;
  new file(mvideo.dstpath).delete();//取消,删除目标文件

  jlog.d("px", "delete file " + mvideo.dstpath);
 } else {//顺利完成

  new file(mvideo.srcpath).delete();//成功,删除源文件

  jlog.d("px", "delete file " + mvideo.srcpath);
 }
 if (mlistener != null) {
  mlistener.oncodecfinish(mvideo);
 }
 if (!videos.isempty()) {
  videocodecmodel video = videos.remove(0);
  start(video);
 }
 }

 class frame {
 mediacodec.bufferinfo videoinfo;
 byte[] data;
 bitmap bitmap;

 public frame(mediacodec.bufferinfo videoinfo, bitmap bitmap) {
  this.videoinfo = videoinfo;
  this.bitmap = bitmap;
 }
 }

 private long getinterval() {
 //用第一二帧获取帧间隔


 long videosampletime;
 bytebuffer buffer = bytebuffer.allocate(1024 * 512);
 //获取源视频相邻帧之间的时间间隔。(1)


 extractor.readsampledata(buffer, 0);
 //skip first i frame


 if (extractor.getsampleflags() == mediaextractor.sample_flag_sync)
  extractor.advance();
 extractor.readsampledata(buffer, 0);
 long firstvideopts = extractor.getsampletime();
 extractor.advance();
 extractor.readsampledata(buffer, 0);
 long secondvideopts = extractor.getsampletime();
 videosampletime = math.abs(secondvideopts - firstvideopts);
 jlog.d(tag, "videosampletime is " + videosampletime);
 return videosampletime;
 }

 @override
 public int onstartcommand(intent intent, int flags, int startid) {
 jlog.d(tag, "onstartcommand");
 super.onstartcommand(intent, flags, startid);
 if (intent == null) {
  return start_not_sticky;
 }
 int action = intent.getintextra("action", 0);
 if (action == request_codec) {
  videocodecmodel video = (videocodecmodel) intent.getserializableextra("video");
  video = codecdao.additem(video);
  if (!encoding) {
  start(video);
  } else {
  videos.add(video);
  }
 } else if (action == request_codec_cancel) {
  videocodecmodel video = (videocodecmodel) intent.getserializableextra("video");
  mdelete = intent.getbooleanextra("delete", false);//是否删除旧文件

  jlog.d("px", "----- onstartcommand action " + action + " is delete?" + mdelete);
  mbinder.cancel(video);
 }
 return start_not_sticky;
 }

 @nullable
 @override
 public ibinder onbind(intent intent) {
 jlog.d(tag, "onbind");
 return mbinder;
 }

 private codecbinder mbinder = new codecbinder();
 private videocodecmodel mvideo;
 //video下载的任务队列


 private list<videocodecmodel> videos = new arraylist<>();

 public static final int request_codec = 0x183;
 public static final int request_codec_cancel = 0x184;

 public class codecbinder extends binder {
 /**

  * @param video

  * @return 是否可以执行, 或等待执行

  */
 public boolean start(videocodecmodel video) {
  video = codecdao.additem(video);
  if (!encoding) {
  testcodecservice.this.start(video);
  } else {
  videos.add(video);
  }
  return !encoding;
 }

 public void setonprogresschangelistener(onprogresschangelistener l) {
  mlistener = l;
 }

 public videocodecmodel getcurrentvideo() {
  return mvideo;
 }

 public void cancel(videocodecmodel video) {
  if (mvideo.equals(video)) {//正在处理

  decodeover = true;//控制解码线程结束

  encoding = false;//控制编码线程结束

  mcancel = true;//控制结束后删除文件等

  } else {//视频没有正在处理

  boolean flag = videos.remove(video);
  if (flag) {
   jlog.d("px", "cancel render task sucess");
  } else {
   //并没有这个任务


   jlog.d("px", "cancel render task fail,seems this video not in renderring queen");
  }
  //删除源文件


  if (mdelete) {
   mdelete = false;
   new file(video.srcpath).delete();
   codecdao.deleteitem(video);
  }
  }
 }

 public list<videocodecmodel> getvideolist() {
  return videos;
 }

 public void removelistener() {
  mlistener = null;
 }
 }

 private void start(videocodecmodel video) {
 if (video == null) {
  return;
 }
 if (!new file(video.srcpath).exists()) {
  toast.maketext(this, "该视频缓存文件可能已经被删除", toast.length_short).show();
  video.finish = -100;
  codecdao.createorupdate(video);
  return;
 }
 mvideo = video;
 if (mlistener != null) {
  mlistener.oncodecstart(mvideo);
 }
 mvideo.finish = 50;//改成处理中

 codecdao.createorupdate(mvideo);

 runnable runnable = new runnable() {
  @override
  public void run() {
  init(mvideo.srcpath, mvideo.dstpath);
  decoderunnable = new decoderunnable();
  decoderunnable.start();
  encoderunnable = new encoderunnable();
  encoderunnable.start();
  }
 };
 asynctaskexecutor.getexecutor().execute(runnable);
 }

 private onprogresschangelistener mlistener;

 public interface onprogresschangelistener {
 void onprogress(int progress, int max);

 void oncodecstart(videocodecmodel video);

 void oncodecfinish(videocodecmodel video);
 }

}


//这是模型类


import com.j256.ormlite.field.databasefield;
import com.j256.ormlite.table.databasetable;

import java.io.serializable;

/**

 * created by user on 2016/8/29.

 */
@databasetable(tablename = "video_codec_task")
public class videocodecmodel implements serializable {

 private static final long serialversionuid = -1307249622002520298l;
 @databasefield
 public string srcpath;
 @databasefield
 public string dstpath;
 @databasefield
 public long videocreatetime;
 @databasefield(generatedid = true)
 public int id;
 //0为被限制的状态,50为渲染中,或渲染队列中,100为已完成,-100为已删除,


 @databasefield
 public int finish = 0;
 @databasefield
 public string serno;

 //操作是用到,不需要存数据库


 public boolean select;

 public videocodecmodel(string srcpath, string dstpath, long videocreatetime) {
 this.srcpath = srcpath;
 this.videocreatetime = videocreatetime;
 this.dstpath = dstpath;
 }

 public videocodecmodel() {
 }

 public string getsrcpath() {
 return srcpath;
 }

 public void setsrcpath(string srcpath) {
 this.srcpath = srcpath;
 }

 public string getdstpath() {
 return dstpath;
 }

 public void setdstpath(string dstpath) {
 this.dstpath = dstpath;
 }

 public long getvideocreatetime() {
 return videocreatetime;
 }

 public void setvideocreatetime(long videocreatetime) {
 this.videocreatetime = videocreatetime;
 }

 public boolean isselect() {
 return select;
 }

 public void setselect(boolean select) {
 this.select = select;
 }

 @override
 public boolean equals(object o) {
 if (this == o) return true;
 if (!(o instanceof videocodecmodel)) return false;

 videocodecmodel that = (videocodecmodel) o;

 if (videocreatetime != that.videocreatetime) return false;
 if (!srcpath.equals(that.srcpath)) return false;
 return dstpath.equals(that.dstpath);

 }
}


//用来查看水印任务完成状态,和监控service运行的界面activity,,activity的打开与否,不影响服务的运行




import android.annotation.targetapi;
import android.app.progressdialog;
import android.content.componentname;
import android.content.context;
import android.content.dialoginterface;
import android.content.intent;
import android.content.serviceconnection;
import android.net.uri;
import android.os.asynctask;
import android.os.build;
import android.os.bundle;
import android.os.ibinder;
import android.os.message;
import android.support.annotation.nullable;
import android.util.log;
import android.view.gravity;
import android.view.menuitem;
import android.view.view;
import android.view.viewgroup;
import android.widget.adapterview;
import android.widget.baseadapter;
import android.widget.checkbox;
import android.widget.compoundbutton;
import android.widget.linearlayout;
import android.widget.listview;
import android.widget.popupmenu;
import android.widget.progressbar;
import android.widget.textview;

import ...

import java.io.file;
import java.lang.ref.weakreference;
import java.text.simpledateformat;
import java.util.arraylist;
import java.util.date;
import java.util.list;

/**

 * created by user on 2016/8/29.

 */
public class showcodecactivity extends baseactivity implements testcodecservice.onprogresschangelistener, view.onclicklistener {

 private textview nonetipsview;
 private list<videocodecmodel> videos = new arraylist<>(), cordingvideos;
 private listview listview;
 private baseadapter adapter;

 private view firsttips;
 @nullable
 videocodecmodel curshowvideo, currendervideo;
 testcodecservice.codecbinder binder;
 private progressbar progressbar;
 serviceconnection connection;

 videocodecdao codecdao;
 private simpledateformat dateformat = new simpledateformat("yyyy/mm/dd hh:mm:ss");

 private boolean meditmode = false;

 @override
 protected void oncreate(bundle savedinstancestate) {
 super.oncreate(savedinstancestate);
 setcontentview(r.layout.activity_show_codec);
 settitle("签约视频列表");
 initview();
 if (getintent() != null) {
  curshowvideo = (videocodecmodel) getintent().getserializableextra("video");
 }
 codecdao = videocodecdao.getinstance(this);
 final intent intent = new intent(this, testcodecservice.class);
 connection = new serviceconnection() {
  @override
  public void onserviceconnected(componentname name, ibinder service) {
  log.d("px", "onserviceconnected");
  binder = (testcodecservice.codecbinder) service;
  binder.setonprogresschangelistener(showcodecactivity.this);
  videos.clear();
  currendervideo = binder.getcurrentvideo();
  cordingvideos = binder.getvideolist();
  videos.addall(codecdao.queryall());

  notifychange();

  }

  @override
  public void onservicedisconnected(componentname name) {

  }
 };
 bindservice(intent, connection, context.bind_auto_create);
 }

 private void notifychange() {
 if (adapter == null) {
  adapter = new baseadapter() {
  @override
  public int getcount() {
   return videos.size();
  }

  @override
  public videocodecmodel getitem(int position) {
   return videos.get(position);
  }

  @override
  public long getitemid(int position) {
   return 0;
  }

  @override
  public view getview(int position, view convertview, viewgroup parent) {
   final holder holder;
   if (convertview == null) {
   convertview = view.inflate(showcodecactivity.this, r.layout.item_show_codec, null);
   holder = new holder();
   holder.bar = (progressbar) convertview.findviewbyid(r.id.pb_codec);
   holder.status = (textview) convertview.findviewbyid(r.id.status);
   holder.serno = (textview) convertview.findviewbyid(r.id.serno);
   holder.select = convertview.findviewbyid(r.id.select);
   holder.time = (textview) convertview.findviewbyid(r.id.time);
   holder.operate = (textview) convertview.findviewbyid(r.id.operate);
   holder.checkbox = (checkbox) convertview.findviewbyid(r.id.cb_select);
   convertview.settag(holder);
   } else {
   holder = (holder) convertview.gettag();
   }
   final videocodecmodel video = getitem(position);
   if (video.finish == 100) {
   holder.status.settext("已完成");
   holder.operate.setvisibility(view.visible);
   holder.operate.settext("操作");
   } else if (video.finish == -100) {
   holder.status.settext("已删除");
   holder.operate.setvisibility(view.invisible);
   } else if (video.equals(currendervideo)) {
   progressbar = holder.bar;
   holder.status.settext("处理中");
   holder.operate.setvisibility(view.invisible);
   } else if (cordingvideos.contains(video)) {
   holder.status.settext("等待中");
   holder.operate.setvisibility(view.visible);
   holder.operate.settext("取消");
   } else {
   holder.status.settext("未处理");
   holder.operate.setvisibility(view.visible);
   holder.operate.settext("开始");
   }
   holder.operate.setonclicklistener(new view.onclicklistener() {
   @override
   public void onclick(view v) {
    if (video.finish == 100) {
    operate(holder.status, video);
    } else if (video.finish == -100) {
    return;
    } else if (video.equals(currendervideo)) {//已经在编码中,不可操作

    return;
    } else if (cordingvideos.contains(video)) {//已经在编码队列中,可取消

    binder.cancel(video);
    holder.status.settext("未处理");
    holder.operate.setvisibility(view.visible);
    holder.operate.settext("开始");
    } else {
    boolean immedia = binder.start(video);
    if (immedia) {
     holder.status.settext("处理中");
     holder.operate.setvisibility(view.invisible);
    } else {
     holder.status.settext("等待中");
     holder.operate.setvisibility(view.visible);
     holder.operate.settext("取消");
    }
    }
   }
   });
   holder.select.setvisibility(video.equals(curshowvideo) ? view.visible : view.gone);
   holder.serno.settext(video.serno);
   holder.time.settext(dateformat.format(new date(video.videocreatetime)));

   holder.checkbox.setvisibility(meditmode ? view.visible : view.gone);
   holder.checkbox.setchecked(video.isselect());
   holder.checkbox.setoncheckedchangelistener(new compoundbutton.oncheckedchangelistener() {
   @override
   public void oncheckedchanged(compoundbutton buttonview, boolean ischecked) {
    video.setselect(ischecked);
   }
   });
   return convertview;
  }
  };
  listview.setadapter(adapter);
 } else {
  adapter.notifydatasetchanged();
 }
 nonetipsview.setvisibility(videos.isempty() ? view.visible : view.gone);
 more.setvisibility(meditmode ? view.visible : view.gone);
 back.setvisibility(meditmode ? view.invisible : view.visible);
 checkbox.setvisibility(meditmode ? view.visible : view.gone);
 }

 class holder {
 progressbar bar;
 textview status, serno, time, operate;
 view select;
 checkbox checkbox;
 }

 private void initview() {
 listview = (listview) findviewbyid(r.id.lv_codec);
 nonetipsview = (textview) findviewbyid(r.id.tv_none);
 listview.setonitemclicklistener(new adapterview.onitemclicklistener() {
  @override
  public void onitemclick(adapterview<?> parent, view view, int position, long id) {
  videocodecmodel video = videos.get(position);
  operate(view, video);
  }
 });
 listview.setonitemlongclicklistener(new adapterview.onitemlongclicklistener() {
  @override
  public boolean onitemlongclick(adapterview<?> parent, view view, int position, long id) {
  if (meditmode) {
   return false;
  }
  meditmode = true;
  //启动编辑模式不记住从前的选中状态


  for (videocodecmodel video : videos) {
   if (video.select)
   video.select = false;
  }
  checkbox.setchecked(false);
  notifychange();
  return true;
  }
 });
 firsttips = findviewbyid(r.id.ll_tips);
 boolean visable = preferences.getboolean("firstshowcodec", true);
 firsttips.setvisibility(visable ? view.visible : view.gone);
 if (visable)
  findviewbyid(r.id.btn_noshow).setonclicklistener(new view.onclicklistener() {
  @override
  public void onclick(view v) {
   preferences.put("firstshowcodec", false);
   firsttips.setvisibility(view.gone);
  }
  });
 checkbox.setoncheckedchangelistener(new compoundbutton.oncheckedchangelistener() {
  @override
  public void oncheckedchanged(compoundbutton buttonview, boolean ischecked) {
  for (videocodecmodel model : videos) {
   model.setselect(ischecked);
  }
  notifychange();
  }
 });
 more.settext("操作");
 more.setonclicklistener(this);
 }

 private void operate(view view, final videocodecmodel video) {
 if (video.finish != 100) {
  return;
 }
 popupmenu popupmenu = new popupmenu(showcodecactivity.this, view);
 popupmenu.getmenu().add(1, 0, 0, "预览或发送");
 popupmenu.getmenu().add(1, 1, 1, "删除");
 popupmenu.setonmenuitemclicklistener(new popupmenu.onmenuitemclicklistener() {
  @override
  public boolean onmenuitemclick(menuitem item) {

  switch (item.getitemid()) {
   case 0:
   previewvideo(video.dstpath);
   break;
   case 1:
   file file = new file(video.dstpath);
   if (file.exists()) {
    file.delete();
   }
   codecdao.deleteitem(video);
   videos.remove(video);
   if (cordingvideos.contains(video)) {
    binder.cancel(video);
   }
   notifychange();
   break;
  }

  return true;
  }
 });
 popupmenu.show();
 }


 @override
 public void onprogress(int progress, int max) {
 if (progressbar != null) {
  progressbar.setmax(max);
  progressbar.setprogress(progress);
 }
 }

 @override
 public void oncodecstart(videocodecmodel video) {
 jlog.d("px", "oncodecstart");
 currendervideo = video;
 int index = videos.indexof(video);
 if (index >= 0) {
  view child = listview.getchildat(index);
  holder holder = (holder) child.gettag();
  holder.status.settext("处理中");
  holder.operate.setvisibility(view.invisible);
  progressbar = holder.bar;
 }
 }

 @override
 public void oncodecfinish(videocodecmodel video) {
 jlog.d("px", "oncodecfinish");
 if (progressbar != null) {
  progressbar.setprogress(0);
 }
 int index = videos.indexof(video);
 videos.get(index).finish = 100;
 if (index >= 0) {
  view child = listview.getchildat(index);
  holder holder = (holder) child.gettag();
  holder.status.settext("已完成");
  holder.operate.setvisibility(view.visible);
  holder.operate.settext("操作");
  progressbar = null;
 }
 }

 @override
 protected void ondestroy() {
 if (binder != null)
  binder.removelistener();
 unbindservice(connection);
 super.ondestroy();
 }

 private void previewvideo(string filepath) {
 //预览录像


 intent intent = new intent(intent.action_view);
 string type = "video/mp4";
 uri uri = uri.parse("file://" + filepath);
 intent.setdataandtype(uri, type);
 startactivity(intent);
 }

 @override
 public void onbackpressed() {
 if (meditmode) {
  meditmode = false;
  notifychange();
  return;
 }
 super.onbackpressed();
 }

 @override
 public void onclick(view v) {
 switch (v.getid()) {
  case r.id.more:
  popupmenu menu = new popupmenu(this, v);
//  menu.getmenu().add(1, 0, 0, "发送");


  menu.getmenu().add(1, 1, 1, "删除");
  menu.getmenu().add(1, 2, 2, "取消");
  menu.setonmenuitemclicklistener(new popupmenu.onmenuitemclicklistener() {
   @override
   public boolean onmenuitemclick(menuitem item) {
   switch (item.getitemid()) {
    case 0:
    break;
    case 1:
    deleteselect();
    break;
    case 2:
    meditmode = false;
    notifychange();
    break;
   }
   return true;
   }
  });
  menu.show();
  break;
 }
 }

 //删除所选


 private void deleteselect() {
 final progressdialog dialog = progressdialog.show(this, null, null);
 asynctask<string, string, boolean> task = new asynctask<string, string, boolean>() {
  @override
  protected boolean doinbackground(string... params) {
  boolean has = false;//是否选到可以删除的,有可能并未有任何选择

  for (videocodecmodel video : videos) {
   if (video.select) {
   file file;
   if (video.finish == 100) {
    file = new file(video.dstpath);
   } else {
    file = new file(video.srcpath);
   }
   if (file.exists()) {
    file.delete();
   }
   codecdao.deleteitem(video);
   if (!has) {
    has = true;
   }
   }
  }
  if (has) {
   videos.clear();
   videos.addall(codecdao.queryall());
  }

  return has;
  }

  @override
  protected void onpostexecute(boolean s) {
  meditmode = false;
  notifychange();
  dialog.dismiss();
  }
 };
 task.executeonexecutor(asynctaskexecutor.getexecutor());
 }
}

以上就是本文的全部内容,希望对大家的学习有所帮助,也希望大家多多支持移动技术网。

如对本文有疑问,请在下面进行留言讨论,广大热心网友会与你互动!! 点击进行留言回复

相关文章:

验证码:
移动技术网