使用juv-client-client.jar主要是尽快地完成毕业设计里面手机端向网页端发送实时视频的功能,由于实习和做毕业设计的时间冲突,因此完成毕业设计只花了1个多月时间。
(万恶的形式主义,论文格式改了我老久老久)因此代码上面会存在一些问题,并且也是单纯的实现了摄像头视频的实时传输,麦克风的实时语音没有实现。
自我感觉这个毕业设计没有多大价值,但是有参考意义,特把实现记录一下,用作纪念!
原理:
juv-client-client.jar提供了很多与Red5的交互操作,比如连接,流数据发布,方法互相调用等等。
在发布实时视频数据的之前,我们需要建立手机端和服务器端的RTMP连接。
使用类库里的NetConnection类:
关键代码如下:
[java] view plain copy
- private void connectRed5() {
- //key的值官方网站上可以申请到免费试用版本:http://www.smaxe.com/order.jsf#request_evaluation_key
- License.setKey("63140-D023C-D7420-00B15-91FC7");
- connection = new NetConnection();
- //对连接进行配置
- connection.configuration().put(NetConnection.Configuration.INACTIVITY_TIMEOUT, -1);
- connection.configuration().put(NetConnection.Configuration.RECEIVE_BUFFER_SIZE, 256 * 1024);
- connection.configuration().put(NetConnection.Configuration.SEND_BUFFER_SIZE, 256 * 1024);
- connection.client(new ClientHandler());
- connection.addEventListener(new NetConnectionListener());
- connection.connect(red5_url);
- }
其中new ClientHandler类是继承Object,里面写的方法可以被服务器调用。
new NetConnectionListener可以继承NetConnection.ListenerAdapter或者实现Listener接口,用于显示处理建立RTMP连接时候的一些网络状况。
例如:
[java] view plain copy
- private class ClientHandler extends Object {
- public ClientHandler() {}
- public void fun1() {}
- public void fun2() {}
- }
- private class NetConnectionListener extends NetConnection.ListenerAdapter {
- public NetConnectionListener() {}
- @Override
- public void onAsyncError(final INetConnection source, final String message, final Exception e) {
- System.out.println("NetConnection#onAsyncError: " + message + " "+ e);
- }
- @Override
- public void onIOError(final INetConnection source, final String message) {
- System.out.println("NetConnection#onIOError: " + message);
- }
- @Override
- public void onNetStatus(final INetConnection source, final Map<String, Object> info) {
- System.out.println("NetConnection#onNetStatus: " + info);
- final Object code = info.get("code");
- if (NetConnection.CONNECT_SUCCESS.equals(code)) {}
- }
- }
以上就是建立连接的过程,判断是否建立了连接在
[java] view plain copy
- System.out.println("NetConnection#onNetStatus: " + info);
是会有消息打出来的。
建立RTMP连接以后我们就可以通过Android的Camera类进行视频的采集,然后进行实时发送。
这里我不得不说的是,实现Android端的视频采集比网页端的复杂,因为这个类库提供的摄像头类和麦克风类都是两个抽象类或者是接口,必须要自己实现它。而网页端却有封装好的摄像头和麦克风,调用简单。
我的方法是实现类库里的AbstractCamera抽象类,想到Android里面自己也提供了一个摄像头的Camera类,于是我想到了用面向对象的组合和多接口实现,于是我打算实现一个AndroidCamera类。
这里有个问题:为什么要实现AbstractCamera类?
因为这个类里面有一个protected的fireOnVideoData方法。可以给继承它的类使用,该方法的作用,我猜想是把一个个数据包封装成流数据。
继续实现AndroidCamera类,用类图表示我的实现方案:
可以看到我用Android里的Camera类、SurfaceView类、SurfaceHolder类组成了我自己的AndroidCamera类,并且需要实现SurfaceHolder.CallBack接口以及Camera的PreviewCallBack接口。
这么做的原因有两个:1、实现预览。2、预览的同时通过Camera的PreviewCallBack接口里的onPreviewFrame方法获取到实时帧数据,进而转码打包生成流数据。(注意我这里并没有进行视频的编码压缩,时间和能力有限)
直接上代码了:
[java] view plain copy
- <pre name="code" class="java"> public class AndroidCamera extends AbstractCamera implements SurfaceHolder.Callback, Camera.PreviewCallback {
- private SurfaceView surfaceView;
- private SurfaceHolder surfaceHolder;
- private Camera camera;
- private int width;
- private int height;
- private boolean init;
- int blockWidth;
- int blockHeight;
- int timeBetweenFrames; // 1000 / frameRate
- int frameCounter;
- byte[] previous;
- public AndroidCamera(Context context) {
- surfaceView = (SurfaceView)((Activity) context).findViewById(R.id.surfaceView);
- //我是把Activity里的context传进入然后获取到SurfaceView,也可以之间传入SurfaceView进行实例
- surfaceHolder = surfaceView.getHolder();
- surfaceHolder.addCallback(AndroidCamera.this);
- surfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
- width = 320;
- height = 240;
- init = false;
- Log.d("DEBUG", "AndroidCamera()");
- }
- private void startVideo() {
- Log.d("DEBUG", "startVideo()");
- netStream = new NetStream(connection);
- netStream.addEventListener(new NetStream.ListenerAdapter() {
- @Override
- public void onNetStatus(final INetStream source, final Map<String, Object> info){
- System.out.println("Publisher#NetStream#onNetStatus: " + info);
- Log.d("DEBUG", "Publisher#NetStream#onNetStatus: " + info);
- final Object code = info.get("code");
- if (NetStream.PUBLISH_START.equals(code)) {
- if (aCamera != null) {
- netStream.attachCamera(aCamera, -1 /*snapshotMilliseconds*/);
- Log.d("DEBUG", "aCamera.start()");
- aCamera.start();
- } else {
- Log.d("DEBUG", "camera == null");
- }
- }
- }
- });
- netStream.publish(VideoName, NetStream.RECORD);
- }
- public void start() {
- camera.startPreview();
- }
- @Override
- public void onPreviewFrame(byte[] arg0, Camera arg1) {
- // TODO Auto-generated method stub
- if (!active) return;
- if (!init) {
- blockWidth = 32;
- blockHeight = 32;
- timeBetweenFrames = 100; // 1000 / frameRate
- frameCounter = 0;
- previous = null;
- init = true;
- }
- final long ctime = System.currentTimeMillis();
- byte[] current = RemoteUtil.decodeYUV420SP2RGB(arg0, width, height);
- try {
- final byte[] packet = RemoteUtil.encode(current, previous, blockWidth, blockHeight, width, height);
- Log.d("DEBUG", packet.toString());
- fireOnVideoData(new MediaDataByteArray(timeBetweenFrames, new ByteArray(packet)));
- previous = current;
- if (++frameCounter % 10 == 0) previous = null;
- }
- catch (Exception e) {
- e.printStackTrace();
- }
- final int spent = (int) (System.currentTimeMillis() - ctime);
- try {
- Thread.sleep(Math.max(0, timeBetweenFrames - spent));
- } catch (InterruptedException e) {
- // TODO Auto-generated catch block
- e.printStackTrace();
- }
- }
- @Override
- public void surfaceChanged(SurfaceHolder holder, int format, int width,
- int height) {
- // TODO Auto-generated method stub
- startVideo();
- }
- @Override
- public void surfaceCreated(SurfaceHolder holder) {
- // TODO Auto-generated method stub
- camera = Camera.open();
- try {
- camera.setPreviewDisplay(surfaceHolder);
- camera.setPreviewCallback(this);
- Camera.Parameters params = camera.getParameters();
- params.setPreviewSize(width, height);
- camera.setParameters(params);
- } catch (IOException e) {
- // TODO Auto-generated catch block
- e.printStackTrace();
- camera.release();
- camera = null;
- }
- }
- @Override
- public void surfaceDestroyed(SurfaceHolder holder) {
- // TODO Auto-generated method stub
- if (camera != null) {
- camera.stopPreview();
- camera.release();
- camera = null;
- }
- }
- } //AndroidCamera</pre><br>
- <br>
- <pre></pre>
- <p></p>
- <pre></pre>
- 上面的实现原理是基于类库自带的ExDesktopPublisher.java实现的,因此有些我自己也无法看懂。(因为我不懂多媒体)
- <p></p>
- <p>值得说明的是在发布实时视频的时候是通过类库里的NetStream的publish方法进行发布的,在这之前需要先用attachCamera方法给他设置视频源(代码里有)。</p>
- <p></p>
- <pre name="code" class="java"><pre name="code" class="java"> RemoteUtil.decodeYUV420SP2RGB</pre>
- <pre></pre>
- <p></p>
- <pre></pre>
- 是对onPreviewFrame获取到的YUV420视频源数据进行转换,转到RGB的,不然显示也许会有问题。算法如下:
- <p></p>
- <p></p>
- <pre name="code" class="java">public static byte[] decodeYUV420SP2RGB(byte[] yuv420sp, int width, int height) {
- final int frameSize = width * height;
- byte[] rgbBuf = new byte[frameSize * 3];
- // if (rgbBuf == null) throw new NullPointerException("buffer ‘rgbBuf‘ is null");
- if (rgbBuf.length < frameSize * 3) throw new IllegalArgumentException("buffer ‘rgbBuf‘ size " + rgbBuf.length + " < minimum " + frameSize * 3);
- if (yuv420sp == null) throw new NullPointerException("buffer ‘yuv420sp‘ is null");
- if (yuv420sp.length < frameSize * 3 / 2) throw new IllegalArgumentException("buffer ‘yuv420sp‘ size " + yuv420sp.length + " < minimum " + frameSize * 3 / 2);
- int i = 0, y = 0;
- int uvp = 0, u = 0, v = 0;
- int y1192 = 0, r = 0, g = 0, b = 0;
- for (int j = 0, yp = 0; j < height; j++) {
- uvp = frameSize + (j >> 1) * width;
- u = 0;
- v = 0;
- for (i = 0; i < width; i++, yp++) {
- y = (0xff & ((int) yuv420sp[yp])) - 16;
- if (y < 0) y = 0;
- if ((i & 1) == 0) {
- v = (0xff & yuv420sp[uvp++]) - 128;
- u = (0xff & yuv420sp[uvp++]) - 128;
- }
- y1192 = 1192 * y;
- r = (y1192 + 1634 * v);
- g = (y1192 - 833 * v - 400 * u);
- b = (y1192 + 2066 * u);
- if (r < 0) r = 0; else if (r > 262143) r = 262143;
- if (g < 0) g = 0; else if (g > 262143) g = 262143;
- if (b < 0) b = 0; else if (b > 262143) b = 262143;
- rgbBuf[yp * 3] = (byte)(r >> 10);
- rgbBuf[yp * 3 + 1] = (byte)(g >> 10);
- rgbBuf[yp * 3 + 2] = (byte)(b >> 10);
- }
- }//for
- return rgbBuf;
- }// decodeYUV420Sp2RGB</pre><pre name="code" class="java"><pre name="code" class="java">RemoteUtil.encode</pre>
- <pre></pre>
- <p></p>
- <pre></pre>
- 的算法取之于ExDesktopPublisher.java,应该是对视频数据的RTMP封装。这里就不贴代码了,可以到sample的文件里拿来用。
- <p></p>
- <p>以上文字组织很乱,因为我是在答辩的前一个晚上才实现的,因此代码也很乱,很难组织清楚,不过原理就是这样。最后的确是实现了实时视频,然而可能由于转码算法问题,实时视频的颜色是有问题的。</p>
- <p>为自己的大学生活里最后一次软件功能实现留给纪念吧!<br>
- </p>
- <p><br>
- </p>
- </pre></pre>