Android学习——在Android中使用OpenCV的第一个程序

刚开始学习Android,由于之前比较熟悉OpenCV,于是就想先在Android上运行OpenCV试试

===================================================================================

1.环境配置

  • JDK
  • Eclipse
  • ADT
  • CDT
  • Android SDK
  • Android NDK
  • cygwin
  • OpenCV for Android 2.4.9

这部分网上很多,我就不再赘述了,可以参考:http://blog.csdn.net/pwh0996/article/details/8957764

2.开发准备

两点注意

  • 新版安装SDK文件一开始有两个XML文件,activity_main.xml和fragment_main.xml:不习惯的可以这样处理:
  1. 删除fragment_main.xml整个文件
  2. 对activity_main.xml,删除里面的内容。然后切换到Graphy Layout,放入一个LinearLayout就可以
  3. 对MainActivity.java,可以删除部分的内容,再把MainActivity extends ActionBarActivity 改为MainActivity extends Activity
  4. (关于activity_main.xml与fragment_main.xml的问题参看:http://bbs.csdn.net/topics/390740123)
  • 引入OpenCV库

    Package Explorer中选择项目,单击右键在弹出菜单中选择Properties,然后在弹出的Properties窗口中左侧选择Android,然后点击右下方的Add按钮,选择OpenCV Library 2.4.9并点击OK,操作完成后,会将OpenCV类库添加到GrayProcess的Android Dependencies中

3.编写程序

目的是实现通过OpenCV for Android实现摄像头采集图像的处理,并通过SurfaceView显示在手机屏幕上

OpenCV的Android库将Android自身的相机相关的库进行了封装,用起来十分方便

  • CameraBridgeViewBase .enableView()
  • SurfaceView is available
    • CameraBridgeViewBase  .setVisibility(SurfaceView.Visiable)
    • CameraBridgeViewBase  .setCvCameraViewListener(this)

就可以使用回调函数

  • onCameraViewStarted
  • onCameraViewStopped

图像处理写在

  • public Mat onCameraFrame(CvCameraViewFrame inputFrame)

Java文件:

public class MainActivity extends Activity implements CvCameraViewListener2 {
    private static final String TAG = "OCVSample::Activity";

    private CameraBridgeViewBase mOpenCvCameraView;
    private boolean mIsJavaCamera = true;
    private MenuItem mItemSwitchCamera = null;
    private Mat mRgba;
    private Button mBtn = null;
    private boolean	 isProcess = false;

//建立连接
    private BaseLoaderCallback mLoaderCallback = new BaseLoaderCallback(this) {
        @Override
        public void onManagerConnected(int status) {
            switch (status) {
                case LoaderCallbackInterface.SUCCESS:
                {
                    Log.i(TAG, "OpenCV loaded successfully");
                    mOpenCvCameraView.enableView();
                } break;
                default:
                {
                    super.onManagerConnected(status);
                } break;
            }
        }
    };

//构造函数
    public MainActivity() {
        Log.i(TAG, "Instantiated new " + this.getClass());
    }

    /** Called when the activity is first created. */
//onCreate函数
    @Override
    public void onCreate(Bundle savedInstanceState) {
        Log.i(TAG, "called onCreate");
        super.onCreate(savedInstanceState);
        getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);

        setContentView(R.layout.activity_main);

//
        if (mIsJavaCamera)
            mOpenCvCameraView = (CameraBridgeViewBase) findViewById(R.id.tutorial1_activity_java_surface_view);
        else
            mOpenCvCameraView = (CameraBridgeViewBase) findViewById(R.id.tutorial1_activity_native_surface_view);

        mOpenCvCameraView.setVisibility(SurfaceView.VISIBLE);

        mOpenCvCameraView.setCvCameraViewListener(this);

        mBtn = (Button) findViewById(R.id.buttonGray);
  mBtn.setOnClickListener(new View.OnClickListener(){
   @Override
      public void onClick(View v) {
    isProcess = !isProcess;
      }
  });
    }

    @Override
    public void onPause()
    {
        super.onPause();
        if (mOpenCvCameraView != null)
            mOpenCvCameraView.disableView();
    }

    @Override
    public void onResume()
    {
        super.onResume();
        OpenCVLoader.initAsync(OpenCVLoader.OPENCV_VERSION_2_4_9, this, mLoaderCallback);
    }

    public void onDestroy() {
        super.onDestroy();
        if (mOpenCvCameraView != null)
            mOpenCvCameraView.disableView();
    }

    @Override
    public boolean onCreateOptionsMenu(Menu menu) {
        Log.i(TAG, "called onCreateOptionsMenu");
        mItemSwitchCamera = menu.add("Toggle Native/Java camera");
        return true;
    }

    @Override
    public boolean onOptionsItemSelected(MenuItem item) {
        String toastMesage = new String();
        Log.i(TAG, "called onOptionsItemSelected; selected item: " + item);

        if (item == mItemSwitchCamera) {
            mOpenCvCameraView.setVisibility(SurfaceView.GONE);
            mIsJavaCamera = !mIsJavaCamera;

            if (mIsJavaCamera) {
                mOpenCvCameraView = (CameraBridgeViewBase) findViewById(R.id.tutorial1_activity_java_surface_view);
                toastMesage = "Java Camera";
            } else {
                mOpenCvCameraView = (CameraBridgeViewBase) findViewById(R.id.tutorial1_activity_native_surface_view);
                toastMesage = "Native Camera";
            }

            mOpenCvCameraView.setVisibility(SurfaceView.VISIBLE);
            mOpenCvCameraView.setCvCameraViewListener(this);
            mOpenCvCameraView.enableView();
            Toast toast = Toast.makeText(this, toastMesage, Toast.LENGTH_LONG);
            toast.show();
        }

        return true;
    }

    public void onCameraViewStarted(int width, int height) {
     mRgba = new Mat(height, width, CvType.CV_8UC4);

    }

    public void onCameraViewStopped() {
     mRgba.release();
    }

    public Mat onCameraFrame(CvCameraViewFrame inputFrame) {
     if(isProcess)
      Imgproc.cvtColor(inputFrame.gray(), mRgba, Imgproc.COLOR_GRAY2RGBA, 4);
     else
       mRgba = inputFrame.rgba();
     return mRgba;
    }
}

Manifest文件:

需加入相机使用权限

<uses-permission android:name="android.permission.CAMERA"/> 

注意:一般Android摄像头采集的图像方向不对

在纯Android的开发环境中,一般采用

mCamera.setDisplayOrientation(90);

在OpenCV
for Android的开发中,在Manifest文件中加入:

android:screenOrientation="landscape"
android:configChanges="keyboardHidden|orientation"

完整的Manifest文件

<?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
package="com.example.camera03"
android:versionCode="1"
android:versionName="1.0" > 

<supports-screens android:resizeable="true"
android:smallScreens="true"
android:normalScreens="true"
android:largeScreens="true"
android:anyDensity="true" />
<uses-sdk
android:minSdkVersion="9"
android:targetSdkVersion="19" />
<uses-permission android:name="android.permission.CAMERA"/> 

<application
android:allowBackup="true"
android:icon="@drawable/ic_launcher"
android:label="@string/app_name"
android:theme="@android:style/Theme.NoTitleBar.Fullscreen" >
<activity
android:name="com.example.camera03.MainActivity"
android:label="@string/app_name"
android:screenOrientation="landscape"
android:configChanges="keyboardHidden|orientation">
<intent-filter>
<action android:name="android.intent.action.MAIN" /> 

<category android:name="android.intent.category.LAUNCHER" />
</intent-filter>
</activity>
</application> 

</manifest> 

原图:

灰度图:

Java程序2:

分别完成了

  1. 原图
  2. 灰度图
  3. Canny边缘检测
  4. Hist 直方图计算
  5. Sobel 边缘检测
  6. SEPIA(色调变换)为每一个数组元素执行一个矩阵变换
  7. ZOOM 放大镜
  8. PIXELIZE 像素化
  9. POSTERIZE 多色调分色印
package com.example.camera03;

import java.util.Arrays;

import org.opencv.android.BaseLoaderCallback;
import org.opencv.android.CameraBridgeViewBase.CvCameraViewFrame;
import org.opencv.android.LoaderCallbackInterface;
import org.opencv.android.OpenCVLoader;
import org.opencv.core.Core;
import org.opencv.core.CvType;
import org.opencv.core.Mat;
import org.opencv.core.MatOfFloat;
import org.opencv.core.MatOfInt;
import org.opencv.core.Point;
import org.opencv.core.Scalar;
import org.opencv.core.Size;
import org.opencv.imgproc.Imgproc;
import org.opencv.android.CameraBridgeViewBase;
import org.opencv.android.CameraBridgeViewBase.CvCameraViewListener2;

import android.app.Activity;
import android.os.Bundle;
import android.util.Log;
import android.view.Menu;
import android.view.MenuItem;
import android.view.SurfaceView;
import android.view.View;
import android.view.WindowManager;
import android.widget.Button;
import android.widget.Toast;

public class MainActivity extends Activity implements CvCameraViewListener2 {
    private static final String TAG = "OCVSample::Activity";

    private CameraBridgeViewBase mOpenCvCameraView;
    private boolean mIsJavaCamera = true;
    private MenuItem mItemSwitchCamera = null;
    private Mat mRgba;
    private Mat mGray;
    private Mat mTmp;

    private Size mSize0;
    private Mat mIntermediateMat;
    private MatOfInt mChannels[];
    private MatOfInt mHistSize;
    private int mHistSizeNum = 25;
    private Mat mMat0;
    private float[] mBuff;
    private MatOfFloat mRanges;
    private Point mP1;
    private Point mP2;
    private Scalar mColorsRGB[];
    private Scalar mColorsHue[];
    private Scalar mWhilte;
    private Mat mSepiaKernel;
    private Button mBtn = null;
    private int	 mProcessMethod = 0;

    private BaseLoaderCallback mLoaderCallback = new BaseLoaderCallback(this) {
        @Override
        public void onManagerConnected(int status) {
            switch (status) {
                case LoaderCallbackInterface.SUCCESS:
                {
                    Log.i(TAG, "OpenCV loaded successfully");
                    mOpenCvCameraView.enableView();
                } break;
                default:
                {
                    super.onManagerConnected(status);
                } break;
            }
        }
    };

    public MainActivity() {
        Log.i(TAG, "Instantiated new " + this.getClass());
    }

    /** Called when the activity is first created. */
    @Override
    public void onCreate(Bundle savedInstanceState) {
        Log.i(TAG, "called onCreate");
        super.onCreate(savedInstanceState);
        getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);

        setContentView(R.layout.activity_main);

        if (mIsJavaCamera)
            mOpenCvCameraView = (CameraBridgeViewBase) findViewById(R.id.tutorial1_activity_java_surface_view);
        else
            mOpenCvCameraView = (CameraBridgeViewBase) findViewById(R.id.tutorial1_activity_native_surface_view);

        mOpenCvCameraView.setVisibility(SurfaceView.VISIBLE);

        mOpenCvCameraView.setCvCameraViewListener(this);

        mBtn = (Button) findViewById(R.id.buttonGray);
  mBtn.setOnClickListener(new View.OnClickListener(){
   @Override
      public void onClick(View v) {
    mProcessMethod++;
    if(mProcessMethod>8) mProcessMethod=0;
      }
  });
    }

    @Override
    public void onPause()
    {
        super.onPause();
        if (mOpenCvCameraView != null)
            mOpenCvCameraView.disableView();
    }

    @Override
    public void onResume()
    {
        super.onResume();
        OpenCVLoader.initAsync(OpenCVLoader.OPENCV_VERSION_2_4_9, this, mLoaderCallback);
    }

    public void onDestroy() {
        super.onDestroy();
        if (mOpenCvCameraView != null)
            mOpenCvCameraView.disableView();
    }

    @Override
    public boolean onCreateOptionsMenu(Menu menu) {
        Log.i(TAG, "called onCreateOptionsMenu");
        mItemSwitchCamera = menu.add("Toggle Native/Java camera");
        return true;
    }

    @Override
    public boolean onOptionsItemSelected(MenuItem item) {
        String toastMesage = new String();
        Log.i(TAG, "called onOptionsItemSelected; selected item: " + item);

        if (item == mItemSwitchCamera) {
            mOpenCvCameraView.setVisibility(SurfaceView.GONE);
            mIsJavaCamera = !mIsJavaCamera;

            if (mIsJavaCamera) {
                mOpenCvCameraView = (CameraBridgeViewBase) findViewById(R.id.tutorial1_activity_java_surface_view);
                toastMesage = "Java Camera";
            } else {
                mOpenCvCameraView = (CameraBridgeViewBase) findViewById(R.id.tutorial1_activity_native_surface_view);
                toastMesage = "Native Camera";
            }

            mOpenCvCameraView.setVisibility(SurfaceView.VISIBLE);
            mOpenCvCameraView.setCvCameraViewListener(this);
            mOpenCvCameraView.enableView();
            Toast toast = Toast.makeText(this, toastMesage, Toast.LENGTH_LONG);
            toast.show();
        }

        return true;
    }

    public void onCameraViewStarted(int width, int height) {
     mRgba = new Mat(height, width, CvType.CV_8UC4);
     mGray = new Mat(height, width, CvType.CV_8UC1);
     mTmp = new Mat(height, width, CvType.CV_8UC4);

      mIntermediateMat = new Mat();
         mSize0 = new Size();
         mChannels = new MatOfInt[] { new MatOfInt(0), new MatOfInt(1), new MatOfInt(2) };
         mBuff = new float[mHistSizeNum];
         mHistSize = new MatOfInt(mHistSizeNum);
         mRanges = new MatOfFloat(0f, 256f);
         mMat0 = new Mat();
         mColorsRGB = new Scalar[] { new Scalar(200, 0, 0, 255), new Scalar(0, 200, 0, 255), new Scalar(0, 0, 200, 255) };
         mColorsHue = new Scalar[] {
                 new Scalar(255, 0, 0, 255), new Scalar(255, 60, 0, 255), new Scalar(255, 120, 0, 255), new Scalar(255, 180, 0, 255), new Scalar(255, 240, 0, 255),
                 new Scalar(215, 213, 0, 255), new Scalar(150, 255, 0, 255), new Scalar(85, 255, 0, 255), new Scalar(20, 255, 0, 255), new Scalar(0, 255, 30, 255),
                 new Scalar(0, 255, 85, 255), new Scalar(0, 255, 150, 255), new Scalar(0, 255, 215, 255), new Scalar(0, 234, 255, 255), new Scalar(0, 170, 255, 255),
                 new Scalar(0, 120, 255, 255), new Scalar(0, 60, 255, 255), new Scalar(0, 0, 255, 255), new Scalar(64, 0, 255, 255), new Scalar(120, 0, 255, 255),
                 new Scalar(180, 0, 255, 255), new Scalar(255, 0, 255, 255), new Scalar(255, 0, 215, 255), new Scalar(255, 0, 85, 255), new Scalar(255, 0, 0, 255)
         };
         mWhilte = Scalar.all(255);
         mP1 = new Point();
         mP2 = new Point();

         // Fill sepia kernel
         mSepiaKernel = new Mat(4, 4, CvType.CV_32F);
         mSepiaKernel.put(0, 0, /* R */0.189f, 0.769f, 0.393f, 0f);
         mSepiaKernel.put(1, 0, /* G */0.168f, 0.686f, 0.349f, 0f);
         mSepiaKernel.put(2, 0, /* B */0.131f, 0.534f, 0.272f, 0f);
         mSepiaKernel.put(3, 0, /* A */0.000f, 0.000f, 0.000f, 1f);
    }

    public void onCameraViewStopped() {
     mRgba.release();
     mGray.release();
     mTmp.release();
    }

    public Mat onCameraFrame(CvCameraViewFrame inputFrame) {

     mRgba = inputFrame.rgba();
     Size sizeRgba = mRgba.size();
     int rows = (int) sizeRgba.height;
        int cols = (int) sizeRgba.width;
        Mat rgbaInnerWindow;

        int left = cols / 8;
        int top = rows / 8;

        int width = cols * 3 / 4;
        int height = rows * 3 / 4;
        //灰度图
     if(mProcessMethod==1)
      Imgproc.cvtColor(inputFrame.gray(), mRgba, Imgproc.COLOR_GRAY2RGBA, 4);
     //Canny边缘检测
     else if(mProcessMethod==2)
     {
      mRgba = inputFrame.rgba();
      Imgproc.Canny(inputFrame.gray(), mTmp, 80, 100);
      Imgproc.cvtColor(mTmp, mRgba, Imgproc.COLOR_GRAY2RGBA, 4);
     }
     //Hist
     else if(mProcessMethod==3)
     {
       Mat hist = new Mat();
             int thikness = (int) (sizeRgba.width / (mHistSizeNum + 10) / 5);
             if(thikness > 5) thikness = 5;
             int offset = (int) ((sizeRgba.width - (5*mHistSizeNum + 4*10)*thikness)/2);

   // RGB
             for(int c=0; c<3; c++) {
                 Imgproc.calcHist(Arrays.asList(mRgba), mChannels[c], mMat0, hist, mHistSize, mRanges);
                 Core.normalize(hist, hist, sizeRgba.height/2, 0, Core.NORM_INF);
                 hist.get(0, 0, mBuff);
                 for(int h=0; h<mHistSizeNum; h++) {
                     mP1.x = mP2.x = offset + (c * (mHistSizeNum + 10) + h) * thikness;
                     mP1.y = sizeRgba.height-1;
                     mP2.y = mP1.y - 2 - (int)mBuff[h];
                     Core.line(mRgba, mP1, mP2, mColorsRGB[c], thikness);
                 }
             }
             // Value and Hue
             Imgproc.cvtColor(mRgba, mTmp, Imgproc.COLOR_RGB2HSV_FULL);
             // Value
             Imgproc.calcHist(Arrays.asList(mTmp), mChannels[2], mMat0, hist, mHistSize, mRanges);
             Core.normalize(hist, hist, sizeRgba.height/2, 0, Core.NORM_INF);
             hist.get(0, 0, mBuff);
             for(int h=0; h<mHistSizeNum; h++) {
                 mP1.x = mP2.x = offset + (3 * (mHistSizeNum + 10) + h) * thikness;
                 mP1.y = sizeRgba.height-1;
                 mP2.y = mP1.y - 2 - (int)mBuff[h];
                 Core.line(mRgba, mP1, mP2, mWhilte, thikness);
             }
     }
     //inner Window Sobel
     else if(mProcessMethod==4)
     {
      Mat gray = inputFrame.gray();
            Mat grayInnerWindow = gray.submat(top, top + height, left, left + width);
            rgbaInnerWindow = mRgba.submat(top, top + height, left, left + width);
            Imgproc.Sobel(grayInnerWindow, mIntermediateMat, CvType.CV_8U, 1, 1);
            Core.convertScaleAbs(mIntermediateMat, mIntermediateMat, 10, 0);
            Imgproc.cvtColor(mIntermediateMat, rgbaInnerWindow, Imgproc.COLOR_GRAY2BGRA, 4);
            grayInnerWindow.release();
            rgbaInnerWindow.release();
     }
     //SEPIA
     else if(mProcessMethod==5)
     {
      rgbaInnerWindow = mRgba.submat(top, top + height, left, left + width);
            Core.transform(rgbaInnerWindow, rgbaInnerWindow, mSepiaKernel);
            rgbaInnerWindow.release();
     }
     //ZOOM
     else if(mProcessMethod==6)
     {
      Mat zoomCorner = mRgba.submat(0, rows / 2 - rows / 10, 0, cols / 2 - cols / 10);
            Mat mZoomWindow = mRgba.submat(rows / 2 - 9 * rows / 100, rows / 2 + 9 * rows / 100, cols / 2 - 9 * cols / 100, cols / 2 + 9 * cols / 100);
            Imgproc.resize(mZoomWindow, zoomCorner, zoomCorner.size());
            Size wsize = mZoomWindow.size();
            Core.rectangle(mZoomWindow, new Point(1, 1), new Point(wsize.width - 2, wsize.height - 2), new Scalar(255, 0, 0, 255), 2);
            zoomCorner.release();
            mZoomWindow.release();
     }
     //PIXELIZE
     else if(mProcessMethod==7)
     {
      rgbaInnerWindow = mRgba.submat(top, top + height, left, left + width);
            Imgproc.resize(rgbaInnerWindow, mIntermediateMat, mSize0, 0.1, 0.1, Imgproc.INTER_NEAREST);
            Imgproc.resize(mIntermediateMat, rgbaInnerWindow, rgbaInnerWindow.size(), 0., 0., Imgproc.INTER_NEAREST);
            rgbaInnerWindow.release();
     }
     //POSTERIZE
     else if(mProcessMethod==8)
     {
      rgbaInnerWindow = mRgba.submat(top, top + height, left, left + width);
            Imgproc.Canny(rgbaInnerWindow, mIntermediateMat, 80, 90);
            rgbaInnerWindow.setTo(new Scalar(0, 0, 0, 255), mIntermediateMat);
            Core.convertScaleAbs(rgbaInnerWindow, mIntermediateMat, 1./16, 0);
            Core.convertScaleAbs(mIntermediateMat, rgbaInnerWindow, 16, 0);
            rgbaInnerWindow.release();
     }
     else
      mRgba = inputFrame.rgba();
     return mRgba;
    }
}
  • 原图
  • 灰度图
  • Canny边缘检测
  • Hist 直方图计算
  • Sobel 边缘检测
  • SEPIA(色调变换)为每一个数组元素执行一个矩阵变换
  • ZOOM 放大镜
  • PIXELIZE 像素化
  • POSTERIZE 多色调分色印

Android学习——在Android中使用OpenCV的第一个程序,布布扣,bubuko.com

时间: 2024-10-12 20:01:41

Android学习——在Android中使用OpenCV的第一个程序的相关文章

[Android学习笔记]ListView中含有Button导致无法响应onItemClick回调的解决办法

转自:http://www.cnblogs.com/eyu8874521/archive/2012/10/17/2727882.html 问题描述: 当ListView的Item中的控件只是一些展示类控件时(比如TextView),注册ListView的监听setOnItemClickListener之后,当点击Item时候会触发onItemClick回调. 但是,当Item中存在Button(继承于Button)的控件时,onItemClick回调不会被触发. 解决方案: 在Item的布局文件

[Android学习笔记]Android中多线程开发的一些概念

线程安全: 在多线程的情况下,不会因为线程之间的操作而导致数据错误. 线程同步: 同一个资源,可能在同一时间被多个线程操作,这样会导致数据错误.这是一个现象,也是一个问题,而研究如何解决此类问题的相关工作就叫做线程同步. android中,处理线程同步的手段就是:锁 一般分为公平锁和非公平锁: synchronized(内部锁,互斥锁):synchronized是JVM提供的线程同步机制,如果出现问题,JVM能捕获异常,并释放资源,具体实现机制需要查看JVM源码 synchronized的使用特

android学习笔记--android启动过程之init.rc文件浅析

1.  init.rc文件结构文件位置:init.c  : /system/core/initinit.rc  : /system/core/rootdir 首先init.rc文件是以模块为单位的,每个模块里的内容都是一起执行的,模块分为3种类型:on.service.import.我们可以看下init.rc文件是怎么写的:1.import import /init.usb.rc import /init.${ro.hardware}.rc import /init.trace.rc 上面的内容

[Android学习笔记]Android向下兼

Android向下兼容的思路:使用高版本的API,在运行时判断真实运行平台的API版本,根据不同版本做不同的处理 关键类:Build.class里面定义了API版本相关信息 内部类:VERSION定义当前系统的版本信息,其中包含SDK版本信息Build.VERSION.SDK_INT可以获取到当前运行的系统的SDK版本号 内部类:VERSION_CODES定义了各个版本的枚举信息 适配时: if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.HONE

Android学习之 Manifest中meta-data扩展元素数据的配置与获取

在AndroidManifest.xml清单文件中 我们有时会看到如下类似的<meta-data ... >元素开始的配置内容: <meta-data android:name="com.google.android.maps.v2.API_KEY" android:value="AIzaSyBhBFOgVQclaa8p1JJeqaZHiCo2nfiyBBo" /> <meta-data android:name="com.g

Android学习笔记-Android应用程序初步认识

一直觉得自己的技术没有一门专长,似乎什么都会一点,但是却一点都不深入.决定学习Android的开发,说不出的理由,希望自己能够坚持下去. 其实之前已经搭建好了Android的开发环境eclipse+ADT+SDK,这里就不做具体介绍了,个人觉得还是非常有必要把这3个软件单独安装一下, 这样对开发工具能有个系统的认识.Eclipse是一个IDE,针对多门开发语言都能够使用,SDK是针对Android应用开发提供的一个框架,其中有开发 过程中使用到的包和一些集成的工具,ADT是安装在eclipse上

Android学习十---Android Camera

Android camera用来拍照和拍摄视频的先看一下最后实现的效果图             最后的效果图 一.准备 在你的应用程序上使用android拍照设备,需要考虑以下几个方面 1. 是否是一定需要camera 如果需要,那么就无法安装在没有摄像头的设备. 需要在mainfest 中声明 <uses-permission android:name="android.permission.CAMERA" /> <uses-feature android:nam

(转)Android学习笔记 --- android任务栈和启动模式

1.一个应用程序一般都是由多个activity组成的,任务栈(task stack),记录和存放用户开启的activity. 2.当一个应用程序被打开时,系统就会给他分配拟一个任务栈,当任务栈中所有的activity都退出的时候,任务栈就清空了.任务栈中的id是一个integer的数据类型(自增长的). 3.在android操作系统里面会存在多个任务栈,一个应用程序对应一个任务栈. 4.默认情况下,关闭掉一个应用程序,系统就会清空了这个应用程序.但是应用程序的进程还会被保留 为什么要引入任务栈的

Android学习路径——Android的四个组成部分activity(一)

一.什么是Activity? Activity简单的说就是一个接口.我们是Android手机上看到的每个界面就是一个activity. 二.Activity的创建 1.定义一个类继承activity,然后在清单文件manifest.xml文件的application节点下注冊activity.这个activity就创建成功了. public class MyActivity extends Activity { } 2.清单文件注冊activity <application android:al