``Accordian'' Patience

终于解决了一个忧伤好久的问题,严重拖了项目进度,深感惭愧!一直被一系列的问题所困扰,然后又只能自己一个人摸索,也是一段辛酸忧伤史,现在小结一下上个月在做二维码的过程中所碰到的问题以及解决办法,现在庆幸终于解决好了,终于能将这个功能告一段落,一下小结也是分享一下Unity的某些“坑”,让同行少走弯路,起码在二维码这方面应该会有所启迪,欣慰的是接下来几天终于可以做自己应该做的事情了!

效果图:

先小结一下碰到的问题:

1.Unity工程屏幕方向与Android工程屏幕方向要一致的问题

本来我测试Android扫码的功能,已经测是通过了,可以参考之前写的博文:http://blog.csdn.net/dingxiaowei2013/article/details/24677795,后来将这这个插件添加到现有的项目中,但导出项目apk后,打开应用怎么就是闪退,表示很忧伤,后来一步一步的精简对比测试,表示自己五一那天一直摸索到凌晨三点才发现这一问题根源,有了这个基础才有接下来思路的转变。我的Android工程设置的屏幕旋转方向是portrait,但由于项目原因,Unity的导出方向必须是landscapeleft,这两者方向不一致会导致应用异常奔溃闪退!当然我的解决思路可能是比较笨的,我将Android工程做成的插件的屏幕方向也做成了横屏的,为了和Unity的方向一致,我想应该就不会出现那种问题。我原本的思路是将Unity导出屏幕设置成auto,然后代码控制屏幕的方向,但貌似还是不成功,没辙只能还是换种思路!网上有不少Android的扫码工程,最多的就是基于zxing,但是都是有很多冗余,不适合初学者,尤其是像我这种Android初学者,我倒是看到一个比较适合入门的精简的扫码工程,可惜是纵屏的,上一篇文章就是介绍的这个,但并不适合我的项目需求,所以我就想办法将这个工程改成横屏工程,本以为只要改改xml配置文件就OK的,结果并不是想象的那么简单,仅仅那么该会出现扫的过程中图片压缩以及移动别扭等现象,还是需要修改里面的工程的!这里可以参考:http://dingxiaowei2013.blog.163.com/blog/static/21965310720144595534507/,这篇是将横屏改纵屏的解决方案,然后将其逆序操作,但我还是出现了一些问题,比如扫到一半的时候应用异常崩溃,又表示很忧伤,为啥都不是一帆风顺呢!然后反复重做,终于成功了!精简版的工程源码我会贴出来共享!为了实现这个功能,可谓是一次有一次的导出,一次有一次的测试,千言万语无法言表其中的忧伤,总而言之,坚韧,是程序猿必须具备的品质!

2.Unity与Android场景跳转的问题

Unity和Android交互不仅仅是调用函数,交互数据,很大的一个还是需要交互视图和场景,将其很好的进行切换。当然场景的跳转或者是切换还是通过调用接口函数来实现,但这里通过了一个很重要的Activity来实现了这一效果,就是UnityPlayerActivity,Android的接口视图就是继承了这个UnityPlayerActivity,这个是为Unity和Android搭建了一个桥梁,当然这个类必须导入Unity的接口包,在Unity的安装目录下,详细可以参考上一篇文章,继承了UnityPlayerActivity的视图是作为Unity和Android的一个通用视图,它是Android插件的一个入口,无论是在Unity切换到Android界面还是Android切换到Unity界面,必须都要通过这个Activity来操作,记得是必须,这里我也是吃过苦头,我想在其他的Activity中来调用Unity的接口函数来实现Unity界面的跳转,但都尝试失败!后来再重头再来,这些都是经过了一遍又一遍的实验才得出的结论!

3.LG的手机作为Unity的测试机貌似有一些问题(纯属个人感觉)

我测试最简单的OnGUI的方法,GUI里面包含中文,LG的显示不出来,由于公司所有的Android机和个人手机基本都是LG的,测试下来中文不显示,估计还得做字体,难道是因为它是韩国货?!纯属我个人瞎猜,解决办法就是你可以在工程里面做一套字体,然后做GUISkin供GUI用,我用其他的手机,比如我的破旧的中兴还有其他的非LG手机就能显示,这个横版的二维码扫码插件工程用LG测试还是有点问题,会出现扫的时候卡屏的现像,不知道是不是因为内存飙升,这个可以用Eclipse测试一下看看是不是因为这个原因!

附带修改的Android精简版的扫码工程代码(只贴修改过的脚本):

AndroidManifest.xml

<?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
    package="com.example.qr_codescan"
    android:versionCode="1"
    android:versionName="1.0" >

    <uses-sdk
        android:minSdkVersion="8"
        android:targetSdkVersion="16" />

    <application
        android:allowBackup="true"
        android:icon="@drawable/ic_launcher"
        android:label="@string/app_name"
        android:theme="@android:style/Theme.NoTitleBar" >
           <activity
            android:name="com.example.qr_codescan.MainActivity"
            android:screenOrientation="landscape"
            android:label="@string/app_name" >
            <intent-filter>
                <action android:name="android.intent.action.MAIN" />

                <category android:name="android.intent.category.LAUNCHER" />
            </intent-filter>
        </activity>

        <activity
            android:name=".MipcaActivityCapture"
            android:configChanges="orientation|keyboardHidden"
            android:screenOrientation="landscape"
            android:windowSoftInputMode="stateAlwaysHidden" >
        </activity>
    </application>

    <uses-permission android:name="android.permission.VIBRATE" />
    <uses-permission android:name="android.permission.CAMERA" />

    <uses-feature android:name="android.hardware.camera" />
    <uses-feature android:name="android.hardware.camera.autofocus" />

</manifest>

MainActivity.java

package com.example.qr_codescan;

import android.app.Activity;
import android.content.Intent;
import android.graphics.Bitmap;
import android.os.Bundle;
import android.view.View;
import android.view.View.OnClickListener;
import android.widget.Button;
import android.widget.ImageView;
import android.widget.TextView;

import com.unity3d.player.UnityPlayer;
import com.unity3d.player.UnityPlayerActivity;

public class MainActivity extends UnityPlayerActivity {
	private final static int SCANNIN_GREQUEST_CODE = 1;
	/**
	 * 显示扫描结果
	 */
//	private TextView mTextView ;
	/**
	 * 显示扫描拍的图片
	 */
//	private ImageView mImageView;

	@Override
	protected void onCreate(Bundle savedInstanceState) {
		super.onCreate(savedInstanceState);
//		setContentView(R.layout.activity_main);
//
//		mTextView = (TextView) findViewById(R.id.result);
//		mImageView = (ImageView) findViewById(R.id.qrcode_bitmap);
//
//		//点击按钮跳转到二维码扫描界面,这里用的是startActivityForResult跳转
//		//扫描完了之后调到该界面
//		Button mButton = (Button) findViewById(R.id.button1);
//		mButton.setOnClickListener(new OnClickListener() {
//
//			@Override
//			public void onClick(View v) {
//				Intent intent = new Intent();
//				intent.setClass(MainActivity.this, MipcaActivityCapture.class);
//				intent.setFlags(Intent.FLAG_ACTIVITY_CLEAR_TOP);
//				startActivityForResult(intent, SCANNIN_GREQUEST_CODE);
//			}
//		});
	}

	public void Show()
	{
		Intent intent = new Intent();
		intent.setClass(MainActivity.this, MipcaActivityCapture.class);
		intent.setFlags(Intent.FLAG_ACTIVITY_CLEAR_TOP);
		startActivityForResult(intent, SCANNIN_GREQUEST_CODE);
	}

	@Override
    protected void onActivityResult(int requestCode, int resultCode, Intent data) {
        super.onActivityResult(requestCode, resultCode, data);
        switch (requestCode) {
		case SCANNIN_GREQUEST_CODE:
			if(resultCode == RESULT_OK){
				Bundle bundle = data.getExtras();
				//显示扫描到的内容
//				mTextView.setText(bundle.getString("result"));
				//显示
//				mImageView.setImageBitmap((Bitmap) data.getParcelableExtra("bitmap"));
				UnityPlayer.UnitySendMessage("GameObject","GetString",bundle.getString("result"));
			}
			break;
		}
    }	

}

MipcaActivityCapture.java

修改中文编码

characterSet = "GBK";

CameraConfigurationManager.java

/*
 * Copyright (C) 2010 ZXing authors
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 *      http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */

package com.mining.app.zxing.camera;

import android.content.Context;
import android.graphics.Point;
import android.hardware.Camera;
import android.os.Build;
import android.util.Log;
import android.view.Display;
import android.view.WindowManager;

import java.lang.reflect.Method;
import java.util.regex.Pattern;

final class CameraConfigurationManager {

  private static final String TAG = CameraConfigurationManager.class.getSimpleName();

  private static final int TEN_DESIRED_ZOOM = 27;
  private static final int DESIRED_SHARPNESS = 30;

  private static final Pattern COMMA_PATTERN = Pattern.compile(",");

  private final Context context;
  private Point screenResolution;
  private Point cameraResolution;
  private int previewFormat;
  private String previewFormatString;

  CameraConfigurationManager(Context context) {
    this.context = context;
  }

  /**
   * Reads, one time, values from the camera that are needed by the app.
   */
  void initFromCameraParameters(Camera camera) {
    Camera.Parameters parameters = camera.getParameters();
    previewFormat = parameters.getPreviewFormat();
    previewFormatString = parameters.get("preview-format");
    Log.d(TAG, "Default preview format: " + previewFormat + ‘/‘ + previewFormatString);
    WindowManager manager = (WindowManager) context.getSystemService(Context.WINDOW_SERVICE);
    Display display = manager.getDefaultDisplay();
    screenResolution = new Point(display.getWidth(), display.getHeight());
    Log.d(TAG, "Screen resolution: " + screenResolution);
    cameraResolution = getCameraResolution(parameters, screenResolution);
    Log.d(TAG, "Camera resolution: " + screenResolution);
  }

  /**
   * Sets the camera up to take preview images which are used for both preview and decoding.
   * We detect the preview format here so that buildLuminanceSource() can build an appropriate
   * LuminanceSource subclass. In the future we may want to force YUV420SP as it‘s the smallest,
   * and the planar Y can be used for barcode scanning without a copy in some cases.
   */
  void setDesiredCameraParameters(Camera camera) {
    Camera.Parameters parameters = camera.getParameters();
    Log.d(TAG, "Setting preview size: " + cameraResolution);
    parameters.setPreviewSize(cameraResolution.x, cameraResolution.y);
    setFlash(parameters);
    setZoom(parameters);
    //setSharpness(parameters);
    //modify here

//    camera.setDisplayOrientation(90);
    //兼容2.1
    //setDisplayOrientation(camera, 90);
    camera.setParameters(parameters);
  }

  Point getCameraResolution() {
    return cameraResolution;
  }

  Point getScreenResolution() {
    return screenResolution;
  }

  int getPreviewFormat() {
    return previewFormat;
  }

  String getPreviewFormatString() {
    return previewFormatString;
  }

  private static Point getCameraResolution(Camera.Parameters parameters, Point screenResolution) {

    String previewSizeValueString = parameters.get("preview-size-values");
    // saw this on Xperia
    if (previewSizeValueString == null) {
      previewSizeValueString = parameters.get("preview-size-value");
    }

    Point cameraResolution = null;

    if (previewSizeValueString != null) {
      Log.d(TAG, "preview-size-values parameter: " + previewSizeValueString);
      cameraResolution = findBestPreviewSizeValue(previewSizeValueString, screenResolution);
    }

    if (cameraResolution == null) {
      // Ensure that the camera resolution is a multiple of 8, as the screen may not be.
      cameraResolution = new Point(
          (screenResolution.x >> 3) << 3,
          (screenResolution.y >> 3) << 3);
    }

    return cameraResolution;
  }

  private static Point findBestPreviewSizeValue(CharSequence previewSizeValueString, Point screenResolution) {
    int bestX = 0;
    int bestY = 0;
    int diff = Integer.MAX_VALUE;
    for (String previewSize : COMMA_PATTERN.split(previewSizeValueString)) {

      previewSize = previewSize.trim();
      int dimPosition = previewSize.indexOf(‘x‘);
      if (dimPosition < 0) {
        Log.w(TAG, "Bad preview-size: " + previewSize);
        continue;
      }

      int newX;
      int newY;
      try {
        newX = Integer.parseInt(previewSize.substring(0, dimPosition));
        newY = Integer.parseInt(previewSize.substring(dimPosition + 1));
      } catch (NumberFormatException nfe) {
        Log.w(TAG, "Bad preview-size: " + previewSize);
        continue;
      }

      int newDiff = Math.abs(newX - screenResolution.x) + Math.abs(newY - screenResolution.y);
      if (newDiff == 0) {
        bestX = newX;
        bestY = newY;
        break;
      } else if (newDiff < diff) {
        bestX = newX;
        bestY = newY;
        diff = newDiff;
      }

    }

    if (bestX > 0 && bestY > 0) {
      return new Point(bestX, bestY);
    }
    return null;
  }

  private static int findBestMotZoomValue(CharSequence stringValues, int tenDesiredZoom) {
    int tenBestValue = 0;
    for (String stringValue : COMMA_PATTERN.split(stringValues)) {
      stringValue = stringValue.trim();
      double value;
      try {
        value = Double.parseDouble(stringValue);
      } catch (NumberFormatException nfe) {
        return tenDesiredZoom;
      }
      int tenValue = (int) (10.0 * value);
      if (Math.abs(tenDesiredZoom - value) < Math.abs(tenDesiredZoom - tenBestValue)) {
        tenBestValue = tenValue;
      }
    }
    return tenBestValue;
  }

  private void setFlash(Camera.Parameters parameters) {
    // FIXME: This is a hack to turn the flash off on the Samsung Galaxy.
    // And this is a hack-hack to work around a different value on the Behold II
    // Restrict Behold II check to Cupcake, per Samsung‘s advice
    //if (Build.MODEL.contains("Behold II") &&
    //    CameraManager.SDK_INT == Build.VERSION_CODES.CUPCAKE) {
    if (Build.MODEL.contains("Behold II") && CameraManager.SDK_INT == 3) { // 3 = Cupcake
      parameters.set("flash-value", 1);
    } else {
      parameters.set("flash-value", 2);
    }
    // This is the standard setting to turn the flash off that all devices should honor.
    parameters.set("flash-mode", "off");
  }

  private void setZoom(Camera.Parameters parameters) {

    String zoomSupportedString = parameters.get("zoom-supported");
    if (zoomSupportedString != null && !Boolean.parseBoolean(zoomSupportedString)) {
      return;
    }

    int tenDesiredZoom = TEN_DESIRED_ZOOM;

    String maxZoomString = parameters.get("max-zoom");
    if (maxZoomString != null) {
      try {
        int tenMaxZoom = (int) (10.0 * Double.parseDouble(maxZoomString));
        if (tenDesiredZoom > tenMaxZoom) {
          tenDesiredZoom = tenMaxZoom;
        }
      } catch (NumberFormatException nfe) {
        Log.w(TAG, "Bad max-zoom: " + maxZoomString);
      }
    }

    String takingPictureZoomMaxString = parameters.get("taking-picture-zoom-max");
    if (takingPictureZoomMaxString != null) {
      try {
        int tenMaxZoom = Integer.parseInt(takingPictureZoomMaxString);
        if (tenDesiredZoom > tenMaxZoom) {
          tenDesiredZoom = tenMaxZoom;
        }
      } catch (NumberFormatException nfe) {
        Log.w(TAG, "Bad taking-picture-zoom-max: " + takingPictureZoomMaxString);
      }
    }

    String motZoomValuesString = parameters.get("mot-zoom-values");
    if (motZoomValuesString != null) {
      tenDesiredZoom = findBestMotZoomValue(motZoomValuesString, tenDesiredZoom);
    }

    String motZoomStepString = parameters.get("mot-zoom-step");
    if (motZoomStepString != null) {
      try {
        double motZoomStep = Double.parseDouble(motZoomStepString.trim());
        int tenZoomStep = (int) (10.0 * motZoomStep);
        if (tenZoomStep > 1) {
          tenDesiredZoom -= tenDesiredZoom % tenZoomStep;
        }
      } catch (NumberFormatException nfe) {
        // continue
      }
    }

    // Set zoom. This helps encourage the user to pull back.
    // Some devices like the Behold have a zoom parameter
    if (maxZoomString != null || motZoomValuesString != null) {
      parameters.set("zoom", String.valueOf(tenDesiredZoom / 10.0));
    }

    // Most devices, like the Hero, appear to expose this zoom parameter.
    // It takes on values like "27" which appears to mean 2.7x zoom
    if (takingPictureZoomMaxString != null) {
      parameters.set("taking-picture-zoom", tenDesiredZoom);
    }
  }

	public static int getDesiredSharpness() {
		return DESIRED_SHARPNESS;
	}

	/**
	 * compatible  1.6
	 * @param camera
	 * @param angle
	 */
	protected void setDisplayOrientation(Camera camera, int angle){
        Method downPolymorphic;
        try
        {
            downPolymorphic = camera.getClass().getMethod("setDisplayOrientation", new Class[] { int.class });
            if (downPolymorphic != null)
                downPolymorphic.invoke(camera, new Object[] { angle });
        }
        catch (Exception e1)
        {
        }
   }  

}

CameraManager.java

/*
 * Copyright (C) 2008 ZXing authors
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 *      http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */

package com.mining.app.zxing.camera;

import java.io.IOException;

import android.content.Context;
import android.graphics.PixelFormat;
import android.graphics.Point;
import android.graphics.Rect;
import android.hardware.Camera;
import android.os.Build;
import android.os.Handler;
import android.util.Log;
import android.view.SurfaceHolder;

/**
 * This object wraps the Camera service object and expects to be the only one talking to it. The
 * implementation encapsulates the steps needed to take preview-sized images, which are used for
 * both preview and decoding.
 *
 */
public final class CameraManager {

  private static final String TAG = CameraManager.class.getSimpleName();

  private static final int MIN_FRAME_WIDTH = 240;
  private static final int MIN_FRAME_HEIGHT = 240;
  private static final int MAX_FRAME_WIDTH = 480;
  private static final int MAX_FRAME_HEIGHT = 360;

  private static CameraManager cameraManager;

  static final int SDK_INT; // Later we can use Build.VERSION.SDK_INT
  static {
    int sdkInt;
    try {
      sdkInt = Integer.parseInt(Build.VERSION.SDK);
    } catch (NumberFormatException nfe) {
      // Just to be safe
      sdkInt = 10000;
    }
    SDK_INT = sdkInt;
  }

  private final Context context;
  private final CameraConfigurationManager configManager;
  private Camera camera;
  private Rect framingRect;
  private Rect framingRectInPreview;
  private boolean initialized;
  private boolean previewing;
  private final boolean useOneShotPreviewCallback;
  /**
   * Preview frames are delivered here, which we pass on to the registered handler. Make sure to
   * clear the handler so it will only receive one message.
   */
  private final PreviewCallback previewCallback;
  /** Autofocus callbacks arrive here, and are dispatched to the Handler which requested them. */
  private final AutoFocusCallback autoFocusCallback;

  /**
   * Initializes this static object with the Context of the calling Activity.
   *
   * @param context The Activity which wants to use the camera.
   */
  public static void init(Context context) {
    if (cameraManager == null) {
      cameraManager = new CameraManager(context);
    }
  }

  /**
   * Gets the CameraManager singleton instance.
   *
   * @return A reference to the CameraManager singleton.
   */
  public static CameraManager get() {
    return cameraManager;
  }

  private CameraManager(Context context) {

    this.context = context;
    this.configManager = new CameraConfigurationManager(context);

    // Camera.setOneShotPreviewCallback() has a race condition in Cupcake, so we use the older
    // Camera.setPreviewCallback() on 1.5 and earlier. For Donut and later, we need to use
    // the more efficient one shot callback, as the older one can swamp the system and cause it
    // to run out of memory. We can‘t use SDK_INT because it was introduced in the Donut SDK.
    //useOneShotPreviewCallback = Integer.parseInt(Build.VERSION.SDK) > Build.VERSION_CODES.CUPCAKE;
    useOneShotPreviewCallback = Integer.parseInt(Build.VERSION.SDK) > 3; // 3 = Cupcake

    previewCallback = new PreviewCallback(configManager, useOneShotPreviewCallback);
    autoFocusCallback = new AutoFocusCallback();
  }

  /**
   * Opens the camera driver and initializes the hardware parameters.
   *
   * @param holder The surface object which the camera will draw preview frames into.
   * @throws IOException Indicates the camera driver failed to open.
   */
  public void openDriver(SurfaceHolder holder) throws IOException {
    if (camera == null) {
      camera = Camera.open();
      if (camera == null) {
        throw new IOException();
      }
      camera.setPreviewDisplay(holder);

      if (!initialized) {
        initialized = true;
        configManager.initFromCameraParameters(camera);
      }
      configManager.setDesiredCameraParameters(camera);

      //FIXME
 //     SharedPreferences prefs = PreferenceManager.getDefaultSharedPreferences(context);
      //是否使用前灯
//      if (prefs.getBoolean(PreferencesActivity.KEY_FRONT_LIGHT, false)) {
//        FlashlightManager.enableFlashlight();
//      }
      FlashlightManager.enableFlashlight();
    }
  }

  /**
   * Closes the camera driver if still in use.
   */
  public void closeDriver() {
    if (camera != null) {
      FlashlightManager.disableFlashlight();
      camera.release();
      camera = null;
    }
  }

  /**
   * Asks the camera hardware to begin drawing preview frames to the screen.
   */
  public void startPreview() {
    if (camera != null && !previewing) {
      camera.startPreview();
      previewing = true;
    }
  }

  /**
   * Tells the camera to stop drawing preview frames.
   */
  public void stopPreview() {
    if (camera != null && previewing) {
      if (!useOneShotPreviewCallback) {
        camera.setPreviewCallback(null);
      }
      camera.stopPreview();
      previewCallback.setHandler(null, 0);
      autoFocusCallback.setHandler(null, 0);
      previewing = false;
    }
  }

  /**
   * A single preview frame will be returned to the handler supplied. The data will arrive as byte[]
   * in the message.obj field, with width and height encoded as message.arg1 and message.arg2,
   * respectively.
   *
   * @param handler The handler to send the message to.
   * @param message The what field of the message to be sent.
   */
  public void requestPreviewFrame(Handler handler, int message) {
    if (camera != null && previewing) {
      previewCallback.setHandler(handler, message);
      if (useOneShotPreviewCallback) {
        camera.setOneShotPreviewCallback(previewCallback);
      } else {
        camera.setPreviewCallback(previewCallback);
      }
    }
  }

  /**
   * Asks the camera hardware to perform an autofocus.
   *
   * @param handler The Handler to notify when the autofocus completes.
   * @param message The message to deliver.
   */
  public void requestAutoFocus(Handler handler, int message) {
    if (camera != null && previewing) {
      autoFocusCallback.setHandler(handler, message);
      //Log.d(TAG, "Requesting auto-focus callback");
      camera.autoFocus(autoFocusCallback);
    }
  }

  /**
   * Calculates the framing rect which the UI should draw to show the user where to place the
   * barcode. This target helps with alignment as well as forces the user to hold the device
   * far enough away to ensure the image will be in focus.
   *
   * @return The rectangle to draw on screen in window coordinates.
   */
  public Rect getFramingRect() {
    Point screenResolution = configManager.getScreenResolution();
    if (framingRect == null) {
      if (camera == null) {
        return null;
      }
      int width = screenResolution.x * 3 / 4;
      if (width < MIN_FRAME_WIDTH) {
        width = MIN_FRAME_WIDTH;
      } else if (width > MAX_FRAME_WIDTH) {
        width = MAX_FRAME_WIDTH;
      }
      int height = screenResolution.y * 3 / 4;
      if (height < MIN_FRAME_HEIGHT) {
        height = MIN_FRAME_HEIGHT;
      } else if (height > MAX_FRAME_HEIGHT) {
        height = MAX_FRAME_HEIGHT;
      }
      int leftOffset = (screenResolution.x - width) / 2;
      int topOffset = (screenResolution.y - height) / 2;
      framingRect = new Rect(leftOffset, topOffset, leftOffset + width, topOffset + height);
      Log.d(TAG, "Calculated framing rect: " + framingRect);
    }
    return framingRect;
  }

  /**
   * Like {@link #getFramingRect} but coordinates are in terms of the preview frame,
   * not UI / screen.
   */
  public Rect getFramingRectInPreview() {
    if (framingRectInPreview == null) {
      Rect rect = new Rect(getFramingRect());
      Point cameraResolution = configManager.getCameraResolution();
      Point screenResolution = configManager.getScreenResolution();
      //modify here
      rect.left = rect.left * cameraResolution.x / screenResolution.x;
      rect.right = rect.right * cameraResolution.x / screenResolution.x;
      rect.top = rect.top * cameraResolution.y / screenResolution.y;
      rect.bottom = rect.bottom * cameraResolution.y / screenResolution.y;
      //rect.left = rect.left * cameraResolution.y / screenResolution.x;
      //rect.right = rect.right * cameraResolution.y / screenResolution.x;
      //rect.top = rect.top * cameraResolution.x / screenResolution.y;
      //rect.bottom = rect.bottom * cameraResolution.x / screenResolution.y;
      framingRectInPreview = rect;
    }
    return framingRectInPreview;
  }

  /**
   * Converts the result points from still resolution coordinates to screen coordinates.
   *
   * @param points The points returned by the Reader subclass through Result.getResultPoints().
   * @return An array of Points scaled to the size of the framing rect and offset appropriately
   *         so they can be drawn in screen coordinates.
   */
  /*
  public Point[] convertResultPoints(ResultPoint[] points) {
    Rect frame = getFramingRectInPreview();
    int count = points.length;
    Point[] output = new Point[count];
    for (int x = 0; x < count; x++) {
      output[x] = new Point();
      output[x].x = frame.left + (int) (points[x].getX() + 0.5f);
      output[x].y = frame.top + (int) (points[x].getY() + 0.5f);
    }
    return output;
  }
   */

  /**
   * A factory method to build the appropriate LuminanceSource object based on the format
   * of the preview buffers, as described by Camera.Parameters.
   *
   * @param data A preview frame.
   * @param width The width of the image.
   * @param height The height of the image.
   * @return A PlanarYUVLuminanceSource instance.
   */
  public PlanarYUVLuminanceSource buildLuminanceSource(byte[] data, int width, int height) {
    Rect rect = getFramingRectInPreview();
    int previewFormat = configManager.getPreviewFormat();
    String previewFormatString = configManager.getPreviewFormatString();
    switch (previewFormat) {
      // This is the standard Android format which all devices are REQUIRED to support.
      // In theory, it‘s the only one we should ever care about.
      case PixelFormat.YCbCr_420_SP:
      // This format has never been seen in the wild, but is compatible as we only care
      // about the Y channel, so allow it.
      case PixelFormat.YCbCr_422_SP:
        return new PlanarYUVLuminanceSource(data, width, height, rect.left, rect.top,
            rect.width(), rect.height());
      default:
        // The Samsung Moment incorrectly uses this variant instead of the ‘sp‘ version.
        // Fortunately, it too has all the Y data up front, so we can read it.
        if ("yuv420p".equals(previewFormatString)) {
          return new PlanarYUVLuminanceSource(data, width, height, rect.left, rect.top,
            rect.width(), rect.height());
        }
    }
    throw new IllegalArgumentException("Unsupported picture format: " +
        previewFormat + ‘/‘ + previewFormatString);
  }

	public Context getContext() {
		return context;
	}

}

PlanarYUVLuminanceSource.java

/*
 * Copyright 2009 ZXing authors
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 *      http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */

package com.mining.app.zxing.camera;

import com.google.zxing.LuminanceSource;

import android.graphics.Bitmap;

/**
 * This object extends LuminanceSource around an array of YUV data returned from the camera driver,
 * with the option to crop to a rectangle within the full data. This can be used to exclude
 * superfluous pixels around the perimeter and speed up decoding.
 *
 * It works for any pixel format where the Y channel is planar and appears first, including
 * YCbCr_420_SP and YCbCr_422_SP.
 *
 * @author [email protected] (Daniel Switkin)
 */
public final class PlanarYUVLuminanceSource extends LuminanceSource {
  private final byte[] yuvData;
  private final int dataWidth;
  private final int dataHeight;
  private final int left;
  private final int top;

  public PlanarYUVLuminanceSource(byte[] yuvData, int dataWidth, int dataHeight, int left, int top,
      int width, int height) {
    super(width, height);

    if (left + width > dataWidth || top + height > dataHeight) {
      throw new IllegalArgumentException("Crop rectangle does not fit within image data.");
    }

    this.yuvData = yuvData;
    this.dataWidth = dataWidth;
    this.dataHeight = dataHeight;
    this.left = left;
    this.top = top;
  }

  @Override
  public byte[] getRow(int y, byte[] row) {
    if (y < 0 || y >= getHeight()) {
      throw new IllegalArgumentException("Requested row is outside the image: " + y);
    }
    int width = getWidth();
    if (row == null || row.length < width) {
      row = new byte[width];
    }
    int offset = (y + top) * dataWidth + left;
    System.arraycopy(yuvData, offset, row, 0, width);
    return row;
  }

  @Override
  public byte[] getMatrix() {
    int width = getWidth();
    int height = getHeight();

    // If the caller asks for the entire underlying image, save the copy and give them the
    // original data. The docs specifically warn that result.length must be ignored.
    if (width == dataWidth && height == dataHeight) {
      return yuvData;
    }

    int area = width * height;
    byte[] matrix = new byte[area];
    int inputOffset = top * dataWidth + left;

    // If the width matches the full width of the underlying data, perform a single copy.
    if (width == dataWidth) {
      System.arraycopy(yuvData, inputOffset, matrix, 0, area);
      return matrix;
    }

    // Otherwise copy one cropped row at a time.
    byte[] yuv = yuvData;
    for (int y = 0; y < height; y++) {
      int outputOffset = y * width;
      System.arraycopy(yuv, inputOffset, matrix, outputOffset, width);
      inputOffset += dataWidth;
    }
    return matrix;
  }

  @Override
  public boolean isCropSupported() {
    return true;
  }

  public int getDataWidth() {
    return dataWidth;
  }

  public int getDataHeight() {
    return dataHeight;
  }

  public Bitmap renderCroppedGreyscaleBitmap() {
    int width = getWidth();
    int height = getHeight();
    int[] pixels = new int[width * height];
    byte[] yuv = yuvData;
    int inputOffset = top * dataWidth + left;

    for (int y = 0; y < height; y++) {
      int outputOffset = y * width;
      for (int x = 0; x < width; x++) {
        int grey = yuv[inputOffset + x] & 0xff;
        pixels[outputOffset + x] = 0xFF000000 | (grey * 0x00010101);
      }
      inputOffset += dataWidth;
    }

    Bitmap bitmap = Bitmap.createBitmap(width, height, Bitmap.Config.ARGB_8888);
    bitmap.setPixels(pixels, 0, width, 0, 0, width, height);
    return bitmap;
  }
}

能解决这一系列的问题还要感觉龙哥的指导,在此感谢!

这时比较晚了就不贴原工程了,明天贴出来!

如果有谁知道以上问题有更好的解决方案,还望指教,相互学习,Aladdin在此感谢,哈哈!

我的微博:http://weibo.com/dingxiaowei2013

工程源码:http://down.51cto.com/data/1164585

``Accordian'' Patience

时间: 2024-10-13 09:04:20

``Accordian'' Patience的相关文章

``Accordian&#39;&#39; Patience UVA 127

说说: 这道题难度其实并不但,但是比较繁琐,且细节容易被忽略.先分析一下游戏规则吧,知道游戏规则之后,问题自然而然就解决了.首先放着一行52个扑克牌堆(ps:输入的时候分两行输入)开始每堆只有一张牌,然后从左到右开始判断,若一张牌和左边第一张牌或者左边第三张牌的大小或者花色相同,则将该张牌放到那一对牌之上并且要求继续向左匹配,直到不能匹配为止.若某个堆一张牌都不剩了,则该堆不存在了,也就是说如果两堆相邻,则两堆的牌数都不能为0.最后按照这个规则,直到没有牌能够移动位置.(ps:移动是指顶层牌的移

UVa OJ 127 - &quot;Accordian&quot; Patience (“手风琴”纸牌)

UVa OJ 127 - "Accordian" Patience ("手风琴"纸牌) Time limit: 3.000 seconds 限时:3.000秒 Problem 问题 You are to simulate the playing of games of "Accordian" patience, the rules for which are as follows: 模拟玩一个"手风琴"纸牌游戏,规则如下: D

ACM学习历程——UVA 127 &quot;Accordian&quot; Patience(栈;模拟)

Description  ``Accordian'' Patience  You are to simulate the playing of games of ``Accordian'' patience, the rules for which are as follows: Deal cards one by one in a row from left to right, not overlapping. Whenever the card matches its immediate n

uva 127 &quot;Accordian&quot; Patience(手风琴纸牌)

用 栈 stack 来处理. 直接根据题目描述写就可以.不要忘记每组数据最后的清空栈堆. 题目大意: 给定52张的扑克牌,现在要求对扑克牌进行整理,对于每一张扑克牌,如果左边的第三张存在那么就去判断这一张是否和第三张满足花色或卡片值相同,如果满足则把这一张移动到左边的第三张,否则去判断左边的第一张是否和这一张满足条件:如果左边的第三张不存在那么只要去判断左边的第一张即可.最后输出剩下的扑克牌的堆数,并且输出每一堆的牌数. #include<stdio.h> #include<iostre

Uva 127 poj 1214 `Accordian&#39;&#39; Patience

 ``Accordian'' Patience  You are to simulate the playing of games of ``Accordian'' patience, the rules for which are as follows: Deal cards one by one in a row from left to right, not overlapping. Whenever the card matches its immediate neighbour on

&quot;Accordian&quot; Patience UVA 127 (”手风琴“牌游戏)

"Accordian" Patience From:UVA, 127 Submit Time Limit: 3000 MS You are to simulate the playing of games of ``Accordian'' patience, the rules for which are as follows: Deal cards one by one in a row from left to right, not overlapping. Whenever th

UVa 127 - &quot;Accordian&quot; Patience POJ 1214 链表题解

UVa和POJ都有这道题. 不同的是UVa要求区分单复数,而POJ不要求. 使用STL做会比较简单,这里纯粹使用指针做了,非常麻烦的指针操作,一不小心就错.调试起来还是非常费力的 本题理解起来也是挺费力的,要搞清楚如何模拟也不容易啊,读题要很仔细. 纯指针的操作挺快的吧.不过POJ 0ms,而UVa就0.2左右了. 三相链表: 1 只要有叠起来的牌,那么就使用一个down指针指向下面的牌就可以了. 2 使用双向链表,可以方便前后遍历. 3 记得有了更新牌之后,又要重新开始检查是否需要更新牌,这是

uva ``Accordian&#39;&#39; Patience

题目如下: ``Accordian'' Patience You are to simulate the playing of games of ``Accordian'' patience, the rules for which are as follows: Deal cards one by one in a row from left to right, not overlapping. Whenever the card matches its immediate neighbour

UVa127,&quot;Accordian&quot; Patience

注意1堆的时候,pile后面没有s!!!!因为这个WA了一次,否则就1A了 犯了一个很幼稚很幼稚的错误,申请ans[]后玩了吧ans置0,结果调了好长好长时间,本来是敲完就能过的T T啊啊啊啊啊啊,一个多小时没了啊 附上我调试时写的代码(把每一次运转都输出了= =一个一个看的,真心用了好长时间,头都大了) #include <iostream> #include <cstdio> #include <string> #include <cstring> #i